chenyu
e706f408cb
suppress test warnings from numpy ( #15688 )
2026-04-11 22:33:20 -04:00
chenyu
1aa04eab08
simple CreationMixin ( #15567 )
...
start with full_like, zeros_like, ones_like
2026-04-01 23:00:56 -04:00
George Hotz
5524916e39
llama compute gradients explicitly + 243 GB of RAM on MP=8 ( #15343 )
...
* llama compute gradients explicitly
* apply grads
* fix multi issue
* multi BUFFER_VIEW support
* simpler
* skip the flaky test
2026-03-18 19:54:40 +08:00
chenyu
fceb21c315
Tensor(uop) uses device from uop ( #15340 )
2026-03-18 02:56:06 -04:00
George Hotz
6109117af1
anonymous buffers are Invalid ( #15336 )
...
* anonymous buffers are Invalid
* unique_const
* work
* remove invalid writes
* test_anonymous_buffers_in_function
2026-03-18 14:52:56 +08:00
chenyu
151608aa90
update test_multiple_to_single_device ( #15056 )
...
follow up to #14482 , add SCACHE=0 to the test
2026-02-27 21:44:33 -05:00
chenyu
4424757b9a
update test_sharded_memory ( #14956 )
...
cleaned up and moved to test/null
2026-02-22 16:56:08 -05:00
George Hotz
8ef5544e4a
realized PYTHON copies ( #14934 )
...
* realized PYTHON copies
* comment that out
* fix that test
* append afters
* contig
* disk copies
* should be 124
* 332
2026-02-21 20:29:31 +08:00
chenyu
24286c5593
fix clone for multi ( #14919 )
...
also update empty_like to make sure it's backed by buffers
2026-02-20 17:21:09 -05:00
George Hotz
ab61c16730
fixes and test relaxations from prealloc_bufs ( #14875 )
...
* fixes and test relaxations from prealloc_bufs
* fix error type and guard _mop
* revert that
* contiguous makes extra/torch_backend/test_kernel_fusion.py fail
2026-02-19 11:37:25 +08:00
chenyu
0c85b93938
support shink sharded and non-sharded axes ( #14874 )
...
simpler to just support it
2026-02-18 20:54:10 -05:00
chenyu
8c830c5b44
test_full_like_shrink_on_shard_axis ( #14870 )
...
* test_full_like_shrink_on_shard_axis
add a test case that triggers non-copy branch in mstack_early_shrink
* 0
2026-02-18 19:23:44 -05:00
chenyu
f84a11bb9f
delete uneven shard tests and mentions ( #14867 )
2026-02-18 14:10:33 -05:00
George Hotz
ff60dab622
Revert "big sink is on base ( #14819 )" ( #14825 )
...
This reverts commit 5fc3d8109f .
2026-02-17 19:18:06 +08:00
George Hotz
5fc3d8109f
big sink is on base ( #14819 )
...
* big sink is on base
* contiguous fixes tests
2026-02-17 18:32:56 +08:00
qazal
ceccc8eb86
unskip now passing multi tests [pr] ( #14759 )
2026-02-15 20:30:00 +09:00
qazal
42b6bf0b7a
fix sdpa causal failing test on multi ( #14762 )
...
* simple failing test
* device is from xq
2026-02-15 16:54:33 +09:00
chenyu
ca68037f26
lazy basic setitem to unrealized Tensor ( #14756 )
...
undo the view and make it a mask, this fuses the setitem with any pending compute too.
one behavior change is that for target not backed by a buffer (const and arange), rangeify makes output contiguous under the hood.
this is stricter better than raise and ask user to call contiguous, as that would no longer be fuse-able.
2026-02-14 20:27:03 -05:00
George Hotz
c331798201
move tests to test/backend ( #14691 )
...
* move tests to test/backend
* fix imports
* fix CI
* revert that one
* Fix formatting in README for test command
2026-02-12 11:09:44 +08:00