chenyu
185a000882
gradient of COPY ( #13760 )
2025-12-19 13:33:59 -05:00
chenyu
ed962786d6
use assign in Tensor.backward ( #13674 )
...
preserve the grad object so that jit works
2025-12-13 22:43:06 -05:00
George Hotz
32e9949052
rename lazydata to uop ( #10698 )
2025-06-08 08:42:22 -07:00
George Hotz
41e3d07d7f
view gradient is tricky ( #10528 )
...
* view gradient is tricky
* explicit
2025-05-26 22:28:30 -07:00
chenyu
8cc2dff4d8
only float Tensors have gradient [pr] ( #10475 )
2025-05-22 21:02:11 -04:00
George Hotz
411392dfb7
move files into uop dir ( #10399 )
...
* move files into uop dir [pr]
* tinygrad.uop is a thing
* fix uop docs, no pr
* fix viz
2025-05-18 11:38:28 -07:00
qazal
14aa2395d0
allow VIEW(BUFFER) in Tensor UOps [pr] ( #9210 )
...
* allow VIEW(BUFFER) in Tensor UOps [pr]
* still reshapes
* update becomes_map tests
* bring copy folder to the scheduler
* lint
* only sgd left
* optimizer assign
* 13 kernels
* rename to test_reorder_expand + assert VIEW
2025-02-24 13:06:15 +01:00
chenyu
287de4ecc6
use torch in test_gradient ( #9186 )
...
used torch.autograd.grad, but not sure if it can be a template like jax
2025-02-20 12:26:11 -05:00
qazal
fd9f9ec772
realized base tensors become RESHAPE(BUFFER) [pr] ( #8994 )
2025-02-10 10:17:54 +01:00
George Hotz
b4bf6a7dea
switch backward to use gradient [pr] ( #8235 )
...
* switch backward to use gradient [pr]
* set device correctly, dedup
* why does that fail?
* add noop cast
* simple backward
* fix beautiful_mnist
* touchups
* set in compute_gradient
* uop_count
* uop_count was wrong
* collections
* no note
* skip that test
* update sched kernel counts
* train mnist is 65
* fix metadata and gc
* fixes
* materialize_grads
* no pathlib stuff
* add contiguous_backward, fix bugs
* add some realize
* fix multi
2025-01-26 09:12:16 +09:00
George Hotz
46a8c5e1e5
delete forced_realize ( #8615 )
...
* delete forced_realize
* put that back
* expectedFailures
* cleaner create_subbuffer
* more comments
---------
Co-authored-by: qazal <qazal.software@gmail.com >
Co-authored-by: qazal <77887910+Qazalin@users.noreply.github.com >
2025-01-20 09:40:36 -08:00
George Hotz
c85737c200
assert to prepare for grad uop [pr] ( #8280 )
...
* assert to prepare for grad uop [pr]
* fix test_nn
* fix most of test_tensor
* few more tests
* fix multi
* uniform gradient
* acc_dtype
* any for multi
* fix typing
* fix assert, CAST_BEFORE_VIEW is still the issue
* explict test for CAST_BEFORE_VIEW
---------
Co-authored-by: qazal <77887910+Qazalin@users.noreply.github.com >
2025-01-14 13:26:56 -08:00
George Hotz
bd9c015b09
tests from grad uop path [pr] ( #8313 )
2024-12-18 09:25:05 -08:00
George Hotz
bcd7ea60f0
hotfix: a few more grad tests
2024-12-13 21:03:02 -08:00
George Hotz
734f2c5344
compute gradient [pr] ( #8237 )
...
* compute gradient [pr]
* schedule_step_with_grads
* second deriv works
2024-12-13 20:46:01 -08:00
George Hotz
8396d90f91
non controversial changes from optim branch [pr] ( #8234 )
2024-12-13 19:24:16 -08:00
George Hotz
37fa38d272
Revert "switch beautiful_mnist to use new optimizer [pr] ( #8231 )" ( #8233 )
...
This reverts commit e9ee39df22 .
2024-12-13 19:07:09 -08:00
George Hotz
e9ee39df22
switch beautiful_mnist to use new optimizer [pr] ( #8231 )
...
* switch beautiful_mnist to use new optimizer [pr]
* fix abstractions3 + docs
* fix OptimizerGroup with schedule_step api
2024-12-13 18:27:16 -08:00
George Hotz
e2f87ecf36
start work on new gradient ( #7838 )
...
* start work on new gradient
* more correct
* working tests
* more tests
* work
* add (faliing) gradient test
* add view and reduce gradient
* test_add works, many failing test_ops
* add max and reduce max
* add max and reduce max
* 129 failing
* 108 failed
* better view drawing
* 101 failed
* i got 99 failures
* 94 failures
* it's tons of terrible code, but only 50 tests fail
* only 19 failures
* same 19 but shorter
* minimal doesn't matter
* shorter
* lil simpler
* simpler
* simpler
* simpler
* 13 test failures
* nine tests fail
* all ops tests pass
* add contiguous gradient + fix sched tests
* faster by removing toposort calls
* missed one
* add jax to testing
2024-12-13 16:45:53 -08:00