chenyu
10c262ced8
update tests that use UOp.size ( #15753 )
2026-04-15 21:58:27 -04:00
wozeparrot
457508d5a0
llama: save more 2 ( #15681 )
2026-04-11 01:03:36 -07:00
George Hotz
3b75d8a7a2
fix double after bug in rangeify ( #15381 )
2026-03-20 14:53:46 +08:00
George Hotz
9d95321be3
set allow_implicit=False by default ( #15319 )
...
* set allow_implicit=False by default
* modernize beautiful mnist
2026-03-17 17:14:38 +08:00
George Hotz
584ec75aa2
precompile backward ( #15311 )
...
* add precompile backward support
* cleanups
* fix
* compact grad
* split v not split
* simpler
* no NOOPT
2026-03-17 15:28:40 +08:00
George Hotz
3ff03be413
call always has tuple ( #15297 )
...
* call always has tuple
* fix pre-commit and simplify
* update
* fix
* move that assert
* tuple
* fix multi
* cleanups
* fix merge
2026-03-17 10:58:46 +08:00
chenyu
3e2b7803e6
view assign replaces at buffer identity ( #15298 )
...
matches what functions capture
2026-03-16 19:58:38 -04:00
George Hotz
476276f4b4
support grads on tuples ( #15287 )
...
* support grads on tuples
* simpler
* grad_fxn works
* cleanups
* unused
2026-03-16 17:39:34 +08:00
George Hotz
08662bc4ab
add TUPLE/GETTUPLE, simple tests pass ( #15286 )
...
* simple tuple stuff passes
* resolved
2026-03-16 15:06:02 +08:00
chenyu
14d1c5fdfd
assign fusion tests on detach and contiguous_backward ( #15092 )
2026-03-02 15:21:51 -05:00
George Hotz
bb84e389cf
functions for llama trainer ( #15045 )
...
* functions for llama trainer
* function there
* axis match
* fix multi
* lil cleaner
* there's a bug with HK_FLASH_ATTENTION
* training functions
* for commit
2026-02-28 12:15:18 +08:00
George Hotz
010d2790ce
fix multi minimal ( #15044 )
2026-02-27 14:31:58 +08:00
George Hotz
fe3ee8c27e
fix symbolic shapes in calls ( #15021 )
...
* fix symbolic shapes in calls
* fix after in the big graph
* real tests
2026-02-26 17:17:18 +08:00
George Hotz
2655655a0c
call gradient creates a call ( #15020 )
...
* function creates a full subgraph
* tests
* fix var
* fix tests
* implict assign/contig
* move kv init
2026-02-26 14:15:29 +08:00
chenyu
ed9d475a12
assign tests with test_function ( #15015 )
2026-02-25 16:15:59 -05:00
George Hotz
0d35b67f2c
revert realize to only be buffers ( #15008 )
...
* revert realize to only be buffers
* fix that
* broken attention
* Revert "broken attention"
This reverts commit a23c3cd96c .
* and that
2026-02-25 22:43:06 +08:00
George Hotz
68831cd852
add more tests to test_function ( #15003 )
...
* add more tests to test_function
* add function to llm
* function decorator on llm
* works
* symbolic fixups
* minimum change
* implicit inputs
* don't actually update llama yet
2026-02-25 18:42:06 +08:00
George Hotz
e3fa9896b7
start function and add walk rewrite ( #14992 )
...
* start function and add walk rewrite
* work
* add function on feed_forward
* llm progress
* stuff
* none of that
2026-02-25 13:56:27 +08:00