George Hotz
82be8abfd2
move opt under codegen ( #11569 )
2025-08-07 14:19:17 -07:00
George Hotz
92678e59ee
move kernel to opt ( #10899 )
2025-06-20 15:22:28 -07:00
George Hotz
b3b43a82c4
remove Tensor.no_grad, it's meaningless now [pr] ( #10556 )
2025-05-28 22:20:02 -07:00
chenyu
f4f56d7c15
move time_linearizer to extra.optimization.helpers [pr] ( #9048 )
...
no longer used in tinygrad
2025-02-12 15:49:58 -05:00
chenyu
28972418c4
s/get_linearizer/get_kernel [run_process_replay] ( #5467 )
2024-07-13 20:32:22 -04:00
George Hotz
ff64bcab69
move graph/search to engine ( #4596 )
2024-05-14 23:12:59 -07:00
forcefieldsovereign
f294bdd681
fixed imports ( #2185 )
2023-10-30 22:07:17 -07:00
George Hotz
c36d306606
KOPT is over, BEAM is upstream ( #2071 )
...
* create cache for q learning
* make linter happy
* global beam
* where it belongs
* bugfix
* ditch the kopt, use the beam
* faster lin and DEBUG=2 okay
* remove kopt, move search to features
2023-10-16 09:46:03 -07:00
George Hotz
c5edb3c374
train value net, improve API, add BCE ( #2047 )
...
* api cleanups, BCE losses
* valuenet
* fixup examples
* learning okay
* add valuenet runner
* net improvements
* net improvements
* 40% win rate
2023-10-12 07:56:38 -07:00
George Hotz
0ba629c7b9
add world dataset ( #2045 )
2023-10-11 15:54:30 -07:00
George Hotz
0c3b6f13a8
Latest opt ( #2044 )
...
* split out actions
* rl algorithm
2023-10-11 15:46:14 -07:00