Files
tinygrad/examples
David Hou 4b95350c41 fp16 resnet (without expand backwards sum in float, doesn't work) (#3816)
* fp16 resnet

* cast running mean and var back to default float

* extra cast

* check symbolic no overflow

* add linearizer failure

* loss scaler after grad contig

* oops

* i think this works

* don't loss scale fp32

* remove overflow test case

* remove symbolic bounds check

* loss scaler should be float

* temporarily disable padto cuz bug

shruggie

* make running stats in batchnorm float32?

* calculate lars stuff in fp32?

* oops

* remove most changes

* move loss scaler out of optimizer

* no more FP16 var

* oops

---------

Co-authored-by: chenyu <chenyu@fastmail.com>
2024-03-28 01:25:37 -04:00
..
2023-03-11 16:28:10 -08:00
2024-03-14 20:44:34 -07:00
2023-10-30 18:42:26 -07:00
2023-08-22 07:36:24 -07:00
2023-09-28 18:02:31 -07:00
2024-01-01 14:58:48 -08:00
2023-11-28 17:36:55 -08:00
2023-12-08 12:59:38 -08:00
2024-03-14 17:33:45 -04:00