mirror of
https://github.com/tinygrad/tinygrad.git
synced 2026-01-24 14:28:09 -05:00
* update llama attention casting updated scaled_dot_product_attention middle cast and removed hard-coded half in llama attention. * fix that