From 65f4e6971ba59634e5d1c423c5b01f847d05c6d7 Mon Sep 17 00:00:00 2001 From: George Hotz Date: Thu, 23 Nov 2023 14:58:22 -0800 Subject: [PATCH] beautiful_mnist.py link --- README.md | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/README.md b/README.md index 332993bff7..db6cf38bd1 100644 --- a/README.md +++ b/README.md @@ -48,8 +48,6 @@ And we can change `DEBUG` to `4` to see the generated code. As it turns out, 90% of what you need for neural networks are a decent autograd/tensor library. Throw in an optimizer, a data loader, and some compute, and you have all you need. -#### Neural network example (see examples/beautiful_mnist.py for the full thing) - ```py from tinygrad import Tensor, nn @@ -72,6 +70,8 @@ for i in range(10): print(i, loss.item()) ``` +See [examples/beautiful_mnist.py](examples/beautiful_mnist.py) for the full version that gets 98% in ~5 seconds + ## Accelerators tinygrad already supports numerous accelerators, including: