diff --git a/README.md b/README.md index 332993bff7..db6cf38bd1 100644 --- a/README.md +++ b/README.md @@ -48,8 +48,6 @@ And we can change `DEBUG` to `4` to see the generated code. As it turns out, 90% of what you need for neural networks are a decent autograd/tensor library. Throw in an optimizer, a data loader, and some compute, and you have all you need. -#### Neural network example (see examples/beautiful_mnist.py for the full thing) - ```py from tinygrad import Tensor, nn @@ -72,6 +70,8 @@ for i in range(10): print(i, loss.item()) ``` +See [examples/beautiful_mnist.py](examples/beautiful_mnist.py) for the full version that gets 98% in ~5 seconds + ## Accelerators tinygrad already supports numerous accelerators, including: