diff --git a/README.md b/README.md index 64e8b8f762..0e998f126b 100644 --- a/README.md +++ b/README.md @@ -39,7 +39,7 @@ print(y.grad) # dz/dy ### Neural networks? -It turns out, a decent autograd tensor library is 90% of what you need for neural networks. Add an optimizer (SGD, RMSprop and Adam implemented) from tinygrad.optim, write some boilerplate minibatching code, and you have all you need. +It turns out, a decent autograd tensor library is 90% of what you need for neural networks. Add an optimizer (SGD, RMSprop, and Adam implemented) from tinygrad.optim, write some boilerplate minibatching code, and you have all you need. ### Neural network example (from test/test_mnist.py)