i like that comma

This commit is contained in:
George Hotz
2020-10-23 06:12:04 -07:00
parent eda29fa0e0
commit 49ae15a450

View File

@@ -39,7 +39,7 @@ print(y.grad) # dz/dy
### Neural networks?
It turns out, a decent autograd tensor library is 90% of what you need for neural networks. Add an optimizer (SGD, RMSprop and Adam implemented) from tinygrad.optim, write some boilerplate minibatching code, and you have all you need.
It turns out, a decent autograd tensor library is 90% of what you need for neural networks. Add an optimizer (SGD, RMSprop, and Adam implemented) from tinygrad.optim, write some boilerplate minibatching code, and you have all you need.
### Neural network example (from test/test_mnist.py)