mirror of
https://github.com/tinygrad/tinygrad.git
synced 2026-01-07 22:23:55 -05:00
cc9054e3ec76f11cf94efe6afd4c02a70752f4c5
tinygrad
For something in between a grad and a karpathy/micrograd
This may not be the best deep learning framework, but it is a deep learning framework.
The Tensor class is a wrapper around a numpy array, except it does Tensor things.
Example
import numpy as np
from tinygrad.tensor import Tensor
x = Tensor(np.eye(3))
y = Tensor(np.array([[2.0,0,-2.0]]))
z = y.dot(x).sum()
z.backward()
print(x.grad) # dz/dx
print(y.grad) # dz/dy
Same example in torch
import torch
x = torch.eye(3, requires_grad=True)
y = torch.tensor([[2.0,0,-2.0]], requires_grad=True)
z = y.matmul(x).sum()
z.backward()
print(x.grad) # dz/dx
print(y.grad) # dz/dy
You can even train neural networks with tinygrad (from test/mnist.py)
from tinygrad.tensor import Tensor
import tinygrad.optim as optim
from tinygrad.utils import layer_init_uniform
class TinyBobNet:
def __init__(self):
self.l1 = Tensor(layer_init_uniform(784, 128))
self.l2 = Tensor(layer_init_uniform(128, 10))
def forward(self, x):
return x.dot(self.l1).relu().dot(self.l2).logsoftmax()
model = TinyBobNet()
optim = optim.SGD([model.l1, model.l2], lr=0.001)
# ... and complete like pytorch
TODO (to make real neural network library)
- Implement gradcheck (numeric)
- Implement convolutions
Languages
Python
67.5%
C
19.4%
Cuda
5.3%
Assembly
2.7%
Metal
2.3%
Other
2.7%