2020-10-18 14:55:20 -07:00
2020-10-18 14:55:20 -07:00
2020-10-18 14:55:20 -07:00
2020-10-17 22:57:01 -07:00
2020-10-18 11:27:37 -07:00
2020-10-18 14:38:20 -07:00

tinygrad

Unit Tests

For something in between a grad and a karpathy/micrograd

This may not be the best deep learning framework, but it is a deep learning framework.

The Tensor class is a wrapper around a numpy array, except it does Tensor things.

Example

import numpy as np
from tinygrad.tensor import Tensor

x = Tensor(np.eye(3))
y = Tensor(np.array([[2.0,0,-2.0]]))
z = y.dot(x).sum()
z.backward()

print(x.grad)  # dz/dx
print(y.grad)  # dz/dy

Same example in torch

import torch

x = torch.eye(3, requires_grad=True)
y = torch.tensor([[2.0,0,-2.0]], requires_grad=True)
z = y.matmul(x).sum()
z.backward()

print(x.grad)  # dz/dx
print(y.grad)  # dz/dy

You can even train neural networks with tinygrad (from test/mnist.py)

from tinygrad.tensor import Tensor
import tinygrad.optim as optim
from tinygrad.utils import layer_init_uniform

class TinyBobNet:
  def __init__(self):
    self.l1 = Tensor(layer_init_uniform(784, 128))
    self.l2 = Tensor(layer_init_uniform(128, 10))

  def forward(self, x):
    return x.dot(self.l1).relu().dot(self.l2).logsoftmax()

model = TinyBobNet()
optim = optim.SGD([model.l1, model.l2], lr=0.001)

# ... and complete like pytorch, with (x,y) data

out = model.forward(x)
loss = out.mul(y).mean()
loss.backward()
optim.step()

TODO (to make real neural network library)

  • Implement gradcheck (numeric)
  • Implement convolutions
Description
No description provided
Readme MIT 265 MiB
Languages
Python 67.5%
C 19.4%
Cuda 5.3%
Assembly 2.7%
Metal 2.3%
Other 2.7%