mirror of
https://github.com/tinygrad/tinygrad.git
synced 2026-01-09 15:08:02 -05:00
different way to write torch backend (#9197)
* different way to write torch backend * both backends * more work * simpler code * more work * test both * imply unwrap/wrap * FORWARD_ONLY=1 TINY_BACKEND=1 python3 test/test_ops.py TestOps.test_add works * ready to start making test_ops work in torch backend * backward pass, TINY_BACKEND=1 python3 test/test_ops.py TestOps.test_add works * FORWARD_ONLY=1 TINY_BACKEND=1 python3 test/test_ops.py TestOps.test_simple_conv2d works * matmul backward is broken with as_strided
This commit is contained in:
@@ -26,7 +26,10 @@ class Model(nn.Module):
|
||||
return self.lin(torch.flatten(x, 1))
|
||||
|
||||
if __name__ == "__main__":
|
||||
if getenv("TINY_BACKEND"):
|
||||
if getenv("TINY_BACKEND2"):
|
||||
import extra.torch_backend.backend2
|
||||
device = torch.device("cpu")
|
||||
elif getenv("TINY_BACKEND"):
|
||||
import extra.torch_backend.backend
|
||||
device = torch.device("tiny")
|
||||
else:
|
||||
|
||||
Reference in New Issue
Block a user