different way to write torch backend (#9197)

* different way to write torch backend

* both backends

* more work

* simpler code

* more work

* test both

* imply unwrap/wrap

* FORWARD_ONLY=1 TINY_BACKEND=1 python3 test/test_ops.py TestOps.test_add works

* ready to start making test_ops work in torch backend

* backward pass, TINY_BACKEND=1 python3 test/test_ops.py TestOps.test_add works

* FORWARD_ONLY=1 TINY_BACKEND=1 python3 test/test_ops.py TestOps.test_simple_conv2d works

* matmul backward is broken with as_strided
This commit is contained in:
George Hotz
2025-02-22 14:42:26 +08:00
committed by GitHub
parent 041b6d5678
commit 4e6665bda5
6 changed files with 164 additions and 58 deletions

View File

@@ -26,7 +26,10 @@ class Model(nn.Module):
return self.lin(torch.flatten(x, 1))
if __name__ == "__main__":
if getenv("TINY_BACKEND"):
if getenv("TINY_BACKEND2"):
import extra.torch_backend.backend2
device = torch.device("cpu")
elif getenv("TINY_BACKEND"):
import extra.torch_backend.backend
device = torch.device("tiny")
else: