Commit Graph

116 Commits

Author SHA1 Message Date
George Hotz
13d34373d1 move gradcheck to extra, clean up unbroadcast 2020-11-16 08:03:31 -08:00
George Hotz
9ac1ad40d6 Add GPU Support! (do not merge yet) (#41)
* copy tensors to and from gpu

* add on GPU

* adding works

* we stick shapes in

* works on cpu and gpu

* test changes, not passing yet

* something else

* op tests pass

* add, mean, and sum have working forward/backward

* mul ops test

* no gpu support, no problem

* test pass, clean up later

* gpu cleanup

* cleanup test ops, don't let div fail

* revert more

* aimpler dispatcher

* clean up grad

* GPU and

* grad is a Tensor now

* gate test on GPU

* cleanups

* late loading gpu

* GPU as input option

* last cleanups
2020-11-01 07:00:49 -08:00
Timothy Mc Alister
15e5988323 make default parameters work for functions 2020-10-26 12:43:36 +01:00
George Hotz
b27bcbe4b4 avgpool and test refactor 2020-10-25 18:40:01 -07:00
George Hotz
567707a5f6 rename max_pool2d to match torch, remove more fast conv crap 2020-10-25 17:16:47 -07:00
George Hotz
96f9cdb8a0 woah, fastconv is wrong 2020-10-25 12:56:42 -07:00
George Hotz
bb98cdfef7 improve conv testing 2020-10-25 12:46:04 -07:00
George Hotz
67506eb6ba fast im2col 2020-10-25 11:49:35 -07:00
George Hotz
935f5ddaaa always keep batch size out front 2020-10-25 08:14:07 -07:00
George Hotz
b91fd3afad maxpool 2020-10-25 07:43:34 -07:00
George Hotz
5756115e57 anyone else let down by the fast conv? 2020-10-23 09:09:29 -07:00
0xNaN
d95adbddb4 gradcheck now returns only a bool, refactoring of test_gradcheck 2020-10-22 01:28:52 +02:00
0xNaN
adbfc67456 test jacobian and numerical_jacobian against torch.autograd.functional.jacobian 2020-10-22 01:28:52 +02:00
0xNaN
1561d3b9c0 extracting jacobian and test_jacobian 2020-10-22 01:28:52 +02:00
0xNaN
93bc3c22a0 tiny gradcheck 2020-10-22 01:28:52 +02:00
Adrian Garcia Badaracco
5afe6b1f68 rename files 2020-10-21 11:28:03 -05:00