George Hotz
e822aae9ec
reorg opts, nicer graph
2022-07-02 22:29:09 -07:00
George Hotz
f9a8412b68
make contiguous ops yellow
2022-07-02 17:54:04 -07:00
George Hotz
207b9e1df3
padding is now a param to conv2d
2022-07-02 17:11:12 -07:00
George Hotz
cde137d163
simple shapetracker tests
2022-07-02 16:02:15 -07:00
George Hotz
368c0ce2f6
NUM=-2 for ants
2022-07-02 15:47:10 -07:00
George Hotz
7276f8d6bf
improve constant folding, detach before moving tensor
2022-07-02 15:29:40 -07:00
George Hotz
0cb99d72e9
NUM=-1 is a small efficientnet for small people
2022-07-02 15:11:51 -07:00
George Hotz
8cf1aed0f4
don't track_running_stats, parameters must require_grad
2022-07-02 14:38:45 -07:00
George Hotz
07b438aa8b
move that to resolve time
2022-07-02 14:26:13 -07:00
George Hotz
dbf4aa09db
assert and tuple
2022-06-27 09:19:54 -07:00
George Hotz
37a6c0ef59
create with new ShapeTracker
2022-06-27 09:07:45 -07:00
George Hotz
e55a9833fb
a little more readable
2022-06-27 08:54:04 -07:00
George Hotz
67ff6b52fd
move padding to convs in enet
2022-06-26 23:14:31 -07:00
George Hotz
04f521a963
err, float32
2022-06-26 23:05:04 -07:00
George Hotz
8540d1f289
track membw
2022-06-26 23:03:53 -07:00
George Hotz
3a414d7f50
cleanup, add flops tracking
2022-06-26 22:43:39 -07:00
George Hotz
a699f7cb0b
debug cleanups
2022-06-26 21:58:44 -07:00
George Hotz
15a16b98e6
remove get_root
2022-06-26 21:18:02 -07:00
George Hotz
e3c2579537
flip stride to match canonical
2022-06-26 19:19:53 -07:00
George Hotz
53ab09de79
remove the SLICE on conv dw
2022-06-26 19:09:36 -07:00
George Hotz
149581b0b2
Cdx without SLICE
2022-06-26 18:51:53 -07:00
George Hotz
a04813ffe3
1 line less in cpu, fix torch tests
2022-06-26 18:11:53 -07:00
George Hotz
dffde3de5a
support both asymmetric and negative padding
2022-06-26 17:59:25 -07:00
George Hotz
49c954b389
comments
2022-06-26 17:20:25 -07:00
George Hotz
8c483fbdc9
maxpool lazy fix
2022-06-26 17:07:03 -07:00
George Hotz
f607f18006
fix backward
2022-06-25 00:00:53 -07:00
George Hotz
ec30f0402f
improve benchmark_train_efficientnet
2022-06-24 23:46:38 -07:00
George Hotz
3a147137ee
CL_DEVICE option
2022-06-24 23:22:10 -07:00
George Hotz
d748353ce5
err, okay, a bit more off
2022-06-24 22:44:57 -07:00
George Hotz
bdde95f16e
CACHE_LAZYBUFFERS options + benchmark. only a couple x from torch
2022-06-24 22:33:53 -07:00
George Hotz
6847eaf5b6
comments
2022-06-22 09:37:50 -07:00
George Hotz
1d4fb3527e
cleanups to Tensor class
2022-06-22 09:33:30 -07:00
George Hotz
3e13e3330a
UNSAFE_FLOAT4 env
2022-06-22 08:20:29 -07:00
George Hotz
73415e20ab
this fixes 2 of the conv recomputes...but it's ugh
2022-06-22 08:18:12 -07:00
George Hotz
b2d5df6049
3 convs are being recomputed
2022-06-22 07:54:52 -07:00
George Hotz
ba2defcdef
elif False
2022-06-21 23:54:09 -07:00
George Hotz
9cb0522574
noargs
2022-06-21 23:48:58 -07:00
George Hotz
1074dfbb71
unstrided
2022-06-21 23:42:21 -07:00
George Hotz
9ae01290ba
pass in shorts
2022-06-21 23:33:23 -07:00
George Hotz
18d74c01b1
float4 opt
2022-06-21 21:27:51 -07:00
George Hotz
ff3d5fe962
debugging while we compile
2022-06-21 21:12:04 -07:00
George Hotz
b12985b013
openpilot compiler
2022-06-21 20:31:18 -07:00
George Hotz
98a730dd00
benchmark on different inputs
2022-06-21 20:20:58 -07:00
George Hotz
9d06a86f7f
CL class, debugging
2022-06-21 20:16:29 -07:00
George Hotz
0b820f7966
FOLD_CONSTANTS_INTO_KERNELS and shapetracker OOB tweak
2022-06-21 19:47:15 -07:00
George Hotz
83d50e2687
move to extra.onnx
2022-06-21 19:43:44 -07:00
George Hotz
1ebc2b5545
lazy opencl works
2022-06-21 19:41:08 -07:00
George Hotz
c833886bf5
improved shapetracker
2022-06-21 19:17:25 -07:00
George Hotz
c53c91f949
opencl tests passed ( #347 )
2022-06-21 18:57:09 -07:00
George Hotz
8fbe2e4aed
No ctx in llops ( #345 )
...
* remove ctx from gpu ops
* ctx for the others
* this is okay
* mlops are not static. fix lazy
* cl is property, _processing_op is class method
* kernel_name
* contiguous_op
2022-06-21 10:07:49 -07:00