George Hotz
3667200df5
remove unused unstride
2022-06-15 20:03:43 -07:00
George Hotz
ff648e9510
remove convt and compute dx with conv
2022-06-15 19:54:15 -07:00
George Hotz
142c88f2e3
move to mlops
2022-06-15 18:06:07 -07:00
George Hotz
85fe25e27b
add stride support to shapetracker
2022-06-15 17:48:41 -07:00
George Hotz
827e8f67eb
comment
2022-06-15 17:31:27 -07:00
George Hotz
3d4657167b
fix tests hopefully
2022-06-15 17:26:37 -07:00
George Hotz
e4ab57e39d
oops, only stride
2022-06-15 15:25:58 -07:00
George Hotz
86f55b078d
transpose dilation was simple
2022-06-15 15:20:51 -07:00
George Hotz
2a14befb74
support padding
2022-06-15 14:46:44 -07:00
George Hotz
6d98366214
move CONVDW out of llops
2022-06-15 12:05:11 -07:00
George Hotz
fef6c82491
wow dilation support was simple
2022-06-15 11:38:23 -07:00
George Hotz
0b182029dd
support dilated convolution in torch
2022-06-14 18:03:35 -07:00
George Hotz
a690ba4588
add test for padding
2022-06-14 17:41:22 -07:00
George Hotz
e057ca23bb
add flip
2022-06-14 17:28:43 -07:00
George Hotz
a8aeebfb0c
use shapetracker to combine adj reduce axis
2022-06-14 17:08:12 -07:00
George Hotz
906cce9916
reduce with loops
2022-06-14 16:38:33 -07:00
George Hotz
6261a0639b
ShapeTracker ( #328 )
...
* start shapetracker
* that late reshape is crushing our hopes
* simple failure
* DumbShapeTracker passes tests
* improve st tests
* stacked view tracker works
* flip works
* tests pass
* shapetracker works
* use ShapeTracker in ops_gpu
* a couple lines
* fix 0 shape
* less lines
* use shapetracker for new_shape in ops.py
* simpler still
* padding with a ZeroView
* gamed it a little
2022-06-14 16:08:22 -07:00
George Hotz
e58b5711ec
simpler convdw
2022-06-13 17:56:54 -07:00
George Hotz
dcbca4fdf1
Expand Operator ( #327 )
...
* replace broadcasting with expand
* Tensor, not self
* remove broadcasting from mlops
* delete useless A operator
* expand, not repeat
* remove A op
* expand on gpu
* binary_op doesn't broadcast anymore
* expand is still total junk, but the tests should pass
2022-06-12 12:31:48 -07:00
George Hotz
5cf7649eda
register the operators outside
2022-06-12 10:26:34 -07:00
George Hotz
33f18c61a1
test_broadcasted_add
2022-06-12 10:19:58 -07:00
George Hotz
d47a421970
add cout to conv_args, don't change the first 12
2022-06-12 00:10:15 -07:00
George Hotz
af300b121b
refactor to pass conv args into llops
2022-06-11 23:08:46 -07:00
George Hotz
d747a4b9e2
add padding to conv2d function, other minor things
2022-06-11 22:29:42 -07:00
George Hotz
9a3c048724
skip broken tests, no float64 allowed
2022-06-11 17:12:04 -07:00
George Hotz
3461e1a467
A is actually a unary op
2022-06-11 17:05:29 -07:00
George Hotz
296f391403
delete old graph engine to reach line count
2022-06-11 17:02:53 -07:00
George Hotz
a4d0d3f17a
stupid numpy hack
2022-06-11 17:01:56 -07:00
George Hotz
c03c835d75
oops
2022-06-11 16:53:34 -07:00
George Hotz
361123490a
remove a lot of useless returns
2022-06-11 16:50:06 -07:00
George Hotz
35e55afe17
and processing op
2022-06-11 16:46:38 -07:00
George Hotz
6d5591f7a3
movement op to SSA
2022-06-11 16:44:24 -07:00
George Hotz
6685807df7
reduce_op in SSA format too
2022-06-11 16:40:14 -07:00
George Hotz
bbf231da34
move unary and binary op mem alloc to Ops class
2022-06-11 16:35:03 -07:00
George Hotz
1511cbf9c5
graphs of llops, sorry about the line count
2022-06-11 16:28:39 -07:00
George Hotz
9ebd472375
move ops to ops.py
2022-06-11 15:58:56 -07:00
George Hotz
b5b68e75ff
simpler onnx
2022-06-11 15:35:45 -07:00
George Hotz
2305a5347b
test_onnx works with enet also
2022-06-11 14:30:26 -07:00
George Hotz
6fdb276886
flip batchnorm function order
2022-06-11 13:20:41 -07:00
George Hotz
85d17a2acd
running resnet onnx
2022-06-11 13:17:15 -07:00
George Hotz
0225360191
fixed with one return x
2022-06-11 12:08:53 -07:00
George Hotz
094b304348
very weird += is broken
2022-06-11 11:58:50 -07:00
George Hotz
db5a632e8c
multicat + test onnx is generic onnx
2022-06-11 11:50:47 -07:00
George Hotz
a710b3a210
it's a real test now
2022-06-11 11:33:33 -07:00
George Hotz
8440dbfa5d
support inputs
2022-06-11 11:21:45 -07:00
George Hotz
08de1aa636
add flatten to tinygrad
2022-06-11 11:15:16 -07:00
George Hotz
aee251cc41
op model test
2022-06-11 11:06:03 -07:00
George Hotz
d061ce8d5e
add ELU support
2022-06-11 10:47:23 -07:00
George Hotz
50ba554a14
Ops layer of indirection
2022-06-11 10:04:45 -07:00
George Hotz
1d29780b75
use conv2d for transpose when able
2022-06-11 09:32:13 -07:00