Commit Graph

4433 Commits

Author SHA1 Message Date
George Hotz
20894991ed good changes from the M1 Tensor Core project (#730)
* good changes

* working except llvm

* llvm types

* nice acc

* archprobe

* lang.float4

* use self.acc for late acc

* fix store bug
2023-03-29 05:11:02 +04:00
Andre Slavescu
39d6e1525f Added activation ops + tests (#729)
* activation ops

* type hints + more testing

* formatting correction + parameter testing

* fixes to shape testing

* hardtanh to use clip + removed type hints

* assign val fix
2023-03-28 13:17:53 +04:00
George Hotz
23f88fb026 synchronize for honest speed compare 2023-03-24 10:24:27 -07:00
George Hotz
1cb5b2d015 test_enet_se 2023-03-24 10:04:30 -07:00
Jacky Lee
fafe8e9ce2 casting: support all backends and implement half (#726)
* casting: support all backends and implement half

* map torch types in ops_torch

* reuse type map for torch buffer

* inverse dict lookup
2023-03-24 09:58:03 -07:00
George Hotz
e88b9bfe1e print gflops avg with DEBUG=2 2023-03-23 16:07:08 -07:00
Jacky Lee
e009b6f341 Add tests for casting (#724)
* Add tests for casting

* Skip half_matmul_upcast when TORCH=1

* Fix promotion on torch

* Fix spacing
2023-03-23 08:02:52 -07:00
George Hotz
b12b60af20 fix binop, other tests failure (#723)
* fix binop, other tests failure

* that was a bad idea

* better layernorm

* inference kernel count tests

* new style reshape pushing

* fixup replacement

* 199 kernels is okay. fix flops

* push reshape through unaryops only

* GRAPH=2 draws the phantom ops

* found resnet issue

* non working test

* mul is cheaper than div

* OPT inflation

* SHUFFLE_PAD_OPS in OPT=2
2023-03-22 18:15:07 -07:00
George Hotz
d6f4219952 LayerNorm2d for 2 lines 2023-03-20 16:58:43 -07:00
George Hotz
30b795874a remove RMSprop, nobody uses it anymore 2023-03-20 12:31:34 -07:00
George Hotz
623fb1ef28 do test_conv_with_bn test 2023-03-19 23:53:56 -07:00
George Hotz
5495c7d64e linearizer! (#714)
* linearizer outputs something

* working ish

* cstyle codegen

* clang mostly works

* fix load valid

* fix numberless loop

* fancy gen

* working

* fix enet compiler

* cleanups

* float4 upcasting

* less lines

* supports_float4

* constant folding

* mulacc

* internet tests flaky in CI

* 90% image support

* fix image generic

* bugs exposed with shapetracker and single view

* new llvm

* use vload, remove OLD

* that's really poorly done

* ending up being more lines
2023-03-19 23:43:49 -07:00
Cyril Roumégous
b629fd4cd8 add AdamW optimizer (#716)
* add AdamW optimizer

* one liner Adam optimizer
2023-03-19 12:51:06 -07:00
George Hotz
902906f909 Fix constant folding (#713)
* fix

* codegen

* contiguous is real

* no bufs_to_delete

* don't assign rawconst

* remove neg and not

* need exec to fix custom function jit
2023-03-18 17:52:46 -07:00
George Hotz
f5467cfedc Devicebufferless (#708)
* runs one metal kernel

* conv2d works

* ops tests are passing

* const folding

* all ops work

* pre commit always passes

* torch works

* working still

* fix graph test

* tests passing

* image almost works

* image conv works

* most images

* fix custom

* fix assignment

* fix compile enet

* clean up comments

* fix realize return value

* include shapetracker in LB repr

* copy should make a copy

* reenable method cache

* fix lna

* dtypes in graph

* forward only for IMAGE=2

* simple realize

* getting close

* fixup new api, it's good except the kernel count

* back to 197 kernels

* tests should pass

* go to a real float

* no type_on_cpu

* fix the docs

* put shapetracker back in it's proper place
2023-03-18 14:40:23 -07:00
Kirill
0532025b04 Fix llama 13B weights loading (#700)
* Fix llama 13B weights loading

* refactor more

* add test

* test storage offset

* fix spacing

* fix strides

* llama 13B working?

* yolo?

* better test for seeks
2023-03-15 08:59:52 -07:00
George Hotz
54f499b623 Move rawbuffer (#697)
* move GlobalCounters to helpers

* that's not part of the public api

* move InterpretedBuffer

* remove fromCPU from devicebuffer
2023-03-13 22:30:36 -07:00
George Hotz
c594a0a835 fix flip bug, add new unit tests 2023-03-12 23:55:31 -07:00
George Hotz
a4abcf0969 improve test_example 2023-03-12 22:59:40 -07:00
George Hotz
5577634cf3 tests in pre commit 2023-03-12 22:42:26 -07:00
George Hotz
ce1564b05e fix shapetracker test 2023-03-12 22:33:25 -07:00
George Hotz
fe0e8a306f jittable llama 2023-03-12 14:15:04 -07:00
George Hotz
15e0b56e39 compile works (#688)
* compile works

* runtimes

* line count

* fix custom, to tg dtype

* meh, that's fine with lazy import
2023-03-12 11:01:25 -07:00
George Hotz
dc9a6b4bb7 fix float16 in CLANG on linux 2023-03-11 21:51:22 -08:00
George Hotz
37cf6fc4c0 err, external_test_opt.py broke...fusing will have to wait. correctness over speed 2023-03-11 17:54:47 -08:00
George Hotz
305b9f2d21 multistep optim tests passing 2023-03-11 17:49:53 -08:00
George Hotz
61071f881a fix bug, and add unit test to catch failure 2023-03-11 16:57:25 -08:00
George Hotz
3ec457248c failing llama test 2023-03-11 16:28:10 -08:00
Diogo
784afc6c6f Eq magic function support (#683)
* add eq magic func

* changed from eq to __eq__

* ignore type for linter

* mypy doenst like descriptions :(
2023-03-11 10:31:46 -08:00
George Hotz
01f39b19dc move to shapetracker.py 2023-03-11 07:50:07 -08:00
George Hotz
0b03216cc3 losing lines (#678)
* losing lines

* FLIP -> STRIDE

* shapetracker refactor
2023-03-10 21:57:05 -08:00
George Hotz
d7cb8e3e56 multithreaded fake_torch_load_zipped 2023-03-10 19:16:27 -08:00
Connor Henderson
8b7a16cf85 Add conv binops_no_rerun test assertions (#665)
* Add conv binops_no_rerun assertions

* use assert_allclose

* widen tolerance for elu
2023-03-10 19:09:48 -08:00
George Hotz
1826ff6b89 dtypes nice and clean (#673)
* add dtype class

* dtypes

* buffers are lazy

* dtype is tracked by lazybuffer and GenericShape

* fix types in llvm

* llvm store

* dtype tests

* fix tests maybe

* fix flop counter

* fix CI

* CI fix and check format

* fix dtype and dtype check

* fix custom test

* fix test graph
2023-03-10 16:56:07 -08:00
George Hotz
036737a12a mem_estimate tracks bytes, not items 2023-03-10 09:44:12 -08:00
George Hotz
1a039306d2 good changes from llama branch (#671)
* good changes from llama

* transpose behavior changed
2023-03-09 20:51:22 -08:00
George Hotz
dbbaa0bdd7 int32, and refactor pad/shrink 2023-03-09 12:57:17 -08:00
George Hotz
fb5ee9260f add pad tests to shapetracker 2023-03-09 12:51:18 -08:00
George Hotz
022c5835fc fix GPU import error and old python Tuple 2023-03-08 12:22:11 -08:00
George Hotz
c22afc52db move the custom function example to a test 2023-03-08 10:05:04 -08:00
George Hotz
00641aa45d add challenge tests 2023-03-07 19:39:04 -08:00
George Hotz
e0244baf60 3 letters for graph op 2023-03-07 19:20:48 -08:00
George Hotz
4eb880550f enable contract test 2023-03-07 17:32:28 -08:00
Alex Wang
d885d2d0f5 Allow 1s for contraction detection (#663)
* Allow 1s for contraction check

* More test cases for 1s
2023-03-07 17:31:28 -08:00
George Hotz
b561256a0e allow all reduces (#661)
* allow all reduces

* push permute tests

* explict permute reshape push

* contractw1s
2023-03-07 15:36:01 -08:00
George Hotz
b14d31d6db ConvNeXt + extras (#657)
* simple convnext implementation

* shorter function names

* need to realize the random functions now

* creating an optimizer realizes all params

* assign contiguous

* fix lazy lazy

* why was i doing that...add convnext to tests

* LazyNumpyArray

* enable assert + comment

* no two tiny
2023-03-06 22:10:56 -08:00
George Hotz
8c5dea8d72 fix CUDA float4 issues 2023-03-06 07:16:38 -08:00
George Hotz
7dbcc26582 fix up external tests 2023-03-06 06:52:28 -08:00
George Hotz
50012f679b move get_contraction to shapetracker 2023-03-06 06:42:57 -08:00
Alex Wang
64ecbd91b5 Refactor contraction and add integration test cases for push permute (#650)
* Refactor contraction and add unit tests

* Fix typo; Fix TestConv.test_elu failure due to some ones in old_shape

* Add push permute test cases

* Fix mypy type annotation check error

* Add contraction unit test; Reshape to higher dimension is not contraction
2023-03-06 06:36:55 -08:00