George Hotz
4013c9848c
don't use tons of memory for tests non CI [pr] ( #7209 )
...
* don't use tons of memory for tests
* fix import and clean up pre-commit
* use pathlib
* no shm on windows
* Revert "use pathlib"
This reverts commit 7c38489820 .
* run pre-commit hooks in test
* ugh, fix later
2024-10-22 15:04:51 +08:00
George Hotz
4438d6a467
Tensor.from_url API [pr] ( #7210 )
...
* Tensor.fetch API [pr]
* update docs
* from_url
2024-10-22 14:54:17 +08:00
George Hotz
be64ac417e
move GGUF test to it's own file [pr] ( #7208 )
...
* move GGUF test to it's own file [pr]
* skip tests if modules aren't installed
2024-10-22 13:24:55 +08:00
George Hotz
ccf4843945
use substitute instead of replace_uop [pr] ( #7207 )
2024-10-22 13:24:38 +08:00
George Hotz
3b4587fbf9
no need to DEFINE_VAR arg sort [pr] ( #7206 )
2024-10-22 12:17:50 +08:00
nimlgen
21acfc39d4
qcom cleanup allocs ( #7200 )
...
* qcom cleanup allocs
* oops
2024-10-21 23:20:15 +03:00
chenyu
f37e6b453b
load_gguf -> gguf_load in doc and test ( #7199 )
2024-10-21 14:03:33 -04:00
chenyu
f93bd9e2b9
ggml_data_to_tensor touchups ( #7196 )
...
* ggml_data_to_tensor touchups
tiny reordering and variable name changes
* return type
* pylint
2024-10-21 13:29:59 -04:00
leopf
815e1a340c
GGUF Cleanup - raise if type is not supported ( #7194 )
...
* raise if ggml type is unsupported
* test raise
2024-10-21 11:32:11 -04:00
qazal
bc9eb324dc
group stores by buffer uops [pr] ( #7190 )
...
* group stores by buffer uops [pr]
* dedup
2024-10-21 18:04:44 +03:00
leopf
87877d7a91
GGUF cleanup ( #7192 )
...
* cleanup
* remove vocab size hard code
2024-10-21 10:44:54 -04:00
chenyu
08a3b97ddc
more generic lt_folding ( #7171 )
...
* more generic lt_folding
instead of checking gcd for all uop, check the gcd of the ones that have const_factor() > 1 and still can simplify if others are smallish
* fixed that stride too
2024-10-21 09:41:02 -04:00
chenyu
abd99bb744
unwrap2 is not used ( #7187 )
2024-10-21 09:40:15 -04:00
qazal
37b829ef0d
track metadata with uops [pr] ( #7188 )
2024-10-21 16:35:46 +03:00
ignaciosica
5551cf6689
add rlshift and rrshift special methods ( #7185 )
2024-10-21 08:37:02 -04:00
qazal
8f375b71c5
post-schedule lazybuf from Buffer [pr] ( #7170 )
2024-10-21 15:11:32 +03:00
qazal
7a9f3dea54
assert a schedule double realize ( #7178 )
...
* assert this
* maybe use lazycache
* Revert "maybe use lazycache"
This reverts commit 7368102906 .
* set enable_cache=True
* assert 1 schedule
2024-10-21 14:16:21 +03:00
George Hotz
31fcccc779
hotfix: flip if order
2024-10-21 17:34:23 +08:00
qazal
6c0c3aff14
keep srcs in all ops ( #7175 )
2024-10-21 12:34:02 +03:00
George Hotz
be1806df47
fast sym infer [pr] ( #7177 )
...
* fast sym infer [pr]
* fix pylint
2024-10-21 17:31:32 +08:00
George Hotz
4af228e9fc
hotfix: pin mypy
2024-10-21 16:22:24 +08:00
leopf
b6d9b276bb
GGUF support ( #7046 )
...
* basic loader, untested
* testing
* remove utils import in test
* q8_0
* q4_1
* end to end testing
* minor cleanup
* fix casting
* moved to state
* move tests
* move dequant to fn
* fix lint elif
* remove gguf from extra
* fix dict union
* q6_k simpler
* naming and spacing
* gpt2-gguf example
* cleanup
* move gguf example
* minor cleanup
---------
Co-authored-by: George Hotz <72895+geohot@users.noreply.github.com >
2024-10-21 16:15:34 +08:00
George Hotz
17e7d8f10e
hotfix: fix sz on windows
2024-10-21 16:02:23 +08:00
ignaciosica
87a1e76745
Refactor hip_bfloat16 cast into uop ( #7143 )
...
* refactor hip_bfloat16 cast into uops
* hotfix: linter issue
* hotfix: comment decorator in test
---------
Co-authored-by: George Hotz <72895+geohot@users.noreply.github.com >
2024-10-21 15:17:14 +08:00
qazal
8074c0ec8f
skip test_bfloat16_unary on AMD ( #7169 )
2024-10-21 01:00:47 +03:00
qazal
713461129b
scheduler ast rewrite reorders from big graph [pr] ( #7168 )
...
* scheduler ast rewrite reorders from big graph [pr]
* update test_uops.py
2024-10-21 00:47:58 +03:00
nimlgen
81349213c0
nv min regs count is 16 ( #7166 )
2024-10-20 20:03:55 +03:00
qazal
1383df95af
track_rewrites by function call [pr] ( #7165 )
...
* named track_rewrites [pr]
* group all of create_schedule_with_vars
2024-10-20 17:45:25 +03:00
chenyu
a9ab7db054
don't raise ValueError in uop_given_valid [pr] ( #7163 )
2024-10-19 20:05:04 -04:00
chenyu
98de58260b
simplify valid itself ( #7112 )
2024-10-19 19:39:25 -04:00
chenyu
f511ad9103
No pyint again ( #7156 )
...
* Revert "bring back pyint (#7150 )"
This reverts commit 37e83ca6fc .
* remove truncate in const folding
* truncate_output=False
2024-10-19 13:48:59 -04:00
qazal
30989fb459
changes from the big graph branch [pr] ( #7160 )
...
* metaops srcs
* delete multioutput ctx var
* always has metadata
* shorter path for realized
* this still needs inputs
This reverts commit a59cbb2886 .
2024-10-19 16:22:37 +03:00
chenyu
11beb67400
fix import of truncate ( #7157 )
...
truncate was moved to dtype
2024-10-18 18:41:41 -04:00
nimlgen
54c6a317f8
test_failure_54 ( #7155 )
...
* test_failure_54
* metal
2024-10-18 23:31:18 +03:00
nimlgen
99fb115791
cuda correct pointer type ( #7153 )
2024-10-18 22:39:59 +03:00
chenyu
37e83ca6fc
bring back pyint ( #7150 )
...
fixed test_failure_52 and resnet. need to understand this better
2024-10-18 14:54:37 -04:00
Jacky Lee
c8b59416d0
fix: find_library can be None ( #7145 )
2024-10-18 20:50:52 +03:00
George Hotz
b0a13896d7
PtrDType is dataclass [pr] ( #7125 )
...
* PtrDType is dataclass [pr]
* new dataset
---------
Co-authored-by: chenyu <chenyu@fastmail.com >
2024-10-18 09:40:33 -04:00
chenyu
ea016b55d1
don't throw in fuzz_linearizer ( #7148 )
...
already broken on master and needs fix. don't throw to not block other pr
2024-10-18 09:28:30 -04:00
chenyu
ea2efbf508
Add Opt(op=OptOps.LOCAL, axis=6, amt=2) to actions ( #7147 )
...
* Add Opt(op=OptOps.LOCAL, axis=6, amt=2) to actions
it's missing if we rebuild all kernels, not just the first 2k.
```
PYTHONPATH="." GPU=1 python3 extra/optimization/get_action_space.py
29%|█████████████████████████████████████▋ | 3682/12701 [01:42<04:11, 35.83it/s]Traceback (most recent call last):
File "/Users/chenyu/code/tinygrad/extra/optimization/get_action_space.py", line 27, in <module>
test_rebuild(lin)
File "/Users/chenyu/code/tinygrad/extra/optimization/get_action_space.py", line 11, in test_rebuild
assert o in actions, f"{o} is not in actions"
^^^^^^^^^^^^
AssertionError: Opt(op=OptOps.LOCAL, axis=6, amt=2) is not in actions
```
* break
2024-10-18 09:03:24 -04:00
qazal
4cf7cca91a
delete fuzz_schedule [pr] ( #7144 )
2024-10-18 15:09:39 +03:00
Bhavya Gada
b7b2017cb9
only ignore warnings not errors ( #7146 )
2024-10-18 07:41:11 -04:00
ignaciosica
8bcdd7c97d
Refactor AMD pm rules to remove handwritten bf16 bool alus ( #7136 )
...
* refactor pm rules
- remove unused handwritten methods
- refactor amd pm rules to fix bug with bool alu
* add bf16 bool alu tests
* add bf16 tests
* hotfix: make atol consistent
2024-10-18 09:00:46 +08:00
Bhavya Gada
534597e753
fix all test warnings ( #7024 )
...
* fix pytorch warning in nn.conv2d for same padding
* fix future warning in torch load
* fix overflow warning in tensor list test: https://github.com/numpy/numpy/issues/23606#issuecomment-1512752172
* fix floating point warnings in dtype tests using docs https://numpy.org/doc/stable/reference/generated/numpy.errstate.html and a neat solution https://stackoverflow.com/questions/53634965/change-np-seterr-behavior-inside-a-function-only
* put err state in one place; comment taken care of by function hover
* enter np errstate context manager on test setup
* put decorator on class
2024-10-18 08:56:40 +08:00
chenyu
0cd4b93441
remove CStyleLanguage from test_uop_symbolic ( #7142 )
2024-10-17 19:39:34 -04:00
chenyu
72ed66205d
enable test_resnet_half ( #7141 )
...
already worked so just fixed the test
2024-10-17 19:02:20 -04:00
nimlgen
211d9753f8
nv more lc checks ( #7139 )
...
* nv more lc checks
* revert
* linter
2024-10-18 00:21:53 +03:00
chenyu
12ff52b88b
test_failure_52 fails on real METAL ( #7138 )
2024-10-17 15:37:28 -04:00
chenyu
84e98900e8
test linearizer failure 53 ( #7137 )
...
variable scope issue caused compile error
2024-10-17 15:23:43 -04:00
qazal
a64e5d0430
graph rewrite all metaops ( #7134 )
2024-10-17 18:49:20 +03:00