more readme cleanup

This commit is contained in:
George Hotz
2021-10-30 16:51:25 -07:00
parent fc6597a6d9
commit 114f6ca3fd

View File

@@ -8,7 +8,9 @@
For something in between a [pytorch](https://github.com/pytorch/pytorch) and a [karpathy/micrograd](https://github.com/karpathy/micrograd)
This may not be the best deep learning framework, but it is a deep learning framework. The core of it is in `tinygrad/`, and it's under 1000 lines.
This may not be the best deep learning framework, but it is a deep learning framework.
The core of it is in `tinygrad/`
Due to its extreme simplicity, it aims to be the easiest framework to add new accelerators to, with support for both inference and training. Support the simple basic ops, and you get SOTA [vision](https://arxiv.org/abs/1905.11946) `models/efficientnet.py` and [language](https://arxiv.org/abs/1706.03762) `models/transformer.py` models. We are working on support for the Apple Neural Engine.
@@ -18,6 +20,11 @@ Eventually, [we will build custom hardware](https://geohot.github.io/blog/jekyll
```bash
pip3 install git+https://github.com/geohot/tinygrad.git --upgrade
# or for development
git clone https://github.com/geohot/tinygrad.git
cd tinygrad
python3 setup.py develop
```
### Example
@@ -153,14 +160,3 @@ tinygrad will always be below 1000 lines. If it isn't, we will revert commits un
python3 -m pytest
```
### TODO (updated)
```bash
PYTHONPATH="." DEBUG=1 CHERRY=1 python3 examples/efficientnet.py https://upload.wikimedia.org/wikipedia/commons/4/41/Chicken.jpg
```
* ~~Add reduce ops to CHERRY, and fully support forward pass. See `extra/ops_risk.py` and `extra/risk.py`~~
* Switch convolution backward pass to CHERRY instead of the numpy placeholder
* Confirm EfficientNet backward pass fully uses CHERRY instructions
* Benchmark that and transformers