Enable CPU benchmarks on test-models workflows. (#299)

* Update test-models.yml

* Update README.md
This commit is contained in:
Ean Garvey
2022-09-07 01:22:58 -05:00
committed by GitHub
parent 3824d37d27
commit d453f2e49d
2 changed files with 12 additions and 5 deletions

View File

@@ -99,7 +99,7 @@ pytest tank/tf/hf_masked_lm/albert-base-v2_test.py::AlbertBaseModuleTest::test_m
<details>
<summary>Testing</summary>
<summary>Testing and Benchmarks</summary>
### Run all model tests on CPU/GPU/VULKAN/Metal
```shell
@@ -122,7 +122,11 @@ pytest tank/<MODEL_NAME> -k "keyword"
### Run benchmarks on SHARK tank pytests and generate bench_results.csv with results.
(requires source installation with `IMPORTER=1 ./setup_venv.sh`)
Note: Latest benchmarks on our canonical machines can be found here:
https://storage.googleapis.com/shark-public/builder/bench_results/latest/bench_results_cpu_latest.csv
https://storage.googleapis.com/shark-public/builder/bench_results/latest/bench_results_gpu_latest.csv
(the following requires source installation with `IMPORTER=1 ./setup_venv.sh`)
```shell
pytest --benchmark tank