Alex O'Connell
|
1ab0d82045
|
Merge pull request #94 from acon96/release/v0.2.9
Release v0.2.9
v0.2.9
|
2024-03-21 03:24:08 +00:00 |
|
Alex O'Connell
|
7637c758a5
|
more docs updates + code cleanup
|
2024-03-20 23:22:07 -04:00 |
|
Alex O'Connell
|
93dacffa20
|
update todo list
|
2024-03-20 23:16:55 -04:00 |
|
Alex O'Connell
|
a2eb8d9b99
|
changelog update
|
2024-03-20 23:13:24 -04:00 |
|
Alex O'Connell
|
7bd072623c
|
add updated llama-cpp-python x64 wheels
|
2024-03-20 23:07:24 -04:00 |
|
Alex O'Connell
|
c67759e16f
|
experiment notes
|
2024-03-20 23:05:22 -04:00 |
|
Alex O'Connell
|
fa31682c51
|
working version of in context examples
|
2024-03-20 23:03:31 -04:00 |
|
Alex O'Connell
|
0f1c773bff
|
Fix model download + llama-cpp-python install steps
|
2024-03-20 21:41:04 -04:00 |
|
Alex O'Connell
|
f1659893d7
|
start working on dpo for the datasets
|
2024-03-19 21:31:34 -04:00 |
|
Alex O'Connell
|
b9d394f860
|
actually fix cover service names
|
2024-03-16 13:50:04 -04:00 |
|
Alex O'Connell
|
4978901412
|
start working on icl examples
|
2024-03-07 18:14:44 -05:00 |
|
Alex O'Connell
|
9b71a1860b
|
fix "cover" type system call names
|
2024-03-06 20:40:25 -05:00 |
|
Alex O'Connell
|
4f6ed08be9
|
split out service call argument allow list + properly parse rgb color arguments
|
2024-03-06 17:58:14 -05:00 |
|
Alex O'Connell
|
41b7ceae57
|
Add Ollama options to docs
|
2024-03-05 21:52:37 -05:00 |
|
Alex O'Connell
|
263b21151f
|
Merge branch 'main' into develop
|
2024-03-05 21:47:38 -05:00 |
|
Alex O'Connell
|
316459b6cd
|
Merge pull request #85 from acon96/release/v0.2.8
Release/v0.2.8
v0.2.8
|
2024-03-05 21:47:02 -05:00 |
|
Alex O'Connell
|
841beb5e77
|
Release v0.2.8
|
2024-03-05 21:46:25 -05:00 |
|
Alex O'Connell
|
569f6e848a
|
properly handle colons in ollama model names
|
2024-03-05 17:39:41 -05:00 |
|
Alex O'Connell
|
5a51c8eb7b
|
Merge branch 'main' into develop
|
2024-03-04 22:58:50 -05:00 |
|
Alex O'Connell
|
474901a367
|
Merge pull request #82 from acon96/release/v0.2.7
Release v0.2.7
v0.2.7
|
2024-03-04 22:57:41 -05:00 |
|
Alex O'Connell
|
0f3d518826
|
Bump version number + change notes
|
2024-03-04 22:56:48 -05:00 |
|
Alex O'Connell
|
1c5414b8af
|
fix ollama keep alive properly + check if model exists for ollama too
|
2024-03-04 22:40:22 -05:00 |
|
Alex O'Connell
|
c13d706879
|
bump text-generation-webui version
|
2024-03-04 20:56:04 -05:00 |
|
Alex O'Connell
|
6cc3f47096
|
fix deprecated configflow behavior
|
2024-03-02 22:20:55 -05:00 |
|
Alex O'Connell
|
7b0b021b59
|
handle voice assistant aliases as duplicate devices
|
2024-03-01 22:59:38 -05:00 |
|
Alex O'Connell
|
b50919340b
|
Ollama keep_alive + better docs
|
2024-03-01 22:07:03 -05:00 |
|
Alex O'Connell
|
b197632b3e
|
Update llama-cpp-python and text-generation-webui
|
2024-02-25 17:43:57 -05:00 |
|
Alex O'Connell
|
869b91b6ab
|
add model references in the setup guide
|
2024-02-22 21:31:05 -05:00 |
|
Alex O'Connell
|
32338242f2
|
Merge pull request #68 from acon96/feature/dataset-customization
Make Dataset easier to customize and Support Training StableLM
|
2024-02-22 21:24:27 -05:00 |
|
Alex O'Connell
|
fc171ac04d
|
Merge branch 'develop' into feature/dataset-customization
|
2024-02-22 21:20:59 -05:00 |
|
Alex O'Connell
|
d22ef571b8
|
update readme
|
2024-02-22 21:20:20 -05:00 |
|
Alex O'Connell
|
5dec71eae2
|
finalize new model version
|
2024-02-22 21:12:38 -05:00 |
|
Alex O'Connell
|
4910a34f11
|
Create issue templates
|
2024-02-18 11:49:22 -05:00 |
|
Alex O'Connell
|
c285e3c6a9
|
instructions for adding personas
|
2024-02-17 23:13:45 -05:00 |
|
Alex O'Connell
|
a78e57031c
|
cleanup + table of contents
|
2024-02-17 23:06:21 -05:00 |
|
Alex O'Connell
|
b2836bc250
|
tweak readme structure
|
2024-02-17 23:05:53 -05:00 |
|
Zeno Jiricek
|
ccf2c2c293
|
docs: Format README for better usability (#50)
|
2024-02-18 04:04:46 +00:00 |
|
Alex O'Connell
|
19ed596614
|
resolve conflicts
|
2024-02-17 23:04:19 -05:00 |
|
Alex O'Connell
|
43510d1d7c
|
finish up dataset changes for new device types
|
2024-02-17 21:16:56 -05:00 |
|
Alex O'Connell
|
984cf2c0a3
|
add vacuum, todo, and timer device names
|
2024-02-17 19:55:19 -05:00 |
|
Alex O'Connell
|
6cc3b13c2a
|
Merge branch 'develop' into feature/dataset-customization
|
2024-02-17 19:55:07 -05:00 |
|
colino17
|
e192a8aee6
|
Add additional data for new entity types and services (#67)
|
2024-02-18 00:49:58 +00:00 |
|
Alex O'Connell
|
3f3fb21cd9
|
Handle python 3.12 upgrade smoother + make utils file
|
2024-02-16 23:44:48 -05:00 |
|
Alex O'Connell
|
bcd67aef37
|
start working on new entities
|
2024-02-16 23:21:22 -05:00 |
|
Alex O'Connell
|
9c6d64f373
|
add note to readme about wheels
|
2024-02-16 18:30:17 -05:00 |
|
Alex O'Connell
|
5edb66c7f2
|
build python 3.12 wheels too
|
2024-02-16 18:28:37 -05:00 |
|
Alex O'Connell
|
fdfea02e1d
|
typo
|
2024-02-15 00:04:45 -05:00 |
|
Alex O'Connell
|
5660666bcc
|
Bump included llama-cpp-python to 0.2.42
|
2024-02-14 08:44:19 -05:00 |
|
Alex O'Connell
|
4e873a873c
|
Merge branch 'develop' into feature/dataset-customization
|
2024-02-13 20:23:39 -05:00 |
|
Alex O'Connell
|
411276408b
|
train new models based on stablelm + properly add new response types
|
2024-02-13 20:21:51 -05:00 |
|