Alex O'Connell
|
feb791cb51
|
fix typo in build config
v0.3.7
|
2024-12-15 14:24:16 -05:00 |
|
Alex O'Connell
|
2658d65155
|
Merge pull request #237 from acon96/release/v0.3.7
Release v0.3.7
|
2024-12-15 19:10:13 +00:00 |
|
Alex O'Connell
|
e78e8e1ef7
|
Release v0.3.7
|
2024-12-15 14:08:24 -05:00 |
|
Alex O'Connell
|
4b19df7c40
|
use unit for attribute if it is provided
|
2024-12-15 14:01:47 -05:00 |
|
Alex O'Connell
|
9fa795bbd9
|
fix wheel install + huggingface quant detection
|
2024-12-15 11:50:58 -05:00 |
|
Alex O'Connell
|
ed5312e9ff
|
update llama cpp python and build python 13 wheels now
|
2024-12-15 10:51:56 -05:00 |
|
Nixellion
|
51e3b145a2
|
Improve multi turn tool use and conversation history (#227)
|
2024-12-15 00:24:16 +00:00 |
|
pstrueb
|
fdd80a6a32
|
Add in context examples file in german (#219)
Co-authored-by: pstruebi <pstruebi@propedal.at>
|
2024-10-03 00:28:11 +00:00 |
|
Alex O'Connell
|
4f145d2922
|
update webcolors and relax huggingface-hub requirement
|
2024-09-14 11:44:18 -04:00 |
|
Teagan Glenn
|
2da7d14e2d
|
fix(utils): Update closest_color logic to use the exported properties and methods from webcolors library (#217)
|
2024-09-14 15:41:55 +00:00 |
|
Teagan Glenn
|
bf0ff32827
|
fix: Initialize the tool_response (#218)
|
2024-09-14 15:38:46 +00:00 |
|
Alex O'Connell
|
90c0edc090
|
Merge branch 'main' into develop
|
2024-08-21 18:46:44 -04:00 |
|
Alex O'Connell
|
f0372412c2
|
Merge pull request #210 from acon96/release/v0.3.6
Release v0.3.6
v0.3.6
|
2024-08-21 18:46:20 -04:00 |
|
Alex O'Connell
|
506aaf40c9
|
update readme
|
2024-08-21 18:43:19 -04:00 |
|
Alex O'Connell
|
79264545e9
|
incorporate find_split updates
|
2024-08-21 18:42:20 -04:00 |
|
Alex O'Connell
|
9cdfffc530
|
better handling for llama.cpp context size errors
|
2024-08-21 11:47:52 -04:00 |
|
Alex O'Connell
|
011d5a0b0d
|
enforce version numbers in build process
|
2024-08-21 11:47:35 -04:00 |
|
Alex O'Connell
|
57a65934cf
|
prevent publishing component with version mismatches
|
2024-08-20 20:22:02 -04:00 |
|
Alex O'Connell
|
34fc39423a
|
Merge branch 'main' into develop
|
2024-08-20 19:02:22 -04:00 |
|
Alex O'Connell
|
9c24ff8de0
|
fix version number
v0.3.5
|
2024-08-20 19:00:46 -04:00 |
|
Alex O'Connell
|
22c9469f66
|
try to support SFT for models without system prompts
|
2024-08-17 22:43:05 -04:00 |
|
Alex O'Connell
|
93b0d8e3a1
|
Merge branch 'main' into develop
|
2024-08-17 13:46:32 -04:00 |
|
Alex O'Connell
|
6930b6a91e
|
Merge pull request #207 from acon96/release/v0.3.5
Release v0.3.5
|
2024-08-17 13:46:15 -04:00 |
|
Alex O'Connell
|
91b4f7f13c
|
another readme tweak
|
2024-08-17 13:30:58 -04:00 |
|
Alex O'Connell
|
decc05b8ac
|
Readme update
|
2024-08-17 13:18:37 -04:00 |
|
Alex O'Connell
|
094e469fd9
|
don't use old builder version
|
2024-08-17 13:14:24 -04:00 |
|
Alex O'Connell
|
1d50c3f589
|
fix wheel creation + bump llama.cpp version again
|
2024-08-17 00:00:41 -04:00 |
|
Alex O'Connell
|
9d0295b4f5
|
clean up training docs
|
2024-08-16 22:45:27 -04:00 |
|
Alex O'Connell
|
0487a82f67
|
make find_split better and make train.py convert prefix + suffix tokens to int
|
2024-08-16 22:29:25 -04:00 |
|
Isabella Nightshade
|
1ccbe62d3c
|
Fix Home LLM v1-v3 service call processing (#206)
|
2024-08-17 00:01:31 +00:00 |
|
Witold Gren
|
e1284324cd
|
fix typo in script arguments (#205)
|
2024-08-17 00:01:23 +00:00 |
|
Witold Gren
|
13c233cfc0
|
Reorganization of documentation and adding information more on how to train models (#204)
|
2024-08-16 00:23:42 +00:00 |
|
Alex O'Connell
|
75c74022d9
|
add find split helper
|
2024-08-14 21:41:08 -04:00 |
|
Alex O'Connell
|
fc6c962afd
|
add training script fixes
|
2024-08-13 08:30:11 -04:00 |
|
Alex O'Connell
|
1aa19ec0d9
|
Merge branch 'main' into develop
|
2024-08-11 21:38:44 -04:00 |
|
Alex O'Connell
|
f6cb96938c
|
Merge pull request #199 from acon96/release/v0.3.4-fixed
Release v0.3.4 (fixed)
v0.3.4
|
2024-08-11 21:37:41 -04:00 |
|
Alex O'Connell
|
47ce389ca1
|
Update readme
|
2024-08-11 21:34:01 -04:00 |
|
Alex O'Connell
|
48e5205cd6
|
housekeeping
|
2024-08-11 21:24:49 -04:00 |
|
Alex O'Connell
|
2f172acf1b
|
update minimum HA version
|
2024-08-11 21:20:48 -04:00 |
|
Alex O'Connell
|
d9ff5026e8
|
fix issue with custom_custom_serializer
|
2024-08-11 21:19:45 -04:00 |
|
Alex O'Connell
|
984954d317
|
Add a platform so agents show up in the UI correctly + bump llama-cpp-python
|
2024-08-11 18:11:50 -04:00 |
|
Witold Gren
|
2837af8443
|
Added full translations for all languages during generate data and creating default prompt system (#196)
|
2024-08-11 22:05:20 +00:00 |
|
Witold Gren
|
cf89f9f478
|
Added support for Polish language (#193)
|
2024-08-04 19:19:10 +00:00 |
|
Alex O'Connell
|
11f966bc30
|
Merge branch 'main' into develop
|
2024-07-29 21:45:08 -04:00 |
|
Alex O'Connell
|
80e07fd3b6
|
Merge pull request #192 from acon96/release/v0.3.4
Release v0.3.4
|
2024-07-29 21:44:16 -04:00 |
|
Alex O'Connell
|
9c8dd2576a
|
Release v0.3.4
|
2024-07-29 21:42:13 -04:00 |
|
Alex O'Connell
|
21ecc293ee
|
better error handling
|
2024-07-29 21:38:25 -04:00 |
|
Alex O'Connell
|
00be01c49f
|
fix build script and update included llama cpp version
|
2024-07-29 21:23:56 -04:00 |
|
Alex O'Connell
|
bf59e5b082
|
reset manifest because that isn't a thing
|
2024-06-16 15:10:21 -04:00 |
|
Alex O'Connell
|
9a6cde3684
|
detect experimental models in integration
|
2024-06-15 20:19:23 -04:00 |
|