Alex O'Connell
|
adae87addd
|
training fixes, default values + other fixes
|
2024-04-21 23:40:28 -04:00 |
|
Alex O'Connell
|
3326bd7d6e
|
handle other languages in component
|
2024-04-21 20:38:46 -04:00 |
|
Alex O'Connell
|
d61b9b9242
|
wrong version number
|
2024-04-11 00:23:13 -04:00 |
|
Alex O'Connell
|
7262a2057a
|
add min p and typical p samplers
|
2024-04-10 23:55:01 -04:00 |
|
Alex O'Connell
|
1b22c06215
|
finish agent tests
|
2024-04-10 23:22:22 -04:00 |
|
Alex O'Connell
|
1577950137
|
more tests, fix missing default options, and load ICL as utf8
|
2024-04-08 20:58:07 -04:00 |
|
Alex O'Connell
|
9c3a3db696
|
more random settings
|
2024-04-07 22:27:13 -04:00 |
|
Alex O'Connell
|
5def7669f0
|
add recommended models
|
2024-04-06 22:55:52 -04:00 |
|
Alex O'Connell
|
5afc5014eb
|
Release v0.2.11
|
2024-04-06 20:13:01 -04:00 |
|
Alex O'Connell
|
793b36f215
|
Merge branch 'feature/prime-kv-cache' into develop
|
2024-04-06 18:24:18 -04:00 |
|
Alex O'Connell
|
f226dda4fc
|
Use github actions to build wheels
|
2024-04-06 18:03:53 -04:00 |
|
Alex O'Connell
|
4058a42ee5
|
clean up UI + hook up other llama.cpp settings
|
2024-03-31 17:08:52 -04:00 |
|
Alex O'Connell
|
fdd9f1bc67
|
wire up llama cpp runtime options
|
2024-03-31 11:08:07 -04:00 |
|
Alex O'Connell
|
0008f6bb5e
|
hook up prompt caching to config flow
|
2024-03-30 16:50:24 -04:00 |
|
Alex O'Connell
|
bf04cc3e6e
|
remove default host since it didn't make sense
|
2024-03-24 11:02:49 -04:00 |
|
Alex O'Connell
|
46e1c4fc1d
|
reset model_config when starting the configflow
|
2024-03-24 10:56:50 -04:00 |
|
Alex O'Connell
|
ac7b71ca4f
|
Make initial configuration easier + rewrite quickstart guide
|
2024-03-24 00:05:07 -04:00 |
|
Alex O'Connell
|
fa31682c51
|
working version of in context examples
|
2024-03-20 23:03:31 -04:00 |
|
Alex O'Connell
|
4978901412
|
start working on icl examples
|
2024-03-07 18:14:44 -05:00 |
|
Alex O'Connell
|
4f6ed08be9
|
split out service call argument allow list + properly parse rgb color arguments
|
2024-03-06 17:58:14 -05:00 |
|
Alex O'Connell
|
1c5414b8af
|
fix ollama keep alive properly + check if model exists for ollama too
|
2024-03-04 22:40:22 -05:00 |
|
Alex O'Connell
|
ff13770f7e
|
add Zephyr prompt format
|
2024-02-08 20:40:05 -05:00 |
|
Alex O'Connell
|
cc48465575
|
Add support for HTTPS endpoints
|
2024-02-08 19:26:58 -05:00 |
|
Isabella Nightshade
|
b1fb7cf184
|
Config to remember conversation/limit number of messages (#53)
|
2024-02-08 01:21:28 +00:00 |
|
Alex O'Connell
|
8e4c602685
|
Merge branch 'feature/proper-functioncalling-args' into develop
|
2024-01-27 15:22:07 -05:00 |
|
Alex O'Connell
|
3134c4f954
|
update info about model in readme
|
2024-01-27 15:19:10 -05:00 |
|
Alex O'Connell
|
946623713f
|
add "extra exposed attributes" to dataset as function call arguments + fix pile template inconsistencies
|
2024-01-26 22:36:34 -05:00 |
|
Alex O'Connell
|
5860028990
|
fix chatml prompt for multi-turn conversations
|
2024-01-26 21:18:12 -05:00 |
|
Alex O'Connell
|
1594844962
|
fix auth, add llama-cpp-python server backend, better in-app docs
|
2024-01-21 21:38:23 -05:00 |
|
Alex O'Connell
|
27963d06f9
|
completions endpoint works but it doesn't apply prompt formatting
|
2024-01-19 00:00:29 -05:00 |
|
Alex O'Connell
|
d6b0c32aa3
|
make upgrades not break the options window
|
2024-01-16 20:28:52 -05:00 |
|
Alex O'Connell
|
db0a3a317c
|
expose a bunch more options + add auth to remote backends
|
2024-01-15 23:11:14 -05:00 |
|
Alex O'Connell
|
9c4c25585c
|
Expose more settings in UI + mark new models as still unavailable for now
|
2024-01-14 11:39:53 -05:00 |
|
Alex O'Connell
|
62adce7a22
|
add generic openai backend type
|
2024-01-13 23:56:32 -05:00 |
|
Alex O'Connell
|
1e5207e899
|
Add new prompt formats + fix exposed entities
|
2024-01-13 14:20:25 -05:00 |
|
Alex O'Connell
|
fb20caefe2
|
allow exposing some entity attributes + work on climate type
|
2024-01-06 16:06:02 -05:00 |
|
Alex O'Connell
|
e2a6bfa8c3
|
start adding new device types, handle json function calling in HA component, and add more data for underrepresented existing device types
|
2024-01-04 23:34:36 -05:00 |
|
Alex O'Connell
|
a026ea7ea9
|
fix wheel installation + add output grammar
|
2023-12-31 23:25:17 -05:00 |
|
Alex O'Connell
|
6e348ac472
|
make the addon actually work
|
2023-12-28 13:46:21 -05:00 |
|
Alex O'Connell
|
17739b6a4d
|
prevent loading multiple models at once locally
|
2023-12-22 22:57:10 -05:00 |
|
Alex O'Connell
|
5f87824b38
|
multi step config flow
|
2023-12-22 17:04:29 -05:00 |
|
Alex O'Connell
|
387c46ee71
|
more docs + work on local backend in HA component
|
2023-12-20 23:12:21 -05:00 |
|
Alex O'Connell
|
3ae9053b2f
|
set defaults properly
|
2023-12-14 00:48:13 -05:00 |
|
Alex O'Connell
|
1d5f8288c3
|
make home assistant component use chatml format
|
2023-12-14 00:27:34 -05:00 |
|
Alex O'Connell
|
bbca917c19
|
version that actually works
|
2023-10-30 23:15:27 -04:00 |
|
Alex O'Connell
|
cb7cc1ed53
|
more plugin work. it passes info around properly
|
2023-10-29 23:22:59 -04:00 |
|
Alex O'Connell
|
33c9161e61
|
add prompt building to hass component
|
2023-10-28 22:15:18 -04:00 |
|
Alex O'Connell
|
7c1ba32caf
|
rename integration folder
|
2023-10-26 21:48:36 -04:00 |
|