Alex O'Connell
|
569f6e848a
|
properly handle colons in ollama model names
|
2024-03-05 17:39:41 -05:00 |
|
Alex O'Connell
|
1c5414b8af
|
fix ollama keep alive properly + check if model exists for ollama too
|
2024-03-04 22:40:22 -05:00 |
|
Alex O'Connell
|
7b0b021b59
|
handle voice assistant aliases as duplicate devices
|
2024-03-01 22:59:38 -05:00 |
|
Alex O'Connell
|
b50919340b
|
Ollama keep_alive + better docs
|
2024-03-01 22:07:03 -05:00 |
|
Alex O'Connell
|
3f3fb21cd9
|
Handle python 3.12 upgrade smoother + make utils file
|
2024-02-16 23:44:48 -05:00 |
|
Alex O'Connell
|
cc48465575
|
Add support for HTTPS endpoints
|
2024-02-08 19:26:58 -05:00 |
|
Isabella Nightshade
|
b1fb7cf184
|
Config to remember conversation/limit number of messages (#53)
|
2024-02-08 01:21:28 +00:00 |
|
Alex O'Connell
|
2f4d157555
|
remove extra else branch
|
2024-01-30 22:08:14 -05:00 |
|
Alex O'Connell
|
96d5c2a809
|
fix for unusually formatted vol schemas in service registry
|
2024-01-29 20:35:35 -05:00 |
|
Alex O'Connell
|
8e4c602685
|
Merge branch 'feature/proper-functioncalling-args' into develop
|
2024-01-27 15:22:07 -05:00 |
|
Alex O'Connell
|
be99ac2d12
|
fix missing setup config for text-generation-webui + pass arguments to service calls
|
2024-01-27 14:50:57 -05:00 |
|
Fixt
|
107f8cd740
|
fix: Max tokens setting for Ollama API (#40)
|
2024-01-26 22:59:54 +00:00 |
|
Alex O'Connell
|
9cdedb021a
|
expose arguments to function calls via HA integration
|
2024-01-25 20:46:59 -05:00 |
|
Alex O'Connell
|
a9b042de1d
|
Fix api key on load model for text-generation-webui
|
2024-01-25 20:23:57 -05:00 |
|
Fixt
|
a43c6ed27e
|
feat: Ollama API Support (#34)
|
2024-01-23 02:10:50 +00:00 |
|
Alex O'Connell
|
1594844962
|
fix auth, add llama-cpp-python server backend, better in-app docs
|
2024-01-21 21:38:23 -05:00 |
|
Alex O'Connell
|
7c30bb57cf
|
split up even more + add llama-cpp-python server
|
2024-01-21 16:25:50 -05:00 |
|
Alex O'Connell
|
cc3dd4884a
|
rewrite using class inheritance to cut down on repeated code
|
2024-01-21 14:38:06 -05:00 |
|
Alex O'Connell
|
9290d9b388
|
properly pass chat mode + docs
|
2024-01-21 13:07:19 -05:00 |
|
Alex O'Connell
|
27963d06f9
|
completions endpoint works but it doesn't apply prompt formatting
|
2024-01-19 00:00:29 -05:00 |
|
Alex O'Connell
|
96d095f8c5
|
start working on chat completions API
|
2024-01-18 20:14:56 -05:00 |
|
Alex O'Connell
|
2a0dbd8806
|
fix training script for Phi-2 lora
|
2024-01-16 21:47:15 -05:00 |
|
Alex O'Connell
|
b1dd2353c9
|
add docs + fix preset not showing up in UI
|
2024-01-16 20:51:50 -05:00 |
|
Alex O'Connell
|
d6b0c32aa3
|
make upgrades not break the options window
|
2024-01-16 20:28:52 -05:00 |
|
Alex O'Connell
|
351fb8a5c4
|
fix config entries not reloading properly
|
2024-01-15 23:21:33 -05:00 |
|
Alex O'Connell
|
db0a3a317c
|
expose a bunch more options + add auth to remote backends
|
2024-01-15 23:11:14 -05:00 |
|
Alex O'Connell
|
83cfb89e65
|
fixes from looking at release merge
|
2024-01-14 12:45:35 -05:00 |
|
Alex O'Connell
|
9c4c25585c
|
Expose more settings in UI + mark new models as still unavailable for now
|
2024-01-14 11:39:53 -05:00 |
|
Alex O'Connell
|
ec69334ab0
|
fix bug + update readme
|
2024-01-14 11:01:33 -05:00 |
|
Alex O'Connell
|
62adce7a22
|
add generic openai backend type
|
2024-01-13 23:56:32 -05:00 |
|
Alex O'Connell
|
4aab796cd3
|
working with wheel install + grammar
|
2024-01-13 23:01:19 -05:00 |
|
Alex O'Connell
|
70a02d68f9
|
check if model is already loaded
|
2024-01-13 15:57:05 -05:00 |
|
Alex O'Connell
|
59984c1b62
|
handle old and new model formats properly
|
2024-01-13 15:48:19 -05:00 |
|
Alex O'Connell
|
9bfd3bc49d
|
fix outputs
|
2024-01-13 15:37:28 -05:00 |
|
Alex O'Connell
|
1e5207e899
|
Add new prompt formats + fix exposed entities
|
2024-01-13 14:20:25 -05:00 |
|
Alex O'Connell
|
fac7cc2b03
|
start working on rgb lighting
|
2024-01-11 00:43:18 -05:00 |
|
Alex O'Connell
|
5c34c7e3b9
|
climate should mostly work + make media status requests better
|
2024-01-11 00:18:23 -05:00 |
|
Alex O'Connell
|
fb20caefe2
|
allow exposing some entity attributes + work on climate type
|
2024-01-06 16:06:02 -05:00 |
|
Alex O'Connell
|
e2a6bfa8c3
|
start adding new device types, handle json function calling in HA component, and add more data for underrepresented existing device types
|
2024-01-04 23:34:36 -05:00 |
|
Alex O'Connell
|
a026ea7ea9
|
fix wheel installation + add output grammar
|
2023-12-31 23:25:17 -05:00 |
|
Alex O'Connell
|
6e348ac472
|
make the addon actually work
|
2023-12-28 13:46:21 -05:00 |
|
Alex O'Connell
|
2b880136b9
|
add warning + only run services for exposed entities
|
2023-12-27 00:10:31 -05:00 |
|
Alex O'Connell
|
17739b6a4d
|
prevent loading multiple models at once locally
|
2023-12-22 22:57:10 -05:00 |
|
Alex O'Connell
|
c904665f0a
|
add validation to remote backend connections
|
2023-12-22 18:45:36 -05:00 |
|
Alex O'Connell
|
5f87824b38
|
multi step config flow
|
2023-12-22 17:04:29 -05:00 |
|
Alex O'Connell
|
387c46ee71
|
more docs + work on local backend in HA component
|
2023-12-20 23:12:21 -05:00 |
|
Alex O'Connell
|
1d5f8288c3
|
make home assistant component use chatml format
|
2023-12-14 00:27:34 -05:00 |
|
Alex O'Connell
|
bbca917c19
|
version that actually works
|
2023-10-30 23:15:27 -04:00 |
|
Alex O'Connell
|
cb7cc1ed53
|
more plugin work. it passes info around properly
|
2023-10-29 23:22:59 -04:00 |
|
Alex O'Connell
|
33c9161e61
|
add prompt building to hass component
|
2023-10-28 22:15:18 -04:00 |
|