eddie
|
0b91769baf
|
Ensuring tool_calls in deltas dict is not None
|
2025-11-16 19:57:54 -07:00 |
|
eddie
|
7b9f8df0a4
|
Removed extra version segement from OpenAI API call
|
2025-11-16 19:51:41 -07:00 |
|
Alex O'Connell
|
b27d3c1494
|
Merge branch 'main' into develop
|
2025-11-02 13:00:43 -05:00 |
|
Alex O'Connell
|
08d3c6d2ee
|
Release v0.4.3
|
2025-11-02 12:58:36 -05:00 |
|
Alex O'Connell
|
03989e37b5
|
add initial implementation for ai task entities
|
2025-10-26 21:47:23 -04:00 |
|
Oscar
|
0ac4c3afcf
|
Fix config_flow set of model_config
|
2025-10-26 19:56:10 +01:00 |
|
Alex O'Connell
|
4c678726d4
|
Release v0.4.2
|
2025-10-25 23:17:29 -04:00 |
|
Alex O'Connell
|
0206673303
|
use qwen3 instead of mistral as ollama example
|
2025-10-25 22:57:29 -04:00 |
|
Alex O'Connell
|
050a539f72
|
fix issue with model default value detection, fix model tool usage with home-llm api, and add better descriptions for certain settings
|
2025-10-25 22:42:47 -04:00 |
|
Alex O'Connell
|
f50997d1a3
|
properly clear llm API when updating settings
|
2025-10-23 21:22:33 -04:00 |
|
Alex O'Connell
|
455ad3f3d2
|
Merge pull request #306 from syn-nick/develop
Handle empty finish_reason
|
2025-10-23 21:20:18 -04:00 |
|
loopy321
|
3443ae3f47
|
Update config_flow.py to fix slicer errors
As per https://github.com/acon96/home-llm/issues/303
|
2025-10-22 08:33:09 -05:00 |
|
syn-nick
|
089aa8cff0
|
handle empty finish_reason
|
2025-10-22 11:19:30 +02:00 |
|
Alex O'Connell
|
e28132a17a
|
Release v0.4.1
|
2025-10-12 10:37:46 -04:00 |
|
Alex O'Connell
|
5cb910d71c
|
add missing downloaded file config attribute
|
2025-10-12 10:28:26 -04:00 |
|
Alex O'Connell
|
fed5aa8309
|
Release v0.4
|
2025-10-08 21:27:24 -04:00 |
|
Alex O'Connell
|
2df454985d
|
Build llama.cpp wheels in forked repo + support reinstallation
|
2025-10-08 21:19:06 -04:00 |
|
Alex O'Connell
|
286cf9a888
|
build all cpu backend variants for releases
|
2025-10-05 14:27:29 -04:00 |
|
Alex O'Connell
|
a508a53d37
|
Fix a setup issue with invalid hostnames
|
2025-10-04 11:37:52 -04:00 |
|
Alex O'Connell
|
1e79426f1a
|
clean up docs with new install steps + remove deprecated addon
|
2025-09-21 20:29:05 -04:00 |
|
Alex O'Connell
|
d339bfa1bd
|
fix output parsing to exclude multi-token prefixes
|
2025-09-21 13:01:54 -04:00 |
|
Alex O'Connell
|
dda3f99208
|
backends should all work now
|
2025-09-20 11:25:44 -04:00 |
|
Alex O'Connell
|
b02118705d
|
fix migration + config sub-entry setup
|
2025-09-17 18:24:08 -04:00 |
|
Alex O'Connell
|
bfc6a5a753
|
work through todos
|
2025-09-15 22:10:49 -04:00 |
|
Alex O'Connell
|
18a34a3d5b
|
more code cleanup and config_flow fixes
|
2025-09-15 22:10:49 -04:00 |
|
Alex O'Connell
|
0e6a86315d
|
generic openai, llama.cpp and ollama should all work now
|
2025-09-15 22:10:49 -04:00 |
|
Alex O'Connell
|
33362bf9b2
|
config sub-entry flow is mostly working now
|
2025-09-15 22:10:49 -04:00 |
|
Alex O'Connell
|
bcfa1dca0a
|
continue rewrite of config flow to support sub entries
|
2025-09-15 22:10:49 -04:00 |
|
Alex O'Connell
|
4245c665b5
|
start working on config subentries
|
2025-09-15 22:10:25 -04:00 |
|
Alex O'Connell
|
18575c7eb3
|
decouple entity options from llm backend classes to allow model re-use between entities
|
2025-09-15 22:10:25 -04:00 |
|
Alex O'Connell
|
8d40fbf095
|
fix labels
|
2025-09-15 22:10:25 -04:00 |
|
Alex O'Connell
|
c2b8461633
|
clean up the options page
|
2025-09-15 22:10:25 -04:00 |
|
Alex O'Connell
|
1394dcb5d2
|
fix openai responses backend
|
2025-09-15 22:10:25 -04:00 |
|
Alex O'Connell
|
1425413fc9
|
add docker compose stack for testing + backends are mostly working at this point
|
2025-09-15 22:10:25 -04:00 |
|
Alex O'Connell
|
9b8baeed07
|
make response parsing universal + working generic openai/llamacpp
|
2025-09-15 22:10:25 -04:00 |
|
Alex O'Connell
|
82ebf988aa
|
random changes
|
2025-09-15 22:10:25 -04:00 |
|
Alex O'Connell
|
05e3ceff7b
|
ollama and generic openai theoretically work
|
2025-09-15 22:10:25 -04:00 |
|
Alex O'Connell
|
61d52ae4d1
|
actually working llamacpp agent
|
2025-09-15 22:10:25 -04:00 |
|
Alex O'Connell
|
69f5464bf7
|
sort of working llamacpp agent
|
2025-09-15 22:10:25 -04:00 |
|
Alex O'Connell
|
843f99d64a
|
mostly working implementation. still needs debugging
|
2025-09-15 22:10:09 -04:00 |
|
Alex O'Connell
|
da0a0e4dbc
|
remove intermediate dict format and pass around home assistant model object
|
2025-09-15 22:10:09 -04:00 |
|
Alex O'Connell
|
53052af641
|
Split backends into separate files and start implementing streaming + tool support
|
2025-09-15 22:10:09 -04:00 |
|
Alex O'Connell
|
fc49f61a49
|
Release v0.3.11
|
2025-09-15 22:03:35 -04:00 |
|
Ryan Georgi
|
650d55c016
|
Removing async_migrate_engine
Removing the async_migrate_engine call on assist_pipeline to address breaking change in homeassistant.
|
2025-08-30 17:39:54 -07:00 |
|
Paul Triebel
|
62f6dcb047
|
Fix area_id
Fix the usage of the area_id variable to also use device area
|
2025-08-27 15:16:01 +02:00 |
|
Alex O'Connell
|
6fc6ceb273
|
Release v0.3.10
|
2025-07-05 13:56:15 -04:00 |
|
Alex O'Connell
|
b8de866113
|
Fix for most recent HA release
|
2025-07-05 13:53:01 -04:00 |
|
Simon Redman
|
b2958b62d5
|
Correctly match expected API field for successful status
|
2025-06-01 00:32:32 -04:00 |
|
Simon Redman
|
4e3f2f76f5
|
Add logging
|
2025-06-01 00:07:58 -04:00 |
|
Simon Redman
|
18b076dbea
|
Consume user-configured time when considering last_response_id usage
|
2025-05-31 23:57:56 -04:00 |
|