Commit Graph

551 Commits

Author SHA1 Message Date
Alex O'Connell
1e79426f1a clean up docs with new install steps + remove deprecated addon 2025-09-21 20:29:05 -04:00
Alex O'Connell
d339bfa1bd fix output parsing to exclude multi-token prefixes 2025-09-21 13:01:54 -04:00
Alex O'Connell
dda3f99208 backends should all work now 2025-09-20 11:25:44 -04:00
Alex O'Connell
b02118705d fix migration + config sub-entry setup 2025-09-17 18:24:08 -04:00
Alex O'Connell
bfc6a5a753 work through todos 2025-09-15 22:10:49 -04:00
Alex O'Connell
18a34a3d5b more code cleanup and config_flow fixes 2025-09-15 22:10:49 -04:00
Alex O'Connell
0e6a86315d generic openai, llama.cpp and ollama should all work now 2025-09-15 22:10:49 -04:00
Alex O'Connell
33362bf9b2 config sub-entry flow is mostly working now 2025-09-15 22:10:49 -04:00
Alex O'Connell
bcfa1dca0a continue rewrite of config flow to support sub entries 2025-09-15 22:10:49 -04:00
Alex O'Connell
4245c665b5 start working on config subentries 2025-09-15 22:10:25 -04:00
Alex O'Connell
18575c7eb3 decouple entity options from llm backend classes to allow model re-use between entities 2025-09-15 22:10:25 -04:00
Alex O'Connell
8d40fbf095 fix labels 2025-09-15 22:10:25 -04:00
Alex O'Connell
c2b8461633 clean up the options page 2025-09-15 22:10:25 -04:00
Alex O'Connell
1394dcb5d2 fix openai responses backend 2025-09-15 22:10:25 -04:00
Alex O'Connell
1425413fc9 add docker compose stack for testing + backends are mostly working at this point 2025-09-15 22:10:25 -04:00
Alex O'Connell
9b8baeed07 make response parsing universal + working generic openai/llamacpp 2025-09-15 22:10:25 -04:00
Alex O'Connell
82ebf988aa random changes 2025-09-15 22:10:25 -04:00
Alex O'Connell
05e3ceff7b ollama and generic openai theoretically work 2025-09-15 22:10:25 -04:00
Alex O'Connell
61d52ae4d1 actually working llamacpp agent 2025-09-15 22:10:25 -04:00
Alex O'Connell
69f5464bf7 sort of working llamacpp agent 2025-09-15 22:10:25 -04:00
Alex O'Connell
843f99d64a mostly working implementation. still needs debugging 2025-09-15 22:10:09 -04:00
Alex O'Connell
da0a0e4dbc remove intermediate dict format and pass around home assistant model object 2025-09-15 22:10:09 -04:00
Alex O'Connell
53052af641 Split backends into separate files and start implementing streaming + tool support 2025-09-15 22:10:09 -04:00
Alex O'Connell
d48ccbc271 Merge branch 'main' into develop 2025-09-15 22:09:54 -04:00
Alex O'Connell
8cb2d33205 Merge pull request #294 from acon96/release/v0.3.11
Release v0.3.11
v0.3.11
2025-09-15 22:09:21 -04:00
Alex O'Connell
fc49f61a49 Release v0.3.11 2025-09-15 22:03:35 -04:00
Alex O'Connell
dfd7f2ba9b Merge pull request #289 from rgeorgi/fix-migration-code
Removing async_migrate_engine For HomeAssistant Breaking Change
2025-08-31 13:59:42 -04:00
Ryan Georgi
650d55c016 Removing async_migrate_engine
Removing the async_migrate_engine call on assist_pipeline to address breaking change in homeassistant.
2025-08-30 17:39:54 -07:00
Alex O'Connell
016e8fae95 Merge pull request #288 from PooruTorie/patch-1
Fix area_id
2025-08-27 20:59:06 -04:00
Paul Triebel
62f6dcb047 Fix area_id
Fix the usage of the area_id variable to also use device area
2025-08-27 15:16:01 +02:00
Alex O'Connell
ebd001fee2 Merge branch 'main' into develop 2025-07-05 14:07:49 -04:00
Alex O'Connell
fdb726d27f Merge pull request #281 from acon96/release/v0.3.10
Release v0.3.10
v0.3.10
2025-07-05 14:07:26 -04:00
Alex O'Connell
6fc6ceb273 Release v0.3.10 2025-07-05 13:56:15 -04:00
Alex O'Connell
b8de866113 Fix for most recent HA release 2025-07-05 13:53:01 -04:00
Alex O'Connell
dc1e7613b8 Update readme with new model + update util scripts 2025-06-05 00:14:24 -04:00
Alex O'Connell
e203e36cc9 update TODO 2025-06-05 00:14:24 -04:00
Alex O'Connell
e676c3f694 Merge pull request #279 from sredman/work/sredman/openai-responses
Implement basic OpenAI Responses API
2025-06-01 22:01:29 -04:00
Simon Redman
b2958b62d5 Correctly match expected API field for successful status 2025-06-01 00:32:32 -04:00
Simon Redman
4e3f2f76f5 Add logging 2025-06-01 00:07:58 -04:00
Simon Redman
18b076dbea Consume user-configured time when considering last_response_id usage 2025-05-31 23:57:56 -04:00
Simon Redman
647e233925 Add config flow for time to remember current conversation 2025-05-31 23:53:24 -04:00
Simon Redman
aeac99ab22 Add time-based handling of previous_conversation_id 2025-05-31 23:38:12 -04:00
Simon Redman
a2be61154a Update human-readable config names to fit in with the generic_openai strings 2025-05-31 20:03:12 -04:00
Simon Redman
dd11d1d040 Implement _extract_response for responses API 2025-05-31 16:31:03 -04:00
Simon Redman
7ad8d03dd0 Offload responsibility for _extract_response to child classes 2025-05-31 16:30:48 -04:00
Simon Redman
92042f629d Tweak config flow to remove chat completions switch from response flow 2025-05-31 16:30:22 -04:00
Simon Redman
8a059fab29 Extract most of _async_generate to helper method 2025-05-31 15:24:31 -04:00
Simon Redman
82621674e4 Move _extract_response to base class 2025-05-31 15:05:38 -04:00
Simon Redman
c9c03fc39d Introduce BaseOpenAICompatibleAPIAgent 2025-05-31 15:05:38 -04:00
Simon Redman
057cce485f Rough pass implement GenericOpenAIResponsesAPIAgent 2025-05-31 15:05:38 -04:00