313 Commits

Author SHA1 Message Date
Alex O'Connell
286cf9a888 build all cpu backend variants for releases 2025-10-05 14:27:29 -04:00
Alex O'Connell
a508a53d37 Fix a setup issue with invalid hostnames 2025-10-04 11:37:52 -04:00
Alex O'Connell
1e79426f1a clean up docs with new install steps + remove deprecated addon 2025-09-21 20:29:05 -04:00
Alex O'Connell
d339bfa1bd fix output parsing to exclude multi-token prefixes 2025-09-21 13:01:54 -04:00
Alex O'Connell
dda3f99208 backends should all work now 2025-09-20 11:25:44 -04:00
Alex O'Connell
b02118705d fix migration + config sub-entry setup 2025-09-17 18:24:08 -04:00
Alex O'Connell
bfc6a5a753 work through todos 2025-09-15 22:10:49 -04:00
Alex O'Connell
18a34a3d5b more code cleanup and config_flow fixes 2025-09-15 22:10:49 -04:00
Alex O'Connell
0e6a86315d generic openai, llama.cpp and ollama should all work now 2025-09-15 22:10:49 -04:00
Alex O'Connell
33362bf9b2 config sub-entry flow is mostly working now 2025-09-15 22:10:49 -04:00
Alex O'Connell
bcfa1dca0a continue rewrite of config flow to support sub entries 2025-09-15 22:10:49 -04:00
Alex O'Connell
4245c665b5 start working on config subentries 2025-09-15 22:10:25 -04:00
Alex O'Connell
18575c7eb3 decouple entity options from llm backend classes to allow model re-use between entities 2025-09-15 22:10:25 -04:00
Alex O'Connell
8d40fbf095 fix labels 2025-09-15 22:10:25 -04:00
Alex O'Connell
c2b8461633 clean up the options page 2025-09-15 22:10:25 -04:00
Alex O'Connell
1394dcb5d2 fix openai responses backend 2025-09-15 22:10:25 -04:00
Alex O'Connell
1425413fc9 add docker compose stack for testing + backends are mostly working at this point 2025-09-15 22:10:25 -04:00
Alex O'Connell
9b8baeed07 make response parsing universal + working generic openai/llamacpp 2025-09-15 22:10:25 -04:00
Alex O'Connell
82ebf988aa random changes 2025-09-15 22:10:25 -04:00
Alex O'Connell
05e3ceff7b ollama and generic openai theoretically work 2025-09-15 22:10:25 -04:00
Alex O'Connell
61d52ae4d1 actually working llamacpp agent 2025-09-15 22:10:25 -04:00
Alex O'Connell
69f5464bf7 sort of working llamacpp agent 2025-09-15 22:10:25 -04:00
Alex O'Connell
843f99d64a mostly working implementation. still needs debugging 2025-09-15 22:10:09 -04:00
Alex O'Connell
da0a0e4dbc remove intermediate dict format and pass around home assistant model object 2025-09-15 22:10:09 -04:00
Alex O'Connell
53052af641 Split backends into separate files and start implementing streaming + tool support 2025-09-15 22:10:09 -04:00
Alex O'Connell
fc49f61a49 Release v0.3.11 2025-09-15 22:03:35 -04:00
Ryan Georgi
650d55c016 Removing async_migrate_engine
Removing the async_migrate_engine call on assist_pipeline to address breaking change in homeassistant.
2025-08-30 17:39:54 -07:00
Paul Triebel
62f6dcb047 Fix area_id
Fix the usage of the area_id variable to also use device area
2025-08-27 15:16:01 +02:00
Alex O'Connell
6fc6ceb273 Release v0.3.10 2025-07-05 13:56:15 -04:00
Alex O'Connell
b8de866113 Fix for most recent HA release 2025-07-05 13:53:01 -04:00
Simon Redman
b2958b62d5 Correctly match expected API field for successful status 2025-06-01 00:32:32 -04:00
Simon Redman
4e3f2f76f5 Add logging 2025-06-01 00:07:58 -04:00
Simon Redman
18b076dbea Consume user-configured time when considering last_response_id usage 2025-05-31 23:57:56 -04:00
Simon Redman
647e233925 Add config flow for time to remember current conversation 2025-05-31 23:53:24 -04:00
Simon Redman
aeac99ab22 Add time-based handling of previous_conversation_id 2025-05-31 23:38:12 -04:00
Simon Redman
a2be61154a Update human-readable config names to fit in with the generic_openai strings 2025-05-31 20:03:12 -04:00
Simon Redman
dd11d1d040 Implement _extract_response for responses API 2025-05-31 16:31:03 -04:00
Simon Redman
7ad8d03dd0 Offload responsibility for _extract_response to child classes 2025-05-31 16:30:48 -04:00
Simon Redman
92042f629d Tweak config flow to remove chat completions switch from response flow 2025-05-31 16:30:22 -04:00
Simon Redman
8a059fab29 Extract most of _async_generate to helper method 2025-05-31 15:24:31 -04:00
Simon Redman
82621674e4 Move _extract_response to base class 2025-05-31 15:05:38 -04:00
Simon Redman
c9c03fc39d Introduce BaseOpenAICompatibleAPIAgent 2025-05-31 15:05:38 -04:00
Simon Redman
057cce485f Rough pass implement GenericOpenAIResponsesAPIAgent 2025-05-31 15:05:38 -04:00
Simon Redman
58efdb9601 Clean up whitespace 2025-05-31 15:05:38 -04:00
Simon Redman
30d9f48006 Implement config flow for Open AI Responses API handler 2025-05-31 15:05:38 -04:00
Alex O'Connell
f361692dbe Release v0.3.9 2025-05-26 21:22:01 -04:00
Alex O'Connell
a6fc8da458 re-enable conversation history that was accidentally commented out 2025-05-26 21:20:07 -04:00
Alex O'Connell
e0c75038c9 use local build versions to comply with PEP wheel name convention 2025-05-26 20:53:08 -04:00
Alex O'Connell
f7ca910dc5 Fix processor type remapping 2025-04-21 16:54:38 -04:00
Alex O'Connell
fe39e481ba set up release 2025-04-13 18:13:08 -04:00