loopy321
|
3443ae3f47
|
Update config_flow.py to fix slicer errors
As per https://github.com/acon96/home-llm/issues/303
|
2025-10-22 08:33:09 -05:00 |
|
syn-nick
|
089aa8cff0
|
handle empty finish_reason
|
2025-10-22 11:19:30 +02:00 |
|
Alex O'Connell
|
99aea64f57
|
Merge branch 'main' into develop
|
2025-10-12 10:39:46 -04:00 |
|
Alex O'Connell
|
5676733abb
|
Merge pull request #300 from acon96/release/v0.4.1
Release v0.4.1
v0.4.1
|
2025-10-12 10:39:15 -04:00 |
|
Alex O'Connell
|
e28132a17a
|
Release v0.4.1
|
2025-10-12 10:37:46 -04:00 |
|
Alex O'Connell
|
5cb910d71c
|
add missing downloaded file config attribute
|
2025-10-12 10:28:26 -04:00 |
|
Alex O'Connell
|
43632eef88
|
Merge branch 'main' into develop
|
2025-10-08 21:40:55 -04:00 |
|
Alex O'Connell
|
3b5417b3a9
|
Merge pull request #297 from acon96/release/v0.4
Release v0.4
v0.4
|
2025-10-08 21:40:10 -04:00 |
|
Alex O'Connell
|
fed5aa8309
|
Release v0.4
|
2025-10-08 21:27:24 -04:00 |
|
Alex O'Connell
|
2df454985d
|
Build llama.cpp wheels in forked repo + support reinstallation
|
2025-10-08 21:19:06 -04:00 |
|
Alex O'Connell
|
286cf9a888
|
build all cpu backend variants for releases
|
2025-10-05 14:27:29 -04:00 |
|
Alex O'Connell
|
a508a53d37
|
Fix a setup issue with invalid hostnames
|
2025-10-04 11:37:52 -04:00 |
|
Alex O'Connell
|
78fe539078
|
Merge pull request #283 from acon96/feature/agents-and-tools
Agentic Tool Use & Config Entry re-write
|
2025-09-21 20:42:19 -04:00 |
|
Alex O'Connell
|
1e79426f1a
|
clean up docs with new install steps + remove deprecated addon
|
2025-09-21 20:29:05 -04:00 |
|
Alex O'Connell
|
d339bfa1bd
|
fix output parsing to exclude multi-token prefixes
|
2025-09-21 13:01:54 -04:00 |
|
Alex O'Connell
|
dda3f99208
|
backends should all work now
|
2025-09-20 11:25:44 -04:00 |
|
Alex O'Connell
|
b02118705d
|
fix migration + config sub-entry setup
|
2025-09-17 18:24:08 -04:00 |
|
Alex O'Connell
|
bfc6a5a753
|
work through todos
|
2025-09-15 22:10:49 -04:00 |
|
Alex O'Connell
|
18a34a3d5b
|
more code cleanup and config_flow fixes
|
2025-09-15 22:10:49 -04:00 |
|
Alex O'Connell
|
0e6a86315d
|
generic openai, llama.cpp and ollama should all work now
|
2025-09-15 22:10:49 -04:00 |
|
Alex O'Connell
|
33362bf9b2
|
config sub-entry flow is mostly working now
|
2025-09-15 22:10:49 -04:00 |
|
Alex O'Connell
|
bcfa1dca0a
|
continue rewrite of config flow to support sub entries
|
2025-09-15 22:10:49 -04:00 |
|
Alex O'Connell
|
4245c665b5
|
start working on config subentries
|
2025-09-15 22:10:25 -04:00 |
|
Alex O'Connell
|
18575c7eb3
|
decouple entity options from llm backend classes to allow model re-use between entities
|
2025-09-15 22:10:25 -04:00 |
|
Alex O'Connell
|
8d40fbf095
|
fix labels
|
2025-09-15 22:10:25 -04:00 |
|
Alex O'Connell
|
c2b8461633
|
clean up the options page
|
2025-09-15 22:10:25 -04:00 |
|
Alex O'Connell
|
1394dcb5d2
|
fix openai responses backend
|
2025-09-15 22:10:25 -04:00 |
|
Alex O'Connell
|
1425413fc9
|
add docker compose stack for testing + backends are mostly working at this point
|
2025-09-15 22:10:25 -04:00 |
|
Alex O'Connell
|
9b8baeed07
|
make response parsing universal + working generic openai/llamacpp
|
2025-09-15 22:10:25 -04:00 |
|
Alex O'Connell
|
82ebf988aa
|
random changes
|
2025-09-15 22:10:25 -04:00 |
|
Alex O'Connell
|
05e3ceff7b
|
ollama and generic openai theoretically work
|
2025-09-15 22:10:25 -04:00 |
|
Alex O'Connell
|
61d52ae4d1
|
actually working llamacpp agent
|
2025-09-15 22:10:25 -04:00 |
|
Alex O'Connell
|
69f5464bf7
|
sort of working llamacpp agent
|
2025-09-15 22:10:25 -04:00 |
|
Alex O'Connell
|
843f99d64a
|
mostly working implementation. still needs debugging
|
2025-09-15 22:10:09 -04:00 |
|
Alex O'Connell
|
da0a0e4dbc
|
remove intermediate dict format and pass around home assistant model object
|
2025-09-15 22:10:09 -04:00 |
|
Alex O'Connell
|
53052af641
|
Split backends into separate files and start implementing streaming + tool support
|
2025-09-15 22:10:09 -04:00 |
|
Alex O'Connell
|
d48ccbc271
|
Merge branch 'main' into develop
|
2025-09-15 22:09:54 -04:00 |
|
Alex O'Connell
|
8cb2d33205
|
Merge pull request #294 from acon96/release/v0.3.11
Release v0.3.11
v0.3.11
|
2025-09-15 22:09:21 -04:00 |
|
Alex O'Connell
|
fc49f61a49
|
Release v0.3.11
|
2025-09-15 22:03:35 -04:00 |
|
Alex O'Connell
|
dfd7f2ba9b
|
Merge pull request #289 from rgeorgi/fix-migration-code
Removing async_migrate_engine For HomeAssistant Breaking Change
|
2025-08-31 13:59:42 -04:00 |
|
Ryan Georgi
|
650d55c016
|
Removing async_migrate_engine
Removing the async_migrate_engine call on assist_pipeline to address breaking change in homeassistant.
|
2025-08-30 17:39:54 -07:00 |
|
Alex O'Connell
|
016e8fae95
|
Merge pull request #288 from PooruTorie/patch-1
Fix area_id
|
2025-08-27 20:59:06 -04:00 |
|
Paul Triebel
|
62f6dcb047
|
Fix area_id
Fix the usage of the area_id variable to also use device area
|
2025-08-27 15:16:01 +02:00 |
|
Alex O'Connell
|
ebd001fee2
|
Merge branch 'main' into develop
|
2025-07-05 14:07:49 -04:00 |
|
Alex O'Connell
|
fdb726d27f
|
Merge pull request #281 from acon96/release/v0.3.10
Release v0.3.10
v0.3.10
|
2025-07-05 14:07:26 -04:00 |
|
Alex O'Connell
|
6fc6ceb273
|
Release v0.3.10
|
2025-07-05 13:56:15 -04:00 |
|
Alex O'Connell
|
b8de866113
|
Fix for most recent HA release
|
2025-07-05 13:53:01 -04:00 |
|
Alex O'Connell
|
dc1e7613b8
|
Update readme with new model + update util scripts
|
2025-06-05 00:14:24 -04:00 |
|
Alex O'Connell
|
e203e36cc9
|
update TODO
|
2025-06-05 00:14:24 -04:00 |
|
Alex O'Connell
|
e676c3f694
|
Merge pull request #279 from sredman/work/sredman/openai-responses
Implement basic OpenAI Responses API
|
2025-06-01 22:01:29 -04:00 |
|