Alex O'Connell
77fd4b63bb
readme/localization fixes
2026-01-04 10:03:00 -05:00
Alex O'Connell
0c6b0d229a
Release v0.4.6
2026-01-04 09:48:38 -05:00
Alex O'Connell
136a25f91b
code cleanup to remove explicit constant name references to openai and anthropic
2026-01-04 09:10:29 -05:00
Thomas
ec741fdf6b
Simplify backend label to 'Anthropic API'
2025-12-29 22:54:07 -05:00
Thomas
489c1fa1e0
Address PR feedback for Anthropic backend
...
- Remove cloud-first defaults, require base URL to be specified
- Use CONF_OPENAI_API_KEY instead of separate CONF_ANTHROPIC_API_KEY
- Merge anthropic_connection_schema into remote_connection_schema
- Use existing get_file_contents_base64 utility from utils
- Use Anthropic models list endpoint instead of hardcoded list
- Update anthropic SDK version requirement to >=0.75.0
- Remove trademark reference from UI labels
2025-12-29 22:43:53 -05:00
Thomas
202bc4aade
Remove debug logging
2025-12-28 23:15:45 -05:00
Thomas
1c8f50e5f0
Fix: merge entry.data with options so API key and base_url are available
2025-12-28 23:08:03 -05:00
Thomas
c00e357b33
Add debug logging for model fetching
2025-12-28 23:07:14 -05:00
Thomas
fd0a1d4926
Fetch available models from API for compatible endpoints
2025-12-28 23:03:30 -05:00
Thomas
55248216b6
Fix auth headers for z.ai - remove Bearer prefix
2025-12-28 22:59:32 -05:00
Thomas
21b71d86a7
Fix auth for compatible APIs - use both Bearer and x-api-key headers
2025-12-28 22:48:14 -05:00
Thomas
c90676e081
Add Bearer auth header for Anthropic-compatible APIs
2025-12-28 22:43:36 -05:00
Thomas
738671d6b8
Add better error logging for Anthropic connection validation
2025-12-28 22:38:12 -05:00
Thomas
3c2257d9a6
Fix blocking SSL call by creating Anthropic client in executor
2025-12-28 22:23:40 -05:00
Thomas
e84e4c21f8
Add Anthropic API backend support
...
Adds a new backend for Anthropic API alongside existing backends
(Ollama, OpenAI, llama.cpp, etc.).
Features:
- Support for official Anthropic API (api.anthropic.com)
- Custom base URL support for Anthropic-compatible servers
- Streaming responses via Messages API
- Tool calling support with Home Assistant integration
- Vision support for image attachments
Files changed:
- backends/anthropic.py: New AnthropicAPIClient implementation
- const.py: Added backend type and configuration constants
- __init__.py: Registered new backend
- config_flow.py: Added connection schema and options
- manifest.json: Added anthropic>=0.40.0 dependency
- translations/en.json: Added UI labels
2025-12-28 21:51:23 -05:00
Alex O'Connell
1811a907f7
manually set roles to train on
2025-12-21 22:09:18 -05:00
Alex O'Connell
cf01fd29ae
synthesize new data, update training job/configs
2025-12-21 14:14:31 -05:00
Alex O'Connell
0b776c0a23
add defaults for functiongemma
2025-12-20 22:41:26 -05:00
Alex O'Connell
0e4031ef43
wire up options to support functiongemma properly
2025-12-20 22:25:03 -05:00
Alex O'Connell
29d839eea8
mostly working gemma implementation
2025-12-20 20:29:09 -05:00
Alex O'Connell
672a9de65c
extract tool calls from multiple keys
2025-12-20 18:07:27 -05:00
Alex O'Connell
3b159178fb
review code
2025-12-14 20:22:17 -05:00
Alex O'Connell
a351c103ff
refine error retry loop
2025-12-14 18:24:26 -05:00
Alex O'Connell
f5fe6b36e3
make ai tasks more usable
2025-12-14 12:35:41 -05:00
Alex O'Connell
1f078d0a41
working ai task entities
2025-12-14 10:34:21 -05:00
Alex O'Connell
b547da286f
support structured ouput for AI tasks
2025-12-14 02:30:58 -05:00
Alex O'Connell
6010bdf26c
rewrite tests from scratch
2025-12-14 01:07:23 -05:00
Alex O'Connell
c8a5b30e5b
clean up tool response extraction
2025-12-14 00:32:18 -05:00
Alex O'Connell
b89a0b44b6
support multiple LLM APIs at once
2025-12-13 19:03:58 -05:00
Alex O'Connell
5f48b403d4
Use the ollama python client to better handle compatability
2025-12-13 18:26:04 -05:00
Alex O'Connell
f12b016b51
gate startup to prevent loading of broken config entries
2025-11-30 14:09:25 -05:00
Alex O'Connell
8d62b52f0f
Release v0.4.4
2025-11-22 09:40:39 -05:00
Alex O'Connell
d07c0d73a3
always pass tool variables to prompt
2025-11-22 09:25:23 -05:00
eddie
0b91769baf
Ensuring tool_calls in deltas dict is not None
2025-11-16 19:57:54 -07:00
eddie
7b9f8df0a4
Removed extra version segement from OpenAI API call
2025-11-16 19:51:41 -07:00
Alex O'Connell
b27d3c1494
Merge branch 'main' into develop
2025-11-02 13:00:43 -05:00
Alex O'Connell
08d3c6d2ee
Release v0.4.3
2025-11-02 12:58:36 -05:00
Alex O'Connell
03989e37b5
add initial implementation for ai task entities
2025-10-26 21:47:23 -04:00
Oscar
0ac4c3afcf
Fix config_flow set of model_config
2025-10-26 19:56:10 +01:00
Alex O'Connell
4c678726d4
Release v0.4.2
2025-10-25 23:17:29 -04:00
Alex O'Connell
0206673303
use qwen3 instead of mistral as ollama example
2025-10-25 22:57:29 -04:00
Alex O'Connell
050a539f72
fix issue with model default value detection, fix model tool usage with home-llm api, and add better descriptions for certain settings
2025-10-25 22:42:47 -04:00
Alex O'Connell
f50997d1a3
properly clear llm API when updating settings
2025-10-23 21:22:33 -04:00
Alex O'Connell
455ad3f3d2
Merge pull request #306 from syn-nick/develop
...
Handle empty finish_reason
2025-10-23 21:20:18 -04:00
loopy321
3443ae3f47
Update config_flow.py to fix slicer errors
...
As per https://github.com/acon96/home-llm/issues/303
2025-10-22 08:33:09 -05:00
syn-nick
089aa8cff0
handle empty finish_reason
2025-10-22 11:19:30 +02:00
Alex O'Connell
e28132a17a
Release v0.4.1
2025-10-12 10:37:46 -04:00
Alex O'Connell
5cb910d71c
add missing downloaded file config attribute
2025-10-12 10:28:26 -04:00
Alex O'Connell
fed5aa8309
Release v0.4
2025-10-08 21:27:24 -04:00
Alex O'Connell
2df454985d
Build llama.cpp wheels in forked repo + support reinstallation
2025-10-08 21:19:06 -04:00