Alex O'Connell
|
f5fe6b36e3
|
make ai tasks more usable
|
2025-12-14 12:35:41 -05:00 |
|
Alex O'Connell
|
b547da286f
|
support structured ouput for AI tasks
|
2025-12-14 02:30:58 -05:00 |
|
Alex O'Connell
|
5f48b403d4
|
Use the ollama python client to better handle compatability
|
2025-12-13 18:26:04 -05:00 |
|
Alex O'Connell
|
8d62b52f0f
|
Release v0.4.4
|
2025-11-22 09:40:39 -05:00 |
|
Alex O'Connell
|
08d3c6d2ee
|
Release v0.4.3
|
2025-11-02 12:58:36 -05:00 |
|
Alex O'Connell
|
4c678726d4
|
Release v0.4.2
|
2025-10-25 23:17:29 -04:00 |
|
Alex O'Connell
|
0206673303
|
use qwen3 instead of mistral as ollama example
|
2025-10-25 22:57:29 -04:00 |
|
Alex O'Connell
|
050a539f72
|
fix issue with model default value detection, fix model tool usage with home-llm api, and add better descriptions for certain settings
|
2025-10-25 22:42:47 -04:00 |
|
Alex O'Connell
|
e28132a17a
|
Release v0.4.1
|
2025-10-12 10:37:46 -04:00 |
|
Alex O'Connell
|
fed5aa8309
|
Release v0.4
|
2025-10-08 21:27:24 -04:00 |
|
Alex O'Connell
|
2df454985d
|
Build llama.cpp wheels in forked repo + support reinstallation
|
2025-10-08 21:19:06 -04:00 |
|
Alex O'Connell
|
b02118705d
|
fix migration + config sub-entry setup
|
2025-09-17 18:24:08 -04:00 |
|
Alex O'Connell
|
33362bf9b2
|
config sub-entry flow is mostly working now
|
2025-09-15 22:10:49 -04:00 |
|
Alex O'Connell
|
bcfa1dca0a
|
continue rewrite of config flow to support sub entries
|
2025-09-15 22:10:49 -04:00 |
|
Alex O'Connell
|
c2b8461633
|
clean up the options page
|
2025-09-15 22:10:25 -04:00 |
|
Alex O'Connell
|
1425413fc9
|
add docker compose stack for testing + backends are mostly working at this point
|
2025-09-15 22:10:25 -04:00 |
|
Alex O'Connell
|
9b8baeed07
|
make response parsing universal + working generic openai/llamacpp
|
2025-09-15 22:10:25 -04:00 |
|
Alex O'Connell
|
82ebf988aa
|
random changes
|
2025-09-15 22:10:25 -04:00 |
|
Alex O'Connell
|
61d52ae4d1
|
actually working llamacpp agent
|
2025-09-15 22:10:25 -04:00 |
|
Alex O'Connell
|
69f5464bf7
|
sort of working llamacpp agent
|
2025-09-15 22:10:25 -04:00 |
|
Alex O'Connell
|
fc49f61a49
|
Release v0.3.11
|
2025-09-15 22:03:35 -04:00 |
|
Alex O'Connell
|
6fc6ceb273
|
Release v0.3.10
|
2025-07-05 13:56:15 -04:00 |
|
Simon Redman
|
647e233925
|
Add config flow for time to remember current conversation
|
2025-05-31 23:53:24 -04:00 |
|
Simon Redman
|
58efdb9601
|
Clean up whitespace
|
2025-05-31 15:05:38 -04:00 |
|
Simon Redman
|
30d9f48006
|
Implement config flow for Open AI Responses API handler
|
2025-05-31 15:05:38 -04:00 |
|
Alex O'Connell
|
f361692dbe
|
Release v0.3.9
|
2025-05-26 21:22:01 -04:00 |
|
Alex O'Connell
|
e0c75038c9
|
use local build versions to comply with PEP wheel name convention
|
2025-05-26 20:53:08 -04:00 |
|
Alex O'Connell
|
fe39e481ba
|
set up release
|
2025-04-13 18:13:08 -04:00 |
|
Wang, Xiao
|
b91368d17f
|
Use local time in CURRENT_DATE_PROMPT for English (#241)
|
2025-04-13 14:22:48 +00:00 |
|
Alex O'Connell
|
b1d0191310
|
filter out disallowed services from the prompt when the home-llm API is selected
|
2025-02-08 08:40:02 -05:00 |
|
Fixt
|
ccd073775e
|
Remove Think Blocks to enhance support for chain-of-thought language models (#247)
|
2025-01-31 13:54:42 +00:00 |
|
Alex O'Connell
|
e78e8e1ef7
|
Release v0.3.7
|
2024-12-15 14:08:24 -05:00 |
|
Alex O'Connell
|
ed5312e9ff
|
update llama cpp python and build python 13 wheels now
|
2024-12-15 10:51:56 -05:00 |
|
Alex O'Connell
|
011d5a0b0d
|
enforce version numbers in build process
|
2024-08-21 11:47:35 -04:00 |
|
Alex O'Connell
|
1d50c3f589
|
fix wheel creation + bump llama.cpp version again
|
2024-08-17 00:00:41 -04:00 |
|
Alex O'Connell
|
984954d317
|
Add a platform so agents show up in the UI correctly + bump llama-cpp-python
|
2024-08-11 18:11:50 -04:00 |
|
Witold Gren
|
2837af8443
|
Added full translations for all languages during generate data and creating default prompt system (#196)
|
2024-08-11 22:05:20 +00:00 |
|
Witold Gren
|
cf89f9f478
|
Added support for Polish language (#193)
|
2024-08-04 19:19:10 +00:00 |
|
Alex O'Connell
|
9c8dd2576a
|
Release v0.3.4
|
2024-07-29 21:42:13 -04:00 |
|
Alex O'Connell
|
00be01c49f
|
fix build script and update included llama cpp version
|
2024-07-29 21:23:56 -04:00 |
|
Alex O'Connell
|
9a6cde3684
|
detect experimental models in integration
|
2024-06-15 20:19:23 -04:00 |
|
Alex O'Connell
|
7393eae33d
|
Release v0.3.3
|
2024-06-15 18:18:42 -04:00 |
|
Alex O'Connell
|
1c3e532af4
|
improve generic openai backend
|
2024-06-11 21:21:46 -04:00 |
|
Alex O'Connell
|
35b766d540
|
Release v0.3.2
|
2024-06-08 16:38:17 -04:00 |
|
Alex O'Connell
|
0a6e558fda
|
Release v0.3.1
|
2024-06-08 12:14:24 -04:00 |
|
Alex O'Connell
|
249298bb99
|
better device prompting with area support + fix circular import
|
2024-06-07 17:42:03 -04:00 |
|
Alex O'Connell
|
5ddf0d09d5
|
restrict services that can be called and add format url function to make behavior standard
|
2024-06-07 08:48:50 -04:00 |
|
Alex O'Connell
|
21640dc321
|
release notes, fix service call args, and other release prep/cleanup
|
2024-06-06 23:54:08 -04:00 |
|
Alex O'Connell
|
ab32942006
|
more fixes for llm API
|
2024-06-06 23:08:04 -04:00 |
|
Alex O'Connell
|
36e29bedf0
|
add an LLM API to support the existing models
|
2024-06-06 22:40:59 -04:00 |
|