Release v0.3.9

This commit is contained in:
Alex O'Connell
2025-05-26 21:22:01 -04:00
parent a6fc8da458
commit f361692dbe
3 changed files with 3 additions and 2 deletions

View File

@@ -150,6 +150,7 @@ In order to facilitate running the project entirely on the system where Home Ass
## Version History
| Version | Description |
|---------|----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| v0.3.9 | Update llama.cpp version, fix installation bugs, fix conversation history not working |
| v0.3.8 | Update llama.cpp, remove think blocks from "thinking" models, fix wheel detection for some Intel CPUs, Fixes for compatibility with latest Home Assistant version (2025.4), other small bug fixes |
| v0.3.7 | Update llama.cpp version to support newer models, Update minimum Home Assistant version to 2024.12.3, Add German In-Context Learning examples, Fix multi-turn use, Fix an issue with webcolors |
| v0.3.6 | Small llama.cpp backend fixes |

View File

@@ -403,5 +403,5 @@ OPTIONS_OVERRIDES = {
},
}
INTEGRATION_VERSION = "0.3.8"
INTEGRATION_VERSION = "0.3.9"
EMBEDDED_LLAMA_CPP_PYTHON_VERSION = "0.3.9"

View File

@@ -1,7 +1,7 @@
{
"domain": "llama_conversation",
"name": "Local LLM Conversation",
"version": "0.3.8",
"version": "0.3.9",
"codeowners": ["@acon96"],
"config_flow": true,
"dependencies": ["conversation"],