Release v0.3.4

This commit is contained in:
Alex O'Connell
2024-07-29 21:42:13 -04:00
parent 21ecc293ee
commit 9c8dd2576a
3 changed files with 3 additions and 2 deletions

View File

@@ -136,6 +136,7 @@ In order to facilitate running the project entirely on the system where Home Ass
## Version History
| Version | Description |
|---------|----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| v0.3.4 | Update bundled llama-cpp-python to support new models, various bug fixes |
| v0.3.3 | Improvements to the Generic OpenAI Backend, improved area handling, fix issue using RGB colors, remove EOS token from responses, replace requests dependency with aiohttp included with Home Assistant |
| v0.3.2 | Fix for exposed script entities causing errors, fix missing GBNF error, trim whitespace from model output |
| v0.3.1 | Adds basic area support in prompting, Fix for broken requirements, fix for issue with formatted tools, fix custom API not registering on startup properly |

View File

@@ -340,5 +340,5 @@ OPTIONS_OVERRIDES = {
},
}
INTEGRATION_VERSION = "0.3.3"
INTEGRATION_VERSION = "0.3.4"
EMBEDDED_LLAMA_CPP_PYTHON_VERSION = "0.2.84"

View File

@@ -1,7 +1,7 @@
{
"domain": "llama_conversation",
"name": "Local LLM Conversation",
"version": "0.3.3",
"version": "0.3.4",
"codeowners": ["@acon96"],
"config_flow": true,
"dependencies": ["conversation"],