mirror of
https://github.com/acon96/home-llm.git
synced 2026-01-08 21:28:05 -05:00
documentation updates
This commit is contained in:
@@ -44,15 +44,12 @@ After the wheel file has been copied to the correct folder, attempt the wheel in
|
||||
Pre-built wheel files (`*.whl`) are provided as part of the [GitHub release](https://github.com/acon96/home-llm/releases/latest) for the integration.
|
||||
|
||||
To ensure compatibility with your Home Assistant and Python versions, select the correct `.whl` file for your hardware's architecture:
|
||||
- For Home Assistant `2024.1.4` and older, use the Python 3.11 wheels (`cp311`)
|
||||
- For Home Assistant `2024.2.0` and newer, use the Python 3.12 wheels (`cp312`)
|
||||
- **ARM devices** (e.g., Raspberry Pi 4/5):
|
||||
- Example filenames:
|
||||
- `llama_cpp_python-{version}-cp311-cp311-musllinux_1_2_aarch64.whl`
|
||||
- Example filename:
|
||||
- `llama_cpp_python-{version}-cp312-cp312-musllinux_1_2_aarch64.whl`
|
||||
- **x86_64 devices** (e.g., Intel/AMD desktops):
|
||||
- Example filenames:
|
||||
- `llama_cpp_python-{version}-cp311-cp311-musllinux_1_2_x86_64.whl`
|
||||
- Example filename:
|
||||
- `llama_cpp_python-{version}-cp312-cp312-musllinux_1_2_x86_64.whl`
|
||||
|
||||
## Build your own
|
||||
|
||||
@@ -1,19 +1,101 @@
|
||||
# Model Prompting
|
||||
|
||||
This integration allows for full customization of the system prompt using Home Assistant's [built in templating engine](https://www.home-assistant.io/docs/configuration/templating/). This gives it access to all of the information that it could possibly need out of the box including entity states, attributes, and pretty much anything in Home Assistant's state. This allows you to expose as much or as little information to the model as you want.
|
||||
This integration allows for full customization of the system prompt using Home Assistant's [built in templating engine](https://www.home-assistant.io/docs/configuration/templating/). This gives it access to all of the information that it could possibly need out of the box including entity states, attributes, which allows you to expose as much or as little information to the model as you want. In addition to having access to all of this information, extra variables have been added to make it easier to build a useful prompt.
|
||||
|
||||
## System Prompt Template
|
||||
The default system prompt is:
|
||||
The default system prompt for non-fine tuned models is:
|
||||
```
|
||||
You are 'Al', a helpful AI Assistant that controls the devices in a house. Complete the following task as instructed with the information provided only.
|
||||
Services: {{ services }}
|
||||
The current time and date is {{ (as_timestamp(now()) | timestamp_custom("%I:%M %p on %A %B %d, %Y", "")) }}
|
||||
Tools: {{ tools | to_json }}
|
||||
Devices:
|
||||
{{ devices }}
|
||||
{% for device in devices | selectattr('area_id', 'none'): %}
|
||||
{{ device.entity_id }} '{{ device.name }}' = {{ device.state }}{{ ([""] + device.attributes) | join(";") }}
|
||||
{% endfor %}
|
||||
{% for area in devices | rejectattr('area_id', 'none') | groupby('area_name') %}
|
||||
## Area: {{ area.grouper }}
|
||||
{% for device in area.list %}
|
||||
{{ device.entity_id }} '{{ device.name }}' = {{ device.state }};{{ device.attributes | join(";") }}
|
||||
{% endfor %}
|
||||
{% endfor %}
|
||||
{% for item in response_examples %}
|
||||
{{ item.request }}
|
||||
{{ item.response }}
|
||||
<functioncall> {{ item.tool | to_json }}
|
||||
{% endfor %}
|
||||
```
|
||||
|
||||
The `services` and `devices` variables are special variables that are provided by the integration and NOT Home Assistant. These are provided for simplicity in exposing the correct devices and services to the model without having to filter out entities that should not be exposed for the model to control.
|
||||
- `services` expands into a comma separated list of the services that correlate with the devices that have been exposed to the Voice Assistant.
|
||||
- `devices` expands into a multi-line block where each line is the format `<entity_id> '<friendly_name> = <state>;<extra_attributes_to_expose>`
|
||||
This prompt provides the following pieces of information to the model:
|
||||
1. it gives the model a quick personality
|
||||
2. provides the time and date
|
||||
3. provides the available tools.
|
||||
- Most models understand JSON so you can simply convert the provided variable to JSON and insert it into the prompt
|
||||
4. provides the exposed devices.
|
||||
- uses the `selectattr` filter to gather all the devices that do not have an area and puts them at the top
|
||||
- uses the `rejectattr` filter to gather the opposite set of devices (the ones that do have areas) and then uses `groupby` make groups for the devices that are in an area.
|
||||
5. provides "in-context-learning" examples to the model, so that it better understands the format that it should produce
|
||||
|
||||
This all results in something that looks like this:
|
||||
```
|
||||
You are 'Al', a helpful AI Assistant that controls the devices in a house. Complete the following task as instructed with the information provided only.
|
||||
The current time and date is 09:40 PM on Friday June 07, 2024
|
||||
Tools: [{"name":"HassTurnOn","description":"Turns on/opens a device or entity","parameters":{"properties":{"name":"string","area":"string","floor":"string","domain":"string","device_class":"string"},"required":[]}},{"name":"HassTurnOff","description":"Turns off/closes a device or entity","parameters":{"properties":{"name":"string","area":"string","floor":"string","domain":"string","device_class":"string"},"required":[]}},{"name":"HassSetPosition","description":"Sets the position of a device or entity","parameters":{"properties":{"name":"string","area":"string","floor":"string","domain":"string","device_class":"string","position":"integer"},"required":["position"]}},{"name":"HassListAddItem","description":"Add item to a todo list","parameters":{"properties":{"item":"string","name":"string"},"required":[]}},{"name":"HassHumidifierSetpoint","description":"Set desired humidity level","parameters":{"properties":{"name":"string","humidity":"integer"},"required":["name","humidity"]}},{"name":"HassHumidifierMode","description":"Set humidifier mode","parameters":{"properties":{"name":"string","mode":"string"},"required":["name","mode"]}},{"name":"HassLightSet","description":"Sets the brightness or color of a light","parameters":{"properties": {"name":"string","area":"string","floor":"string","domain":"string","device_class":"string","color":"string","temperature":"integer","brightness":"integer"},"required":[]}},{"name":"HassMediaUnpause","description":"Resumes a media player","parameters":{"properties":{"name":"string","area":"string","floor":"string","domain":"string","device_class":"string"},"required":[]}},{"name":"HassMediaPause","description":"Pauses a media player","parameters":{"properties":{"name":"string","area":"string","floor":"string","domain":"string","device_class":"string"},"required":[]}},{"name":"HassVacuumStart","description":"Starts a vacuum","parameters":{"properties":{"name":"string","area":"string","floor":"string","domain":"string","device_class":"string"},"required":[]}},{"name":"HassVacuumReturnToBase","description":"Returns a vacuum to base","parameters":{"properties":{"name":"string","area":"string","floor":"string","domain":"string","device_class":"string"},"required":[]}}]
|
||||
Devices:
|
||||
button.push 'Push' = unknown
|
||||
climate.heatpump 'HeatPump' = heat;68F
|
||||
climate.ecobee_thermostat 'Ecobee Thermostat' = cool;70F;67.4%;on_high
|
||||
cover.kitchen_window 'Kitchen Window' = closed
|
||||
cover.hall_window 'Hall Window' = open
|
||||
cover.living_room_window 'Living Room Window' = open
|
||||
cover.garage_door 'Garage Door' = closed
|
||||
fan.ceiling_fan 'Ceiling Fan' = off
|
||||
fan.percentage_full_fan 'Percentage Full Fan' = off
|
||||
humidifier.humidifier 'Humidifier' = on;68%
|
||||
light.bed_light 'Bed Light' = off
|
||||
light.ceiling_lights 'Ceiling Lights' = on;sandybrown (255, 164, 81);70%
|
||||
light.ceiling_lights 'Dan's Lights' = on;sandybrown (255, 164, 81);70%
|
||||
light.kitchen_lights 'Kitchen Lights' = on;tomato (255, 63, 111);70%
|
||||
light.office_rgbw_lights 'Office RGBW Lights' = on;salmon (255, 128, 128);70%
|
||||
light.living_room_rgbww_lights 'Living Room RGBWW Lights' = on;salmon (255, 127, 125);70%
|
||||
lock.front_door 'Front Door' = locked
|
||||
lock.kitchen_door 'Kitchen Door' = unlocked
|
||||
lock.poorly_installed_door 'Poorly Installed Door' = unlocked
|
||||
lock.openable_lock 'Openable Lock' = locked
|
||||
sensor.carbon_dioxide 'Carbon dioxide' = 54
|
||||
switch.decorative_lights 'Decorative Lights' = on
|
||||
vacuum.1_first_floor '1_First_floor' = docked
|
||||
todo.shopping_list 'Shopping_list' = 2
|
||||
## Area: Living Room:
|
||||
fan.living_room_fan 'Living Room Fan' = on;
|
||||
Make the lights in Living Room greenyellow
|
||||
The color should be changed now.
|
||||
<functioncall> {"name":"HassLightSet","arguments":{"area":"Living Room","color":"greenyellow"}}
|
||||
Set the brightness for light.bed_light to 0.47
|
||||
Setting the brightness now.
|
||||
<functioncall> {"name":"HassLightSet","arguments":{"name":"light.bed_light","brightness":0.47}}
|
||||
Can you open the cover.hall_window?
|
||||
Opening the garage door for you.
|
||||
<functioncall> {"name":"HassOpenCover","arguments":{"name":"cover.hall_window"}}
|
||||
Stop the vacuum.1_first_floor vacuum
|
||||
Sending the vacuum back to its base.
|
||||
<functioncall> {"name":"HassVacuumReturnToBase","arguments":{"name":"vacuum.1_first_floor"}}
|
||||
```
|
||||
|
||||
There are a few variables that are exposed to the template to allow the prompt to expose the devices in your home as well as the various tools that the model can call.
|
||||
|
||||
Prompt Variables:
|
||||
- `devices`: each item in the provided array contains the `entity_id`, `name`, `state`, and `attributes` properties
|
||||
- `tools`: can be one of 3 formats as selected by the user
|
||||
- Minimal: Tools are passed as an array of strings in a Python inspired function definition format. Uses the fewest tokens. ex: `climate.set_hvac_mode(hvac_mode)`
|
||||
- Reduced: Tools are passed as an array of dictionaries where each tool contains the following fields: `name`, `description`, `parameters`.
|
||||
- Full: Tools are passed as an array of dictionaries where the structure of each tool matches the tool format used by the OpenAI APIs. Uses the most tokens.
|
||||
- `formatted_devices` expands into a multi-line block where each line is the format `<entity_id> '<friendly_name> = <state>;<extra_attributes_to_expose>`
|
||||
- `formatted_tools`: when using Reduced, or Full tool format is selected, the entire array is converted to JSON. When using Minimal, each tool is returned separated by a comma.
|
||||
- `response_examples`: an array of randomly generated in-context-learning examples containing the `request`, `response`, and `tool` properties that can be used to build a properly formatted ICL example
|
||||
|
||||
`formatted_devices` and `formatted_tools` are provided for simplicity in exposing the correct devices and tools to the model if you are using the Home-LLM model or do not want to customize the formatting of the devices and tools.
|
||||
|
||||
The examples used for the `response_examples` variable are loaded from the `in_context_examples.csv` file in the `/custom_components/llama_conversation/` folder.
|
||||
|
||||
### Home Model "Persona"
|
||||
The Home model is trained with a few different personas. They can be activated by using their system prompt found below:
|
||||
@@ -62,39 +144,8 @@ Currently supported prompt formats are:
|
||||
4. Mistral
|
||||
5. Zephyr w/ eos token `<|endoftext|>`
|
||||
6. Zephyr w/ eos token `</s>`
|
||||
7. Llama 3
|
||||
8. None (useful for foundation models)
|
||||
7. Zephyr w/ eos token `<|end|>`
|
||||
8. Llama 3
|
||||
9. Command-R
|
||||
10. None (useful for foundation models)
|
||||
|
||||
## Prompting other models with In Context Learning
|
||||
It is possible to use models that are not fine-tuned with the dataset via the usage of In Context Learning (ICL) examples. These examples condition the model to output the correct JSON schema without any fine-tuning of the model.
|
||||
|
||||
Here is an example configuration of using Mixtral-7B-Instruct-v0.2.
|
||||
First, download and set up the model on the desired backend.
|
||||
|
||||
Then, navigate to the conversation agent's configuration page and set the following options:
|
||||
|
||||
System Prompt:
|
||||
```
|
||||
You are 'Al', a helpful AI Assistant that controls the devices in a house. Complete the following task as instructed with the information provided only.
|
||||
Services: {{ services }}
|
||||
Devices:
|
||||
{{ devices }}
|
||||
|
||||
Respond to the following user instruction by responding in the same format as the following examples:
|
||||
{{ response_examples }}
|
||||
|
||||
User instruction:
|
||||
```
|
||||
Prompt Format: `Mistral`
|
||||
Service Call Regex: `({[\S \t]*?})`
|
||||
Enable in context learning (ICL) examples: Checked
|
||||
|
||||
### Explanation
|
||||
Enabling in context learning examples exposes the additional `{{ response_examples }}` variable for the system prompt. This variable is expanded to include various examples in the following format:
|
||||
```
|
||||
{"to_say": "Switching off the fan as requested.", "service": "fan.turn_off", "target_device": "fan.ceiling_fan"}
|
||||
{"to_say": "the todo has been added to your todo list.", "service": "todo.add_item", "target_device": "todo.shopping_list"}
|
||||
{"to_say": "Starting media playback.", "service": "media_player.media_play", "target_device": "media_player.bedroom"}
|
||||
```
|
||||
|
||||
These examples are loaded from the `in_context_examples.csv` file in the `/custom_components/llama_conversation/` folder.
|
||||
@@ -57,7 +57,7 @@ The next step is to specify which model will be used by the integration. You may
|
||||
**Model Name**: Use either `acon96/Home-3B-v3-GGUF` or `acon96/Home-1B-v3-GGUF`
|
||||
**Quantization Level**: The model will be downloaded in the selected quantization level from the HuggingFace repository. If unsure which level to choose, select `Q4_K_M`.
|
||||
|
||||
Pressing `Submit` will download the model from HuggingFace.
|
||||
Pressing `Submit` will download the model from HuggingFace. The downloaded files will be stored by default in `/media/models/`.
|
||||
|
||||
**Note for Docker/sanboxed HA install users:** The model download may fail if it does not have the permissions to create the ```media``` folder in your Home Assistant install. To fix this, you will need to manually create the folder beside your existing ```config``` folder called ```media``` and set the permissions accordingly so that the addon can access it. If you're using Docker or similar, you may need to map the folder in your Compose file too and ```Update the Stack```. Once created and updated, you can open the model download screen again and it should now download as normal.
|
||||
|
||||
|
||||
Reference in New Issue
Block a user