Update text-generation-webui version for addon

This commit is contained in:
Alex O'Connell
2024-01-15 17:20:40 -05:00
parent d53d366766
commit 7b9a6179c2
3 changed files with 3 additions and 3 deletions

View File

@@ -116,7 +116,7 @@ You need the following settings to configure the local backend from HuggingFace:
**Setting up the "remote" backends**:
You need the following settings in order to configure the "remote" backend:
1. Hostname: the host of the machine where text-generation-webui API is hosted. If you are using the provided add-on then the hostname is `local-text-generation-webui` or `<random hex string>-text-generation-webui`. The actual hostname can be found on the addon's page.
1. Hostname: the host of the machine where text-generation-webui API is hosted. If you are using the provided add-on then the hostname is `local-text-generation-webui` or `2ad5febb-text-generation-webui` depending on how the addon was installed.
2. Port: the port for accessing the text-generation-webui API. NOTE: this is not the same as the UI port. (Usually 5000)
3. Name of the Model: This name must EXACTLY match the name as it appears in `text-generation-webui`

View File

@@ -23,7 +23,7 @@ RUN \
python3-venv \
python3-pip \
\
&& git clone https://github.com/oobabooga/text-generation-webui.git ${APP_DIR} --branch snapshot-2024-01-07 \
&& git clone https://github.com/oobabooga/text-generation-webui.git ${APP_DIR} --branch snapshot-2024-01-14 \
&& python3 -m pip install torch torchvision torchaudio py-cpuinfo==9.0.0 \
&& python3 -m pip install -r ${APP_DIR}/requirements_cpu_only_noavx2.txt -r ${APP_DIR}/extensions/openai/requirements.txt llama-cpp-python \
&& apt-get purge -y --auto-remove \

View File

@@ -1,6 +1,6 @@
---
name: oobabooga-text-generation-webui
version: dev
version: 2024.01.14
slug: text-generation-webui
description: "A tool for running Large Language Models"
url: "https://github.com/oobabooga/text-generation-webui"