feat(backend): Remove deprecated LLM models and add migration script (#11331)

These models have become deprecated
- deepseek-r1-distill-llama-70b
- gemma2-9b-it
- llama3-70b-8192
- llama3-8b-8192
- google/gemini-flash-1.5

I have removed them and setup a migration, the migration is to convert
all the old versions of the model to new versions, the model changes
will happen like so

- llama3-70b-8192 → llama-3.3-70b-versatile
- llama3-8b-8192 → llama-3.1-8b-instant
- google/gemini-flash-1.5 → google/gemini-2.5-flash
- deepseek-r1-distill-llama-70b → gpt-5-chat-latest
- gemma2-9b-it → gpt-5-chat-latest 

### Changes 🏗️

<!-- Concisely describe all of the changes made in this pull request:
-->

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [x] Check to see if old models where removed
- [x] Check to see if migration worked and converted old models to new
one in graph
This commit is contained in:
Bently
2025-11-06 12:36:42 +00:00
committed by GitHub
parent a056d9e71a
commit dcecb17bd1
4 changed files with 53 additions and 34 deletions

View File

@@ -22,8 +22,6 @@ from .schema import (
class GroqModelName(str, enum.Enum):
LLAMA3_8B = "llama3-8b-8192"
LLAMA3_70B = "llama3-70b-8192"
MIXTRAL_8X7B = "mixtral-8x7b-32768"
GEMMA_7B = "gemma-7b-it"
@@ -31,22 +29,6 @@ class GroqModelName(str, enum.Enum):
GROQ_CHAT_MODELS = {
info.name: info
for info in [
ChatModelInfo(
name=GroqModelName.LLAMA3_8B,
provider_name=ModelProviderName.GROQ,
prompt_token_cost=0.05 / 1e6,
completion_token_cost=0.10 / 1e6,
max_tokens=8192,
has_function_call_api=True,
),
ChatModelInfo(
name=GroqModelName.LLAMA3_70B,
provider_name=ModelProviderName.GROQ,
prompt_token_cost=0.59 / 1e6,
completion_token_cost=0.79 / 1e6,
max_tokens=8192,
has_function_call_api=True,
),
ChatModelInfo(
name=GroqModelName.MIXTRAL_8X7B,
provider_name=ModelProviderName.GROQ,