fix(blocks): add ollama_host to AIConversationBlock (#9351)

<!-- Clearly explain the need for these changes: -->

### Changes 🏗️

In the AI Conversation block it uses
``AIStructuredResponseGeneratorBlock.input`` for the input but it is
missing ollama_host, this means that if i try to use ollama with this
block, ollama_host is not passed, this PR fixes that by simply adding
``ollama_host=input_data.ollama_host`` to the ``class
AIConversationBlock(AIBlockBase):``

```diff
AIStructuredResponseGeneratorBlock.Input(
    prompt="",
    credentials=input_data.credentials,
    model=input_data.model,
    conversation_history=input_data.messages,
    max_tokens=input_data.max_tokens,
    expected_format={},
++  ollama_host=input_data.ollama_host,
),
```

<!-- Concisely describe all of the changes made in this pull request:
-->

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
- [x] Run the AI Conversation block with a normal provider like GPT-4 to
see if it works like normal
- [x] Run the AI Conversation block with Ollama to make sure the changes
made work
This commit is contained in:
Bently
2025-01-28 09:23:05 +01:00
committed by GitHub
parent ac8a466cda
commit 40897ce874

View File

@@ -924,6 +924,7 @@ class AIConversationBlock(AIBlockBase):
conversation_history=input_data.messages,
max_tokens=input_data.max_tokens,
expected_format={},
ollama_host=input_data.ollama_host,
),
credentials=credentials,
)