feat(blocks): Add Wolfram Alpha LLM API Block (#10486)

### Changes 🏗️

This PR adds a new Wolfram Alpha block that integrates with Wolfram's
LLM API endpoint:

- **Ask Wolfram Block**: Allows users to ask questions to Wolfram Alpha
and get structured answers
- **API Integration**: Implements the Wolfram LLM API endpoint
(`/api/v1/llm-api`) for natural language queries
- **Simple Authentication**: Uses App ID based authentication via API
key credentials
- **Error Handling**: Proper error handling for API failures with
descriptive error messages

The block enables users to leverage Wolfram Alpha's computational
knowledge engine for:
- Mathematical calculations and explanations
- Scientific data and facts
- Unit conversions
- Historical information
- And many other knowledge-based queries

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  <\!-- Put your test plan here: -->
- [x] Verified the block appears in the UI and can be added to workflows
  - [x] Tested API authentication with valid Wolfram App ID
  - [x] Tested various query types (math, science, general knowledge)
  - [x] Verified error handling for invalid credentials
  - [x] Confirmed proper response formatting

**Configuration changes:**
- Users need to add `WOLFRAM_APP_ID` to their environment variables or
provide it through the UI credentials field
This commit is contained in:
Swifty
2025-07-30 13:04:46 +02:00
committed by GitHub
parent 02f5e92167
commit bb71492c8f
3 changed files with 64 additions and 0 deletions

View File

@@ -0,0 +1,14 @@
from backend.sdk import APIKeyCredentials, Requests
async def llm_api_call(credentials: APIKeyCredentials, question: str) -> str:
params = {"appid": credentials.api_key.get_secret_value(), "input": question}
response = await Requests().get(
"https://www.wolframalpha.com/api/v1/llm-api", params=params
)
if not response.ok:
raise ValueError(f"API request failed: {response.status} {response.text()}")
answer = response.text() if response.text() else ""
return answer

View File

@@ -0,0 +1,50 @@
from backend.sdk import (
APIKeyCredentials,
Block,
BlockCategory,
BlockCostType,
BlockOutput,
BlockSchema,
CredentialsMetaInput,
ProviderBuilder,
SchemaField,
)
from ._api import llm_api_call
wolfram = (
ProviderBuilder("wolfram")
.with_api_key("WOLFRAM_APP_ID", "Wolfram Alpha App ID")
.with_base_cost(1, BlockCostType.RUN)
.build()
)
class AskWolframBlock(Block):
"""
Ask Wolfram Alpha a question.
"""
class Input(BlockSchema):
credentials: CredentialsMetaInput = wolfram.credentials_field(
description="Wolfram Alpha API credentials"
)
question: str = SchemaField(description="The question to ask")
class Output(BlockSchema):
answer: str = SchemaField(description="The answer to the question")
def __init__(self):
super().__init__(
id="b7710ce4-68ef-4e82-9a2f-f0b874ef9c7d",
description="Ask Wolfram Alpha a question",
categories={BlockCategory.SEARCH},
input_schema=self.Input,
output_schema=self.Output,
)
async def run(
self, input_data: Input, *, credentials: APIKeyCredentials, **kwargs
) -> BlockOutput:
answer = await llm_api_call(credentials, input_data.question)
yield "answer", answer