Compare commits

..

28 Commits

Author SHA1 Message Date
seer-by-sentry[bot]
fb0d7fa31e feat(backend): Add admin diagnostics routes for execution monitoring 2025-11-03 20:32:20 +00:00
Ubbe
5359f20070 feat(frontend): Google Drive Picker component (#11286)
## Changes 🏗️

<img width="800" height="876" alt="Screenshot_2025-10-29_at_22 56 43"
src="https://github.com/user-attachments/assets/e1d9cf62-0a81-4658-82c2-6e673d636479"
/>

New `<GoogleDrivePicker />` component that, when rendered:
- re-uses existing Google credentials OR asks the user to SSO
- uses the Google Drive Picker script to launch a modal for the user to
select files

We will need this 3 new environment variables on the Front-end for it to
work:
```
# Google Drive Picker
NEXT_PUBLIC_GOOGLE_CLIENT_ID=
NEXT_PUBLIC_GOOGLE_API_KEY=
NEXT_PUBLIC_GOOGLE_APP_ID=
```
Updated `.env.default` with them.

### Next

We need to figure out how to map this to an agent input type and update
the Back-end to accept the files as input.

## Checklist 📋

### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  - [x] I tried the whole flow

### For configuration changes:

- [x] `.env.default` is updated or already compatible with my changes
- [x] I have included a list of my configuration changes in the PR
description (under **Changes**)
2025-11-03 13:48:28 +00:00
Abhimanyu Yadav
427c7eb1d4 feat(frontend): Add dynamic input dialog for agent execution with credential support (#11301)
### Changes 🏗️

This PR enhances the agent execution functionality by introducing a
dynamic input dialog that collects both regular inputs and credentials
before running agents.

<img width="1309" height="826" alt="Screenshot 2025-11-03 at 10 16
38 AM"
src="https://github.com/user-attachments/assets/2015da5d-055d-49c5-8e7e-31bd0fe369f4"
/>

####  New Features
- **Dynamic Input Dialog**: Added a new `RunInputDialog` component that
automatically detects when agents require inputs or credentials and
prompts users before execution
- **Credential Management**: Integrated credential input handling
directly into the execution flow, supporting various credential types
(API keys, OAuth, passwords)
- **Enhanced Run Controls**: Improved the `RunGraph` component with
better state management and visual feedback for running/stopping agents
- **Form Renderer**: Created a new unified `FormRenderer` component for
consistent input rendering across the application

#### 🔧 Refactoring
- **Input Renderer Migration**: Moved input renderer components from
FlowEditor-specific location to a shared components directory for better
reusability:
  - Migrated fields (AnyOfField, CredentialField, ObjectField)
- Migrated widgets (ArrayEditor, DateInput, SelectWidget, TextInput,
etc.)
  - Migrated templates (FieldTemplate, ArrayFieldTemplate)
- **State Management**: Enhanced `graphStore` with schemas for inputs
and credentials, including helper methods to check for their presence
- **Component Organization**: Restructured BuilderActions components for
better modularity

#### 🗑️ Cleanup
- Removed outdated FlowEditor documentation files (FORM_CREATOR.md,
README.md)
- Removed deprecated `RunGraph` and `useRunGraph` implementations from
FlowEditor
- Consolidated duplicate functionality into new shared components

#### 🎨 UI/UX Improvements
- Added gradient styling to Run/Stop button for better visual appeal
- Improved dialog layout with clear sections for Credentials and Inputs
- Enhanced form fields with size variants (small, medium, large) for
better responsiveness
- Added loading states and proper error handling during execution

### Technical Details
- The new system automatically detects input requirements from the graph
schema
- Credentials are handled separately with special UI treatment based on
credential type
- The dialog only appears when inputs or credentials are actually
required
- Execution flow: Save graph → Check for inputs/credentials → Show
dialog if needed → Execute with provided values

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
- [x] Create an agent without inputs and verify it runs directly without
dialog
- [x] Create an agent with input blocks and verify the dialog appears
with correct fields
- [x] Create an agent requiring credentials and verify credential
selection/creation works
  - [x] Test agent execution with both inputs and credentials
  - [x] Verify Stop Agent functionality during execution
  - [x] Test error handling for invalid inputs or missing credentials
  - [x] Verify that the dialog closes properly after submission
  - [x] Test that execution state is properly reflected in the UI
2025-11-03 12:05:45 +00:00
Krzysztof Czerwinski
c17a2f807d fix(frontend): Reset beads on run (#11303)
Beads are reset when saving but not on run which can result in beads
from previous runs accumulating on the opened graph.

### Changes 🏗️

- Move bead reset code to function and call it before run

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  - [x] Beads reset on every run
2025-11-03 09:23:39 +00:00
Krzysztof Czerwinski
f80739d38c Merge branch 'master' into dev 2025-11-03 10:28:57 +09:00
Krzysztof Czerwinski
f97e19f418 hotfix: Patch onboarding (#11299)
### Changes 🏗️

- Prevent removing progress of user onboarding tasks by merging arrays
on the backend instead of replacing them
- New endpoint for onboarding reset

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  - [x] Tasks are not being reset
  - [x] `/onboarding/reset` works
2025-11-01 10:19:55 +01:00
Reinier van der Leer
42b9facd4a hotfix(backend/scheduler): Bump apscheduler to DST-fixed version 3.11.1 (#11294)
- #11273

- Bump `apscheduler` to v3.11.1 which contains a fix for the issue

- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
- [x] "It's a rather ugly solution but the test proves that it works."
~the maintainer
  - [x] CI passes
2025-10-31 23:09:28 +01:00
Reinier van der Leer
a02b8d9ad7 fix(backend/scheduler): Bump apscheduler to DST-fixed version 3.11.1 (#11294)
- #11273

### Changes 🏗️

- Bump `apscheduler` to v3.11.1 which contains a fix for the issue

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
- [x] "It's a rather ugly solution but the test proves that it works."
~the maintainer
  - [x] CI passes
2025-10-31 21:40:44 +00:00
Nicholas Tindle
834617d221 hotfix(backend): Clarify prompt requirements for list generation for our friend claude (#11293) 2025-10-31 12:28:05 -05:00
Lluis Agusti
e6fb649ced Merge 'master' into 'dev' 2025-10-30 20:05:55 +07:00
Zamil Majdy
2f8cdf62ba feat(backend): Standardize error handling with BlockSchemaInput & BlockSchemaOutput base class (#11257)
<!-- Clearly explain the need for these changes: -->

This PR addresses the need for consistent error handling across all
blocks in the AutoGPT platform. Previously, each block had to manually
define an `error` field in their output schema, leading to code
duplication and potential inconsistencies. Some blocks might forget to
include the error field, making error handling unpredictable.

### Changes 🏗️

<!-- Concisely describe all of the changes made in this pull request:
-->

- **Created `BlockSchemaOutput` base class**: New base class that
extends `BlockSchema` with a standardized `error` field
- **Created `BlockSchemaInput` base class**: Added for consistency and
future extensibility
- **Updated 140+ block implementations**: Changed all block `Output`
classes from `class Output(BlockSchema):` to `class
Output(BlockSchemaOutput):`
- **Removed manual error field definitions**: Eliminated hundreds of
duplicate `error: str = SchemaField(...)` definitions
- **Updated type annotations**: Changed `Block[BlockSchema,
BlockSchema]` to `Block[BlockSchemaInput, BlockSchemaOutput]` throughout
the codebase
- **Fixed imports**: Added `BlockSchemaInput` and `BlockSchemaOutput`
imports to all relevant files
- **Maintained backward compatibility**: Updated `EmptySchema` to
inherit from `BlockSchemaOutput`

**Key Benefits:**
- Consistent error handling across all blocks
- Reduced code duplication (removed ~200 lines of repetitive error field
definitions)
- Type safety improvements with distinct input/output schema types
- Blocks can still override error field with more specific descriptions
when needed

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
- [x] Verified `poetry run format` passes (all linting, formatting, and
type checking)
- [x] Tested block instantiation works correctly (MediaDurationBlock,
UnrealTextToSpeechBlock)
- [x] Confirmed error fields are automatically present in all updated
blocks
- [x] Verified block loading system works (successfully loads 353+
blocks)
  - [x] Tested backward compatibility with EmptySchema
- [x] Confirmed blocks can still override error field with custom
descriptions
  - [x] Validated core schema inheritance chain works correctly

#### For configuration changes:

- [x] `.env.default` is updated or already compatible with my changes
- [x] `docker-compose.yml` is updated or already compatible with my
changes
- [x] I have included a list of my configuration changes in the PR
description (under **Changes**)

*Note: No configuration changes were needed for this refactoring.*

🤖 Generated with [Claude Code](https://claude.ai/code)

---------

Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: Lluis Agusti <hi@llu.lu>
Co-authored-by: Ubbe <hi@ubbe.dev>
2025-10-30 12:28:08 +00:00
seer-by-sentry[bot]
3dc5208f71 feat(backend): Increase max_field_size in aiohttp requests (#11261)
### Changes 🏗️

- Increased `max_field_size` in `aiohttp.ClientSession` to 16KB to
handle servers with large headers (e.g., long CSP headers).

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [x]  Add unit test that checks it can now parse headers over 8k size

---------

Co-authored-by: seer-by-sentry[bot] <157164994+seer-by-sentry[bot]@users.noreply.github.com>
Co-authored-by: Swifty <craigswift13@gmail.com>
Co-authored-by: Ubbe <hi@ubbe.dev>
2025-10-30 10:41:22 +00:00
Ubbe
04493598e2 fix(frontend): more wallet popover fixes (#11285)
## Changes 🏗️

<img width="800" height="547" alt="Screenshot 2025-10-29 at 22 11 35"
src="https://github.com/user-attachments/assets/5c700ddc-d770-48ef-9847-7e652c5dedcb"
/>
<br /><br />

- Use
[`react-currency-input-field`](https://www.npmjs.com/package/react-currency-input-field)
for `<Input type="amount" />` under the hood
  - so it formats numbers nicely with `,` and `.`
- Simplify form logic
- Make the popover cover the trigger button when open
- Re-organize imports
- Show a `$` prefix in front of the amount inputs

## Checklist 📋

### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  - [x] Login
  - [x] Open the wallet with credits enabled
  - [x] Play with the inputs

---------

Co-authored-by: Swifty <craigswift13@gmail.com>
2025-10-30 14:44:29 +04:00
seer-by-sentry[bot]
4140331731 fix(blocks/llm): Validate LLM summary responses are strings (#11275)
### Changes 🏗️

- Added validation to ensure that the `summary` and `final_summary`
returned by the LLM are strings.
- Raises a `ValueError` if the LLM returns a list or other non-string
type, providing a descriptive error message to aid debugging.

Fixes
[AUTOGPT-SERVER-6M4](https://sentry.io/organizations/significant-gravitas/issues/6978480131/).
The issue was that: LLM returned list of strings instead of single
string summary, causing `_combine_summaries` to fail on `join`.

This fix was generated by Seer in Sentry, triggered by Craig Swift. 👁️
Run ID: 2230933

Not quite right? [Click here to continue debugging with
Seer.](https://sentry.io/organizations/significant-gravitas/issues/6978480131/?seerDrawer=true)

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
- [x] Added a unit test to verify that a ValueError is raised when the
LLM returns a list instead of a string for summary or final_summary.

---------

Co-authored-by: seer-by-sentry[bot] <157164994+seer-by-sentry[bot]@users.noreply.github.com>
Co-authored-by: Swifty <craigswift13@gmail.com>
2025-10-30 09:52:50 +00:00
Swifty
594b1adcf7 fix(frontend): Fix marketplace sort by (#11284)
Marketplace sort by functionality was not working on the frontend. This
PR fixes it

### Changes 🏗️

- Add type hints for sort by
- Fix marketplace sort by drop downs


### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [x] tested locally
2025-10-30 08:46:11 +00:00
Swifty
cab6590908 fix(frontend): Safely parse error response body in handleFetchError (#11274)
### Changes 🏗️

- Ensures `handleFetchError` can handle non-JSON error responses (e.g.,
HTML error pages).
- Attempts to parse the response body as JSON, but falls back to text if
JSON parsing fails.
- Logs a warning to the console if JSON parsing fails.
- Sets `responseData` to null if parsing fails.

Fixes
[BUILDER-482](https://sentry.io/organizations/significant-gravitas/issues/6958135748/).
The issue was that: Frontend error handler unconditionally calls
`response.json()` on a non-JSON HTML error page starting with 'A'.

This fix was generated by Seer in Sentry, triggered by Craig Swift. 👁️
Run ID: 2206951

Not quite right? [Click here to continue debugging with
Seer.](https://sentry.io/organizations/significant-gravitas/issues/6958135748/?seerDrawer=true)

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  - [x] Test Plan:
    - [x] Created unit tests for the issue that caused the error
    - [x] Created unit tests to ensure responses are parsed gracefully
2025-10-29 16:22:47 +00:00
Swifty
a1ac109356 fix(backend): Further enhance sanitization of SQL raw queries (#11279)
### Changes 🏗️

Enhanced SQL query security in the store search functionality by
implementing proper parameterization to prevent SQL injection
vulnerabilities.

**Security Improvements:**
- Replaced string interpolation with PostgreSQL positional parameters
(`$1`, `$2`, etc.) for all user inputs
- Added ORDER BY whitelist validation to prevent injection via
`sorted_by` parameter
- Parameterized search term, creators array, category, and pagination
values
- Fixed variable naming conflict (`sql_where_clause` vs `where_clause`)

**Testing:**
- Added 4 comprehensive tests validating SQL injection prevention across
different attack vectors
- Tests verify that malicious input in search queries, filters, sorting,
and categories are safely handled
- All 10 tests in db_test.py pass successfully

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  - [x] All existing tests pass (10/10 tests passing)
  - [x] New security tests validate SQL injection prevention
  - [x] Verified parameterized queries handle malicious input safely
  - [x] Code formatting passes (`poetry run format`)

#### For configuration changes:
- [x] `.env.default` is updated or already compatible with my changes
- [x] `docker-compose.yml` is updated or already compatible with my
changes
- [x] I have included a list of my configuration changes in the PR
description (under **Changes**)

*Note: No configuration changes required for this security fix*
2025-10-29 15:21:27 +00:00
Zamil Majdy
5506d59da1 fix(backend/executor): make graph execution permission check version-agnostic (#11283)
## Summary
Fix critical issue where pre-execution permission validation broke
execution of graphs that reference older versions of sub-graphs.

## Problem
The `validate_graph_execution_permissions` function was checking for the
specific version of a graph in the user's library. This caused failures
when:
1. A parent graph references an older version of a sub-graph  
2. The user updates the sub-graph to a newer version
3. The older version is no longer in their library
4. Execution of the parent graph fails with `GraphNotInLibraryError`

## Root Cause
In `backend/executor/utils.py` line 523, the function was checking for
the exact version, but sub-graphs legitimately reference older versions
that may no longer be in the library.

## Solution

### 1. Remove Version-Specific Check (backend/executor/utils.py)
- Remove `graph_version=graph.version` parameter from validation call
- Add explanatory comment about version-agnostic behavior
- Now only checks that the graph ID exists in user's library (any
version)

### 2. Enhance Documentation (backend/data/graph.py)  
- Update function docstring to explain version-agnostic behavior
- Document that `None` (now default) allows execution of any version
- Clarify this is important for sub-graph version compatibility

## Technical Details
The `validate_graph_execution_permissions` function was already designed
to handle version-agnostic checks when `graph_version=None`. By omitting
the version parameter, we skip the version check and only verify:
- Graph exists in user's library  
- Graph is not deleted/archived
- User has execution permissions

## Impact
-  Parent graphs can execute even when they reference older sub-graph
versions
-  Sub-graph updates don't break existing parent graphs  
-  Maintains security: still checks library membership and permissions
-  No breaking changes: version-specific validation still available
when needed

## Example Scenario Fixed
1. User creates parent graph that uses sub-graph v1
2. User updates sub-graph to v2 (v1 removed from library)  
3. Parent graph still references sub-graph v1
4. **Before**: Execution fails with `GraphNotInLibraryError`
5. **After**: Execution succeeds (version-agnostic permission check)

## Testing
- [x] Code formatting and linting passes
- [x] Type checking passes
- [x] No breaking changes to existing functionality
- [x] Security still maintained through library membership checks

## Files Changed
- `backend/executor/utils.py`: Remove version-specific permission check
- `backend/data/graph.py`: Enhanced documentation for version-agnostic
behavior

Closes #[issue-number-if-applicable]

Co-authored-by: Claude <noreply@anthropic.com>
2025-10-29 14:13:23 +00:00
Ubbe
749341100b fix(frontend): prevent Wallet rendering twice (#11282)
## Changes 🏗️

The `<Wallet />` was being rendered twice ( one hidden with CSS `hidden`
) because of the Navbar layout, which caused logic issues within the
wallet. I changed to render it conditionally via Javascript instance,
which is always better practice than use `hidden` specially for
components with actual logic.

I also moved the component files closer to where it is used ( in the
navbar ).

I have a Cursor plugin that removes imports when unused, but annoyingly
re-organizes them, hence the changes around that...

## Checklist 📋

### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  - [x] Login
  - [x] There is only 1 Wallet in the DOM
2025-10-29 14:09:13 +00:00
Zamil Majdy
4922f88851 feat(backend/executor): Implement cascading stop for nested graph executions (#11277)
## Summary
Fixes critical issue where child executions spawned by
`AgentExecutorBlock` continue running after parent execution is stopped.
Implements parent-child execution tracking and recursive cascading stop
logic to ensure entire execution trees are terminated together.

## Background
When a parent graph execution containing `AgentExecutorBlock` nodes is
stopped, only the parent was terminated. Child executions continued
running, leading to:
-  Orphaned child executions consuming credits
-  No user control over execution trees  
-  Race conditions where children start after parent stops
-  Resource leaks from abandoned executions

## Core Changes

### 1. Database Schema (`schema.prisma` + migration)
```sql
-- Add nullable parent tracking field
ALTER TABLE "AgentGraphExecution" ADD COLUMN "parentGraphExecutionId" TEXT;

-- Add self-referential foreign key with graceful deletion
ALTER TABLE "AgentGraphExecution" ADD CONSTRAINT "AgentGraphExecution_parentGraphExecutionId_fkey" 
  FOREIGN KEY ("parentGraphExecutionId") REFERENCES "AgentGraphExecution"("id") 
  ON DELETE SET NULL ON UPDATE CASCADE;

-- Add index for efficient child queries
CREATE INDEX "AgentGraphExecution_parentGraphExecutionId_idx" 
  ON "AgentGraphExecution"("parentGraphExecutionId");
```

### 2. Parent ID Propagation (`backend/blocks/agent.py`)
```python
# Extract current graph execution ID and pass as parent to child
execution = add_graph_execution(
    # ... other params
    parent_graph_exec_id=graph_exec_id,  # NEW: Track parent relationship
)
```

### 3. Data Layer (`backend/data/execution.py`)
```python
async def get_child_graph_executions(parent_exec_id: str) -> list[GraphExecution]:
    """Get all child executions of a parent execution."""
    children = await AgentGraphExecution.prisma().find_many(
        where={"parentGraphExecutionId": parent_exec_id, "isDeleted": False}
    )
    return [GraphExecution.from_db(child) for child in children]
```

### 4. Cascading Stop Logic (`backend/executor/utils.py`)
```python
async def stop_graph_execution(
    user_id: str,
    graph_exec_id: str,
    wait_timeout: float = 15.0,
    cascade: bool = True,  # NEW parameter
):
    # 1. Find all child executions
    if cascade:
        children = await _get_child_executions(graph_exec_id)
        
        # 2. Stop all children recursively in parallel
        if children:
            await asyncio.gather(
                *[stop_graph_execution(user_id, child.id, wait_timeout, True) 
                  for child in children],
                return_exceptions=True,  # Don't fail parent if child fails
            )
    
    # 3. Stop the parent execution
    # ... existing stop logic
```

### 5. Race Condition Prevention (`backend/executor/manager.py`)
```python
# Before executing queued child, check if parent was terminated
if parent_graph_exec_id:
    parent_exec = get_db_client().get_graph_execution_meta(parent_graph_exec_id, user_id)
    if parent_exec and parent_exec.status == ExecutionStatus.TERMINATED:
        # Skip execution, mark child as terminated
        get_db_client().update_graph_execution_stats(
            graph_exec_id=graph_exec_id,
            status=ExecutionStatus.TERMINATED,
        )
        return  # Don't start orphaned child
```

## How It Works

### Before (Broken)
```
User stops parent execution
    ↓
Parent terminates ✓
    ↓
Child executions keep running ✗
    ↓
User cannot stop children ✗
```

### After (Fixed)
```
User stops parent execution
    ↓
Query database for all children
    ↓
Recursively stop all children in parallel
    ↓
Wait for children to terminate
    ↓
Stop parent execution
    ↓
All executions in tree stopped ✓
```

### Race Prevention
```
Child in QUEUED status
    ↓
Parent stopped
    ↓
Child picked up by executor
    ↓
Pre-flight check: parent TERMINATED?
    ↓
Yes → Skip execution, mark child TERMINATED
    ↓
Child never runs ✓
```

## Edge Cases Handled
 **Deep nesting** - Recursive cascading handles multi-level trees  
 **Queued children** - Pre-flight check prevents execution  
 **Race conditions** - Child spawned during stop operation  
 **Partial failures** - `return_exceptions=True` continues on error  
 **Multiple children** - Parallel stop via `asyncio.gather()`  
 **No parent** - Backward compatible (nullable field)  
 **Already completed** - Existing status check handles it  

## Performance Impact
- **Stop operation**: O(depth) with parallel execution vs O(1) before
- **Memory**: +36 bytes per execution (one UUID reference)
- **Database**: +1 query per tree level, indexed for efficiency

## API Changes (Backward Compatible)

### `stop_graph_execution()` - New Optional Parameter
```python
# Before
async def stop_graph_execution(user_id: str, graph_exec_id: str, wait_timeout: float = 15.0)

# After  
async def stop_graph_execution(user_id: str, graph_exec_id: str, wait_timeout: float = 15.0, cascade: bool = True)
```
**Default `cascade=True`** means existing callers get the new behavior
automatically.

### `add_graph_execution()` - New Optional Parameter
```python
async def add_graph_execution(..., parent_graph_exec_id: Optional[str] = None)
```

## Security & Safety
-  **User verification** - Users can only stop their own executions
(parent + children)
-  **No cycles** - Self-referential FK prevents infinite loops  
-  **Graceful degradation** - Errors in child stops don't block parent
stop
-  **Rate limits** - Existing execution rate limits still apply

## Testing Checklist

### Database Migration
- [x] Migration runs successfully  
- [x] Prisma client regenerates without errors
- [x] Existing tests pass

### Core Functionality  
- [ ] Manual test: Stop parent with running child → child stops
- [ ] Manual test: Stop parent with queued child → child never starts
- [ ] Unit test: Cascading stop with multiple children
- [ ] Unit test: Deep nesting (3+ levels)
- [ ] Integration test: Race condition prevention

## Breaking Changes
**None** - All changes are backward compatible with existing code.

## Rollback Plan
If issues arise:
1. **Code rollback**: Revert PR, redeploy
2. **Database rollback**: Drop column and constraints (non-destructive)

---

**Note**: This branch contains additional unrelated changes from merging
with `dev`. The core cascading stop feature involves only:
- `schema.prisma` + migration
- `backend/data/execution.py` 
- `backend/executor/utils.py`
- `backend/blocks/agent.py`
- `backend/executor/manager.py`

All other file changes are from dev branch updates and not part of this
feature.

🤖 Generated with [Claude Code](https://claude.ai/code)

<!-- This is an auto-generated comment: release notes by coderabbit.ai
-->
## Summary by CodeRabbit

* **New Features**
* Nested graph executions: parent-child tracking and retrieval of child
executions

* **Improvements**
* Cascading stop: stopping a parent optionally terminates child
executions
  * Parent execution IDs propagated through runs and surfaced in logs
  * Per-user/graph concurrent execution limits enforced

* **Bug Fixes**
* Skip enqueuing children if parent is terminated; robust handling when
parent-status checks fail

* **Tests**
  * Updated tests to cover parent linkage in graph creation
<!-- end of auto-generated comment: release notes by coderabbit.ai -->

---------

Co-authored-by: Claude <noreply@anthropic.com>
2025-10-29 11:11:22 +00:00
Zamil Majdy
5fb142c656 fix(backend/executor): ensure cluster lock release on all execution submission failures (#11281)
## Root Cause
During rolling deployment, execution
`97058338-052a-4528-87f4-98c88416bb7f` got stuck in QUEUED state
because:

1. Pod acquired cluster lock successfully during shutdown  
2. Subsequent setup operations failed (ThreadPoolExecutor shutdown,
resource exhaustion, etc.)
3. **No error handling existed** around the critical section after lock
acquisition
4. Cluster lock remained stuck in Redis for 5 minutes (TTL timeout)
5. Other pods couldn't acquire the lock, leaving execution permanently
queued

## The Fix

### Problem: Critical Section Not Protected
The original code had no error handling for the entire critical section
after successful lock acquisition:
```python
# Original code - no error handling after lock acquired
current_owner = cluster_lock.try_acquire()
if current_owner != self.executor_id:
    return  # didn't get lock
    
# CRITICAL SECTION - any failure here leaves lock stuck
self._execution_locks[graph_exec_id] = cluster_lock  # Could fail: memory
logger.info("Acquired cluster lock...")              # Could fail: logging  
cancel_event = threading.Event()                     # Could fail: resources
future = self.executor.submit(...)                   # Could fail: shutdown
self.active_graph_runs[...] = (future, cancel_event) # Could fail: memory
```

### Solution: Wrap Entire Critical Section  
Protect ALL operations after successful lock acquisition:
```python
# Fixed code - comprehensive error handling
current_owner = cluster_lock.try_acquire()
if current_owner != self.executor_id:
    return  # didn't get lock

# Wrap ENTIRE critical section after successful acquisition
try:
    self._execution_locks[graph_exec_id] = cluster_lock
    logger.info("Acquired cluster lock...")
    cancel_event = threading.Event()
    future = self.executor.submit(...)
    self.active_graph_runs[...] = (future, cancel_event)
except Exception as e:
    # Release cluster lock before requeue
    cluster_lock.release()
    del self._execution_locks[graph_exec_id] 
    _ack_message(reject=True, requeue=True)
    return
```

### Why This Comprehensive Approach Works
- **Complete protection**: Any failure in critical section → lock
released
- **Proper cleanup order**: Lock released → message requeued → another
pod can try
- **Uses existing infrastructure**: Leverages established
`_ack_message()` requeue logic
- **Handles all scenarios**: ThreadPoolExecutor shutdown, resource
exhaustion, memory issues, logging failures

## Protected Failure Scenarios
1. **Memory exhaustion**: `_execution_locks` assignment or
`active_graph_runs` assignment
2. **Resource exhaustion**: `threading.Event()` creation fails
3. **ThreadPoolExecutor shutdown**: `executor.submit()` with "cannot
schedule new futures after shutdown"
4. **Logging system failures**: `logger.info()` calls fail
5. **Any unexpected exceptions**: Network issues, disk problems, etc.

## Validation
-  All existing tests pass  
-  Maintains exact same success path behavior
-  Comprehensive error handling for all failure points
-  Minimal code change with maximum protection

## Impact
- **Eliminates stuck executions** during pod lifecycle events (rolling
deployments, scaling, crashes)
- **Faster recovery**: Immediate requeue vs 5-minute Redis TTL wait
- **Higher reliability**: Handles ANY failure in the critical section
- **Production-ready**: Comprehensive solution for distributed lock
management

This prevents the exact race condition that caused execution
`97058338-052a-4528-87f4-98c88416bb7f` to be stuck for >300 seconds,
plus many other potential failure scenarios.

---------

Co-authored-by: Claude <noreply@anthropic.com>
2025-10-29 08:56:24 +00:00
Pratyush Singh
e14594ff4a fix: handle oversized notifications by sending summary email (#11119) (#11130)
📨 Fix: Handle Oversized Notification Emails
Summary

This PR adds logic to detect and handle oversized notification emails
exceeding Postmark’s 5 MB limit. Instead of retrying indefinitely, the
system now sends a lightweight summary email with key stats and a
dashboard link.

Changes

Added size check in EmailSender.send_templated()

Sends summary email when payload > ~4.5 MB

Prevents infinite retries and queue clogging

Added logs for oversized detection

Fixes #11119

---------

Co-authored-by: Nicholas Tindle <nicholas.tindle@agpt.co>
Co-authored-by: Zamil Majdy <zamil.majdy@agpt.co>
2025-10-29 00:57:13 +00:00
Zamil Majdy
de70ede54a fix(backend): prevent execution of deleted agents and cleanup orphaned resources (#11243)
## Summary
Fix critical bug where deleted agents continue running scheduled and
triggered executions indefinitely, consuming credits without user
control.

## Problem
When agents are deleted from user libraries, their schedules and webhook
triggers remain active, leading to:
-  Uncontrolled resource consumption 
-  "Unknown agent" executions that charge credits
-  No way for users to stop orphaned executions
-  Accumulation of orphaned database records

## Solution

### 1. Prevention: Library Validation Before Execution
- Add `is_graph_in_user_library()` function with efficient database
queries
- Validate graph accessibility before all executions in
`validate_and_construct_node_execution_input()`
- Use specific `GraphNotInLibraryError` for clear error handling

### 2. Cleanup: Remove Schedules & Webhooks on Deletion
- Enhanced `delete_library_agent()` to clean up associated schedules and
webhooks
- Comprehensive cleanup functions for both scheduled and triggered
executions
- Proper database transaction handling

### 3. Error-Based Cleanup: Handle Existing Orphaned Resources
- Catch `GraphNotInLibraryError` in scheduler and webhook handlers
- Automatically clean up orphaned resources when execution fails
- Graceful degradation without breaking existing workflows

### 4. Migration: Clean Up Historical Orphans
- SQL migration to remove existing orphaned schedules and webhooks
- Performance index for faster cleanup queries
- Proper logging and error handling

## Key Changes

### Core Library Validation
```python
# backend/data/graph.py - Single source of truth
async def is_graph_in_user_library(graph_id: str, user_id: str, graph_version: Optional[int] = None) -> bool:
    where_clause = {"userId": user_id, "agentGraphId": graph_id, "isDeleted": False, "isArchived": False}
    if graph_version is not None:
        where_clause["agentGraphVersion"] = graph_version
    count = await LibraryAgent.prisma().count(where=where_clause)
    return count > 0
```

### Enhanced Agent Deletion
```python
# backend/server/v2/library/db.py
async def delete_library_agent(library_agent_id: str, user_id: str, soft_delete: bool = True) -> None:
    # ... existing deletion logic ...
    await _cleanup_schedules_for_graph(graph_id=graph_id, user_id=user_id)
    await _cleanup_webhooks_for_graph(graph_id=graph_id, user_id=user_id)
```

### Execution Prevention
```python
# backend/executor/utils.py
if not await gdb.is_graph_in_user_library(graph_id=graph_id, user_id=user_id, graph_version=graph.version):
    raise GraphNotInLibraryError(f"Graph #{graph_id} is not accessible in your library")
```

### Error-Based Cleanup
```python
# backend/executor/scheduler.py & backend/server/integrations/router.py
except GraphNotInLibraryError as e:
    logger.warning(f"Execution blocked for deleted/archived graph {graph_id}")
    await _cleanup_orphaned_resources_for_graph(graph_id, user_id)
```

## Technical Implementation

### Database Efficiency
- Use `count()` instead of `find_first()` for faster queries
- Add performance index: `idx_library_agent_user_graph_active`
- Follow existing `prisma.is_connected()` patterns

### Error Handling Hierarchy
- **`GraphNotInLibraryError`**: Specific exception for deleted/archived
graphs
- **`NotAuthorizedError`**: Generic authorization errors (preserved for
user ID mismatches)
- Clear error messages for better debugging

### Code Organization
- Single source of truth for library validation in
`backend/data/graph.py`
- Import from centralized location to avoid duplication
- Top-level imports following codebase conventions

## Testing & Validation

### Functional Testing
-  Library validation prevents execution of deleted agents
-  Cleanup functions remove schedules and webhooks properly  
-  Error-based cleanup handles orphaned resources gracefully
-  Migration removes existing orphaned records

### Integration Testing
-  All existing tests pass (including `test_store_listing_graph`)
-  No breaking changes to existing functionality
-  Proper error propagation and handling

### Performance Testing
-  Efficient database queries with proper indexing
-  Minimal overhead for normal execution flows
-  Cleanup operations don't impact performance

## Impact

### User Experience
- 🎯 **Immediate**: Deleted agents stop running automatically
- 🎯 **Ongoing**: No more unexpected credit charges from orphaned
executions
- 🎯 **Cleanup**: Historical orphaned resources are removed

### System Reliability
- 🔒 **Security**: Users can only execute agents they have access to
- 🧹 **Cleanup**: Automatic removal of orphaned database records
- 📈 **Performance**: Efficient validation with minimal overhead

### Developer Experience
- 🎯 **Clear Errors**: Specific exception types for better debugging
- 🔧 **Maintainable**: Centralized library validation logic
- 📚 **Documented**: Comprehensive error handling patterns

## Files Modified
- `backend/data/graph.py` - Library validation function
- `backend/server/v2/library/db.py` - Enhanced agent deletion with
cleanup
- `backend/executor/utils.py` - Execution validation and prevention
- `backend/executor/scheduler.py` - Error-based cleanup for schedules
- `backend/server/integrations/router.py` - Error-based cleanup for
webhooks
- `backend/util/exceptions.py` - Specific error type for deleted graphs
-
`migrations/20251023000000_cleanup_orphaned_schedules_and_webhooks/migration.sql`
- Historical cleanup

## Breaking Changes
None. All changes are backward compatible and preserve existing
functionality.

## Follow-up Tasks
- [ ] Monitor cleanup effectiveness in production
- [ ] Consider adding metrics for orphaned resource detection
- [ ] Potential optimization of cleanup batch operations

🤖 Generated with [Claude Code](https://claude.ai/code)

Co-Authored-By: Claude <noreply@anthropic.com>

---------

Co-authored-by: Claude <noreply@anthropic.com>
2025-10-28 23:48:35 +00:00
Ubbe
59657eb42e fix(frontend): onboarding step 5 adjustments (#11276)
## Changes 🏗️

A couple of improvements on **Onboarding Step 5**:
- Show a spinner when the page is loading ( better contrast / context
than skeleton in this case )
- Prevent the run button being disabled if credentials failed to load
- while this is good/expected behavior, it will help us debug the issue
in production where credentials failed to load silently, given running
the agent it'll throw an error we can see

## Checklist 📋

### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  - [x] Create a new account/signup
  - [x] On Onboarding Step 5 test the above
2025-10-28 13:58:04 +00:00
Reinier van der Leer
5e5f45a713 fix(backend): Fix various warnings (#11252)
- Resolves #11251

This fixes all the warnings mentioned in #11251, reducing noise and
making our logs and error alerts more useful :)

### Changes 🏗️

- Remove "Block {block_name} has multiple credential inputs" warning
(not actually an issue)
- Rename `json` attribute of `MainCodeExecutionResult` to `json_data`;
retain serialized name through a field alias
- Replace `Path(regex=...)` with `Path(pattern=...)` in
`get_shared_execution` endpoint parameter config
- Change Uvicorn's WebSocket module to new Sans-I/O implementation for
WS server
- Disable Uvicorn's WebSocket module for REST server
- Remove deprecated `enable_cleanup_closed=True` argument in
`CloudStorageHandler` implementation
- Replace Prisma transaction timeout `int` argument with a `timedelta`
value
- Update Sentry SDK to latest version (v2.42.1)
- Broaden filter for cleanup warnings from indirect dependency `litellm`
- Fix handling of `MissingConfigError` in REST server endpoints

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  - Check that the warnings are actually gone
- [x] Deploy to dev environment and run a graph; check for any warnings
  - Test WebSocket server
- [x] Run an agent in the Builder; make sure real-time execution updates
still work
2025-10-28 13:18:45 +00:00
Swifty
b31d60276a fix(backend/store): Sanitize all sql terms (#11228)
Categories and Creators where not sanitized in the full text search

- apply sanitization to categories and creators

- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  - [x] run tests to check it still works
2025-10-27 13:16:37 +01:00
Toran Bruce Richards
b52e95e1fc fix(blocks): Add missing error output pins to all Firecrawl blocks (#11256)
Added error output pins to all Firecrawl blocks as standard on the
AutoGPT platform. The base block execution code already handles error
yielding, so no try-catch logic was needed.

- FirecrawlScrapeBlock: Added error output pin for scrape failures
- FirecrawlCrawlBlock: Added error output pin for crawl failures
- FirecrawlExtractBlock: Added error output pin for extraction failures
- FirecrawlMapBlock: Added error output pin for map failures
- FirecrawlSearchBlock: Added error output pin for search failures

Resolves #11253

<!-- Clearly explain the need for these changes: -->

### Changes 🏗️

<!-- Concisely describe all of the changes made in this pull request:
-->

### Checklist 📋

#### For code changes:
- [ ] I have clearly listed my changes in the PR description
- [ ] I have made a test plan
- [ ] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [ ] ...

<details>
  <summary>Example test plan</summary>
  
  - [ ] Create from scratch and execute an agent with at least 3 blocks
- [ ] Import an agent from file upload, and confirm it executes
correctly
  - [ ] Upload agent to marketplace
- [ ] Import an agent from marketplace and confirm it executes correctly
  - [ ] Edit an agent from monitor, and confirm it executes correctly
</details>

#### For configuration changes:

- [ ] `.env.default` is updated or already compatible with my changes
- [ ] `docker-compose.yml` is updated or already compatible with my
changes
- [ ] I have included a list of my configuration changes in the PR
description (under **Changes**)

<details>
  <summary>Examples of configuration changes</summary>

  - Changing ports
  - Adding new services that need to communicate with each other
  - Secrets or environment variable changes
  - New or infrastructure changes such as databases
</details>

Co-authored-by: claude[bot] <41898282+claude[bot]@users.noreply.github.com>
Co-authored-by: Toran Bruce Richards <Torantulino@users.noreply.github.com>
2025-10-27 08:36:28 +00:00
Bently
f4ba02f2f1 feat(blocks/revid): Add cost configs for revid video blocks (#11242)
Updated block costs in `backend/backend/data/block_cost_config.py`:
  - **AIShortformVideoCreatorBlock**: Updated from 50 credits to 307
  - **AIAdMakerVideoCreatorBlock**: Added cost of 714 credits
  - **AIScreenshotToVideoAdBlock**: Added cost of 612 credits

  ### Checklist 📋

  #### For code changes:
  - [x] I have clearly listed my changes in the PR description
  - [x] I have made a test plan
  - [x] I have tested my changes according to the test plan:
- [x] Verify AIShortformVideoCreatorBlock costs 307 credits when
executed
- [x] Verify AIAdMakerVideoCreatorBlock costs 714 credits when executed
- [x] Verify AIScreenshotToVideoAdBlock costs 612 credits when executed
2025-10-24 18:35:37 +01:00
271 changed files with 4583 additions and 2436 deletions

View File

@@ -7,6 +7,7 @@ from backend.data.block import (
BlockInput,
BlockOutput,
BlockSchema,
BlockSchemaInput,
BlockType,
get_block,
)
@@ -19,7 +20,7 @@ _logger = logging.getLogger(__name__)
class AgentExecutorBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
user_id: str = SchemaField(description="User ID")
graph_id: str = SchemaField(description="Graph ID")
graph_version: int = SchemaField(description="Graph Version")
@@ -53,6 +54,7 @@ class AgentExecutorBlock(Block):
return validate_with_jsonschema(cls.get_input_schema(data), data)
class Output(BlockSchema):
# Use BlockSchema to avoid automatic error field that could clash with graph outputs
pass
def __init__(self):
@@ -65,7 +67,13 @@ class AgentExecutorBlock(Block):
categories={BlockCategory.AGENT},
)
async def run(self, input_data: Input, **kwargs) -> BlockOutput:
async def run(
self,
input_data: Input,
*,
graph_exec_id: str,
**kwargs,
) -> BlockOutput:
from backend.executor import utils as execution_utils
@@ -75,6 +83,7 @@ class AgentExecutorBlock(Block):
user_id=input_data.user_id,
inputs=input_data.inputs,
nodes_input_masks=input_data.nodes_input_masks,
parent_graph_exec_id=graph_exec_id,
)
logger = execution_utils.LogMetadata(

View File

@@ -10,7 +10,12 @@ from backend.blocks.llm import (
LLMResponse,
llm_call,
)
from backend.data.block import BlockCategory, BlockOutput, BlockSchema
from backend.data.block import (
BlockCategory,
BlockOutput,
BlockSchemaInput,
BlockSchemaOutput,
)
from backend.data.model import APIKeyCredentials, NodeExecutionStats, SchemaField
@@ -23,7 +28,7 @@ class AIConditionBlock(AIBlockBase):
It provides the same yes/no data pass-through functionality as the standard ConditionBlock.
"""
class Input(BlockSchema):
class Input(BlockSchemaInput):
input_value: Any = SchemaField(
description="The input value to evaluate with the AI condition",
placeholder="Enter the value to be evaluated (text, number, or any data)",
@@ -50,7 +55,7 @@ class AIConditionBlock(AIBlockBase):
)
credentials: AICredentials = AICredentialsField()
class Output(BlockSchema):
class Output(BlockSchemaOutput):
result: bool = SchemaField(
description="The result of the AI condition evaluation (True or False)"
)

View File

@@ -5,7 +5,13 @@ from pydantic import SecretStr
from replicate.client import Client as ReplicateClient
from replicate.helpers import FileOutput
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.block import (
Block,
BlockCategory,
BlockOutput,
BlockSchemaInput,
BlockSchemaOutput,
)
from backend.data.model import (
APIKeyCredentials,
CredentialsField,
@@ -42,7 +48,7 @@ TEST_CREDENTIALS_INPUT = {
class AIImageCustomizerBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: CredentialsMetaInput[
Literal[ProviderName.REPLICATE], Literal["api_key"]
] = CredentialsField(
@@ -68,9 +74,8 @@ class AIImageCustomizerBlock(Block):
title="Output Format",
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
image_url: MediaFileType = SchemaField(description="URL of the generated image")
error: str = SchemaField(description="Error message if generation failed")
def __init__(self):
super().__init__(

View File

@@ -5,7 +5,7 @@ from pydantic import SecretStr
from replicate.client import Client as ReplicateClient
from replicate.helpers import FileOutput
from backend.data.block import Block, BlockCategory, BlockSchema
from backend.data.block import Block, BlockCategory, BlockSchemaInput, BlockSchemaOutput
from backend.data.model import (
APIKeyCredentials,
CredentialsField,
@@ -101,7 +101,7 @@ class ImageGenModel(str, Enum):
class AIImageGeneratorBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: CredentialsMetaInput[
Literal[ProviderName.REPLICATE], Literal["api_key"]
] = CredentialsField(
@@ -135,9 +135,8 @@ class AIImageGeneratorBlock(Block):
title="Image Style",
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
image_url: str = SchemaField(description="URL of the generated image")
error: str = SchemaField(description="Error message if generation failed")
def __init__(self):
super().__init__(

View File

@@ -6,7 +6,13 @@ from typing import Literal
from pydantic import SecretStr
from replicate.client import Client as ReplicateClient
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.block import (
Block,
BlockCategory,
BlockOutput,
BlockSchemaInput,
BlockSchemaOutput,
)
from backend.data.model import (
APIKeyCredentials,
CredentialsField,
@@ -54,7 +60,7 @@ class NormalizationStrategy(str, Enum):
class AIMusicGeneratorBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: CredentialsMetaInput[
Literal[ProviderName.REPLICATE], Literal["api_key"]
] = CredentialsField(
@@ -107,9 +113,8 @@ class AIMusicGeneratorBlock(Block):
title="Normalization Strategy",
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
result: str = SchemaField(description="URL of the generated audio file")
error: str = SchemaField(description="Error message if the model run failed")
def __init__(self):
super().__init__(

View File

@@ -6,7 +6,13 @@ from typing import Literal
from pydantic import SecretStr
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.block import (
Block,
BlockCategory,
BlockOutput,
BlockSchemaInput,
BlockSchemaOutput,
)
from backend.data.model import (
APIKeyCredentials,
CredentialsField,
@@ -148,7 +154,7 @@ logger = logging.getLogger(__name__)
class AIShortformVideoCreatorBlock(Block):
"""Creates a shortform texttovideo clip using stock or AI imagery."""
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: CredentialsMetaInput[
Literal[ProviderName.REVID], Literal["api_key"]
] = CredentialsField(
@@ -187,9 +193,8 @@ class AIShortformVideoCreatorBlock(Block):
placeholder=VisualMediaType.STOCK_VIDEOS,
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
video_url: str = SchemaField(description="The URL of the created video")
error: str = SchemaField(description="Error message if the request failed")
async def create_webhook(self) -> tuple[str, str]:
"""Create a new webhook URL for receiving notifications."""
@@ -336,7 +341,7 @@ class AIShortformVideoCreatorBlock(Block):
class AIAdMakerVideoCreatorBlock(Block):
"""Generates a 30second vertical AI advert using optional usersupplied imagery."""
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: CredentialsMetaInput[
Literal[ProviderName.REVID], Literal["api_key"]
] = CredentialsField(
@@ -364,9 +369,8 @@ class AIAdMakerVideoCreatorBlock(Block):
description="Restrict visuals to supplied images only.", default=True
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
video_url: str = SchemaField(description="URL of the finished advert")
error: str = SchemaField(description="Error message on failure")
async def create_webhook(self) -> tuple[str, str]:
"""Create a new webhook URL for receiving notifications."""
@@ -524,7 +528,7 @@ class AIAdMakerVideoCreatorBlock(Block):
class AIScreenshotToVideoAdBlock(Block):
"""Creates an advert where the supplied screenshot is narrated by an AI avatar."""
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: CredentialsMetaInput[
Literal[ProviderName.REVID], Literal["api_key"]
] = CredentialsField(description="Revid.ai API key")
@@ -542,9 +546,8 @@ class AIScreenshotToVideoAdBlock(Block):
default=AudioTrack.DONT_STOP_ME_ABSTRACT_FUTURE_BASS
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
video_url: str = SchemaField(description="Rendered video URL")
error: str = SchemaField(description="Error, if encountered")
async def create_webhook(self) -> tuple[str, str]:
"""Create a new webhook URL for receiving notifications."""

View File

@@ -9,7 +9,8 @@ from backend.sdk import (
Block,
BlockCategory,
BlockOutput,
BlockSchema,
BlockSchemaInput,
BlockSchemaOutput,
CredentialsMetaInput,
SchemaField,
)
@@ -23,7 +24,7 @@ class AirtableCreateBaseBlock(Block):
Creates a new base in an Airtable workspace, or returns existing base if one with the same name exists.
"""
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: CredentialsMetaInput = airtable.credentials_field(
description="Airtable API credentials"
)
@@ -53,7 +54,7 @@ class AirtableCreateBaseBlock(Block):
],
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
base_id: str = SchemaField(description="The ID of the created or found base")
tables: list[dict] = SchemaField(description="Array of table objects")
table: dict = SchemaField(description="A single table object")
@@ -118,7 +119,7 @@ class AirtableListBasesBlock(Block):
Lists all bases in an Airtable workspace that the user has access to.
"""
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: CredentialsMetaInput = airtable.credentials_field(
description="Airtable API credentials"
)
@@ -129,7 +130,7 @@ class AirtableListBasesBlock(Block):
description="Pagination offset from previous request", default=""
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
bases: list[dict] = SchemaField(description="Array of base objects")
offset: Optional[str] = SchemaField(
description="Offset for next page (null if no more bases)", default=None

View File

@@ -9,7 +9,8 @@ from backend.sdk import (
Block,
BlockCategory,
BlockOutput,
BlockSchema,
BlockSchemaInput,
BlockSchemaOutput,
CredentialsMetaInput,
SchemaField,
)
@@ -31,7 +32,7 @@ class AirtableListRecordsBlock(Block):
Lists records from an Airtable table with optional filtering, sorting, and pagination.
"""
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: CredentialsMetaInput = airtable.credentials_field(
description="Airtable API credentials"
)
@@ -65,7 +66,7 @@ class AirtableListRecordsBlock(Block):
default=False,
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
records: list[dict] = SchemaField(description="Array of record objects")
offset: Optional[str] = SchemaField(
description="Offset for next page (null if no more records)", default=None
@@ -137,7 +138,7 @@ class AirtableGetRecordBlock(Block):
Retrieves a single record from an Airtable table by its ID.
"""
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: CredentialsMetaInput = airtable.credentials_field(
description="Airtable API credentials"
)
@@ -153,7 +154,7 @@ class AirtableGetRecordBlock(Block):
default=False,
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
id: str = SchemaField(description="The record ID")
fields: dict = SchemaField(description="The record fields")
created_time: str = SchemaField(description="The record created time")
@@ -217,7 +218,7 @@ class AirtableCreateRecordsBlock(Block):
Creates one or more records in an Airtable table.
"""
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: CredentialsMetaInput = airtable.credentials_field(
description="Airtable API credentials"
)
@@ -239,7 +240,7 @@ class AirtableCreateRecordsBlock(Block):
default=None,
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
records: list[dict] = SchemaField(description="Array of created record objects")
details: dict = SchemaField(description="Details of the created records")
@@ -290,7 +291,7 @@ class AirtableUpdateRecordsBlock(Block):
Updates one or more existing records in an Airtable table.
"""
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: CredentialsMetaInput = airtable.credentials_field(
description="Airtable API credentials"
)
@@ -306,7 +307,7 @@ class AirtableUpdateRecordsBlock(Block):
default=None,
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
records: list[dict] = SchemaField(description="Array of updated record objects")
def __init__(self):
@@ -339,7 +340,7 @@ class AirtableDeleteRecordsBlock(Block):
Deletes one or more records from an Airtable table.
"""
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: CredentialsMetaInput = airtable.credentials_field(
description="Airtable API credentials"
)
@@ -351,7 +352,7 @@ class AirtableDeleteRecordsBlock(Block):
description="Array of upto 10 record IDs to delete"
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
records: list[dict] = SchemaField(description="Array of deletion results")
def __init__(self):

View File

@@ -7,7 +7,8 @@ from backend.sdk import (
Block,
BlockCategory,
BlockOutput,
BlockSchema,
BlockSchemaInput,
BlockSchemaOutput,
CredentialsMetaInput,
Requests,
SchemaField,
@@ -23,13 +24,13 @@ class AirtableListSchemaBlock(Block):
fields, and views.
"""
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: CredentialsMetaInput = airtable.credentials_field(
description="Airtable API credentials"
)
base_id: str = SchemaField(description="The Airtable base ID")
class Output(BlockSchema):
class Output(BlockSchemaOutput):
base_schema: dict = SchemaField(
description="Complete base schema with tables, fields, and views"
)
@@ -66,7 +67,7 @@ class AirtableCreateTableBlock(Block):
Creates a new table in an Airtable base with specified fields and views.
"""
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: CredentialsMetaInput = airtable.credentials_field(
description="Airtable API credentials"
)
@@ -77,7 +78,7 @@ class AirtableCreateTableBlock(Block):
default=[{"name": "Name", "type": "singleLineText"}],
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
table: dict = SchemaField(description="Created table object")
table_id: str = SchemaField(description="ID of the created table")
@@ -109,7 +110,7 @@ class AirtableUpdateTableBlock(Block):
Updates an existing table's properties such as name or description.
"""
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: CredentialsMetaInput = airtable.credentials_field(
description="Airtable API credentials"
)
@@ -125,7 +126,7 @@ class AirtableUpdateTableBlock(Block):
description="The date dependency of the table to update", default=None
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
table: dict = SchemaField(description="Updated table object")
def __init__(self):
@@ -157,7 +158,7 @@ class AirtableCreateFieldBlock(Block):
Adds a new field (column) to an existing Airtable table.
"""
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: CredentialsMetaInput = airtable.credentials_field(
description="Airtable API credentials"
)
@@ -176,7 +177,7 @@ class AirtableCreateFieldBlock(Block):
description="The options of the field to create", default=None
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
field: dict = SchemaField(description="Created field object")
field_id: str = SchemaField(description="ID of the created field")
@@ -209,7 +210,7 @@ class AirtableUpdateFieldBlock(Block):
Updates an existing field's properties in an Airtable table.
"""
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: CredentialsMetaInput = airtable.credentials_field(
description="Airtable API credentials"
)
@@ -225,7 +226,7 @@ class AirtableUpdateFieldBlock(Block):
advanced=False,
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
field: dict = SchemaField(description="Updated field object")
def __init__(self):

View File

@@ -3,7 +3,8 @@ from backend.sdk import (
Block,
BlockCategory,
BlockOutput,
BlockSchema,
BlockSchemaInput,
BlockSchemaOutput,
BlockType,
BlockWebhookConfig,
CredentialsMetaInput,
@@ -32,7 +33,7 @@ class AirtableWebhookTriggerBlock(Block):
Thin wrapper just forwards the payloads one at a time to the next block.
"""
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: CredentialsMetaInput = airtable.credentials_field(
description="Airtable API credentials"
)
@@ -43,7 +44,7 @@ class AirtableWebhookTriggerBlock(Block):
description="Airtable webhook event filter"
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
payload: WebhookPayload = SchemaField(description="Airtable webhook payload")
def __init__(self):

View File

@@ -10,14 +10,20 @@ from backend.blocks.apollo.models import (
PrimaryPhone,
SearchOrganizationsRequest,
)
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.block import (
Block,
BlockCategory,
BlockOutput,
BlockSchemaInput,
BlockSchemaOutput,
)
from backend.data.model import CredentialsField, SchemaField
class SearchOrganizationsBlock(Block):
"""Search for organizations in Apollo"""
class Input(BlockSchema):
class Input(BlockSchemaInput):
organization_num_employees_range: list[int] = SchemaField(
description="""The number range of employees working for the company. This enables you to find companies based on headcount. You can add multiple ranges to expand your search results.
@@ -69,7 +75,7 @@ To find IDs, identify the values for organization_id when you call this endpoint
description="Apollo credentials",
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
organizations: list[Organization] = SchemaField(
description="List of organizations found",
default_factory=list,

View File

@@ -14,14 +14,20 @@ from backend.blocks.apollo.models import (
SearchPeopleRequest,
SenorityLevels,
)
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.block import (
Block,
BlockCategory,
BlockOutput,
BlockSchemaInput,
BlockSchemaOutput,
)
from backend.data.model import CredentialsField, SchemaField
class SearchPeopleBlock(Block):
"""Search for people in Apollo"""
class Input(BlockSchema):
class Input(BlockSchemaInput):
person_titles: list[str] = SchemaField(
description="""Job titles held by the people you want to find. For a person to be included in search results, they only need to match 1 of the job titles you add. Adding more job titles expands your search results.
@@ -109,7 +115,7 @@ class SearchPeopleBlock(Block):
description="Apollo credentials",
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
people: list[Contact] = SchemaField(
description="List of people found",
default_factory=list,

View File

@@ -6,14 +6,20 @@ from backend.blocks.apollo._auth import (
ApolloCredentialsInput,
)
from backend.blocks.apollo.models import Contact, EnrichPersonRequest
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.block import (
Block,
BlockCategory,
BlockOutput,
BlockSchemaInput,
BlockSchemaOutput,
)
from backend.data.model import CredentialsField, SchemaField
class GetPersonDetailBlock(Block):
"""Get detailed person data with Apollo API, including email reveal"""
class Input(BlockSchema):
class Input(BlockSchemaInput):
person_id: str = SchemaField(
description="Apollo person ID to enrich (most accurate method)",
default="",
@@ -68,7 +74,7 @@ class GetPersonDetailBlock(Block):
description="Apollo credentials",
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
contact: Contact = SchemaField(
description="Enriched contact information",
)

View File

@@ -3,7 +3,7 @@ from typing import Optional
from pydantic import BaseModel, Field
from backend.data.block import BlockSchema
from backend.data.block import BlockSchemaInput
from backend.data.model import SchemaField, UserIntegrations
from backend.integrations.ayrshare import AyrshareClient
from backend.util.clients import get_database_manager_async_client
@@ -17,7 +17,7 @@ async def get_profile_key(user_id: str):
return user_integrations.managed_credentials.ayrshare_profile_key
class BaseAyrshareInput(BlockSchema):
class BaseAyrshareInput(BlockSchemaInput):
"""Base input model for Ayrshare social media posts with common fields."""
post: str = SchemaField(

View File

@@ -3,7 +3,7 @@ from backend.sdk import (
Block,
BlockCategory,
BlockOutput,
BlockSchema,
BlockSchemaOutput,
BlockType,
SchemaField,
)
@@ -38,7 +38,7 @@ class PostToBlueskyBlock(Block):
advanced=True,
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
post_result: PostResponse = SchemaField(description="The result of the post")
post: PostIds = SchemaField(description="The result of the post")

View File

@@ -3,7 +3,7 @@ from backend.sdk import (
Block,
BlockCategory,
BlockOutput,
BlockSchema,
BlockSchemaOutput,
BlockType,
SchemaField,
)
@@ -101,7 +101,7 @@ class PostToFacebookBlock(Block):
description="URL for custom link preview", default="", advanced=True
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
post_result: PostResponse = SchemaField(description="The result of the post")
post: PostIds = SchemaField(description="The result of the post")

View File

@@ -3,7 +3,7 @@ from backend.sdk import (
Block,
BlockCategory,
BlockOutput,
BlockSchema,
BlockSchemaOutput,
BlockType,
SchemaField,
)
@@ -94,7 +94,7 @@ class PostToGMBBlock(Block):
advanced=True,
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
post_result: PostResponse = SchemaField(description="The result of the post")
post: PostIds = SchemaField(description="The result of the post")

View File

@@ -5,7 +5,7 @@ from backend.sdk import (
Block,
BlockCategory,
BlockOutput,
BlockSchema,
BlockSchemaOutput,
BlockType,
SchemaField,
)
@@ -94,7 +94,7 @@ class PostToInstagramBlock(Block):
advanced=True,
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
post_result: PostResponse = SchemaField(description="The result of the post")
post: PostIds = SchemaField(description="The result of the post")

View File

@@ -3,7 +3,7 @@ from backend.sdk import (
Block,
BlockCategory,
BlockOutput,
BlockSchema,
BlockSchemaOutput,
BlockType,
SchemaField,
)
@@ -94,7 +94,7 @@ class PostToLinkedInBlock(Block):
advanced=True,
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
post_result: PostResponse = SchemaField(description="The result of the post")
post: PostIds = SchemaField(description="The result of the post")

View File

@@ -3,7 +3,7 @@ from backend.sdk import (
Block,
BlockCategory,
BlockOutput,
BlockSchema,
BlockSchemaOutput,
BlockType,
SchemaField,
)
@@ -73,7 +73,7 @@ class PostToPinterestBlock(Block):
advanced=True,
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
post_result: PostResponse = SchemaField(description="The result of the post")
post: PostIds = SchemaField(description="The result of the post")

View File

@@ -3,7 +3,7 @@ from backend.sdk import (
Block,
BlockCategory,
BlockOutput,
BlockSchema,
BlockSchemaOutput,
BlockType,
SchemaField,
)
@@ -19,7 +19,7 @@ class PostToRedditBlock(Block):
pass # Uses all base fields
class Output(BlockSchema):
class Output(BlockSchemaOutput):
post_result: PostResponse = SchemaField(description="The result of the post")
post: PostIds = SchemaField(description="The result of the post")

View File

@@ -3,7 +3,7 @@ from backend.sdk import (
Block,
BlockCategory,
BlockOutput,
BlockSchema,
BlockSchemaOutput,
BlockType,
SchemaField,
)
@@ -43,7 +43,7 @@ class PostToSnapchatBlock(Block):
advanced=True,
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
post_result: PostResponse = SchemaField(description="The result of the post")
post: PostIds = SchemaField(description="The result of the post")

View File

@@ -3,7 +3,7 @@ from backend.sdk import (
Block,
BlockCategory,
BlockOutput,
BlockSchema,
BlockSchemaOutput,
BlockType,
SchemaField,
)
@@ -38,7 +38,7 @@ class PostToTelegramBlock(Block):
advanced=True,
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
post_result: PostResponse = SchemaField(description="The result of the post")
post: PostIds = SchemaField(description="The result of the post")

View File

@@ -3,7 +3,7 @@ from backend.sdk import (
Block,
BlockCategory,
BlockOutput,
BlockSchema,
BlockSchemaOutput,
BlockType,
SchemaField,
)
@@ -31,7 +31,7 @@ class PostToThreadsBlock(Block):
advanced=False,
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
post_result: PostResponse = SchemaField(description="The result of the post")
post: PostIds = SchemaField(description="The result of the post")

View File

@@ -5,7 +5,7 @@ from backend.sdk import (
Block,
BlockCategory,
BlockOutput,
BlockSchema,
BlockSchemaOutput,
BlockType,
SchemaField,
)
@@ -98,7 +98,7 @@ class PostToTikTokBlock(Block):
advanced=True,
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
post_result: PostResponse = SchemaField(description="The result of the post")
post: PostIds = SchemaField(description="The result of the post")

View File

@@ -3,7 +3,7 @@ from backend.sdk import (
Block,
BlockCategory,
BlockOutput,
BlockSchema,
BlockSchemaOutput,
BlockType,
SchemaField,
)
@@ -97,7 +97,7 @@ class PostToXBlock(Block):
advanced=True,
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
post_result: PostResponse = SchemaField(description="The result of the post")
post: PostIds = SchemaField(description="The result of the post")

View File

@@ -6,7 +6,7 @@ from backend.sdk import (
Block,
BlockCategory,
BlockOutput,
BlockSchema,
BlockSchemaOutput,
BlockType,
SchemaField,
)
@@ -119,7 +119,7 @@ class PostToYouTubeBlock(Block):
advanced=True,
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
post_result: PostResponse = SchemaField(description="The result of the post")
post: PostIds = SchemaField(description="The result of the post")

View File

@@ -9,7 +9,8 @@ from backend.sdk import (
Block,
BlockCategory,
BlockOutput,
BlockSchema,
BlockSchemaInput,
BlockSchemaOutput,
CredentialsMetaInput,
SchemaField,
)
@@ -23,7 +24,7 @@ class BaasBotJoinMeetingBlock(Block):
Deploy a bot immediately or at a scheduled start_time to join and record a meeting.
"""
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: CredentialsMetaInput = baas.credentials_field(
description="Meeting BaaS API credentials"
)
@@ -57,7 +58,7 @@ class BaasBotJoinMeetingBlock(Block):
description="Custom metadata to attach to the bot", default={}
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
bot_id: str = SchemaField(description="UUID of the deployed bot")
join_response: dict = SchemaField(
description="Full response from join operation"
@@ -103,13 +104,13 @@ class BaasBotLeaveMeetingBlock(Block):
Force the bot to exit the call.
"""
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: CredentialsMetaInput = baas.credentials_field(
description="Meeting BaaS API credentials"
)
bot_id: str = SchemaField(description="UUID of the bot to remove from meeting")
class Output(BlockSchema):
class Output(BlockSchemaOutput):
left: bool = SchemaField(description="Whether the bot successfully left")
def __init__(self):
@@ -138,7 +139,7 @@ class BaasBotFetchMeetingDataBlock(Block):
Pull MP4 URL, transcript & metadata for a completed meeting.
"""
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: CredentialsMetaInput = baas.credentials_field(
description="Meeting BaaS API credentials"
)
@@ -147,7 +148,7 @@ class BaasBotFetchMeetingDataBlock(Block):
description="Include transcript data in response", default=True
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
mp4_url: str = SchemaField(
description="URL to download the meeting recording (time-limited)"
)
@@ -185,13 +186,13 @@ class BaasBotDeleteRecordingBlock(Block):
Purge MP4 + transcript data for privacy or storage management.
"""
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: CredentialsMetaInput = baas.credentials_field(
description="Meeting BaaS API credentials"
)
bot_id: str = SchemaField(description="UUID of the bot whose data to delete")
class Output(BlockSchema):
class Output(BlockSchemaOutput):
deleted: bool = SchemaField(
description="Whether the data was successfully deleted"
)

View File

@@ -11,7 +11,8 @@ from backend.sdk import (
Block,
BlockCategory,
BlockOutput,
BlockSchema,
BlockSchemaInput,
BlockSchemaOutput,
CredentialsMetaInput,
Requests,
SchemaField,
@@ -27,7 +28,7 @@ TEST_CREDENTIALS = APIKeyCredentials(
)
class TextModification(BlockSchema):
class TextModification(BlockSchemaInput):
name: str = SchemaField(
description="The name of the layer to modify in the template"
)
@@ -60,7 +61,7 @@ class TextModification(BlockSchema):
class BannerbearTextOverlayBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: CredentialsMetaInput = bannerbear.credentials_field(
description="API credentials for Bannerbear"
)
@@ -96,7 +97,7 @@ class BannerbearTextOverlayBlock(Block):
advanced=True,
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
success: bool = SchemaField(
description="Whether the image generation was successfully initiated"
)
@@ -105,7 +106,6 @@ class BannerbearTextOverlayBlock(Block):
)
uid: str = SchemaField(description="Unique identifier for the generated image")
status: str = SchemaField(description="Status of the image generation")
error: str = SchemaField(description="Error message if the operation failed")
def __init__(self):
super().__init__(

View File

@@ -1,14 +1,21 @@
import enum
from typing import Any
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema, BlockType
from backend.data.block import (
Block,
BlockCategory,
BlockOutput,
BlockSchemaInput,
BlockSchemaOutput,
BlockType,
)
from backend.data.model import SchemaField
from backend.util.file import store_media_file
from backend.util.type import MediaFileType, convert
class FileStoreBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
file_in: MediaFileType = SchemaField(
description="The file to store in the temporary directory, it can be a URL, data URI, or local path."
)
@@ -19,7 +26,7 @@ class FileStoreBlock(Block):
title="Produce Base64 Output",
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
file_out: MediaFileType = SchemaField(
description="The relative path to the stored file in the temporary directory."
)
@@ -57,7 +64,7 @@ class StoreValueBlock(Block):
The block output will be static, the output can be consumed multiple times.
"""
class Input(BlockSchema):
class Input(BlockSchemaInput):
input: Any = SchemaField(
description="Trigger the block to produce the output. "
"The value is only used when `data` is None."
@@ -68,7 +75,7 @@ class StoreValueBlock(Block):
default=None,
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
output: Any = SchemaField(description="The stored data retained in the block.")
def __init__(self):
@@ -94,10 +101,10 @@ class StoreValueBlock(Block):
class PrintToConsoleBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
text: Any = SchemaField(description="The data to print to the console.")
class Output(BlockSchema):
class Output(BlockSchemaOutput):
output: Any = SchemaField(description="The data printed to the console.")
status: str = SchemaField(description="The status of the print operation.")
@@ -121,10 +128,10 @@ class PrintToConsoleBlock(Block):
class NoteBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
text: str = SchemaField(description="The text to display in the sticky note.")
class Output(BlockSchema):
class Output(BlockSchemaOutput):
output: str = SchemaField(description="The text to display in the sticky note.")
def __init__(self):
@@ -154,15 +161,14 @@ class TypeOptions(enum.Enum):
class UniversalTypeConverterBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
value: Any = SchemaField(
description="The value to convert to a universal type."
)
type: TypeOptions = SchemaField(description="The type to convert the value to.")
class Output(BlockSchema):
class Output(BlockSchemaOutput):
value: Any = SchemaField(description="The converted value.")
error: str = SchemaField(description="Error message if conversion failed.")
def __init__(self):
super().__init__(
@@ -195,10 +201,10 @@ class ReverseListOrderBlock(Block):
A block which takes in a list and returns it in the opposite order.
"""
class Input(BlockSchema):
class Input(BlockSchemaInput):
input_list: list[Any] = SchemaField(description="The list to reverse")
class Output(BlockSchema):
class Output(BlockSchemaOutput):
reversed_list: list[Any] = SchemaField(description="The list in reversed order")
def __init__(self):

View File

@@ -2,7 +2,13 @@ import os
import re
from typing import Type
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.block import (
Block,
BlockCategory,
BlockOutput,
BlockSchemaInput,
BlockSchemaOutput,
)
from backend.data.model import SchemaField
@@ -15,12 +21,12 @@ class BlockInstallationBlock(Block):
for development purposes only.
"""
class Input(BlockSchema):
class Input(BlockSchemaInput):
code: str = SchemaField(
description="Python code of the block to be installed",
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
success: str = SchemaField(
description="Success message if the block is installed successfully",
)

View File

@@ -1,7 +1,13 @@
from enum import Enum
from typing import Any
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.block import (
Block,
BlockCategory,
BlockOutput,
BlockSchemaInput,
BlockSchemaOutput,
)
from backend.data.model import SchemaField
from backend.util.type import convert
@@ -16,7 +22,7 @@ class ComparisonOperator(Enum):
class ConditionBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
value1: Any = SchemaField(
description="Enter the first value for comparison",
placeholder="For example: 10 or 'hello' or True",
@@ -40,7 +46,7 @@ class ConditionBlock(Block):
default=None,
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
result: bool = SchemaField(
description="The result of the condition evaluation (True or False)"
)
@@ -111,7 +117,7 @@ class ConditionBlock(Block):
class IfInputMatchesBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
input: Any = SchemaField(
description="The input to match against",
placeholder="For example: 10 or 'hello' or True",
@@ -131,7 +137,7 @@ class IfInputMatchesBlock(Block):
default=None,
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
result: bool = SchemaField(
description="The result of the condition evaluation (True or False)"
)

View File

@@ -4,9 +4,15 @@ from typing import Any, Literal, Optional
from e2b_code_interpreter import AsyncSandbox
from e2b_code_interpreter import Result as E2BExecutionResult
from e2b_code_interpreter.charts import Chart as E2BExecutionResultChart
from pydantic import BaseModel, JsonValue, SecretStr
from pydantic import BaseModel, Field, JsonValue, SecretStr
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.block import (
Block,
BlockCategory,
BlockOutput,
BlockSchemaInput,
BlockSchemaOutput,
)
from backend.data.model import (
APIKeyCredentials,
CredentialsField,
@@ -61,7 +67,7 @@ class MainCodeExecutionResult(BaseModel):
jpeg: Optional[str] = None
pdf: Optional[str] = None
latex: Optional[str] = None
json: Optional[JsonValue] = None # type: ignore (reportIncompatibleMethodOverride)
json_data: Optional[JsonValue] = Field(None, alias="json")
javascript: Optional[str] = None
data: Optional[dict] = None
chart: Optional[Chart] = None
@@ -159,7 +165,7 @@ class ExecuteCodeBlock(Block, BaseE2BExecutorMixin):
# TODO : Add support to upload and download files
# NOTE: Currently, you can only customize the CPU and Memory
# by creating a pre customized sandbox template
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: CredentialsMetaInput[
Literal[ProviderName.E2B], Literal["api_key"]
] = CredentialsField(
@@ -217,7 +223,7 @@ class ExecuteCodeBlock(Block, BaseE2BExecutorMixin):
advanced=True,
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
main_result: MainCodeExecutionResult = SchemaField(
title="Main Result", description="The main result from the code execution"
)
@@ -232,7 +238,6 @@ class ExecuteCodeBlock(Block, BaseE2BExecutorMixin):
description="Standard output logs from execution"
)
stderr_logs: str = SchemaField(description="Standard error logs from execution")
error: str = SchemaField(description="Error message if execution failed")
def __init__(self):
super().__init__(
@@ -296,7 +301,7 @@ class ExecuteCodeBlock(Block, BaseE2BExecutorMixin):
class InstantiateCodeSandboxBlock(Block, BaseE2BExecutorMixin):
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: CredentialsMetaInput[
Literal[ProviderName.E2B], Literal["api_key"]
] = CredentialsField(
@@ -346,7 +351,7 @@ class InstantiateCodeSandboxBlock(Block, BaseE2BExecutorMixin):
advanced=True,
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
sandbox_id: str = SchemaField(description="ID of the sandbox instance")
response: str = SchemaField(
title="Text Result",
@@ -356,7 +361,6 @@ class InstantiateCodeSandboxBlock(Block, BaseE2BExecutorMixin):
description="Standard output logs from execution"
)
stderr_logs: str = SchemaField(description="Standard error logs from execution")
error: str = SchemaField(description="Error message if execution failed")
def __init__(self):
super().__init__(
@@ -421,7 +425,7 @@ class InstantiateCodeSandboxBlock(Block, BaseE2BExecutorMixin):
class ExecuteCodeStepBlock(Block, BaseE2BExecutorMixin):
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: CredentialsMetaInput[
Literal[ProviderName.E2B], Literal["api_key"]
] = CredentialsField(
@@ -454,7 +458,7 @@ class ExecuteCodeStepBlock(Block, BaseE2BExecutorMixin):
default=False,
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
main_result: MainCodeExecutionResult = SchemaField(
title="Main Result", description="The main result from the code execution"
)
@@ -469,7 +473,6 @@ class ExecuteCodeStepBlock(Block, BaseE2BExecutorMixin):
description="Standard output logs from execution"
)
stderr_logs: str = SchemaField(description="Standard error logs from execution")
error: str = SchemaField(description="Error message if execution failed")
def __init__(self):
super().__init__(

View File

@@ -1,17 +1,23 @@
import re
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.block import (
Block,
BlockCategory,
BlockOutput,
BlockSchemaInput,
BlockSchemaOutput,
)
from backend.data.model import SchemaField
class CodeExtractionBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
text: str = SchemaField(
description="Text containing code blocks to extract (e.g., AI response)",
placeholder="Enter text containing code blocks",
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
html: str = SchemaField(description="Extracted HTML code")
css: str = SchemaField(description="Extracted CSS code")
javascript: str = SchemaField(description="Extracted JavaScript code")

View File

@@ -5,7 +5,8 @@ from backend.data.block import (
BlockCategory,
BlockManualWebhookConfig,
BlockOutput,
BlockSchema,
BlockSchemaInput,
BlockSchemaOutput,
)
from backend.data.model import SchemaField
from backend.integrations.providers import ProviderName
@@ -27,10 +28,10 @@ class TranscriptionDataModel(BaseModel):
class CompassAITriggerBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
payload: TranscriptionDataModel = SchemaField(hidden=True)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
transcription: str = SchemaField(
description="The contents of the compass transcription."
)

View File

@@ -1,16 +1,22 @@
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.block import (
Block,
BlockCategory,
BlockOutput,
BlockSchemaInput,
BlockSchemaOutput,
)
from backend.data.model import SchemaField
class WordCharacterCountBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
text: str = SchemaField(
description="Input text to count words and characters",
placeholder="Enter your text here",
advanced=False,
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
word_count: int = SchemaField(description="Number of words in the input text")
character_count: int = SchemaField(
description="Number of characters in the input text"

View File

@@ -1,6 +1,12 @@
from typing import Any, List
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.block import (
Block,
BlockCategory,
BlockOutput,
BlockSchemaInput,
BlockSchemaOutput,
)
from backend.data.model import SchemaField
from backend.util.json import loads
from backend.util.mock import MockObject
@@ -12,13 +18,13 @@ from backend.util.prompt import estimate_token_count_str
class CreateDictionaryBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
values: dict[str, Any] = SchemaField(
description="Key-value pairs to create the dictionary with",
placeholder="e.g., {'name': 'Alice', 'age': 25}",
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
dictionary: dict[str, Any] = SchemaField(
description="The created dictionary containing the specified key-value pairs"
)
@@ -62,7 +68,7 @@ class CreateDictionaryBlock(Block):
class AddToDictionaryBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
dictionary: dict[Any, Any] = SchemaField(
default_factory=dict,
description="The dictionary to add the entry to. If not provided, a new dictionary will be created.",
@@ -86,11 +92,10 @@ class AddToDictionaryBlock(Block):
advanced=True,
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
updated_dictionary: dict = SchemaField(
description="The dictionary with the new entry added."
)
error: str = SchemaField(description="Error message if the operation failed.")
def __init__(self):
super().__init__(
@@ -141,11 +146,11 @@ class AddToDictionaryBlock(Block):
class FindInDictionaryBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
input: Any = SchemaField(description="Dictionary to lookup from")
key: str | int = SchemaField(description="Key to lookup in the dictionary")
class Output(BlockSchema):
class Output(BlockSchemaOutput):
output: Any = SchemaField(description="Value found for the given key")
missing: Any = SchemaField(
description="Value of the input that missing the key"
@@ -201,7 +206,7 @@ class FindInDictionaryBlock(Block):
class RemoveFromDictionaryBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
dictionary: dict[Any, Any] = SchemaField(
description="The dictionary to modify."
)
@@ -210,12 +215,11 @@ class RemoveFromDictionaryBlock(Block):
default=False, description="Whether to return the removed value."
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
updated_dictionary: dict[Any, Any] = SchemaField(
description="The dictionary after removal."
)
removed_value: Any = SchemaField(description="The removed value if requested.")
error: str = SchemaField(description="Error message if the operation failed.")
def __init__(self):
super().__init__(
@@ -251,19 +255,18 @@ class RemoveFromDictionaryBlock(Block):
class ReplaceDictionaryValueBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
dictionary: dict[Any, Any] = SchemaField(
description="The dictionary to modify."
)
key: str | int = SchemaField(description="Key to replace the value for.")
value: Any = SchemaField(description="The new value for the given key.")
class Output(BlockSchema):
class Output(BlockSchemaOutput):
updated_dictionary: dict[Any, Any] = SchemaField(
description="The dictionary after replacement."
)
old_value: Any = SchemaField(description="The value that was replaced.")
error: str = SchemaField(description="Error message if the operation failed.")
def __init__(self):
super().__init__(
@@ -300,10 +303,10 @@ class ReplaceDictionaryValueBlock(Block):
class DictionaryIsEmptyBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
dictionary: dict[Any, Any] = SchemaField(description="The dictionary to check.")
class Output(BlockSchema):
class Output(BlockSchemaOutput):
is_empty: bool = SchemaField(description="True if the dictionary is empty.")
def __init__(self):
@@ -327,7 +330,7 @@ class DictionaryIsEmptyBlock(Block):
class CreateListBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
values: List[Any] = SchemaField(
description="A list of values to be combined into a new list.",
placeholder="e.g., ['Alice', 25, True]",
@@ -343,11 +346,10 @@ class CreateListBlock(Block):
advanced=True,
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
list: List[Any] = SchemaField(
description="The created list containing the specified values."
)
error: str = SchemaField(description="Error message if list creation failed.")
def __init__(self):
super().__init__(
@@ -404,7 +406,7 @@ class CreateListBlock(Block):
class AddToListBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
list: List[Any] = SchemaField(
default_factory=list,
advanced=False,
@@ -425,11 +427,10 @@ class AddToListBlock(Block):
description="The position to insert the new entry. If not provided, the entry will be appended to the end of the list.",
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
updated_list: List[Any] = SchemaField(
description="The list with the new entry added."
)
error: str = SchemaField(description="Error message if the operation failed.")
def __init__(self):
super().__init__(
@@ -484,11 +485,11 @@ class AddToListBlock(Block):
class FindInListBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
list: List[Any] = SchemaField(description="The list to search in.")
value: Any = SchemaField(description="The value to search for.")
class Output(BlockSchema):
class Output(BlockSchemaOutput):
index: int = SchemaField(description="The index of the value in the list.")
found: bool = SchemaField(
description="Whether the value was found in the list."
@@ -526,15 +527,14 @@ class FindInListBlock(Block):
class GetListItemBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
list: List[Any] = SchemaField(description="The list to get the item from.")
index: int = SchemaField(
description="The 0-based index of the item (supports negative indices)."
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
item: Any = SchemaField(description="The item at the specified index.")
error: str = SchemaField(description="Error message if the operation failed.")
def __init__(self):
super().__init__(
@@ -561,7 +561,7 @@ class GetListItemBlock(Block):
class RemoveFromListBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
list: List[Any] = SchemaField(description="The list to modify.")
value: Any = SchemaField(
default=None, description="Value to remove from the list."
@@ -574,10 +574,9 @@ class RemoveFromListBlock(Block):
default=False, description="Whether to return the removed item."
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
updated_list: List[Any] = SchemaField(description="The list after removal.")
removed_item: Any = SchemaField(description="The removed item if requested.")
error: str = SchemaField(description="Error message if the operation failed.")
def __init__(self):
super().__init__(
@@ -618,17 +617,16 @@ class RemoveFromListBlock(Block):
class ReplaceListItemBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
list: List[Any] = SchemaField(description="The list to modify.")
index: int = SchemaField(
description="Index of the item to replace (supports negative indices)."
)
value: Any = SchemaField(description="The new value for the given index.")
class Output(BlockSchema):
class Output(BlockSchemaOutput):
updated_list: List[Any] = SchemaField(description="The list after replacement.")
old_item: Any = SchemaField(description="The item that was replaced.")
error: str = SchemaField(description="Error message if the operation failed.")
def __init__(self):
super().__init__(
@@ -663,10 +661,10 @@ class ReplaceListItemBlock(Block):
class ListIsEmptyBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
list: List[Any] = SchemaField(description="The list to check.")
class Output(BlockSchema):
class Output(BlockSchemaOutput):
is_empty: bool = SchemaField(description="True if the list is empty.")
def __init__(self):

View File

@@ -8,7 +8,8 @@ from backend.sdk import (
Block,
BlockCategory,
BlockOutput,
BlockSchema,
BlockSchemaInput,
BlockSchemaOutput,
CredentialsMetaInput,
SchemaField,
UserPasswordCredentials,
@@ -18,7 +19,7 @@ from ._api import DataForSeoClient
from ._config import dataforseo
class KeywordSuggestion(BlockSchema):
class KeywordSuggestion(BlockSchemaInput):
"""Schema for a keyword suggestion result."""
keyword: str = SchemaField(description="The keyword suggestion")
@@ -45,7 +46,7 @@ class KeywordSuggestion(BlockSchema):
class DataForSeoKeywordSuggestionsBlock(Block):
"""Block for getting keyword suggestions from DataForSEO Labs."""
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: CredentialsMetaInput = dataforseo.credentials_field(
description="DataForSEO credentials (username and password)"
)
@@ -77,7 +78,7 @@ class DataForSeoKeywordSuggestionsBlock(Block):
le=3000,
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
suggestions: List[KeywordSuggestion] = SchemaField(
description="List of keyword suggestions with metrics"
)
@@ -90,7 +91,6 @@ class DataForSeoKeywordSuggestionsBlock(Block):
seed_keyword: str = SchemaField(
description="The seed keyword used for the query"
)
error: str = SchemaField(description="Error message if the API call failed")
def __init__(self):
super().__init__(
@@ -213,12 +213,12 @@ class DataForSeoKeywordSuggestionsBlock(Block):
class KeywordSuggestionExtractorBlock(Block):
"""Extracts individual fields from a KeywordSuggestion object."""
class Input(BlockSchema):
class Input(BlockSchemaInput):
suggestion: KeywordSuggestion = SchemaField(
description="The keyword suggestion object to extract fields from"
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
keyword: str = SchemaField(description="The keyword suggestion")
search_volume: Optional[int] = SchemaField(
description="Monthly search volume", default=None

View File

@@ -8,7 +8,8 @@ from backend.sdk import (
Block,
BlockCategory,
BlockOutput,
BlockSchema,
BlockSchemaInput,
BlockSchemaOutput,
CredentialsMetaInput,
SchemaField,
UserPasswordCredentials,
@@ -18,7 +19,7 @@ from ._api import DataForSeoClient
from ._config import dataforseo
class RelatedKeyword(BlockSchema):
class RelatedKeyword(BlockSchemaInput):
"""Schema for a related keyword result."""
keyword: str = SchemaField(description="The related keyword")
@@ -45,7 +46,7 @@ class RelatedKeyword(BlockSchema):
class DataForSeoRelatedKeywordsBlock(Block):
"""Block for getting related keywords from DataForSEO Labs."""
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: CredentialsMetaInput = dataforseo.credentials_field(
description="DataForSEO credentials (username and password)"
)
@@ -85,7 +86,7 @@ class DataForSeoRelatedKeywordsBlock(Block):
le=4,
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
related_keywords: List[RelatedKeyword] = SchemaField(
description="List of related keywords with metrics"
)
@@ -98,7 +99,6 @@ class DataForSeoRelatedKeywordsBlock(Block):
seed_keyword: str = SchemaField(
description="The seed keyword used for the query"
)
error: str = SchemaField(description="Error message if the API call failed")
def __init__(self):
super().__init__(
@@ -231,12 +231,12 @@ class DataForSeoRelatedKeywordsBlock(Block):
class RelatedKeywordExtractorBlock(Block):
"""Extracts individual fields from a RelatedKeyword object."""
class Input(BlockSchema):
class Input(BlockSchemaInput):
related_keyword: RelatedKeyword = SchemaField(
description="The related keyword object to extract fields from"
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
keyword: str = SchemaField(description="The related keyword")
search_volume: Optional[int] = SchemaField(
description="Monthly search volume", default=None

View File

@@ -1,17 +1,23 @@
import codecs
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.block import (
Block,
BlockCategory,
BlockOutput,
BlockSchemaInput,
BlockSchemaOutput,
)
from backend.data.model import SchemaField
class TextDecoderBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
text: str = SchemaField(
description="A string containing escaped characters to be decoded",
placeholder='Your entire text block with \\n and \\" escaped characters',
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
decoded_text: str = SchemaField(
description="The decoded text with escape sequences processed"
)

View File

@@ -7,7 +7,13 @@ from typing import Any
import discord
from pydantic import SecretStr
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.block import (
Block,
BlockCategory,
BlockOutput,
BlockSchemaInput,
BlockSchemaOutput,
)
from backend.data.model import APIKeyCredentials, SchemaField
from backend.util.file import store_media_file
from backend.util.request import Requests
@@ -28,10 +34,10 @@ TEST_CREDENTIALS_INPUT = TEST_BOT_CREDENTIALS_INPUT
class ReadDiscordMessagesBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: DiscordCredentials = DiscordCredentialsField()
class Output(BlockSchema):
class Output(BlockSchemaOutput):
message_content: str = SchemaField(
description="The content of the message received"
)
@@ -164,7 +170,7 @@ class ReadDiscordMessagesBlock(Block):
class SendDiscordMessageBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: DiscordCredentials = DiscordCredentialsField()
message_content: str = SchemaField(
description="The content of the message to send"
@@ -178,7 +184,7 @@ class SendDiscordMessageBlock(Block):
default="",
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
status: str = SchemaField(
description="The status of the operation (e.g., 'Message sent', 'Error')"
)
@@ -310,7 +316,7 @@ class SendDiscordMessageBlock(Block):
class SendDiscordDMBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: DiscordCredentials = DiscordCredentialsField()
user_id: str = SchemaField(
description="The Discord user ID to send the DM to (e.g., '123456789012345678')"
@@ -319,7 +325,7 @@ class SendDiscordDMBlock(Block):
description="The content of the direct message to send"
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
status: str = SchemaField(description="The status of the operation")
message_id: str = SchemaField(description="The ID of the sent message")
@@ -399,7 +405,7 @@ class SendDiscordDMBlock(Block):
class SendDiscordEmbedBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: DiscordCredentials = DiscordCredentialsField()
channel_identifier: str = SchemaField(
description="Channel ID or channel name to send the embed to"
@@ -436,7 +442,7 @@ class SendDiscordEmbedBlock(Block):
default=[],
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
status: str = SchemaField(description="Operation status")
message_id: str = SchemaField(description="ID of the sent embed message")
@@ -586,7 +592,7 @@ class SendDiscordEmbedBlock(Block):
class SendDiscordFileBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: DiscordCredentials = DiscordCredentialsField()
channel_identifier: str = SchemaField(
description="Channel ID or channel name to send the file to"
@@ -607,7 +613,7 @@ class SendDiscordFileBlock(Block):
description="Optional message to send with the file", default=""
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
status: str = SchemaField(description="Operation status")
message_id: str = SchemaField(description="ID of the sent message")
@@ -788,7 +794,7 @@ class SendDiscordFileBlock(Block):
class ReplyToDiscordMessageBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: DiscordCredentials = DiscordCredentialsField()
channel_id: str = SchemaField(
description="The channel ID where the message to reply to is located"
@@ -799,7 +805,7 @@ class ReplyToDiscordMessageBlock(Block):
description="Whether to mention the original message author", default=True
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
status: str = SchemaField(description="Operation status")
reply_id: str = SchemaField(description="ID of the reply message")
@@ -913,13 +919,13 @@ class ReplyToDiscordMessageBlock(Block):
class DiscordUserInfoBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: DiscordCredentials = DiscordCredentialsField()
user_id: str = SchemaField(
description="The Discord user ID to get information about"
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
user_id: str = SchemaField(
description="The user's ID (passed through for chaining)"
)
@@ -1030,7 +1036,7 @@ class DiscordUserInfoBlock(Block):
class DiscordChannelInfoBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: DiscordCredentials = DiscordCredentialsField()
channel_identifier: str = SchemaField(
description="Channel name or channel ID to look up"
@@ -1041,7 +1047,7 @@ class DiscordChannelInfoBlock(Block):
default="",
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
channel_id: str = SchemaField(description="The channel's ID")
channel_name: str = SchemaField(description="The channel's name")
server_id: str = SchemaField(description="The server's ID")

View File

@@ -2,7 +2,13 @@
Discord OAuth-based blocks.
"""
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.block import (
Block,
BlockCategory,
BlockOutput,
BlockSchemaInput,
BlockSchemaOutput,
)
from backend.data.model import OAuth2Credentials, SchemaField
from ._api import DiscordOAuthUser, get_current_user
@@ -21,12 +27,12 @@ class DiscordGetCurrentUserBlock(Block):
This block requires Discord OAuth2 credentials (not bot tokens).
"""
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: DiscordOAuthCredentialsInput = DiscordOAuthCredentialsField(
["identify"]
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
user_id: str = SchemaField(description="The authenticated user's Discord ID")
username: str = SchemaField(description="The user's username")
avatar_url: str = SchemaField(description="URL to the user's avatar image")

View File

@@ -5,7 +5,13 @@ from typing import Literal
from pydantic import BaseModel, ConfigDict, SecretStr
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.block import (
Block,
BlockCategory,
BlockOutput,
BlockSchemaInput,
BlockSchemaOutput,
)
from backend.data.model import (
CredentialsField,
CredentialsMetaInput,
@@ -51,7 +57,7 @@ class SMTPConfig(BaseModel):
class SendEmailBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
to_email: str = SchemaField(
description="Recipient email address", placeholder="recipient@example.com"
)
@@ -67,7 +73,7 @@ class SendEmailBlock(Block):
)
credentials: SMTPCredentialsInput = SMTPCredentialsField()
class Output(BlockSchema):
class Output(BlockSchemaOutput):
status: str = SchemaField(description="Status of the email sending operation")
error: str = SchemaField(
description="Error message if the email sending failed"

View File

@@ -8,7 +8,13 @@ which provides access to LinkedIn profile data and related information.
import logging
from typing import Optional
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.block import (
Block,
BlockCategory,
BlockOutput,
BlockSchemaInput,
BlockSchemaOutput,
)
from backend.data.model import APIKeyCredentials, CredentialsField, SchemaField
from backend.util.type import MediaFileType
@@ -29,7 +35,7 @@ logger = logging.getLogger(__name__)
class GetLinkedinProfileBlock(Block):
"""Block to fetch LinkedIn profile data using Enrichlayer API."""
class Input(BlockSchema):
class Input(BlockSchemaInput):
"""Input schema for GetLinkedinProfileBlock."""
linkedin_url: str = SchemaField(
@@ -80,13 +86,12 @@ class GetLinkedinProfileBlock(Block):
description="Enrichlayer API credentials"
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
"""Output schema for GetLinkedinProfileBlock."""
profile: PersonProfileResponse = SchemaField(
description="LinkedIn profile data"
)
error: str = SchemaField(description="Error message if the request failed")
def __init__(self):
"""Initialize GetLinkedinProfileBlock."""
@@ -199,7 +204,7 @@ class GetLinkedinProfileBlock(Block):
class LinkedinPersonLookupBlock(Block):
"""Block to look up LinkedIn profiles by person's information using Enrichlayer API."""
class Input(BlockSchema):
class Input(BlockSchemaInput):
"""Input schema for LinkedinPersonLookupBlock."""
first_name: str = SchemaField(
@@ -242,13 +247,12 @@ class LinkedinPersonLookupBlock(Block):
description="Enrichlayer API credentials"
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
"""Output schema for LinkedinPersonLookupBlock."""
lookup_result: PersonLookupResponse = SchemaField(
description="LinkedIn profile lookup result"
)
error: str = SchemaField(description="Error message if the request failed")
def __init__(self):
"""Initialize LinkedinPersonLookupBlock."""
@@ -346,7 +350,7 @@ class LinkedinPersonLookupBlock(Block):
class LinkedinRoleLookupBlock(Block):
"""Block to look up LinkedIn profiles by role in a company using Enrichlayer API."""
class Input(BlockSchema):
class Input(BlockSchemaInput):
"""Input schema for LinkedinRoleLookupBlock."""
role: str = SchemaField(
@@ -366,13 +370,12 @@ class LinkedinRoleLookupBlock(Block):
description="Enrichlayer API credentials"
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
"""Output schema for LinkedinRoleLookupBlock."""
role_lookup_result: RoleLookupResponse = SchemaField(
description="LinkedIn role lookup result"
)
error: str = SchemaField(description="Error message if the request failed")
def __init__(self):
"""Initialize LinkedinRoleLookupBlock."""
@@ -449,7 +452,7 @@ class LinkedinRoleLookupBlock(Block):
class GetLinkedinProfilePictureBlock(Block):
"""Block to get LinkedIn profile pictures using Enrichlayer API."""
class Input(BlockSchema):
class Input(BlockSchemaInput):
"""Input schema for GetLinkedinProfilePictureBlock."""
linkedin_profile_url: str = SchemaField(
@@ -460,13 +463,12 @@ class GetLinkedinProfilePictureBlock(Block):
description="Enrichlayer API credentials"
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
"""Output schema for GetLinkedinProfilePictureBlock."""
profile_picture_url: MediaFileType = SchemaField(
description="LinkedIn profile picture URL"
)
error: str = SchemaField(description="Error message if the request failed")
def __init__(self):
"""Initialize GetLinkedinProfilePictureBlock."""

View File

@@ -4,7 +4,8 @@ from backend.sdk import (
Block,
BlockCategory,
BlockOutput,
BlockSchema,
BlockSchemaInput,
BlockSchemaOutput,
CredentialsMetaInput,
Requests,
SchemaField,
@@ -49,7 +50,7 @@ class CostDollars(BaseModel):
class ExaAnswerBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: CredentialsMetaInput = exa.credentials_field(
description="The Exa integration requires an API Key."
)
@@ -69,7 +70,7 @@ class ExaAnswerBlock(Block):
advanced=True,
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
answer: str = SchemaField(
description="The generated answer based on search results"
)

View File

@@ -3,7 +3,8 @@ from backend.sdk import (
Block,
BlockCategory,
BlockOutput,
BlockSchema,
BlockSchemaInput,
BlockSchemaOutput,
CredentialsMetaInput,
Requests,
SchemaField,
@@ -14,7 +15,7 @@ from .helpers import ContentSettings
class ExaContentsBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: CredentialsMetaInput = exa.credentials_field(
description="The Exa integration requires an API Key."
)
@@ -27,7 +28,7 @@ class ExaContentsBlock(Block):
advanced=True,
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
results: list = SchemaField(
description="List of document contents", default_factory=list
)

View File

@@ -5,7 +5,8 @@ from backend.sdk import (
Block,
BlockCategory,
BlockOutput,
BlockSchema,
BlockSchemaInput,
BlockSchemaOutput,
CredentialsMetaInput,
Requests,
SchemaField,
@@ -16,7 +17,7 @@ from .helpers import ContentSettings
class ExaSearchBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: CredentialsMetaInput = exa.credentials_field(
description="The Exa integration requires an API Key."
)
@@ -63,7 +64,7 @@ class ExaSearchBlock(Block):
advanced=True,
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
results: list = SchemaField(
description="List of search results", default_factory=list
)

View File

@@ -6,7 +6,8 @@ from backend.sdk import (
Block,
BlockCategory,
BlockOutput,
BlockSchema,
BlockSchemaInput,
BlockSchemaOutput,
CredentialsMetaInput,
Requests,
SchemaField,
@@ -17,7 +18,7 @@ from .helpers import ContentSettings
class ExaFindSimilarBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: CredentialsMetaInput = exa.credentials_field(
description="The Exa integration requires an API Key."
)
@@ -65,7 +66,7 @@ class ExaFindSimilarBlock(Block):
advanced=True,
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
results: list[Any] = SchemaField(
description="List of similar documents with title, URL, published date, author, and score",
default_factory=list,

View File

@@ -9,7 +9,8 @@ from backend.sdk import (
Block,
BlockCategory,
BlockOutput,
BlockSchema,
BlockSchemaInput,
BlockSchemaOutput,
BlockType,
BlockWebhookConfig,
CredentialsMetaInput,
@@ -84,7 +85,7 @@ class ExaWebsetWebhookBlock(Block):
including creation, updates, searches, and exports.
"""
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: CredentialsMetaInput = exa.credentials_field(
description="Exa API credentials for webhook management"
)
@@ -104,7 +105,7 @@ class ExaWebsetWebhookBlock(Block):
description="Webhook payload data", default={}, hidden=True
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
event_type: str = SchemaField(description="Type of event that occurred")
event_id: str = SchemaField(description="Unique identifier for this event")
webset_id: str = SchemaField(description="ID of the affected webset")

View File

@@ -31,7 +31,8 @@ from backend.sdk import (
Block,
BlockCategory,
BlockOutput,
BlockSchema,
BlockSchemaInput,
BlockSchemaOutput,
CredentialsMetaInput,
Requests,
SchemaField,
@@ -104,7 +105,7 @@ class Webset(BaseModel):
class ExaCreateWebsetBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: CredentialsMetaInput = exa.credentials_field(
description="The Exa integration requires an API Key."
)
@@ -219,7 +220,7 @@ class ExaCreateWebsetBlock(Block):
advanced=True,
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
webset: Webset = SchemaField(
description="The unique identifier for the created webset"
)
@@ -404,7 +405,7 @@ class ExaCreateWebsetBlock(Block):
class ExaUpdateWebsetBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: CredentialsMetaInput = exa.credentials_field(
description="The Exa integration requires an API Key."
)
@@ -417,7 +418,7 @@ class ExaUpdateWebsetBlock(Block):
description="Key-value pairs to associate with this webset (set to null to clear)",
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
webset_id: str = SchemaField(description="The unique identifier for the webset")
status: str = SchemaField(description="The status of the webset")
external_id: Optional[str] = SchemaField(
@@ -475,7 +476,7 @@ class ExaUpdateWebsetBlock(Block):
class ExaListWebsetsBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: CredentialsMetaInput = exa.credentials_field(
description="The Exa integration requires an API Key."
)
@@ -497,7 +498,7 @@ class ExaListWebsetsBlock(Block):
advanced=True,
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
websets: list[Webset] = SchemaField(
description="List of websets", default_factory=list
)
@@ -550,7 +551,7 @@ class ExaListWebsetsBlock(Block):
class ExaGetWebsetBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: CredentialsMetaInput = exa.credentials_field(
description="The Exa integration requires an API Key."
)
@@ -559,7 +560,7 @@ class ExaGetWebsetBlock(Block):
placeholder="webset-id-or-external-id",
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
webset_id: str = SchemaField(description="The unique identifier for the webset")
status: str = SchemaField(description="The status of the webset")
external_id: Optional[str] = SchemaField(
@@ -637,7 +638,7 @@ class ExaGetWebsetBlock(Block):
class ExaDeleteWebsetBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: CredentialsMetaInput = exa.credentials_field(
description="The Exa integration requires an API Key."
)
@@ -646,7 +647,7 @@ class ExaDeleteWebsetBlock(Block):
placeholder="webset-id-or-external-id",
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
webset_id: str = SchemaField(
description="The unique identifier for the deleted webset"
)
@@ -695,7 +696,7 @@ class ExaDeleteWebsetBlock(Block):
class ExaCancelWebsetBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: CredentialsMetaInput = exa.credentials_field(
description="The Exa integration requires an API Key."
)
@@ -704,7 +705,7 @@ class ExaCancelWebsetBlock(Block):
placeholder="webset-id-or-external-id",
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
webset_id: str = SchemaField(description="The unique identifier for the webset")
status: str = SchemaField(
description="The status of the webset after cancellation"

View File

@@ -10,7 +10,13 @@ from backend.blocks.fal._auth import (
FalCredentialsField,
FalCredentialsInput,
)
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.block import (
Block,
BlockCategory,
BlockOutput,
BlockSchemaInput,
BlockSchemaOutput,
)
from backend.data.model import SchemaField
from backend.util.request import ClientResponseError, Requests
@@ -24,7 +30,7 @@ class FalModel(str, Enum):
class AIVideoGeneratorBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
prompt: str = SchemaField(
description="Description of the video to generate.",
placeholder="A dog running in a field.",
@@ -36,7 +42,7 @@ class AIVideoGeneratorBlock(Block):
)
credentials: FalCredentialsInput = FalCredentialsField()
class Output(BlockSchema):
class Output(BlockSchemaOutput):
video_url: str = SchemaField(description="The URL of the generated video.")
error: str = SchemaField(
description="Error message if video generation failed."

View File

@@ -9,7 +9,8 @@ from backend.sdk import (
Block,
BlockCategory,
BlockOutput,
BlockSchema,
BlockSchemaInput,
BlockSchemaOutput,
CredentialsMetaInput,
SchemaField,
)
@@ -19,7 +20,7 @@ from ._format_utils import convert_to_format_options
class FirecrawlCrawlBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: CredentialsMetaInput = firecrawl.credentials_field()
url: str = SchemaField(description="The URL to crawl")
limit: int = SchemaField(description="The number of pages to crawl", default=10)
@@ -39,7 +40,7 @@ class FirecrawlCrawlBlock(Block):
description="The format of the crawl", default=[ScrapeFormat.MARKDOWN]
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
data: list[dict[str, Any]] = SchemaField(description="The result of the crawl")
markdown: str = SchemaField(description="The markdown of the crawl")
html: str = SchemaField(description="The html of the crawl")
@@ -55,6 +56,10 @@ class FirecrawlCrawlBlock(Block):
change_tracking: dict[str, Any] = SchemaField(
description="The change tracking of the crawl"
)
error: str = SchemaField(
description="Error message if the crawl failed",
default="",
)
def __init__(self):
super().__init__(

View File

@@ -9,7 +9,8 @@ from backend.sdk import (
BlockCost,
BlockCostType,
BlockOutput,
BlockSchema,
BlockSchemaInput,
BlockSchemaOutput,
CredentialsMetaInput,
SchemaField,
cost,
@@ -20,7 +21,7 @@ from ._config import firecrawl
@cost(BlockCost(2, BlockCostType.RUN))
class FirecrawlExtractBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: CredentialsMetaInput = firecrawl.credentials_field()
urls: list[str] = SchemaField(
description="The URLs to crawl - at least one is required. Wildcards are supported. (/*)"
@@ -37,8 +38,12 @@ class FirecrawlExtractBlock(Block):
default=False,
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
data: dict[str, Any] = SchemaField(description="The result of the crawl")
error: str = SchemaField(
description="Error message if the extraction failed",
default="",
)
def __init__(self):
super().__init__(

View File

@@ -7,7 +7,8 @@ from backend.sdk import (
Block,
BlockCategory,
BlockOutput,
BlockSchema,
BlockSchemaInput,
BlockSchemaOutput,
CredentialsMetaInput,
SchemaField,
)
@@ -16,16 +17,20 @@ from ._config import firecrawl
class FirecrawlMapWebsiteBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: CredentialsMetaInput = firecrawl.credentials_field()
url: str = SchemaField(description="The website url to map")
class Output(BlockSchema):
class Output(BlockSchemaOutput):
links: list[str] = SchemaField(description="List of URLs found on the website")
results: list[dict[str, Any]] = SchemaField(
description="List of search results with url, title, and description"
)
error: str = SchemaField(
description="Error message if the map failed",
default="",
)
def __init__(self):
super().__init__(

View File

@@ -8,7 +8,8 @@ from backend.sdk import (
Block,
BlockCategory,
BlockOutput,
BlockSchema,
BlockSchemaInput,
BlockSchemaOutput,
CredentialsMetaInput,
SchemaField,
)
@@ -18,7 +19,7 @@ from ._format_utils import convert_to_format_options
class FirecrawlScrapeBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: CredentialsMetaInput = firecrawl.credentials_field()
url: str = SchemaField(description="The URL to crawl")
limit: int = SchemaField(description="The number of pages to crawl", default=10)
@@ -38,7 +39,7 @@ class FirecrawlScrapeBlock(Block):
description="The format of the crawl", default=[ScrapeFormat.MARKDOWN]
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
data: dict[str, Any] = SchemaField(description="The result of the crawl")
markdown: str = SchemaField(description="The markdown of the crawl")
html: str = SchemaField(description="The html of the crawl")
@@ -54,6 +55,10 @@ class FirecrawlScrapeBlock(Block):
change_tracking: dict[str, Any] = SchemaField(
description="The change tracking of the crawl"
)
error: str = SchemaField(
description="Error message if the scrape failed",
default="",
)
def __init__(self):
super().__init__(

View File

@@ -9,7 +9,8 @@ from backend.sdk import (
Block,
BlockCategory,
BlockOutput,
BlockSchema,
BlockSchemaInput,
BlockSchemaOutput,
CredentialsMetaInput,
SchemaField,
)
@@ -19,7 +20,7 @@ from ._format_utils import convert_to_format_options
class FirecrawlSearchBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: CredentialsMetaInput = firecrawl.credentials_field()
query: str = SchemaField(description="The query to search for")
limit: int = SchemaField(description="The number of pages to crawl", default=10)
@@ -35,9 +36,13 @@ class FirecrawlSearchBlock(Block):
description="Returns the content of the search if specified", default=[]
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
data: dict[str, Any] = SchemaField(description="The result of the search")
site: dict[str, Any] = SchemaField(description="The site of the search")
error: str = SchemaField(
description="Error message if the search failed",
default="",
)
def __init__(self):
super().__init__(

View File

@@ -5,7 +5,13 @@ from pydantic import SecretStr
from replicate.client import Client as ReplicateClient
from replicate.helpers import FileOutput
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.block import (
Block,
BlockCategory,
BlockOutput,
BlockSchemaInput,
BlockSchemaOutput,
)
from backend.data.model import (
APIKeyCredentials,
CredentialsField,
@@ -57,7 +63,7 @@ class AspectRatio(str, Enum):
class AIImageEditorBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: CredentialsMetaInput[
Literal[ProviderName.REPLICATE], Literal["api_key"]
] = CredentialsField(
@@ -90,11 +96,10 @@ class AIImageEditorBlock(Block):
title="Model",
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
output_image: MediaFileType = SchemaField(
description="URL of the transformed image"
)
error: str = SchemaField(description="Error message if generation failed")
def __init__(self):
super().__init__(

View File

@@ -3,7 +3,8 @@ from backend.sdk import (
BlockCategory,
BlockManualWebhookConfig,
BlockOutput,
BlockSchema,
BlockSchemaInput,
BlockSchemaOutput,
ProviderBuilder,
ProviderName,
SchemaField,
@@ -19,14 +20,14 @@ generic_webhook = (
class GenericWebhookTriggerBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
payload: dict = SchemaField(hidden=True, default_factory=dict)
constants: dict = SchemaField(
description="The constants to be set when the block is put on the graph",
default_factory=dict,
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
payload: dict = SchemaField(
description="The complete webhook payload that was received from the generic webhook."
)

View File

@@ -3,7 +3,13 @@ from typing import Optional
from pydantic import BaseModel
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.block import (
Block,
BlockCategory,
BlockOutput,
BlockSchemaInput,
BlockSchemaOutput,
)
from backend.data.model import SchemaField
from ._api import get_api
@@ -39,7 +45,7 @@ class ChecksConclusion(Enum):
class GithubCreateCheckRunBlock(Block):
"""Block for creating a new check run on a GitHub repository."""
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: GithubCredentialsInput = GithubCredentialsField("repo:status")
repo_url: str = SchemaField(
description="URL of the GitHub repository",
@@ -76,7 +82,7 @@ class GithubCreateCheckRunBlock(Block):
default="",
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
class CheckRunResult(BaseModel):
id: int
html_url: str
@@ -211,7 +217,7 @@ class GithubCreateCheckRunBlock(Block):
class GithubUpdateCheckRunBlock(Block):
"""Block for updating an existing check run on a GitHub repository."""
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: GithubCredentialsInput = GithubCredentialsField("repo:status")
repo_url: str = SchemaField(
description="URL of the GitHub repository",
@@ -239,7 +245,7 @@ class GithubUpdateCheckRunBlock(Block):
default=None,
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
class CheckRunResult(BaseModel):
id: int
html_url: str
@@ -249,7 +255,6 @@ class GithubUpdateCheckRunBlock(Block):
check_run: CheckRunResult = SchemaField(
description="Details of the updated check run"
)
error: str = SchemaField(description="Error message if check run update failed")
def __init__(self):
super().__init__(

View File

@@ -5,7 +5,13 @@ from typing import Optional
from typing_extensions import TypedDict
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.block import (
Block,
BlockCategory,
BlockOutput,
BlockSchemaInput,
BlockSchemaOutput,
)
from backend.data.model import SchemaField
from ._api import get_api
@@ -37,7 +43,7 @@ class CheckRunConclusion(Enum):
class GithubGetCIResultsBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: GithubCredentialsInput = GithubCredentialsField("repo")
repo: str = SchemaField(
description="GitHub repository",
@@ -60,7 +66,7 @@ class GithubGetCIResultsBlock(Block):
advanced=True,
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
class CheckRunItem(TypedDict, total=False):
id: int
name: str
@@ -104,7 +110,6 @@ class GithubGetCIResultsBlock(Block):
total_checks: int = SchemaField(description="Total number of CI checks")
passed_checks: int = SchemaField(description="Number of passed checks")
failed_checks: int = SchemaField(description="Number of failed checks")
error: str = SchemaField(description="Error message if the operation failed")
def __init__(self):
super().__init__(

View File

@@ -3,7 +3,13 @@ from urllib.parse import urlparse
from typing_extensions import TypedDict
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.block import (
Block,
BlockCategory,
BlockOutput,
BlockSchemaInput,
BlockSchemaOutput,
)
from backend.data.model import SchemaField
from ._api import convert_comment_url_to_api_endpoint, get_api
@@ -24,7 +30,7 @@ def is_github_url(url: str) -> bool:
# --8<-- [start:GithubCommentBlockExample]
class GithubCommentBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: GithubCredentialsInput = GithubCredentialsField("repo")
issue_url: str = SchemaField(
description="URL of the GitHub issue or pull request",
@@ -35,7 +41,7 @@ class GithubCommentBlock(Block):
placeholder="Enter your comment",
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
id: int = SchemaField(description="ID of the created comment")
url: str = SchemaField(description="URL to the comment on GitHub")
error: str = SchemaField(
@@ -112,7 +118,7 @@ class GithubCommentBlock(Block):
class GithubUpdateCommentBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: GithubCredentialsInput = GithubCredentialsField("repo")
comment_url: str = SchemaField(
description="URL of the GitHub comment",
@@ -135,7 +141,7 @@ class GithubUpdateCommentBlock(Block):
placeholder="Enter your comment",
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
id: int = SchemaField(description="ID of the updated comment")
url: str = SchemaField(description="URL to the comment on GitHub")
error: str = SchemaField(
@@ -219,14 +225,14 @@ class GithubUpdateCommentBlock(Block):
class GithubListCommentsBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: GithubCredentialsInput = GithubCredentialsField("repo")
issue_url: str = SchemaField(
description="URL of the GitHub issue or pull request",
placeholder="https://github.com/owner/repo/issues/1",
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
class CommentItem(TypedDict):
id: int
body: str
@@ -239,7 +245,6 @@ class GithubListCommentsBlock(Block):
comments: list[CommentItem] = SchemaField(
description="List of comments with their ID, body, user, and URL"
)
error: str = SchemaField(description="Error message if listing comments failed")
def __init__(self):
super().__init__(
@@ -335,7 +340,7 @@ class GithubListCommentsBlock(Block):
class GithubMakeIssueBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: GithubCredentialsInput = GithubCredentialsField("repo")
repo_url: str = SchemaField(
description="URL of the GitHub repository",
@@ -348,7 +353,7 @@ class GithubMakeIssueBlock(Block):
description="Body of the issue", placeholder="Enter the issue body"
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
number: int = SchemaField(description="Number of the created issue")
url: str = SchemaField(description="URL of the created issue")
error: str = SchemaField(
@@ -410,14 +415,14 @@ class GithubMakeIssueBlock(Block):
class GithubReadIssueBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: GithubCredentialsInput = GithubCredentialsField("repo")
issue_url: str = SchemaField(
description="URL of the GitHub issue",
placeholder="https://github.com/owner/repo/issues/1",
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
title: str = SchemaField(description="Title of the issue")
body: str = SchemaField(description="Body of the issue")
user: str = SchemaField(description="User who created the issue")
@@ -483,14 +488,14 @@ class GithubReadIssueBlock(Block):
class GithubListIssuesBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: GithubCredentialsInput = GithubCredentialsField("repo")
repo_url: str = SchemaField(
description="URL of the GitHub repository",
placeholder="https://github.com/owner/repo",
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
class IssueItem(TypedDict):
title: str
url: str
@@ -501,7 +506,6 @@ class GithubListIssuesBlock(Block):
issues: list[IssueItem] = SchemaField(
description="List of issues with their title and URL"
)
error: str = SchemaField(description="Error message if listing issues failed")
def __init__(self):
super().__init__(
@@ -573,7 +577,7 @@ class GithubListIssuesBlock(Block):
class GithubAddLabelBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: GithubCredentialsInput = GithubCredentialsField("repo")
issue_url: str = SchemaField(
description="URL of the GitHub issue or pull request",
@@ -584,7 +588,7 @@ class GithubAddLabelBlock(Block):
placeholder="Enter the label",
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
status: str = SchemaField(description="Status of the label addition operation")
error: str = SchemaField(
description="Error message if the label addition failed"
@@ -633,7 +637,7 @@ class GithubAddLabelBlock(Block):
class GithubRemoveLabelBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: GithubCredentialsInput = GithubCredentialsField("repo")
issue_url: str = SchemaField(
description="URL of the GitHub issue or pull request",
@@ -644,7 +648,7 @@ class GithubRemoveLabelBlock(Block):
placeholder="Enter the label",
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
status: str = SchemaField(description="Status of the label removal operation")
error: str = SchemaField(
description="Error message if the label removal failed"
@@ -694,7 +698,7 @@ class GithubRemoveLabelBlock(Block):
class GithubAssignIssueBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: GithubCredentialsInput = GithubCredentialsField("repo")
issue_url: str = SchemaField(
description="URL of the GitHub issue",
@@ -705,7 +709,7 @@ class GithubAssignIssueBlock(Block):
placeholder="Enter the username",
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
status: str = SchemaField(
description="Status of the issue assignment operation"
)
@@ -760,7 +764,7 @@ class GithubAssignIssueBlock(Block):
class GithubUnassignIssueBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: GithubCredentialsInput = GithubCredentialsField("repo")
issue_url: str = SchemaField(
description="URL of the GitHub issue",
@@ -771,7 +775,7 @@ class GithubUnassignIssueBlock(Block):
placeholder="Enter the username",
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
status: str = SchemaField(
description="Status of the issue unassignment operation"
)

View File

@@ -2,7 +2,13 @@ import re
from typing_extensions import TypedDict
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.block import (
Block,
BlockCategory,
BlockOutput,
BlockSchemaInput,
BlockSchemaOutput,
)
from backend.data.model import SchemaField
from ._api import get_api
@@ -16,14 +22,14 @@ from ._auth import (
class GithubListPullRequestsBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: GithubCredentialsInput = GithubCredentialsField("repo")
repo_url: str = SchemaField(
description="URL of the GitHub repository",
placeholder="https://github.com/owner/repo",
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
class PRItem(TypedDict):
title: str
url: str
@@ -108,7 +114,7 @@ class GithubListPullRequestsBlock(Block):
class GithubMakePullRequestBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: GithubCredentialsInput = GithubCredentialsField("repo")
repo_url: str = SchemaField(
description="URL of the GitHub repository",
@@ -135,7 +141,7 @@ class GithubMakePullRequestBlock(Block):
placeholder="Enter the base branch",
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
number: int = SchemaField(description="Number of the created pull request")
url: str = SchemaField(description="URL of the created pull request")
error: str = SchemaField(
@@ -209,7 +215,7 @@ class GithubMakePullRequestBlock(Block):
class GithubReadPullRequestBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: GithubCredentialsInput = GithubCredentialsField("repo")
pr_url: str = SchemaField(
description="URL of the GitHub pull request",
@@ -221,7 +227,7 @@ class GithubReadPullRequestBlock(Block):
advanced=False,
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
title: str = SchemaField(description="Title of the pull request")
body: str = SchemaField(description="Body of the pull request")
author: str = SchemaField(description="User who created the pull request")
@@ -325,7 +331,7 @@ class GithubReadPullRequestBlock(Block):
class GithubAssignPRReviewerBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: GithubCredentialsInput = GithubCredentialsField("repo")
pr_url: str = SchemaField(
description="URL of the GitHub pull request",
@@ -336,7 +342,7 @@ class GithubAssignPRReviewerBlock(Block):
placeholder="Enter the reviewer's username",
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
status: str = SchemaField(
description="Status of the reviewer assignment operation"
)
@@ -392,7 +398,7 @@ class GithubAssignPRReviewerBlock(Block):
class GithubUnassignPRReviewerBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: GithubCredentialsInput = GithubCredentialsField("repo")
pr_url: str = SchemaField(
description="URL of the GitHub pull request",
@@ -403,7 +409,7 @@ class GithubUnassignPRReviewerBlock(Block):
placeholder="Enter the reviewer's username",
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
status: str = SchemaField(
description="Status of the reviewer unassignment operation"
)
@@ -459,14 +465,14 @@ class GithubUnassignPRReviewerBlock(Block):
class GithubListPRReviewersBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: GithubCredentialsInput = GithubCredentialsField("repo")
pr_url: str = SchemaField(
description="URL of the GitHub pull request",
placeholder="https://github.com/owner/repo/pull/1",
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
class ReviewerItem(TypedDict):
username: str
url: str

View File

@@ -2,7 +2,13 @@ import base64
from typing_extensions import TypedDict
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.block import (
Block,
BlockCategory,
BlockOutput,
BlockSchemaInput,
BlockSchemaOutput,
)
from backend.data.model import SchemaField
from ._api import get_api
@@ -16,14 +22,14 @@ from ._auth import (
class GithubListTagsBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: GithubCredentialsInput = GithubCredentialsField("repo")
repo_url: str = SchemaField(
description="URL of the GitHub repository",
placeholder="https://github.com/owner/repo",
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
class TagItem(TypedDict):
name: str
url: str
@@ -34,7 +40,6 @@ class GithubListTagsBlock(Block):
tags: list[TagItem] = SchemaField(
description="List of tags with their name and file tree browser URL"
)
error: str = SchemaField(description="Error message if listing tags failed")
def __init__(self):
super().__init__(
@@ -111,14 +116,14 @@ class GithubListTagsBlock(Block):
class GithubListBranchesBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: GithubCredentialsInput = GithubCredentialsField("repo")
repo_url: str = SchemaField(
description="URL of the GitHub repository",
placeholder="https://github.com/owner/repo",
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
class BranchItem(TypedDict):
name: str
url: str
@@ -130,7 +135,6 @@ class GithubListBranchesBlock(Block):
branches: list[BranchItem] = SchemaField(
description="List of branches with their name and file tree browser URL"
)
error: str = SchemaField(description="Error message if listing branches failed")
def __init__(self):
super().__init__(
@@ -207,7 +211,7 @@ class GithubListBranchesBlock(Block):
class GithubListDiscussionsBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: GithubCredentialsInput = GithubCredentialsField("repo")
repo_url: str = SchemaField(
description="URL of the GitHub repository",
@@ -217,7 +221,7 @@ class GithubListDiscussionsBlock(Block):
description="Number of discussions to fetch", default=5
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
class DiscussionItem(TypedDict):
title: str
url: str
@@ -323,14 +327,14 @@ class GithubListDiscussionsBlock(Block):
class GithubListReleasesBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: GithubCredentialsInput = GithubCredentialsField("repo")
repo_url: str = SchemaField(
description="URL of the GitHub repository",
placeholder="https://github.com/owner/repo",
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
class ReleaseItem(TypedDict):
name: str
url: str
@@ -342,7 +346,6 @@ class GithubListReleasesBlock(Block):
releases: list[ReleaseItem] = SchemaField(
description="List of releases with their name and file tree browser URL"
)
error: str = SchemaField(description="Error message if listing releases failed")
def __init__(self):
super().__init__(
@@ -414,7 +417,7 @@ class GithubListReleasesBlock(Block):
class GithubReadFileBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: GithubCredentialsInput = GithubCredentialsField("repo")
repo_url: str = SchemaField(
description="URL of the GitHub repository",
@@ -430,7 +433,7 @@ class GithubReadFileBlock(Block):
default="master",
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
text_content: str = SchemaField(
description="Content of the file (decoded as UTF-8 text)"
)
@@ -438,7 +441,6 @@ class GithubReadFileBlock(Block):
description="Raw base64-encoded content of the file"
)
size: int = SchemaField(description="The size of the file (in bytes)")
error: str = SchemaField(description="Error message if the file reading failed")
def __init__(self):
super().__init__(
@@ -501,7 +503,7 @@ class GithubReadFileBlock(Block):
class GithubReadFolderBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: GithubCredentialsInput = GithubCredentialsField("repo")
repo_url: str = SchemaField(
description="URL of the GitHub repository",
@@ -517,7 +519,7 @@ class GithubReadFolderBlock(Block):
default="master",
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
class DirEntry(TypedDict):
name: str
path: str
@@ -625,7 +627,7 @@ class GithubReadFolderBlock(Block):
class GithubMakeBranchBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: GithubCredentialsInput = GithubCredentialsField("repo")
repo_url: str = SchemaField(
description="URL of the GitHub repository",
@@ -640,7 +642,7 @@ class GithubMakeBranchBlock(Block):
placeholder="source_branch_name",
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
status: str = SchemaField(description="Status of the branch creation operation")
error: str = SchemaField(
description="Error message if the branch creation failed"
@@ -705,7 +707,7 @@ class GithubMakeBranchBlock(Block):
class GithubDeleteBranchBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: GithubCredentialsInput = GithubCredentialsField("repo")
repo_url: str = SchemaField(
description="URL of the GitHub repository",
@@ -716,7 +718,7 @@ class GithubDeleteBranchBlock(Block):
placeholder="branch_name",
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
status: str = SchemaField(description="Status of the branch deletion operation")
error: str = SchemaField(
description="Error message if the branch deletion failed"
@@ -766,7 +768,7 @@ class GithubDeleteBranchBlock(Block):
class GithubCreateFileBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: GithubCredentialsInput = GithubCredentialsField("repo")
repo_url: str = SchemaField(
description="URL of the GitHub repository",
@@ -789,7 +791,7 @@ class GithubCreateFileBlock(Block):
default="Create new file",
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
url: str = SchemaField(description="URL of the created file")
sha: str = SchemaField(description="SHA of the commit")
error: str = SchemaField(
@@ -868,7 +870,7 @@ class GithubCreateFileBlock(Block):
class GithubUpdateFileBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: GithubCredentialsInput = GithubCredentialsField("repo")
repo_url: str = SchemaField(
description="URL of the GitHub repository",
@@ -891,10 +893,9 @@ class GithubUpdateFileBlock(Block):
default="Update file",
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
url: str = SchemaField(description="URL of the updated file")
sha: str = SchemaField(description="SHA of the commit")
error: str = SchemaField(description="Error message if the file update failed")
def __init__(self):
super().__init__(
@@ -974,7 +975,7 @@ class GithubUpdateFileBlock(Block):
class GithubCreateRepositoryBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: GithubCredentialsInput = GithubCredentialsField("repo")
name: str = SchemaField(
description="Name of the repository to create",
@@ -998,7 +999,7 @@ class GithubCreateRepositoryBlock(Block):
default="",
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
url: str = SchemaField(description="URL of the created repository")
clone_url: str = SchemaField(description="Git clone URL of the repository")
error: str = SchemaField(
@@ -1077,14 +1078,14 @@ class GithubCreateRepositoryBlock(Block):
class GithubListStargazersBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: GithubCredentialsInput = GithubCredentialsField("repo")
repo_url: str = SchemaField(
description="URL of the GitHub repository",
placeholder="https://github.com/owner/repo",
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
class StargazerItem(TypedDict):
username: str
url: str

View File

@@ -4,7 +4,13 @@ from typing import Any, List, Optional
from typing_extensions import TypedDict
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.block import (
Block,
BlockCategory,
BlockOutput,
BlockSchemaInput,
BlockSchemaOutput,
)
from backend.data.model import SchemaField
from ._api import get_api
@@ -26,7 +32,7 @@ class ReviewEvent(Enum):
class GithubCreatePRReviewBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
class ReviewComment(TypedDict, total=False):
path: str
position: Optional[int]
@@ -61,7 +67,7 @@ class GithubCreatePRReviewBlock(Block):
advanced=True,
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
review_id: int = SchemaField(description="ID of the created review")
state: str = SchemaField(
description="State of the review (e.g., PENDING, COMMENTED, APPROVED, CHANGES_REQUESTED)"
@@ -197,7 +203,7 @@ class GithubCreatePRReviewBlock(Block):
class GithubListPRReviewsBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: GithubCredentialsInput = GithubCredentialsField("repo")
repo: str = SchemaField(
description="GitHub repository",
@@ -208,7 +214,7 @@ class GithubListPRReviewsBlock(Block):
placeholder="123",
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
class ReviewItem(TypedDict):
id: int
user: str
@@ -223,7 +229,6 @@ class GithubListPRReviewsBlock(Block):
reviews: list[ReviewItem] = SchemaField(
description="List of all reviews on the pull request"
)
error: str = SchemaField(description="Error message if listing reviews failed")
def __init__(self):
super().__init__(
@@ -317,7 +322,7 @@ class GithubListPRReviewsBlock(Block):
class GithubSubmitPendingReviewBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: GithubCredentialsInput = GithubCredentialsField("repo")
repo: str = SchemaField(
description="GitHub repository",
@@ -336,7 +341,7 @@ class GithubSubmitPendingReviewBlock(Block):
default=ReviewEvent.COMMENT,
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
state: str = SchemaField(description="State of the submitted review")
html_url: str = SchemaField(description="URL of the submitted review")
error: str = SchemaField(
@@ -415,7 +420,7 @@ class GithubSubmitPendingReviewBlock(Block):
class GithubResolveReviewDiscussionBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: GithubCredentialsInput = GithubCredentialsField("repo")
repo: str = SchemaField(
description="GitHub repository",
@@ -434,9 +439,8 @@ class GithubResolveReviewDiscussionBlock(Block):
default=True,
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
success: bool = SchemaField(description="Whether the operation was successful")
error: str = SchemaField(description="Error message if the operation failed")
def __init__(self):
super().__init__(
@@ -579,7 +583,7 @@ class GithubResolveReviewDiscussionBlock(Block):
class GithubGetPRReviewCommentsBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: GithubCredentialsInput = GithubCredentialsField("repo")
repo: str = SchemaField(
description="GitHub repository",
@@ -596,7 +600,7 @@ class GithubGetPRReviewCommentsBlock(Block):
advanced=True,
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
class CommentItem(TypedDict):
id: int
user: str
@@ -616,7 +620,6 @@ class GithubGetPRReviewCommentsBlock(Block):
comments: list[CommentItem] = SchemaField(
description="List of all review comments on the pull request"
)
error: str = SchemaField(description="Error message if getting comments failed")
def __init__(self):
super().__init__(
@@ -744,7 +747,7 @@ class GithubGetPRReviewCommentsBlock(Block):
class GithubCreateCommentObjectBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
path: str = SchemaField(
description="The file path to comment on",
placeholder="src/main.py",
@@ -781,7 +784,7 @@ class GithubCreateCommentObjectBlock(Block):
advanced=True,
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
comment_object: dict = SchemaField(
description="The comment object formatted for GitHub API"
)

View File

@@ -3,7 +3,13 @@ from typing import Optional
from pydantic import BaseModel
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.block import (
Block,
BlockCategory,
BlockOutput,
BlockSchemaInput,
BlockSchemaOutput,
)
from backend.data.model import SchemaField
from ._api import get_api
@@ -26,7 +32,7 @@ class StatusState(Enum):
class GithubCreateStatusBlock(Block):
"""Block for creating a commit status on a GitHub repository."""
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: GithubFineGrainedAPICredentialsInput = (
GithubFineGrainedAPICredentialsField("repo:status")
)
@@ -54,7 +60,7 @@ class GithubCreateStatusBlock(Block):
advanced=False,
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
class StatusResult(BaseModel):
id: int
url: str
@@ -66,7 +72,6 @@ class GithubCreateStatusBlock(Block):
updated_at: str
status: StatusResult = SchemaField(description="Details of the created status")
error: str = SchemaField(description="Error message if status creation failed")
def __init__(self):
super().__init__(

View File

@@ -8,7 +8,8 @@ from backend.data.block import (
Block,
BlockCategory,
BlockOutput,
BlockSchema,
BlockSchemaInput,
BlockSchemaOutput,
BlockWebhookConfig,
)
from backend.data.model import SchemaField
@@ -26,7 +27,7 @@ logger = logging.getLogger(__name__)
# --8<-- [start:GithubTriggerExample]
class GitHubTriggerBase:
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: GithubCredentialsInput = GithubCredentialsField("repo")
repo: str = SchemaField(
description=(
@@ -40,7 +41,7 @@ class GitHubTriggerBase:
payload: dict = SchemaField(hidden=True, default_factory=dict)
# --8<-- [end:example-payload-field]
class Output(BlockSchema):
class Output(BlockSchemaOutput):
payload: dict = SchemaField(
description="The complete webhook payload that was received from GitHub. "
"Includes information about the affected resource (e.g. pull request), "

View File

@@ -8,7 +8,13 @@ from google.oauth2.credentials import Credentials
from googleapiclient.discovery import build
from pydantic import BaseModel
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.block import (
Block,
BlockCategory,
BlockOutput,
BlockSchemaInput,
BlockSchemaOutput,
)
from backend.data.model import SchemaField
from backend.util.settings import Settings
@@ -43,7 +49,7 @@ class CalendarEvent(BaseModel):
class GoogleCalendarReadEventsBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: GoogleCredentialsInput = GoogleCredentialsField(
["https://www.googleapis.com/auth/calendar.readonly"]
)
@@ -73,7 +79,7 @@ class GoogleCalendarReadEventsBlock(Block):
description="Include events you've declined", default=False
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
events: list[CalendarEvent] = SchemaField(
description="List of calendar events in the requested time range",
default_factory=list,
@@ -379,7 +385,7 @@ class RecurringEvent(BaseModel):
class GoogleCalendarCreateEventBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: GoogleCredentialsInput = GoogleCredentialsField(
["https://www.googleapis.com/auth/calendar"]
)
@@ -433,12 +439,11 @@ class GoogleCalendarCreateEventBlock(Block):
default_factory=lambda: [ReminderPreset.TEN_MINUTES],
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
event_id: str = SchemaField(description="ID of the created event")
event_link: str = SchemaField(
description="Link to view the event in Google Calendar"
)
error: str = SchemaField(description="Error message if event creation failed")
def __init__(self):
super().__init__(

View File

@@ -14,7 +14,13 @@ from google.oauth2.credentials import Credentials
from googleapiclient.discovery import build
from pydantic import BaseModel, Field
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.block import (
Block,
BlockCategory,
BlockOutput,
BlockSchemaInput,
BlockSchemaOutput,
)
from backend.data.model import SchemaField
from backend.util.file import MediaFileType, get_exec_file_path, store_media_file
from backend.util.settings import Settings
@@ -320,7 +326,7 @@ class GmailBase(Block, ABC):
class GmailReadBlock(GmailBase):
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: GoogleCredentialsInput = GoogleCredentialsField(
["https://www.googleapis.com/auth/gmail.readonly"]
)
@@ -333,7 +339,7 @@ class GmailReadBlock(GmailBase):
default=10,
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
email: Email = SchemaField(
description="Email data",
)
@@ -516,7 +522,7 @@ class GmailSendBlock(GmailBase):
- Attachment support for multiple files
"""
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: GoogleCredentialsInput = GoogleCredentialsField(
["https://www.googleapis.com/auth/gmail.send"]
)
@@ -540,7 +546,7 @@ class GmailSendBlock(GmailBase):
description="Files to attach", default_factory=list, advanced=True
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
result: GmailSendResult = SchemaField(
description="Send confirmation",
)
@@ -618,7 +624,7 @@ class GmailCreateDraftBlock(GmailBase):
- Attachment support for multiple files
"""
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: GoogleCredentialsInput = GoogleCredentialsField(
["https://www.googleapis.com/auth/gmail.modify"]
)
@@ -642,7 +648,7 @@ class GmailCreateDraftBlock(GmailBase):
description="Files to attach", default_factory=list, advanced=True
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
result: GmailDraftResult = SchemaField(
description="Draft creation result",
)
@@ -721,12 +727,12 @@ class GmailCreateDraftBlock(GmailBase):
class GmailListLabelsBlock(GmailBase):
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: GoogleCredentialsInput = GoogleCredentialsField(
["https://www.googleapis.com/auth/gmail.labels"]
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
result: list[dict] = SchemaField(
description="List of labels",
)
@@ -779,7 +785,7 @@ class GmailListLabelsBlock(GmailBase):
class GmailAddLabelBlock(GmailBase):
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: GoogleCredentialsInput = GoogleCredentialsField(
["https://www.googleapis.com/auth/gmail.modify"]
)
@@ -790,7 +796,7 @@ class GmailAddLabelBlock(GmailBase):
description="Label name to add",
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
result: GmailLabelResult = SchemaField(
description="Label addition result",
)
@@ -865,7 +871,7 @@ class GmailAddLabelBlock(GmailBase):
class GmailRemoveLabelBlock(GmailBase):
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: GoogleCredentialsInput = GoogleCredentialsField(
["https://www.googleapis.com/auth/gmail.modify"]
)
@@ -876,7 +882,7 @@ class GmailRemoveLabelBlock(GmailBase):
description="Label name to remove",
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
result: GmailLabelResult = SchemaField(
description="Label removal result",
)
@@ -941,17 +947,16 @@ class GmailRemoveLabelBlock(GmailBase):
class GmailGetThreadBlock(GmailBase):
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: GoogleCredentialsInput = GoogleCredentialsField(
["https://www.googleapis.com/auth/gmail.readonly"]
)
threadId: str = SchemaField(description="Gmail thread ID")
class Output(BlockSchema):
class Output(BlockSchemaOutput):
thread: Thread = SchemaField(
description="Gmail thread with decoded message bodies"
)
error: str = SchemaField(description="Error message if any")
def __init__(self):
super().__init__(
@@ -1218,7 +1223,7 @@ class GmailReplyBlock(GmailBase):
- Full Unicode/emoji support with UTF-8 encoding
"""
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: GoogleCredentialsInput = GoogleCredentialsField(
[
"https://www.googleapis.com/auth/gmail.send",
@@ -1246,14 +1251,13 @@ class GmailReplyBlock(GmailBase):
description="Files to attach", default_factory=list, advanced=True
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
messageId: str = SchemaField(description="Sent message ID")
threadId: str = SchemaField(description="Thread ID")
message: dict = SchemaField(description="Raw Gmail message object")
email: Email = SchemaField(
description="Parsed email object with decoded body and attachments"
)
error: str = SchemaField(description="Error message if any")
def __init__(self):
super().__init__(
@@ -1368,7 +1372,7 @@ class GmailDraftReplyBlock(GmailBase):
- Full Unicode/emoji support with UTF-8 encoding
"""
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: GoogleCredentialsInput = GoogleCredentialsField(
[
"https://www.googleapis.com/auth/gmail.modify",
@@ -1396,12 +1400,11 @@ class GmailDraftReplyBlock(GmailBase):
description="Files to attach", default_factory=list, advanced=True
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
draftId: str = SchemaField(description="Created draft ID")
messageId: str = SchemaField(description="Draft message ID")
threadId: str = SchemaField(description="Thread ID")
status: str = SchemaField(description="Draft creation status")
error: str = SchemaField(description="Error message if any")
def __init__(self):
super().__init__(
@@ -1482,14 +1485,13 @@ class GmailDraftReplyBlock(GmailBase):
class GmailGetProfileBlock(GmailBase):
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: GoogleCredentialsInput = GoogleCredentialsField(
["https://www.googleapis.com/auth/gmail.readonly"]
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
profile: Profile = SchemaField(description="Gmail user profile information")
error: str = SchemaField(description="Error message if any")
def __init__(self):
super().__init__(
@@ -1555,7 +1557,7 @@ class GmailForwardBlock(GmailBase):
- Manual content type override option
"""
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: GoogleCredentialsInput = GoogleCredentialsField(
[
"https://www.googleapis.com/auth/gmail.send",
@@ -1589,11 +1591,10 @@ class GmailForwardBlock(GmailBase):
advanced=True,
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
messageId: str = SchemaField(description="Forwarded message ID")
threadId: str = SchemaField(description="Thread ID")
status: str = SchemaField(description="Forward status")
error: str = SchemaField(description="Error message if any")
def __init__(self):
super().__init__(

View File

@@ -5,7 +5,13 @@ from typing import Any
from google.oauth2.credentials import Credentials
from googleapiclient.discovery import build
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.block import (
Block,
BlockCategory,
BlockOutput,
BlockSchemaInput,
BlockSchemaOutput,
)
from backend.data.model import SchemaField
from backend.util.settings import Settings
@@ -195,7 +201,7 @@ class BatchOperationType(str, Enum):
CLEAR = "clear"
class BatchOperation(BlockSchema):
class BatchOperation(BlockSchemaInput):
type: BatchOperationType = SchemaField(
description="The type of operation to perform"
)
@@ -206,7 +212,7 @@ class BatchOperation(BlockSchema):
class GoogleSheetsReadBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: GoogleCredentialsInput = GoogleCredentialsField(
["https://www.googleapis.com/auth/spreadsheets.readonly"]
)
@@ -218,7 +224,7 @@ class GoogleSheetsReadBlock(Block):
description="The A1 notation of the range to read",
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
result: list[list[str]] = SchemaField(
description="The data read from the spreadsheet",
)
@@ -274,7 +280,7 @@ class GoogleSheetsReadBlock(Block):
class GoogleSheetsWriteBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: GoogleCredentialsInput = GoogleCredentialsField(
["https://www.googleapis.com/auth/spreadsheets"]
)
@@ -289,7 +295,7 @@ class GoogleSheetsWriteBlock(Block):
description="The data to write to the spreadsheet",
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
result: dict = SchemaField(
description="The result of the write operation",
)
@@ -363,7 +369,7 @@ class GoogleSheetsWriteBlock(Block):
class GoogleSheetsAppendBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: GoogleCredentialsInput = GoogleCredentialsField(
["https://www.googleapis.com/auth/spreadsheets"]
)
@@ -403,9 +409,8 @@ class GoogleSheetsAppendBlock(Block):
advanced=True,
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
result: dict = SchemaField(description="Append API response")
error: str = SchemaField(description="Error message, if any")
def __init__(self):
super().__init__(
@@ -503,7 +508,7 @@ class GoogleSheetsAppendBlock(Block):
class GoogleSheetsClearBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: GoogleCredentialsInput = GoogleCredentialsField(
["https://www.googleapis.com/auth/spreadsheets"]
)
@@ -515,7 +520,7 @@ class GoogleSheetsClearBlock(Block):
description="The A1 notation of the range to clear",
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
result: dict = SchemaField(
description="The result of the clear operation",
)
@@ -571,7 +576,7 @@ class GoogleSheetsClearBlock(Block):
class GoogleSheetsMetadataBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: GoogleCredentialsInput = GoogleCredentialsField(
["https://www.googleapis.com/auth/spreadsheets.readonly"]
)
@@ -580,7 +585,7 @@ class GoogleSheetsMetadataBlock(Block):
title="Spreadsheet ID or URL",
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
result: dict = SchemaField(
description="The metadata of the spreadsheet including sheets info",
)
@@ -652,7 +657,7 @@ class GoogleSheetsMetadataBlock(Block):
class GoogleSheetsManageSheetBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: GoogleCredentialsInput = GoogleCredentialsField(
["https://www.googleapis.com/auth/spreadsheets"]
)
@@ -672,9 +677,8 @@ class GoogleSheetsManageSheetBlock(Block):
description="New sheet name for copy", default=""
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
result: dict = SchemaField(description="Operation result")
error: str = SchemaField(description="Error message, if any")
def __init__(self):
super().__init__(
@@ -760,7 +764,7 @@ class GoogleSheetsManageSheetBlock(Block):
class GoogleSheetsBatchOperationsBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: GoogleCredentialsInput = GoogleCredentialsField(
["https://www.googleapis.com/auth/spreadsheets"]
)
@@ -772,7 +776,7 @@ class GoogleSheetsBatchOperationsBlock(Block):
description="List of operations to perform",
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
result: dict = SchemaField(
description="The result of the batch operations",
)
@@ -877,7 +881,7 @@ class GoogleSheetsBatchOperationsBlock(Block):
class GoogleSheetsFindReplaceBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: GoogleCredentialsInput = GoogleCredentialsField(
["https://www.googleapis.com/auth/spreadsheets"]
)
@@ -904,7 +908,7 @@ class GoogleSheetsFindReplaceBlock(Block):
default=False,
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
result: dict = SchemaField(
description="The result of the find/replace operation including number of replacements",
)
@@ -987,7 +991,7 @@ class GoogleSheetsFindReplaceBlock(Block):
class GoogleSheetsFindBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: GoogleCredentialsInput = GoogleCredentialsField(
["https://www.googleapis.com/auth/spreadsheets.readonly"]
)
@@ -1020,7 +1024,7 @@ class GoogleSheetsFindBlock(Block):
advanced=True,
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
result: dict = SchemaField(
description="The result of the find operation including locations and count",
)
@@ -1255,7 +1259,7 @@ class GoogleSheetsFindBlock(Block):
class GoogleSheetsFormatBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: GoogleCredentialsInput = GoogleCredentialsField(
["https://www.googleapis.com/auth/spreadsheets"]
)
@@ -1270,9 +1274,8 @@ class GoogleSheetsFormatBlock(Block):
italic: bool = SchemaField(default=False)
font_size: int = SchemaField(default=10)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
result: dict = SchemaField(description="API response or success flag")
error: str = SchemaField(description="Error message, if any")
def __init__(self):
super().__init__(
@@ -1383,7 +1386,7 @@ class GoogleSheetsFormatBlock(Block):
class GoogleSheetsCreateSpreadsheetBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: GoogleCredentialsInput = GoogleCredentialsField(
["https://www.googleapis.com/auth/spreadsheets"]
)
@@ -1395,7 +1398,7 @@ class GoogleSheetsCreateSpreadsheetBlock(Block):
default=["Sheet1"],
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
result: dict = SchemaField(
description="The result containing spreadsheet ID and URL",
)

View File

@@ -3,7 +3,13 @@ from typing import Literal
import googlemaps
from pydantic import BaseModel, SecretStr
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.block import (
Block,
BlockCategory,
BlockOutput,
BlockSchemaInput,
BlockSchemaOutput,
)
from backend.data.model import (
APIKeyCredentials,
CredentialsField,
@@ -37,7 +43,7 @@ class Place(BaseModel):
class GoogleMapsSearchBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: CredentialsMetaInput[
Literal[ProviderName.GOOGLE_MAPS], Literal["api_key"]
] = CredentialsField(description="Google Maps API Key")
@@ -58,9 +64,8 @@ class GoogleMapsSearchBlock(Block):
le=60,
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
place: Place = SchemaField(description="Place found")
error: str = SchemaField(description="Error message if the search failed")
def __init__(self):
super().__init__(

View File

@@ -8,7 +8,13 @@ from typing import Literal
import aiofiles
from pydantic import SecretStr
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.block import (
Block,
BlockCategory,
BlockOutput,
BlockSchemaInput,
BlockSchemaOutput,
)
from backend.data.model import (
CredentialsField,
CredentialsMetaInput,
@@ -62,7 +68,7 @@ class HttpMethod(Enum):
class SendWebRequestBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
url: str = SchemaField(
description="The URL to send the request to",
placeholder="https://api.example.com",
@@ -93,7 +99,7 @@ class SendWebRequestBlock(Block):
default_factory=list,
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
response: object = SchemaField(description="The response from the server")
client_error: object = SchemaField(description="Errors on 4xx status codes")
server_error: object = SchemaField(description="Errors on 5xx status codes")

View File

@@ -3,13 +3,19 @@ from backend.blocks.hubspot._auth import (
HubSpotCredentialsField,
HubSpotCredentialsInput,
)
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.block import (
Block,
BlockCategory,
BlockOutput,
BlockSchemaInput,
BlockSchemaOutput,
)
from backend.data.model import SchemaField
from backend.util.request import Requests
class HubSpotCompanyBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: HubSpotCredentialsInput = HubSpotCredentialsField()
operation: str = SchemaField(
description="Operation to perform (create, update, get)", default="get"
@@ -22,7 +28,7 @@ class HubSpotCompanyBlock(Block):
description="Company domain for get/update operations", default=""
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
company: dict = SchemaField(description="Company information")
status: str = SchemaField(description="Operation status")

View File

@@ -3,13 +3,19 @@ from backend.blocks.hubspot._auth import (
HubSpotCredentialsField,
HubSpotCredentialsInput,
)
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.block import (
Block,
BlockCategory,
BlockOutput,
BlockSchemaInput,
BlockSchemaOutput,
)
from backend.data.model import SchemaField
from backend.util.request import Requests
class HubSpotContactBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: HubSpotCredentialsInput = HubSpotCredentialsField()
operation: str = SchemaField(
description="Operation to perform (create, update, get)", default="get"
@@ -22,7 +28,7 @@ class HubSpotContactBlock(Block):
description="Email address for get/update operations", default=""
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
contact: dict = SchemaField(description="Contact information")
status: str = SchemaField(description="Operation status")

View File

@@ -5,13 +5,19 @@ from backend.blocks.hubspot._auth import (
HubSpotCredentialsField,
HubSpotCredentialsInput,
)
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.block import (
Block,
BlockCategory,
BlockOutput,
BlockSchemaInput,
BlockSchemaOutput,
)
from backend.data.model import SchemaField
from backend.util.request import Requests
class HubSpotEngagementBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: HubSpotCredentialsInput = HubSpotCredentialsField()
operation: str = SchemaField(
description="Operation to perform (send_email, track_engagement)",
@@ -29,7 +35,7 @@ class HubSpotEngagementBlock(Block):
default=30,
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
result: dict = SchemaField(description="Operation result")
status: str = SchemaField(description="Operation status")

View File

@@ -4,7 +4,13 @@ from typing import Any, Dict, Literal, Optional
from pydantic import SecretStr
from requests.exceptions import RequestException
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.block import (
Block,
BlockCategory,
BlockOutput,
BlockSchemaInput,
BlockSchemaOutput,
)
from backend.data.model import (
APIKeyCredentials,
CredentialsField,
@@ -84,7 +90,7 @@ class UpscaleOption(str, Enum):
class IdeogramModelBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: CredentialsMetaInput[
Literal[ProviderName.IDEOGRAM], Literal["api_key"]
] = CredentialsField(
@@ -154,9 +160,8 @@ class IdeogramModelBlock(Block):
advanced=True,
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
result: str = SchemaField(description="Generated image URL")
error: str = SchemaField(description="Error message if the model run failed")
def __init__(self):
super().__init__(

View File

@@ -2,7 +2,14 @@ import copy
from datetime import date, time
from typing import Any, Optional
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema, BlockType
from backend.data.block import (
Block,
BlockCategory,
BlockOutput,
BlockSchema,
BlockSchemaInput,
BlockType,
)
from backend.data.model import SchemaField
from backend.util.file import store_media_file
from backend.util.mock import MockObject
@@ -22,7 +29,7 @@ class AgentInputBlock(Block):
It Outputs the value passed as input.
"""
class Input(BlockSchema):
class Input(BlockSchemaInput):
name: str = SchemaField(description="The name of the input.")
value: Any = SchemaField(
description="The value to be passed as input.",
@@ -60,6 +67,7 @@ class AgentInputBlock(Block):
return schema
class Output(BlockSchema):
# Use BlockSchema to avoid automatic error field for interface definition
result: Any = SchemaField(description="The value passed as input.")
def __init__(self, **kwargs):
@@ -109,7 +117,7 @@ class AgentOutputBlock(Block):
If formatting fails or no `format` is provided, the raw `value` is output.
"""
class Input(BlockSchema):
class Input(BlockSchemaInput):
value: Any = SchemaField(
description="The value to be recorded as output.",
default=None,
@@ -151,6 +159,7 @@ class AgentOutputBlock(Block):
return self.get_field_schema("value")
class Output(BlockSchema):
# Use BlockSchema to avoid automatic error field for interface definition
output: Any = SchemaField(description="The value recorded as output.")
name: Any = SchemaField(description="The name of the value recorded as output.")

View File

@@ -1,12 +1,18 @@
from typing import Any
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.block import (
Block,
BlockCategory,
BlockOutput,
BlockSchemaInput,
BlockSchemaOutput,
)
from backend.data.model import SchemaField
from backend.util.json import loads
class StepThroughItemsBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
items: list = SchemaField(
advanced=False,
description="The list or dictionary of items to iterate over",
@@ -26,7 +32,7 @@ class StepThroughItemsBlock(Block):
default="",
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
item: Any = SchemaField(description="The current item in the iteration")
key: Any = SchemaField(
description="The key or index of the current item in the iteration",

View File

@@ -3,13 +3,19 @@ from backend.blocks.jina._auth import (
JinaCredentialsField,
JinaCredentialsInput,
)
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.block import (
Block,
BlockCategory,
BlockOutput,
BlockSchemaInput,
BlockSchemaOutput,
)
from backend.data.model import SchemaField
from backend.util.request import Requests
class JinaChunkingBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
texts: list = SchemaField(description="List of texts to chunk")
credentials: JinaCredentialsInput = JinaCredentialsField()
@@ -20,7 +26,7 @@ class JinaChunkingBlock(Block):
description="Whether to return token information", default=False
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
chunks: list = SchemaField(description="List of chunked texts")
tokens: list = SchemaField(
description="List of token information for each chunk",

View File

@@ -3,13 +3,19 @@ from backend.blocks.jina._auth import (
JinaCredentialsField,
JinaCredentialsInput,
)
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.block import (
Block,
BlockCategory,
BlockOutput,
BlockSchemaInput,
BlockSchemaOutput,
)
from backend.data.model import SchemaField
from backend.util.request import Requests
class JinaEmbeddingBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
texts: list = SchemaField(description="List of texts to embed")
credentials: JinaCredentialsInput = JinaCredentialsField()
model: str = SchemaField(
@@ -17,7 +23,7 @@ class JinaEmbeddingBlock(Block):
default="jina-embeddings-v2-base-en",
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
embeddings: list = SchemaField(description="List of embeddings")
def __init__(self):

View File

@@ -8,7 +8,13 @@ from backend.blocks.jina._auth import (
JinaCredentialsField,
JinaCredentialsInput,
)
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.block import (
Block,
BlockCategory,
BlockOutput,
BlockSchemaInput,
BlockSchemaOutput,
)
from backend.data.model import SchemaField
from backend.util.request import Requests
@@ -20,13 +26,13 @@ class Reference(TypedDict):
class FactCheckerBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
statement: str = SchemaField(
description="The statement to check for factuality"
)
credentials: JinaCredentialsInput = JinaCredentialsField()
class Output(BlockSchema):
class Output(BlockSchemaOutput):
factuality: float = SchemaField(
description="The factuality score of the statement"
)
@@ -36,7 +42,6 @@ class FactCheckerBlock(Block):
description="List of references supporting or contradicting the statement",
default=[],
)
error: str = SchemaField(description="Error message if the check fails")
def __init__(self):
super().__init__(

View File

@@ -8,20 +8,25 @@ from backend.blocks.jina._auth import (
JinaCredentialsInput,
)
from backend.blocks.search import GetRequest
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.block import (
Block,
BlockCategory,
BlockOutput,
BlockSchemaInput,
BlockSchemaOutput,
)
from backend.data.model import SchemaField
class SearchTheWebBlock(Block, GetRequest):
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: JinaCredentialsInput = JinaCredentialsField()
query: str = SchemaField(description="The search query to search the web for")
class Output(BlockSchema):
class Output(BlockSchemaOutput):
results: str = SchemaField(
description="The search results including content from top 5 URLs"
)
error: str = SchemaField(description="Error message if the search fails")
def __init__(self):
super().__init__(
@@ -58,7 +63,7 @@ class SearchTheWebBlock(Block, GetRequest):
class ExtractWebsiteContentBlock(Block, GetRequest):
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: JinaCredentialsInput = JinaCredentialsField()
url: str = SchemaField(description="The URL to scrape the content from")
raw_content: bool = SchemaField(
@@ -68,7 +73,7 @@ class ExtractWebsiteContentBlock(Block, GetRequest):
advanced=True,
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
content: str = SchemaField(description="The scraped content from the given URL")
error: str = SchemaField(
description="Error message if the content cannot be retrieved"

View File

@@ -3,7 +3,8 @@ from backend.sdk import (
Block,
BlockCategory,
BlockOutput,
BlockSchema,
BlockSchemaInput,
BlockSchemaOutput,
CredentialsMetaInput,
OAuth2Credentials,
SchemaField,
@@ -22,7 +23,7 @@ from .models import CreateCommentResponse
class LinearCreateCommentBlock(Block):
"""Block for creating comments on Linear issues"""
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: CredentialsMetaInput = linear.credentials_field(
description="Linear credentials with comment creation permissions",
required_scopes={LinearScope.COMMENTS_CREATE},
@@ -30,12 +31,11 @@ class LinearCreateCommentBlock(Block):
issue_id: str = SchemaField(description="ID of the issue to comment on")
comment: str = SchemaField(description="Comment text to add to the issue")
class Output(BlockSchema):
class Output(BlockSchemaOutput):
comment_id: str = SchemaField(description="ID of the created comment")
comment_body: str = SchemaField(
description="Text content of the created comment"
)
error: str = SchemaField(description="Error message if comment creation failed")
def __init__(self):
super().__init__(

View File

@@ -3,7 +3,8 @@ from backend.sdk import (
Block,
BlockCategory,
BlockOutput,
BlockSchema,
BlockSchemaInput,
BlockSchemaOutput,
CredentialsMetaInput,
OAuth2Credentials,
SchemaField,
@@ -22,7 +23,7 @@ from .models import CreateIssueResponse, Issue
class LinearCreateIssueBlock(Block):
"""Block for creating issues on Linear"""
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: CredentialsMetaInput = linear.credentials_field(
description="Linear credentials with issue creation permissions",
required_scopes={LinearScope.ISSUES_CREATE},
@@ -43,10 +44,9 @@ class LinearCreateIssueBlock(Block):
default=None,
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
issue_id: str = SchemaField(description="ID of the created issue")
issue_title: str = SchemaField(description="Title of the created issue")
error: str = SchemaField(description="Error message if issue creation failed")
def __init__(self):
super().__init__(
@@ -129,14 +129,14 @@ class LinearCreateIssueBlock(Block):
class LinearSearchIssuesBlock(Block):
"""Block for searching issues on Linear"""
class Input(BlockSchema):
class Input(BlockSchemaInput):
term: str = SchemaField(description="Term to search for issues")
credentials: CredentialsMetaInput = linear.credentials_field(
description="Linear credentials with read permissions",
required_scopes={LinearScope.READ},
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
issues: list[Issue] = SchemaField(description="List of issues")
def __init__(self):

View File

@@ -3,7 +3,8 @@ from backend.sdk import (
Block,
BlockCategory,
BlockOutput,
BlockSchema,
BlockSchemaInput,
BlockSchemaOutput,
CredentialsMetaInput,
OAuth2Credentials,
SchemaField,
@@ -22,16 +23,15 @@ from .models import Project
class LinearSearchProjectsBlock(Block):
"""Block for searching projects on Linear"""
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: CredentialsMetaInput = linear.credentials_field(
description="Linear credentials with read permissions",
required_scopes={LinearScope.READ},
)
term: str = SchemaField(description="Term to search for projects")
class Output(BlockSchema):
class Output(BlockSchemaOutput):
projects: list[Project] = SchemaField(description="List of projects")
error: str = SchemaField(description="Error message if issue creation failed")
def __init__(self):
super().__init__(

View File

@@ -16,7 +16,13 @@ from anthropic.types import ToolParam
from groq import AsyncGroq
from pydantic import BaseModel, SecretStr
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.block import (
Block,
BlockCategory,
BlockOutput,
BlockSchemaInput,
BlockSchemaOutput,
)
from backend.data.model import (
APIKeyCredentials,
CredentialsField,
@@ -766,7 +772,7 @@ class AIBlockBase(Block, ABC):
class AIStructuredResponseGeneratorBlock(AIBlockBase):
class Input(BlockSchema):
class Input(BlockSchemaInput):
prompt: str = SchemaField(
description="The prompt to send to the language model.",
placeholder="Enter your prompt here...",
@@ -833,12 +839,11 @@ class AIStructuredResponseGeneratorBlock(AIBlockBase):
description="Ollama host for local models",
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
response: dict[str, Any] | list[dict[str, Any]] = SchemaField(
description="The response object generated by the language model."
)
prompt: list = SchemaField(description="The prompt sent to the language model.")
error: str = SchemaField(description="Error message if the API call failed.")
def __init__(self):
super().__init__(
@@ -1207,7 +1212,7 @@ def trim_prompt(s: str) -> str:
class AITextGeneratorBlock(AIBlockBase):
class Input(BlockSchema):
class Input(BlockSchemaInput):
prompt: str = SchemaField(
description="The prompt to send to the language model. You can use any of the {keys} from Prompt Values to fill in the prompt with values from the prompt values dictionary by putting them in curly braces.",
placeholder="Enter your prompt here...",
@@ -1245,12 +1250,11 @@ class AITextGeneratorBlock(AIBlockBase):
description="The maximum number of tokens to generate in the chat completion.",
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
response: str = SchemaField(
description="The response generated by the language model."
)
prompt: list = SchemaField(description="The prompt sent to the language model.")
error: str = SchemaField(description="Error message if the API call failed.")
def __init__(self):
super().__init__(
@@ -1304,7 +1308,7 @@ class SummaryStyle(Enum):
class AITextSummarizerBlock(AIBlockBase):
class Input(BlockSchema):
class Input(BlockSchemaInput):
text: str = SchemaField(
description="The text to summarize.",
placeholder="Enter the text to summarize here...",
@@ -1344,10 +1348,9 @@ class AITextSummarizerBlock(AIBlockBase):
description="Ollama host for local models",
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
summary: str = SchemaField(description="The final summary of the text.")
prompt: list = SchemaField(description="The prompt sent to the language model.")
error: str = SchemaField(description="Error message if the API call failed.")
def __init__(self):
super().__init__(
@@ -1451,7 +1454,20 @@ class AITextSummarizerBlock(AIBlockBase):
credentials=credentials,
)
return llm_response["summary"]
summary = llm_response["summary"]
# Validate that the LLM returned a string and not a list or other type
if not isinstance(summary, str):
from backend.util.truncate import truncate
truncated_summary = truncate(summary, 500)
raise ValueError(
f"LLM generation failed: Expected a string summary, but received {type(summary).__name__}. "
f"The language model incorrectly formatted its response. "
f"Received value: {json.dumps(truncated_summary)}"
)
return summary
async def _combine_summaries(
self, summaries: list[str], input_data: Input, credentials: APIKeyCredentials
@@ -1473,7 +1489,20 @@ class AITextSummarizerBlock(AIBlockBase):
credentials=credentials,
)
return llm_response["final_summary"]
final_summary = llm_response["final_summary"]
# Validate that the LLM returned a string and not a list or other type
if not isinstance(final_summary, str):
from backend.util.truncate import truncate
truncated_final_summary = truncate(final_summary, 500)
raise ValueError(
f"LLM generation failed: Expected a string final summary, but received {type(final_summary).__name__}. "
f"The language model incorrectly formatted its response. "
f"Received value: {json.dumps(truncated_final_summary)}"
)
return final_summary
else:
# If combined summaries are still too long, recursively summarize
block = AITextSummarizerBlock()
@@ -1491,7 +1520,7 @@ class AITextSummarizerBlock(AIBlockBase):
class AIConversationBlock(AIBlockBase):
class Input(BlockSchema):
class Input(BlockSchemaInput):
prompt: str = SchemaField(
description="The prompt to send to the language model.",
placeholder="Enter your prompt here...",
@@ -1518,12 +1547,11 @@ class AIConversationBlock(AIBlockBase):
description="Ollama host for local models",
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
response: str = SchemaField(
description="The model's response to the conversation."
)
prompt: list = SchemaField(description="The prompt sent to the language model.")
error: str = SchemaField(description="Error message if the API call failed.")
def __init__(self):
super().__init__(
@@ -1590,7 +1618,7 @@ class AIConversationBlock(AIBlockBase):
class AIListGeneratorBlock(AIBlockBase):
class Input(BlockSchema):
class Input(BlockSchemaInput):
focus: str | None = SchemaField(
description="The focus of the list to generate.",
placeholder="The top 5 most interesting news stories in the data.",
@@ -1627,15 +1655,12 @@ class AIListGeneratorBlock(AIBlockBase):
description="Ollama host for local models",
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
generated_list: List[str] = SchemaField(description="The generated list.")
list_item: str = SchemaField(
description="Each individual item in the list.",
)
prompt: list = SchemaField(description="The prompt sent to the language model.")
error: str = SchemaField(
description="Error message if the list generation failed."
)
def __init__(self):
super().__init__(
@@ -1753,6 +1778,7 @@ class AIListGeneratorBlock(AIBlockBase):
|```
|Do not include any explanations or additional text, just respond with the list in the format specified above.
|Do not include code fences or any other formatting, just the raw list.
"""
# If a focus is provided, add it to the prompt
if input_data.focus:

View File

@@ -2,7 +2,13 @@ import operator
from enum import Enum
from typing import Any
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.block import (
Block,
BlockCategory,
BlockOutput,
BlockSchemaInput,
BlockSchemaOutput,
)
from backend.data.model import SchemaField
@@ -15,7 +21,7 @@ class Operation(Enum):
class CalculatorBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
operation: Operation = SchemaField(
description="Choose the math operation you want to perform",
placeholder="Select an operation",
@@ -31,7 +37,7 @@ class CalculatorBlock(Block):
default=False,
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
result: float = SchemaField(description="The result of your calculation")
def __init__(self):
@@ -85,13 +91,13 @@ class CalculatorBlock(Block):
class CountItemsBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
collection: Any = SchemaField(
description="Enter the collection you want to count. This can be a list, dictionary, string, or any other iterable.",
placeholder="For example: [1, 2, 3] or {'a': 1, 'b': 2} or 'hello'",
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
count: int = SchemaField(description="The number of items in the collection")
def __init__(self):

View File

@@ -6,14 +6,20 @@ from moviepy.audio.io.AudioFileClip import AudioFileClip
from moviepy.video.fx.Loop import Loop
from moviepy.video.io.VideoFileClip import VideoFileClip
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.block import (
Block,
BlockCategory,
BlockOutput,
BlockSchemaInput,
BlockSchemaOutput,
)
from backend.data.model import SchemaField
from backend.util.file import MediaFileType, get_exec_file_path, store_media_file
class MediaDurationBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
media_in: MediaFileType = SchemaField(
description="Media input (URL, data URI, or local path)."
)
@@ -22,13 +28,10 @@ class MediaDurationBlock(Block):
default=True,
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
duration: float = SchemaField(
description="Duration of the media file (in seconds)."
)
error: str = SchemaField(
description="Error message if something fails.", default=""
)
def __init__(self):
super().__init__(
@@ -70,7 +73,7 @@ class LoopVideoBlock(Block):
Block for looping (repeating) a video clip until a given duration or number of loops.
"""
class Input(BlockSchema):
class Input(BlockSchemaInput):
video_in: MediaFileType = SchemaField(
description="The input video (can be a URL, data URI, or local path)."
)
@@ -90,13 +93,10 @@ class LoopVideoBlock(Block):
default="file_path",
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
video_out: str = SchemaField(
description="Looped video returned either as a relative path or a data URI."
)
error: str = SchemaField(
description="Error message if something fails.", default=""
)
def __init__(self):
super().__init__(
@@ -166,7 +166,7 @@ class AddAudioToVideoBlock(Block):
Optionally scale the volume of the new track.
"""
class Input(BlockSchema):
class Input(BlockSchemaInput):
video_in: MediaFileType = SchemaField(
description="Video input (URL, data URI, or local path)."
)
@@ -182,13 +182,10 @@ class AddAudioToVideoBlock(Block):
default="file_path",
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
video_out: MediaFileType = SchemaField(
description="Final video (with attached audio), as a path or data URI."
)
error: str = SchemaField(
description="Error message if something fails.", default=""
)
def __init__(self):
super().__init__(

View File

@@ -3,7 +3,13 @@ from typing import List, Literal
from pydantic import SecretStr
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.block import (
Block,
BlockCategory,
BlockOutput,
BlockSchemaInput,
BlockSchemaOutput,
)
from backend.data.model import (
APIKeyCredentials,
BlockSecret,
@@ -37,7 +43,7 @@ class PublishToMediumStatus(str, Enum):
class PublishToMediumBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
author_id: BlockSecret = SecretField(
key="medium_author_id",
description="""The Medium AuthorID of the user. You can get this by calling the /me endpoint of the Medium API.\n\ncurl -H "Authorization: Bearer YOUR_ACCESS_TOKEN" https://api.medium.com/v1/me" the response will contain the authorId field.""",
@@ -84,7 +90,7 @@ class PublishToMediumBlock(Block):
description="The Medium integration can be used with any API key with sufficient permissions for the blocks it is used on.",
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
post_id: str = SchemaField(description="The ID of the created Medium post")
post_url: str = SchemaField(description="The URL of the created Medium post")
published_at: int = SchemaField(

View File

@@ -3,7 +3,7 @@ from typing import Any, Literal, Optional, Union
from mem0 import MemoryClient
from pydantic import BaseModel, SecretStr
from backend.data.block import Block, BlockOutput, BlockSchema
from backend.data.block import Block, BlockOutput, BlockSchemaInput, BlockSchemaOutput
from backend.data.model import (
APIKeyCredentials,
CredentialsField,
@@ -55,7 +55,7 @@ class AddMemoryBlock(Block, Mem0Base):
Always limited by user_id and optional graph_id and graph_exec_id"""
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: CredentialsMetaInput[
Literal[ProviderName.MEM0], Literal["api_key"]
] = CredentialsField(description="Mem0 API key credentials")
@@ -74,13 +74,12 @@ class AddMemoryBlock(Block, Mem0Base):
description="Limit the memory to the agent", default=True
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
action: str = SchemaField(description="Action of the operation")
memory: str = SchemaField(description="Memory created")
results: list[dict[str, str]] = SchemaField(
description="List of all results from the operation"
)
error: str = SchemaField(description="Error message if operation fails")
def __init__(self):
super().__init__(
@@ -172,7 +171,7 @@ class AddMemoryBlock(Block, Mem0Base):
class SearchMemoryBlock(Block, Mem0Base):
"""Block for searching memories in Mem0"""
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: CredentialsMetaInput[
Literal[ProviderName.MEM0], Literal["api_key"]
] = CredentialsField(description="Mem0 API key credentials")
@@ -201,9 +200,8 @@ class SearchMemoryBlock(Block, Mem0Base):
description="Limit the memory to the agent", default=True
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
memories: Any = SchemaField(description="List of matching memories")
error: str = SchemaField(description="Error message if operation fails")
def __init__(self):
super().__init__(
@@ -266,7 +264,7 @@ class SearchMemoryBlock(Block, Mem0Base):
class GetAllMemoriesBlock(Block, Mem0Base):
"""Block for retrieving all memories from Mem0"""
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: CredentialsMetaInput[
Literal[ProviderName.MEM0], Literal["api_key"]
] = CredentialsField(description="Mem0 API key credentials")
@@ -289,9 +287,8 @@ class GetAllMemoriesBlock(Block, Mem0Base):
description="Limit the memory to the agent", default=True
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
memories: Any = SchemaField(description="List of memories")
error: str = SchemaField(description="Error message if operation fails")
def __init__(self):
super().__init__(
@@ -353,7 +350,7 @@ class GetAllMemoriesBlock(Block, Mem0Base):
class GetLatestMemoryBlock(Block, Mem0Base):
"""Block for retrieving the latest memory from Mem0"""
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: CredentialsMetaInput[
Literal[ProviderName.MEM0], Literal["api_key"]
] = CredentialsField(description="Mem0 API key credentials")
@@ -380,12 +377,11 @@ class GetLatestMemoryBlock(Block, Mem0Base):
description="Limit the memory to the agent", default=True
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
memory: Optional[dict[str, Any]] = SchemaField(
description="Latest memory if found"
)
found: bool = SchemaField(description="Whether a memory was found")
error: str = SchemaField(description="Error message if operation fails")
def __init__(self):
super().__init__(

View File

@@ -4,7 +4,13 @@ from typing import Any, Dict, List, Optional
from pydantic import model_validator
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.block import (
Block,
BlockCategory,
BlockOutput,
BlockSchemaInput,
BlockSchemaOutput,
)
from backend.data.model import OAuth2Credentials, SchemaField
from ._api import NotionClient
@@ -20,7 +26,7 @@ from ._auth import (
class NotionCreatePageBlock(Block):
"""Create a new page in Notion with content."""
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: NotionCredentialsInput = NotionCredentialsField()
parent_page_id: Optional[str] = SchemaField(
description="Parent page ID to create the page under. Either this OR parent_database_id is required.",
@@ -58,10 +64,9 @@ class NotionCreatePageBlock(Block):
)
return self
class Output(BlockSchema):
class Output(BlockSchemaOutput):
page_id: str = SchemaField(description="ID of the created page.")
page_url: str = SchemaField(description="URL of the created page.")
error: str = SchemaField(description="Error message if the operation failed.")
def __init__(self):
super().__init__(

View File

@@ -2,7 +2,13 @@ from __future__ import annotations
from typing import Any, Dict, List, Optional
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.block import (
Block,
BlockCategory,
BlockOutput,
BlockSchemaInput,
BlockSchemaOutput,
)
from backend.data.model import OAuth2Credentials, SchemaField
from ._api import NotionClient, parse_rich_text
@@ -18,7 +24,7 @@ from ._auth import (
class NotionReadDatabaseBlock(Block):
"""Query a Notion database and retrieve entries with their properties."""
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: NotionCredentialsInput = NotionCredentialsField()
database_id: str = SchemaField(
description="Notion database ID. Must be accessible by the connected integration.",
@@ -44,7 +50,7 @@ class NotionReadDatabaseBlock(Block):
le=100,
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
entries: List[Dict[str, Any]] = SchemaField(
description="List of database entries with their properties."
)
@@ -59,7 +65,6 @@ class NotionReadDatabaseBlock(Block):
)
count: int = SchemaField(description="Number of entries retrieved.")
database_title: str = SchemaField(description="Title of the database.")
error: str = SchemaField(description="Error message if the operation failed.")
def __init__(self):
super().__init__(

View File

@@ -1,6 +1,12 @@
from __future__ import annotations
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.block import (
Block,
BlockCategory,
BlockOutput,
BlockSchemaInput,
BlockSchemaOutput,
)
from backend.data.model import OAuth2Credentials, SchemaField
from ._api import NotionClient
@@ -16,15 +22,14 @@ from ._auth import (
class NotionReadPageBlock(Block):
"""Read a Notion page by ID and return its raw JSON."""
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: NotionCredentialsInput = NotionCredentialsField()
page_id: str = SchemaField(
description="Notion page ID. Must be accessible by the connected integration. You can get this from the page URL notion.so/A-Page-586edd711467478da59fe3ce29a1ffab would be 586edd711467478da59fe35e29a1ffab",
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
page: dict = SchemaField(description="Raw Notion page JSON.")
error: str = SchemaField(description="Error message if the operation failed.")
def __init__(self):
super().__init__(

View File

@@ -1,6 +1,12 @@
from __future__ import annotations
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.block import (
Block,
BlockCategory,
BlockOutput,
BlockSchemaInput,
BlockSchemaOutput,
)
from backend.data.model import OAuth2Credentials, SchemaField
from ._api import NotionClient, blocks_to_markdown, extract_page_title
@@ -16,7 +22,7 @@ from ._auth import (
class NotionReadPageMarkdownBlock(Block):
"""Read a Notion page and convert it to clean Markdown format."""
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: NotionCredentialsInput = NotionCredentialsField()
page_id: str = SchemaField(
description="Notion page ID. Must be accessible by the connected integration. You can get this from the page URL notion.so/A-Page-586edd711467478da59fe35e29a1ffab would be 586edd711467478da59fe35e29a1ffab",
@@ -26,10 +32,9 @@ class NotionReadPageMarkdownBlock(Block):
default=True,
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
markdown: str = SchemaField(description="Page content in Markdown format.")
title: str = SchemaField(description="Page title.")
error: str = SchemaField(description="Error message if the operation failed.")
def __init__(self):
super().__init__(

View File

@@ -4,7 +4,13 @@ from typing import List, Optional
from pydantic import BaseModel
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.block import (
Block,
BlockCategory,
BlockOutput,
BlockSchemaInput,
BlockSchemaOutput,
)
from backend.data.model import OAuth2Credentials, SchemaField
from ._api import NotionClient, extract_page_title, parse_rich_text
@@ -35,7 +41,7 @@ class NotionSearchResult(BaseModel):
class NotionSearchBlock(Block):
"""Search across your Notion workspace for pages and databases."""
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: NotionCredentialsInput = NotionCredentialsField()
query: str = SchemaField(
description="Search query text. Leave empty to get all accessible pages/databases.",
@@ -49,7 +55,7 @@ class NotionSearchBlock(Block):
description="Maximum number of results to return", default=20, ge=1, le=100
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
results: List[NotionSearchResult] = SchemaField(
description="List of search results with title, type, URL, and metadata."
)
@@ -60,7 +66,6 @@ class NotionSearchBlock(Block):
description="List of IDs from search results for batch operations."
)
count: int = SchemaField(description="Number of results found.")
error: str = SchemaField(description="Error message if the operation failed.")
def __init__(self):
super().__init__(

View File

@@ -3,14 +3,20 @@ from backend.blocks.nvidia._auth import (
NvidiaCredentialsField,
NvidiaCredentialsInput,
)
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.block import (
Block,
BlockCategory,
BlockOutput,
BlockSchemaInput,
BlockSchemaOutput,
)
from backend.data.model import SchemaField
from backend.util.request import Requests
from backend.util.type import MediaFileType
class NvidiaDeepfakeDetectBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: NvidiaCredentialsInput = NvidiaCredentialsField()
image_base64: MediaFileType = SchemaField(
description="Image to analyze for deepfakes",
@@ -20,7 +26,7 @@ class NvidiaDeepfakeDetectBlock(Block):
default=False,
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
status: str = SchemaField(
description="Detection status (SUCCESS, ERROR, CONTENT_FILTERED)",
)

View File

@@ -6,7 +6,13 @@ from typing import Any, Literal
import openai
from pydantic import SecretStr
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.block import (
Block,
BlockCategory,
BlockOutput,
BlockSchemaInput,
BlockSchemaOutput,
)
from backend.data.model import (
APIKeyCredentials,
CredentialsField,
@@ -54,7 +60,7 @@ def PerplexityCredentialsField() -> PerplexityCredentials:
class PerplexityBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
prompt: str = SchemaField(
description="The query to send to the Perplexity model.",
placeholder="Enter your query here...",
@@ -78,14 +84,13 @@ class PerplexityBlock(Block):
description="The maximum number of tokens to generate.",
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
response: str = SchemaField(
description="The response from the Perplexity model."
)
annotations: list[dict[str, Any]] = SchemaField(
description="List of URL citations and annotations from the response."
)
error: str = SchemaField(description="Error message if the API call failed.")
def __init__(self):
super().__init__(

View File

@@ -1,7 +1,13 @@
import logging
from typing import Any, Literal
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.block import (
Block,
BlockCategory,
BlockOutput,
BlockSchemaInput,
BlockSchemaOutput,
)
from backend.data.model import SchemaField
from backend.util.clients import get_database_manager_async_client
@@ -22,7 +28,7 @@ def get_storage_key(key: str, scope: StorageScope, graph_id: str) -> str:
class PersistInformationBlock(Block):
"""Block for persisting key-value data for the current user with configurable scope"""
class Input(BlockSchema):
class Input(BlockSchemaInput):
key: str = SchemaField(description="Key to store the information under")
value: Any = SchemaField(description="Value to store")
scope: StorageScope = SchemaField(
@@ -30,7 +36,7 @@ class PersistInformationBlock(Block):
default="within_agent",
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
value: Any = SchemaField(description="Value that was stored")
def __init__(self):
@@ -90,7 +96,7 @@ class PersistInformationBlock(Block):
class RetrieveInformationBlock(Block):
"""Block for retrieving key-value data for the current user with configurable scope"""
class Input(BlockSchema):
class Input(BlockSchemaInput):
key: str = SchemaField(description="Key to retrieve the information for")
scope: StorageScope = SchemaField(
description="Scope of persistence: within_agent (shared across all runs of this agent) or across_agents (shared across all agents for this user)",
@@ -100,7 +106,7 @@ class RetrieveInformationBlock(Block):
description="Default value to return if key is not found", default=None
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
value: Any = SchemaField(description="Retrieved value or default value")
def __init__(self):

View File

@@ -3,7 +3,13 @@ from typing import Any, Literal
from pinecone import Pinecone, ServerlessSpec
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.block import (
Block,
BlockCategory,
BlockOutput,
BlockSchemaInput,
BlockSchemaOutput,
)
from backend.data.model import (
APIKeyCredentials,
CredentialsField,
@@ -27,7 +33,7 @@ def PineconeCredentialsField() -> PineconeCredentialsInput:
class PineconeInitBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: PineconeCredentialsInput = PineconeCredentialsField()
index_name: str = SchemaField(description="Name of the Pinecone index")
dimension: int = SchemaField(
@@ -43,7 +49,7 @@ class PineconeInitBlock(Block):
description="Region for serverless", default="us-east-1"
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
index: str = SchemaField(description="Name of the initialized Pinecone index")
message: str = SchemaField(description="Status message")
@@ -83,7 +89,7 @@ class PineconeInitBlock(Block):
class PineconeQueryBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: PineconeCredentialsInput = PineconeCredentialsField()
query_vector: list = SchemaField(description="Query vector")
namespace: str = SchemaField(
@@ -102,7 +108,7 @@ class PineconeQueryBlock(Block):
host: str = SchemaField(description="Host for pinecone", default="")
idx_name: str = SchemaField(description="Index name for pinecone")
class Output(BlockSchema):
class Output(BlockSchemaOutput):
results: Any = SchemaField(description="Query results from Pinecone")
combined_results: Any = SchemaField(
description="Combined results from Pinecone"
@@ -166,7 +172,7 @@ class PineconeQueryBlock(Block):
class PineconeInsertBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: PineconeCredentialsInput = PineconeCredentialsField()
index: str = SchemaField(description="Initialized Pinecone index")
chunks: list = SchemaField(description="List of text chunks to ingest")
@@ -181,7 +187,7 @@ class PineconeInsertBlock(Block):
default_factory=dict,
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
upsert_response: str = SchemaField(
description="Response from Pinecone upsert operation"
)

View File

@@ -4,7 +4,13 @@ from typing import Iterator, Literal
import praw
from pydantic import BaseModel, SecretStr
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.block import (
Block,
BlockCategory,
BlockOutput,
BlockSchemaInput,
BlockSchemaOutput,
)
from backend.data.model import (
CredentialsField,
CredentialsMetaInput,
@@ -76,7 +82,7 @@ def get_praw(creds: RedditCredentials) -> praw.Reddit:
class GetRedditPostsBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
subreddit: str = SchemaField(
description="Subreddit name, excluding the /r/ prefix",
default="writingprompts",
@@ -94,7 +100,7 @@ class GetRedditPostsBlock(Block):
description="Number of posts to fetch", default=10
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
post: RedditPost = SchemaField(description="Reddit post")
posts: list[RedditPost] = SchemaField(description="List of all Reddit posts")
@@ -194,11 +200,11 @@ class GetRedditPostsBlock(Block):
class PostRedditCommentBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: RedditCredentialsInput = RedditCredentialsField()
data: RedditComment = SchemaField(description="Reddit comment")
class Output(BlockSchema):
class Output(BlockSchemaOutput):
comment_id: str = SchemaField(description="Posted comment ID")
def __init__(self):

View File

@@ -10,7 +10,13 @@ from backend.blocks.replicate._auth import (
ReplicateCredentialsInput,
)
from backend.blocks.replicate._helper import ReplicateOutputs, extract_result
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.block import (
Block,
BlockCategory,
BlockOutput,
BlockSchemaInput,
BlockSchemaOutput,
)
from backend.data.model import APIKeyCredentials, CredentialsField, SchemaField
@@ -38,7 +44,7 @@ class ImageType(str, Enum):
class ReplicateFluxAdvancedModelBlock(Block):
class Input(BlockSchema):
class Input(BlockSchemaInput):
credentials: ReplicateCredentialsInput = CredentialsField(
description="The Replicate integration can be used with "
"any API key with sufficient permissions for the blocks it is used on.",
@@ -105,9 +111,8 @@ class ReplicateFluxAdvancedModelBlock(Block):
title="Safety Tolerance",
)
class Output(BlockSchema):
class Output(BlockSchemaOutput):
result: str = SchemaField(description="Generated output")
error: str = SchemaField(description="Error message if the model run failed")
def __init__(self):
super().__init__(

Some files were not shown because too many files have changed in this diff Show More