Compare commits

..

13 Commits

Author SHA1 Message Date
Otto-AGPT
6a40a29f13 fix: Use standalone devcontainer instead of extending platform compose
The platform's docker-compose.yml has complex networks and dependencies
that caused 'docker compose config' to fail during container creation.

Now the devcontainer is standalone - platform services (db, redis, etc.)
are started from WITHIN the container using docker compose commands in
the lifecycle scripts. This is cleaner and avoids compose conflicts.
2026-02-11 18:25:07 +00:00
Otto-AGPT
fa8930da4c fix: Address CodeRabbit review feedback
- Node version 21 → 22 (matches frontend package.json engines)
- Add language specifier to README code block (MD040)
- Add error handling to cd command in poststart.sh (SC2164)
- Add PYTHONPATH to seed test data launch config
- Use docker compose pull instead of hardcoded image tags
2026-02-11 17:38:47 +00:00
Otto-AGPT
8c79e170e7 fix: Add git.openRepositoryInParentFolders setting
Workspace opens at autogpt_platform/ but git root is parent AutoGPT/
2026-02-11 17:28:04 +00:00
Otto-AGPT
698dc45146 fix: More defensive tool installation with fallbacks
- Source nvm.sh to get node/pnpm in PATH
- Add fallback pip install for poetry if pipx missing
- Add fallback npm install for pnpm if not found
- Better error messages if tools fail to install
- Check Docker availability before pulling images
2026-02-11 17:27:18 +00:00
Otto-AGPT
214ab25b3c fix: Install poetry via pipx (Python feature provides pipx, not poetry)
The Python devcontainer feature installs pipx but not poetry by default.
Updated to:
1. Add poetry to toolsToInstall in devcontainer.json
2. Use pipx to install poetry in oncreate.sh
3. Ensure PATH includes /usr/local/py-utils/bin where pipx installs tools
2026-02-11 17:26:35 +00:00
Otto-AGPT
4daa25e3dc fix: Move devcontainer to repo root for Codespaces detection
GitHub Codespaces only looks for devcontainer.json in:
- .devcontainer/devcontainer.json
- .devcontainer/<subfolder>/devcontainer.json
- .devcontainer.json

It does NOT look inside project subfolders like autogpt_platform/.devcontainer/

Moved to .devcontainer/platform/ which:
1. Will be detected by Codespaces
2. Allows multiple configs (platform vs classic)
3. Updated all path references accordingly
2026-02-11 16:51:03 +00:00
Otto-AGPT
7195f7e298 fix: Run backend/frontend natively to avoid stale prebuild code
The previous approach would build backend/frontend Docker images during
prebuild, but those images would contain the prebuild branch's code (dev),
not the PR branch being reviewed.

Now:
- Prebuild caches: dependency images, Python/Node packages
- Prebuild does NOT cache: backend/frontend code
- Backend/Frontend run natively with hot-reload
- Always uses current branch's code
2026-02-11 16:45:50 +00:00
Otto-AGPT
582754256e feat(platform): Add GitHub Codespaces devcontainer for PR reviews
SECRT-1933

This adds a complete devcontainer configuration for the AutoGPT Platform,
optimized for PR reviews in GitHub Codespaces.

Features:
- Docker-in-Docker for full compose support
- Pre-installed Python 3.13, Node 21, pnpm, Poetry
- Auto-start of all platform services (Supabase, Redis, RabbitMQ, etc.)
- Pre-seeded test data with ready-to-use accounts
- VS Code extensions for Python, TypeScript, Prisma, Playwright
- Debug launch configs for backend and frontend
- Optimized for prebuilds (~60s spinup vs 5-10 min)

Test account: test123@gmail.com / testpassword123

Files:
- .devcontainer/devcontainer.json - Main configuration
- .devcontainer/docker-compose.devcontainer.yml - Compose overlay
- .devcontainer/scripts/ - Lifecycle scripts (oncreate, postcreate, poststart)
- .devcontainer/vscode-templates/ - Optional VS Code debug configs
- .devcontainer/README.md - Documentation
2026-02-11 16:37:30 +00:00
Otto
36aeb0b2b3 docs(blocks): clarify HumanInTheLoop output descriptions for agent builder (#12069)
## Problem

The agent builder (LLM) misinterprets the HumanInTheLoop block outputs.
It thinks `approved_data` and `rejected_data` will yield status strings
like "APPROVED" or "REJECTED" instead of understanding that the actual
input data passes through.

This leads to unnecessary complexity - the agent builder adds comparison
blocks to check for status strings that don't exist.

## Solution

Enriched the block docstring and all input/output field descriptions to
make it explicit that:
1. The output is the actual data itself, not a status string
2. The routing is determined by which output pin fires
3. How to use the block correctly (connect downstream blocks to
appropriate output pins)

## Changes

- Updated block docstring with clear "How it works" and "Example usage"
sections
- Enhanced `data` input description to explain data flow
- Enhanced `name` input description for reviewer context
- Enhanced `approved_data` output to explicitly state it's NOT a status
string
- Enhanced `rejected_data` output to explicitly state it's NOT a status
string
- Enhanced `review_message` output for clarity

## Testing

Documentation-only change to schema descriptions. No functional changes.

Fixes SECRT-1930

<!-- greptile_comment -->

<h2>Greptile Overview</h2>

<details><summary><h3>Greptile Summary</h3></summary>

Enhanced documentation for the `HumanInTheLoopBlock` to clarify how
output pins work. The key improvement explicitly states that output pins
(`approved_data` and `rejected_data`) yield the actual input data, not
status strings like "APPROVED" or "REJECTED". This prevents the agent
builder (LLM) from misinterpreting the block's behavior and adding
unnecessary comparison blocks.

**Key changes:**
- Added "How it works" and "Example usage" sections to the block
docstring
- Clarified that routing is determined by which output pin fires, not by
comparing output values
- Enhanced all input/output field descriptions with explicit data flow
explanations
- Emphasized that downstream blocks should be connected to the
appropriate output pin based on desired workflow path

This is a documentation-only change with no functional modifications to
the code logic.
</details>


<details><summary><h3>Confidence Score: 5/5</h3></summary>

- This PR is safe to merge with no risk
- Documentation-only change that accurately reflects the existing code
behavior. No functional changes, no runtime impact, and the enhanced
descriptions correctly explain how the block outputs work based on
verification of the implementation code.
- No files require special attention
</details>


<!-- greptile_other_comments_section -->

<!-- /greptile_comment -->

Co-authored-by: Zamil Majdy <zamil.majdy@agpt.co>
2026-02-11 15:43:58 +00:00
Ubbe
2a189c44c4 fix(frontend): API stream issues leaking into prompt (#12063)
## Changes 🏗️

<img width="800" height="621" alt="Screenshot 2026-02-11 at 19 32 39"
src="https://github.com/user-attachments/assets/e97be1a7-972e-4ae0-8dfa-6ade63cf287b"
/>

When the BE API has an error, prevent it from leaking into the stream
and instead handle it gracefully via toast.

## Checklist 📋

### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  - [x] Run the app locally and trust the changes

<!-- greptile_comment -->

<h2>Greptile Overview</h2>

<details><summary><h3>Greptile Summary</h3></summary>

This PR fixes an issue where backend API stream errors were leaking into
the chat prompt instead of being handled gracefully. The fix involves
both backend and frontend changes to ensure error events conform to the
AI SDK's strict schema.

**Key Changes:**
- **Backend (`response_model.py`)**: Added custom `to_sse()` method for
`StreamError` that only emits `type` and `errorText` fields, stripping
extra fields like `code` and `details` that cause AI SDK validation
failures
- **Backend (`prompt.py`)**: Added validation step after context
compression to remove orphaned tool responses without matching tool
calls, preventing "unexpected tool_use_id" API errors
- **Frontend (`route.ts`)**: Implemented SSE stream normalization with
`normalizeSSEStream()` and `normalizeSSEEvent()` functions to strip
non-conforming fields from error events before they reach the AI SDK
- **Frontend (`ChatMessagesContainer.tsx`)**: Added toast notifications
for errors and improved error display UI with deduplication logic

The changes ensure a clean separation between internal error metadata
(useful for logging/debugging) and the strict schema required by the AI
SDK on the frontend.
</details>


<details><summary><h3>Confidence Score: 4/5</h3></summary>

- This PR is safe to merge with low risk
- The changes are well-structured and address a specific bug with proper
error handling. The dual-layer approach (backend filtering in `to_sse()`
+ frontend normalization) provides defense-in-depth. However, the lack
of automated tests for the new error normalization logic and the
potential for edge cases in SSE parsing prevent a perfect score.
- Pay close attention to
`autogpt_platform/frontend/src/app/api/chat/sessions/[sessionId]/stream/route.ts`
- the SSE normalization logic should be tested with various error
scenarios
</details>


<details><summary><h3>Sequence Diagram</h3></summary>

```mermaid
sequenceDiagram
    participant User
    participant Frontend as ChatMessagesContainer
    participant Proxy as /api/chat/.../stream
    participant Backend as Backend API
    participant AISDK as AI SDK

    User->>Frontend: Send message
    Frontend->>Proxy: POST with message
    Proxy->>Backend: Forward request with auth
    Backend->>Backend: Process message
    
    alt Success Path
        Backend->>Proxy: SSE stream (text-delta, etc.)
        Proxy->>Proxy: normalizeSSEStream (pass through)
        Proxy->>AISDK: Forward SSE events
        AISDK->>Frontend: Update messages
        Frontend->>User: Display response
    else Error Path
        Backend->>Backend: StreamError.to_sse()
        Note over Backend: Only emit {type, errorText}
        Backend->>Proxy: SSE error event
        Proxy->>Proxy: normalizeSSEEvent()
        Note over Proxy: Strip extra fields (code, details)
        Proxy->>AISDK: {type: "error", errorText: "..."}
        AISDK->>Frontend: error state updated
        Frontend->>Frontend: Toast notification (deduplicated)
        Frontend->>User: Show error UI + toast
    end
```
</details>


<!-- greptile_other_comments_section -->

<!-- /greptile_comment -->

---------

Co-authored-by: greptile-apps[bot] <165735046+greptile-apps[bot]@users.noreply.github.com>
Co-authored-by: Otto-AGPT <otto@agpt.co>
2026-02-11 22:46:37 +08:00
Abhimanyu Yadav
508759610f fix(frontend): add min-width-0 to ContentCard to prevent overflow (#12060)
### Changes 🏗️

Added `min-w-0` class to the ContentCard component in the ToolAccordion
to prevent content overflow issues. This CSS fix ensures that the card
properly respects its container width constraints and allows text
truncation to work correctly when content is too wide.

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  - [x] Verified that tool content displays correctly in the accordion
- [x] Confirmed that long content properly truncates instead of
overflowing
  - [x] Tested with various screen sizes to ensure responsive behavior

#### For configuration changes:

- [x] `.env.default` is updated or already compatible with my changes
- [x] `docker-compose.yml` is updated or already compatible with my
changes

<!-- greptile_comment -->

<h2>Greptile Overview</h2>

<details><summary><h3>Greptile Summary</h3></summary>

Added `min-w-0` class to `ContentCard` component to fix text truncation
overflow in grid layouts. This is a standard CSS fix that allows grid
items to shrink below their content size, enabling `truncate` classes on
child elements (`ContentCardTitle`, `ContentCardSubtitle`) to work
correctly. The fix follows the same pattern already used in
`ContentCardHeader` (line 54) and `ToolAccordion` (line 54).
</details>


<details><summary><h3>Confidence Score: 5/5</h3></summary>

- Safe to merge with no risk
- Single-line CSS fix that addresses a well-known flexbox/grid layout
issue. The change follows existing patterns in the codebase and is
thoroughly tested. No logic changes, no breaking changes, no side
effects.
- No files require special attention
</details>


<!-- greptile_other_comments_section -->

<!-- /greptile_comment -->
2026-02-11 21:09:21 +08:00
Otto
062fe1aa70 fix(security): enforce disabled flag on blocks in graph validation (#12059)
## Summary
Blocks marked `disabled=True` (like BlockInstallationBlock) were not
being checked during graph validation, allowing them to be used via
direct API calls despite being hidden from the UI.

This adds a security check in `_validate_graph_get_errors()` to reject
any graph containing disabled blocks.

## Security Advisory
GHSA-4crw-9p35-9x54

## Linear
SECRT-1927

## Changes
- Added `block.disabled` check in graph validation (6 lines)

## Testing
- Graphs with disabled blocks → rejected with clear error message
- Graphs with valid blocks → unchanged behavior

<!-- greptile_comment -->

<h2>Greptile Overview</h2>

<details><summary><h3>Greptile Summary</h3></summary>

Adds critical security validation to prevent execution of disabled
blocks (like `BlockInstallationBlock`) via direct API calls. The fix
validates that `block.disabled` is `False` during graph validation in
`_validate_graph_get_errors()` on line 747-750, ensuring disabled blocks
are rejected before graph creation or execution. This closes a
vulnerability where blocks marked disabled in the UI could still be used
through API endpoints.
</details>


<details><summary><h3>Confidence Score: 5/5</h3></summary>

- This PR is safe to merge and addresses a critical security
vulnerability
- The fix is minimal (6 lines), correctly placed in the validation flow,
includes clear security context (GHSA reference), and follows existing
validation patterns. The check is positioned after block existence
validation and before input validation, ensuring disabled blocks are
caught early in both graph creation and execution paths.
- No files require special attention
</details>


<!-- greptile_other_comments_section -->

<!-- /greptile_comment -->

---------

Co-authored-by: Nicholas Tindle <nicholas.tindle@agpt.co>
Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-11 03:28:19 +00:00
dependabot[bot]
2cd0d4fe0f chore(deps): bump actions/checkout from 4 to 6 (#12034)
Bumps [actions/checkout](https://github.com/actions/checkout) from 4 to
6.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/actions/checkout/releases">actions/checkout's
releases</a>.</em></p>
<blockquote>
<h2>v6.0.0</h2>
<h2>What's Changed</h2>
<ul>
<li>Update README to include Node.js 24 support details and requirements
by <a href="https://github.com/salmanmkc"><code>@​salmanmkc</code></a>
in <a
href="https://redirect.github.com/actions/checkout/pull/2248">actions/checkout#2248</a></li>
<li>Persist creds to a separate file by <a
href="https://github.com/ericsciple"><code>@​ericsciple</code></a> in <a
href="https://redirect.github.com/actions/checkout/pull/2286">actions/checkout#2286</a></li>
<li>v6-beta by <a
href="https://github.com/ericsciple"><code>@​ericsciple</code></a> in <a
href="https://redirect.github.com/actions/checkout/pull/2298">actions/checkout#2298</a></li>
<li>update readme/changelog for v6 by <a
href="https://github.com/ericsciple"><code>@​ericsciple</code></a> in <a
href="https://redirect.github.com/actions/checkout/pull/2311">actions/checkout#2311</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/actions/checkout/compare/v5.0.0...v6.0.0">https://github.com/actions/checkout/compare/v5.0.0...v6.0.0</a></p>
<h2>v6-beta</h2>
<h2>What's Changed</h2>
<p>Updated persist-credentials to store the credentials under
<code>$RUNNER_TEMP</code> instead of directly in the local git
config.</p>
<p>This requires a minimum Actions Runner version of <a
href="https://github.com/actions/runner/releases/tag/v2.329.0">v2.329.0</a>
to access the persisted credentials for <a
href="https://docs.github.com/en/actions/tutorials/use-containerized-services/create-a-docker-container-action">Docker
container action</a> scenarios.</p>
<h2>v5.0.1</h2>
<h2>What's Changed</h2>
<ul>
<li>Port v6 cleanup to v5 by <a
href="https://github.com/ericsciple"><code>@​ericsciple</code></a> in <a
href="https://redirect.github.com/actions/checkout/pull/2301">actions/checkout#2301</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/actions/checkout/compare/v5...v5.0.1">https://github.com/actions/checkout/compare/v5...v5.0.1</a></p>
<h2>v5.0.0</h2>
<h2>What's Changed</h2>
<ul>
<li>Update actions checkout to use node 24 by <a
href="https://github.com/salmanmkc"><code>@​salmanmkc</code></a> in <a
href="https://redirect.github.com/actions/checkout/pull/2226">actions/checkout#2226</a></li>
<li>Prepare v5.0.0 release by <a
href="https://github.com/salmanmkc"><code>@​salmanmkc</code></a> in <a
href="https://redirect.github.com/actions/checkout/pull/2238">actions/checkout#2238</a></li>
</ul>
<h2>⚠️ Minimum Compatible Runner Version</h2>
<p><strong>v2.327.1</strong><br />
<a
href="https://github.com/actions/runner/releases/tag/v2.327.1">Release
Notes</a></p>
<p>Make sure your runner is updated to this version or newer to use this
release.</p>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/actions/checkout/compare/v4...v5.0.0">https://github.com/actions/checkout/compare/v4...v5.0.0</a></p>
<h2>v4.3.1</h2>
<h2>What's Changed</h2>
<ul>
<li>Port v6 cleanup to v4 by <a
href="https://github.com/ericsciple"><code>@​ericsciple</code></a> in <a
href="https://redirect.github.com/actions/checkout/pull/2305">actions/checkout#2305</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/actions/checkout/compare/v4...v4.3.1">https://github.com/actions/checkout/compare/v4...v4.3.1</a></p>
<h2>v4.3.0</h2>
<h2>What's Changed</h2>
<ul>
<li>docs: update README.md by <a
href="https://github.com/motss"><code>@​motss</code></a> in <a
href="https://redirect.github.com/actions/checkout/pull/1971">actions/checkout#1971</a></li>
<li>Add internal repos for checking out multiple repositories by <a
href="https://github.com/mouismail"><code>@​mouismail</code></a> in <a
href="https://redirect.github.com/actions/checkout/pull/1977">actions/checkout#1977</a></li>
<li>Documentation update - add recommended permissions to Readme by <a
href="https://github.com/benwells"><code>@​benwells</code></a> in <a
href="https://redirect.github.com/actions/checkout/pull/2043">actions/checkout#2043</a></li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/actions/checkout/blob/main/CHANGELOG.md">actions/checkout's
changelog</a>.</em></p>
<blockquote>
<h1>Changelog</h1>
<h2>v6.0.2</h2>
<ul>
<li>Fix tag handling: preserve annotations and explicit fetch-tags by <a
href="https://github.com/ericsciple"><code>@​ericsciple</code></a> in <a
href="https://redirect.github.com/actions/checkout/pull/2356">actions/checkout#2356</a></li>
</ul>
<h2>v6.0.1</h2>
<ul>
<li>Add worktree support for persist-credentials includeIf by <a
href="https://github.com/ericsciple"><code>@​ericsciple</code></a> in <a
href="https://redirect.github.com/actions/checkout/pull/2327">actions/checkout#2327</a></li>
</ul>
<h2>v6.0.0</h2>
<ul>
<li>Persist creds to a separate file by <a
href="https://github.com/ericsciple"><code>@​ericsciple</code></a> in <a
href="https://redirect.github.com/actions/checkout/pull/2286">actions/checkout#2286</a></li>
<li>Update README to include Node.js 24 support details and requirements
by <a href="https://github.com/salmanmkc"><code>@​salmanmkc</code></a>
in <a
href="https://redirect.github.com/actions/checkout/pull/2248">actions/checkout#2248</a></li>
</ul>
<h2>v5.0.1</h2>
<ul>
<li>Port v6 cleanup to v5 by <a
href="https://github.com/ericsciple"><code>@​ericsciple</code></a> in <a
href="https://redirect.github.com/actions/checkout/pull/2301">actions/checkout#2301</a></li>
</ul>
<h2>v5.0.0</h2>
<ul>
<li>Update actions checkout to use node 24 by <a
href="https://github.com/salmanmkc"><code>@​salmanmkc</code></a> in <a
href="https://redirect.github.com/actions/checkout/pull/2226">actions/checkout#2226</a></li>
</ul>
<h2>v4.3.1</h2>
<ul>
<li>Port v6 cleanup to v4 by <a
href="https://github.com/ericsciple"><code>@​ericsciple</code></a> in <a
href="https://redirect.github.com/actions/checkout/pull/2305">actions/checkout#2305</a></li>
</ul>
<h2>v4.3.0</h2>
<ul>
<li>docs: update README.md by <a
href="https://github.com/motss"><code>@​motss</code></a> in <a
href="https://redirect.github.com/actions/checkout/pull/1971">actions/checkout#1971</a></li>
<li>Add internal repos for checking out multiple repositories by <a
href="https://github.com/mouismail"><code>@​mouismail</code></a> in <a
href="https://redirect.github.com/actions/checkout/pull/1977">actions/checkout#1977</a></li>
<li>Documentation update - add recommended permissions to Readme by <a
href="https://github.com/benwells"><code>@​benwells</code></a> in <a
href="https://redirect.github.com/actions/checkout/pull/2043">actions/checkout#2043</a></li>
<li>Adjust positioning of user email note and permissions heading by <a
href="https://github.com/joshmgross"><code>@​joshmgross</code></a> in <a
href="https://redirect.github.com/actions/checkout/pull/2044">actions/checkout#2044</a></li>
<li>Update README.md by <a
href="https://github.com/nebuk89"><code>@​nebuk89</code></a> in <a
href="https://redirect.github.com/actions/checkout/pull/2194">actions/checkout#2194</a></li>
<li>Update CODEOWNERS for actions by <a
href="https://github.com/TingluoHuang"><code>@​TingluoHuang</code></a>
in <a
href="https://redirect.github.com/actions/checkout/pull/2224">actions/checkout#2224</a></li>
<li>Update package dependencies by <a
href="https://github.com/salmanmkc"><code>@​salmanmkc</code></a> in <a
href="https://redirect.github.com/actions/checkout/pull/2236">actions/checkout#2236</a></li>
</ul>
<h2>v4.2.2</h2>
<ul>
<li><code>url-helper.ts</code> now leverages well-known environment
variables by <a href="https://github.com/jww3"><code>@​jww3</code></a>
in <a
href="https://redirect.github.com/actions/checkout/pull/1941">actions/checkout#1941</a></li>
<li>Expand unit test coverage for <code>isGhes</code> by <a
href="https://github.com/jww3"><code>@​jww3</code></a> in <a
href="https://redirect.github.com/actions/checkout/pull/1946">actions/checkout#1946</a></li>
</ul>
<h2>v4.2.1</h2>
<ul>
<li>Check out other refs/* by commit if provided, fall back to ref by <a
href="https://github.com/orhantoy"><code>@​orhantoy</code></a> in <a
href="https://redirect.github.com/actions/checkout/pull/1924">actions/checkout#1924</a></li>
</ul>
<h2>v4.2.0</h2>
<ul>
<li>Add Ref and Commit outputs by <a
href="https://github.com/lucacome"><code>@​lucacome</code></a> in <a
href="https://redirect.github.com/actions/checkout/pull/1180">actions/checkout#1180</a></li>
<li>Dependency updates by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a>- <a
href="https://redirect.github.com/actions/checkout/pull/1777">actions/checkout#1777</a>,
<a
href="https://redirect.github.com/actions/checkout/pull/1872">actions/checkout#1872</a></li>
</ul>
<h2>v4.1.7</h2>
<ul>
<li>Bump the minor-npm-dependencies group across 1 directory with 4
updates by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a
href="https://redirect.github.com/actions/checkout/pull/1739">actions/checkout#1739</a></li>
<li>Bump actions/checkout from 3 to 4 by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a
href="https://redirect.github.com/actions/checkout/pull/1697">actions/checkout#1697</a></li>
<li>Check out other refs/* by commit by <a
href="https://github.com/orhantoy"><code>@​orhantoy</code></a> in <a
href="https://redirect.github.com/actions/checkout/pull/1774">actions/checkout#1774</a></li>
<li>Pin actions/checkout's own workflows to a known, good, stable
version. by <a href="https://github.com/jww3"><code>@​jww3</code></a> in
<a
href="https://redirect.github.com/actions/checkout/pull/1776">actions/checkout#1776</a></li>
</ul>
<h2>v4.1.6</h2>
<ul>
<li>Check platform to set archive extension appropriately by <a
href="https://github.com/cory-miller"><code>@​cory-miller</code></a> in
<a
href="https://redirect.github.com/actions/checkout/pull/1732">actions/checkout#1732</a></li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="de0fac2e45"><code>de0fac2</code></a>
Fix tag handling: preserve annotations and explicit fetch-tags (<a
href="https://redirect.github.com/actions/checkout/issues/2356">#2356</a>)</li>
<li><a
href="064fe7f331"><code>064fe7f</code></a>
Add orchestration_id to git user-agent when ACTIONS_ORCHESTRATION_ID is
set (...</li>
<li><a
href="8e8c483db8"><code>8e8c483</code></a>
Clarify v6 README (<a
href="https://redirect.github.com/actions/checkout/issues/2328">#2328</a>)</li>
<li><a
href="033fa0dc0b"><code>033fa0d</code></a>
Add worktree support for persist-credentials includeIf (<a
href="https://redirect.github.com/actions/checkout/issues/2327">#2327</a>)</li>
<li><a
href="c2d88d3ecc"><code>c2d88d3</code></a>
Update all references from v5 and v4 to v6 (<a
href="https://redirect.github.com/actions/checkout/issues/2314">#2314</a>)</li>
<li><a
href="1af3b93b68"><code>1af3b93</code></a>
update readme/changelog for v6 (<a
href="https://redirect.github.com/actions/checkout/issues/2311">#2311</a>)</li>
<li><a
href="71cf2267d8"><code>71cf226</code></a>
v6-beta (<a
href="https://redirect.github.com/actions/checkout/issues/2298">#2298</a>)</li>
<li><a
href="069c695914"><code>069c695</code></a>
Persist creds to a separate file (<a
href="https://redirect.github.com/actions/checkout/issues/2286">#2286</a>)</li>
<li><a
href="ff7abcd0c3"><code>ff7abcd</code></a>
Update README to include Node.js 24 support details and requirements (<a
href="https://redirect.github.com/actions/checkout/issues/2248">#2248</a>)</li>
<li><a
href="08c6903cd8"><code>08c6903</code></a>
Prepare v5.0.0 release (<a
href="https://redirect.github.com/actions/checkout/issues/2238">#2238</a>)</li>
<li>Additional commits viewable in <a
href="https://github.com/actions/checkout/compare/v4...v6">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=actions/checkout&package-manager=github_actions&previous-version=4&new-version=6)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

---------

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Otto <otto@agpt.co>
2026-02-11 02:25:51 +00:00
41 changed files with 1779 additions and 392 deletions

View File

@@ -0,0 +1,206 @@
# GitHub Codespaces for AutoGPT Platform
This dev container provides a complete development environment for the AutoGPT Platform, optimized for PR reviews.
## 🚀 Quick Start
1. **Open in Codespaces:**
- Go to the repository on GitHub
- Click **Code****Codespaces****Create codespace on dev**
- Or click the badge: [![Open in GitHub Codespaces](https://github.com/codespaces/badge.svg)](https://codespaces.new/Significant-Gravitas/AutoGPT?quickstart=1)
2. **Wait for setup** (~60 seconds with prebuild, ~5-10 min without)
3. **Start the servers:**
```bash
# Terminal 1
make run-backend
# Terminal 2
make run-frontend
```
4. **Start developing!**
- Frontend: `http://localhost:3000`
- Login with: `test123@gmail.com` / `testpassword123`
## 🏗️ Architecture
**Dependencies run in Docker** (cached by prebuild):
- PostgreSQL, Redis, RabbitMQ, Supabase Auth
**Backend & Frontend run natively** (not cached):
- This ensures you're always running the current branch's code
- Enables hot-reload during development
- VS Code debugger can attach directly
## 📍 Available Services
| Service | URL | Notes |
|---------|-----|-------|
| Frontend | http://localhost:3000 | Next.js app |
| REST API | http://localhost:8006 | FastAPI backend |
| WebSocket | ws://localhost:8001 | Real-time updates |
| Supabase | http://localhost:8000 | Auth & API gateway |
| Supabase Studio | http://localhost:5555 | Database admin |
| RabbitMQ | http://localhost:15672 | Queue management |
## 🔑 Test Accounts
| Email | Password | Role |
|-------|----------|------|
| test123@gmail.com | testpassword123 | Featured Creator |
The test account has:
- Pre-created agents and workflows
- Published store listings
- Active agent executions
- Reviews and ratings
## 🛠️ Development Commands
```bash
# Navigate to platform directory (terminal starts here by default)
cd autogpt_platform
# Start all services
docker compose up -d
# Or just core services (DB, Redis, RabbitMQ)
make start-core
# Run backend in dev mode (hot reload)
make run-backend
# Run frontend in dev mode (hot reload)
make run-frontend
# Run both backend and frontend
# (Use VS Code's "Full Stack" launch config for debugging)
# Format code
make format
# Run tests
make test-data # Regenerate test data
poetry run test # Backend tests (from backend/)
pnpm test:e2e # E2E tests (from frontend/)
```
## 🐛 Debugging
### VS Code Launch Configs
> **Note:** Launch and task configs are in `.devcontainer/vscode-templates/`.
> To use them locally, copy to `.vscode/`:
> ```bash
> cp .devcontainer/vscode-templates/*.json .vscode/
> ```
> In Codespaces, core settings are auto-applied via devcontainer.json.
Press `F5` or use the Run and Debug panel:
- **Backend: Debug FastAPI** - Debug the REST API server
- **Backend: Debug Executor** - Debug the agent executor
- **Frontend: Debug Next.js** - Debug with browser DevTools
- **Full Stack: Backend + Frontend** - Debug both simultaneously
- **Tests: Run E2E Tests** - Run Playwright tests
### VS Code Tasks
Press `Ctrl+Shift+P` → "Tasks: Run Task":
- Start/Stop All Services
- Run Migrations
- Seed Test Data
- View Docker Logs
- Reset Database
## 📁 Project Structure
```text
autogpt_platform/ # This folder
├── .devcontainer/ # Codespaces/devcontainer config
├── .vscode/ # VS Code settings
├── backend/ # Python FastAPI backend
│ ├── backend/ # Application code
│ ├── test/ # Test files + data seeders
│ └── migrations/ # Prisma migrations
├── frontend/ # Next.js frontend
│ ├── src/ # Application code
│ └── e2e/ # Playwright E2E tests
├── db/ # Supabase configuration
├── docker-compose.yml # Service orchestration
└── Makefile # Common commands
```
## 🔧 Troubleshooting
### Services not starting?
```bash
# Check service status
docker compose ps
# View logs
docker compose logs -f
# Restart everything
docker compose down && docker compose up -d
```
### Database issues?
```bash
# Reset database (destroys all data)
make reset-db
# Re-run migrations
make migrate
# Re-seed test data
make test-data
```
### Port already in use?
```bash
# Check what's using the port
lsof -i :3000
# Kill process (if safe)
kill -9 <PID>
```
### Can't login?
- Ensure all services are running: `docker compose ps`
- Check auth service: `docker compose logs auth`
- Try seeding data again: `make test-data`
## 📝 Making Changes
### Backend Changes
1. Edit files in `backend/backend/`
2. If using `make run-backend`, changes auto-reload
3. Run `poetry run format` before committing
### Frontend Changes
1. Edit files in `frontend/src/`
2. If using `make run-frontend`, changes auto-reload
3. Run `pnpm format` before committing
### Database Schema Changes
1. Edit `backend/schema.prisma`
2. Run `poetry run prisma migrate dev --name your_migration`
3. Run `poetry run prisma generate`
## 🔒 Environment Variables
Default environment variables are configured for local development. For production secrets, use GitHub Codespaces Secrets:
1. Go to GitHub Settings → Codespaces → Secrets
2. Add secrets with names matching `.env` variables
3. They'll be automatically available in your codespace
## 📚 More Resources
- [AutoGPT Platform Docs](https://docs.agpt.co)
- [Codespaces Documentation](https://docs.github.com/en/codespaces)
- [Dev Containers Spec](https://containers.dev)

View File

@@ -0,0 +1,152 @@
{
"name": "AutoGPT Platform",
"dockerComposeFile": "docker-compose.devcontainer.yml",
"service": "devcontainer",
"workspaceFolder": "/workspaces/AutoGPT/autogpt_platform",
"shutdownAction": "stopCompose",
// Features - Docker-in-Docker for full compose support
"features": {
"ghcr.io/devcontainers/features/docker-in-docker:2": {
"dockerDashComposeVersion": "v2"
},
"ghcr.io/devcontainers/features/github-cli:1": {},
"ghcr.io/devcontainers/features/node:1": {
"version": "22",
"nodeGypDependencies": true
},
"ghcr.io/devcontainers/features/python:1": {
"version": "3.13",
"installTools": true,
"toolsToInstall": "flake8,autopep8,black,mypy,pytest,poetry"
}
},
// Lifecycle scripts - paths relative to repo root
"onCreateCommand": "bash .devcontainer/platform/scripts/oncreate.sh",
"postCreateCommand": "bash .devcontainer/platform/scripts/postcreate.sh",
"postStartCommand": "bash .devcontainer/platform/scripts/poststart.sh",
// Port forwarding
"forwardPorts": [
3000, // Frontend
8006, // REST API
8001, // WebSocket
8000, // Supabase Kong
5432, // PostgreSQL
6379, // Redis
15672, // RabbitMQ Management
5555 // Supabase Studio
],
"portsAttributes": {
"3000": { "label": "Frontend", "onAutoForward": "openBrowser" },
"8006": { "label": "REST API", "onAutoForward": "notify" },
"8001": { "label": "WebSocket", "onAutoForward": "silent" },
"8000": { "label": "Supabase", "onAutoForward": "silent" },
"5432": { "label": "PostgreSQL", "onAutoForward": "silent" },
"6379": { "label": "Redis", "onAutoForward": "silent" },
"15672": { "label": "RabbitMQ", "onAutoForward": "silent" },
"5555": { "label": "Supabase Studio", "onAutoForward": "silent" }
},
// VS Code customizations
"customizations": {
"vscode": {
"settings": {
// Python
"python.defaultInterpreterPath": "${workspaceFolder}/backend/.venv/bin/python",
"python.analysis.typeCheckingMode": "basic",
"python.testing.pytestEnabled": true,
"python.testing.pytestArgs": ["backend"],
// Formatting
"[python]": {
"editor.defaultFormatter": "ms-python.black-formatter",
"editor.formatOnSave": true
},
"[typescript]": {
"editor.defaultFormatter": "esbenp.prettier-vscode",
"editor.formatOnSave": true
},
"[typescriptreact]": {
"editor.defaultFormatter": "esbenp.prettier-vscode",
"editor.formatOnSave": true
},
"[javascript]": {
"editor.defaultFormatter": "esbenp.prettier-vscode",
"editor.formatOnSave": true
},
// Editor
"editor.rulers": [88, 120],
"editor.tabSize": 2,
"files.trimTrailingWhitespace": true,
// Terminal
"terminal.integrated.defaultProfile.linux": "bash",
"terminal.integrated.cwd": "${workspaceFolder}",
// Git
"git.autofetch": true,
"git.enableSmartCommit": true,
"git.openRepositoryInParentFolders": "always",
// Prisma
"prisma.showPrismaDataPlatformNotification": false
},
"extensions": [
// Python
"ms-python.python",
"ms-python.vscode-pylance",
"ms-python.black-formatter",
"ms-python.isort",
// JavaScript/TypeScript
"dbaeumer.vscode-eslint",
"esbenp.prettier-vscode",
"bradlc.vscode-tailwindcss",
// Database
"Prisma.prisma",
// Testing
"ms-playwright.playwright",
// GitHub
"GitHub.vscode-pull-request-github",
"GitHub.copilot",
"github.vscode-github-actions",
// Utilities
"eamodio.gitlens",
"usernamehw.errorlens",
"christian-kohler.path-intellisense",
"mikestead.dotenv"
]
},
"codespaces": {
"openFiles": [
"README.md"
]
}
},
// Environment variables
"containerEnv": {
"CODESPACES": "true",
"POETRY_VIRTUALENVS_IN_PROJECT": "true"
},
// Run as non-root for security
"remoteUser": "vscode",
// Host requirements for performance
"hostRequirements": {
"cpus": 4,
"memory": "8gb",
"storage": "32gb"
}
}

View File

@@ -0,0 +1,21 @@
# Standalone devcontainer service
# The platform services (db, redis, etc.) are started from within
# the container using docker compose commands in the lifecycle scripts.
services:
devcontainer:
image: mcr.microsoft.com/devcontainers/base:ubuntu-24.04
volumes:
# Mount the entire AutoGPT repo
- ../..:/workspaces/AutoGPT:cached
# Keep container running
command: sleep infinity
# Environment for development
environment:
- CODESPACES=true
- POETRY_VIRTUALENVS_IN_PROJECT=true
- POETRY_NO_INTERACTION=1
- NEXT_TELEMETRY_DISABLED=1
- DO_NOT_TRACK=1

View File

@@ -0,0 +1,142 @@
#!/bin/bash
# =============================================================================
# ONCREATE SCRIPT - Runs during prebuild
# =============================================================================
# This script runs during the prebuild phase (GitHub Actions).
# It caches everything that's safe to cache:
# ✅ Dependency Docker images (postgres, redis, rabbitmq, etc.)
# ✅ Python packages (poetry install)
# ✅ Node packages (pnpm install)
#
# It does NOT build backend/frontend Docker images because those would
# contain stale code from the prebuild branch, not the PR being reviewed.
# =============================================================================
set -e # Exit on error
set -x # Print commands for debugging
echo "🚀 Starting prebuild setup..."
# =============================================================================
# Setup PATH for tools installed by devcontainer features
# =============================================================================
# Python feature installs pipx at /usr/local/py-utils/bin
# Node feature installs nvm, node, pnpm at various locations
export PATH="/usr/local/py-utils/bin:$PATH"
# Source nvm if available (Node feature uses nvm)
export NVM_DIR="${NVM_DIR:-/usr/local/share/nvm}"
if [ -s "$NVM_DIR/nvm.sh" ]; then
. "$NVM_DIR/nvm.sh"
fi
# =============================================================================
# Verify and Install Poetry
# =============================================================================
echo "📦 Setting up Poetry..."
if command -v poetry &> /dev/null; then
echo " Poetry already installed: $(poetry --version)"
else
echo " Installing Poetry via pipx..."
if command -v pipx &> /dev/null; then
pipx install poetry
else
echo " pipx not found, installing poetry via pip..."
pip install --user poetry
export PATH="$HOME/.local/bin:$PATH"
fi
fi
poetry --version || { echo "❌ Poetry installation failed"; exit 1; }
# =============================================================================
# Verify and Install pnpm
# =============================================================================
echo "📦 Setting up pnpm..."
if command -v pnpm &> /dev/null; then
echo " pnpm already installed: $(pnpm --version)"
else
echo " Installing pnpm via npm..."
npm install -g pnpm
fi
pnpm --version || { echo "❌ pnpm installation failed"; exit 1; }
# =============================================================================
# Navigate to workspace
# =============================================================================
cd /workspaces/AutoGPT/autogpt_platform
# =============================================================================
# Install Backend Dependencies
# =============================================================================
echo "📦 Installing backend dependencies..."
cd backend
poetry install --no-interaction --no-ansi
# Generate Prisma client (schema only, no DB needed)
echo "🔧 Generating Prisma client..."
poetry run prisma generate || true
poetry run gen-prisma-stub || true
cd ..
# =============================================================================
# Install Frontend Dependencies
# =============================================================================
echo "📦 Installing frontend dependencies..."
cd frontend
pnpm install --frozen-lockfile
cd ..
# =============================================================================
# Pull Dependency Docker Images
# =============================================================================
# Use docker compose pull to get exact versions from compose files
# (single source of truth, no version drift)
# =============================================================================
echo "🐳 Pulling dependency Docker images..."
# Start Docker daemon if using docker-in-docker
if [ -e /var/run/docker-host.sock ]; then
sudo ln -sf /var/run/docker-host.sock /var/run/docker.sock || true
fi
# Check if Docker is available
if command -v docker &> /dev/null && docker info &> /dev/null; then
# Pull images defined in docker-compose.yml (single source of truth)
docker compose pull db redis rabbitmq kong auth || echo "⚠️ Some images may not have pulled"
echo "✅ Dependency images pulled"
else
echo "⚠️ Docker not available during prebuild, images will be pulled on first start"
fi
# =============================================================================
# Copy environment files
# =============================================================================
echo "📄 Setting up environment files..."
cd /workspaces/AutoGPT/autogpt_platform
[ ! -f backend/.env ] && cp backend/.env.default backend/.env
[ ! -f frontend/.env ] && cp frontend/.env.default frontend/.env
[ ! -f .env ] && cp .env.default .env
# =============================================================================
# Done!
# =============================================================================
echo ""
echo "=============================================="
echo "✅ PREBUILD COMPLETE"
echo "=============================================="
echo ""
echo "Cached:"
echo " ✅ Poetry $(poetry --version 2>/dev/null || echo '(check path)')"
echo " ✅ pnpm $(pnpm --version 2>/dev/null || echo '(check path)')"
echo " ✅ Python packages"
echo " ✅ Node packages"
echo ""

View File

@@ -0,0 +1,131 @@
#!/bin/bash
# =============================================================================
# POSTCREATE SCRIPT - Runs after container creation
# =============================================================================
# This script runs once when a codespace is first created. It starts the
# dependency services and prepares the environment for development.
#
# NOTE: Backend and Frontend run NATIVELY (not in Docker) to ensure you're
# always running the current branch's code, not stale prebuild code.
# =============================================================================
set -e # Exit on error
echo "🚀 Setting up your development environment..."
# Ensure PATH includes pipx binaries (where poetry is installed)
export PATH="/usr/local/py-utils/bin:$PATH"
cd /workspaces/AutoGPT/autogpt_platform
# =============================================================================
# Ensure Docker is available
# =============================================================================
if [ -e /var/run/docker-host.sock ]; then
sudo ln -sf /var/run/docker-host.sock /var/run/docker.sock 2>/dev/null || true
fi
# Wait for Docker to be ready
echo "⏳ Waiting for Docker..."
timeout 60 bash -c 'until docker info &>/dev/null; do sleep 1; done'
echo "✅ Docker is ready"
# =============================================================================
# Start Dependency Services ONLY
# =============================================================================
# We only start infrastructure deps in Docker.
# Backend/Frontend run natively to use the current branch's code.
# =============================================================================
echo "🐳 Starting dependency services..."
# Start core dependencies (DB, Auth, Redis, RabbitMQ)
docker compose up -d db redis rabbitmq kong auth
# Wait for PostgreSQL to be healthy
echo "⏳ Waiting for PostgreSQL..."
timeout 120 bash -c '
until docker compose exec -T db pg_isready -U postgres &>/dev/null; do
sleep 2
echo " Waiting for database..."
done
'
echo "✅ PostgreSQL is ready"
# Wait for Redis
echo "⏳ Waiting for Redis..."
timeout 60 bash -c 'until docker compose exec -T redis redis-cli ping &>/dev/null; do sleep 1; done'
echo "✅ Redis is ready"
# Wait for RabbitMQ
echo "⏳ Waiting for RabbitMQ..."
timeout 90 bash -c 'until docker compose exec -T rabbitmq rabbitmq-diagnostics -q ping &>/dev/null; do sleep 2; done'
echo "✅ RabbitMQ is ready"
# =============================================================================
# Run Database Migrations
# =============================================================================
echo "🔄 Running database migrations..."
cd backend
# Run migrations
poetry run prisma migrate deploy
poetry run prisma generate
poetry run gen-prisma-stub || true
cd ..
# =============================================================================
# Seed Test Data (Minimal)
# =============================================================================
echo "🌱 Checking test data..."
cd backend
# Check if test data already exists (idempotent)
if poetry run python -c "
import asyncio
from backend.data.db import prisma
async def check():
await prisma.connect()
count = await prisma.user.count()
await prisma.disconnect()
return count > 0
print('exists' if asyncio.run(check()) else 'empty')
" 2>/dev/null | grep -q "exists"; then
echo " Test data already exists, skipping seed"
else
echo " Running E2E test data creator..."
poetry run python test/e2e_test_data.py || echo "⚠️ Test data seeding had issues (may be partial)"
fi
cd ..
# =============================================================================
# Print Welcome Message
# =============================================================================
echo ""
echo "=============================================="
echo "🎉 CODESPACE READY!"
echo "=============================================="
echo ""
echo "📍 Services Running (Docker):"
echo " PostgreSQL: localhost:5432"
echo " Redis: localhost:6379"
echo " RabbitMQ: localhost:5672 (mgmt: 15672)"
echo " Supabase: localhost:8000"
echo ""
echo "🚀 Start Development:"
echo " make run-backend # Start backend (localhost:8006)"
echo " make run-frontend # Start frontend (localhost:3000)"
echo ""
echo " Or run both in separate terminals!"
echo ""
echo "🔑 Test Account:"
echo " Email: test123@gmail.com"
echo " Password: testpassword123"
echo ""
echo "📚 Full docs: .devcontainer/platform/README.md"
echo ""

View File

@@ -0,0 +1,44 @@
#!/bin/bash
# =============================================================================
# POSTSTART SCRIPT - Runs every time the codespace starts
# =============================================================================
# This script runs when:
# 1. Codespace is first created (after postcreate)
# 2. Codespace resumes from stopped state
# 3. Codespace rebuilds
#
# It ensures dependency services are running. Backend/Frontend are run
# manually by the developer for hot-reload during development.
# =============================================================================
echo "🔄 Starting dependency services..."
cd /workspaces/AutoGPT/autogpt_platform || { echo "❌ Failed to cd to workspace"; exit 1; }
# Ensure Docker socket is available
if [ -e /var/run/docker-host.sock ]; then
sudo ln -sf /var/run/docker-host.sock /var/run/docker.sock 2>/dev/null || true
fi
# Wait for Docker
timeout 30 bash -c 'until docker info &>/dev/null; do sleep 1; done' || {
echo "⚠️ Docker not available, services may need manual start"
exit 0
}
# Start only dependency services (not backend/frontend)
docker compose up -d db redis rabbitmq kong auth
# Quick health check
echo "⏳ Waiting for services..."
sleep 5
if docker compose ps | grep -q "running"; then
echo "✅ Dependency services are running"
echo ""
echo "🚀 Start development with:"
echo " make run-backend # Terminal 1"
echo " make run-frontend # Terminal 2"
else
echo "⚠️ Some services may not be running. Try: docker compose up -d"
fi

View File

@@ -0,0 +1,110 @@
{
"version": "0.2.0",
"configurations": [
{
"name": "Backend: Debug FastAPI",
"type": "debugpy",
"request": "launch",
"module": "uvicorn",
"args": [
"backend.rest:app",
"--reload",
"--host", "0.0.0.0",
"--port", "8006"
],
"cwd": "${workspaceFolder}/backend",
"env": {
"PYTHONPATH": "${workspaceFolder}/backend"
},
"console": "integratedTerminal",
"justMyCode": false
},
{
"name": "Backend: Debug Executor",
"type": "debugpy",
"request": "launch",
"module": "backend.exec",
"cwd": "${workspaceFolder}/backend",
"env": {
"PYTHONPATH": "${workspaceFolder}/backend"
},
"console": "integratedTerminal",
"justMyCode": false
},
{
"name": "Backend: Debug WebSocket",
"type": "debugpy",
"request": "launch",
"module": "backend.ws",
"cwd": "${workspaceFolder}/backend",
"env": {
"PYTHONPATH": "${workspaceFolder}/backend"
},
"console": "integratedTerminal",
"justMyCode": false
},
{
"name": "Frontend: Debug Next.js",
"type": "node",
"request": "launch",
"runtimeExecutable": "pnpm",
"runtimeArgs": ["dev"],
"cwd": "${workspaceFolder}/frontend",
"console": "integratedTerminal",
"serverReadyAction": {
"pattern": "- Local:.+(https?://\\S+)",
"uriFormat": "%s",
"action": "openExternally"
}
},
{
"name": "Frontend: Debug Next.js (Server-side)",
"type": "node",
"request": "launch",
"runtimeExecutable": "pnpm",
"runtimeArgs": ["dev"],
"cwd": "${workspaceFolder}/frontend",
"env": {
"NODE_OPTIONS": "--inspect"
},
"console": "integratedTerminal"
},
{
"name": "Tests: Run Backend Tests",
"type": "debugpy",
"request": "launch",
"module": "pytest",
"args": ["-v", "--tb=short"],
"cwd": "${workspaceFolder}/backend",
"console": "integratedTerminal",
"justMyCode": false
},
{
"name": "Tests: Run E2E Tests (Playwright)",
"type": "node",
"request": "launch",
"runtimeExecutable": "pnpm",
"runtimeArgs": ["test:e2e"],
"cwd": "${workspaceFolder}/frontend",
"console": "integratedTerminal"
},
{
"name": "Scripts: Seed Test Data",
"type": "debugpy",
"request": "launch",
"program": "${workspaceFolder}/backend/test/e2e_test_data.py",
"cwd": "${workspaceFolder}/backend",
"env": {
"PYTHONPATH": "${workspaceFolder}/backend"
},
"console": "integratedTerminal"
}
],
"compounds": [
{
"name": "Full Stack: Backend + Frontend",
"configurations": ["Backend: Debug FastAPI", "Frontend: Debug Next.js"],
"stopAll": true
}
]
}

View File

@@ -0,0 +1,147 @@
{
"version": "2.0.0",
"tasks": [
{
"label": "Start All Services",
"type": "shell",
"command": "docker compose up -d",
"options": {
"cwd": "${workspaceFolder}"
},
"problemMatcher": [],
"group": "build"
},
{
"label": "Stop All Services",
"type": "shell",
"command": "docker compose stop",
"options": {
"cwd": "${workspaceFolder}"
},
"problemMatcher": []
},
{
"label": "Start Core Services",
"type": "shell",
"command": "make start-core",
"options": {
"cwd": "${workspaceFolder}"
},
"problemMatcher": [],
"group": "build"
},
{
"label": "Run Backend (Dev Mode)",
"type": "shell",
"command": "poetry run app",
"options": {
"cwd": "${workspaceFolder}/backend"
},
"problemMatcher": [],
"isBackground": true,
"group": {
"kind": "build",
"isDefault": false
}
},
{
"label": "Run Frontend (Dev Mode)",
"type": "shell",
"command": "pnpm dev",
"options": {
"cwd": "${workspaceFolder}/frontend"
},
"problemMatcher": [],
"isBackground": true,
"group": {
"kind": "build",
"isDefault": true
}
},
{
"label": "Run Migrations",
"type": "shell",
"command": "make migrate",
"options": {
"cwd": "${workspaceFolder}"
},
"problemMatcher": []
},
{
"label": "Seed Test Data",
"type": "shell",
"command": "make test-data",
"options": {
"cwd": "${workspaceFolder}"
},
"problemMatcher": []
},
{
"label": "Format Code",
"type": "shell",
"command": "make format",
"options": {
"cwd": "${workspaceFolder}"
},
"problemMatcher": []
},
{
"label": "Backend: Run Tests",
"type": "shell",
"command": "poetry run test",
"options": {
"cwd": "${workspaceFolder}/backend"
},
"problemMatcher": [],
"group": "test"
},
{
"label": "Frontend: Run Tests",
"type": "shell",
"command": "pnpm test",
"options": {
"cwd": "${workspaceFolder}/frontend"
},
"problemMatcher": [],
"group": "test"
},
{
"label": "Frontend: Run E2E Tests",
"type": "shell",
"command": "pnpm test:e2e",
"options": {
"cwd": "${workspaceFolder}/frontend"
},
"problemMatcher": [],
"group": "test"
},
{
"label": "Generate API Client",
"type": "shell",
"command": "pnpm generate:api",
"options": {
"cwd": "${workspaceFolder}/frontend"
},
"problemMatcher": []
},
{
"label": "View Docker Logs",
"type": "shell",
"command": "docker compose logs -f",
"options": {
"cwd": "${workspaceFolder}"
},
"problemMatcher": [],
"isBackground": true
},
{
"label": "Reset Database",
"type": "shell",
"command": "make reset-db",
"options": {
"cwd": "${workspaceFolder}"
},
"problemMatcher": []
}
]
}

View File

@@ -22,7 +22,7 @@ jobs:
runs-on: ubuntu-latest
steps:
- name: Checkout code
uses: actions/checkout@v4
uses: actions/checkout@v6
with:
ref: ${{ github.event.workflow_run.head_branch }}
fetch-depth: 0

View File

@@ -30,7 +30,7 @@ jobs:
actions: read # Required for CI access
steps:
- name: Checkout code
uses: actions/checkout@v4
uses: actions/checkout@v6
with:
fetch-depth: 1

View File

@@ -40,7 +40,7 @@ jobs:
actions: read # Required for CI access
steps:
- name: Checkout code
uses: actions/checkout@v4
uses: actions/checkout@v6
with:
fetch-depth: 1

View File

@@ -58,7 +58,7 @@ jobs:
# your codebase is analyzed, see https://docs.github.com/en/code-security/code-scanning/creating-an-advanced-setup-for-code-scanning/codeql-code-scanning-for-compiled-languages
steps:
- name: Checkout repository
uses: actions/checkout@v4
uses: actions/checkout@v6
# Initializes the CodeQL tools for scanning.
- name: Initialize CodeQL

View File

@@ -27,7 +27,7 @@ jobs:
# If you do not check out your code, Copilot will do this for you.
steps:
- name: Checkout code
uses: actions/checkout@v4
uses: actions/checkout@v6
with:
fetch-depth: 0
submodules: true

View File

@@ -23,7 +23,7 @@ jobs:
steps:
- name: Checkout code
uses: actions/checkout@v4
uses: actions/checkout@v6
with:
fetch-depth: 1

View File

@@ -23,7 +23,7 @@ jobs:
steps:
- name: Checkout code
uses: actions/checkout@v4
uses: actions/checkout@v6
with:
fetch-depth: 0

View File

@@ -28,7 +28,7 @@ jobs:
steps:
- name: Checkout code
uses: actions/checkout@v4
uses: actions/checkout@v6
with:
fetch-depth: 1

View File

@@ -25,7 +25,7 @@ jobs:
steps:
- name: Checkout code
uses: actions/checkout@v4
uses: actions/checkout@v6
with:
ref: ${{ github.event.inputs.git_ref || github.ref_name }}

View File

@@ -17,7 +17,7 @@ jobs:
steps:
- name: Checkout code
uses: actions/checkout@v4
uses: actions/checkout@v6
with:
ref: ${{ github.ref_name || 'master' }}

View File

@@ -68,7 +68,7 @@ jobs:
steps:
- name: Checkout repository
uses: actions/checkout@v4
uses: actions/checkout@v6
with:
fetch-depth: 0
submodules: true

View File

@@ -31,7 +31,7 @@ jobs:
steps:
- name: Checkout repository
uses: actions/checkout@v4
uses: actions/checkout@v6
- name: Check for component changes
uses: dorny/paths-filter@v3
@@ -71,7 +71,7 @@ jobs:
steps:
- name: Checkout repository
uses: actions/checkout@v4
uses: actions/checkout@v6
- name: Set up Node.js
uses: actions/setup-node@v6
@@ -107,7 +107,7 @@ jobs:
steps:
- name: Checkout repository
uses: actions/checkout@v4
uses: actions/checkout@v6
with:
fetch-depth: 0
@@ -148,7 +148,7 @@ jobs:
steps:
- name: Checkout repository
uses: actions/checkout@v4
uses: actions/checkout@v6
with:
submodules: recursive
@@ -277,7 +277,7 @@ jobs:
steps:
- name: Checkout repository
uses: actions/checkout@v4
uses: actions/checkout@v6
with:
submodules: recursive

View File

@@ -29,7 +29,7 @@ jobs:
steps:
- name: Checkout repository
uses: actions/checkout@v4
uses: actions/checkout@v6
- name: Set up Node.js
uses: actions/setup-node@v6
@@ -63,7 +63,7 @@ jobs:
steps:
- name: Checkout repository
uses: actions/checkout@v4
uses: actions/checkout@v6
with:
submodules: recursive

View File

@@ -11,7 +11,7 @@ jobs:
steps:
# - name: Wait some time for all actions to start
# run: sleep 30
- uses: actions/checkout@v4
- uses: actions/checkout@v6
# with:
# fetch-depth: 0
- name: Set up Python

View File

@@ -1062,14 +1062,14 @@ urllib3 = ">=1.26.0,<3"
[[package]]
name = "launchdarkly-server-sdk"
version = "9.15.0"
version = "9.14.1"
description = "LaunchDarkly SDK for Python"
optional = false
python-versions = ">=3.10"
python-versions = ">=3.9"
groups = ["main"]
files = [
{file = "launchdarkly_server_sdk-9.15.0-py3-none-any.whl", hash = "sha256:c267e29bfa3fb5e2a06a208448ada6ed5557a2924979b8d79c970b45d227c668"},
{file = "launchdarkly_server_sdk-9.15.0.tar.gz", hash = "sha256:f31441b74bc1a69c381db57c33116509e407a2612628ad6dff0a7dbb39d5020b"},
{file = "launchdarkly_server_sdk-9.14.1-py3-none-any.whl", hash = "sha256:a9e2bd9ecdef845cd631ae0d4334a1115e5b44257c42eb2349492be4bac7815c"},
{file = "launchdarkly_server_sdk-9.14.1.tar.gz", hash = "sha256:1df44baf0a0efa74d8c1dad7a00592b98bce7d19edded7f770da8dbc49922213"},
]
[package.dependencies]
@@ -1478,14 +1478,14 @@ testing = ["coverage", "pytest", "pytest-benchmark"]
[[package]]
name = "postgrest"
version = "2.28.0"
version = "2.27.2"
description = "PostgREST client for Python. This library provides an ORM interface to PostgREST."
optional = false
python-versions = ">=3.9"
groups = ["main"]
files = [
{file = "postgrest-2.28.0-py3-none-any.whl", hash = "sha256:7bca2f24dd1a1bf8a3d586c7482aba6cd41662da6733045fad585b63b7f7df75"},
{file = "postgrest-2.28.0.tar.gz", hash = "sha256:c36b38646d25ea4255321d3d924ce70f8d20ec7799cb42c1221d6a818d4f6515"},
{file = "postgrest-2.27.2-py3-none-any.whl", hash = "sha256:1666fef3de05ca097a314433dd5ae2f2d71c613cb7b233d0f468c4ffe37277da"},
{file = "postgrest-2.27.2.tar.gz", hash = "sha256:55407d530b5af3d64e883a71fec1f345d369958f723ce4a8ab0b7d169e313242"},
]
[package.dependencies]
@@ -2135,21 +2135,21 @@ files = [
[[package]]
name = "pytest"
version = "9.0.2"
version = "8.4.1"
description = "pytest: simple powerful testing with Python"
optional = false
python-versions = ">=3.10"
python-versions = ">=3.9"
groups = ["dev"]
files = [
{file = "pytest-9.0.2-py3-none-any.whl", hash = "sha256:711ffd45bf766d5264d487b917733b453d917afd2b0ad65223959f59089f875b"},
{file = "pytest-9.0.2.tar.gz", hash = "sha256:75186651a92bd89611d1d9fc20f0b4345fd827c41ccd5c299a868a05d70edf11"},
{file = "pytest-8.4.1-py3-none-any.whl", hash = "sha256:539c70ba6fcead8e78eebbf1115e8b589e7565830d7d006a8723f19ac8a0afb7"},
{file = "pytest-8.4.1.tar.gz", hash = "sha256:7c67fd69174877359ed9371ec3af8a3d2b04741818c51e5e99cc1742251fa93c"},
]
[package.dependencies]
colorama = {version = ">=0.4", markers = "sys_platform == \"win32\""}
exceptiongroup = {version = ">=1", markers = "python_version < \"3.11\""}
iniconfig = ">=1.0.1"
packaging = ">=22"
iniconfig = ">=1"
packaging = ">=20"
pluggy = ">=1.5,<2"
pygments = ">=2.7.2"
tomli = {version = ">=1", markers = "python_version < \"3.11\""}
@@ -2248,14 +2248,14 @@ cli = ["click (>=5.0)"]
[[package]]
name = "realtime"
version = "2.28.0"
version = "2.27.2"
description = ""
optional = false
python-versions = ">=3.9"
groups = ["main"]
files = [
{file = "realtime-2.28.0-py3-none-any.whl", hash = "sha256:db1bd59bab9b1fcc9f9d3b1a073bed35bf4994d720e6751f10031a58d57a3836"},
{file = "realtime-2.28.0.tar.gz", hash = "sha256:d18cedcebd6a8f22fcd509bc767f639761eb218b7b2b6f14fc4205b6259b50fc"},
{file = "realtime-2.27.2-py3-none-any.whl", hash = "sha256:34a9cbb26a274e707e8fc9e3ee0a66de944beac0fe604dc336d1e985db2c830f"},
{file = "realtime-2.27.2.tar.gz", hash = "sha256:b960a90294d2cea1b3f1275ecb89204304728e08fff1c393cc1b3150739556b3"},
]
[package.dependencies]
@@ -2265,21 +2265,20 @@ websockets = ">=11,<16"
[[package]]
name = "redis"
version = "7.1.1"
version = "6.2.0"
description = "Python client for Redis database and key-value store"
optional = false
python-versions = ">=3.10"
python-versions = ">=3.9"
groups = ["main"]
files = [
{file = "redis-7.1.1-py3-none-any.whl", hash = "sha256:f77817f16071c2950492c67d40b771fa493eb3fccc630a424a10976dbb794b7a"},
{file = "redis-7.1.1.tar.gz", hash = "sha256:a2814b2bda15b39dad11391cc48edac4697214a8a5a4bd10abe936ab4892eb43"},
{file = "redis-6.2.0-py3-none-any.whl", hash = "sha256:c8ddf316ee0aab65f04a11229e94a64b2618451dab7a67cb2f77eb799d872d5e"},
{file = "redis-6.2.0.tar.gz", hash = "sha256:e821f129b75dde6cb99dd35e5c76e8c49512a5a0d8dfdc560b2fbd44b85ca977"},
]
[package.dependencies]
async-timeout = {version = ">=4.0.3", markers = "python_full_version < \"3.11.3\""}
[package.extras]
circuit-breaker = ["pybreaker (>=1.4.0)"]
hiredis = ["hiredis (>=3.2.0)"]
jwt = ["pyjwt (>=2.9.0)"]
ocsp = ["cryptography (>=36.0.1)", "pyopenssl (>=20.0.1)", "requests (>=2.31.0)"]
@@ -2437,14 +2436,14 @@ full = ["httpx (>=0.27.0,<0.29.0)", "itsdangerous", "jinja2", "python-multipart
[[package]]
name = "storage3"
version = "2.28.0"
version = "2.27.2"
description = "Supabase Storage client for Python."
optional = false
python-versions = ">=3.9"
groups = ["main"]
files = [
{file = "storage3-2.28.0-py3-none-any.whl", hash = "sha256:ecb50efd2ac71dabbdf97e99ad346eafa630c4c627a8e5a138ceb5fbbadae716"},
{file = "storage3-2.28.0.tar.gz", hash = "sha256:bc1d008aff67de7a0f2bd867baee7aadbcdb6f78f5a310b4f7a38e8c13c19865"},
{file = "storage3-2.27.2-py3-none-any.whl", hash = "sha256:e6f16e7a260729e7b1f46e9bf61746805a02e30f5e419ee1291007c432e3ec63"},
{file = "storage3-2.27.2.tar.gz", hash = "sha256:cb4807b7f86b4bb1272ac6fdd2f3cfd8ba577297046fa5f88557425200275af5"},
]
[package.dependencies]
@@ -2488,35 +2487,35 @@ python-dateutil = ">=2.6.0"
[[package]]
name = "supabase"
version = "2.28.0"
version = "2.27.2"
description = "Supabase client for Python."
optional = false
python-versions = ">=3.9"
groups = ["main"]
files = [
{file = "supabase-2.28.0-py3-none-any.whl", hash = "sha256:42776971c7d0ccca16034df1ab96a31c50228eb1eb19da4249ad2f756fc20272"},
{file = "supabase-2.28.0.tar.gz", hash = "sha256:aea299aaab2a2eed3c57e0be7fc035c6807214194cce795a3575add20268ece1"},
{file = "supabase-2.27.2-py3-none-any.whl", hash = "sha256:d4dce00b3a418ee578017ec577c0e5be47a9a636355009c76f20ed2faa15bc54"},
{file = "supabase-2.27.2.tar.gz", hash = "sha256:2aed40e4f3454438822442a1e94a47be6694c2c70392e7ae99b51a226d4293f7"},
]
[package.dependencies]
httpx = ">=0.26,<0.29"
postgrest = "2.28.0"
realtime = "2.28.0"
storage3 = "2.28.0"
supabase-auth = "2.28.0"
supabase-functions = "2.28.0"
postgrest = "2.27.2"
realtime = "2.27.2"
storage3 = "2.27.2"
supabase-auth = "2.27.2"
supabase-functions = "2.27.2"
yarl = ">=1.22.0"
[[package]]
name = "supabase-auth"
version = "2.28.0"
version = "2.27.2"
description = "Python Client Library for Supabase Auth"
optional = false
python-versions = ">=3.9"
groups = ["main"]
files = [
{file = "supabase_auth-2.28.0-py3-none-any.whl", hash = "sha256:2ac85026cc285054c7fa6d41924f3a333e9ec298c013e5b5e1754039ba7caec9"},
{file = "supabase_auth-2.28.0.tar.gz", hash = "sha256:2bb8f18ff39934e44b28f10918db965659f3735cd6fbfcc022fe0b82dbf8233e"},
{file = "supabase_auth-2.27.2-py3-none-any.whl", hash = "sha256:78ec25b11314d0a9527a7205f3b1c72560dccdc11b38392f80297ef98664ee91"},
{file = "supabase_auth-2.27.2.tar.gz", hash = "sha256:0f5bcc79b3677cb42e9d321f3c559070cfa40d6a29a67672cc8382fb7dc2fe97"},
]
[package.dependencies]
@@ -2526,14 +2525,14 @@ pyjwt = {version = ">=2.10.1", extras = ["crypto"]}
[[package]]
name = "supabase-functions"
version = "2.28.0"
version = "2.27.2"
description = "Library for Supabase Functions"
optional = false
python-versions = ">=3.9"
groups = ["main"]
files = [
{file = "supabase_functions-2.28.0-py3-none-any.whl", hash = "sha256:30bf2d586f8df285faf0621bb5d5bb3ec3157234fc820553ca156f009475e4ae"},
{file = "supabase_functions-2.28.0.tar.gz", hash = "sha256:db3dddfc37aca5858819eb461130968473bd8c75bd284581013958526dac718b"},
{file = "supabase_functions-2.27.2-py3-none-any.whl", hash = "sha256:db480efc669d0bca07605b9b6f167312af43121adcc842a111f79bea416ef754"},
{file = "supabase_functions-2.27.2.tar.gz", hash = "sha256:d0c8266207a94371cb3fd35ad3c7f025b78a97cf026861e04ccd35ac1775f80b"},
]
[package.dependencies]
@@ -2912,4 +2911,4 @@ type = ["pytest-mypy"]
[metadata]
lock-version = "2.1"
python-versions = ">=3.10,<4.0"
content-hash = "3f738dbf158a0b9319387d7251cd557e8e143d4dec809c5ab720321d2b53e368"
content-hash = "40eae94995dc0a388fa832ed4af9b6137f28d5b5ced3aaea70d5f91d4d9a179d"

View File

@@ -13,17 +13,17 @@ cryptography = "^46.0"
expiringdict = "^1.2.2"
fastapi = "^0.128.0"
google-cloud-logging = "^3.13.0"
launchdarkly-server-sdk = "^9.15.0"
launchdarkly-server-sdk = "^9.14.1"
pydantic = "^2.12.5"
pydantic-settings = "^2.12.0"
pyjwt = { version = "^2.11.0", extras = ["crypto"] }
redis = "^7.1.1"
supabase = "^2.28.0"
redis = "^6.2.0"
supabase = "^2.27.2"
uvicorn = "^0.40.0"
[tool.poetry.group.dev.dependencies]
pyright = "^1.1.408"
pytest = "^9.0.2"
pytest = "^8.4.1"
pytest-asyncio = "^1.3.0"
pytest-mock = "^3.15.1"
pytest-cov = "^7.0.0"

View File

@@ -10,6 +10,8 @@ from typing import Any
from pydantic import BaseModel, Field
from backend.util.json import dumps as json_dumps
class ResponseType(str, Enum):
"""Types of streaming responses following AI SDK protocol."""
@@ -193,6 +195,18 @@ class StreamError(StreamBaseResponse):
default=None, description="Additional error details"
)
def to_sse(self) -> str:
"""Convert to SSE format, only emitting fields required by AI SDK protocol.
The AI SDK uses z.strictObject({type, errorText}) which rejects
any extra fields like `code` or `details`.
"""
data = {
"type": self.type.value,
"errorText": self.errorText,
}
return f"data: {json_dumps(data)}\n\n"
class StreamHeartbeat(StreamBaseResponse):
"""Heartbeat to keep SSE connection alive during long-running operations.

View File

@@ -21,43 +21,71 @@ logger = logging.getLogger(__name__)
class HumanInTheLoopBlock(Block):
"""
This block pauses execution and waits for human approval or modification of the data.
Pauses execution and waits for human approval or rejection of the data.
When executed, it creates a pending review entry and sets the node execution status
to REVIEW. The execution will remain paused until a human user either:
- Approves the data (with or without modifications)
- Rejects the data
When executed, this block creates a pending review entry and sets the node execution
status to REVIEW. The execution remains paused until a human user either approves
or rejects the data.
This is useful for workflows that require human validation or intervention before
proceeding to the next steps.
**How it works:**
- The input data is presented to a human reviewer
- The reviewer can approve or reject (and optionally modify the data if editable)
- On approval: the data flows out through the `approved_data` output pin
- On rejection: the data flows out through the `rejected_data` output pin
**Important:** The output pins yield the actual data itself, NOT status strings.
The approval/rejection decision determines WHICH output pin fires, not the value.
You do NOT need to compare the output to "APPROVED" or "REJECTED" - simply connect
downstream blocks to the appropriate output pin for each case.
**Example usage:**
- Connect `approved_data` → next step in your workflow (data was approved)
- Connect `rejected_data` → error handling or notification (data was rejected)
"""
class Input(BlockSchemaInput):
data: Any = SchemaField(description="The data to be reviewed by a human user")
data: Any = SchemaField(
description="The data to be reviewed by a human user. "
"This exact data will be passed through to either approved_data or "
"rejected_data output based on the reviewer's decision."
)
name: str = SchemaField(
description="A descriptive name for what this data represents",
description="A descriptive name for what this data represents. "
"This helps the reviewer understand what they are reviewing.",
)
editable: bool = SchemaField(
description="Whether the human reviewer can edit the data",
description="Whether the human reviewer can edit the data before "
"approving or rejecting it",
default=True,
advanced=True,
)
class Output(BlockSchemaOutput):
approved_data: Any = SchemaField(
description="The data when approved (may be modified by reviewer)"
description="Outputs the input data when the reviewer APPROVES it. "
"The value is the actual data itself (not a status string like 'APPROVED'). "
"If the reviewer edited the data, this contains the modified version. "
"Connect downstream blocks here for the 'approved' workflow path."
)
rejected_data: Any = SchemaField(
description="The data when rejected (may be modified by reviewer)"
description="Outputs the input data when the reviewer REJECTS it. "
"The value is the actual data itself (not a status string like 'REJECTED'). "
"If the reviewer edited the data, this contains the modified version. "
"Connect downstream blocks here for the 'rejected' workflow path."
)
review_message: str = SchemaField(
description="Any message provided by the reviewer", default=""
description="Optional message provided by the reviewer explaining their "
"decision. Only outputs when the reviewer provides a message; "
"this pin does not fire if no message was given.",
default="",
)
def __init__(self):
super().__init__(
id="8b2a7b3c-6e9d-4a5f-8c1b-2e3f4a5b6c7d",
description="Pause execution and wait for human approval or modification of data",
description="Pause execution for human review. Data flows through "
"approved_data or rejected_data output based on the reviewer's decision. "
"Outputs contain the actual data, not status strings.",
categories={BlockCategory.BASIC},
input_schema=HumanInTheLoopBlock.Input,
output_schema=HumanInTheLoopBlock.Output,

View File

@@ -743,6 +743,11 @@ class GraphModel(Graph, GraphMeta):
# For invalid blocks, we still raise immediately as this is a structural issue
raise ValueError(f"Invalid block {node.block_id} for node #{node.id}")
if block.disabled:
raise ValueError(
f"Block {node.block_id} is disabled and cannot be used in graphs"
)
node_input_mask = (
nodes_input_masks.get(node.id, {}) if nodes_input_masks else {}
)

View File

@@ -213,6 +213,9 @@ async def execute_node(
block_name=node_block.name,
)
if node_block.disabled:
raise ValueError(f"Block {node_block.id} is disabled and cannot be executed")
# Sanity check: validate the execution input.
input_data, error = validate_exec(node, data.inputs, resolve_input=False)
if input_data is None:

View File

@@ -364,6 +364,44 @@ def _remove_orphan_tool_responses(
return result
def validate_and_remove_orphan_tool_responses(
messages: list[dict],
log_warning: bool = True,
) -> list[dict]:
"""
Validate tool_call/tool_response pairs and remove orphaned responses.
Scans messages in order, tracking all tool_call IDs. Any tool response
referencing an ID not seen in a preceding message is considered orphaned
and removed. This prevents API errors like Anthropic's "unexpected tool_use_id".
Args:
messages: List of messages to validate (OpenAI or Anthropic format)
log_warning: Whether to log a warning when orphans are found
Returns:
A new list with orphaned tool responses removed
"""
available_ids: set[str] = set()
orphan_ids: set[str] = set()
for msg in messages:
available_ids |= _extract_tool_call_ids_from_message(msg)
for resp_id in _extract_tool_response_ids_from_message(msg):
if resp_id not in available_ids:
orphan_ids.add(resp_id)
if not orphan_ids:
return messages
if log_warning:
logger.warning(
f"Removing {len(orphan_ids)} orphan tool response(s): {orphan_ids}"
)
return _remove_orphan_tool_responses(messages, orphan_ids)
def _ensure_tool_pairs_intact(
recent_messages: list[dict],
all_messages: list[dict],
@@ -723,6 +761,13 @@ async def compress_context(
# Filter out any None values that may have been introduced
final_msgs: list[dict] = [m for m in msgs if m is not None]
# ---- STEP 6: Final tool-pair validation ---------------------------------
# After all compression steps, verify that every tool response has a
# matching tool_call in a preceding assistant message. Remove orphans
# to prevent API errors (e.g., Anthropic's "unexpected tool_use_id").
final_msgs = validate_and_remove_orphan_tool_responses(final_msgs)
final_count = sum(_msg_tokens(m, enc) for m in final_msgs)
error = None
if final_count + reserve > target_tokens:

File diff suppressed because it is too large Load Diff

View File

@@ -11,7 +11,7 @@ packages = [{ include = "backend", format = "sdist" }]
python = ">=3.10,<3.14"
aio-pika = "^9.5.5"
aiohttp = "^3.10.0"
aiodns = "^4.0.0"
aiodns = "^3.5.0"
anthropic = "^0.79.0"
apscheduler = "^3.11.1"
autogpt-libs = { path = "../autogpt_libs", develop = true }
@@ -19,7 +19,7 @@ bleach = { extras = ["css"], version = "^6.2.0" }
click = "^8.2.0"
cryptography = "^46.0"
discord-py = "^2.5.2"
e2b-code-interpreter = "^2.4.1"
e2b-code-interpreter = "^1.5.2"
elevenlabs = "^1.50.0"
fastapi = "^0.128.6"
feedparser = "^6.0.11"
@@ -29,7 +29,7 @@ google-auth-oauthlib = "^1.2.2"
google-cloud-storage = "^3.2.0"
googlemaps = "^4.10.0"
gravitasml = "^0.1.4"
groq = "^1.0.0"
groq = "^0.30.0"
html2text = "^2024.2.26"
jinja2 = "^3.1.6"
jsonref = "^1.1.0"
@@ -58,21 +58,21 @@ pytest = "^8.4.1"
pytest-asyncio = "^1.1.0"
python-dotenv = "^1.1.1"
python-multipart = "^0.0.22"
redis = "^7.1.1"
redis = "^6.2.0"
regex = "^2025.9.18"
replicate = "^1.0.6"
sentry-sdk = {extras = ["anthropic", "fastapi", "launchdarkly", "openai", "sqlalchemy"], version = "^2.44.0"}
sqlalchemy = "^2.0.40"
strenum = "^0.4.9"
stripe = "^11.5.0"
supabase = "2.28.0"
supabase = "2.27.3"
tenacity = "^9.1.4"
todoist-api-python = "^3.2.1"
todoist-api-python = "^2.1.7"
tweepy = "^4.16.0"
uvicorn = { extras = ["standard"], version = "^0.40.0" }
websockets = "^15.0"
youtube-transcript-api = "^1.2.1"
yt-dlp = "2026.2.4"
yt-dlp = "2025.12.08"
zerobouncesdk = "^1.1.2"
# NOTE: please insert new dependencies in their alphabetical location
pytest-snapshot = "^0.9.0"
@@ -85,7 +85,7 @@ pandas = "^2.3.1"
firecrawl-py = "^4.3.6"
exa-py = "^1.14.20"
croniter = "^6.0.0"
stagehand = "^3.5.0"
stagehand = "^0.5.1"
gravitas-md2gdocs = "^0.1.0"
posthog = "^7.6.0"
@@ -94,7 +94,7 @@ aiohappyeyeballs = "^2.6.1"
black = "^24.10.0"
faker = "^38.2.0"
httpx = "^0.28.1"
isort = "^7.0.0"
isort = "^5.13.2"
poethepoet = "^0.41.0"
pre-commit = "^4.4.0"
pyright = "^1.1.407"

View File

@@ -10,8 +10,9 @@ import {
MessageResponse,
} from "@/components/ai-elements/message";
import { LoadingSpinner } from "@/components/atoms/LoadingSpinner/LoadingSpinner";
import { toast } from "@/components/molecules/Toast/use-toast";
import { ToolUIPart, UIDataTypes, UIMessage, UITools } from "ai";
import { useEffect, useState } from "react";
import { useEffect, useRef, useState } from "react";
import { CreateAgentTool } from "../../tools/CreateAgent/CreateAgent";
import { EditAgentTool } from "../../tools/EditAgent/EditAgent";
import { FindAgentsTool } from "../../tools/FindAgents/FindAgents";
@@ -121,6 +122,7 @@ export const ChatMessagesContainer = ({
isLoading,
}: ChatMessagesContainerProps) => {
const [thinkingPhrase, setThinkingPhrase] = useState(getRandomPhrase);
const lastToastTimeRef = useRef(0);
useEffect(() => {
if (status === "submitted") {
@@ -128,6 +130,20 @@ export const ChatMessagesContainer = ({
}
}, [status]);
// Show a toast when a new error occurs, debounced to avoid spam
useEffect(() => {
if (!error) return;
const now = Date.now();
if (now - lastToastTimeRef.current < 3_000) return;
lastToastTimeRef.current = now;
toast({
variant: "destructive",
title: "Something went wrong",
description:
"The assistant encountered an error. Please try sending your message again.",
});
}, [error]);
const lastMessage = messages[messages.length - 1];
const lastAssistantHasVisibleContent =
lastMessage?.role === "assistant" &&
@@ -263,8 +279,12 @@ export const ChatMessagesContainer = ({
</Message>
)}
{error && (
<div className="rounded-lg bg-red-50 p-3 text-red-600">
Error: {error.message}
<div className="rounded-lg bg-red-50 p-4 text-sm text-red-700">
<p className="font-medium">Something went wrong</p>
<p className="mt-1 text-red-600">
The assistant encountered an error. Please try sending your
message again.
</p>
</div>
)}
</ConversationContent>

View File

@@ -30,7 +30,7 @@ export function ContentCard({
return (
<div
className={cn(
"rounded-lg bg-gradient-to-r from-purple-500/30 to-blue-500/30 p-[1px]",
"min-w-0 rounded-lg bg-gradient-to-r from-purple-500/30 to-blue-500/30 p-[1px]",
className,
)}
>

View File

@@ -4,7 +4,6 @@ import { WarningDiamondIcon } from "@phosphor-icons/react";
import type { ToolUIPart } from "ai";
import { useCopilotChatActions } from "../../components/CopilotChatActionsProvider/useCopilotChatActions";
import { MorphingTextAnimation } from "../../components/MorphingTextAnimation/MorphingTextAnimation";
import { OrbitLoader } from "../../components/OrbitLoader/OrbitLoader";
import { ProgressBar } from "../../components/ProgressBar/ProgressBar";
import {
ContentCardDescription,
@@ -77,7 +76,7 @@ function getAccordionMeta(output: CreateAgentToolOutput) {
isOperationInProgressOutput(output)
) {
return {
icon: <OrbitLoader size={32} />,
icon,
title: "Creating agent, this may take a few minutes. Sit back and relax.",
};
}

View File

@@ -203,7 +203,7 @@ export function getAccordionMeta(output: RunAgentToolOutput): {
? output.status.trim()
: "started";
return {
icon: <OrbitLoader size={28} className="text-neutral-700" />,
icon,
title: output.graph_name,
description: `Status: ${statusText}`,
};

View File

@@ -149,7 +149,7 @@ export function getAccordionMeta(output: RunBlockToolOutput): {
if (isRunBlockBlockOutput(output)) {
const keys = Object.keys(output.outputs ?? {});
return {
icon: <OrbitLoader size={24} className="text-neutral-700" />,
icon,
title: output.block_name,
description:
keys.length > 0

View File

@@ -1,11 +1,8 @@
import { environment } from "@/services/environment";
import { getServerAuthToken } from "@/lib/autogpt-server-api/helpers";
import { NextRequest } from "next/server";
import { normalizeSSEStream, SSE_HEADERS } from "../../../sse-helpers";
/**
* SSE Proxy for chat streaming.
* Supports POST with context (page content + URL) in the request body.
*/
export async function POST(
request: NextRequest,
{ params }: { params: Promise<{ sessionId: string }> },
@@ -23,17 +20,14 @@ export async function POST(
);
}
// Get auth token from server-side session
const token = await getServerAuthToken();
// Build backend URL
const backendUrl = environment.getAGPTServerBaseUrl();
const streamUrl = new URL(
`/api/chat/sessions/${sessionId}/stream`,
backendUrl,
);
// Forward request to backend with auth header
const headers: Record<string, string> = {
"Content-Type": "application/json",
Accept: "text/event-stream",
@@ -63,14 +57,15 @@ export async function POST(
});
}
// Return the SSE stream directly
return new Response(response.body, {
headers: {
"Content-Type": "text/event-stream",
"Cache-Control": "no-cache, no-transform",
Connection: "keep-alive",
"X-Accel-Buffering": "no",
},
if (!response.body) {
return new Response(
JSON.stringify({ error: "Empty response from chat service" }),
{ status: 502, headers: { "Content-Type": "application/json" } },
);
}
return new Response(normalizeSSEStream(response.body), {
headers: SSE_HEADERS,
});
} catch (error) {
console.error("SSE proxy error:", error);
@@ -87,13 +82,6 @@ export async function POST(
}
}
/**
* Resume an active stream for a session.
*
* Called by the AI SDK's `useChat(resume: true)` on page load.
* Proxies to the backend which checks for an active stream and either
* replays it (200 + SSE) or returns 204 No Content.
*/
export async function GET(
_request: NextRequest,
{ params }: { params: Promise<{ sessionId: string }> },
@@ -124,7 +112,6 @@ export async function GET(
headers,
});
// 204 = no active stream to resume
if (response.status === 204) {
return new Response(null, { status: 204 });
}
@@ -137,12 +124,13 @@ export async function GET(
});
}
return new Response(response.body, {
if (!response.body) {
return new Response(null, { status: 204 });
}
return new Response(normalizeSSEStream(response.body), {
headers: {
"Content-Type": "text/event-stream",
"Cache-Control": "no-cache, no-transform",
Connection: "keep-alive",
"X-Accel-Buffering": "no",
...SSE_HEADERS,
"x-vercel-ai-ui-message-stream": "v1",
},
});

View File

@@ -0,0 +1,72 @@
export const SSE_HEADERS = {
"Content-Type": "text/event-stream",
"Cache-Control": "no-cache, no-transform",
Connection: "keep-alive",
"X-Accel-Buffering": "no",
} as const;
export function normalizeSSEStream(
input: ReadableStream<Uint8Array>,
): ReadableStream<Uint8Array> {
const decoder = new TextDecoder();
const encoder = new TextEncoder();
let buffer = "";
return input.pipeThrough(
new TransformStream<Uint8Array, Uint8Array>({
transform(chunk, controller) {
buffer += decoder.decode(chunk, { stream: true });
const parts = buffer.split("\n\n");
buffer = parts.pop() ?? "";
for (const part of parts) {
const normalized = normalizeSSEEvent(part);
controller.enqueue(encoder.encode(normalized + "\n\n"));
}
},
flush(controller) {
if (buffer.trim()) {
const normalized = normalizeSSEEvent(buffer);
controller.enqueue(encoder.encode(normalized + "\n\n"));
}
},
}),
);
}
function normalizeSSEEvent(event: string): string {
const lines = event.split("\n");
const dataLines: string[] = [];
const otherLines: string[] = [];
for (const line of lines) {
if (line.startsWith("data: ")) {
dataLines.push(line.slice(6));
} else {
otherLines.push(line);
}
}
if (dataLines.length === 0) return event;
const dataStr = dataLines.join("\n");
try {
const parsed = JSON.parse(dataStr) as Record<string, unknown>;
if (parsed.type === "error") {
const normalized = {
type: "error",
errorText:
typeof parsed.errorText === "string"
? parsed.errorText
: "An unexpected error occurred",
};
const newData = `data: ${JSON.stringify(normalized)}`;
return [...otherLines.filter((l) => l.length > 0), newData].join("\n");
}
} catch {
// Not valid JSON — pass through as-is
}
return event;
}

View File

@@ -1,20 +1,8 @@
import { environment } from "@/services/environment";
import { getServerAuthToken } from "@/lib/autogpt-server-api/helpers";
import { NextRequest } from "next/server";
import { normalizeSSEStream, SSE_HEADERS } from "../../../sse-helpers";
/**
* SSE Proxy for task stream reconnection.
*
* This endpoint allows clients to reconnect to an ongoing or recently completed
* background task's stream. It replays missed messages from Redis Streams and
* subscribes to live updates if the task is still running.
*
* Client contract:
* 1. When receiving an operation_started event, store the task_id
* 2. To reconnect: GET /api/chat/tasks/{taskId}/stream?last_message_id={idx}
* 3. Messages are replayed from the last_message_id position
* 4. Stream ends when "finish" event is received
*/
export async function GET(
request: NextRequest,
{ params }: { params: Promise<{ taskId: string }> },
@@ -24,15 +12,12 @@ export async function GET(
const lastMessageId = searchParams.get("last_message_id") || "0-0";
try {
// Get auth token from server-side session
const token = await getServerAuthToken();
// Build backend URL
const backendUrl = environment.getAGPTServerBaseUrl();
const streamUrl = new URL(`/api/chat/tasks/${taskId}/stream`, backendUrl);
streamUrl.searchParams.set("last_message_id", lastMessageId);
// Forward request to backend with auth header
const headers: Record<string, string> = {
Accept: "text/event-stream",
"Cache-Control": "no-cache",
@@ -56,14 +41,12 @@ export async function GET(
});
}
// Return the SSE stream directly
return new Response(response.body, {
headers: {
"Content-Type": "text/event-stream",
"Cache-Control": "no-cache, no-transform",
Connection: "keep-alive",
"X-Accel-Buffering": "no",
},
if (!response.body) {
return new Response(null, { status: 204 });
}
return new Response(normalizeSSEStream(response.body), {
headers: SSE_HEADERS,
});
} catch (error) {
console.error("Task stream proxy error:", error);

View File

@@ -61,7 +61,7 @@ Below is a comprehensive list of all available blocks, categorized by their prim
| [Get List Item](block-integrations/basic.md#get-list-item) | Returns the element at the given index |
| [Get Store Agent Details](block-integrations/system/store_operations.md#get-store-agent-details) | Get detailed information about an agent from the store |
| [Get Weather Information](block-integrations/basic.md#get-weather-information) | Retrieves weather information for a specified location using OpenWeatherMap API |
| [Human In The Loop](block-integrations/basic.md#human-in-the-loop) | Pause execution and wait for human approval or modification of data |
| [Human In The Loop](block-integrations/basic.md#human-in-the-loop) | Pause execution for human review |
| [List Is Empty](block-integrations/basic.md#list-is-empty) | Checks if a list is empty |
| [List Library Agents](block-integrations/system/library_operations.md#list-library-agents) | List all agents in your personal library |
| [Note](block-integrations/basic.md#note) | A visual annotation block that displays a sticky note in the workflow editor for documentation and organization purposes |

View File

@@ -975,7 +975,7 @@ A travel planning application could use this block to provide users with current
## Human In The Loop
### What it is
Pause execution and wait for human approval or modification of data
Pause execution for human review. Data flows through approved_data or rejected_data output based on the reviewer's decision. Outputs contain the actual data, not status strings.
### How it works
<!-- MANUAL: how_it_works -->
@@ -988,18 +988,18 @@ This enables human oversight at critical points in automated workflows, ensuring
| Input | Description | Type | Required |
|-------|-------------|------|----------|
| data | The data to be reviewed by a human user | Data | Yes |
| name | A descriptive name for what this data represents | str | Yes |
| editable | Whether the human reviewer can edit the data | bool | No |
| data | The data to be reviewed by a human user. This exact data will be passed through to either approved_data or rejected_data output based on the reviewer's decision. | Data | Yes |
| name | A descriptive name for what this data represents. This helps the reviewer understand what they are reviewing. | str | Yes |
| editable | Whether the human reviewer can edit the data before approving or rejecting it | bool | No |
### Outputs
| Output | Description | Type |
|--------|-------------|------|
| error | Error message if the operation failed | str |
| approved_data | The data when approved (may be modified by reviewer) | Approved Data |
| rejected_data | The data when rejected (may be modified by reviewer) | Rejected Data |
| review_message | Any message provided by the reviewer | str |
| approved_data | Outputs the input data when the reviewer APPROVES it. The value is the actual data itself (not a status string like 'APPROVED'). If the reviewer edited the data, this contains the modified version. Connect downstream blocks here for the 'approved' workflow path. | Approved Data |
| rejected_data | Outputs the input data when the reviewer REJECTS it. The value is the actual data itself (not a status string like 'REJECTED'). If the reviewer edited the data, this contains the modified version. Connect downstream blocks here for the 'rejected' workflow path. | Rejected Data |
| review_message | Optional message provided by the reviewer explaining their decision. Only outputs when the reviewer provides a message; this pin does not fire if no message was given. | str |
### Possible use case
<!-- MANUAL: use_case -->