Compare commits

..

52 Commits

Author SHA1 Message Date
Nicholas Tindle
e688f4003e fix(backend): handle malformed emails in PII masking
Prevents IndexError when email has empty local part (e.g., "@example.com").

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-02-08 21:56:49 -06:00
Nicholas Tindle
3b6f1a4591 fix(frontend): check response status in delete mutation
Consistent with other mutations, now checks response.status === 200
before showing success toast.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-02-08 20:55:10 -06:00
Nicholas Tindle
6b1432d59e fix(backend): add null fallback for categories field
Prevents validation error when DB returns None for categories.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-02-08 20:42:00 -06:00
Nicholas Tindle
f91edde32a fix(backend): mask email PII in waitlist logging
Avoid logging raw email addresses by masking to first char + domain.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-02-08 20:35:13 -06:00
Nicholas Tindle
4ba6c44f61 fix(frontend): regenerate openapi.json with correct structure
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-02-08 20:22:11 -06:00
Nicholas Tindle
b4e16e7246 Merge branch 'dev' into ntindle/waitlist 2026-02-08 20:00:02 -06:00
Nicholas Tindle
adeeba76d1 fix(backend): remove unused pytest import
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-02-08 19:07:00 -06:00
Nicholas Tindle
c88918af4f Merge remote-tracking branch 'origin/dev' into ntindle/waitlist 2026-02-08 18:42:10 -06:00
Otto
a329831b0b feat(backend): Add ClamAV scanning for local file paths (#11988)
## Context

From PR #11796 review discussion. Files processed by the video blocks
(downloads, uploads, generated videos) should be scanned through ClamAV
for malware detection.

## Problem

`store_media_file()` in `backend/util/file.py` already scans:
- `workspace://` references
- Cloud storage paths  
- Data URIs (`data:...`)
- HTTP/HTTPS URLs

**But local file paths were NOT scanned.** The `else` branch only
verified the file exists.

This gap affected video processing blocks (e.g., `LoopVideoBlock`,
`AddAudioToVideoBlock`) that:
1. Download/receive input media
2. Process it locally (loop, add audio, etc.)
3. Write output to temp directory
4. Call `store_media_file(output_filename, ...)` with a local path →
**skipped virus scanning**

## Solution

Added virus scanning to the local file path branch:
```python
# Virus scan the local file before any further processing
local_content = target_path.read_bytes()
if len(local_content) > MAX_FILE_SIZE_BYTES:
    raise ValueError(...)
await scan_content_safe(local_content, filename=sanitized_file)
```

## Changes

- `backend/util/file.py` - Added ~7 lines to scan local files
(consistent with other input types)
- `backend/util/file_test.py` - Added 2 test cases for local file
scanning

## Risk Assessment

- **Low risk:** Single point of change, follows existing pattern
- **Backwards compatible:** No API changes
- **Fail-safe:** If scanning fails, file is rejected (existing behavior)

Closes SECRT-1904

Co-authored-by: Nicholas Tindle <nicholas.tindle@agpt.co>
2026-02-09 00:24:18 +00:00
dependabot[bot]
98dd1a9480 chore(libs/deps): Bump cryptography from 45.0.6 to 46.0.1 in /autogpt_platform/autogpt_libs (#10968)
Bumps [cryptography](https://github.com/pyca/cryptography) from 45.0.6
to 46.0.1.
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/pyca/cryptography/blob/main/CHANGELOG.rst">cryptography's
changelog</a>.</em></p>
<blockquote>
<p>46.0.1 - 2025-09-16</p>
<pre><code>
* Fixed an issue where users installing via ``pip`` on Python 3.14
development
  versions would not properly install a dependency.
* Fixed an issue building the free-threaded macOS 3.14 wheels.
<p>.. _v46-0-0:</p>
<p>46.0.0 - 2025-09-16<br />
</code></pre></p>
<ul>
<li><strong>BACKWARDS INCOMPATIBLE:</strong> Support for Python 3.7 has
been removed.</li>
<li>Support for OpenSSL &lt; 3.0 is deprecated and will be removed in
the next
release.</li>
<li>Support for <code>x86_64</code> macOS (including publishing wheels)
is deprecated
and will be removed in two releases. We will switch to publishing an
<code>arm64</code> only wheel for macOS.</li>
<li>Support for 32-bit Windows (including publishing wheels) is
deprecated
and will be removed in two releases. Users should move to a 64-bit
Python installation.</li>
<li>Updated Windows, macOS, and Linux wheels to be compiled with OpenSSL
3.5.3.</li>
<li>We now build <code>ppc64le</code> <code>manylinux</code> wheels and
publish them to PyPI.</li>
<li>We now build <code>win_arm64</code> (Windows on Arm) wheels and
publish them to PyPI.</li>
<li>Added support for free-threaded Python 3.14.</li>
<li>Removed the deprecated <code>get_attribute_for_oid</code> method on
:class:<code>~cryptography.x509.CertificateSigningRequest</code>. Users
should use
:meth:<code>~cryptography.x509.Attributes.get_attribute_for_oid</code>
instead.</li>
<li>Removed the deprecated <code>CAST5</code>, <code>SEED</code>,
<code>IDEA</code>, and <code>Blowfish</code>
classes from the cipher module. These are still available in
:doc:<code>/hazmat/decrepit/index</code>.</li>
<li>In X.509, when performing a PSS signature with a SHA-3 hash, it is
now
encoded with the official NIST SHA3 OID.</li>
</ul>
<p>.. _v45-0-7:</p>
<p>45.0.7 - 2025-09-01</p>
<pre><code>
* Added a function to support an upcoming ``pyOpenSSL`` release.
<p>.. _v45-0-6:<br />
</code></pre></p>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="e735cfc275"><code>e735cfc</code></a>
release 46.0.1 (<a
href="https://redirect.github.com/pyca/cryptography/issues/13450">#13450</a>)</li>
<li><a
href="4e457ffba4"><code>4e457ff</code></a>
Explicitly specify python in mac uv build invocation (<a
href="https://redirect.github.com/pyca/cryptography/issues/13447">#13447</a>)</li>
<li><a
href="2726efdb6d"><code>2726efd</code></a>
Depend on CFFI 2.0.0 or newer on Python &gt; 3.8 (<a
href="https://redirect.github.com/pyca/cryptography/issues/13448">#13448</a>)</li>
<li><a
href="62230623d1"><code>6223062</code></a>
release 46.0.0 (<a
href="https://redirect.github.com/pyca/cryptography/issues/13446">#13446</a>)</li>
<li><a
href="563c4915b0"><code>563c491</code></a>
Update comment for pyopenssl-release tag (<a
href="https://redirect.github.com/pyca/cryptography/issues/13445">#13445</a>)</li>
<li><a
href="d2f6f7face"><code>d2f6f7f</code></a>
Bump downstream dependencies in CI (<a
href="https://redirect.github.com/pyca/cryptography/issues/13439">#13439</a>)</li>
<li><a
href="e7ab02bd67"><code>e7ab02b</code></a>
we'll ship this with 3.5.3 why not (<a
href="https://redirect.github.com/pyca/cryptography/issues/13442">#13442</a>)</li>
<li><a
href="0b68a4bffb"><code>0b68a4b</code></a>
Another pair of bump dependencies fix (<a
href="https://redirect.github.com/pyca/cryptography/issues/13444">#13444</a>)</li>
<li><a
href="e076d08ee4"><code>e076d08</code></a>
Attempt to fix commit message for bump downstreams (<a
href="https://redirect.github.com/pyca/cryptography/issues/13440">#13440</a>)</li>
<li><a
href="6835ce899e"><code>6835ce8</code></a>
Put correct version bounds for pyenchant in pins (<a
href="https://redirect.github.com/pyca/cryptography/issues/13441">#13441</a>)</li>
<li>Additional commits viewable in <a
href="https://github.com/pyca/cryptography/compare/45.0.6...46.0.1">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=cryptography&package-manager=pip&previous-version=45.0.6&new-version=46.0.1)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

You can trigger a rebase of this PR by commenting `@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

> **Note**
> Automatic rebases have been disabled on this pull request as it has
been open for over 30 days.

---------

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Nick Tindle <nick@ntindle.com>
2026-02-08 23:40:15 +00:00
dependabot[bot]
9c7c598c7d chore(deps): bump peter-evans/create-pull-request from 7 to 8 (#11663)
Bumps
[peter-evans/create-pull-request](https://github.com/peter-evans/create-pull-request)
from 7 to 8.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/peter-evans/create-pull-request/releases">peter-evans/create-pull-request's
releases</a>.</em></p>
<blockquote>
<h2>Create Pull Request v8.0.0</h2>
<h2>What's new in v8</h2>
<ul>
<li>Requires <a
href="https://github.com/actions/runner/releases/tag/v2.327.1">Actions
Runner v2.327.1</a> or later if you are using a self-hosted runner for
Node 24 support.</li>
</ul>
<h2>What's Changed</h2>
<ul>
<li>chore: Update checkout action version to v6 by <a
href="https://github.com/yonas"><code>@​yonas</code></a> in <a
href="https://redirect.github.com/peter-evans/create-pull-request/pull/4258">peter-evans/create-pull-request#4258</a></li>
<li>Update actions/checkout references to <a
href="https://github.com/v6"><code>@​v6</code></a> in docs by <a
href="https://github.com/Copilot"><code>@​Copilot</code></a> in <a
href="https://redirect.github.com/peter-evans/create-pull-request/pull/4259">peter-evans/create-pull-request#4259</a></li>
<li>feat: v8 by <a
href="https://github.com/peter-evans"><code>@​peter-evans</code></a> in
<a
href="https://redirect.github.com/peter-evans/create-pull-request/pull/4260">peter-evans/create-pull-request#4260</a></li>
</ul>
<h2>New Contributors</h2>
<ul>
<li><a href="https://github.com/yonas"><code>@​yonas</code></a> made
their first contribution in <a
href="https://redirect.github.com/peter-evans/create-pull-request/pull/4258">peter-evans/create-pull-request#4258</a></li>
<li><a href="https://github.com/Copilot"><code>@​Copilot</code></a> made
their first contribution in <a
href="https://redirect.github.com/peter-evans/create-pull-request/pull/4259">peter-evans/create-pull-request#4259</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/peter-evans/create-pull-request/compare/v7.0.11...v8.0.0">https://github.com/peter-evans/create-pull-request/compare/v7.0.11...v8.0.0</a></p>
<h2>Create Pull Request v7.0.11</h2>
<h2>What's Changed</h2>
<ul>
<li>fix: restrict remote prune to self-hosted runners by <a
href="https://github.com/peter-evans"><code>@​peter-evans</code></a> in
<a
href="https://redirect.github.com/peter-evans/create-pull-request/pull/4250">peter-evans/create-pull-request#4250</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/peter-evans/create-pull-request/compare/v7.0.10...v7.0.11">https://github.com/peter-evans/create-pull-request/compare/v7.0.10...v7.0.11</a></p>
<h2>Create Pull Request v7.0.10</h2>
<p>⚙️ Fixes an issue where updating a pull request failed when targeting
a forked repository with the same owner as its parent.</p>
<h2>What's Changed</h2>
<ul>
<li>build(deps): bump the github-actions group with 2 updates by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a>[bot]
in <a
href="https://redirect.github.com/peter-evans/create-pull-request/pull/4235">peter-evans/create-pull-request#4235</a></li>
<li>build(deps-dev): bump prettier from 3.6.2 to 3.7.3 in the npm group
by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a>[bot]
in <a
href="https://redirect.github.com/peter-evans/create-pull-request/pull/4240">peter-evans/create-pull-request#4240</a></li>
<li>fix: provider list pulls fallback for multi fork same owner by <a
href="https://github.com/peter-evans"><code>@​peter-evans</code></a> in
<a
href="https://redirect.github.com/peter-evans/create-pull-request/pull/4245">peter-evans/create-pull-request#4245</a></li>
</ul>
<h2>New Contributors</h2>
<ul>
<li><a href="https://github.com/obnyis"><code>@​obnyis</code></a> made
their first contribution in <a
href="https://redirect.github.com/peter-evans/create-pull-request/pull/4064">peter-evans/create-pull-request#4064</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/peter-evans/create-pull-request/compare/v7.0.9...v7.0.10">https://github.com/peter-evans/create-pull-request/compare/v7.0.9...v7.0.10</a></p>
<h2>Create Pull Request v7.0.9</h2>
<p>⚙️ Fixes an <a
href="https://redirect.github.com/peter-evans/create-pull-request/issues/4228">incompatibility</a>
with the recently released <code>actions/checkout@v6</code>.</p>
<h2>What's Changed</h2>
<ul>
<li>~70 dependency updates by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a></li>
<li>docs: fix workaround description about <code>ready_for_review</code>
by <a href="https://github.com/ybiquitous"><code>@​ybiquitous</code></a>
in <a
href="https://redirect.github.com/peter-evans/create-pull-request/pull/3939">peter-evans/create-pull-request#3939</a></li>
<li>Docs: <code>add-paths</code> default behavior by <a
href="https://github.com/joeflack4"><code>@​joeflack4</code></a> in <a
href="https://redirect.github.com/peter-evans/create-pull-request/pull/3928">peter-evans/create-pull-request#3928</a></li>
<li>docs: update to create-github-app-token v2 by <a
href="https://github.com/Goooler"><code>@​Goooler</code></a> in <a
href="https://redirect.github.com/peter-evans/create-pull-request/pull/4063">peter-evans/create-pull-request#4063</a></li>
<li>Fix compatibility with actions/checkout@v6 by <a
href="https://github.com/ericsciple"><code>@​ericsciple</code></a> in <a
href="https://redirect.github.com/peter-evans/create-pull-request/pull/4230">peter-evans/create-pull-request#4230</a></li>
</ul>
<h2>New Contributors</h2>
<ul>
<li><a href="https://github.com/joeflack4"><code>@​joeflack4</code></a>
made their first contribution in <a
href="https://redirect.github.com/peter-evans/create-pull-request/pull/3928">peter-evans/create-pull-request#3928</a></li>
<li><a href="https://github.com/Goooler"><code>@​Goooler</code></a> made
their first contribution in <a
href="https://redirect.github.com/peter-evans/create-pull-request/pull/4063">peter-evans/create-pull-request#4063</a></li>
<li><a
href="https://github.com/ericsciple"><code>@​ericsciple</code></a> made
their first contribution in <a
href="https://redirect.github.com/peter-evans/create-pull-request/pull/4230">peter-evans/create-pull-request#4230</a></li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="98357b18bf"><code>98357b1</code></a>
feat: v8 (<a
href="https://redirect.github.com/peter-evans/create-pull-request/issues/4260">#4260</a>)</li>
<li><a
href="41c0e4b789"><code>41c0e4b</code></a>
Update actions/checkout references to <a
href="https://github.com/v6"><code>@​v6</code></a> in docs (<a
href="https://redirect.github.com/peter-evans/create-pull-request/issues/4259">#4259</a>)</li>
<li><a
href="994332de4c"><code>994332d</code></a>
chore: Update checkout action version to v6 (<a
href="https://redirect.github.com/peter-evans/create-pull-request/issues/4258">#4258</a>)</li>
<li>See full diff in <a
href="https://github.com/peter-evans/create-pull-request/compare/v7...v8">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=peter-evans/create-pull-request&package-manager=github_actions&previous-version=7&new-version=8)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

You can trigger a rebase of this PR by commenting `@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

> **Note**
> Automatic rebases have been disabled on this pull request as it has
been open for over 30 days.

---------

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Nick Tindle <nick@ntindle.com>
2026-02-08 23:06:40 +00:00
Nikhil Bhagat
728c40def5 fix(backend): replace multiprocessing queue with thread safe queue in ExecutionQueue (#11618)
<!-- Clearly explain the need for these changes: -->

The `ExecutionQueue` class was using `multiprocessing.Manager().Queue()`
which spawns a subprocess for inter-process communication. However,
analysis showed that `ExecutionQueue` is only accessed from threads
within the same process, not across processes. This caused:
- Unnecessary subprocess spawning per graph execution
- IPC overhead for every queue operation
- Potential resource leaks if Manager processes weren't properly cleaned
up
- Limited scalability when many graphs execute concurrently

### Changes 

<!-- Concisely describe all of the changes made in this pull request:
-->

- Replaced `multiprocessing.Manager().Queue()` with `queue.Queue()` in
`ExecutionQueue` class
- Updated imports: removed `from multiprocessing import Manager` and
`from queue import Empty`, added `import queue`
- Updated exception handling from `except Empty:` to `except
queue.Empty:`
- Added comprehensive docstring explaining the bug and fix

**File changed:** `autogpt_platform/backend/backend/data/execution.py`

### Checklist 

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
- [x] Verified `ExecutionQueue` uses `queue.Queue` (not
`multiprocessing.Manager().Queue()`)
- [x] Tested all queue operations: `add()`, `get()`, `empty()`,
`get_or_none()`
- [x] Verified thread-safety with concurrent producer/consumer threads
(100 items)
- [x] Verified multi-producer/consumer scenario (3 producers, 2
consumers, 150 items)
  - [x] Confirmed no subprocess spawning when creating multiple queues
  - [x] Code passes Black formatting check

#### For configuration changes:

- [x] `.env.default` is updated or already compatible with my changes
- [x] `docker-compose.yml` is updated or already compatible with my
changes
- [x] I have included a list of my configuration changes in the PR
description (under **Changes**)

> No configuration changes required - this is a code-only fix with no
external API changes.

---------

Co-authored-by: Otto <otto@agpt.co>
Co-authored-by: Zamil Majdy <majdyz@users.noreply.github.com>
Co-authored-by: Zamil Majdy <zamil.majdy@agpt.co>
2026-02-08 16:28:04 +00:00
dependabot[bot]
cd64562e1b chore(libs/deps): bump the production-dependencies group across 1 directory with 8 updates (#11934)
Bumps the production-dependencies group with 8 updates in the
/autogpt_platform/autogpt_libs directory:

| Package | From | To |
| --- | --- | --- |
| [fastapi](https://github.com/fastapi/fastapi) | `0.116.1` | `0.128.0`
|
| [google-cloud-logging](https://github.com/googleapis/python-logging) |
`3.12.1` | `3.13.0` |
|
[launchdarkly-server-sdk](https://github.com/launchdarkly/python-server-sdk)
| `9.12.0` | `9.14.1` |
| [pydantic](https://github.com/pydantic/pydantic) | `2.11.7` | `2.12.5`
|
| [pydantic-settings](https://github.com/pydantic/pydantic-settings) |
`2.10.1` | `2.12.0` |
| [pyjwt](https://github.com/jpadilla/pyjwt) | `2.10.1` | `2.11.0` |
| [supabase](https://github.com/supabase/supabase-py) | `2.16.0` |
`2.27.2` |
| [uvicorn](https://github.com/Kludex/uvicorn) | `0.35.0` | `0.40.0` |


Updates `fastapi` from 0.116.1 to 0.128.0
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/fastapi/fastapi/releases">fastapi's
releases</a>.</em></p>
<blockquote>
<h2>0.128.0</h2>
<h3>Breaking Changes</h3>
<ul>
<li> Drop support for <code>pydantic.v1</code>. PR <a
href="https://redirect.github.com/fastapi/fastapi/pull/14609">#14609</a>
by <a
href="https://github.com/tiangolo"><code>@​tiangolo</code></a>.</li>
</ul>
<h3>Internal</h3>
<ul>
<li> Run performance tests only on Pydantic v2. PR <a
href="https://redirect.github.com/fastapi/fastapi/pull/14608">#14608</a>
by <a
href="https://github.com/tiangolo"><code>@​tiangolo</code></a>.</li>
</ul>
<h2>0.127.1</h2>
<h3>Refactors</h3>
<ul>
<li>🔊 Add a custom <code>FastAPIDeprecationWarning</code>. PR <a
href="https://redirect.github.com/fastapi/fastapi/pull/14605">#14605</a>
by <a
href="https://github.com/tiangolo"><code>@​tiangolo</code></a>.</li>
</ul>
<h3>Docs</h3>
<ul>
<li>📝 Add documentary to website. PR <a
href="https://redirect.github.com/fastapi/fastapi/pull/14600">#14600</a>
by <a
href="https://github.com/tiangolo"><code>@​tiangolo</code></a>.</li>
</ul>
<h3>Translations</h3>
<ul>
<li>🌐 Update translations for de (update-outdated). PR <a
href="https://redirect.github.com/fastapi/fastapi/pull/14602">#14602</a>
by <a
href="https://github.com/nilslindemann"><code>@​nilslindemann</code></a>.</li>
<li>🌐 Update translations for de (update-outdated). PR <a
href="https://redirect.github.com/fastapi/fastapi/pull/14581">#14581</a>
by <a
href="https://github.com/nilslindemann"><code>@​nilslindemann</code></a>.</li>
</ul>
<h3>Internal</h3>
<ul>
<li>🔧 Update pre-commit to use local Ruff instead of hook. PR <a
href="https://redirect.github.com/fastapi/fastapi/pull/14604">#14604</a>
by <a
href="https://github.com/tiangolo"><code>@​tiangolo</code></a>.</li>
<li> Add missing tests for code examples. PR <a
href="https://redirect.github.com/fastapi/fastapi/pull/14569">#14569</a>
by <a
href="https://github.com/YuriiMotov"><code>@​YuriiMotov</code></a>.</li>
<li>👷 Remove <code>lint</code> job from <code>test</code> CI workflow.
PR <a
href="https://redirect.github.com/fastapi/fastapi/pull/14593">#14593</a>
by <a
href="https://github.com/YuriiMotov"><code>@​YuriiMotov</code></a>.</li>
<li>👷 Update secrets check. PR <a
href="https://redirect.github.com/fastapi/fastapi/pull/14592">#14592</a>
by <a
href="https://github.com/tiangolo"><code>@​tiangolo</code></a>.</li>
<li>👷 Run CodSpeed tests in parallel to other tests to speed up CI. PR
<a
href="https://redirect.github.com/fastapi/fastapi/pull/14586">#14586</a>
by <a
href="https://github.com/tiangolo"><code>@​tiangolo</code></a>.</li>
<li>🔨 Update scripts and pre-commit to autofix files. PR <a
href="https://redirect.github.com/fastapi/fastapi/pull/14585">#14585</a>
by <a
href="https://github.com/tiangolo"><code>@​tiangolo</code></a>.</li>
</ul>
<h2>0.127.0</h2>
<h3>Breaking Changes</h3>
<ul>
<li>🔊 Add deprecation warnings when using <code>pydantic.v1</code>. PR
<a
href="https://redirect.github.com/fastapi/fastapi/pull/14583">#14583</a>
by <a
href="https://github.com/tiangolo"><code>@​tiangolo</code></a>.</li>
</ul>
<h3>Translations</h3>
<ul>
<li>🔧 Add LLM prompt file for Korean, generated from the existing
translations. PR <a
href="https://redirect.github.com/fastapi/fastapi/pull/14546">#14546</a>
by <a
href="https://github.com/tiangolo"><code>@​tiangolo</code></a>.</li>
<li>🔧 Add LLM prompt file for Japanese, generated from the existing
translations. PR <a
href="https://redirect.github.com/fastapi/fastapi/pull/14545">#14545</a>
by <a
href="https://github.com/tiangolo"><code>@​tiangolo</code></a>.</li>
</ul>
<h3>Internal</h3>
<ul>
<li>⬆️ Upgrade OpenAI model for translations to gpt-5.2. PR <a
href="https://redirect.github.com/fastapi/fastapi/pull/14579">#14579</a>
by <a
href="https://github.com/tiangolo"><code>@​tiangolo</code></a>.</li>
</ul>
<h2>0.126.0</h2>
<h3>Upgrades</h3>
<ul>
<li> Drop support for Pydantic v1, keeping short temporary support for
Pydantic v2's <code>pydantic.v1</code>. PR <a
href="https://redirect.github.com/fastapi/fastapi/pull/14575">#14575</a>
by <a
href="https://github.com/tiangolo"><code>@​tiangolo</code></a>.</li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="8322a4445a"><code>8322a44</code></a>
🔖 Release version 0.128.0</li>
<li><a
href="4b2cfcfd34"><code>4b2cfcf</code></a>
📝 Update release notes</li>
<li><a
href="e300630551"><code>e300630</code></a>
 Drop support for <code>pydantic.v1</code> (<a
href="https://redirect.github.com/fastapi/fastapi/issues/14609">#14609</a>)</li>
<li><a
href="1b3bea8b6b"><code>1b3bea8</code></a>
📝 Update release notes</li>
<li><a
href="34e884156f"><code>34e8841</code></a>
 Run performance tests only on Pydantic v2 (<a
href="https://redirect.github.com/fastapi/fastapi/issues/14608">#14608</a>)</li>
<li><a
href="cd90c78391"><code>cd90c78</code></a>
🔖 Release version 0.127.1</li>
<li><a
href="93f4dfd88b"><code>93f4dfd</code></a>
📝 Update release notes</li>
<li><a
href="535b5daa31"><code>535b5da</code></a>
🔊 Add a custom <code>FastAPIDeprecationWarning</code> (<a
href="https://redirect.github.com/fastapi/fastapi/issues/14605">#14605</a>)</li>
<li><a
href="6b53786f62"><code>6b53786</code></a>
📝 Update release notes</li>
<li><a
href="d98f4eb56e"><code>d98f4eb</code></a>
🔧 Update pre-commit to use local Ruff instead of hook (<a
href="https://redirect.github.com/fastapi/fastapi/issues/14604">#14604</a>)</li>
<li>Additional commits viewable in <a
href="https://github.com/fastapi/fastapi/compare/0.116.1...0.128.0">compare
view</a></li>
</ul>
</details>
<br />

Updates `google-cloud-logging` from 3.12.1 to 3.13.0
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/googleapis/python-logging/releases">google-cloud-logging's
releases</a>.</em></p>
<blockquote>
<h2>google-cloud-logging 3.13.0</h2>
<h2><a
href="https://github.com/googleapis/python-logging/compare/v3.12.1...v3.13.0">3.13.0</a>
(2025-12-15)</h2>
<h3>Features</h3>
<ul>
<li>Add support for python 3.14 (<a
href="https://redirect.github.com/googleapis/python-logging/issues/1065">#1065</a>)
(<a
href="https://github.com/googleapis/python-logging/commit/6be3df6a">6be3df6a</a>)</li>
</ul>
<h3>Bug Fixes</h3>
<ul>
<li>remove setup.cfg configuration for creating universal wheels (<a
href="https://redirect.github.com/googleapis/python-logging/issues/981">#981</a>)
(<a
href="https://github.com/googleapis/python-logging/commit/70f612c3">70f612c3</a>)</li>
</ul>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/googleapis/python-logging/blob/main/CHANGELOG.md">google-cloud-logging's
changelog</a>.</em></p>
<blockquote>
<h2><a
href="https://github.com/googleapis/python-logging/compare/v3.12.1...v3.13.0">3.13.0</a>
(2025-12-15)</h2>
<h3>Features</h3>
<ul>
<li>Add support for python 3.14 (<a
href="https://redirect.github.com/googleapis/python-logging/issues/1065">#1065</a>)
(<a
href="6be3df6aa9">6be3df6aa94539cd2ab22a4fac55b343862228b2</a>)</li>
</ul>
<h3>Bug Fixes</h3>
<ul>
<li>remove setup.cfg configuration for creating universal wheels (<a
href="https://redirect.github.com/googleapis/python-logging/issues/981">#981</a>)
(<a
href="70f612c328">70f612c3281f1df13f3aba6b19bc4e9397297f3d</a>)</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="1415883be0"><code>1415883</code></a>
chore: librarian release pull request: 20251215T134006Z (<a
href="https://redirect.github.com/googleapis/python-logging/issues/1066">#1066</a>)</li>
<li><a
href="6be3df6aa9"><code>6be3df6</code></a>
feat: Add support for python 3.14 (<a
href="https://redirect.github.com/googleapis/python-logging/issues/1065">#1065</a>)</li>
<li><a
href="36fb4270b3"><code>36fb427</code></a>
chore(librarian): onboard to librarian (<a
href="https://redirect.github.com/googleapis/python-logging/issues/1061">#1061</a>)</li>
<li><a
href="eb189bf712"><code>eb189bf</code></a>
chore: update Python generator version to 1.25.1 (<a
href="https://redirect.github.com/googleapis/python-logging/issues/1003">#1003</a>)</li>
<li><a
href="a7a28d1b93"><code>a7a28d1</code></a>
test: ignore DeprecationWarning for <code>credentials_file</code>
argument and Python ve...</li>
<li><a
href="70f612c328"><code>70f612c</code></a>
fix: remove setup.cfg configuration for creating universal wheels (<a
href="https://redirect.github.com/googleapis/python-logging/issues/981">#981</a>)</li>
<li><a
href="e4c445a856"><code>e4c445a</code></a>
chore: Update gapic-generator-python to 1.25.0 (<a
href="https://redirect.github.com/googleapis/python-logging/issues/985">#985</a>)</li>
<li><a
href="14364a534a"><code>14364a5</code></a>
test: Added cleanup of old sink storage buckets (<a
href="https://redirect.github.com/googleapis/python-logging/issues/991">#991</a>)</li>
<li>See full diff in <a
href="https://github.com/googleapis/python-logging/compare/v3.12.1...v3.13.0">compare
view</a></li>
</ul>
</details>
<br />

Updates `launchdarkly-server-sdk` from 9.12.0 to 9.14.1
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/launchdarkly/python-server-sdk/releases">launchdarkly-server-sdk's
releases</a>.</em></p>
<blockquote>
<h2>v9.14.1</h2>
<h2><a
href="https://github.com/launchdarkly/python-server-sdk/compare/9.14.0...9.14.1">9.14.1</a>
(2025-12-15)</h2>
<h3>Bug Fixes</h3>
<ul>
<li>Remove all synchronizers in daemon mode (<a
href="https://redirect.github.com/launchdarkly/python-server-sdk/issues/388">#388</a>)
(<a
href="441a5ecb3d">441a5ec</a>)</li>
</ul>
<hr />
<p>This PR was generated with <a
href="https://github.com/googleapis/release-please">Release Please</a>.
See <a
href="https://github.com/googleapis/release-please#release-please">documentation</a>.</p>
<!-- raw HTML omitted -->
<h2>v9.14.0</h2>
<h2><a
href="https://github.com/launchdarkly/python-server-sdk/compare/9.13.1...9.14.0">9.14.0</a>
(2025-12-04)</h2>
<h3>Features</h3>
<ul>
<li>adding data system option to create file datasource intializer (<a
href="e5b121f92a">e5b121f</a>)</li>
<li>adding file data source as an intializer (<a
href="https://redirect.github.com/launchdarkly/python-server-sdk/issues/381">#381</a>)
(<a
href="3700d1ddd9">3700d1d</a>)</li>
</ul>
<h3>Bug Fixes</h3>
<ul>
<li>Add warning if relying on Redis <code>max_connections</code>
parameter (<a
href="https://redirect.github.com/launchdarkly/python-server-sdk/issues/387">#387</a>)
(<a
href="e6395fa531">e6395fa</a>),
closes <a
href="https://redirect.github.com/launchdarkly/python-server-sdk/issues/386">#386</a></li>
<li>modified initializer behavior to spec (<a
href="064f65c761">064f65c</a>)</li>
</ul>
<hr />
<p>This PR was generated with <a
href="https://github.com/googleapis/release-please">Release Please</a>.
See <a
href="https://github.com/googleapis/release-please#release-please">documentation</a>.</p>
<!-- raw HTML omitted -->
<h2>v9.13.1</h2>
<h2><a
href="https://github.com/launchdarkly/python-server-sdk/compare/9.13.0...9.13.1">9.13.1</a>
(2025-11-19)</h2>
<h3>Bug Fixes</h3>
<ul>
<li>Include ldclient.datasystem in docs (<a
href="https://redirect.github.com/launchdarkly/python-server-sdk/issues/379">#379</a>)
(<a
href="318c6fea07">318c6fe</a>)</li>
</ul>
<hr />
<p>This PR was generated with <a
href="https://github.com/googleapis/release-please">Release Please</a>.
See <a
href="https://github.com/googleapis/release-please#release-please">documentation</a>.</p>
<!-- raw HTML omitted -->
<h2>v9.13.0</h2>
<h2><a
href="https://github.com/launchdarkly/python-server-sdk/compare/9.12.3...9.13.0">9.13.0</a>
(2025-11-19)</h2>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/launchdarkly/python-server-sdk/blob/main/CHANGELOG.md">launchdarkly-server-sdk's
changelog</a>.</em></p>
<blockquote>
<h2><a
href="https://github.com/launchdarkly/python-server-sdk/compare/9.14.0...9.14.1">9.14.1</a>
(2025-12-15)</h2>
<h3>Bug Fixes</h3>
<ul>
<li>Remove all synchronizers in daemon mode (<a
href="https://redirect.github.com/launchdarkly/python-server-sdk/issues/388">#388</a>)
(<a
href="441a5ecb3d">441a5ec</a>)</li>
</ul>
<h2><a
href="https://github.com/launchdarkly/python-server-sdk/compare/9.13.1...9.14.0">9.14.0</a>
(2025-12-04)</h2>
<h3>Features</h3>
<ul>
<li>adding data system option to create file datasource intializer (<a
href="e5b121f92a">e5b121f</a>)</li>
<li>adding file data source as an intializer (<a
href="https://redirect.github.com/launchdarkly/python-server-sdk/issues/381">#381</a>)
(<a
href="3700d1ddd9">3700d1d</a>)</li>
</ul>
<h3>Bug Fixes</h3>
<ul>
<li>Add warning if relying on Redis <code>max_connections</code>
parameter (<a
href="https://redirect.github.com/launchdarkly/python-server-sdk/issues/387">#387</a>)
(<a
href="e6395fa531">e6395fa</a>),
closes <a
href="https://redirect.github.com/launchdarkly/python-server-sdk/issues/386">#386</a></li>
<li>modified initializer behavior to spec (<a
href="064f65c761">064f65c</a>)</li>
</ul>
<h2><a
href="https://github.com/launchdarkly/python-server-sdk/compare/9.13.0...9.13.1">9.13.1</a>
(2025-11-19)</h2>
<h3>Bug Fixes</h3>
<ul>
<li>Include ldclient.datasystem in docs (<a
href="https://redirect.github.com/launchdarkly/python-server-sdk/issues/379">#379</a>)
(<a
href="318c6fea07">318c6fe</a>)</li>
</ul>
<h2><a
href="https://github.com/launchdarkly/python-server-sdk/compare/9.12.3...9.13.0">9.13.0</a>
(2025-11-19)</h2>
<h3>Features</h3>
<ul>
<li><strong>experimental:</strong> Release EAP support for FDv2 data
system (<a
href="https://redirect.github.com/launchdarkly/python-server-sdk/issues/376">#376</a>)
(<a
href="0e7c32b4df">0e7c32b</a>)</li>
</ul>
<h2><a
href="https://github.com/launchdarkly/python-server-sdk/compare/9.12.2...9.12.3">9.12.3</a>
(2025-10-30)</h2>
<h3>Bug Fixes</h3>
<ul>
<li>Fix overly generic type hint on File data source (<a
href="https://redirect.github.com/launchdarkly/python-server-sdk/issues/365">#365</a>)
(<a
href="52a7499f7c">52a7499</a>),
closes <a
href="https://redirect.github.com/launchdarkly/python-server-sdk/issues/364">#364</a></li>
</ul>
<h2><a
href="https://github.com/launchdarkly/python-server-sdk/compare/9.12.1...9.12.2">9.12.2</a>
(2025-10-27)</h2>
<h3>Bug Fixes</h3>
<ul>
<li>Fix incorrect event count in failure message (<a
href="https://redirect.github.com/launchdarkly/python-server-sdk/issues/359">#359</a>)
(<a
href="91f416329b">91f4163</a>)</li>
</ul>
<h2><a
href="https://github.com/launchdarkly/python-server-sdk/compare/9.12.0...9.12.1">9.12.1</a>
(2025-09-30)</h2>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="54e62cc706"><code>54e62cc</code></a>
chore(main): release 9.14.1 (<a
href="https://redirect.github.com/launchdarkly/python-server-sdk/issues/389">#389</a>)</li>
<li><a
href="441a5ecb3d"><code>441a5ec</code></a>
fix: Remove all synchronizers in daemon mode (<a
href="https://redirect.github.com/launchdarkly/python-server-sdk/issues/388">#388</a>)</li>
<li><a
href="7bb537827f"><code>7bb5378</code></a>
chore(main): release 9.14.0 (<a
href="https://redirect.github.com/launchdarkly/python-server-sdk/issues/382">#382</a>)</li>
<li><a
href="e6395fa531"><code>e6395fa</code></a>
fix: Add warning if relying on Redis <code>max_connections</code>
parameter (<a
href="https://redirect.github.com/launchdarkly/python-server-sdk/issues/387">#387</a>)</li>
<li><a
href="45786a9a7e"><code>45786a9</code></a>
chore: Expose flag change listeners from data system (<a
href="https://redirect.github.com/launchdarkly/python-server-sdk/issues/384">#384</a>)</li>
<li><a
href="2b7eedc836"><code>2b7eedc</code></a>
chore: Clean up unused _data_availability (<a
href="https://redirect.github.com/launchdarkly/python-server-sdk/issues/383">#383</a>)</li>
<li><a
href="3700d1ddd9"><code>3700d1d</code></a>
feat: adding file data source as an intializer (<a
href="https://redirect.github.com/launchdarkly/python-server-sdk/issues/381">#381</a>)</li>
<li><a
href="04a2c538e5"><code>04a2c53</code></a>
chore: PR comments</li>
<li><a
href="064f65c761"><code>064f65c</code></a>
fix: modified initializer behavior to spec</li>
<li><a
href="e5b121f92a"><code>e5b121f</code></a>
feat: adding data system option to create file datasource
intializer</li>
<li>Additional commits viewable in <a
href="https://github.com/launchdarkly/python-server-sdk/compare/9.12.0...9.14.1">compare
view</a></li>
</ul>
</details>
<br />

Updates `pydantic` from 2.11.7 to 2.12.5
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/pydantic/pydantic/releases">pydantic's
releases</a>.</em></p>
<blockquote>
<h2>v2.12.5 2025-11-26</h2>
<h2>v2.12.5 (2025-11-26)</h2>
<p>This is the fifth 2.12 patch release, addressing an issue with the
<code>MISSING</code> sentinel and providing several documentation
improvements.</p>
<p>The next 2.13 minor release will be published in a couple weeks, and
will include a new <em>polymorphic serialization</em> feature addressing
the remaining unexpected changes to the <em>serialize as any</em>
behavior.</p>
<ul>
<li>Fix pickle error when using <code>model_construct()</code> on a
model with <code>MISSING</code> as a default value by <a
href="https://github.com/ornariece"><code>@​ornariece</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic/pull/12522">#12522</a>.</li>
<li>Several updates to the documentation by <a
href="https://github.com/Viicos"><code>@​Viicos</code></a>.</li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/pydantic/pydantic/compare/v2.12.4...v2.12.5">https://github.com/pydantic/pydantic/compare/v2.12.4...v2.12.5</a></p>
<h2>v2.12.4 2025-11-05</h2>
<h2>v2.12.4 (2025-11-05)</h2>
<p>This is the fourth 2.12 patch release, fixing more regressions, and
reverting a change in the <code>build()</code> method
of the <a
href="https://docs.pydantic.dev/latest/api/networks/"><code>AnyUrl</code>
and Dsn types</a>.</p>
<p>This patch release also fixes an issue with the serialization of IP
address types, when <code>serialize_as_any</code> is used. The next
patch release
will try to address the remaining issues with <em>serialize as any</em>
behavior by introducing a new <em>polymorphic serialization</em>
feature, that
should be used in most cases in place of <em>serialize as any</em>.</p>
<ul>
<li>
<p>Fix issue with forward references in parent <code>TypedDict</code>
classes by <a href="https://github.com/Viicos"><code>@​Viicos</code></a>
in <a
href="https://redirect.github.com/pydantic/pydantic/pull/12427">#12427</a>.</p>
<p>This issue is only relevant on Python 3.14 and greater.</p>
</li>
<li>
<p>Exclude fields with <code>exclude_if</code> from JSON Schema required
fields by <a href="https://github.com/Viicos"><code>@​Viicos</code></a>
in <a
href="https://redirect.github.com/pydantic/pydantic/pull/12430">#12430</a></p>
</li>
<li>
<p>Revert URL percent-encoding of credentials in the
<code>build()</code> method of the <a
href="https://docs.pydantic.dev/latest/api/networks/"><code>AnyUrl</code>
and Dsn types</a> by <a
href="https://github.com/davidhewitt"><code>@​davidhewitt</code></a> in
<a
href="https://redirect.github.com/pydantic/pydantic-core/pull/1833">pydantic-core#1833</a>.</p>
<p>This was initially considered as a bugfix, but caused regressions and
as such was fully reverted. The next release will include
an opt-in option to percent-encode components of the URL.</p>
</li>
<li>
<p>Add type inference for IP address types by <a
href="https://github.com/davidhewitt"><code>@​davidhewitt</code></a> in
<a
href="https://redirect.github.com/pydantic/pydantic-core/pull/1868">pydantic-core#1868</a>.</p>
<p>The 2.12 changes to the <code>serialize_as_any</code> behavior made
it so that IP address types could not properly serialize to JSON.</p>
</li>
<li>
<p>Avoid getting default values from defaultdict by <a
href="https://github.com/davidhewitt"><code>@​davidhewitt</code></a> in
<a
href="https://redirect.github.com/pydantic/pydantic-core/pull/1853">pydantic-core#1853</a>.</p>
<p>This fixes a subtle regression in the validation behavior of the <a
href="https://docs.python.org/3/library/collections.html#collections.defaultdict"><code>collections.defaultdict</code></a>
type.</p>
</li>
<li>
<p>Fix issue with field serializers on nested typed dictionaries by <a
href="https://github.com/davidhewitt"><code>@​davidhewitt</code></a> in
<a
href="https://redirect.github.com/pydantic/pydantic-core/pull/1879">pydantic-core#1879</a>.</p>
</li>
<li>
<p>Add more <code>pydantic-core</code> builds for the three-threaded
version of Python 3.14 by <a
href="https://github.com/davidhewitt"><code>@​davidhewitt</code></a> in
<a
href="https://redirect.github.com/pydantic/pydantic-core/pull/1864">pydantic-core#1864</a>.</p>
</li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/pydantic/pydantic/compare/v2.12.3...v2.12.4">https://github.com/pydantic/pydantic/compare/v2.12.3...v2.12.4</a></p>
<h2>v2.12.3 2025-10-17</h2>
<h2>v2.12.3 (2025-10-17)</h2>
<h3>What's Changed</h3>
<p>This is the third 2.13 patch release, fixing issues related to the
<code>FieldInfo</code> class, and reverting a change to the supported <a
href="https://docs.pydantic.dev/latest/concepts/validators/#model-validators"><em>after</em>
model validator</a> function signatures.</p>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/pydantic/pydantic/blob/main/HISTORY.md">pydantic's
changelog</a>.</em></p>
<blockquote>
<h2>v2.12.5 (2025-11-26)</h2>
<p><a
href="https://github.com/pydantic/pydantic/releases/tag/v2.12.5">GitHub
release</a></p>
<p>This is the fifth 2.12 patch release, addressing an issue with the
<code>MISSING</code> sentinel and providing several documentation
improvements.</p>
<p>The next 2.13 minor release will be published in a couple weeks, and
will include a new <em>polymorphic serialization</em> feature addressing
the remaining unexpected changes to the <em>serialize as any</em>
behavior.</p>
<ul>
<li>Fix pickle error when using <code>model_construct()</code> on a
model with <code>MISSING</code> as a default value by <a
href="https://github.com/ornariece"><code>@​ornariece</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic/pull/12522">#12522</a>.</li>
<li>Several updates to the documentation by <a
href="https://github.com/Viicos"><code>@​Viicos</code></a>.</li>
</ul>
<h2>v2.12.4 (2025-11-05)</h2>
<p><a
href="https://github.com/pydantic/pydantic/releases/tag/v2.12.4">GitHub
release</a></p>
<p>This is the fourth 2.12 patch release, fixing more regressions, and
reverting a change in the <code>build()</code> method
of the <a
href="https://docs.pydantic.dev/latest/api/networks/"><code>AnyUrl</code>
and Dsn types</a>.</p>
<p>This patch release also fixes an issue with the serialization of IP
address types, when <code>serialize_as_any</code> is used. The next
patch release
will try to address the remaining issues with <em>serialize as any</em>
behavior by introducing a new <em>polymorphic serialization</em>
feature, that
should be used in most cases in place of <em>serialize as any</em>.</p>
<ul>
<li>
<p>Fix issue with forward references in parent <code>TypedDict</code>
classes by <a href="https://github.com/Viicos"><code>@​Viicos</code></a>
in <a
href="https://redirect.github.com/pydantic/pydantic/pull/12427">#12427</a>.</p>
<p>This issue is only relevant on Python 3.14 and greater.</p>
</li>
<li>
<p>Exclude fields with <code>exclude_if</code> from JSON Schema required
fields by <a href="https://github.com/Viicos"><code>@​Viicos</code></a>
in <a
href="https://redirect.github.com/pydantic/pydantic/pull/12430">#12430</a></p>
</li>
<li>
<p>Revert URL percent-encoding of credentials in the
<code>build()</code> method
of the <a
href="https://docs.pydantic.dev/latest/api/networks/"><code>AnyUrl</code>
and Dsn types</a> by <a
href="https://github.com/davidhewitt"><code>@​davidhewitt</code></a> in
<a
href="https://redirect.github.com/pydantic/pydantic-core/pull/1833">pydantic-core#1833</a>.</p>
<p>This was initially considered as a bugfix, but caused regressions and
as such was fully reverted. The next release will include
an opt-in option to percent-encode components of the URL.</p>
</li>
<li>
<p>Add type inference for IP address types by <a
href="https://github.com/davidhewitt"><code>@​davidhewitt</code></a> in
<a
href="https://redirect.github.com/pydantic/pydantic-core/pull/1868">pydantic-core#1868</a>.</p>
<p>The 2.12 changes to the <code>serialize_as_any</code> behavior made
it so that IP address types could not properly serialize to JSON.</p>
</li>
<li>
<p>Avoid getting default values from defaultdict by <a
href="https://github.com/davidhewitt"><code>@​davidhewitt</code></a> in
<a
href="https://redirect.github.com/pydantic/pydantic-core/pull/1853">pydantic-core#1853</a>.</p>
<p>This fixes a subtle regression in the validation behavior of the <a
href="https://docs.python.org/3/library/collections.html#collections.defaultdict"><code>collections.defaultdict</code></a>
type.</p>
</li>
<li>
<p>Fix issue with field serializers on nested typed dictionaries by <a
href="https://github.com/davidhewitt"><code>@​davidhewitt</code></a> in
<a
href="https://redirect.github.com/pydantic/pydantic-core/pull/1879">pydantic-core#1879</a>.</p>
</li>
<li>
<p>Add more <code>pydantic-core</code> builds for the three-threaded
version of Python 3.14 by <a
href="https://github.com/davidhewitt"><code>@​davidhewitt</code></a> in
<a
href="https://redirect.github.com/pydantic/pydantic-core/pull/1864">pydantic-core#1864</a>.</p>
</li>
</ul>
<h2>v2.12.3 (2025-10-17)</h2>
<p><a
href="https://github.com/pydantic/pydantic/releases/tag/v2.12.3">GitHub
release</a></p>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="bd2d0dd013"><code>bd2d0dd</code></a>
Prepare release v2.12.5</li>
<li><a
href="7d0302ec7e"><code>7d0302e</code></a>
Document security implications when using
<code>create_model()</code></li>
<li><a
href="e9ef980def"><code>e9ef980</code></a>
Fix typo in Standard Library Types documentation</li>
<li><a
href="f2c20c00c2"><code>f2c20c0</code></a>
Add <code>pydantic-docs</code> dev dependency, make use of versioning
blocks</li>
<li><a
href="a76c1aa26f"><code>a76c1aa</code></a>
Update documentation about JSON Schema</li>
<li><a
href="8cbc72ca48"><code>8cbc72c</code></a>
Add documentation about custom <code>__init__()</code></li>
<li><a
href="99eba59906"><code>99eba59</code></a>
Add additional test for <code>FieldInfo.get_default()</code></li>
<li><a
href="c71076988e"><code>c710769</code></a>
Special case <code>MISSING</code> sentinel in
<code>smart_deepcopy()</code></li>
<li><a
href="20a9d771c2"><code>20a9d77</code></a>
Do not delete mock validator/serializer in
<code>rebuild_dataclass()</code></li>
<li><a
href="c86515a3a8"><code>c86515a</code></a>
Update parts of the model and <code>revalidate_instances</code>
documentation</li>
<li>Additional commits viewable in <a
href="https://github.com/pydantic/pydantic/compare/v2.11.7...v2.12.5">compare
view</a></li>
</ul>
</details>
<br />

Updates `pydantic-settings` from 2.10.1 to 2.12.0
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/pydantic/pydantic-settings/releases">pydantic-settings's
releases</a>.</em></p>
<blockquote>
<h2>v2.12.0</h2>
<h2>What's Changed</h2>
<ul>
<li>Support for enum kebab case. by <a
href="https://github.com/kschwab"><code>@​kschwab</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic-settings/pull/686">pydantic/pydantic-settings#686</a></li>
<li>Apply source order: init &gt; env &gt; dotenv &gt; secrets &gt;
defaults and pres… by <a
href="https://github.com/chbndrhnns"><code>@​chbndrhnns</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic-settings/pull/688">pydantic/pydantic-settings#688</a></li>
<li>Add NestedSecretsSettings source by <a
href="https://github.com/makukha"><code>@​makukha</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic-settings/pull/690">pydantic/pydantic-settings#690</a></li>
<li>Strip non-explicit default values. by <a
href="https://github.com/kschwab"><code>@​kschwab</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic-settings/pull/692">pydantic/pydantic-settings#692</a></li>
<li>Coerce env vars if strict is True. by <a
href="https://github.com/kschwab"><code>@​kschwab</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic-settings/pull/693">pydantic/pydantic-settings#693</a></li>
<li>Restore init kwarg names before returning final state dictionary. by
<a href="https://github.com/kschwab"><code>@​kschwab</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic-settings/pull/700">pydantic/pydantic-settings#700</a></li>
<li>Drop Python3.9 support by <a
href="https://github.com/hramezani"><code>@​hramezani</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic-settings/pull/699">pydantic/pydantic-settings#699</a></li>
<li>Adapt test_protected_namespace_defaults for dev. Pydantic by <a
href="https://github.com/musicinmybrain"><code>@​musicinmybrain</code></a>
in <a
href="https://redirect.github.com/pydantic/pydantic-settings/pull/637">pydantic/pydantic-settings#637</a></li>
<li>Add Python 3.14 by <a
href="https://github.com/hramezani"><code>@​hramezani</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic-settings/pull/704">pydantic/pydantic-settings#704</a></li>
<li>Prepare release 2.12 by <a
href="https://github.com/hramezani"><code>@​hramezani</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic-settings/pull/705">pydantic/pydantic-settings#705</a></li>
</ul>
<h2>New Contributors</h2>
<ul>
<li><a
href="https://github.com/chbndrhnns"><code>@​chbndrhnns</code></a> made
their first contribution in <a
href="https://redirect.github.com/pydantic/pydantic-settings/pull/688">pydantic/pydantic-settings#688</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/pydantic/pydantic-settings/compare/v2.11.0...v2.12.0">https://github.com/pydantic/pydantic-settings/compare/v2.11.0...v2.12.0</a></p>
<h2>v2.11.0</h2>
<h2>What's Changed</h2>
<ul>
<li>CLI Serialize Support by <a
href="https://github.com/kschwab"><code>@​kschwab</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic-settings/pull/643">pydantic/pydantic-settings#643</a></li>
<li>Inspect type aliases to determine if an annotation is complex by <a
href="https://github.com/tselepakis"><code>@​tselepakis</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic-settings/pull/644">pydantic/pydantic-settings#644</a></li>
<li>Revert &quot;fix: Respect 'cli_parse_args' from model_config with
settings_customise_sources (<a
href="https://redirect.github.com/pydantic/pydantic-settings/issues/611">#611</a>)&quot;
by <a href="https://github.com/hramezani"><code>@​hramezani</code></a>
in <a
href="https://redirect.github.com/pydantic/pydantic-settings/pull/655">pydantic/pydantic-settings#655</a></li>
<li>Remove parsing of command line arguments from
<code>CliSettingsSource.__init__</code>. by <a
href="https://github.com/trygve-baerland"><code>@​trygve-baerland</code></a>
in <a
href="https://redirect.github.com/pydantic/pydantic-settings/pull/656">pydantic/pydantic-settings#656</a></li>
<li>turn off allow_abbrev on subparsers by <a
href="https://github.com/mroch"><code>@​mroch</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic-settings/pull/658">pydantic/pydantic-settings#658</a></li>
<li>CLI Serialization Fixes by <a
href="https://github.com/kschwab"><code>@​kschwab</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic-settings/pull/649">pydantic/pydantic-settings#649</a></li>
<li>Fix PydanticModel type checking. by <a
href="https://github.com/kschwab"><code>@​kschwab</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic-settings/pull/659">pydantic/pydantic-settings#659</a></li>
<li>Avoid env_prefix falling back to env vars without prefix by <a
href="https://github.com/tselepakis"><code>@​tselepakis</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic-settings/pull/648">pydantic/pydantic-settings#648</a></li>
<li>Warn if model_config sets unused keys for missing settings sources
by <a href="https://github.com/HomerusJa"><code>@​HomerusJa</code></a>
in <a
href="https://redirect.github.com/pydantic/pydantic-settings/pull/663">pydantic/pydantic-settings#663</a></li>
<li>Included endpoint_url kwarg in AWSSecretsManagerSettingsSource class
by <a href="https://github.com/adrianohrl"><code>@​adrianohrl</code></a>
in <a
href="https://redirect.github.com/pydantic/pydantic-settings/pull/664">pydantic/pydantic-settings#664</a></li>
<li>Fix typo (&quot;Accesing&quot;) in the &quot;Adding sources&quot;
docs by <a
href="https://github.com/deepyaman"><code>@​deepyaman</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic-settings/pull/668">pydantic/pydantic-settings#668</a></li>
<li>CLI Windows Path Fix by <a
href="https://github.com/kschwab"><code>@​kschwab</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic-settings/pull/669">pydantic/pydantic-settings#669</a></li>
<li>Cli root model support by <a
href="https://github.com/kschwab"><code>@​kschwab</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic-settings/pull/677">pydantic/pydantic-settings#677</a></li>
<li>Snake case conversion in Azure Key Vault by <a
href="https://github.com/AndreuCodina"><code>@​AndreuCodina</code></a>
in <a
href="https://redirect.github.com/pydantic/pydantic-settings/pull/680">pydantic/pydantic-settings#680</a></li>
<li>Make <code>InitSettingsSource</code> resolution deterministic by <a
href="https://github.com/enrico-stauss"><code>@​enrico-stauss</code></a>
in <a
href="https://redirect.github.com/pydantic/pydantic-settings/pull/681">pydantic/pydantic-settings#681</a></li>
<li>Update deps by <a
href="https://github.com/hramezani"><code>@​hramezani</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic-settings/pull/683">pydantic/pydantic-settings#683</a></li>
</ul>
<h2>New Contributors</h2>
<ul>
<li><a
href="https://github.com/tselepakis"><code>@​tselepakis</code></a> made
their first contribution in <a
href="https://redirect.github.com/pydantic/pydantic-settings/pull/644">pydantic/pydantic-settings#644</a></li>
<li><a
href="https://github.com/trygve-baerland"><code>@​trygve-baerland</code></a>
made their first contribution in <a
href="https://redirect.github.com/pydantic/pydantic-settings/pull/656">pydantic/pydantic-settings#656</a></li>
<li><a href="https://github.com/mroch"><code>@​mroch</code></a> made
their first contribution in <a
href="https://redirect.github.com/pydantic/pydantic-settings/pull/658">pydantic/pydantic-settings#658</a></li>
<li><a href="https://github.com/HomerusJa"><code>@​HomerusJa</code></a>
made their first contribution in <a
href="https://redirect.github.com/pydantic/pydantic-settings/pull/663">pydantic/pydantic-settings#663</a></li>
<li><a
href="https://github.com/adrianohrl"><code>@​adrianohrl</code></a> made
their first contribution in <a
href="https://redirect.github.com/pydantic/pydantic-settings/pull/664">pydantic/pydantic-settings#664</a></li>
<li><a href="https://github.com/deepyaman"><code>@​deepyaman</code></a>
made their first contribution in <a
href="https://redirect.github.com/pydantic/pydantic-settings/pull/668">pydantic/pydantic-settings#668</a></li>
<li><a
href="https://github.com/enrico-stauss"><code>@​enrico-stauss</code></a>
made their first contribution in <a
href="https://redirect.github.com/pydantic/pydantic-settings/pull/681">pydantic/pydantic-settings#681</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/pydantic/pydantic-settings/compare/2.10.1...v2.11.0">https://github.com/pydantic/pydantic-settings/compare/2.10.1...v2.11.0</a></p>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="584983d253"><code>584983d</code></a>
Prepare release 2.12 (<a
href="https://redirect.github.com/pydantic/pydantic-settings/issues/705">#705</a>)</li>
<li><a
href="6b4d87e776"><code>6b4d87e</code></a>
Add Python 3.14 (<a
href="https://redirect.github.com/pydantic/pydantic-settings/issues/704">#704</a>)</li>
<li><a
href="02de5b622b"><code>02de5b6</code></a>
Adapt test_protected_namespace_defaults for dev. Pydantic (<a
href="https://redirect.github.com/pydantic/pydantic-settings/issues/637">#637</a>)</li>
<li><a
href="4239ea460a"><code>4239ea4</code></a>
Drop Python3.9 support (<a
href="https://redirect.github.com/pydantic/pydantic-settings/issues/699">#699</a>)</li>
<li><a
href="5008c694f6"><code>5008c69</code></a>
Restore init kwarg names before returning final state dictionary. (<a
href="https://redirect.github.com/pydantic/pydantic-settings/issues/700">#700</a>)</li>
<li><a
href="4433101fef"><code>4433101</code></a>
Coerce env vars if strict is True. (<a
href="https://redirect.github.com/pydantic/pydantic-settings/issues/693">#693</a>)</li>
<li><a
href="4d2ebfd543"><code>4d2ebfd</code></a>
Strip non-explicit default values. (<a
href="https://redirect.github.com/pydantic/pydantic-settings/issues/692">#692</a>)</li>
<li><a
href="4a6ffcaeae"><code>4a6ffca</code></a>
Add NestedSecretsSettings source (<a
href="https://redirect.github.com/pydantic/pydantic-settings/issues/690">#690</a>)</li>
<li><a
href="7a6e96ebfc"><code>7a6e96e</code></a>
Apply source order: init &gt; env &gt; dotenv &gt; secrets &gt; defaults
and pres… (<a
href="https://redirect.github.com/pydantic/pydantic-settings/issues/688">#688</a>)</li>
<li><a
href="68563eddc0"><code>68563ed</code></a>
Support for enum kebab case. (<a
href="https://redirect.github.com/pydantic/pydantic-settings/issues/686">#686</a>)</li>
<li>Additional commits viewable in <a
href="https://github.com/pydantic/pydantic-settings/compare/2.10.1...v2.12.0">compare
view</a></li>
</ul>
</details>
<br />

Updates `pyjwt` from 2.10.1 to 2.11.0
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/jpadilla/pyjwt/releases">pyjwt's
releases</a>.</em></p>
<blockquote>
<h2>2.11.0</h2>
<h2>What's Changed</h2>
<ul>
<li>Fixed type error in comment by <a
href="https://github.com/shuhaib-aot"><code>@​shuhaib-aot</code></a> in
<a
href="https://redirect.github.com/jpadilla/pyjwt/pull/1026">jpadilla/pyjwt#1026</a></li>
<li>[pre-commit.ci] pre-commit autoupdate by <a
href="https://github.com/pre-commit-ci"><code>@​pre-commit-ci</code></a>[bot]
in <a
href="https://redirect.github.com/jpadilla/pyjwt/pull/1018">jpadilla/pyjwt#1018</a></li>
<li>[pre-commit.ci] pre-commit autoupdate by <a
href="https://github.com/pre-commit-ci"><code>@​pre-commit-ci</code></a>[bot]
in <a
href="https://redirect.github.com/jpadilla/pyjwt/pull/1033">jpadilla/pyjwt#1033</a></li>
<li>Make note of use of leeway with nbf by <a
href="https://github.com/djw8605"><code>@​djw8605</code></a> in <a
href="https://redirect.github.com/jpadilla/pyjwt/pull/1034">jpadilla/pyjwt#1034</a></li>
<li>[pre-commit.ci] pre-commit autoupdate by <a
href="https://github.com/pre-commit-ci"><code>@​pre-commit-ci</code></a>[bot]
in <a
href="https://redirect.github.com/jpadilla/pyjwt/pull/1035">jpadilla/pyjwt#1035</a></li>
<li>Fixes <a
href="https://redirect.github.com/jpadilla/pyjwt/issues/964">#964</a>:
Validate key against allowed types for Algorithm family by <a
href="https://github.com/pachewise"><code>@​pachewise</code></a> in <a
href="https://redirect.github.com/jpadilla/pyjwt/pull/985">jpadilla/pyjwt#985</a></li>
<li>Feat <a
href="https://redirect.github.com/jpadilla/pyjwt/issues/1024">#1024</a>:
Add iterator for PyJWKSet by <a
href="https://github.com/pachewise"><code>@​pachewise</code></a> in <a
href="https://redirect.github.com/jpadilla/pyjwt/pull/1041">jpadilla/pyjwt#1041</a></li>
<li>Fixes <a
href="https://redirect.github.com/jpadilla/pyjwt/issues/1039">#1039</a>:
Add iss, issuer type checks by <a
href="https://github.com/pachewise"><code>@​pachewise</code></a> in <a
href="https://redirect.github.com/jpadilla/pyjwt/pull/1040">jpadilla/pyjwt#1040</a></li>
<li>Fixes <a
href="https://redirect.github.com/jpadilla/pyjwt/issues/660">#660</a>:
Improve typing/logic for <code>options</code> in decode,
decode_complete; Improve docs by <a
href="https://github.com/pachewise"><code>@​pachewise</code></a> in <a
href="https://redirect.github.com/jpadilla/pyjwt/pull/1045">jpadilla/pyjwt#1045</a></li>
<li>[pre-commit.ci] pre-commit autoupdate by <a
href="https://github.com/pre-commit-ci"><code>@​pre-commit-ci</code></a>[bot]
in <a
href="https://redirect.github.com/jpadilla/pyjwt/pull/1042">jpadilla/pyjwt#1042</a></li>
<li>[pre-commit.ci] pre-commit autoupdate by <a
href="https://github.com/pre-commit-ci"><code>@​pre-commit-ci</code></a>[bot]
in <a
href="https://redirect.github.com/jpadilla/pyjwt/pull/1052">jpadilla/pyjwt#1052</a></li>
<li>[pre-commit.ci] pre-commit autoupdate by <a
href="https://github.com/pre-commit-ci"><code>@​pre-commit-ci</code></a>[bot]
in <a
href="https://redirect.github.com/jpadilla/pyjwt/pull/1053">jpadilla/pyjwt#1053</a></li>
<li>Fix <a
href="https://redirect.github.com/jpadilla/pyjwt/issues/1022">#1022</a>:
Map <code>algorithm=None</code> to &quot;none&quot; by <a
href="https://github.com/qqii"><code>@​qqii</code></a> in <a
href="https://redirect.github.com/jpadilla/pyjwt/pull/1056">jpadilla/pyjwt#1056</a></li>
<li>[pre-commit.ci] pre-commit autoupdate by <a
href="https://github.com/pre-commit-ci"><code>@​pre-commit-ci</code></a>[bot]
in <a
href="https://redirect.github.com/jpadilla/pyjwt/pull/1055">jpadilla/pyjwt#1055</a></li>
<li>[pre-commit.ci] pre-commit autoupdate by <a
href="https://github.com/pre-commit-ci"><code>@​pre-commit-ci</code></a>[bot]
in <a
href="https://redirect.github.com/jpadilla/pyjwt/pull/1058">jpadilla/pyjwt#1058</a></li>
<li>[pre-commit.ci] pre-commit autoupdate by <a
href="https://github.com/pre-commit-ci"><code>@​pre-commit-ci</code></a>[bot]
in <a
href="https://redirect.github.com/jpadilla/pyjwt/pull/1060">jpadilla/pyjwt#1060</a></li>
<li>[pre-commit.ci] pre-commit autoupdate by <a
href="https://github.com/pre-commit-ci"><code>@​pre-commit-ci</code></a>[bot]
in <a
href="https://redirect.github.com/jpadilla/pyjwt/pull/1061">jpadilla/pyjwt#1061</a></li>
<li>Fixes <a
href="https://redirect.github.com/jpadilla/pyjwt/issues/1047">#1047</a>:
Correct <code>PyJWKClient.get_signing_key_from_jwt</code> annotation by
<a href="https://github.com/khvn26"><code>@​khvn26</code></a> in <a
href="https://redirect.github.com/jpadilla/pyjwt/pull/1048">jpadilla/pyjwt#1048</a></li>
<li>[pre-commit.ci] pre-commit autoupdate by <a
href="https://github.com/pre-commit-ci"><code>@​pre-commit-ci</code></a>[bot]
in <a
href="https://redirect.github.com/jpadilla/pyjwt/pull/1062">jpadilla/pyjwt#1062</a></li>
<li>Fixed doc string typo in _validate_jti() function <a
href="https://redirect.github.com/jpadilla/pyjwt/issues/1063">#1063</a>
by <a
href="https://github.com/kuldeepkhatke"><code>@​kuldeepkhatke</code></a>
in <a
href="https://redirect.github.com/jpadilla/pyjwt/pull/1064">jpadilla/pyjwt#1064</a></li>
<li>[pre-commit.ci] pre-commit autoupdate by <a
href="https://github.com/pre-commit-ci"><code>@​pre-commit-ci</code></a>[bot]
in <a
href="https://redirect.github.com/jpadilla/pyjwt/pull/1065">jpadilla/pyjwt#1065</a></li>
<li>Update SECURITY.md by <a
href="https://github.com/auvipy"><code>@​auvipy</code></a> in <a
href="https://redirect.github.com/jpadilla/pyjwt/pull/1057">jpadilla/pyjwt#1057</a></li>
<li>Typing fix: use <code>float</code> instead of <code>int</code> for
<code>lifespan</code> and <code>timeout</code> by <a
href="https://github.com/nikitagashkov"><code>@​nikitagashkov</code></a>
in <a
href="https://redirect.github.com/jpadilla/pyjwt/pull/1068">jpadilla/pyjwt#1068</a></li>
<li>[pre-commit.ci] pre-commit autoupdate by <a
href="https://github.com/pre-commit-ci"><code>@​pre-commit-ci</code></a>[bot]
in <a
href="https://redirect.github.com/jpadilla/pyjwt/pull/1067">jpadilla/pyjwt#1067</a></li>
<li>[pre-commit.ci] pre-commit autoupdate by <a
href="https://github.com/pre-commit-ci"><code>@​pre-commit-ci</code></a>[bot]
in <a
href="https://redirect.github.com/jpadilla/pyjwt/pull/1071">jpadilla/pyjwt#1071</a></li>
<li>[pre-commit.ci] pre-commit autoupdate by <a
href="https://github.com/pre-commit-ci"><code>@​pre-commit-ci</code></a>[bot]
in <a
href="https://redirect.github.com/jpadilla/pyjwt/pull/1076">jpadilla/pyjwt#1076</a></li>
<li>Fix TYP header documentation by <a
href="https://github.com/fobiasmog"><code>@​fobiasmog</code></a> in <a
href="https://redirect.github.com/jpadilla/pyjwt/pull/1046">jpadilla/pyjwt#1046</a></li>
<li>doc: Document claims sub and jti by <a
href="https://github.com/cleder"><code>@​cleder</code></a> in <a
href="https://redirect.github.com/jpadilla/pyjwt/pull/1088">jpadilla/pyjwt#1088</a></li>
<li>[pre-commit.ci] pre-commit autoupdate by <a
href="https://github.com/pre-commit-ci"><code>@​pre-commit-ci</code></a>[bot]
in <a
href="https://redirect.github.com/jpadilla/pyjwt/pull/1077">jpadilla/pyjwt#1077</a></li>
<li>Bump actions/setup-python from 5 to 6 by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a>[bot]
in <a
href="https://redirect.github.com/jpadilla/pyjwt/pull/1089">jpadilla/pyjwt#1089</a></li>
<li>Bump actions/stale from 8 to 10 by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a>[bot]
in <a
href="https://redirect.github.com/jpadilla/pyjwt/pull/1090">jpadilla/pyjwt#1090</a></li>
<li>Bump actions/checkout from 4 to 5 by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a>[bot]
in <a
href="https://redirect.github.com/jpadilla/pyjwt/pull/1083">jpadilla/pyjwt#1083</a></li>
<li>[pre-commit.ci] pre-commit autoupdate by <a
href="https://github.com/pre-commit-ci"><code>@​pre-commit-ci</code></a>[bot]
in <a
href="https://redirect.github.com/jpadilla/pyjwt/pull/1091">jpadilla/pyjwt#1091</a></li>
<li>[pre-commit.ci] pre-commit autoupdate by <a
href="https://github.com/pre-commit-ci"><code>@​pre-commit-ci</code></a>[bot]
in <a
href="https://redirect.github.com/jpadilla/pyjwt/pull/1093">jpadilla/pyjwt#1093</a></li>
<li>[pre-commit.ci] pre-commit autoupdate by <a
href="https://github.com/pre-commit-ci"><code>@​pre-commit-ci</code></a>[bot]
in <a
href="https://redirect.github.com/jpadilla/pyjwt/pull/1096">jpadilla/pyjwt#1096</a></li>
<li>Resolve package build warnings by <a
href="https://github.com/kurtmckee"><code>@​kurtmckee</code></a> in <a
href="https://redirect.github.com/jpadilla/pyjwt/pull/1105">jpadilla/pyjwt#1105</a></li>
<li>Support Python 3.14, and test against PyPy 3.10+ by <a
href="https://github.com/kurtmckee"><code>@​kurtmckee</code></a> in <a
href="https://redirect.github.com/jpadilla/pyjwt/pull/1104">jpadilla/pyjwt#1104</a></li>
<li>Fix a <code>SyntaxWarning</code> caused by invalid escape sequences
by <a href="https://github.com/kurtmckee"><code>@​kurtmckee</code></a>
in <a
href="https://redirect.github.com/jpadilla/pyjwt/pull/1103">jpadilla/pyjwt#1103</a></li>
<li>Standardize CHANGELOG links to PRs by <a
href="https://github.com/kurtmckee"><code>@​kurtmckee</code></a> in <a
href="https://redirect.github.com/jpadilla/pyjwt/pull/1110">jpadilla/pyjwt#1110</a></li>
<li>Migrate from <code>pep517</code>, which is deprecated, to
<code>build</code> by <a
href="https://github.com/kurtmckee"><code>@​kurtmckee</code></a> in <a
href="https://redirect.github.com/jpadilla/pyjwt/pull/1108">jpadilla/pyjwt#1108</a></li>
<li>Fix incorrectly-named test suite function by <a
href="https://github.com/kurtmckee"><code>@​kurtmckee</code></a> in <a
href="https://redirect.github.com/jpadilla/pyjwt/pull/1116">jpadilla/pyjwt#1116</a></li>
<li>Fix Read the Docs builds by <a
href="https://github.com/kurtmckee"><code>@​kurtmckee</code></a> in <a
href="https://redirect.github.com/jpadilla/pyjwt/pull/1111">jpadilla/pyjwt#1111</a></li>
<li>Bump actions/download-artifact from 4 to 6 by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a>[bot]
in <a
href="https://redirect.github.com/jpadilla/pyjwt/pull/1118">jpadilla/pyjwt#1118</a></li>
<li>Escalate test suite warnings to errors by <a
href="https://github.com/kurtmckee"><code>@​kurtmckee</code></a> in <a
href="https://redirect.github.com/jpadilla/pyjwt/pull/1107">jpadilla/pyjwt#1107</a></li>
<li>Add pyupgrade as a pre-commit hook by <a
href="https://github.com/kurtmckee"><code>@​kurtmckee</code></a> in <a
href="https://redirect.github.com/jpadilla/pyjwt/pull/1109">jpadilla/pyjwt#1109</a></li>
<li>Simplify the test suite decorators by <a
href="https://github.com/kurtmckee"><code>@​kurtmckee</code></a> in <a
href="https://redirect.github.com/jpadilla/pyjwt/pull/1113">jpadilla/pyjwt#1113</a></li>
<li>Improve coverage config and eliminate unused test suite code by <a
href="https://github.com/kurtmckee"><code>@​kurtmckee</code></a> in <a
href="https://redirect.github.com/jpadilla/pyjwt/pull/1115">jpadilla/pyjwt#1115</a></li>
<li>Build a shared wheel once in the test suite by <a
href="https://github.com/kurtmckee"><code>@​kurtmckee</code></a> in <a
href="https://redirect.github.com/jpadilla/pyjwt/pull/1114">jpadilla/pyjwt#1114</a></li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/jpadilla/pyjwt/blob/master/CHANGELOG.rst">pyjwt's
changelog</a>.</em></p>
<blockquote>
<h2><code>v2.11.0
&lt;https://github.com/jpadilla/pyjwt/compare/2.10.1...2.11.0&gt;</code>__</h2>
<p>Fixed</p>
<pre><code>
- Enforce ECDSA curve validation per RFC 7518 Section 3.4.
- Fix build system warnings by @kurtmckee in
`[#1105](https://github.com/jpadilla/pyjwt/issues/1105)
&lt;https://github.com/jpadilla/pyjwt/pull/1105&gt;`__
- Validate key against allowed types for Algorithm family in
`[#964](https://github.com/jpadilla/pyjwt/issues/964)
&lt;https://github.com/jpadilla/pyjwt/pull/964&gt;`__
- Add iterator for JWKSet in
`[#1041](https://github.com/jpadilla/pyjwt/issues/1041)
&lt;https://github.com/jpadilla/pyjwt/pull/1041&gt;`__
- Validate `iss` claim is a string during encoding and decoding by
@pachewise in `[#1040](https://github.com/jpadilla/pyjwt/issues/1040)
&lt;https://github.com/jpadilla/pyjwt/pull/1040&gt;`__
- Improve typing/logic for `options` in decode, decode_complete by
@pachewise in `[#1045](https://github.com/jpadilla/pyjwt/issues/1045)
&lt;https://github.com/jpadilla/pyjwt/pull/1045&gt;`__
- Declare float supported type for lifespan and timeout by
@nikitagashkov in
`[#1068](https://github.com/jpadilla/pyjwt/issues/1068)
&lt;https://github.com/jpadilla/pyjwt/pull/1068&gt;`__
- Fix ``SyntaxWarning``\s/``DeprecationWarning``\s caused by invalid
escape sequences by @kurtmckee in
`[#1103](https://github.com/jpadilla/pyjwt/issues/1103)
&lt;https://github.com/jpadilla/pyjwt/pull/1103&gt;`__
- Development: Build a shared wheel once to speed up test suite setup
times by @kurtmckee in
`[#1114](https://github.com/jpadilla/pyjwt/issues/1114)
&lt;https://github.com/jpadilla/pyjwt/pull/1114&gt;`__
- Development: Test type annotations across all supported Python
versions,
increase the strictness of the type checking, and remove the mypy
pre-commit hook
by @kurtmckee in `[#1112](https://github.com/jpadilla/pyjwt/issues/1112)
&lt;https://github.com/jpadilla/pyjwt/pull/1112&gt;`__
<p>Added
</code></pre></p>
<ul>
<li>Support Python 3.14, and test against PyPy 3.10 and 3.11 by <a
href="https://github.com/kurtmckee"><code>@​kurtmckee</code></a> in
<code>[#1104](https://github.com/jpadilla/pyjwt/issues/1104)
&lt;https://github.com/jpadilla/pyjwt/pull/1104&gt;</code>__</li>
<li>Development: Migrate to <code>build</code> to test package building
in CI by <a
href="https://github.com/kurtmckee"><code>@​kurtmckee</code></a> in
<code>[#1108](https://github.com/jpadilla/pyjwt/issues/1108)
&lt;https://github.com/jpadilla/pyjwt/pull/1108&gt;</code>__</li>
<li>Development: Improve coverage config and eliminate unused test suite
code by <a
href="https://github.com/kurtmckee"><code>@​kurtmckee</code></a> in
<code>[#1115](https://github.com/jpadilla/pyjwt/issues/1115)
&lt;https://github.com/jpadilla/pyjwt/pull/1115&gt;</code>__</li>
<li>Docs: Standardize CHANGELOG links to PRs by <a
href="https://github.com/kurtmckee"><code>@​kurtmckee</code></a> in
<code>[#1110](https://github.com/jpadilla/pyjwt/issues/1110)
&lt;https://github.com/jpadilla/pyjwt/pull/1110&gt;</code>__</li>
<li>Docs: Fix Read the Docs builds by <a
href="https://github.com/kurtmckee"><code>@​kurtmckee</code></a> in
<code>[#1111](https://github.com/jpadilla/pyjwt/issues/1111)
&lt;https://github.com/jpadilla/pyjwt/pull/1111&gt;</code>__</li>
<li>Docs: Add example of using leeway with nbf by <a
href="https://github.com/djw8605"><code>@​djw8605</code></a> in
<code>[#1034](https://github.com/jpadilla/pyjwt/issues/1034)
&lt;https://github.com/jpadilla/pyjwt/pull/1034&gt;</code>__</li>
<li>Docs: Refactored docs with <code>autodoc</code>; added
<code>PyJWS</code> and <code>jwt.algorithms</code> docs by <a
href="https://github.com/pachewise"><code>@​pachewise</code></a> in
<code>[#1045](https://github.com/jpadilla/pyjwt/issues/1045)
&lt;https://github.com/jpadilla/pyjwt/pull/1045&gt;</code>__</li>
<li>Docs: Documentation improvements for &quot;sub&quot; and
&quot;jti&quot; claims by <a
href="https://github.com/cleder"><code>@​cleder</code></a> in
<code>[#1088](https://github.com/jpadilla/pyjwt/issues/1088)
&lt;https://github.com/jpadilla/pyjwt/pull/1088&gt;</code>__</li>
<li>Development: Add pyupgrade as a pre-commit hook by <a
href="https://github.com/kurtmckee"><code>@​kurtmckee</code></a> in
<code>[#1109](https://github.com/jpadilla/pyjwt/issues/1109)
&lt;https://github.com/jpadilla/pyjwt/pull/1109&gt;</code>__</li>
<li>Add minimum key length validation for HMAC and RSA keys (CWE-326).
Warns by default via <code>InsecureKeyLengthWarning</code> when keys are
below
minimum recommended lengths per RFC 7518 Section 3.2 (HMAC) and
NIST SP 800-131A (RSA). Pass
<code>enforce_minimum_key_length=True</code> in
options to <code>PyJWT</code> or <code>PyJWS</code> to raise
<code>InvalidKeyError</code> instead.</li>
<li>Refactor <code>PyJWT</code> to own an internal <code>PyJWS</code>
instance instead of
calling global <code>api_jws</code> functions.</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="697344d259"><code>697344d</code></a>
bump up version</li>
<li><a
href="e4d0aec024"><code>e4d0aec</code></a>
fix: pre-commit</li>
<li><a
href="df9a6a0c44"><code>df9a6a0</code></a>
fix: failing test</li>
<li><a
href="2b2e53cd23"><code>2b2e53c</code></a>
fix: docs</li>
<li><a
href="635c8d89dd"><code>635c8d8</code></a>
fix: failing mypy</li>
<li><a
href="96ae3563b9"><code>96ae356</code></a>
feat: add minimum key length validation for HMAC and RSA</li>
<li><a
href="5b86227733"><code>5b86227</code></a>
fix: enforce ECDSA curve validation per RFC 7518 Section 3.4</li>
<li><a
href="04947d75dc"><code>04947d7</code></a>
Bump actions/download-artifact from 6 to 7 (<a
href="https://redirect.github.com/jpadilla/pyjwt/issues/1125">#1125</a>)</li>
<li><a
href="dd448344c3"><code>dd44834</code></a>
Fix leeway value in usage documentation (<a
href="https://redirect.github.com/jpadilla/pyjwt/issues/1124">#1124</a>)</li>
<li><a
href="407f0bde99"><code>407f0bd</code></a>
Thoroughly test type annotations, and resolve errors (<a
href="https://redirect.github.com/jpadilla/pyjwt/issues/1112">#1112</a>)</li>
<li>Additional commits viewable in <a
href="https://github.com/jpadilla/pyjwt/compare/2.10.1...2.11.0">compare
view</a></li>
</ul>
</details>
<br />

Updates `supabase` from 2.16.0 to 2.27.2
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/supabase/supabase-py/releases">supabase's
releases</a>.</em></p>
<blockquote>
<h2>v2.27.2</h2>
<h2><a
href="https://github.com/supabase/supabase-py/compare/v2.27.1...v2.27.2">2.27.2</a>
(2026-01-14)</h2>
<h3>Bug Fixes</h3>
<ul>
<li><strong>ci:</strong> generate new token for release-please (<a
href="https://redirect.github.com/supabase/supabase-py/issues/1348">#1348</a>)
(<a
href="c2ad37f9dc">c2ad37f</a>)</li>
<li><strong>ci:</strong> run CI when .github files change (<a
href="https://redirect.github.com/supabase/supabase-py/issues/1349">#1349</a>)
(<a
href="a221aac029">a221aac</a>)</li>
<li><strong>realtime:</strong> ammend reconnect logic to not unsubscribe
(<a
href="https://redirect.github.com/supabase/supabase-py/issues/1346">#1346</a>)
(<a
href="cfbe5943cb">cfbe594</a>)</li>
</ul>
<h2>v2.27.1</h2>
<h2><a
href="https://github.com/supabase/supabase-py/compare/v2.27.0...v2.27.1">2.27.1</a>
(2026-01-06)</h2>
<h3>Bug Fixes</h3>
<ul>
<li><strong>realtime:</strong> use 'event' instead of 'events' in
postgres_changes protocol (<a
href="https://redirect.github.com/supabase/supabase-py/issues/1339">#1339</a>)
(<a
href="c1e7986c5e">c1e7986</a>)</li>
<li><strong>storage:</strong> catch bad responses from server (<a
href="https://redirect.github.com/supabase/supabase-py/issues/1344">#1344</a>)
(<a
href="ddb50547db">ddb5054</a>)</li>
</ul>
<h2>v2.27.0</h2>
<h2><a
href="https://github.com/supabase/supabase-py/compare/v2.26.0...v2.27.0">2.27.0</a>
(2025-12-16)</h2>
<h3>Features</h3>
<ul>
<li><strong>auth:</strong> add X (OAuth 2.0) provider (<a
href="https://redirect.github.com/supabase/supabase-py/issues/1335">#1335</a>)
(<a
href="f600f96b52">f600f96</a>)</li>
</ul>
<h3>Bug Fixes</h3>
<ul>
<li><strong>storage:</strong> replace deprecated pydantic Extra with
literal values (<a
href="https://redirect.github.com/supabase/supabase-py/issues/1334">#1334</a>)
(<a
href="6df3545785">6df3545</a>)</li>
</ul>
<h2>v2.26....

_Description has been truncated_

---------

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: claude[bot] <41898282+claude[bot]@users.noreply.github.com>
Co-authored-by: Nicholas Tindle <ntindle@users.noreply.github.com>
Co-authored-by: Nicholas Tindle <nicholas.tindle@agpt.co>
Co-authored-by: Nick Tindle <nick@ntindle.com>
2026-02-07 02:17:38 +00:00
Reinier van der Leer
8fddc9d71f fix(backend): Reduce GET /api/graphs expense + latency (#11986)
[SECRT-1896: Fix crazy `GET /api/graphs` latency (P95 =
107s)](https://linear.app/autogpt/issue/SECRT-1896)

These changes should decrease latency of this endpoint by ~~60-65%~~ a
lot.

### Changes 🏗️

- Make `Graph.credentials_input_schema` cheaper by avoiding constructing
a new `BlockSchema` subclass
- Strip down `GraphMeta` - drop all computed fields
- Replace with either `GraphModel` or `GraphModelWithoutNodes` wherever
those computed fields are used
- Simplify usage in `list_graphs_paginated` and
`fetch_graph_from_store_slug`
- Refactor and clarify relationships between the different graph models
  - Split `BaseGraph` into `GraphBaseMeta` + `BaseGraph`
- Strip down `Graph` - move `credentials_input_schema` and
`aggregate_credentials_inputs` to `GraphModel`
- Refactor to eliminate double `aggregate_credentials_inputs()` call in
`credentials_input_schema` call tree
  - Add `GraphModelWithoutNodes` (similar to current `GraphMeta`)

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  - [x] `GET /api/graphs` works as it should
  - [x] Running a graph succeeds
  - [x] Adding a sub-agent in the Builder works as it should
2026-02-06 19:13:21 +00:00
Nicholas Tindle
69618a5e05 Merge branch 'dev' into ntindle/waitlist 2026-02-04 19:02:00 -06:00
Nicholas Tindle
3610be3e83 Merge branch 'dev' into ntindle/waitlist 2026-01-20 17:47:02 -06:00
Nicholas Tindle
9e1f7c9415 Merge branch 'dev' into ntindle/waitlist 2026-01-19 01:12:14 -06:00
Nicholas Tindle
0d03ebb43c fix: lint 2026-01-16 11:34:00 -06:00
Nicholas Tindle
1b37bd6da9 Merge branch 'dev' into ntindle/waitlist 2026-01-16 11:32:05 -06:00
Nicholas Tindle
db989a5eed fix: lint 2026-01-15 15:58:33 -06:00
Nicholas Tindle
e3a8c57a35 Merge branch 'dev' into ntindle/waitlist
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-15 15:54:38 -06:00
Nicholas Tindle
dfc8e53386 fix(backend): add assertions to fix type errors in waitlist admin functions
Prisma's update() returns T | None but we verify existence before updating,
so assert the result is not None to satisfy the type checker.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-15 15:48:30 -06:00
Nicholas Tindle
b5b7e5da92 fix(backend): don't mark waitlist DONE if email-only users pending
The notify_waitlist_users_on_launch function was marking waitlists as
DONE after notifying registered users, but ignoring unaffiliatedEmailUsers
who haven't been notified yet. Since DONE waitlists are excluded from
future notification queries, those email users would never receive
notifications when that functionality is implemented.

Now the waitlist remains in an active state if there are pending
email-only signups that still need notifications.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-15 15:41:31 -06:00
Nicholas Tindle
07ea2c2ab7 fix(backend): check waitlist existence before update in update_waitlist_admin
Added find_unique check before update() call to properly return 404 when
waitlist doesn't exist, following the established pattern used in other
waitlist admin functions.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-15 15:37:19 -06:00
Nicholas Tindle
9c873a0158 fix(backend): add exception handling to add_self_to_waitlist route
The public waitlist join route was missing exception handling, causing
500 errors for all failures. Now properly returns:
- 404 for waitlist not found
- 400 for closed/unavailable waitlists
- 500 for unexpected errors

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-15 15:30:54 -06:00
Nicholas Tindle
ed634db8f7 fix(backend): validate waitlist status enum at API boundary
Changed WaitlistUpdateRequest.status from str to the actual enum type.
Pydantic now validates the status value, returning 422 for invalid
values instead of a misleading 404 "Waitlist not found" error.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-15 15:26:26 -06:00
Nicholas Tindle
398197f3ea fix(frontend): add title attribute to YouTube iframe for accessibility
Screen readers need a title attribute on iframes to describe their
content. Added "YouTube video player" title to the embedded video.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-15 15:23:52 -06:00
Nicholas Tindle
b7df4cfdbf fix(backend): align migration FK with schema (SET NULL not CASCADE)
The migration had ON DELETE CASCADE for WaitlistEntry.storeListingId,
but the Prisma schema specifies onDelete: SetNull. This mismatch would
cause waitlist entries and all signup data to be deleted when a store
listing is removed, instead of just unlinking them.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-15 15:18:03 -06:00
Nicholas Tindle
5d8dd46759 fix(backend): align waitlist admin functions with established patterns
- delete_waitlist_admin: add find_unique check before update, raise
  ValueError if not found, add except ValueError: raise
- link_waitlist_to_listing_admin: add find_unique check for waitlist
  before update, remove dead code
- delete_waitlist route: add except ValueError: → 404, remove dead
  code bool check pattern

All waitlist admin functions now follow the consistent pattern:
1. find_unique to check existence
2. raise ValueError if not found
3. except ValueError: raise to bubble up
4. except Exception: raise DatabaseError

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-15 14:54:53 -06:00
Nicholas Tindle
f9518b6f8b fix(frontend): use generated query key for waitlist cache invalidation
The hardcoded query key string didn't match the actual generated key,
causing cache invalidation to fail after joining a waitlist. Now uses
the generated getGetV2GetWaitlistIdsTheCurrentUserHasJoinedQueryKey()
function for correct cache invalidation.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-15 14:44:35 -06:00
Nicholas Tindle
205b220e90 fix(backend): filter out DONE/CANCELED waitlists before sending notifications
The notify_waitlist_users_on_launch function was not filtering by
waitlist status, which could cause duplicate notifications when an
agent is re-approved. Now excludes DONE and CANCELED waitlists,
consistent with get_waitlist() and add_user_to_waitlist().

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-15 14:41:37 -06:00
Nicholas Tindle
29a232fcb4 fix(frontend): add URL validation and sandbox to video player
- Add getYouTubeVideoId() to extract video IDs from YouTube URLs
- Add isValidVideoUrl() to validate video URLs before rendering
- Create VideoPlayer component that:
  - Embeds YouTube videos via iframe with safe embed URL
  - Adds sandbox attribute to restrict iframe capabilities
  - Adds proper allow attributes for media playback
  - Falls back to native video element for valid non-YouTube URLs
  - Shows error state for invalid URLs

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-15 14:29:10 -06:00
Nicholas Tindle
a53f261812 feat(frontend): add TODO warning for email-only waitlist notifications
Adds a warning banner on the admin waitlist page indicating that
notifications for email-only signups (non-logged-in users) have not
been implemented yet.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-15 14:05:21 -06:00
Nicholas Tindle
00a20f77be feat(backend): add waitlist_launch email notification template
The WAITLIST_LAUNCH notification type was referencing a template that
didn't exist, causing FileNotFoundError when trying to notify users
that an agent they waitlisted has launched.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-12 16:38:04 -06:00
Nicholas Tindle
4d49536a40 Discard changes to autogpt_platform/frontend/src/lib/autogpt-server-api/types.ts 2026-01-12 15:28:37 -07:00
Nicholas Tindle
6028a2528c refactor(frontend): consolidate waitlist modals and align with Figma design
- Merge JoinWaitlistModal into WaitlistDetailModal for unified experience
- Add MediaCarousel component supporting videos and images with play overlay
- Update WaitlistCard styling to match Figma (rounded-large, line-clamp-5, zinc-800 button)
- Update success state with party emoji and Close button per Figma design
- Add sticky footer for buttons during modal scroll
- Support email input for non-logged-in users

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-12 16:27:09 -06:00
Nicholas Tindle
b31cd05675 fix(backend): correct typo in unaffiliatedEmailUsers field name
- Rename unafilliatedEmailUsers -> unaffiliatedEmailUsers in schema.prisma
- Update migration SQL to use correct column name
- Update all references in db.py and model.py

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-12 15:33:38 -06:00
Nicholas Tindle
128366772f refactor(backend): remove apscheduler tables from prisma schema
- Remove apscheduler_jobs and apscheduler_jobs_batched_notifications models
- Delete migration 20260107000001_add_apscheduler_tables
- Remove index rename statements from waitlist migration

APScheduler tables are managed at runtime by APScheduler itself and
should not be part of the Prisma schema.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-12 15:29:17 -06:00
Nicholas Tindle
764cdf17fe refactor(frontend): migrate waitlist admin components to generated API hooks
- Convert WaitlistTable to use generated React Query hooks directly
- Convert CreateWaitlistButton to use generated hooks
- Update WaitlistDetailModal to use generated types and design system Dialog
- Remove deprecated waitlist types from types.ts
- Remove deprecated waitlist methods from BackendAPI client
- Delete actions.ts server actions (no longer needed)
- Replace lucide-react icons with Phosphor icons

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-12 15:26:34 -06:00
Nicholas Tindle
1dd83b4cf8 fix(frontend): add text color to status badge fallback in WaitlistTable
Ensures unknown status values have readable text contrast by adding
text-gray-700 to the fallback className.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-12 15:09:44 -06:00
Nicholas Tindle
24a34f7ce5 Merge branch 'dev' into ntindle/waitlist 2026-01-12 14:08:48 -07:00
Nicholas Tindle
20fe2c3877 fix(backend): remove PII-exposing fields from public waitlist model
Remove `owner` (User type) and `storeListing` (StoreListingWithVersions)
fields from StoreWaitlistEntry. These fields were never populated but
exposed PII types (email, stripe_customer_id, etc.) in the OpenAPI schema.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-12 14:52:51 -06:00
Nicholas Tindle
738c7e2bef fix(platform): address remaining PR review feedback for waitlist
Backend fixes:
- Fix optional field clearing by using model_fields_set
- Re-fetch waitlist data after join operation
- Only mark waitlist as DONE if all notifications succeed
- Fix race condition in email removal with transaction
- Rename waitlist_id to waitlistId for naming consistency

Frontend fixes:
- Migrate useWaitlistSection to generated API hooks
- Migrate JoinWaitlistModal to design system + generated hooks
- Migrate WaitlistSignupsDialog to design system + generated hooks
- Replace lucide-react icons with Phosphor in WaitlistTable
- Add proper error state in WaitlistSignupsDialog
- Update waitlistId naming across components

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-12 14:43:10 -06:00
Nicholas Tindle
9edfe0fb97 refactor(frontend): migrate EditWaitlistDialog to design system and generated API
- Replace legacy Dialog components with molecules/Dialog
- Replace legacy Input/Label/Textarea with atoms/Input
- Replace legacy Select with atoms/Select
- Replace @/lib/autogpt-server-api/types with @/app/api/__generated__/models
- Replace updateWaitlist action with usePutV2UpdateWaitlist hook
- Remove dependency on BackendAPI in favor of generated React Query hooks

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-09 16:49:35 -07:00
Nicholas Tindle
4aabe71001 fix(platform): address PR review feedback for waitlist feature
Backend fixes:
- Fix creator_username null check in store URL construction
- Add embed=True to link_waitlist_to_listing endpoint body param
- Fix race condition in email list with transaction wrapper
- Replace str(e) with generic error messages in admin ValueError handlers
- Add validation requiring user_id or email in waitlist join
- Configure WAITLIST_LAUNCH in notification system (data type, queue, template, subject)
- Change StoreListing cascade delete to SetNull to preserve waitlist data

Frontend fixes:
- Escape internal quotes in CSV export for proper RFC 4180 compliance
- Remove incorrect 'use server' directive from page.tsx
- Replace lucide-react Check icon with Phosphor Icons

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-09 16:40:35 -07:00
Nicholas Tindle
b3999669f2 refactor(platform): simplify waitlist code and remove type duplication
- Backend: Extract _waitlist_to_store_entry helper to reduce duplication
- Backend: Use dict comprehension in update_waitlist_admin for cleaner code
- Frontend: Import types directly from shared types file instead of re-exporting
- Frontend: Remove redundant isMember check in WaitlistCard handleJoinClick

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-09 16:25:27 -07:00
Swifty
8c45a5ee98 Merge branch 'dev' into ntindle/waitlist 2026-01-08 12:38:46 +01:00
Nicholas Tindle
4b654c7e9f fix(frontend): Fix lint and type errors in waitlist admin components
- Remove unused WaitlistSignup import
- Change button size from "sm" to "small"

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-07 22:48:53 -07:00
Nicholas Tindle
8d82e3b633 fix(backend): Use Prisma connect pattern for waitlist-listing relation
Use StoreListing relation with connect pattern instead of directly
setting storeListingId, which doesn't work with Prisma's typed update.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-07 22:01:18 -07:00
Nicholas Tindle
d4ecdb64ed feat(platform): Show "On the waitlist" status for joined users
- Add GET /api/store/waitlist/my-memberships endpoint to fetch user's joined waitlists
- Add get_user_waitlist_memberships() db function
- Update useWaitlistSection hook to fetch memberships when logged in
- Update WaitlistCard to show green "On the waitlist" button for members
- Update WaitlistDetailModal to show member status
- Add onSuccess callback to JoinWaitlistModal for optimistic UI updates

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-07 21:15:03 -07:00
Nicholas Tindle
a73fb8f114 feat(platform): Add waitlist feature with admin management and user notifications
Backend:
- Add waitlist admin API routes for CRUD operations
- Add admin functions for waitlist management (create, update, delete, list)
- Add WaitlistLaunchData notification type for user notifications
- Integrate waitlist notifications into store submission approval flow
- Auto-notify waitlist users when linked agent is approved

Frontend:
- Add admin waitlist management page with table, create/edit dialogs
- Add WaitlistSection component to marketplace homepage
- Add WaitlistCard, WaitlistDetailModal, JoinWaitlistModal components
- Add API client methods and types for waitlist operations

Database:
- Add WAITLIST_LAUNCH notification type enum
- Add baseline migration for APScheduler tables

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-07 20:38:15 -07:00
Nicholas Tindle
2c60aa64ef wip: adding waitlist 2026-01-06 22:13:35 -07:00
66 changed files with 9386 additions and 4215 deletions

View File

@@ -49,7 +49,7 @@ jobs:
- name: Create PR ${{ env.BUILD_BRANCH }} -> ${{ github.ref_name }} - name: Create PR ${{ env.BUILD_BRANCH }} -> ${{ github.ref_name }}
if: github.event_name == 'push' if: github.event_name == 'push'
uses: peter-evans/create-pull-request@v7 uses: peter-evans/create-pull-request@v8
with: with:
add-paths: classic/frontend/build/web add-paths: classic/frontend/build/web
base: ${{ github.ref_name }} base: ${{ github.ref_name }}

View File

@@ -309,6 +309,7 @@ jobs:
uses: anthropics/claude-code-action@v1 uses: anthropics/claude-code-action@v1
with: with:
claude_code_oauth_token: ${{ secrets.CLAUDE_CODE_OAUTH_TOKEN }} claude_code_oauth_token: ${{ secrets.CLAUDE_CODE_OAUTH_TOKEN }}
allowed_bots: "dependabot[bot]"
claude_args: | claude_args: |
--allowedTools "Bash(npm:*),Bash(pnpm:*),Bash(poetry:*),Bash(git:*),Edit,Replace,NotebookEditCell,mcp__github_inline_comment__create_inline_comment,Bash(gh pr comment:*), Bash(gh pr diff:*), Bash(gh pr view:*)" --allowedTools "Bash(npm:*),Bash(pnpm:*),Bash(poetry:*),Bash(git:*),Edit,Replace,NotebookEditCell,mcp__github_inline_comment__create_inline_comment,Bash(gh pr comment:*), Bash(gh pr diff:*), Bash(gh pr view:*)"
prompt: | prompt: |

File diff suppressed because it is too large Load Diff

View File

@@ -9,17 +9,17 @@ packages = [{ include = "autogpt_libs" }]
[tool.poetry.dependencies] [tool.poetry.dependencies]
python = ">=3.10,<4.0" python = ">=3.10,<4.0"
colorama = "^0.4.6" colorama = "^0.4.6"
cryptography = "^45.0" cryptography = "^46.0"
expiringdict = "^1.2.2" expiringdict = "^1.2.2"
fastapi = "^0.116.1" fastapi = "^0.128.0"
google-cloud-logging = "^3.12.1" google-cloud-logging = "^3.13.0"
launchdarkly-server-sdk = "^9.12.0" launchdarkly-server-sdk = "^9.14.1"
pydantic = "^2.11.7" pydantic = "^2.12.5"
pydantic-settings = "^2.10.1" pydantic-settings = "^2.12.0"
pyjwt = { version = "^2.10.1", extras = ["crypto"] } pyjwt = { version = "^2.11.0", extras = ["crypto"] }
redis = "^6.2.0" redis = "^6.2.0"
supabase = "^2.16.0" supabase = "^2.27.2"
uvicorn = "^0.35.0" uvicorn = "^0.40.0"
[tool.poetry.group.dev.dependencies] [tool.poetry.group.dev.dependencies]
pyright = "^1.1.404" pyright = "^1.1.404"

View File

@@ -157,16 +157,6 @@ yield "image_url", result_url
3. Write tests alongside the route file 3. Write tests alongside the route file
4. Run `poetry run test` to verify 4. Run `poetry run test` to verify
## Workspace & Media Files
**Read [Workspace & Media Architecture](../../docs/platform/workspace-media-architecture.md) when:**
- Working on CoPilot file upload/download features
- Building blocks that handle `MediaFileType` inputs/outputs
- Modifying `WorkspaceManager` or `store_media_file()`
- Debugging file persistence or virus scanning issues
Covers: `WorkspaceManager` (persistent storage with session scoping), `store_media_file()` (media normalization pipeline), and responsibility boundaries for virus scanning and persistence.
## Security Implementation ## Security Implementation
### Cache Protection Middleware ### Cache Protection Middleware

View File

@@ -0,0 +1,251 @@
import logging
import autogpt_libs.auth
import fastapi
import fastapi.responses
import backend.api.features.store.db as store_db
import backend.api.features.store.model as store_model
logger = logging.getLogger(__name__)
router = fastapi.APIRouter(
prefix="/admin/waitlist",
tags=["store", "admin", "waitlist"],
dependencies=[fastapi.Security(autogpt_libs.auth.requires_admin_user)],
)
@router.post(
"",
summary="Create Waitlist",
response_model=store_model.WaitlistAdminResponse,
)
async def create_waitlist(
request: store_model.WaitlistCreateRequest,
user_id: str = fastapi.Security(autogpt_libs.auth.get_user_id),
):
"""
Create a new waitlist (admin only).
Args:
request: Waitlist creation details
user_id: Authenticated admin user creating the waitlist
Returns:
WaitlistAdminResponse with the created waitlist details
"""
try:
waitlist = await store_db.create_waitlist_admin(
admin_user_id=user_id,
data=request,
)
return waitlist
except Exception as e:
logger.exception("Error creating waitlist: %s", e)
return fastapi.responses.JSONResponse(
status_code=500,
content={"detail": "An error occurred while creating the waitlist"},
)
@router.get(
"",
summary="List All Waitlists",
response_model=store_model.WaitlistAdminListResponse,
)
async def list_waitlists():
"""
Get all waitlists with admin details (admin only).
Returns:
WaitlistAdminListResponse with all waitlists
"""
try:
return await store_db.get_waitlists_admin()
except Exception as e:
logger.exception("Error listing waitlists: %s", e)
return fastapi.responses.JSONResponse(
status_code=500,
content={"detail": "An error occurred while fetching waitlists"},
)
@router.get(
"/{waitlist_id}",
summary="Get Waitlist Details",
response_model=store_model.WaitlistAdminResponse,
)
async def get_waitlist(
waitlist_id: str = fastapi.Path(..., description="The ID of the waitlist"),
):
"""
Get a single waitlist with admin details (admin only).
Args:
waitlist_id: ID of the waitlist to retrieve
Returns:
WaitlistAdminResponse with waitlist details
"""
try:
return await store_db.get_waitlist_admin(waitlist_id)
except ValueError:
logger.warning("Waitlist not found: %s", waitlist_id)
return fastapi.responses.JSONResponse(
status_code=404,
content={"detail": "Waitlist not found"},
)
except Exception as e:
logger.exception("Error fetching waitlist: %s", e)
return fastapi.responses.JSONResponse(
status_code=500,
content={"detail": "An error occurred while fetching the waitlist"},
)
@router.put(
"/{waitlist_id}",
summary="Update Waitlist",
response_model=store_model.WaitlistAdminResponse,
)
async def update_waitlist(
request: store_model.WaitlistUpdateRequest,
waitlist_id: str = fastapi.Path(..., description="The ID of the waitlist"),
):
"""
Update a waitlist (admin only).
Args:
waitlist_id: ID of the waitlist to update
request: Fields to update
Returns:
WaitlistAdminResponse with updated waitlist details
"""
try:
return await store_db.update_waitlist_admin(waitlist_id, request)
except ValueError:
logger.warning("Waitlist not found for update: %s", waitlist_id)
return fastapi.responses.JSONResponse(
status_code=404,
content={"detail": "Waitlist not found"},
)
except Exception as e:
logger.exception("Error updating waitlist: %s", e)
return fastapi.responses.JSONResponse(
status_code=500,
content={"detail": "An error occurred while updating the waitlist"},
)
@router.delete(
"/{waitlist_id}",
summary="Delete Waitlist",
)
async def delete_waitlist(
waitlist_id: str = fastapi.Path(..., description="The ID of the waitlist"),
):
"""
Soft delete a waitlist (admin only).
Args:
waitlist_id: ID of the waitlist to delete
Returns:
Success message
"""
try:
await store_db.delete_waitlist_admin(waitlist_id)
return {"message": "Waitlist deleted successfully"}
except ValueError:
logger.warning(f"Waitlist not found for deletion: {waitlist_id}")
return fastapi.responses.JSONResponse(
status_code=404,
content={"detail": "Waitlist not found"},
)
except Exception as e:
logger.exception("Error deleting waitlist: %s", e)
return fastapi.responses.JSONResponse(
status_code=500,
content={"detail": "An error occurred while deleting the waitlist"},
)
@router.get(
"/{waitlist_id}/signups",
summary="Get Waitlist Signups",
response_model=store_model.WaitlistSignupListResponse,
)
async def get_waitlist_signups(
waitlist_id: str = fastapi.Path(..., description="The ID of the waitlist"),
):
"""
Get all signups for a waitlist (admin only).
Args:
waitlist_id: ID of the waitlist
Returns:
WaitlistSignupListResponse with all signups
"""
try:
return await store_db.get_waitlist_signups_admin(waitlist_id)
except ValueError:
logger.warning("Waitlist not found for signups: %s", waitlist_id)
return fastapi.responses.JSONResponse(
status_code=404,
content={"detail": "Waitlist not found"},
)
except Exception as e:
logger.exception("Error fetching waitlist signups: %s", e)
return fastapi.responses.JSONResponse(
status_code=500,
content={"detail": "An error occurred while fetching waitlist signups"},
)
@router.post(
"/{waitlist_id}/link",
summary="Link Waitlist to Store Listing",
response_model=store_model.WaitlistAdminResponse,
)
async def link_waitlist_to_listing(
waitlist_id: str = fastapi.Path(..., description="The ID of the waitlist"),
store_listing_id: str = fastapi.Body(
..., embed=True, description="The ID of the store listing"
),
):
"""
Link a waitlist to a store listing (admin only).
When the linked store listing is approved/published, waitlist users
will be automatically notified.
Args:
waitlist_id: ID of the waitlist
store_listing_id: ID of the store listing to link
Returns:
WaitlistAdminResponse with updated waitlist details
"""
try:
return await store_db.link_waitlist_to_listing_admin(
waitlist_id, store_listing_id
)
except ValueError:
logger.warning(
"Link failed - waitlist or listing not found: %s, %s",
waitlist_id,
store_listing_id,
)
return fastapi.responses.JSONResponse(
status_code=404,
content={"detail": "Waitlist or store listing not found"},
)
except Exception as e:
logger.exception("Error linking waitlist to listing: %s", e)
return fastapi.responses.JSONResponse(
status_code=500,
content={"detail": "An error occurred while linking the waitlist"},
)

View File

@@ -6,7 +6,6 @@ from typing import Any
from backend.api.features.library import db as library_db from backend.api.features.library import db as library_db
from backend.api.features.library import model as library_model from backend.api.features.library import model as library_model
from backend.api.features.store import db as store_db from backend.api.features.store import db as store_db
from backend.data import graph as graph_db
from backend.data.graph import GraphModel from backend.data.graph import GraphModel
from backend.data.model import ( from backend.data.model import (
CredentialsFieldInfo, CredentialsFieldInfo,
@@ -44,14 +43,8 @@ async def fetch_graph_from_store_slug(
return None, None return None, None
# Get the graph from store listing version # Get the graph from store listing version
graph_meta = await store_db.get_available_graph( graph = await store_db.get_available_graph(
store_agent.store_listing_version_id store_agent.store_listing_version_id, hide_nodes=False
)
graph = await graph_db.get_graph(
graph_id=graph_meta.id,
version=graph_meta.version,
user_id=None, # Public access
include_subgraphs=True,
) )
return graph, store_agent return graph, store_agent
@@ -128,7 +121,7 @@ def build_missing_credentials_from_graph(
return { return {
field_key: _serialize_missing_credential(field_key, field_info) field_key: _serialize_missing_credential(field_key, field_info)
for field_key, (field_info, _node_fields) in aggregated_fields.items() for field_key, (field_info, _, _) in aggregated_fields.items()
if field_key not in matched_keys if field_key not in matched_keys
} }
@@ -269,7 +262,8 @@ async def match_user_credentials_to_graph(
# provider is in the set of acceptable providers. # provider is in the set of acceptable providers.
for credential_field_name, ( for credential_field_name, (
credential_requirements, credential_requirements,
_node_fields, _,
_,
) in aggregated_creds.items(): ) in aggregated_creds.items():
# Find first matching credential by provider, type, and scopes # Find first matching credential by provider, type, and scopes
matching_cred = next( matching_cred = next(

View File

@@ -9,6 +9,7 @@ from pydantic import BaseModel
from backend.api.features.chat.model import ChatSession from backend.api.features.chat.model import ChatSession
from backend.data.workspace import get_or_create_workspace from backend.data.workspace import get_or_create_workspace
from backend.util.settings import Config from backend.util.settings import Config
from backend.util.virus_scanner import scan_content_safe
from backend.util.workspace import WorkspaceManager from backend.util.workspace import WorkspaceManager
from .base import BaseTool from .base import BaseTool
@@ -474,6 +475,9 @@ class WriteWorkspaceFileTool(BaseTool):
) )
try: try:
# Virus scan
await scan_content_safe(content, filename=filename)
workspace = await get_or_create_workspace(user_id) workspace = await get_or_create_workspace(user_id)
# Pass session_id for session-scoped file access # Pass session_id for session-scoped file access
manager = WorkspaceManager(user_id, workspace.id, session_id) manager = WorkspaceManager(user_id, workspace.id, session_id)

View File

@@ -374,7 +374,7 @@ async def get_library_agent_by_graph_id(
async def add_generated_agent_image( async def add_generated_agent_image(
graph: graph_db.BaseGraph, graph: graph_db.GraphBaseMeta,
user_id: str, user_id: str,
library_agent_id: str, library_agent_id: str,
) -> Optional[prisma.models.LibraryAgent]: ) -> Optional[prisma.models.LibraryAgent]:

View File

@@ -1,7 +1,7 @@
import asyncio import asyncio
import logging import logging
from datetime import datetime, timezone from datetime import datetime, timezone
from typing import Any, Literal from typing import Any, Literal, overload
import fastapi import fastapi
import prisma.enums import prisma.enums
@@ -11,8 +11,8 @@ import prisma.types
from backend.data.db import transaction from backend.data.db import transaction
from backend.data.graph import ( from backend.data.graph import (
GraphMeta,
GraphModel, GraphModel,
GraphModelWithoutNodes,
get_graph, get_graph,
get_graph_as_admin, get_graph_as_admin,
get_sub_graphs, get_sub_graphs,
@@ -22,6 +22,7 @@ from backend.data.notifications import (
AgentApprovalData, AgentApprovalData,
AgentRejectionData, AgentRejectionData,
NotificationEventModel, NotificationEventModel,
WaitlistLaunchData,
) )
from backend.notifications.notifications import queue_notification_async from backend.notifications.notifications import queue_notification_async
from backend.util.exceptions import DatabaseError from backend.util.exceptions import DatabaseError
@@ -334,7 +335,22 @@ async def get_store_agent_details(
raise DatabaseError("Failed to fetch agent details") from e raise DatabaseError("Failed to fetch agent details") from e
async def get_available_graph(store_listing_version_id: str) -> GraphMeta: @overload
async def get_available_graph(
store_listing_version_id: str, hide_nodes: Literal[False]
) -> GraphModel: ...
@overload
async def get_available_graph(
store_listing_version_id: str, hide_nodes: Literal[True] = True
) -> GraphModelWithoutNodes: ...
async def get_available_graph(
store_listing_version_id: str,
hide_nodes: bool = True,
) -> GraphModelWithoutNodes | GraphModel:
try: try:
# Get avaialble, non-deleted store listing version # Get avaialble, non-deleted store listing version
store_listing_version = ( store_listing_version = (
@@ -344,7 +360,7 @@ async def get_available_graph(store_listing_version_id: str) -> GraphMeta:
"isAvailable": True, "isAvailable": True,
"isDeleted": False, "isDeleted": False,
}, },
include={"AgentGraph": {"include": {"Nodes": True}}}, include={"AgentGraph": {"include": AGENT_GRAPH_INCLUDE}},
) )
) )
@@ -354,7 +370,9 @@ async def get_available_graph(store_listing_version_id: str) -> GraphMeta:
detail=f"Store listing version {store_listing_version_id} not found", detail=f"Store listing version {store_listing_version_id} not found",
) )
return GraphModel.from_db(store_listing_version.AgentGraph).meta() return (GraphModelWithoutNodes if hide_nodes else GraphModel).from_db(
store_listing_version.AgentGraph
)
except Exception as e: except Exception as e:
logger.error(f"Error getting agent: {e}") logger.error(f"Error getting agent: {e}")
@@ -1713,6 +1731,29 @@ async def review_store_submission(
# Don't fail the review process if email sending fails # Don't fail the review process if email sending fails
pass pass
# Notify waitlist users if this is an approval and has a linked waitlist
if is_approved and submission.StoreListing:
try:
frontend_base_url = (
settings.config.frontend_base_url
or settings.config.platform_base_url
)
store_agent = (
await prisma.models.StoreAgent.prisma().find_first_or_raise(
where={"storeListingVersionId": submission.id}
)
)
creator_username = store_agent.creator_username or "unknown"
store_url = f"{frontend_base_url}/marketplace/agent/{creator_username}/{store_agent.slug}"
await notify_waitlist_users_on_launch(
store_listing_id=submission.StoreListing.id,
agent_name=submission.name,
store_url=store_url,
)
except Exception as e:
logger.error(f"Failed to notify waitlist users on agent approval: {e}")
# Don't fail the approval process
# Convert to Pydantic model for consistency # Convert to Pydantic model for consistency
return store_model.StoreSubmission( return store_model.StoreSubmission(
listing_id=(submission.StoreListing.id if submission.StoreListing else ""), listing_id=(submission.StoreListing.id if submission.StoreListing else ""),
@@ -1960,3 +2001,557 @@ async def get_agent_as_admin(
) )
return graph return graph
def _waitlist_to_store_entry(
waitlist: prisma.models.WaitlistEntry,
) -> store_model.StoreWaitlistEntry:
"""Convert a WaitlistEntry to StoreWaitlistEntry for public display."""
return store_model.StoreWaitlistEntry(
waitlistId=waitlist.id,
slug=waitlist.slug,
name=waitlist.name,
subHeading=waitlist.subHeading,
videoUrl=waitlist.videoUrl,
agentOutputDemoUrl=waitlist.agentOutputDemoUrl,
imageUrls=waitlist.imageUrls or [],
description=waitlist.description,
categories=waitlist.categories or [],
)
async def get_waitlist() -> list[store_model.StoreWaitlistEntry]:
"""Get all active waitlists for public display."""
try:
waitlists = await prisma.models.WaitlistEntry.prisma().find_many(
where=prisma.types.WaitlistEntryWhereInput(isDeleted=False),
)
# Filter out closed/done waitlists and sort by votes (descending)
excluded_statuses = {
prisma.enums.WaitlistExternalStatus.CANCELED,
prisma.enums.WaitlistExternalStatus.DONE,
}
active_waitlists = [w for w in waitlists if w.status not in excluded_statuses]
sorted_list = sorted(active_waitlists, key=lambda x: x.votes, reverse=True)
return [_waitlist_to_store_entry(w) for w in sorted_list]
except Exception as e:
logger.error(f"Error fetching waitlists: {e}")
raise DatabaseError("Failed to fetch waitlists") from e
async def get_user_waitlist_memberships(user_id: str) -> list[str]:
"""Get all waitlist IDs that a user has joined."""
try:
user = await prisma.models.User.prisma().find_unique(
where={"id": user_id},
include={"joinedWaitlists": True},
)
if not user or not user.joinedWaitlists:
return []
return [w.id for w in user.joinedWaitlists]
except Exception as e:
logger.error(f"Error fetching user waitlist memberships: {e}")
raise DatabaseError("Failed to fetch waitlist memberships") from e
async def add_user_to_waitlist(
waitlist_id: str, user_id: str | None, email: str | None
) -> store_model.StoreWaitlistEntry:
"""
Add a user to a waitlist.
For logged-in users: connects via joinedUsers relation
For anonymous users: adds email to unaffiliatedEmailUsers array
"""
if not user_id and not email:
raise ValueError("Either user_id or email must be provided")
try:
# Find the waitlist
waitlist = await prisma.models.WaitlistEntry.prisma().find_unique(
where={"id": waitlist_id},
include={"joinedUsers": True},
)
if not waitlist:
raise ValueError(f"Waitlist {waitlist_id} not found")
if waitlist.isDeleted:
raise ValueError(f"Waitlist {waitlist_id} is no longer available")
if waitlist.status in [
prisma.enums.WaitlistExternalStatus.CANCELED,
prisma.enums.WaitlistExternalStatus.DONE,
]:
raise ValueError(f"Waitlist {waitlist_id} is closed")
if user_id:
# Check if user already joined
joined_user_ids = [u.id for u in (waitlist.joinedUsers or [])]
if user_id in joined_user_ids:
# Already joined - return waitlist info
logger.debug(f"User {user_id} already joined waitlist {waitlist_id}")
else:
# Connect user to waitlist
await prisma.models.WaitlistEntry.prisma().update(
where={"id": waitlist_id},
data={"joinedUsers": {"connect": [{"id": user_id}]}},
)
logger.info(f"User {user_id} joined waitlist {waitlist_id}")
# If user was previously in email list, remove them
# Use transaction to prevent race conditions
if email:
async with transaction() as tx:
current_waitlist = await tx.waitlistentry.find_unique(
where={"id": waitlist_id}
)
if current_waitlist and email in (
current_waitlist.unaffiliatedEmailUsers or []
):
updated_emails: list[str] = [
e
for e in (current_waitlist.unaffiliatedEmailUsers or [])
if e != email
]
await tx.waitlistentry.update(
where={"id": waitlist_id},
data={"unaffiliatedEmailUsers": updated_emails},
)
elif email:
# Add email to unaffiliated list if not already present
# Use transaction to prevent race conditions with concurrent signups
async with transaction() as tx:
# Re-fetch within transaction to get latest state
current_waitlist = await tx.waitlistentry.find_unique(
where={"id": waitlist_id}
)
if current_waitlist:
current_emails: list[str] = list(
current_waitlist.unaffiliatedEmailUsers or []
)
if email not in current_emails:
current_emails.append(email)
await tx.waitlistentry.update(
where={"id": waitlist_id},
data={"unaffiliatedEmailUsers": current_emails},
)
# Mask email for logging to avoid PII exposure
parts = email.split("@") if "@" in email else []
local = parts[0] if len(parts) > 0 else ""
domain = parts[1] if len(parts) > 1 else "unknown"
masked = (local[0] if local else "?") + "***@" + domain
logger.info(f"Email {masked} added to waitlist {waitlist_id}")
else:
logger.debug(f"Email already exists on waitlist {waitlist_id}")
# Re-fetch to return updated data
updated_waitlist = await prisma.models.WaitlistEntry.prisma().find_unique(
where={"id": waitlist_id}
)
return _waitlist_to_store_entry(updated_waitlist or waitlist)
except ValueError:
raise
except Exception as e:
logger.error(f"Error adding user to waitlist: {e}")
raise DatabaseError("Failed to add user to waitlist") from e
# ============== Admin Waitlist Functions ==============
def _waitlist_to_admin_response(
waitlist: prisma.models.WaitlistEntry,
) -> store_model.WaitlistAdminResponse:
"""Convert a WaitlistEntry to WaitlistAdminResponse."""
joined_count = len(waitlist.joinedUsers) if waitlist.joinedUsers else 0
email_count = (
len(waitlist.unaffiliatedEmailUsers) if waitlist.unaffiliatedEmailUsers else 0
)
return store_model.WaitlistAdminResponse(
id=waitlist.id,
createdAt=waitlist.createdAt.isoformat() if waitlist.createdAt else "",
updatedAt=waitlist.updatedAt.isoformat() if waitlist.updatedAt else "",
slug=waitlist.slug,
name=waitlist.name,
subHeading=waitlist.subHeading,
description=waitlist.description,
categories=waitlist.categories or [],
imageUrls=waitlist.imageUrls or [],
videoUrl=waitlist.videoUrl,
agentOutputDemoUrl=waitlist.agentOutputDemoUrl,
status=waitlist.status or prisma.enums.WaitlistExternalStatus.NOT_STARTED,
votes=waitlist.votes,
signupCount=joined_count + email_count,
storeListingId=waitlist.storeListingId,
owningUserId=waitlist.owningUserId,
)
async def create_waitlist_admin(
admin_user_id: str,
data: store_model.WaitlistCreateRequest,
) -> store_model.WaitlistAdminResponse:
"""Create a new waitlist (admin only)."""
logger.info(f"Admin {admin_user_id} creating waitlist: {data.name}")
try:
waitlist = await prisma.models.WaitlistEntry.prisma().create(
data=prisma.types.WaitlistEntryCreateInput(
name=data.name,
slug=data.slug,
subHeading=data.subHeading,
description=data.description,
categories=data.categories,
imageUrls=data.imageUrls,
videoUrl=data.videoUrl,
agentOutputDemoUrl=data.agentOutputDemoUrl,
owningUserId=admin_user_id,
status=prisma.enums.WaitlistExternalStatus.NOT_STARTED,
),
include={"joinedUsers": True},
)
return _waitlist_to_admin_response(waitlist)
except Exception as e:
logger.error(f"Error creating waitlist: {e}")
raise DatabaseError("Failed to create waitlist") from e
async def get_waitlists_admin() -> store_model.WaitlistAdminListResponse:
"""Get all waitlists with admin details."""
try:
waitlists = await prisma.models.WaitlistEntry.prisma().find_many(
where=prisma.types.WaitlistEntryWhereInput(isDeleted=False),
include={"joinedUsers": True},
order={"createdAt": "desc"},
)
return store_model.WaitlistAdminListResponse(
waitlists=[_waitlist_to_admin_response(w) for w in waitlists],
totalCount=len(waitlists),
)
except Exception as e:
logger.error(f"Error fetching waitlists for admin: {e}")
raise DatabaseError("Failed to fetch waitlists") from e
async def get_waitlist_admin(
waitlist_id: str,
) -> store_model.WaitlistAdminResponse:
"""Get a single waitlist with admin details."""
try:
waitlist = await prisma.models.WaitlistEntry.prisma().find_unique(
where={"id": waitlist_id},
include={"joinedUsers": True},
)
if not waitlist:
raise ValueError(f"Waitlist {waitlist_id} not found")
if waitlist.isDeleted:
raise ValueError(f"Waitlist {waitlist_id} has been deleted")
return _waitlist_to_admin_response(waitlist)
except ValueError:
raise
except Exception as e:
logger.error(f"Error fetching waitlist {waitlist_id}: {e}")
raise DatabaseError("Failed to fetch waitlist") from e
async def update_waitlist_admin(
waitlist_id: str,
data: store_model.WaitlistUpdateRequest,
) -> store_model.WaitlistAdminResponse:
"""Update a waitlist (admin only)."""
logger.info(f"Updating waitlist {waitlist_id}")
try:
# Check if waitlist exists first
existing = await prisma.models.WaitlistEntry.prisma().find_unique(
where={"id": waitlist_id}
)
if not existing:
raise ValueError(f"Waitlist {waitlist_id} not found")
if existing.isDeleted:
raise ValueError(f"Waitlist {waitlist_id} has been deleted")
# Build update data from explicitly provided fields
# Use model_fields_set to allow clearing fields by setting them to None
field_mappings = {
"name": data.name,
"slug": data.slug,
"subHeading": data.subHeading,
"description": data.description,
"categories": data.categories,
"imageUrls": data.imageUrls,
"videoUrl": data.videoUrl,
"agentOutputDemoUrl": data.agentOutputDemoUrl,
"storeListingId": data.storeListingId,
}
update_data: dict[str, Any] = {
k: v for k, v in field_mappings.items() if k in data.model_fields_set
}
# Add status if provided (already validated as enum by Pydantic)
if "status" in data.model_fields_set and data.status is not None:
update_data["status"] = data.status
if not update_data:
# No updates, just return current data
return await get_waitlist_admin(waitlist_id)
waitlist = await prisma.models.WaitlistEntry.prisma().update(
where={"id": waitlist_id},
data=prisma.types.WaitlistEntryUpdateInput(**update_data),
include={"joinedUsers": True},
)
# We already verified existence above, so this should never be None
assert waitlist is not None
return _waitlist_to_admin_response(waitlist)
except ValueError:
raise
except Exception as e:
logger.error(f"Error updating waitlist {waitlist_id}: {e}")
raise DatabaseError("Failed to update waitlist") from e
async def delete_waitlist_admin(waitlist_id: str) -> None:
"""Soft delete a waitlist (admin only)."""
logger.info(f"Soft deleting waitlist {waitlist_id}")
try:
# Check if waitlist exists first
waitlist = await prisma.models.WaitlistEntry.prisma().find_unique(
where={"id": waitlist_id},
)
if not waitlist:
raise ValueError(f"Waitlist {waitlist_id} not found")
if waitlist.isDeleted:
raise ValueError(f"Waitlist {waitlist_id} has already been deleted")
await prisma.models.WaitlistEntry.prisma().update(
where={"id": waitlist_id},
data={"isDeleted": True},
)
except ValueError:
raise
except Exception as e:
logger.error(f"Error deleting waitlist {waitlist_id}: {e}")
raise DatabaseError("Failed to delete waitlist") from e
async def get_waitlist_signups_admin(
waitlist_id: str,
) -> store_model.WaitlistSignupListResponse:
"""Get all signups for a waitlist (admin only)."""
try:
waitlist = await prisma.models.WaitlistEntry.prisma().find_unique(
where={"id": waitlist_id},
include={"joinedUsers": True},
)
if not waitlist:
raise ValueError(f"Waitlist {waitlist_id} not found")
signups: list[store_model.WaitlistSignup] = []
# Add user signups
for user in waitlist.joinedUsers or []:
signups.append(
store_model.WaitlistSignup(
type="user",
userId=user.id,
email=user.email,
username=user.name,
)
)
# Add email signups
for email in waitlist.unaffiliatedEmailUsers or []:
signups.append(
store_model.WaitlistSignup(
type="email",
email=email,
)
)
return store_model.WaitlistSignupListResponse(
waitlistId=waitlist_id,
signups=signups,
totalCount=len(signups),
)
except ValueError:
raise
except Exception as e:
logger.error(f"Error fetching signups for waitlist {waitlist_id}: {e}")
raise DatabaseError("Failed to fetch waitlist signups") from e
async def link_waitlist_to_listing_admin(
waitlist_id: str,
store_listing_id: str,
) -> store_model.WaitlistAdminResponse:
"""Link a waitlist to a store listing (admin only)."""
logger.info(f"Linking waitlist {waitlist_id} to listing {store_listing_id}")
try:
# Verify the waitlist exists
waitlist = await prisma.models.WaitlistEntry.prisma().find_unique(
where={"id": waitlist_id}
)
if not waitlist:
raise ValueError(f"Waitlist {waitlist_id} not found")
if waitlist.isDeleted:
raise ValueError(f"Waitlist {waitlist_id} has been deleted")
# Verify the store listing exists
listing = await prisma.models.StoreListing.prisma().find_unique(
where={"id": store_listing_id}
)
if not listing:
raise ValueError(f"Store listing {store_listing_id} not found")
updated_waitlist = await prisma.models.WaitlistEntry.prisma().update(
where={"id": waitlist_id},
data={"StoreListing": {"connect": {"id": store_listing_id}}},
include={"joinedUsers": True},
)
# We already verified existence above, so this should never be None
assert updated_waitlist is not None
return _waitlist_to_admin_response(updated_waitlist)
except ValueError:
raise
except Exception as e:
logger.error(f"Error linking waitlist to listing: {e}")
raise DatabaseError("Failed to link waitlist to listing") from e
async def notify_waitlist_users_on_launch(
store_listing_id: str,
agent_name: str,
store_url: str,
) -> int:
"""
Notify all users on waitlists linked to a store listing when the agent is launched.
Args:
store_listing_id: The ID of the store listing that was approved
agent_name: The name of the approved agent
store_url: The URL to the agent's store page
Returns:
The number of notifications sent
"""
logger.info(f"Notifying waitlist users for store listing {store_listing_id}")
try:
# Find all active waitlists linked to this store listing
# Exclude DONE and CANCELED to prevent duplicate notifications on re-approval
waitlists = await prisma.models.WaitlistEntry.prisma().find_many(
where={
"storeListingId": store_listing_id,
"isDeleted": False,
"status": {
"not_in": [
prisma.enums.WaitlistExternalStatus.DONE,
prisma.enums.WaitlistExternalStatus.CANCELED,
]
},
},
include={"joinedUsers": True},
)
if not waitlists:
logger.info(
f"No active waitlists found for store listing {store_listing_id}"
)
return 0
notification_count = 0
launched_at = datetime.now(tz=timezone.utc)
for waitlist in waitlists:
# Track notification results for this waitlist
users_to_notify = waitlist.joinedUsers or []
failed_user_ids: list[str] = []
# Notify registered users
for user in users_to_notify:
try:
notification_data = WaitlistLaunchData(
agent_name=agent_name,
waitlist_name=waitlist.name,
store_url=store_url,
launched_at=launched_at,
)
notification_event = NotificationEventModel[WaitlistLaunchData](
user_id=user.id,
type=prisma.enums.NotificationType.WAITLIST_LAUNCH,
data=notification_data,
)
await queue_notification_async(notification_event)
notification_count += 1
except Exception as e:
logger.error(
f"Failed to send waitlist launch notification to user {user.id}: {e}"
)
failed_user_ids.append(user.id)
# Note: For unaffiliated email users, you would need to send emails directly
# since they don't have user IDs for the notification system.
# This could be done via a separate email service.
# For now, we log these for potential manual follow-up or future implementation.
has_pending_email_users = bool(waitlist.unaffiliatedEmailUsers)
if has_pending_email_users:
logger.info(
f"Waitlist {waitlist.id} has {len(waitlist.unaffiliatedEmailUsers)} "
f"unaffiliated email users that need email notifications"
)
# Only mark waitlist as DONE if all registered user notifications succeeded
# AND there are no unaffiliated email users still waiting for notifications
if not failed_user_ids and not has_pending_email_users:
await prisma.models.WaitlistEntry.prisma().update(
where={"id": waitlist.id},
data={"status": prisma.enums.WaitlistExternalStatus.DONE},
)
logger.info(f"Updated waitlist {waitlist.id} status to DONE")
elif failed_user_ids:
logger.warning(
f"Waitlist {waitlist.id} not marked as DONE due to "
f"{len(failed_user_ids)} failed notifications"
)
elif has_pending_email_users:
logger.warning(
f"Waitlist {waitlist.id} not marked as DONE due to "
f"{len(waitlist.unaffiliatedEmailUsers)} pending email-only users"
)
logger.info(
f"Sent {notification_count} waitlist launch notifications for store listing {store_listing_id}"
)
return notification_count
except Exception as e:
logger.error(
f"Error notifying waitlist users for store listing {store_listing_id}: {e}"
)
# Don't raise - we don't want to fail the approval process
return 0

View File

@@ -16,7 +16,7 @@ from backend.blocks.ideogram import (
StyleType, StyleType,
UpscaleOption, UpscaleOption,
) )
from backend.data.graph import BaseGraph from backend.data.graph import GraphBaseMeta
from backend.data.model import CredentialsMetaInput, ProviderName from backend.data.model import CredentialsMetaInput, ProviderName
from backend.integrations.credentials_store import ideogram_credentials from backend.integrations.credentials_store import ideogram_credentials
from backend.util.request import Requests from backend.util.request import Requests
@@ -34,14 +34,14 @@ class ImageStyle(str, Enum):
DIGITAL_ART = "digital art" DIGITAL_ART = "digital art"
async def generate_agent_image(agent: BaseGraph | AgentGraph) -> io.BytesIO: async def generate_agent_image(agent: GraphBaseMeta | AgentGraph) -> io.BytesIO:
if settings.config.use_agent_image_generation_v2: if settings.config.use_agent_image_generation_v2:
return await generate_agent_image_v2(graph=agent) return await generate_agent_image_v2(graph=agent)
else: else:
return await generate_agent_image_v1(agent=agent) return await generate_agent_image_v1(agent=agent)
async def generate_agent_image_v2(graph: BaseGraph | AgentGraph) -> io.BytesIO: async def generate_agent_image_v2(graph: GraphBaseMeta | AgentGraph) -> io.BytesIO:
""" """
Generate an image for an agent using Ideogram model. Generate an image for an agent using Ideogram model.
Returns: Returns:
@@ -54,14 +54,17 @@ async def generate_agent_image_v2(graph: BaseGraph | AgentGraph) -> io.BytesIO:
description = f"{name} ({graph.description})" if graph.description else name description = f"{name} ({graph.description})" if graph.description else name
prompt = ( prompt = (
f"Create a visually striking retro-futuristic vector pop art illustration prominently featuring " "Create a visually striking retro-futuristic vector pop art illustration "
f'"{name}" in bold typography. The image clearly and literally depicts a {description}, ' f'prominently featuring "{name}" in bold typography. The image clearly and '
f"along with recognizable objects directly associated with the primary function of a {name}. " f"literally depicts a {description}, along with recognizable objects directly "
f"Ensure the imagery is concrete, intuitive, and immediately understandable, clearly conveying the " f"associated with the primary function of a {name}. "
f"purpose of a {name}. Maintain vibrant, limited-palette colors, sharp vector lines, geometric " f"Ensure the imagery is concrete, intuitive, and immediately understandable, "
f"shapes, flat illustration techniques, and solid colors without gradients or shading. Preserve a " f"clearly conveying the purpose of a {name}. "
f"retro-futuristic aesthetic influenced by mid-century futurism and 1960s psychedelia, " "Maintain vibrant, limited-palette colors, sharp vector lines, "
f"prioritizing clear visual storytelling and thematic clarity above all else." "geometric shapes, flat illustration techniques, and solid colors "
"without gradients or shading. Preserve a retro-futuristic aesthetic "
"influenced by mid-century futurism and 1960s psychedelia, "
"prioritizing clear visual storytelling and thematic clarity above all else."
) )
custom_colors = [ custom_colors = [
@@ -99,12 +102,12 @@ async def generate_agent_image_v2(graph: BaseGraph | AgentGraph) -> io.BytesIO:
return io.BytesIO(response.content) return io.BytesIO(response.content)
async def generate_agent_image_v1(agent: BaseGraph | AgentGraph) -> io.BytesIO: async def generate_agent_image_v1(agent: GraphBaseMeta | AgentGraph) -> io.BytesIO:
""" """
Generate an image for an agent using Flux model via Replicate API. Generate an image for an agent using Flux model via Replicate API.
Args: Args:
agent (Graph): The agent to generate an image for agent (GraphBaseMeta | AgentGraph): The agent to generate an image for
Returns: Returns:
io.BytesIO: The generated image as bytes io.BytesIO: The generated image as bytes
@@ -114,7 +117,13 @@ async def generate_agent_image_v1(agent: BaseGraph | AgentGraph) -> io.BytesIO:
raise ValueError("Missing Replicate API key in settings") raise ValueError("Missing Replicate API key in settings")
# Construct prompt from agent details # Construct prompt from agent details
prompt = f"Create a visually engaging app store thumbnail for the AI agent that highlights what it does in a clear and captivating way:\n- **Name**: {agent.name}\n- **Description**: {agent.description}\nFocus on showcasing its core functionality with an appealing design." prompt = (
"Create a visually engaging app store thumbnail for the AI agent "
"that highlights what it does in a clear and captivating way:\n"
f"- **Name**: {agent.name}\n"
f"- **Description**: {agent.description}\n"
f"Focus on showcasing its core functionality with an appealing design."
)
# Set up Replicate client # Set up Replicate client
client = ReplicateClient(api_token=settings.secrets.replicate_api_key) client = ReplicateClient(api_token=settings.secrets.replicate_api_key)

View File

@@ -224,6 +224,102 @@ class ReviewSubmissionRequest(pydantic.BaseModel):
internal_comments: str | None = None # Private admin notes internal_comments: str | None = None # Private admin notes
class StoreWaitlistEntry(pydantic.BaseModel):
"""Public waitlist entry - no PII fields exposed."""
waitlistId: str
slug: str
# Content fields
name: str
subHeading: str
videoUrl: str | None = None
agentOutputDemoUrl: str | None = None
imageUrls: list[str]
description: str
categories: list[str]
class StoreWaitlistsAllResponse(pydantic.BaseModel):
listings: list[StoreWaitlistEntry]
# Admin Waitlist Models
class WaitlistCreateRequest(pydantic.BaseModel):
"""Request model for creating a new waitlist."""
name: str
slug: str
subHeading: str
description: str
categories: list[str] = []
imageUrls: list[str] = []
videoUrl: str | None = None
agentOutputDemoUrl: str | None = None
class WaitlistUpdateRequest(pydantic.BaseModel):
"""Request model for updating a waitlist."""
name: str | None = None
slug: str | None = None
subHeading: str | None = None
description: str | None = None
categories: list[str] | None = None
imageUrls: list[str] | None = None
videoUrl: str | None = None
agentOutputDemoUrl: str | None = None
status: prisma.enums.WaitlistExternalStatus | None = None
storeListingId: str | None = None # Link to a store listing
class WaitlistAdminResponse(pydantic.BaseModel):
"""Admin response model with full waitlist details including internal data."""
id: str
createdAt: str
updatedAt: str
slug: str
name: str
subHeading: str
description: str
categories: list[str]
imageUrls: list[str]
videoUrl: str | None = None
agentOutputDemoUrl: str | None = None
status: prisma.enums.WaitlistExternalStatus
votes: int
signupCount: int # Total count of joinedUsers + unaffiliatedEmailUsers
storeListingId: str | None = None
owningUserId: str
class WaitlistSignup(pydantic.BaseModel):
"""Individual signup entry for a waitlist."""
type: str # "user" or "email"
userId: str | None = None
email: str | None = None
username: str | None = None # For user signups
class WaitlistSignupListResponse(pydantic.BaseModel):
"""Response model for listing waitlist signups."""
waitlistId: str
signups: list[WaitlistSignup]
totalCount: int
class WaitlistAdminListResponse(pydantic.BaseModel):
"""Response model for listing all waitlists (admin view)."""
waitlists: list[WaitlistAdminResponse]
totalCount: int
class UnifiedSearchResult(pydantic.BaseModel): class UnifiedSearchResult(pydantic.BaseModel):
"""A single result from unified hybrid search across all content types.""" """A single result from unified hybrid search across all content types."""

View File

@@ -8,6 +8,7 @@ import autogpt_libs.auth
import fastapi import fastapi
import fastapi.responses import fastapi.responses
import prisma.enums import prisma.enums
from autogpt_libs.auth.dependencies import get_optional_user_id
import backend.data.graph import backend.data.graph
import backend.util.json import backend.util.json
@@ -81,6 +82,74 @@ async def update_or_create_profile(
return updated_profile return updated_profile
##############################################
############## Waitlist Endpoints ############
##############################################
@router.get(
"/waitlist",
summary="Get the agent waitlist",
tags=["store", "public"],
response_model=store_model.StoreWaitlistsAllResponse,
)
async def get_waitlist():
"""
Get all active waitlists for public display.
"""
waitlists = await store_db.get_waitlist()
return store_model.StoreWaitlistsAllResponse(listings=waitlists)
@router.get(
"/waitlist/my-memberships",
summary="Get waitlist IDs the current user has joined",
tags=["store", "private"],
)
async def get_my_waitlist_memberships(
user_id: str = fastapi.Security(autogpt_libs.auth.get_user_id),
) -> list[str]:
"""Returns list of waitlist IDs the authenticated user has joined."""
return await store_db.get_user_waitlist_memberships(user_id)
@router.post(
path="/waitlist/{waitlist_id}/join",
summary="Add self to the agent waitlist",
tags=["store", "public"],
response_model=store_model.StoreWaitlistEntry,
)
async def add_self_to_waitlist(
user_id: str | None = fastapi.Security(get_optional_user_id),
waitlist_id: str = fastapi.Path(..., description="The ID of the waitlist to join"),
email: str | None = fastapi.Body(
default=None, embed=True, description="Email address for unauthenticated users"
),
):
"""
Add the current user to the agent waitlist.
"""
if not user_id and not email:
raise fastapi.HTTPException(
status_code=400,
detail="Either user authentication or email address is required",
)
try:
waitlist_entry = await store_db.add_user_to_waitlist(
waitlist_id=waitlist_id, user_id=user_id, email=email
)
return waitlist_entry
except ValueError as e:
error_msg = str(e)
if "not found" in error_msg:
raise fastapi.HTTPException(status_code=404, detail="Waitlist not found")
# Waitlist exists but is closed or unavailable
raise fastapi.HTTPException(status_code=400, detail=error_msg)
except Exception:
raise fastapi.HTTPException(
status_code=500, detail="An error occurred while joining the waitlist"
)
############################################## ##############################################
############### Agent Endpoints ############## ############### Agent Endpoints ##############
############################################## ##############################################
@@ -278,7 +347,7 @@ async def get_agent(
) )
async def get_graph_meta_by_store_listing_version_id( async def get_graph_meta_by_store_listing_version_id(
store_listing_version_id: str, store_listing_version_id: str,
) -> backend.data.graph.GraphMeta: ) -> backend.data.graph.GraphModelWithoutNodes:
""" """
Get Agent Graph from Store Listing Version ID. Get Agent Graph from Store Listing Version ID.
""" """

View File

@@ -19,6 +19,7 @@ from prisma.errors import PrismaError
import backend.api.features.admin.credit_admin_routes import backend.api.features.admin.credit_admin_routes
import backend.api.features.admin.execution_analytics_routes import backend.api.features.admin.execution_analytics_routes
import backend.api.features.admin.store_admin_routes import backend.api.features.admin.store_admin_routes
import backend.api.features.admin.waitlist_admin_routes
import backend.api.features.builder import backend.api.features.builder
import backend.api.features.builder.routes import backend.api.features.builder.routes
import backend.api.features.chat.routes as chat_routes import backend.api.features.chat.routes as chat_routes
@@ -306,6 +307,11 @@ app.include_router(
tags=["v2", "admin"], tags=["v2", "admin"],
prefix="/api/store", prefix="/api/store",
) )
app.include_router(
backend.api.features.admin.waitlist_admin_routes.router,
tags=["v2", "admin"],
prefix="/api/store",
)
app.include_router( app.include_router(
backend.api.features.admin.credit_admin_routes.router, backend.api.features.admin.credit_admin_routes.router,
tags=["v2", "admin"], tags=["v2", "admin"],

View File

@@ -478,7 +478,7 @@ class ExaCreateOrFindWebsetBlock(Block):
aexa = AsyncExa(api_key=credentials.api_key.get_secret_value()) aexa = AsyncExa(api_key=credentials.api_key.get_secret_value())
try: try:
webset = aexa.websets.get(id=input_data.external_id) webset = await aexa.websets.get(id=input_data.external_id)
webset_result = Webset.model_validate(webset.model_dump(by_alias=True)) webset_result = Webset.model_validate(webset.model_dump(by_alias=True))
yield "webset", webset_result yield "webset", webset_result
@@ -494,7 +494,7 @@ class ExaCreateOrFindWebsetBlock(Block):
count=input_data.search_count, count=input_data.search_count,
) )
webset = aexa.websets.create( webset = await aexa.websets.create(
params=CreateWebsetParameters( params=CreateWebsetParameters(
search=search_params, search=search_params,
external_id=input_data.external_id, external_id=input_data.external_id,
@@ -554,7 +554,7 @@ class ExaUpdateWebsetBlock(Block):
if input_data.metadata is not None: if input_data.metadata is not None:
payload["metadata"] = input_data.metadata payload["metadata"] = input_data.metadata
sdk_webset = aexa.websets.update(id=input_data.webset_id, params=payload) sdk_webset = await aexa.websets.update(id=input_data.webset_id, params=payload)
status_str = ( status_str = (
sdk_webset.status.value sdk_webset.status.value
@@ -617,7 +617,7 @@ class ExaListWebsetsBlock(Block):
) -> BlockOutput: ) -> BlockOutput:
aexa = AsyncExa(api_key=credentials.api_key.get_secret_value()) aexa = AsyncExa(api_key=credentials.api_key.get_secret_value())
response = aexa.websets.list( response = await aexa.websets.list(
cursor=input_data.cursor, cursor=input_data.cursor,
limit=input_data.limit, limit=input_data.limit,
) )
@@ -678,7 +678,7 @@ class ExaGetWebsetBlock(Block):
) -> BlockOutput: ) -> BlockOutput:
aexa = AsyncExa(api_key=credentials.api_key.get_secret_value()) aexa = AsyncExa(api_key=credentials.api_key.get_secret_value())
sdk_webset = aexa.websets.get(id=input_data.webset_id) sdk_webset = await aexa.websets.get(id=input_data.webset_id)
status_str = ( status_str = (
sdk_webset.status.value sdk_webset.status.value
@@ -748,7 +748,7 @@ class ExaDeleteWebsetBlock(Block):
) -> BlockOutput: ) -> BlockOutput:
aexa = AsyncExa(api_key=credentials.api_key.get_secret_value()) aexa = AsyncExa(api_key=credentials.api_key.get_secret_value())
deleted_webset = aexa.websets.delete(id=input_data.webset_id) deleted_webset = await aexa.websets.delete(id=input_data.webset_id)
status_str = ( status_str = (
deleted_webset.status.value deleted_webset.status.value
@@ -798,7 +798,7 @@ class ExaCancelWebsetBlock(Block):
) -> BlockOutput: ) -> BlockOutput:
aexa = AsyncExa(api_key=credentials.api_key.get_secret_value()) aexa = AsyncExa(api_key=credentials.api_key.get_secret_value())
canceled_webset = aexa.websets.cancel(id=input_data.webset_id) canceled_webset = await aexa.websets.cancel(id=input_data.webset_id)
status_str = ( status_str = (
canceled_webset.status.value canceled_webset.status.value
@@ -968,7 +968,7 @@ class ExaPreviewWebsetBlock(Block):
entity["description"] = input_data.entity_description entity["description"] = input_data.entity_description
payload["entity"] = entity payload["entity"] = entity
sdk_preview = aexa.websets.preview(params=payload) sdk_preview = await aexa.websets.preview(params=payload)
preview = PreviewWebsetModel.from_sdk(sdk_preview) preview = PreviewWebsetModel.from_sdk(sdk_preview)
@@ -1051,7 +1051,7 @@ class ExaWebsetStatusBlock(Block):
) -> BlockOutput: ) -> BlockOutput:
aexa = AsyncExa(api_key=credentials.api_key.get_secret_value()) aexa = AsyncExa(api_key=credentials.api_key.get_secret_value())
webset = aexa.websets.get(id=input_data.webset_id) webset = await aexa.websets.get(id=input_data.webset_id)
status = ( status = (
webset.status.value webset.status.value
@@ -1185,7 +1185,7 @@ class ExaWebsetSummaryBlock(Block):
) -> BlockOutput: ) -> BlockOutput:
aexa = AsyncExa(api_key=credentials.api_key.get_secret_value()) aexa = AsyncExa(api_key=credentials.api_key.get_secret_value())
webset = aexa.websets.get(id=input_data.webset_id) webset = await aexa.websets.get(id=input_data.webset_id)
# Extract basic info # Extract basic info
webset_id = webset.id webset_id = webset.id
@@ -1211,7 +1211,7 @@ class ExaWebsetSummaryBlock(Block):
total_items = 0 total_items = 0
if input_data.include_sample_items and input_data.sample_size > 0: if input_data.include_sample_items and input_data.sample_size > 0:
items_response = aexa.websets.items.list( items_response = await aexa.websets.items.list(
webset_id=input_data.webset_id, limit=input_data.sample_size webset_id=input_data.webset_id, limit=input_data.sample_size
) )
sample_items_data = [ sample_items_data = [
@@ -1362,7 +1362,7 @@ class ExaWebsetReadyCheckBlock(Block):
aexa = AsyncExa(api_key=credentials.api_key.get_secret_value()) aexa = AsyncExa(api_key=credentials.api_key.get_secret_value())
# Get webset details # Get webset details
webset = aexa.websets.get(id=input_data.webset_id) webset = await aexa.websets.get(id=input_data.webset_id)
status = ( status = (
webset.status.value webset.status.value

View File

@@ -202,7 +202,7 @@ class ExaCreateEnrichmentBlock(Block):
# Use AsyncExa SDK # Use AsyncExa SDK
aexa = AsyncExa(api_key=credentials.api_key.get_secret_value()) aexa = AsyncExa(api_key=credentials.api_key.get_secret_value())
sdk_enrichment = aexa.websets.enrichments.create( sdk_enrichment = await aexa.websets.enrichments.create(
webset_id=input_data.webset_id, params=payload webset_id=input_data.webset_id, params=payload
) )
@@ -223,7 +223,7 @@ class ExaCreateEnrichmentBlock(Block):
items_enriched = 0 items_enriched = 0
while time.time() - poll_start < input_data.polling_timeout: while time.time() - poll_start < input_data.polling_timeout:
current_enrich = aexa.websets.enrichments.get( current_enrich = await aexa.websets.enrichments.get(
webset_id=input_data.webset_id, id=enrichment_id webset_id=input_data.webset_id, id=enrichment_id
) )
current_status = ( current_status = (
@@ -234,7 +234,7 @@ class ExaCreateEnrichmentBlock(Block):
if current_status in ["completed", "failed", "cancelled"]: if current_status in ["completed", "failed", "cancelled"]:
# Estimate items from webset searches # Estimate items from webset searches
webset = aexa.websets.get(id=input_data.webset_id) webset = await aexa.websets.get(id=input_data.webset_id)
if webset.searches: if webset.searches:
for search in webset.searches: for search in webset.searches:
if search.progress: if search.progress:
@@ -329,7 +329,7 @@ class ExaGetEnrichmentBlock(Block):
# Use AsyncExa SDK # Use AsyncExa SDK
aexa = AsyncExa(api_key=credentials.api_key.get_secret_value()) aexa = AsyncExa(api_key=credentials.api_key.get_secret_value())
sdk_enrichment = aexa.websets.enrichments.get( sdk_enrichment = await aexa.websets.enrichments.get(
webset_id=input_data.webset_id, id=input_data.enrichment_id webset_id=input_data.webset_id, id=input_data.enrichment_id
) )
@@ -474,7 +474,7 @@ class ExaDeleteEnrichmentBlock(Block):
# Use AsyncExa SDK # Use AsyncExa SDK
aexa = AsyncExa(api_key=credentials.api_key.get_secret_value()) aexa = AsyncExa(api_key=credentials.api_key.get_secret_value())
deleted_enrichment = aexa.websets.enrichments.delete( deleted_enrichment = await aexa.websets.enrichments.delete(
webset_id=input_data.webset_id, id=input_data.enrichment_id webset_id=input_data.webset_id, id=input_data.enrichment_id
) )
@@ -525,13 +525,13 @@ class ExaCancelEnrichmentBlock(Block):
# Use AsyncExa SDK # Use AsyncExa SDK
aexa = AsyncExa(api_key=credentials.api_key.get_secret_value()) aexa = AsyncExa(api_key=credentials.api_key.get_secret_value())
canceled_enrichment = aexa.websets.enrichments.cancel( canceled_enrichment = await aexa.websets.enrichments.cancel(
webset_id=input_data.webset_id, id=input_data.enrichment_id webset_id=input_data.webset_id, id=input_data.enrichment_id
) )
# Try to estimate how many items were enriched before cancellation # Try to estimate how many items were enriched before cancellation
items_enriched = 0 items_enriched = 0
items_response = aexa.websets.items.list( items_response = await aexa.websets.items.list(
webset_id=input_data.webset_id, limit=100 webset_id=input_data.webset_id, limit=100
) )

View File

@@ -222,7 +222,7 @@ class ExaCreateImportBlock(Block):
def _create_test_mock(): def _create_test_mock():
"""Create test mocks for the AsyncExa SDK.""" """Create test mocks for the AsyncExa SDK."""
from datetime import datetime from datetime import datetime
from unittest.mock import MagicMock from unittest.mock import AsyncMock, MagicMock
# Create mock SDK import object # Create mock SDK import object
mock_import = MagicMock() mock_import = MagicMock()
@@ -247,7 +247,7 @@ class ExaCreateImportBlock(Block):
return { return {
"_get_client": lambda *args, **kwargs: MagicMock( "_get_client": lambda *args, **kwargs: MagicMock(
websets=MagicMock( websets=MagicMock(
imports=MagicMock(create=lambda *args, **kwargs: mock_import) imports=MagicMock(create=AsyncMock(return_value=mock_import))
) )
) )
} }
@@ -294,7 +294,7 @@ class ExaCreateImportBlock(Block):
if input_data.metadata: if input_data.metadata:
payload["metadata"] = input_data.metadata payload["metadata"] = input_data.metadata
sdk_import = aexa.websets.imports.create( sdk_import = await aexa.websets.imports.create(
params=payload, csv_data=input_data.csv_data params=payload, csv_data=input_data.csv_data
) )
@@ -360,7 +360,7 @@ class ExaGetImportBlock(Block):
# Use AsyncExa SDK # Use AsyncExa SDK
aexa = AsyncExa(api_key=credentials.api_key.get_secret_value()) aexa = AsyncExa(api_key=credentials.api_key.get_secret_value())
sdk_import = aexa.websets.imports.get(import_id=input_data.import_id) sdk_import = await aexa.websets.imports.get(import_id=input_data.import_id)
import_obj = ImportModel.from_sdk(sdk_import) import_obj = ImportModel.from_sdk(sdk_import)
@@ -426,7 +426,7 @@ class ExaListImportsBlock(Block):
# Use AsyncExa SDK # Use AsyncExa SDK
aexa = AsyncExa(api_key=credentials.api_key.get_secret_value()) aexa = AsyncExa(api_key=credentials.api_key.get_secret_value())
response = aexa.websets.imports.list( response = await aexa.websets.imports.list(
cursor=input_data.cursor, cursor=input_data.cursor,
limit=input_data.limit, limit=input_data.limit,
) )
@@ -474,7 +474,9 @@ class ExaDeleteImportBlock(Block):
# Use AsyncExa SDK # Use AsyncExa SDK
aexa = AsyncExa(api_key=credentials.api_key.get_secret_value()) aexa = AsyncExa(api_key=credentials.api_key.get_secret_value())
deleted_import = aexa.websets.imports.delete(import_id=input_data.import_id) deleted_import = await aexa.websets.imports.delete(
import_id=input_data.import_id
)
yield "import_id", deleted_import.id yield "import_id", deleted_import.id
yield "success", "true" yield "success", "true"
@@ -573,14 +575,14 @@ class ExaExportWebsetBlock(Block):
} }
) )
# Create mock iterator # Create async iterator for list_all
mock_items = [mock_item1, mock_item2] async def async_item_iterator(*args, **kwargs):
for item in [mock_item1, mock_item2]:
yield item
return { return {
"_get_client": lambda *args, **kwargs: MagicMock( "_get_client": lambda *args, **kwargs: MagicMock(
websets=MagicMock( websets=MagicMock(items=MagicMock(list_all=async_item_iterator))
items=MagicMock(list_all=lambda *args, **kwargs: iter(mock_items))
)
) )
} }
@@ -602,7 +604,7 @@ class ExaExportWebsetBlock(Block):
webset_id=input_data.webset_id, limit=input_data.max_items webset_id=input_data.webset_id, limit=input_data.max_items
) )
for sdk_item in item_iterator: async for sdk_item in item_iterator:
if len(all_items) >= input_data.max_items: if len(all_items) >= input_data.max_items:
break break

View File

@@ -178,7 +178,7 @@ class ExaGetWebsetItemBlock(Block):
) -> BlockOutput: ) -> BlockOutput:
aexa = AsyncExa(api_key=credentials.api_key.get_secret_value()) aexa = AsyncExa(api_key=credentials.api_key.get_secret_value())
sdk_item = aexa.websets.items.get( sdk_item = await aexa.websets.items.get(
webset_id=input_data.webset_id, id=input_data.item_id webset_id=input_data.webset_id, id=input_data.item_id
) )
@@ -269,7 +269,7 @@ class ExaListWebsetItemsBlock(Block):
response = None response = None
while time.time() - start_time < input_data.wait_timeout: while time.time() - start_time < input_data.wait_timeout:
response = aexa.websets.items.list( response = await aexa.websets.items.list(
webset_id=input_data.webset_id, webset_id=input_data.webset_id,
cursor=input_data.cursor, cursor=input_data.cursor,
limit=input_data.limit, limit=input_data.limit,
@@ -282,13 +282,13 @@ class ExaListWebsetItemsBlock(Block):
interval = min(interval * 1.2, 10) interval = min(interval * 1.2, 10)
if not response: if not response:
response = aexa.websets.items.list( response = await aexa.websets.items.list(
webset_id=input_data.webset_id, webset_id=input_data.webset_id,
cursor=input_data.cursor, cursor=input_data.cursor,
limit=input_data.limit, limit=input_data.limit,
) )
else: else:
response = aexa.websets.items.list( response = await aexa.websets.items.list(
webset_id=input_data.webset_id, webset_id=input_data.webset_id,
cursor=input_data.cursor, cursor=input_data.cursor,
limit=input_data.limit, limit=input_data.limit,
@@ -340,7 +340,7 @@ class ExaDeleteWebsetItemBlock(Block):
) -> BlockOutput: ) -> BlockOutput:
aexa = AsyncExa(api_key=credentials.api_key.get_secret_value()) aexa = AsyncExa(api_key=credentials.api_key.get_secret_value())
deleted_item = aexa.websets.items.delete( deleted_item = await aexa.websets.items.delete(
webset_id=input_data.webset_id, id=input_data.item_id webset_id=input_data.webset_id, id=input_data.item_id
) )
@@ -408,7 +408,7 @@ class ExaBulkWebsetItemsBlock(Block):
webset_id=input_data.webset_id, limit=input_data.max_items webset_id=input_data.webset_id, limit=input_data.max_items
) )
for sdk_item in item_iterator: async for sdk_item in item_iterator:
if len(all_items) >= input_data.max_items: if len(all_items) >= input_data.max_items:
break break
@@ -475,7 +475,7 @@ class ExaWebsetItemsSummaryBlock(Block):
# Use AsyncExa SDK # Use AsyncExa SDK
aexa = AsyncExa(api_key=credentials.api_key.get_secret_value()) aexa = AsyncExa(api_key=credentials.api_key.get_secret_value())
webset = aexa.websets.get(id=input_data.webset_id) webset = await aexa.websets.get(id=input_data.webset_id)
entity_type = "unknown" entity_type = "unknown"
if webset.searches: if webset.searches:
@@ -495,7 +495,7 @@ class ExaWebsetItemsSummaryBlock(Block):
# Get sample items if requested # Get sample items if requested
sample_items: List[WebsetItemModel] = [] sample_items: List[WebsetItemModel] = []
if input_data.sample_size > 0: if input_data.sample_size > 0:
items_response = aexa.websets.items.list( items_response = await aexa.websets.items.list(
webset_id=input_data.webset_id, limit=input_data.sample_size webset_id=input_data.webset_id, limit=input_data.sample_size
) )
# Convert to our stable models # Convert to our stable models
@@ -569,7 +569,7 @@ class ExaGetNewItemsBlock(Block):
aexa = AsyncExa(api_key=credentials.api_key.get_secret_value()) aexa = AsyncExa(api_key=credentials.api_key.get_secret_value())
# Get items starting from cursor # Get items starting from cursor
response = aexa.websets.items.list( response = await aexa.websets.items.list(
webset_id=input_data.webset_id, webset_id=input_data.webset_id,
cursor=input_data.since_cursor, cursor=input_data.since_cursor,
limit=input_data.max_items, limit=input_data.max_items,

View File

@@ -233,7 +233,7 @@ class ExaCreateMonitorBlock(Block):
def _create_test_mock(): def _create_test_mock():
"""Create test mocks for the AsyncExa SDK.""" """Create test mocks for the AsyncExa SDK."""
from datetime import datetime from datetime import datetime
from unittest.mock import MagicMock from unittest.mock import AsyncMock, MagicMock
# Create mock SDK monitor object # Create mock SDK monitor object
mock_monitor = MagicMock() mock_monitor = MagicMock()
@@ -263,7 +263,7 @@ class ExaCreateMonitorBlock(Block):
return { return {
"_get_client": lambda *args, **kwargs: MagicMock( "_get_client": lambda *args, **kwargs: MagicMock(
websets=MagicMock( websets=MagicMock(
monitors=MagicMock(create=lambda *args, **kwargs: mock_monitor) monitors=MagicMock(create=AsyncMock(return_value=mock_monitor))
) )
) )
} }
@@ -320,7 +320,7 @@ class ExaCreateMonitorBlock(Block):
if input_data.metadata: if input_data.metadata:
payload["metadata"] = input_data.metadata payload["metadata"] = input_data.metadata
sdk_monitor = aexa.websets.monitors.create(params=payload) sdk_monitor = await aexa.websets.monitors.create(params=payload)
monitor = MonitorModel.from_sdk(sdk_monitor) monitor = MonitorModel.from_sdk(sdk_monitor)
@@ -384,7 +384,7 @@ class ExaGetMonitorBlock(Block):
# Use AsyncExa SDK # Use AsyncExa SDK
aexa = AsyncExa(api_key=credentials.api_key.get_secret_value()) aexa = AsyncExa(api_key=credentials.api_key.get_secret_value())
sdk_monitor = aexa.websets.monitors.get(monitor_id=input_data.monitor_id) sdk_monitor = await aexa.websets.monitors.get(monitor_id=input_data.monitor_id)
monitor = MonitorModel.from_sdk(sdk_monitor) monitor = MonitorModel.from_sdk(sdk_monitor)
@@ -476,7 +476,7 @@ class ExaUpdateMonitorBlock(Block):
if input_data.metadata is not None: if input_data.metadata is not None:
payload["metadata"] = input_data.metadata payload["metadata"] = input_data.metadata
sdk_monitor = aexa.websets.monitors.update( sdk_monitor = await aexa.websets.monitors.update(
monitor_id=input_data.monitor_id, params=payload monitor_id=input_data.monitor_id, params=payload
) )
@@ -522,7 +522,9 @@ class ExaDeleteMonitorBlock(Block):
# Use AsyncExa SDK # Use AsyncExa SDK
aexa = AsyncExa(api_key=credentials.api_key.get_secret_value()) aexa = AsyncExa(api_key=credentials.api_key.get_secret_value())
deleted_monitor = aexa.websets.monitors.delete(monitor_id=input_data.monitor_id) deleted_monitor = await aexa.websets.monitors.delete(
monitor_id=input_data.monitor_id
)
yield "monitor_id", deleted_monitor.id yield "monitor_id", deleted_monitor.id
yield "success", "true" yield "success", "true"
@@ -579,7 +581,7 @@ class ExaListMonitorsBlock(Block):
# Use AsyncExa SDK # Use AsyncExa SDK
aexa = AsyncExa(api_key=credentials.api_key.get_secret_value()) aexa = AsyncExa(api_key=credentials.api_key.get_secret_value())
response = aexa.websets.monitors.list( response = await aexa.websets.monitors.list(
cursor=input_data.cursor, cursor=input_data.cursor,
limit=input_data.limit, limit=input_data.limit,
webset_id=input_data.webset_id, webset_id=input_data.webset_id,

View File

@@ -121,7 +121,7 @@ class ExaWaitForWebsetBlock(Block):
WebsetTargetStatus.IDLE, WebsetTargetStatus.IDLE,
WebsetTargetStatus.ANY_COMPLETE, WebsetTargetStatus.ANY_COMPLETE,
]: ]:
final_webset = aexa.websets.wait_until_idle( final_webset = await aexa.websets.wait_until_idle(
id=input_data.webset_id, id=input_data.webset_id,
timeout=input_data.timeout, timeout=input_data.timeout,
poll_interval=input_data.check_interval, poll_interval=input_data.check_interval,
@@ -164,7 +164,7 @@ class ExaWaitForWebsetBlock(Block):
interval = input_data.check_interval interval = input_data.check_interval
while time.time() - start_time < input_data.timeout: while time.time() - start_time < input_data.timeout:
# Get current webset status # Get current webset status
webset = aexa.websets.get(id=input_data.webset_id) webset = await aexa.websets.get(id=input_data.webset_id)
current_status = ( current_status = (
webset.status.value webset.status.value
if hasattr(webset.status, "value") if hasattr(webset.status, "value")
@@ -209,7 +209,7 @@ class ExaWaitForWebsetBlock(Block):
# Timeout reached # Timeout reached
elapsed = time.time() - start_time elapsed = time.time() - start_time
webset = aexa.websets.get(id=input_data.webset_id) webset = await aexa.websets.get(id=input_data.webset_id)
final_status = ( final_status = (
webset.status.value webset.status.value
if hasattr(webset.status, "value") if hasattr(webset.status, "value")
@@ -345,7 +345,7 @@ class ExaWaitForSearchBlock(Block):
try: try:
while time.time() - start_time < input_data.timeout: while time.time() - start_time < input_data.timeout:
# Get current search status using SDK # Get current search status using SDK
search = aexa.websets.searches.get( search = await aexa.websets.searches.get(
webset_id=input_data.webset_id, id=input_data.search_id webset_id=input_data.webset_id, id=input_data.search_id
) )
@@ -401,7 +401,7 @@ class ExaWaitForSearchBlock(Block):
elapsed = time.time() - start_time elapsed = time.time() - start_time
# Get last known status # Get last known status
search = aexa.websets.searches.get( search = await aexa.websets.searches.get(
webset_id=input_data.webset_id, id=input_data.search_id webset_id=input_data.webset_id, id=input_data.search_id
) )
final_status = ( final_status = (
@@ -503,7 +503,7 @@ class ExaWaitForEnrichmentBlock(Block):
try: try:
while time.time() - start_time < input_data.timeout: while time.time() - start_time < input_data.timeout:
# Get current enrichment status using SDK # Get current enrichment status using SDK
enrichment = aexa.websets.enrichments.get( enrichment = await aexa.websets.enrichments.get(
webset_id=input_data.webset_id, id=input_data.enrichment_id webset_id=input_data.webset_id, id=input_data.enrichment_id
) )
@@ -548,7 +548,7 @@ class ExaWaitForEnrichmentBlock(Block):
elapsed = time.time() - start_time elapsed = time.time() - start_time
# Get last known status # Get last known status
enrichment = aexa.websets.enrichments.get( enrichment = await aexa.websets.enrichments.get(
webset_id=input_data.webset_id, id=input_data.enrichment_id webset_id=input_data.webset_id, id=input_data.enrichment_id
) )
final_status = ( final_status = (
@@ -575,7 +575,7 @@ class ExaWaitForEnrichmentBlock(Block):
) -> tuple[list[SampleEnrichmentModel], int]: ) -> tuple[list[SampleEnrichmentModel], int]:
"""Get sample enriched data and count.""" """Get sample enriched data and count."""
# Get a few items to see enrichment results using SDK # Get a few items to see enrichment results using SDK
response = aexa.websets.items.list(webset_id=webset_id, limit=5) response = await aexa.websets.items.list(webset_id=webset_id, limit=5)
sample_data: list[SampleEnrichmentModel] = [] sample_data: list[SampleEnrichmentModel] = []
enriched_count = 0 enriched_count = 0

View File

@@ -317,7 +317,7 @@ class ExaCreateWebsetSearchBlock(Block):
aexa = AsyncExa(api_key=credentials.api_key.get_secret_value()) aexa = AsyncExa(api_key=credentials.api_key.get_secret_value())
sdk_search = aexa.websets.searches.create( sdk_search = await aexa.websets.searches.create(
webset_id=input_data.webset_id, params=payload webset_id=input_data.webset_id, params=payload
) )
@@ -350,7 +350,7 @@ class ExaCreateWebsetSearchBlock(Block):
poll_start = time.time() poll_start = time.time()
while time.time() - poll_start < input_data.polling_timeout: while time.time() - poll_start < input_data.polling_timeout:
current_search = aexa.websets.searches.get( current_search = await aexa.websets.searches.get(
webset_id=input_data.webset_id, id=search_id webset_id=input_data.webset_id, id=search_id
) )
current_status = ( current_status = (
@@ -442,7 +442,7 @@ class ExaGetWebsetSearchBlock(Block):
# Use AsyncExa SDK # Use AsyncExa SDK
aexa = AsyncExa(api_key=credentials.api_key.get_secret_value()) aexa = AsyncExa(api_key=credentials.api_key.get_secret_value())
sdk_search = aexa.websets.searches.get( sdk_search = await aexa.websets.searches.get(
webset_id=input_data.webset_id, id=input_data.search_id webset_id=input_data.webset_id, id=input_data.search_id
) )
@@ -523,7 +523,7 @@ class ExaCancelWebsetSearchBlock(Block):
# Use AsyncExa SDK # Use AsyncExa SDK
aexa = AsyncExa(api_key=credentials.api_key.get_secret_value()) aexa = AsyncExa(api_key=credentials.api_key.get_secret_value())
canceled_search = aexa.websets.searches.cancel( canceled_search = await aexa.websets.searches.cancel(
webset_id=input_data.webset_id, id=input_data.search_id webset_id=input_data.webset_id, id=input_data.search_id
) )
@@ -604,7 +604,7 @@ class ExaFindOrCreateSearchBlock(Block):
aexa = AsyncExa(api_key=credentials.api_key.get_secret_value()) aexa = AsyncExa(api_key=credentials.api_key.get_secret_value())
# Get webset to check existing searches # Get webset to check existing searches
webset = aexa.websets.get(id=input_data.webset_id) webset = await aexa.websets.get(id=input_data.webset_id)
# Look for existing search with same query # Look for existing search with same query
existing_search = None existing_search = None
@@ -636,7 +636,7 @@ class ExaFindOrCreateSearchBlock(Block):
if input_data.entity_type != SearchEntityType.AUTO: if input_data.entity_type != SearchEntityType.AUTO:
payload["entity"] = {"type": input_data.entity_type.value} payload["entity"] = {"type": input_data.entity_type.value}
sdk_search = aexa.websets.searches.create( sdk_search = await aexa.websets.searches.create(
webset_id=input_data.webset_id, params=payload webset_id=input_data.webset_id, params=payload
) )

View File

@@ -596,10 +596,10 @@ def extract_openai_tool_calls(response) -> list[ToolContentBlock] | None:
def get_parallel_tool_calls_param( def get_parallel_tool_calls_param(
llm_model: LlmModel, parallel_tool_calls: bool | None llm_model: LlmModel, parallel_tool_calls: bool | None
): ) -> bool | openai.Omit:
"""Get the appropriate parallel_tool_calls parameter for OpenAI-compatible APIs.""" """Get the appropriate parallel_tool_calls parameter for OpenAI-compatible APIs."""
if llm_model.startswith("o") or parallel_tool_calls is None: if llm_model.startswith("o") or parallel_tool_calls is None:
return openai.NOT_GIVEN return openai.omit
return parallel_tool_calls return parallel_tool_calls

View File

@@ -246,7 +246,9 @@ class BlockSchema(BaseModel):
f"is not of type {CredentialsMetaInput.__name__}" f"is not of type {CredentialsMetaInput.__name__}"
) )
credentials_fields[field_name].validate_credentials_field_schema(cls) CredentialsMetaInput.validate_credentials_field_schema(
cls.get_field_schema(field_name), field_name
)
elif field_name in credentials_fields: elif field_name in credentials_fields:
raise KeyError( raise KeyError(

View File

@@ -1,9 +1,8 @@
import logging import logging
import queue
from collections import defaultdict from collections import defaultdict
from datetime import datetime, timedelta, timezone from datetime import datetime, timedelta, timezone
from enum import Enum from enum import Enum
from multiprocessing import Manager
from queue import Empty
from typing import ( from typing import (
TYPE_CHECKING, TYPE_CHECKING,
Annotated, Annotated,
@@ -1200,12 +1199,16 @@ class NodeExecutionEntry(BaseModel):
class ExecutionQueue(Generic[T]): class ExecutionQueue(Generic[T]):
""" """
Queue for managing the execution of agents. Thread-safe queue for managing node execution within a single graph execution.
This will be shared between different processes
Note: Uses queue.Queue (not multiprocessing.Queue) since all access is from
threads within the same process. If migrating back to ProcessPoolExecutor,
replace with multiprocessing.Manager().Queue() for cross-process safety.
""" """
def __init__(self): def __init__(self):
self.queue = Manager().Queue() # Thread-safe queue (not multiprocessing) — see class docstring
self.queue: queue.Queue[T] = queue.Queue()
def add(self, execution: T) -> T: def add(self, execution: T) -> T:
self.queue.put(execution) self.queue.put(execution)
@@ -1220,7 +1223,7 @@ class ExecutionQueue(Generic[T]):
def get_or_none(self) -> T | None: def get_or_none(self) -> T | None:
try: try:
return self.queue.get_nowait() return self.queue.get_nowait()
except Empty: except queue.Empty:
return None return None

View File

@@ -0,0 +1,58 @@
"""Tests for ExecutionQueue thread-safety."""
import queue
import threading
from backend.data.execution import ExecutionQueue
def test_execution_queue_uses_stdlib_queue():
"""Verify ExecutionQueue uses queue.Queue (not multiprocessing)."""
q = ExecutionQueue()
assert isinstance(q.queue, queue.Queue)
def test_basic_operations():
"""Test add, get, empty, and get_or_none."""
q = ExecutionQueue()
assert q.empty() is True
assert q.get_or_none() is None
result = q.add("item1")
assert result == "item1"
assert q.empty() is False
item = q.get()
assert item == "item1"
assert q.empty() is True
def test_thread_safety():
"""Test concurrent access from multiple threads."""
q = ExecutionQueue()
results = []
num_items = 100
def producer():
for i in range(num_items):
q.add(f"item_{i}")
def consumer():
count = 0
while count < num_items:
item = q.get_or_none()
if item is not None:
results.append(item)
count += 1
producer_thread = threading.Thread(target=producer)
consumer_thread = threading.Thread(target=consumer)
producer_thread.start()
consumer_thread.start()
producer_thread.join(timeout=5)
consumer_thread.join(timeout=5)
assert len(results) == num_items

View File

@@ -3,7 +3,7 @@ import logging
import uuid import uuid
from collections import defaultdict from collections import defaultdict
from datetime import datetime, timezone from datetime import datetime, timezone
from typing import TYPE_CHECKING, Annotated, Any, Literal, Optional, cast from typing import TYPE_CHECKING, Annotated, Any, Literal, Optional, Self, cast
from prisma.enums import SubmissionStatus from prisma.enums import SubmissionStatus
from prisma.models import ( from prisma.models import (
@@ -20,7 +20,7 @@ from prisma.types import (
AgentNodeLinkCreateInput, AgentNodeLinkCreateInput,
StoreListingVersionWhereInput, StoreListingVersionWhereInput,
) )
from pydantic import BaseModel, BeforeValidator, Field, create_model from pydantic import BaseModel, BeforeValidator, Field
from pydantic.fields import computed_field from pydantic.fields import computed_field
from backend.blocks.agent import AgentExecutorBlock from backend.blocks.agent import AgentExecutorBlock
@@ -30,7 +30,6 @@ from backend.data.db import prisma as db
from backend.data.dynamic_fields import is_tool_pin, sanitize_pin_name from backend.data.dynamic_fields import is_tool_pin, sanitize_pin_name
from backend.data.includes import MAX_GRAPH_VERSIONS_FETCH from backend.data.includes import MAX_GRAPH_VERSIONS_FETCH
from backend.data.model import ( from backend.data.model import (
CredentialsField,
CredentialsFieldInfo, CredentialsFieldInfo,
CredentialsMetaInput, CredentialsMetaInput,
is_credentials_field_name, is_credentials_field_name,
@@ -45,7 +44,6 @@ from .block import (
AnyBlockSchema, AnyBlockSchema,
Block, Block,
BlockInput, BlockInput,
BlockSchema,
BlockType, BlockType,
EmptySchema, EmptySchema,
get_block, get_block,
@@ -113,10 +111,12 @@ class Link(BaseDbModel):
class Node(BaseDbModel): class Node(BaseDbModel):
block_id: str block_id: str
input_default: BlockInput = {} # dict[input_name, default_value] input_default: BlockInput = Field( # dict[input_name, default_value]
metadata: dict[str, Any] = {} default_factory=dict
input_links: list[Link] = [] )
output_links: list[Link] = [] metadata: dict[str, Any] = Field(default_factory=dict)
input_links: list[Link] = Field(default_factory=list)
output_links: list[Link] = Field(default_factory=list)
@property @property
def credentials_optional(self) -> bool: def credentials_optional(self) -> bool:
@@ -221,18 +221,33 @@ class NodeModel(Node):
return result return result
class BaseGraph(BaseDbModel): class GraphBaseMeta(BaseDbModel):
"""
Shared base for `GraphMeta` and `BaseGraph`, with core graph metadata fields.
"""
version: int = 1 version: int = 1
is_active: bool = True is_active: bool = True
name: str name: str
description: str description: str
instructions: str | None = None instructions: str | None = None
recommended_schedule_cron: str | None = None recommended_schedule_cron: str | None = None
nodes: list[Node] = []
links: list[Link] = []
forked_from_id: str | None = None forked_from_id: str | None = None
forked_from_version: int | None = None forked_from_version: int | None = None
class BaseGraph(GraphBaseMeta):
"""
Graph with nodes, links, and computed I/O schema fields.
Used to represent sub-graphs within a `Graph`. Contains the full graph
structure including nodes and links, plus computed fields for schemas
and trigger info. Does NOT include user_id or created_at (see GraphModel).
"""
nodes: list[Node] = Field(default_factory=list)
links: list[Link] = Field(default_factory=list)
@computed_field @computed_field
@property @property
def input_schema(self) -> dict[str, Any]: def input_schema(self) -> dict[str, Any]:
@@ -361,44 +376,79 @@ class GraphTriggerInfo(BaseModel):
class Graph(BaseGraph): class Graph(BaseGraph):
sub_graphs: list[BaseGraph] = [] # Flattened sub-graphs """Creatable graph model used in API create/update endpoints."""
sub_graphs: list[BaseGraph] = Field(default_factory=list) # Flattened sub-graphs
class GraphMeta(GraphBaseMeta):
"""
Lightweight graph metadata model representing an existing graph from the database,
for use in listings and summaries.
Lacks `GraphModel`'s nodes, links, and expensive computed fields.
Use for list endpoints where full graph data is not needed and performance matters.
"""
id: str # type: ignore
version: int # type: ignore
user_id: str
created_at: datetime
@classmethod
def from_db(cls, graph: "AgentGraph") -> Self:
return cls(
id=graph.id,
version=graph.version,
is_active=graph.isActive,
name=graph.name or "",
description=graph.description or "",
instructions=graph.instructions,
recommended_schedule_cron=graph.recommendedScheduleCron,
forked_from_id=graph.forkedFromId,
forked_from_version=graph.forkedFromVersion,
user_id=graph.userId,
created_at=graph.createdAt,
)
class GraphModel(Graph, GraphMeta):
"""
Full graph model representing an existing graph from the database.
This is the primary model for working with persisted graphs. Includes all
graph data (nodes, links, sub_graphs) plus user ownership and timestamps.
Provides computed fields (input_schema, output_schema, etc.) used during
set-up (frontend) and execution (backend).
Inherits from:
- `Graph`: provides structure (nodes, links, sub_graphs) and computed schemas
- `GraphMeta`: provides user_id, created_at for database records
"""
nodes: list[NodeModel] = Field(default_factory=list) # type: ignore
@property
def starting_nodes(self) -> list[NodeModel]:
outbound_nodes = {link.sink_id for link in self.links}
input_nodes = {
node.id for node in self.nodes if node.block.block_type == BlockType.INPUT
}
return [
node
for node in self.nodes
if node.id not in outbound_nodes or node.id in input_nodes
]
@property
def webhook_input_node(self) -> NodeModel | None: # type: ignore
return cast(NodeModel, super().webhook_input_node)
@computed_field @computed_field
@property @property
def credentials_input_schema(self) -> dict[str, Any]: def credentials_input_schema(self) -> dict[str, Any]:
schema = self._credentials_input_schema.jsonschema()
# Determine which credential fields are required based on credentials_optional metadata
graph_credentials_inputs = self.aggregate_credentials_inputs() graph_credentials_inputs = self.aggregate_credentials_inputs()
required_fields = []
# Build a map of node_id -> node for quick lookup
all_nodes = {node.id: node for node in self.nodes}
for sub_graph in self.sub_graphs:
for node in sub_graph.nodes:
all_nodes[node.id] = node
for field_key, (
_field_info,
node_field_pairs,
) in graph_credentials_inputs.items():
# A field is required if ANY node using it has credentials_optional=False
is_required = False
for node_id, _field_name in node_field_pairs:
node = all_nodes.get(node_id)
if node and not node.credentials_optional:
is_required = True
break
if is_required:
required_fields.append(field_key)
schema["required"] = required_fields
return schema
@property
def _credentials_input_schema(self) -> type[BlockSchema]:
graph_credentials_inputs = self.aggregate_credentials_inputs()
logger.debug( logger.debug(
f"Combined credentials input fields for graph #{self.id} ({self.name}): " f"Combined credentials input fields for graph #{self.id} ({self.name}): "
f"{graph_credentials_inputs}" f"{graph_credentials_inputs}"
@@ -406,8 +456,8 @@ class Graph(BaseGraph):
# Warn if same-provider credentials inputs can't be combined (= bad UX) # Warn if same-provider credentials inputs can't be combined (= bad UX)
graph_cred_fields = list(graph_credentials_inputs.values()) graph_cred_fields = list(graph_credentials_inputs.values())
for i, (field, keys) in enumerate(graph_cred_fields): for i, (field, keys, _) in enumerate(graph_cred_fields):
for other_field, other_keys in list(graph_cred_fields)[i + 1 :]: for other_field, other_keys, _ in list(graph_cred_fields)[i + 1 :]:
if field.provider != other_field.provider: if field.provider != other_field.provider:
continue continue
if ProviderName.HTTP in field.provider: if ProviderName.HTTP in field.provider:
@@ -423,31 +473,78 @@ class Graph(BaseGraph):
f"keys: {keys} <> {other_keys}." f"keys: {keys} <> {other_keys}."
) )
fields: dict[str, tuple[type[CredentialsMetaInput], CredentialsMetaInput]] = { # Build JSON schema directly to avoid expensive create_model + validation overhead
agg_field_key: ( properties = {}
CredentialsMetaInput[ required_fields = []
Literal[tuple(field_info.provider)], # type: ignore
Literal[tuple(field_info.supported_types)], # type: ignore
],
CredentialsField(
required_scopes=set(field_info.required_scopes or []),
discriminator=field_info.discriminator,
discriminator_mapping=field_info.discriminator_mapping,
discriminator_values=field_info.discriminator_values,
),
)
for agg_field_key, (field_info, _) in graph_credentials_inputs.items()
}
return create_model( for agg_field_key, (
self.name.replace(" ", "") + "CredentialsInputSchema", field_info,
__base__=BlockSchema, _,
**fields, # type: ignore is_required,
) ) in graph_credentials_inputs.items():
providers = list(field_info.provider)
cred_types = list(field_info.supported_types)
field_schema: dict[str, Any] = {
"credentials_provider": providers,
"credentials_types": cred_types,
"type": "object",
"properties": {
"id": {"title": "Id", "type": "string"},
"title": {
"anyOf": [{"type": "string"}, {"type": "null"}],
"default": None,
"title": "Title",
},
"provider": {
"title": "Provider",
"type": "string",
**(
{"enum": providers}
if len(providers) > 1
else {"const": providers[0]}
),
},
"type": {
"title": "Type",
"type": "string",
**(
{"enum": cred_types}
if len(cred_types) > 1
else {"const": cred_types[0]}
),
},
},
"required": ["id", "provider", "type"],
}
# Add other (optional) field info items
field_schema.update(
field_info.model_dump(
by_alias=True,
exclude_defaults=True,
exclude={"provider", "supported_types"}, # already included above
)
)
# Ensure field schema is well-formed
CredentialsMetaInput.validate_credentials_field_schema(
field_schema, agg_field_key
)
properties[agg_field_key] = field_schema
if is_required:
required_fields.append(agg_field_key)
return {
"type": "object",
"properties": properties,
"required": required_fields,
}
def aggregate_credentials_inputs( def aggregate_credentials_inputs(
self, self,
) -> dict[str, tuple[CredentialsFieldInfo, set[tuple[str, str]]]]: ) -> dict[str, tuple[CredentialsFieldInfo, set[tuple[str, str]], bool]]:
""" """
Returns: Returns:
dict[aggregated_field_key, tuple( dict[aggregated_field_key, tuple(
@@ -455,13 +552,19 @@ class Graph(BaseGraph):
(now includes discriminator_values from matching nodes) (now includes discriminator_values from matching nodes)
set[(node_id, field_name)]: Node credentials fields that are set[(node_id, field_name)]: Node credentials fields that are
compatible with this aggregated field spec compatible with this aggregated field spec
bool: True if the field is required (any node has credentials_optional=False)
)] )]
""" """
# First collect all credential field data with input defaults # First collect all credential field data with input defaults
node_credential_data = [] # Track (field_info, (node_id, field_name), is_required) for each credential field
node_credential_data: list[tuple[CredentialsFieldInfo, tuple[str, str]]] = []
node_required_map: dict[str, bool] = {} # node_id -> is_required
for graph in [self] + self.sub_graphs: for graph in [self] + self.sub_graphs:
for node in graph.nodes: for node in graph.nodes:
# Track if this node requires credentials (credentials_optional=False means required)
node_required_map[node.id] = not node.credentials_optional
for ( for (
field_name, field_name,
field_info, field_info,
@@ -485,37 +588,21 @@ class Graph(BaseGraph):
) )
# Combine credential field info (this will merge discriminator_values automatically) # Combine credential field info (this will merge discriminator_values automatically)
return CredentialsFieldInfo.combine(*node_credential_data) combined = CredentialsFieldInfo.combine(*node_credential_data)
# Add is_required flag to each aggregated field
class GraphModel(Graph): # A field is required if ANY node using it has credentials_optional=False
user_id: str return {
nodes: list[NodeModel] = [] # type: ignore key: (
field_info,
created_at: datetime node_field_pairs,
any(
@property node_required_map.get(node_id, True)
def starting_nodes(self) -> list[NodeModel]: for node_id, _ in node_field_pairs
outbound_nodes = {link.sink_id for link in self.links} ),
input_nodes = { )
node.id for node in self.nodes if node.block.block_type == BlockType.INPUT for key, (field_info, node_field_pairs) in combined.items()
} }
return [
node
for node in self.nodes
if node.id not in outbound_nodes or node.id in input_nodes
]
@property
def webhook_input_node(self) -> NodeModel | None: # type: ignore
return cast(NodeModel, super().webhook_input_node)
def meta(self) -> "GraphMeta":
"""
Returns a GraphMeta object with metadata about the graph.
This is used to return metadata about the graph without exposing nodes and links.
"""
return GraphMeta.from_graph(self)
def reassign_ids(self, user_id: str, reassign_graph_id: bool = False): def reassign_ids(self, user_id: str, reassign_graph_id: bool = False):
""" """
@@ -799,13 +886,14 @@ class GraphModel(Graph):
if is_static_output_block(link.source_id): if is_static_output_block(link.source_id):
link.is_static = True # Each value block output should be static. link.is_static = True # Each value block output should be static.
@staticmethod @classmethod
def from_db( def from_db( # type: ignore[reportIncompatibleMethodOverride]
cls,
graph: AgentGraph, graph: AgentGraph,
for_export: bool = False, for_export: bool = False,
sub_graphs: list[AgentGraph] | None = None, sub_graphs: list[AgentGraph] | None = None,
) -> "GraphModel": ) -> Self:
return GraphModel( return cls(
id=graph.id, id=graph.id,
user_id=graph.userId if not for_export else "", user_id=graph.userId if not for_export else "",
version=graph.version, version=graph.version,
@@ -831,17 +919,28 @@ class GraphModel(Graph):
], ],
) )
def hide_nodes(self) -> "GraphModelWithoutNodes":
"""
Returns a copy of the `GraphModel` with nodes, links, and sub-graphs hidden
(excluded from serialization). They are still present in the model instance
so all computed fields (e.g. `credentials_input_schema`) still work.
"""
return GraphModelWithoutNodes.model_validate(self, from_attributes=True)
class GraphMeta(Graph):
user_id: str
# Easy work-around to prevent exposing nodes and links in the API response class GraphModelWithoutNodes(GraphModel):
nodes: list[NodeModel] = Field(default=[], exclude=True) # type: ignore """
links: list[Link] = Field(default=[], exclude=True) GraphModel variant that excludes nodes, links, and sub-graphs from serialization.
@staticmethod Used in contexts like the store where exposing internal graph structure
def from_graph(graph: GraphModel) -> "GraphMeta": is not desired. Inherits all computed fields from GraphModel but marks
return GraphMeta(**graph.model_dump()) nodes and links as excluded from JSON output.
"""
nodes: list[NodeModel] = Field(default_factory=list, exclude=True)
links: list[Link] = Field(default_factory=list, exclude=True)
sub_graphs: list[BaseGraph] = Field(default_factory=list, exclude=True)
class GraphsPaginated(BaseModel): class GraphsPaginated(BaseModel):
@@ -912,21 +1011,11 @@ async def list_graphs_paginated(
where=where_clause, where=where_clause,
distinct=["id"], distinct=["id"],
order={"version": "desc"}, order={"version": "desc"},
include=AGENT_GRAPH_INCLUDE,
skip=offset, skip=offset,
take=page_size, take=page_size,
) )
graph_models: list[GraphMeta] = [] graph_models = [GraphMeta.from_db(graph) for graph in graphs]
for graph in graphs:
try:
graph_meta = GraphModel.from_db(graph).meta()
# Trigger serialization to validate that the graph is well formed
graph_meta.model_dump()
graph_models.append(graph_meta)
except Exception as e:
logger.error(f"Error processing graph {graph.id}: {e}")
continue
return GraphsPaginated( return GraphsPaginated(
graphs=graph_models, graphs=graph_models,

View File

@@ -163,7 +163,6 @@ class User(BaseModel):
if TYPE_CHECKING: if TYPE_CHECKING:
from prisma.models import User as PrismaUser from prisma.models import User as PrismaUser
from backend.data.block import BlockSchema
T = TypeVar("T") T = TypeVar("T")
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)
@@ -508,15 +507,13 @@ class CredentialsMetaInput(BaseModel, Generic[CP, CT]):
def allowed_cred_types(cls) -> tuple[CredentialsType, ...]: def allowed_cred_types(cls) -> tuple[CredentialsType, ...]:
return get_args(cls.model_fields["type"].annotation) return get_args(cls.model_fields["type"].annotation)
@classmethod @staticmethod
def validate_credentials_field_schema(cls, model: type["BlockSchema"]): def validate_credentials_field_schema(
field_schema: dict[str, Any], field_name: str
):
"""Validates the schema of a credentials input field""" """Validates the schema of a credentials input field"""
field_name = next(
name for name, type in model.get_credentials_fields().items() if type is cls
)
field_schema = model.jsonschema()["properties"][field_name]
try: try:
schema_extra = CredentialsFieldInfo[CP, CT].model_validate(field_schema) field_info = CredentialsFieldInfo[CP, CT].model_validate(field_schema)
except ValidationError as e: except ValidationError as e:
if "Field required [type=missing" not in str(e): if "Field required [type=missing" not in str(e):
raise raise
@@ -526,11 +523,11 @@ class CredentialsMetaInput(BaseModel, Generic[CP, CT]):
f"{field_schema}" f"{field_schema}"
) from e ) from e
providers = cls.allowed_providers() providers = field_info.provider
if ( if (
providers is not None providers is not None
and len(providers) > 1 and len(providers) > 1
and not schema_extra.discriminator and not field_info.discriminator
): ):
raise TypeError( raise TypeError(
f"Multi-provider CredentialsField '{field_name}' " f"Multi-provider CredentialsField '{field_name}' "

View File

@@ -211,6 +211,22 @@ class AgentRejectionData(BaseNotificationData):
return value return value
class WaitlistLaunchData(BaseNotificationData):
"""Notification data for when an agent from a waitlist is launched."""
agent_name: str
waitlist_name: str
store_url: str
launched_at: datetime
@field_validator("launched_at")
@classmethod
def validate_timezone(cls, value: datetime):
if value.tzinfo is None:
raise ValueError("datetime must have timezone information")
return value
NotificationData = Annotated[ NotificationData = Annotated[
Union[ Union[
AgentRunData, AgentRunData,
@@ -223,6 +239,7 @@ NotificationData = Annotated[
DailySummaryData, DailySummaryData,
RefundRequestData, RefundRequestData,
BaseSummaryData, BaseSummaryData,
WaitlistLaunchData,
], ],
Field(discriminator="type"), Field(discriminator="type"),
] ]
@@ -273,6 +290,7 @@ def get_notif_data_type(
NotificationType.REFUND_PROCESSED: RefundRequestData, NotificationType.REFUND_PROCESSED: RefundRequestData,
NotificationType.AGENT_APPROVED: AgentApprovalData, NotificationType.AGENT_APPROVED: AgentApprovalData,
NotificationType.AGENT_REJECTED: AgentRejectionData, NotificationType.AGENT_REJECTED: AgentRejectionData,
NotificationType.WAITLIST_LAUNCH: WaitlistLaunchData,
}[notification_type] }[notification_type]
@@ -318,6 +336,7 @@ class NotificationTypeOverride:
NotificationType.REFUND_PROCESSED: QueueType.ADMIN, NotificationType.REFUND_PROCESSED: QueueType.ADMIN,
NotificationType.AGENT_APPROVED: QueueType.IMMEDIATE, NotificationType.AGENT_APPROVED: QueueType.IMMEDIATE,
NotificationType.AGENT_REJECTED: QueueType.IMMEDIATE, NotificationType.AGENT_REJECTED: QueueType.IMMEDIATE,
NotificationType.WAITLIST_LAUNCH: QueueType.IMMEDIATE,
} }
return BATCHING_RULES.get(self.notification_type, QueueType.IMMEDIATE) return BATCHING_RULES.get(self.notification_type, QueueType.IMMEDIATE)
@@ -337,6 +356,7 @@ class NotificationTypeOverride:
NotificationType.REFUND_PROCESSED: "refund_processed.html", NotificationType.REFUND_PROCESSED: "refund_processed.html",
NotificationType.AGENT_APPROVED: "agent_approved.html", NotificationType.AGENT_APPROVED: "agent_approved.html",
NotificationType.AGENT_REJECTED: "agent_rejected.html", NotificationType.AGENT_REJECTED: "agent_rejected.html",
NotificationType.WAITLIST_LAUNCH: "waitlist_launch.html",
}[self.notification_type] }[self.notification_type]
@property @property
@@ -354,6 +374,7 @@ class NotificationTypeOverride:
NotificationType.REFUND_PROCESSED: "Refund for ${{data.amount / 100}} to {{data.user_name}} has been processed", NotificationType.REFUND_PROCESSED: "Refund for ${{data.amount / 100}} to {{data.user_name}} has been processed",
NotificationType.AGENT_APPROVED: "🎉 Your agent '{{data.agent_name}}' has been approved!", NotificationType.AGENT_APPROVED: "🎉 Your agent '{{data.agent_name}}' has been approved!",
NotificationType.AGENT_REJECTED: "Your agent '{{data.agent_name}}' needs some updates", NotificationType.AGENT_REJECTED: "Your agent '{{data.agent_name}}' needs some updates",
NotificationType.WAITLIST_LAUNCH: "🚀 {{data.agent_name}} is now available!",
}[self.notification_type] }[self.notification_type]

View File

@@ -373,7 +373,7 @@ def make_node_credentials_input_map(
# Get aggregated credentials fields for the graph # Get aggregated credentials fields for the graph
graph_cred_inputs = graph.aggregate_credentials_inputs() graph_cred_inputs = graph.aggregate_credentials_inputs()
for graph_input_name, (_, compatible_node_fields) in graph_cred_inputs.items(): for graph_input_name, (_, compatible_node_fields, _) in graph_cred_inputs.items():
# Best-effort map: skip missing items # Best-effort map: skip missing items
if graph_input_name not in graph_credentials_input: if graph_input_name not in graph_credentials_input:
continue continue

View File

@@ -0,0 +1,59 @@
{# Waitlist Launch Notification Email Template #}
{#
Template variables:
data.agent_name: the name of the launched agent
data.waitlist_name: the name of the waitlist the user joined
data.store_url: URL to view the agent in the store
data.launched_at: when the agent was launched
Subject: {{ data.agent_name }} is now available!
#}
{% block content %}
<h1 style="color: #7c3aed; font-size: 32px; font-weight: 700; margin: 0 0 24px 0; text-align: center;">
The wait is over!
</h1>
<p style="color: #586069; font-size: 18px; text-align: center; margin: 0 0 24px 0;">
<strong>'{{ data.agent_name }}'</strong> is now live in the AutoGPT Store!
</p>
<div style="height: 32px; background: transparent;"></div>
<div style="background: #f3e8ff; border: 1px solid #d8b4fe; border-radius: 8px; padding: 20px; margin: 0;">
<h3 style="color: #6b21a8; font-size: 16px; font-weight: 600; margin: 0 0 12px 0;">
You're one of the first to know!
</h3>
<p style="color: #6b21a8; margin: 0; font-size: 16px; line-height: 1.5;">
You signed up for the <strong>{{ data.waitlist_name }}</strong> waitlist, and we're excited to let you know that this agent is now ready for you to use.
</p>
</div>
<div style="height: 32px; background: transparent;"></div>
<div style="text-align: center; margin: 24px 0;">
<a href="{{ data.store_url }}" style="display: inline-block; background: linear-gradient(135deg, #7c3aed 0%, #5b21b6 100%); color: white; text-decoration: none; padding: 14px 28px; border-radius: 6px; font-weight: 600; font-size: 16px;">
Get {{ data.agent_name }} Now
</a>
</div>
<div style="height: 32px; background: transparent;"></div>
<div style="background: #d1ecf1; border: 1px solid #bee5eb; border-radius: 8px; padding: 20px; margin: 0;">
<h3 style="color: #0c5460; font-size: 16px; font-weight: 600; margin: 0 0 12px 0;">
What can you do now?
</h3>
<ul style="color: #0c5460; margin: 0; padding-left: 18px; font-size: 16px; line-height: 1.6;">
<li>Visit the store to learn more about what this agent can do</li>
<li>Install and start using the agent right away</li>
<li>Share it with others who might find it useful</li>
</ul>
</div>
<div style="height: 32px; background: transparent;"></div>
<p style="color: #6a737d; font-size: 14px; text-align: center; margin: 24px 0;">
Thank you for helping us prioritize what to build! Your interest made this happen.
</p>
{% endblock %}

View File

@@ -342,6 +342,14 @@ async def store_media_file(
if not target_path.is_file(): if not target_path.is_file():
raise ValueError(f"Local file does not exist: {target_path}") raise ValueError(f"Local file does not exist: {target_path}")
# Virus scan the local file before any further processing
local_content = target_path.read_bytes()
if len(local_content) > MAX_FILE_SIZE_BYTES:
raise ValueError(
f"File too large: {len(local_content)} bytes > {MAX_FILE_SIZE_BYTES} bytes"
)
await scan_content_safe(local_content, filename=sanitized_file)
# Return based on requested format # Return based on requested format
if return_format == "for_local_processing": if return_format == "for_local_processing":
# Use when processing files locally with tools like ffmpeg, MoviePy, PIL # Use when processing files locally with tools like ffmpeg, MoviePy, PIL

View File

@@ -247,3 +247,100 @@ class TestFileCloudIntegration:
execution_context=make_test_context(graph_exec_id=graph_exec_id), execution_context=make_test_context(graph_exec_id=graph_exec_id),
return_format="for_local_processing", return_format="for_local_processing",
) )
@pytest.mark.asyncio
async def test_store_media_file_local_path_scanned(self):
"""Test that local file paths are scanned for viruses."""
graph_exec_id = "test-exec-123"
local_file = "test_video.mp4"
file_content = b"fake video content"
with patch(
"backend.util.file.get_cloud_storage_handler"
) as mock_handler_getter, patch(
"backend.util.file.scan_content_safe"
) as mock_scan, patch(
"backend.util.file.Path"
) as mock_path_class:
# Mock cloud storage handler - not a cloud path
mock_handler = MagicMock()
mock_handler.is_cloud_path.return_value = False
mock_handler_getter.return_value = mock_handler
# Mock virus scanner
mock_scan.return_value = None
# Mock file system operations
mock_base_path = MagicMock()
mock_target_path = MagicMock()
mock_resolved_path = MagicMock()
mock_path_class.return_value = mock_base_path
mock_base_path.mkdir = MagicMock()
mock_base_path.__truediv__ = MagicMock(return_value=mock_target_path)
mock_target_path.resolve.return_value = mock_resolved_path
mock_resolved_path.is_relative_to.return_value = True
mock_resolved_path.is_file.return_value = True
mock_resolved_path.read_bytes.return_value = file_content
mock_resolved_path.relative_to.return_value = Path(local_file)
mock_resolved_path.name = local_file
result = await store_media_file(
file=MediaFileType(local_file),
execution_context=make_test_context(graph_exec_id=graph_exec_id),
return_format="for_local_processing",
)
# Verify virus scan was called for local file
mock_scan.assert_called_once_with(file_content, filename=local_file)
# Result should be the relative path
assert str(result) == local_file
@pytest.mark.asyncio
async def test_store_media_file_local_path_virus_detected(self):
"""Test that infected local files raise VirusDetectedError."""
from backend.api.features.store.exceptions import VirusDetectedError
graph_exec_id = "test-exec-123"
local_file = "infected.exe"
file_content = b"malicious content"
with patch(
"backend.util.file.get_cloud_storage_handler"
) as mock_handler_getter, patch(
"backend.util.file.scan_content_safe"
) as mock_scan, patch(
"backend.util.file.Path"
) as mock_path_class:
# Mock cloud storage handler - not a cloud path
mock_handler = MagicMock()
mock_handler.is_cloud_path.return_value = False
mock_handler_getter.return_value = mock_handler
# Mock virus scanner to detect virus
mock_scan.side_effect = VirusDetectedError(
"EICAR-Test-File", "File rejected due to virus detection"
)
# Mock file system operations
mock_base_path = MagicMock()
mock_target_path = MagicMock()
mock_resolved_path = MagicMock()
mock_path_class.return_value = mock_base_path
mock_base_path.mkdir = MagicMock()
mock_base_path.__truediv__ = MagicMock(return_value=mock_target_path)
mock_target_path.resolve.return_value = mock_resolved_path
mock_resolved_path.is_relative_to.return_value = True
mock_resolved_path.is_file.return_value = True
mock_resolved_path.read_bytes.return_value = file_content
with pytest.raises(VirusDetectedError):
await store_media_file(
file=MediaFileType(local_file),
execution_context=make_test_context(graph_exec_id=graph_exec_id),
return_format="for_local_processing",
)

View File

@@ -188,6 +188,7 @@ class WorkspaceManager:
f"{Config().max_file_size_mb}MB limit" f"{Config().max_file_size_mb}MB limit"
) )
# Virus scan content before persisting (defense in depth)
await scan_content_safe(content, filename=filename) await scan_content_safe(content, filename=filename)
# Determine path with session scoping # Determine path with session scoping

View File

@@ -0,0 +1,53 @@
-- CreateEnum
CREATE TYPE "WaitlistExternalStatus" AS ENUM ('DONE', 'NOT_STARTED', 'CANCELED', 'WORK_IN_PROGRESS');
-- AlterEnum
ALTER TYPE "NotificationType" ADD VALUE 'WAITLIST_LAUNCH';
-- CreateTable
CREATE TABLE "WaitlistEntry" (
"id" TEXT NOT NULL,
"createdAt" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
"updatedAt" TIMESTAMP(3) NOT NULL,
"storeListingId" TEXT,
"owningUserId" TEXT NOT NULL,
"slug" TEXT NOT NULL,
"search" tsvector DEFAULT ''::tsvector,
"name" TEXT NOT NULL,
"subHeading" TEXT NOT NULL,
"videoUrl" TEXT,
"agentOutputDemoUrl" TEXT,
"imageUrls" TEXT[],
"description" TEXT NOT NULL,
"categories" TEXT[],
"status" "WaitlistExternalStatus" NOT NULL DEFAULT 'NOT_STARTED',
"votes" INTEGER NOT NULL DEFAULT 0,
"unaffiliatedEmailUsers" TEXT[] DEFAULT ARRAY[]::TEXT[],
"isDeleted" BOOLEAN NOT NULL DEFAULT false,
CONSTRAINT "WaitlistEntry_pkey" PRIMARY KEY ("id")
);
-- CreateTable
CREATE TABLE "_joinedWaitlists" (
"A" TEXT NOT NULL,
"B" TEXT NOT NULL
);
-- CreateIndex
CREATE UNIQUE INDEX "_joinedWaitlists_AB_unique" ON "_joinedWaitlists"("A", "B");
-- CreateIndex
CREATE INDEX "_joinedWaitlists_B_index" ON "_joinedWaitlists"("B");
-- AddForeignKey
ALTER TABLE "WaitlistEntry" ADD CONSTRAINT "WaitlistEntry_storeListingId_fkey" FOREIGN KEY ("storeListingId") REFERENCES "StoreListing"("id") ON DELETE SET NULL ON UPDATE CASCADE;
-- AddForeignKey
ALTER TABLE "WaitlistEntry" ADD CONSTRAINT "WaitlistEntry_owningUserId_fkey" FOREIGN KEY ("owningUserId") REFERENCES "User"("id") ON DELETE RESTRICT ON UPDATE CASCADE;
-- AddForeignKey
ALTER TABLE "_joinedWaitlists" ADD CONSTRAINT "_joinedWaitlists_A_fkey" FOREIGN KEY ("A") REFERENCES "User"("id") ON DELETE CASCADE ON UPDATE CASCADE;
-- AddForeignKey
ALTER TABLE "_joinedWaitlists" ADD CONSTRAINT "_joinedWaitlists_B_fkey" FOREIGN KEY ("B") REFERENCES "WaitlistEntry"("id") ON DELETE CASCADE ON UPDATE CASCADE;

File diff suppressed because it is too large Load Diff

View File

@@ -17,11 +17,11 @@ apscheduler = "^3.11.1"
autogpt-libs = { path = "../autogpt_libs", develop = true } autogpt-libs = { path = "../autogpt_libs", develop = true }
bleach = { extras = ["css"], version = "^6.2.0" } bleach = { extras = ["css"], version = "^6.2.0" }
click = "^8.2.0" click = "^8.2.0"
cryptography = "^45.0" cryptography = "^46.0"
discord-py = "^2.5.2" discord-py = "^2.5.2"
e2b-code-interpreter = "^1.5.2" e2b-code-interpreter = "^1.5.2"
elevenlabs = "^1.50.0" elevenlabs = "^1.50.0"
fastapi = "^0.116.1" fastapi = "^0.128.0"
feedparser = "^6.0.11" feedparser = "^6.0.11"
flake8 = "^7.3.0" flake8 = "^7.3.0"
google-api-python-client = "^2.177.0" google-api-python-client = "^2.177.0"
@@ -35,7 +35,7 @@ jinja2 = "^3.1.6"
jsonref = "^1.1.0" jsonref = "^1.1.0"
jsonschema = "^4.25.0" jsonschema = "^4.25.0"
langfuse = "^3.11.0" langfuse = "^3.11.0"
launchdarkly-server-sdk = "^9.12.0" launchdarkly-server-sdk = "^9.14.1"
mem0ai = "^0.1.115" mem0ai = "^0.1.115"
moviepy = "^2.1.2" moviepy = "^2.1.2"
ollama = "^0.5.1" ollama = "^0.5.1"
@@ -52,8 +52,8 @@ prometheus-client = "^0.22.1"
prometheus-fastapi-instrumentator = "^7.0.0" prometheus-fastapi-instrumentator = "^7.0.0"
psutil = "^7.0.0" psutil = "^7.0.0"
psycopg2-binary = "^2.9.10" psycopg2-binary = "^2.9.10"
pydantic = { extras = ["email"], version = "^2.11.7" } pydantic = { extras = ["email"], version = "^2.12.5" }
pydantic-settings = "^2.10.1" pydantic-settings = "^2.12.0"
pytest = "^8.4.1" pytest = "^8.4.1"
pytest-asyncio = "^1.1.0" pytest-asyncio = "^1.1.0"
python-dotenv = "^1.1.1" python-dotenv = "^1.1.1"
@@ -65,11 +65,11 @@ sentry-sdk = {extras = ["anthropic", "fastapi", "launchdarkly", "openai", "sqlal
sqlalchemy = "^2.0.40" sqlalchemy = "^2.0.40"
strenum = "^0.4.9" strenum = "^0.4.9"
stripe = "^11.5.0" stripe = "^11.5.0"
supabase = "2.17.0" supabase = "2.27.2"
tenacity = "^9.1.2" tenacity = "^9.1.2"
todoist-api-python = "^2.1.7" todoist-api-python = "^2.1.7"
tweepy = "^4.16.0" tweepy = "^4.16.0"
uvicorn = { extras = ["standard"], version = "^0.35.0" } uvicorn = { extras = ["standard"], version = "^0.40.0" }
websockets = "^15.0" websockets = "^15.0"
youtube-transcript-api = "^1.2.1" youtube-transcript-api = "^1.2.1"
yt-dlp = "2025.12.08" yt-dlp = "2025.12.08"

View File

@@ -70,6 +70,10 @@ model User {
OAuthAuthorizationCodes OAuthAuthorizationCode[] OAuthAuthorizationCodes OAuthAuthorizationCode[]
OAuthAccessTokens OAuthAccessToken[] OAuthAccessTokens OAuthAccessToken[]
OAuthRefreshTokens OAuthRefreshToken[] OAuthRefreshTokens OAuthRefreshToken[]
// Waitlist relations
waitlistEntries WaitlistEntry[]
joinedWaitlists WaitlistEntry[] @relation("joinedWaitlists")
} }
enum OnboardingStep { enum OnboardingStep {
@@ -344,6 +348,7 @@ enum NotificationType {
REFUND_PROCESSED REFUND_PROCESSED
AGENT_APPROVED AGENT_APPROVED
AGENT_REJECTED AGENT_REJECTED
WAITLIST_LAUNCH
} }
model NotificationEvent { model NotificationEvent {
@@ -948,7 +953,8 @@ model StoreListing {
OwningUser User @relation(fields: [owningUserId], references: [id]) OwningUser User @relation(fields: [owningUserId], references: [id])
// Relations // Relations
Versions StoreListingVersion[] @relation("ListingVersions") Versions StoreListingVersion[] @relation("ListingVersions")
waitlistEntries WaitlistEntry[]
// Unique index on agentId to ensure only one listing per agent, regardless of number of versions the agent has. // Unique index on agentId to ensure only one listing per agent, regardless of number of versions the agent has.
@@unique([agentGraphId]) @@unique([agentGraphId])
@@ -1080,6 +1086,47 @@ model StoreListingReview {
@@index([reviewByUserId]) @@index([reviewByUserId])
} }
enum WaitlistExternalStatus {
DONE
NOT_STARTED
CANCELED
WORK_IN_PROGRESS
}
model WaitlistEntry {
id String @id @default(uuid())
createdAt DateTime @default(now())
updatedAt DateTime @updatedAt
storeListingId String?
StoreListing StoreListing? @relation(fields: [storeListingId], references: [id], onDelete: SetNull)
owningUserId String
OwningUser User @relation(fields: [owningUserId], references: [id])
slug String
search Unsupported("tsvector")? @default(dbgenerated("''::tsvector"))
// Content fields
name String
subHeading String
videoUrl String?
agentOutputDemoUrl String?
imageUrls String[]
description String
categories String[]
//Waitlist specific fields
status WaitlistExternalStatus @default(NOT_STARTED)
votes Int @default(0) // Hide from frontend api
joinedUsers User[] @relation("joinedWaitlists")
// NOTE: DO NOT DOUBLE SEND TO THESE USERS, IF THEY HAVE SIGNED UP SINCE THEY MAY HAVE ALREADY RECEIVED AN EMAIL
// DOUBLE CHECK WHEN SENDING THAT THEY ARE NOT IN THE JOINED USERS LIST ALSO
unaffiliatedEmailUsers String[] @default([])
isDeleted Boolean @default(false)
}
enum SubmissionStatus { enum SubmissionStatus {
DRAFT // Being prepared, not yet submitted DRAFT // Being prepared, not yet submitted
PENDING // Submitted, awaiting review PENDING // Submitted, awaiting review

View File

@@ -3,7 +3,6 @@
"credentials_input_schema": { "credentials_input_schema": {
"properties": {}, "properties": {},
"required": [], "required": [],
"title": "TestGraphCredentialsInputSchema",
"type": "object" "type": "object"
}, },
"description": "A test graph", "description": "A test graph",

View File

@@ -1,34 +1,14 @@
[ [
{ {
"credentials_input_schema": { "created_at": "2025-09-04T13:37:00",
"properties": {},
"required": [],
"title": "TestGraphCredentialsInputSchema",
"type": "object"
},
"description": "A test graph", "description": "A test graph",
"forked_from_id": null, "forked_from_id": null,
"forked_from_version": null, "forked_from_version": null,
"has_external_trigger": false,
"has_human_in_the_loop": false,
"has_sensitive_action": false,
"id": "graph-123", "id": "graph-123",
"input_schema": {
"properties": {},
"required": [],
"type": "object"
},
"instructions": null, "instructions": null,
"is_active": true, "is_active": true,
"name": "Test Graph", "name": "Test Graph",
"output_schema": {
"properties": {},
"required": [],
"type": "object"
},
"recommended_schedule_cron": null, "recommended_schedule_cron": null,
"sub_graphs": [],
"trigger_setup_info": null,
"user_id": "3e53486c-cf57-477e-ba2a-cb02dc828e1a", "user_id": "3e53486c-cf57-477e-ba2a-cb02dc828e1a",
"version": 1 "version": 1
} }

View File

@@ -1,5 +1,5 @@
import { CredentialsMetaInput } from "@/app/api/__generated__/models/credentialsMetaInput"; import { CredentialsMetaInput } from "@/app/api/__generated__/models/credentialsMetaInput";
import { GraphMeta } from "@/app/api/__generated__/models/graphMeta"; import { GraphModel } from "@/app/api/__generated__/models/graphModel";
import { CredentialsInput } from "@/components/contextual/CredentialsInput/CredentialsInput"; import { CredentialsInput } from "@/components/contextual/CredentialsInput/CredentialsInput";
import { useState } from "react"; import { useState } from "react";
import { getSchemaDefaultCredentials } from "../../helpers"; import { getSchemaDefaultCredentials } from "../../helpers";
@@ -9,7 +9,7 @@ type Credential = CredentialsMetaInput | undefined;
type Credentials = Record<string, Credential>; type Credentials = Record<string, Credential>;
type Props = { type Props = {
agent: GraphMeta | null; agent: GraphModel | null;
siblingInputs?: Record<string, any>; siblingInputs?: Record<string, any>;
onCredentialsChange: ( onCredentialsChange: (
credentials: Record<string, CredentialsMetaInput>, credentials: Record<string, CredentialsMetaInput>,

View File

@@ -1,9 +1,9 @@
import { CredentialsMetaInput } from "@/app/api/__generated__/models/credentialsMetaInput"; import { CredentialsMetaInput } from "@/app/api/__generated__/models/credentialsMetaInput";
import { GraphMeta } from "@/app/api/__generated__/models/graphMeta"; import { GraphModel } from "@/app/api/__generated__/models/graphModel";
import { BlockIOCredentialsSubSchema } from "@/lib/autogpt-server-api/types"; import { BlockIOCredentialsSubSchema } from "@/lib/autogpt-server-api/types";
export function getCredentialFields( export function getCredentialFields(
agent: GraphMeta | null, agent: GraphModel | null,
): AgentCredentialsFields { ): AgentCredentialsFields {
if (!agent) return {}; if (!agent) return {};

View File

@@ -3,10 +3,10 @@ import type {
CredentialsMetaInput, CredentialsMetaInput,
} from "@/lib/autogpt-server-api/types"; } from "@/lib/autogpt-server-api/types";
import type { InputValues } from "./types"; import type { InputValues } from "./types";
import { GraphMeta } from "@/app/api/__generated__/models/graphMeta"; import { GraphModel } from "@/app/api/__generated__/models/graphModel";
export function computeInitialAgentInputs( export function computeInitialAgentInputs(
agent: GraphMeta | null, agent: GraphModel | null,
existingInputs?: InputValues | null, existingInputs?: InputValues | null,
): InputValues { ): InputValues {
const properties = agent?.input_schema?.properties || {}; const properties = agent?.input_schema?.properties || {};
@@ -29,7 +29,7 @@ export function computeInitialAgentInputs(
} }
type IsRunDisabledParams = { type IsRunDisabledParams = {
agent: GraphMeta | null; agent: GraphModel | null;
isRunning: boolean; isRunning: boolean;
agentInputs: InputValues | null | undefined; agentInputs: InputValues | null | undefined;
}; };

View File

@@ -1,5 +1,5 @@
import { Sidebar } from "@/components/__legacy__/Sidebar"; import { Sidebar } from "@/components/__legacy__/Sidebar";
import { Users, DollarSign, UserSearch, FileText } from "lucide-react"; import { Users, DollarSign, UserSearch, FileText, Clock } from "lucide-react";
import { IconSliders } from "@/components/__legacy__/ui/icons"; import { IconSliders } from "@/components/__legacy__/ui/icons";
@@ -11,6 +11,11 @@ const sidebarLinkGroups = [
href: "/admin/marketplace", href: "/admin/marketplace",
icon: <Users className="h-6 w-6" />, icon: <Users className="h-6 w-6" />,
}, },
{
text: "Waitlist Management",
href: "/admin/waitlist",
icon: <Clock className="h-6 w-6" />,
},
{ {
text: "User Spending", text: "User Spending",
href: "/admin/spending", href: "/admin/spending",

View File

@@ -0,0 +1,217 @@
"use client";
import { useState } from "react";
import { useQueryClient } from "@tanstack/react-query";
import { Button } from "@/components/atoms/Button/Button";
import { Input } from "@/components/atoms/Input/Input";
import { Dialog } from "@/components/molecules/Dialog/Dialog";
import {
usePostV2CreateWaitlist,
getGetV2ListAllWaitlistsQueryKey,
} from "@/app/api/__generated__/endpoints/admin/admin";
import { useToast } from "@/components/molecules/Toast/use-toast";
import { Plus } from "@phosphor-icons/react";
export function CreateWaitlistButton() {
const [open, setOpen] = useState(false);
const { toast } = useToast();
const queryClient = useQueryClient();
const createWaitlistMutation = usePostV2CreateWaitlist({
mutation: {
onSuccess: (response) => {
if (response.status === 200) {
toast({
title: "Success",
description: "Waitlist created successfully",
});
setOpen(false);
setFormData({
name: "",
slug: "",
subHeading: "",
description: "",
categories: "",
imageUrls: "",
videoUrl: "",
agentOutputDemoUrl: "",
});
queryClient.invalidateQueries({
queryKey: getGetV2ListAllWaitlistsQueryKey(),
});
} else {
toast({
variant: "destructive",
title: "Error",
description: "Failed to create waitlist",
});
}
},
onError: (error) => {
console.error("Error creating waitlist:", error);
toast({
variant: "destructive",
title: "Error",
description: "Failed to create waitlist",
});
},
},
});
const [formData, setFormData] = useState({
name: "",
slug: "",
subHeading: "",
description: "",
categories: "",
imageUrls: "",
videoUrl: "",
agentOutputDemoUrl: "",
});
function handleInputChange(id: string, value: string) {
setFormData((prev) => ({
...prev,
[id]: value,
}));
}
function generateSlug(name: string) {
return name
.toLowerCase()
.replace(/[^a-z0-9]+/g, "-")
.replace(/^-|-$/g, "");
}
function handleSubmit(e: React.FormEvent) {
e.preventDefault();
createWaitlistMutation.mutate({
data: {
name: formData.name,
slug: formData.slug || generateSlug(formData.name),
subHeading: formData.subHeading,
description: formData.description,
categories: formData.categories
? formData.categories.split(",").map((c) => c.trim())
: [],
imageUrls: formData.imageUrls
? formData.imageUrls.split(",").map((u) => u.trim())
: [],
videoUrl: formData.videoUrl || null,
agentOutputDemoUrl: formData.agentOutputDemoUrl || null,
},
});
}
return (
<>
<Button onClick={() => setOpen(true)}>
<Plus size={16} className="mr-2" />
Create Waitlist
</Button>
<Dialog
title="Create New Waitlist"
controlled={{
isOpen: open,
set: async (isOpen) => setOpen(isOpen),
}}
onClose={() => setOpen(false)}
styling={{ maxWidth: "600px" }}
>
<Dialog.Content>
<p className="mb-4 text-sm text-zinc-500">
Create a new waitlist for an upcoming agent. Users can sign up to be
notified when it launches.
</p>
<form onSubmit={handleSubmit} className="flex flex-col gap-2">
<Input
id="name"
label="Name"
value={formData.name}
onChange={(e) => handleInputChange("name", e.target.value)}
placeholder="SEO Analysis Agent"
required
/>
<Input
id="slug"
label="Slug"
value={formData.slug}
onChange={(e) => handleInputChange("slug", e.target.value)}
placeholder="seo-analysis-agent (auto-generated if empty)"
/>
<Input
id="subHeading"
label="Subheading"
value={formData.subHeading}
onChange={(e) => handleInputChange("subHeading", e.target.value)}
placeholder="Analyze your website's SEO in minutes"
required
/>
<Input
id="description"
label="Description"
type="textarea"
value={formData.description}
onChange={(e) => handleInputChange("description", e.target.value)}
placeholder="Detailed description of what this agent does..."
rows={4}
required
/>
<Input
id="categories"
label="Categories (comma-separated)"
value={formData.categories}
onChange={(e) => handleInputChange("categories", e.target.value)}
placeholder="SEO, Marketing, Analysis"
/>
<Input
id="imageUrls"
label="Image URLs (comma-separated)"
value={formData.imageUrls}
onChange={(e) => handleInputChange("imageUrls", e.target.value)}
placeholder="https://example.com/image1.jpg, https://example.com/image2.jpg"
/>
<Input
id="videoUrl"
label="Video URL (optional)"
value={formData.videoUrl}
onChange={(e) => handleInputChange("videoUrl", e.target.value)}
placeholder="https://youtube.com/watch?v=..."
/>
<Input
id="agentOutputDemoUrl"
label="Output Demo URL (optional)"
value={formData.agentOutputDemoUrl}
onChange={(e) =>
handleInputChange("agentOutputDemoUrl", e.target.value)
}
placeholder="https://example.com/demo-output.mp4"
/>
<Dialog.Footer>
<Button
type="button"
variant="secondary"
onClick={() => setOpen(false)}
>
Cancel
</Button>
<Button type="submit" loading={createWaitlistMutation.isPending}>
Create Waitlist
</Button>
</Dialog.Footer>
</form>
</Dialog.Content>
</Dialog>
</>
);
}

View File

@@ -0,0 +1,221 @@
"use client";
import { useState } from "react";
import { Button } from "@/components/atoms/Button/Button";
import { Input } from "@/components/atoms/Input/Input";
import { Select } from "@/components/atoms/Select/Select";
import { Dialog } from "@/components/molecules/Dialog/Dialog";
import { useToast } from "@/components/molecules/Toast/use-toast";
import { usePutV2UpdateWaitlist } from "@/app/api/__generated__/endpoints/admin/admin";
import type { WaitlistAdminResponse } from "@/app/api/__generated__/models/waitlistAdminResponse";
import type { WaitlistUpdateRequest } from "@/app/api/__generated__/models/waitlistUpdateRequest";
import { WaitlistExternalStatus } from "@/app/api/__generated__/models/waitlistExternalStatus";
type EditWaitlistDialogProps = {
waitlist: WaitlistAdminResponse;
onClose: () => void;
onSave: () => void;
};
const STATUS_OPTIONS = [
{ value: WaitlistExternalStatus.NOT_STARTED, label: "Not Started" },
{ value: WaitlistExternalStatus.WORK_IN_PROGRESS, label: "Work In Progress" },
{ value: WaitlistExternalStatus.DONE, label: "Done" },
{ value: WaitlistExternalStatus.CANCELED, label: "Canceled" },
];
export function EditWaitlistDialog({
waitlist,
onClose,
onSave,
}: EditWaitlistDialogProps) {
const { toast } = useToast();
const updateWaitlistMutation = usePutV2UpdateWaitlist();
const [formData, setFormData] = useState({
name: waitlist.name,
slug: waitlist.slug,
subHeading: waitlist.subHeading,
description: waitlist.description,
categories: waitlist.categories.join(", "),
imageUrls: waitlist.imageUrls.join(", "),
videoUrl: waitlist.videoUrl || "",
agentOutputDemoUrl: waitlist.agentOutputDemoUrl || "",
status: waitlist.status,
storeListingId: waitlist.storeListingId || "",
});
function handleInputChange(id: string, value: string) {
setFormData((prev) => ({
...prev,
[id]: value,
}));
}
function handleStatusChange(value: string) {
setFormData((prev) => ({
...prev,
status: value as WaitlistExternalStatus,
}));
}
async function handleSubmit(e: React.FormEvent) {
e.preventDefault();
const updateData: WaitlistUpdateRequest = {
name: formData.name,
slug: formData.slug,
subHeading: formData.subHeading,
description: formData.description,
categories: formData.categories
? formData.categories.split(",").map((c) => c.trim())
: [],
imageUrls: formData.imageUrls
? formData.imageUrls.split(",").map((u) => u.trim())
: [],
videoUrl: formData.videoUrl || null,
agentOutputDemoUrl: formData.agentOutputDemoUrl || null,
status: formData.status,
storeListingId: formData.storeListingId || null,
};
updateWaitlistMutation.mutate(
{ waitlistId: waitlist.id, data: updateData },
{
onSuccess: (response) => {
if (response.status === 200) {
toast({
title: "Success",
description: "Waitlist updated successfully",
});
onSave();
} else {
toast({
variant: "destructive",
title: "Error",
description: "Failed to update waitlist",
});
}
},
onError: () => {
toast({
variant: "destructive",
title: "Error",
description: "Failed to update waitlist",
});
},
},
);
}
return (
<Dialog
title="Edit Waitlist"
controlled={{
isOpen: true,
set: async (open) => {
if (!open) onClose();
},
}}
onClose={onClose}
styling={{ maxWidth: "600px" }}
>
<Dialog.Content>
<p className="mb-4 text-sm text-zinc-500">
Update the waitlist details. Changes will be reflected immediately.
</p>
<form onSubmit={handleSubmit} className="flex flex-col gap-2">
<Input
id="name"
label="Name"
value={formData.name}
onChange={(e) => handleInputChange("name", e.target.value)}
required
/>
<Input
id="slug"
label="Slug"
value={formData.slug}
onChange={(e) => handleInputChange("slug", e.target.value)}
/>
<Input
id="subHeading"
label="Subheading"
value={formData.subHeading}
onChange={(e) => handleInputChange("subHeading", e.target.value)}
required
/>
<Input
id="description"
label="Description"
type="textarea"
value={formData.description}
onChange={(e) => handleInputChange("description", e.target.value)}
rows={4}
required
/>
<Select
id="status"
label="Status"
value={formData.status}
onValueChange={handleStatusChange}
options={STATUS_OPTIONS}
/>
<Input
id="categories"
label="Categories (comma-separated)"
value={formData.categories}
onChange={(e) => handleInputChange("categories", e.target.value)}
/>
<Input
id="imageUrls"
label="Image URLs (comma-separated)"
value={formData.imageUrls}
onChange={(e) => handleInputChange("imageUrls", e.target.value)}
/>
<Input
id="videoUrl"
label="Video URL"
value={formData.videoUrl}
onChange={(e) => handleInputChange("videoUrl", e.target.value)}
/>
<Input
id="agentOutputDemoUrl"
label="Output Demo URL"
value={formData.agentOutputDemoUrl}
onChange={(e) =>
handleInputChange("agentOutputDemoUrl", e.target.value)
}
/>
<Input
id="storeListingId"
label="Store Listing ID (for linking)"
value={formData.storeListingId}
onChange={(e) =>
handleInputChange("storeListingId", e.target.value)
}
placeholder="Leave empty if not linked"
/>
<Dialog.Footer>
<Button type="button" variant="secondary" onClick={onClose}>
Cancel
</Button>
<Button type="submit" loading={updateWaitlistMutation.isPending}>
Save Changes
</Button>
</Dialog.Footer>
</form>
</Dialog.Content>
</Dialog>
);
}

View File

@@ -0,0 +1,156 @@
"use client";
import { Button } from "@/components/atoms/Button/Button";
import { Dialog } from "@/components/molecules/Dialog/Dialog";
import { User, Envelope, DownloadSimple } from "@phosphor-icons/react";
import { useGetV2GetWaitlistSignups } from "@/app/api/__generated__/endpoints/admin/admin";
type WaitlistSignupsDialogProps = {
waitlistId: string;
onClose: () => void;
};
export function WaitlistSignupsDialog({
waitlistId,
onClose,
}: WaitlistSignupsDialogProps) {
const {
data: signupsResponse,
isLoading,
isError,
} = useGetV2GetWaitlistSignups(waitlistId);
const signups = signupsResponse?.status === 200 ? signupsResponse.data : null;
function exportToCSV() {
if (!signups) return;
const headers = ["Type", "Email", "User ID", "Username"];
const rows = signups.signups.map((signup) => [
signup.type,
signup.email || "",
signup.userId || "",
signup.username || "",
]);
const escapeCell = (cell: string) => `"${cell.replace(/"/g, '""')}"`;
const csvContent = [
headers.join(","),
...rows.map((row) => row.map(escapeCell).join(",")),
].join("\n");
const blob = new Blob([csvContent], { type: "text/csv" });
const url = window.URL.createObjectURL(blob);
const a = document.createElement("a");
a.href = url;
a.download = `waitlist-${waitlistId}-signups.csv`;
a.click();
window.URL.revokeObjectURL(url);
}
function renderContent() {
if (isLoading) {
return <div className="py-10 text-center">Loading signups...</div>;
}
if (isError) {
return (
<div className="py-10 text-center text-red-500">
Failed to load signups. Please try again.
</div>
);
}
if (!signups || signups.signups.length === 0) {
return (
<div className="py-10 text-center text-gray-500">
No signups yet for this waitlist.
</div>
);
}
return (
<>
<div className="flex justify-end">
<Button variant="secondary" size="small" onClick={exportToCSV}>
<DownloadSimple className="mr-2 h-4 w-4" size={16} />
Export CSV
</Button>
</div>
<div className="max-h-[400px] overflow-y-auto rounded-md border">
<table className="w-full">
<thead className="bg-gray-50 dark:bg-gray-800">
<tr>
<th className="px-4 py-3 text-left text-sm font-medium">
Type
</th>
<th className="px-4 py-3 text-left text-sm font-medium">
Email / Username
</th>
<th className="px-4 py-3 text-left text-sm font-medium">
User ID
</th>
</tr>
</thead>
<tbody className="divide-y">
{signups.signups.map((signup, index) => (
<tr key={index}>
<td className="px-4 py-3">
{signup.type === "user" ? (
<span className="flex items-center gap-1 text-blue-600">
<User className="h-4 w-4" size={16} /> User
</span>
) : (
<span className="flex items-center gap-1 text-gray-600">
<Envelope className="h-4 w-4" size={16} /> Email
</span>
)}
</td>
<td className="px-4 py-3">
{signup.type === "user"
? signup.username || signup.email
: signup.email}
</td>
<td className="px-4 py-3 font-mono text-sm">
{signup.userId || "-"}
</td>
</tr>
))}
</tbody>
</table>
</div>
</>
);
}
return (
<Dialog
title="Waitlist Signups"
controlled={{
isOpen: true,
set: async (open) => {
if (!open) onClose();
},
}}
onClose={onClose}
styling={{ maxWidth: "700px" }}
>
<Dialog.Content>
<p className="mb-4 text-sm text-zinc-500">
{signups
? `${signups.totalCount} total signups`
: "Loading signups..."}
</p>
{renderContent()}
<Dialog.Footer>
<Button variant="secondary" onClick={onClose}>
Close
</Button>
</Dialog.Footer>
</Dialog.Content>
</Dialog>
);
}

View File

@@ -0,0 +1,214 @@
"use client";
import { useState } from "react";
import { useQueryClient } from "@tanstack/react-query";
import {
Table,
TableBody,
TableCell,
TableHead,
TableHeader,
TableRow,
} from "@/components/__legacy__/ui/table";
import { Button } from "@/components/atoms/Button/Button";
import {
useGetV2ListAllWaitlists,
useDeleteV2DeleteWaitlist,
getGetV2ListAllWaitlistsQueryKey,
} from "@/app/api/__generated__/endpoints/admin/admin";
import type { WaitlistAdminResponse } from "@/app/api/__generated__/models/waitlistAdminResponse";
import { EditWaitlistDialog } from "./EditWaitlistDialog";
import { WaitlistSignupsDialog } from "./WaitlistSignupsDialog";
import { Trash, PencilSimple, Users, Link } from "@phosphor-icons/react";
import { useToast } from "@/components/molecules/Toast/use-toast";
export function WaitlistTable() {
const [editingWaitlist, setEditingWaitlist] =
useState<WaitlistAdminResponse | null>(null);
const [viewingSignups, setViewingSignups] = useState<string | null>(null);
const { toast } = useToast();
const queryClient = useQueryClient();
const { data: response, isLoading, error } = useGetV2ListAllWaitlists();
const deleteWaitlistMutation = useDeleteV2DeleteWaitlist({
mutation: {
onSuccess: (response) => {
if (response.status === 200) {
toast({
title: "Success",
description: "Waitlist deleted successfully",
});
queryClient.invalidateQueries({
queryKey: getGetV2ListAllWaitlistsQueryKey(),
});
} else {
toast({
variant: "destructive",
title: "Error",
description: "Failed to delete waitlist",
});
}
},
onError: (error) => {
console.error("Error deleting waitlist:", error);
toast({
variant: "destructive",
title: "Error",
description: "Failed to delete waitlist",
});
},
},
});
function handleDelete(waitlistId: string) {
if (!confirm("Are you sure you want to delete this waitlist?")) return;
deleteWaitlistMutation.mutate({ waitlistId });
}
function handleWaitlistSaved() {
setEditingWaitlist(null);
queryClient.invalidateQueries({
queryKey: getGetV2ListAllWaitlistsQueryKey(),
});
}
function formatStatus(status: string) {
const statusColors: Record<string, string> = {
NOT_STARTED: "bg-gray-100 text-gray-800",
WORK_IN_PROGRESS: "bg-blue-100 text-blue-800",
DONE: "bg-green-100 text-green-800",
CANCELED: "bg-red-100 text-red-800",
};
return (
<span
className={`rounded-full px-2 py-1 text-xs font-medium ${statusColors[status] || "bg-gray-100 text-gray-700"}`}
>
{status.replace(/_/g, " ")}
</span>
);
}
function formatDate(dateStr: string) {
if (!dateStr) return "-";
return new Intl.DateTimeFormat("en-US", {
month: "short",
day: "numeric",
year: "numeric",
}).format(new Date(dateStr));
}
if (isLoading) {
return <div className="py-10 text-center">Loading waitlists...</div>;
}
if (error) {
return (
<div className="py-10 text-center text-red-500">
Error loading waitlists. Please try again.
</div>
);
}
const waitlists = response?.status === 200 ? response.data.waitlists : [];
if (waitlists.length === 0) {
return (
<div className="py-10 text-center text-gray-500">
No waitlists found. Create one to get started!
</div>
);
}
return (
<>
<div className="rounded-md border bg-white">
<Table>
<TableHeader className="bg-gray-50">
<TableRow>
<TableHead className="font-medium">Name</TableHead>
<TableHead className="font-medium">Status</TableHead>
<TableHead className="font-medium">Signups</TableHead>
<TableHead className="font-medium">Votes</TableHead>
<TableHead className="font-medium">Created</TableHead>
<TableHead className="font-medium">Linked Agent</TableHead>
<TableHead className="font-medium">Actions</TableHead>
</TableRow>
</TableHeader>
<TableBody>
{waitlists.map((waitlist) => (
<TableRow key={waitlist.id}>
<TableCell>
<div>
<div className="font-medium">{waitlist.name}</div>
<div className="text-sm text-gray-500">
{waitlist.subHeading}
</div>
</div>
</TableCell>
<TableCell>{formatStatus(waitlist.status)}</TableCell>
<TableCell>{waitlist.signupCount}</TableCell>
<TableCell>{waitlist.votes}</TableCell>
<TableCell>{formatDate(waitlist.createdAt)}</TableCell>
<TableCell>
{waitlist.storeListingId ? (
<span className="text-green-600">
<Link size={16} className="inline" /> Linked
</span>
) : (
<span className="text-gray-400">Not linked</span>
)}
</TableCell>
<TableCell>
<div className="flex gap-2">
<Button
variant="ghost"
size="small"
onClick={() => setViewingSignups(waitlist.id)}
title="View signups"
>
<Users size={16} />
</Button>
<Button
variant="ghost"
size="small"
onClick={() => setEditingWaitlist(waitlist)}
title="Edit"
>
<PencilSimple size={16} />
</Button>
<Button
variant="ghost"
size="small"
onClick={() => handleDelete(waitlist.id)}
title="Delete"
disabled={deleteWaitlistMutation.isPending}
>
<Trash size={16} className="text-red-500" />
</Button>
</div>
</TableCell>
</TableRow>
))}
</TableBody>
</Table>
</div>
{editingWaitlist && (
<EditWaitlistDialog
waitlist={editingWaitlist}
onClose={() => setEditingWaitlist(null)}
onSave={handleWaitlistSaved}
/>
)}
{viewingSignups && (
<WaitlistSignupsDialog
waitlistId={viewingSignups}
onClose={() => setViewingSignups(null)}
/>
)}
</>
);
}

View File

@@ -0,0 +1,52 @@
import { withRoleAccess } from "@/lib/withRoleAccess";
import { Suspense } from "react";
import { WaitlistTable } from "./components/WaitlistTable";
import { CreateWaitlistButton } from "./components/CreateWaitlistButton";
import { Warning } from "@phosphor-icons/react/dist/ssr";
function WaitlistDashboard() {
return (
<div className="mx-auto p-6">
<div className="flex flex-col gap-4">
<div className="flex items-center justify-between">
<div>
<h1 className="text-3xl font-bold">Waitlist Management</h1>
<p className="text-gray-500">
Manage upcoming agent waitlists and track signups
</p>
</div>
<CreateWaitlistButton />
</div>
<div className="flex items-start gap-3 rounded-lg border border-amber-300 bg-amber-50 p-4 dark:border-amber-700 dark:bg-amber-950">
<Warning
className="mt-0.5 h-5 w-5 flex-shrink-0 text-amber-600 dark:text-amber-400"
weight="fill"
/>
<div className="text-sm text-amber-800 dark:text-amber-200">
<p className="font-medium">TODO: Email-only signup notifications</p>
<p className="mt-1 text-amber-700 dark:text-amber-300">
Notifications for email-only signups (users who weren&apos;t
logged in) have not been implemented yet. Currently only
registered users will receive launch emails.
</p>
</div>
</div>
<Suspense
fallback={
<div className="py-10 text-center">Loading waitlists...</div>
}
>
<WaitlistTable />
</Suspense>
</div>
</div>
);
}
export default async function WaitlistDashboardPage() {
const withAdminAccess = await withRoleAccess(["admin"]);
const ProtectedWaitlistDashboard = await withAdminAccess(WaitlistDashboard);
return <ProtectedWaitlistDashboard />;
}

View File

@@ -30,6 +30,8 @@ import {
} from "@/components/atoms/Tooltip/BaseTooltip"; } from "@/components/atoms/Tooltip/BaseTooltip";
import { GraphMeta } from "@/lib/autogpt-server-api"; import { GraphMeta } from "@/lib/autogpt-server-api";
import jaro from "jaro-winkler"; import jaro from "jaro-winkler";
import { getV1GetSpecificGraph } from "@/app/api/__generated__/endpoints/graphs/graphs";
import { okData } from "@/app/api/helpers";
type _Block = Omit<Block, "inputSchema" | "outputSchema"> & { type _Block = Omit<Block, "inputSchema" | "outputSchema"> & {
uiKey?: string; uiKey?: string;
@@ -107,6 +109,8 @@ export function BlocksControl({
.filter((b) => b.uiType !== BlockUIType.AGENT) .filter((b) => b.uiType !== BlockUIType.AGENT)
.sort((a, b) => a.name.localeCompare(b.name)); .sort((a, b) => a.name.localeCompare(b.name));
// Agent blocks are created from GraphMeta which doesn't include schemas.
// Schemas will be fetched on-demand when the block is actually added.
const agentBlockList = flows const agentBlockList = flows
.map((flow): _Block => { .map((flow): _Block => {
return { return {
@@ -116,8 +120,9 @@ export function BlocksControl({
`Ver.${flow.version}` + `Ver.${flow.version}` +
(flow.description ? ` | ${flow.description}` : ""), (flow.description ? ` | ${flow.description}` : ""),
categories: [{ category: "AGENT", description: "" }], categories: [{ category: "AGENT", description: "" }],
inputSchema: flow.input_schema, // Empty schemas - will be populated when block is added
outputSchema: flow.output_schema, inputSchema: { type: "object", properties: {} },
outputSchema: { type: "object", properties: {} },
staticOutput: false, staticOutput: false,
uiType: BlockUIType.AGENT, uiType: BlockUIType.AGENT,
costs: [], costs: [],
@@ -125,8 +130,7 @@ export function BlocksControl({
hardcodedValues: { hardcodedValues: {
graph_id: flow.id, graph_id: flow.id,
graph_version: flow.version, graph_version: flow.version,
input_schema: flow.input_schema, // Schemas will be fetched on-demand when block is added
output_schema: flow.output_schema,
}, },
}; };
}) })
@@ -182,6 +186,37 @@ export function BlocksControl({
setSelectedCategory(null); setSelectedCategory(null);
}, []); }, []);
// Handler to add a block, fetching graph data on-demand for agent blocks
const handleAddBlock = useCallback(
async (block: _Block & { notAvailable: string | null }) => {
if (block.notAvailable) return;
// For agent blocks, fetch the full graph to get schemas
if (block.uiType === BlockUIType.AGENT && block.hardcodedValues) {
const graphID = block.hardcodedValues.graph_id as string;
const graphVersion = block.hardcodedValues.graph_version as number;
const graphData = okData(
await getV1GetSpecificGraph(graphID, { version: graphVersion }),
);
if (graphData) {
addBlock(block.id, block.name, {
...block.hardcodedValues,
input_schema: graphData.input_schema,
output_schema: graphData.output_schema,
});
} else {
// Fallback: add without schemas (will be incomplete)
console.error("Failed to fetch graph data for agent block");
addBlock(block.id, block.name, block.hardcodedValues || {});
}
} else {
addBlock(block.id, block.name, block.hardcodedValues || {});
}
},
[addBlock],
);
// Extract unique categories from blocks // Extract unique categories from blocks
const categories = useMemo(() => { const categories = useMemo(() => {
return Array.from( return Array.from(
@@ -303,10 +338,7 @@ export function BlocksControl({
}), }),
); );
}} }}
onClick={() => onClick={() => handleAddBlock(block)}
!block.notAvailable &&
addBlock(block.id, block.name, block?.hardcodedValues || {})
}
title={block.notAvailable ?? undefined} title={block.notAvailable ?? undefined}
> >
<div <div

View File

@@ -29,13 +29,17 @@ import "@xyflow/react/dist/style.css";
import { ConnectedEdge, CustomNode } from "../CustomNode/CustomNode"; import { ConnectedEdge, CustomNode } from "../CustomNode/CustomNode";
import "./flow.css"; import "./flow.css";
import { import {
BlockIORootSchema,
BlockUIType, BlockUIType,
formatEdgeID, formatEdgeID,
GraphExecutionID, GraphExecutionID,
GraphID, GraphID,
GraphMeta, GraphMeta,
LibraryAgent, LibraryAgent,
SpecialBlockID,
} from "@/lib/autogpt-server-api"; } from "@/lib/autogpt-server-api";
import { getV1GetSpecificGraph } from "@/app/api/__generated__/endpoints/graphs/graphs";
import { okData } from "@/app/api/helpers";
import { IncompatibilityInfo } from "../../../hooks/useSubAgentUpdate/types"; import { IncompatibilityInfo } from "../../../hooks/useSubAgentUpdate/types";
import { Key, storage } from "@/services/storage/local-storage"; import { Key, storage } from "@/services/storage/local-storage";
import { findNewlyAddedBlockCoordinates, getTypeColor } from "@/lib/utils"; import { findNewlyAddedBlockCoordinates, getTypeColor } from "@/lib/utils";
@@ -687,8 +691,94 @@ const FlowEditor: React.FC<{
[getNode, updateNode, nodes], [getNode, updateNode, nodes],
); );
/* Shared helper to create and add a node */
const createAndAddNode = useCallback(
async (
blockID: string,
blockName: string,
hardcodedValues: Record<string, any>,
position: { x: number; y: number },
): Promise<CustomNode | null> => {
const nodeSchema = availableBlocks.find((node) => node.id === blockID);
if (!nodeSchema) {
console.error(`Schema not found for block ID: ${blockID}`);
return null;
}
// For agent blocks, fetch the full graph to get schemas
let inputSchema: BlockIORootSchema = nodeSchema.inputSchema;
let outputSchema: BlockIORootSchema = nodeSchema.outputSchema;
let finalHardcodedValues = hardcodedValues;
if (blockID === SpecialBlockID.AGENT) {
const graphID = hardcodedValues.graph_id as string;
const graphVersion = hardcodedValues.graph_version as number;
const graphData = okData(
await getV1GetSpecificGraph(graphID, { version: graphVersion }),
);
if (graphData) {
inputSchema = graphData.input_schema as BlockIORootSchema;
outputSchema = graphData.output_schema as BlockIORootSchema;
finalHardcodedValues = {
...hardcodedValues,
input_schema: graphData.input_schema,
output_schema: graphData.output_schema,
};
} else {
console.error("Failed to fetch graph data for agent block");
}
}
const newNode: CustomNode = {
id: nodeId.toString(),
type: "custom",
position,
data: {
blockType: blockName,
blockCosts: nodeSchema.costs || [],
title: `${blockName} ${nodeId}`,
description: nodeSchema.description,
categories: nodeSchema.categories,
inputSchema: inputSchema,
outputSchema: outputSchema,
hardcodedValues: finalHardcodedValues,
connections: [],
isOutputOpen: false,
block_id: blockID,
isOutputStatic: nodeSchema.staticOutput,
uiType: nodeSchema.uiType,
},
};
addNodes(newNode);
setNodeId((prevId) => prevId + 1);
clearNodesStatusAndOutput();
history.push({
type: "ADD_NODE",
payload: { node: { ...newNode, ...newNode.data } },
undo: () => deleteElements({ nodes: [{ id: newNode.id }] }),
redo: () => addNodes(newNode),
});
return newNode;
},
[
availableBlocks,
nodeId,
addNodes,
deleteElements,
clearNodesStatusAndOutput,
],
);
const addNode = useCallback( const addNode = useCallback(
(blockId: string, nodeType: string, hardcodedValues: any = {}) => { async (
blockId: string,
nodeType: string,
hardcodedValues: Record<string, any> = {},
) => {
const nodeSchema = availableBlocks.find((node) => node.id === blockId); const nodeSchema = availableBlocks.find((node) => node.id === blockId);
if (!nodeSchema) { if (!nodeSchema) {
console.error(`Schema not found for block ID: ${blockId}`); console.error(`Schema not found for block ID: ${blockId}`);
@@ -707,73 +797,42 @@ const FlowEditor: React.FC<{
// Alternative: We could also use D3 force, Intersection for this (React flow Pro examples) // Alternative: We could also use D3 force, Intersection for this (React flow Pro examples)
const { x, y } = getViewport(); const { x, y } = getViewport();
const viewportCoordinates = const position =
nodeDimensions && Object.keys(nodeDimensions).length > 0 nodeDimensions && Object.keys(nodeDimensions).length > 0
? // we will get all the dimension of nodes, then store ? findNewlyAddedBlockCoordinates(
findNewlyAddedBlockCoordinates(
nodeDimensions, nodeDimensions,
nodeSchema.uiType == BlockUIType.NOTE ? 300 : 500, nodeSchema.uiType == BlockUIType.NOTE ? 300 : 500,
60, 60,
1.0, 1.0,
) )
: // we will get all the dimension of nodes, then store : {
{
x: window.innerWidth / 2 - x, x: window.innerWidth / 2 - x,
y: window.innerHeight / 2 - y, y: window.innerHeight / 2 - y,
}; };
const newNode: CustomNode = { const newNode = await createAndAddNode(
id: nodeId.toString(), blockId,
type: "custom", nodeType,
position: viewportCoordinates, // Set the position to the calculated viewport center hardcodedValues,
data: { position,
blockType: nodeType, );
blockCosts: nodeSchema.costs, if (!newNode) return;
title: `${nodeType} ${nodeId}`,
description: nodeSchema.description,
categories: nodeSchema.categories,
inputSchema: nodeSchema.inputSchema,
outputSchema: nodeSchema.outputSchema,
hardcodedValues: hardcodedValues,
connections: [],
isOutputOpen: false,
block_id: blockId,
isOutputStatic: nodeSchema.staticOutput,
uiType: nodeSchema.uiType,
},
};
addNodes(newNode);
setNodeId((prevId) => prevId + 1);
clearNodesStatusAndOutput(); // Clear status and output when a new node is added
setViewport( setViewport(
{ {
// Rough estimate of the dimension of the node is: 500x400px. x: -position.x * 0.8 + (window.innerWidth - 0.0) / 2,
// Though we skip shifting the X, considering the block menu side-bar. y: -position.y * 0.8 + (window.innerHeight - 400) / 2,
x: -viewportCoordinates.x * 0.8 + (window.innerWidth - 0.0) / 2,
y: -viewportCoordinates.y * 0.8 + (window.innerHeight - 400) / 2,
zoom: 0.8, zoom: 0.8,
}, },
{ duration: 500 }, { duration: 500 },
); );
history.push({
type: "ADD_NODE",
payload: { node: { ...newNode, ...newNode.data } },
undo: () => deleteElements({ nodes: [{ id: newNode.id }] }),
redo: () => addNodes(newNode),
});
}, },
[ [
nodeId,
getViewport, getViewport,
setViewport, setViewport,
availableBlocks, availableBlocks,
addNodes,
nodeDimensions, nodeDimensions,
deleteElements, createAndAddNode,
clearNodesStatusAndOutput,
], ],
); );
@@ -920,7 +979,7 @@ const FlowEditor: React.FC<{
}, []); }, []);
const onDrop = useCallback( const onDrop = useCallback(
(event: React.DragEvent) => { async (event: React.DragEvent) => {
event.preventDefault(); event.preventDefault();
const blockData = event.dataTransfer.getData("application/reactflow"); const blockData = event.dataTransfer.getData("application/reactflow");
@@ -935,62 +994,17 @@ const FlowEditor: React.FC<{
y: event.clientY, y: event.clientY,
}); });
// Find the block schema await createAndAddNode(
const nodeSchema = availableBlocks.find((node) => node.id === blockId); blockId,
if (!nodeSchema) { blockName,
console.error(`Schema not found for block ID: ${blockId}`); hardcodedValues || {},
return;
}
// Create the new node at the drop position
const newNode: CustomNode = {
id: nodeId.toString(),
type: "custom",
position, position,
data: { );
blockType: blockName,
blockCosts: nodeSchema.costs || [],
title: `${blockName} ${nodeId}`,
description: nodeSchema.description,
categories: nodeSchema.categories,
inputSchema: nodeSchema.inputSchema,
outputSchema: nodeSchema.outputSchema,
hardcodedValues: hardcodedValues,
connections: [],
isOutputOpen: false,
block_id: blockId,
uiType: nodeSchema.uiType,
},
};
history.push({
type: "ADD_NODE",
payload: { node: { ...newNode, ...newNode.data } },
undo: () => {
deleteElements({ nodes: [{ id: newNode.id } as any], edges: [] });
},
redo: () => {
addNodes([newNode]);
},
});
addNodes([newNode]);
clearNodesStatusAndOutput();
setNodeId((prevId) => prevId + 1);
} catch (error) { } catch (error) {
console.error("Failed to drop block:", error); console.error("Failed to drop block:", error);
} }
}, },
[ [screenToFlowPosition, createAndAddNode],
nodeId,
availableBlocks,
nodes,
edges,
addNodes,
screenToFlowPosition,
deleteElements,
clearNodesStatusAndOutput,
],
); );
const buildContextValue: BuilderContextType = useMemo( const buildContextValue: BuilderContextType = useMemo(

View File

@@ -4,13 +4,13 @@ import { AgentRunDraftView } from "@/app/(platform)/library/agents/[id]/componen
import { Dialog } from "@/components/molecules/Dialog/Dialog"; import { Dialog } from "@/components/molecules/Dialog/Dialog";
import type { import type {
CredentialsMetaInput, CredentialsMetaInput,
GraphMeta, Graph,
} from "@/lib/autogpt-server-api/types"; } from "@/lib/autogpt-server-api/types";
interface RunInputDialogProps { interface RunInputDialogProps {
isOpen: boolean; isOpen: boolean;
doClose: () => void; doClose: () => void;
graph: GraphMeta; graph: Graph;
doRun?: ( doRun?: (
inputs: Record<string, any>, inputs: Record<string, any>,
credentialsInputs: Record<string, CredentialsMetaInput>, credentialsInputs: Record<string, CredentialsMetaInput>,

View File

@@ -9,13 +9,13 @@ import { CustomNodeData } from "@/app/(platform)/build/components/legacy-builder
import { import {
BlockUIType, BlockUIType,
CredentialsMetaInput, CredentialsMetaInput,
GraphMeta, Graph,
} from "@/lib/autogpt-server-api/types"; } from "@/lib/autogpt-server-api/types";
import RunnerOutputUI, { OutputNodeInfo } from "./RunnerOutputUI"; import RunnerOutputUI, { OutputNodeInfo } from "./RunnerOutputUI";
import { RunnerInputDialog } from "./RunnerInputUI"; import { RunnerInputDialog } from "./RunnerInputUI";
interface RunnerUIWrapperProps { interface RunnerUIWrapperProps {
graph: GraphMeta; graph: Graph;
nodes: Node<CustomNodeData>[]; nodes: Node<CustomNodeData>[];
graphExecutionError?: string | null; graphExecutionError?: string | null;
saveAndRun: ( saveAndRun: (

View File

@@ -1,5 +1,5 @@
import { GraphInputSchema } from "@/lib/autogpt-server-api"; import { GraphInputSchema } from "@/lib/autogpt-server-api";
import { GraphMetaLike, IncompatibilityInfo } from "./types"; import { GraphLike, IncompatibilityInfo } from "./types";
// Helper type for schema properties - the generated types are too loose // Helper type for schema properties - the generated types are too loose
type SchemaProperties = Record<string, GraphInputSchema["properties"][string]>; type SchemaProperties = Record<string, GraphInputSchema["properties"][string]>;
@@ -36,7 +36,7 @@ export function getSchemaRequired(schema: unknown): SchemaRequired {
*/ */
export function createUpdatedAgentNodeInputs( export function createUpdatedAgentNodeInputs(
currentInputs: Record<string, unknown>, currentInputs: Record<string, unknown>,
latestSubGraphVersion: GraphMetaLike, latestSubGraphVersion: GraphLike,
): Record<string, unknown> { ): Record<string, unknown> {
return { return {
...currentInputs, ...currentInputs,

View File

@@ -1,7 +1,11 @@
import type { GraphMeta as LegacyGraphMeta } from "@/lib/autogpt-server-api"; import type {
Graph as LegacyGraph,
GraphMeta as LegacyGraphMeta,
} from "@/lib/autogpt-server-api";
import type { GraphModel as GeneratedGraph } from "@/app/api/__generated__/models/graphModel";
import type { GraphMeta as GeneratedGraphMeta } from "@/app/api/__generated__/models/graphMeta"; import type { GraphMeta as GeneratedGraphMeta } from "@/app/api/__generated__/models/graphMeta";
export type SubAgentUpdateInfo<T extends GraphMetaLike = GraphMetaLike> = { export type SubAgentUpdateInfo<T extends GraphLike = GraphLike> = {
hasUpdate: boolean; hasUpdate: boolean;
currentVersion: number; currentVersion: number;
latestVersion: number; latestVersion: number;
@@ -10,7 +14,10 @@ export type SubAgentUpdateInfo<T extends GraphMetaLike = GraphMetaLike> = {
incompatibilities: IncompatibilityInfo | null; incompatibilities: IncompatibilityInfo | null;
}; };
// Union type for GraphMeta that works with both legacy and new builder // Union type for Graph (with schemas) that works with both legacy and new builder
export type GraphLike = LegacyGraph | GeneratedGraph;
// Union type for GraphMeta (without schemas) for version detection
export type GraphMetaLike = LegacyGraphMeta | GeneratedGraphMeta; export type GraphMetaLike = LegacyGraphMeta | GeneratedGraphMeta;
export type IncompatibilityInfo = { export type IncompatibilityInfo = {

View File

@@ -1,5 +1,11 @@
import { useMemo } from "react"; import { useMemo } from "react";
import { GraphInputSchema, GraphOutputSchema } from "@/lib/autogpt-server-api"; import type {
GraphInputSchema,
GraphOutputSchema,
} from "@/lib/autogpt-server-api";
import type { GraphModel } from "@/app/api/__generated__/models/graphModel";
import { useGetV1GetSpecificGraph } from "@/app/api/__generated__/endpoints/graphs/graphs";
import { okData } from "@/app/api/helpers";
import { getEffectiveType } from "@/lib/utils"; import { getEffectiveType } from "@/lib/utils";
import { EdgeLike, getSchemaProperties, getSchemaRequired } from "./helpers"; import { EdgeLike, getSchemaProperties, getSchemaRequired } from "./helpers";
import { import {
@@ -11,26 +17,38 @@ import {
/** /**
* Checks if a newer version of a sub-agent is available and determines compatibility * Checks if a newer version of a sub-agent is available and determines compatibility
*/ */
export function useSubAgentUpdate<T extends GraphMetaLike>( export function useSubAgentUpdate(
nodeID: string, nodeID: string,
graphID: string | undefined, graphID: string | undefined,
graphVersion: number | undefined, graphVersion: number | undefined,
currentInputSchema: GraphInputSchema | undefined, currentInputSchema: GraphInputSchema | undefined,
currentOutputSchema: GraphOutputSchema | undefined, currentOutputSchema: GraphOutputSchema | undefined,
connections: EdgeLike[], connections: EdgeLike[],
availableGraphs: T[], availableGraphs: GraphMetaLike[],
): SubAgentUpdateInfo<T> { ): SubAgentUpdateInfo<GraphModel> {
// Find the latest version of the same graph // Find the latest version of the same graph
const latestGraph = useMemo(() => { const latestGraphInfo = useMemo(() => {
if (!graphID) return null; if (!graphID) return null;
return availableGraphs.find((graph) => graph.id === graphID) || null; return availableGraphs.find((graph) => graph.id === graphID) || null;
}, [graphID, availableGraphs]); }, [graphID, availableGraphs]);
// Check if there's an update available // Check if there's a newer version available
const hasUpdate = useMemo(() => { const hasUpdate = useMemo(() => {
if (!latestGraph || graphVersion === undefined) return false; if (!latestGraphInfo || graphVersion === undefined) return false;
return latestGraph.version! > graphVersion; return latestGraphInfo.version! > graphVersion;
}, [latestGraph, graphVersion]); }, [latestGraphInfo, graphVersion]);
// Fetch full graph IF an update is detected
const { data: latestGraph } = useGetV1GetSpecificGraph(
graphID ?? "",
{ version: latestGraphInfo?.version },
{
query: {
enabled: hasUpdate && !!graphID && !!latestGraphInfo?.version,
select: okData,
},
},
);
// Get connected input and output handles for this specific node // Get connected input and output handles for this specific node
const connectedHandles = useMemo(() => { const connectedHandles = useMemo(() => {
@@ -152,8 +170,8 @@ export function useSubAgentUpdate<T extends GraphMetaLike>(
return { return {
hasUpdate, hasUpdate,
currentVersion: graphVersion || 0, currentVersion: graphVersion || 0,
latestVersion: latestGraph?.version || 0, latestVersion: latestGraphInfo?.version || 0,
latestGraph, latestGraph: latestGraph || null,
isCompatible: compatibilityResult.isCompatible, isCompatible: compatibilityResult.isCompatible,
incompatibilities: compatibilityResult.incompatibilities, incompatibilities: compatibilityResult.incompatibilities,
}; };

View File

@@ -18,7 +18,7 @@ interface GraphStore {
outputSchema: Record<string, any> | null, outputSchema: Record<string, any> | null,
) => void; ) => void;
// Available graphs; used for sub-graph updates // Available graphs; used for sub-graph updated version detection
availableSubGraphs: GraphMeta[]; availableSubGraphs: GraphMeta[];
setAvailableSubGraphs: (graphs: GraphMeta[]) => void; setAvailableSubGraphs: (graphs: GraphMeta[]) => void;

View File

@@ -10,8 +10,8 @@ import React, {
import { import {
CredentialsMetaInput, CredentialsMetaInput,
CredentialsType, CredentialsType,
Graph,
GraphExecutionID, GraphExecutionID,
GraphMeta,
LibraryAgentPreset, LibraryAgentPreset,
LibraryAgentPresetID, LibraryAgentPresetID,
LibraryAgentPresetUpdatable, LibraryAgentPresetUpdatable,
@@ -69,7 +69,7 @@ export function AgentRunDraftView({
className, className,
recommendedScheduleCron, recommendedScheduleCron,
}: { }: {
graph: GraphMeta; graph: Graph;
agentActions?: ButtonAction[]; agentActions?: ButtonAction[];
recommendedScheduleCron?: string | null; recommendedScheduleCron?: string | null;
doRun?: ( doRun?: (

View File

@@ -2,8 +2,8 @@
import React, { useCallback, useMemo } from "react"; import React, { useCallback, useMemo } from "react";
import { import {
Graph,
GraphExecutionID, GraphExecutionID,
GraphMeta,
Schedule, Schedule,
ScheduleID, ScheduleID,
} from "@/lib/autogpt-server-api"; } from "@/lib/autogpt-server-api";
@@ -35,7 +35,7 @@ export function AgentScheduleDetailsView({
onForcedRun, onForcedRun,
doDeleteSchedule, doDeleteSchedule,
}: { }: {
graph: GraphMeta; graph: Graph;
schedule: Schedule; schedule: Schedule;
agentActions: ButtonAction[]; agentActions: ButtonAction[];
onForcedRun: (runID: GraphExecutionID) => void; onForcedRun: (runID: GraphExecutionID) => void;

View File

@@ -8,6 +8,7 @@ import { useMainMarketplacePage } from "./useMainMarketplacePage";
import { FeaturedCreators } from "../FeaturedCreators/FeaturedCreators"; import { FeaturedCreators } from "../FeaturedCreators/FeaturedCreators";
import { MainMarketplacePageLoading } from "../MainMarketplacePageLoading"; import { MainMarketplacePageLoading } from "../MainMarketplacePageLoading";
import { ErrorCard } from "@/components/molecules/ErrorCard/ErrorCard"; import { ErrorCard } from "@/components/molecules/ErrorCard/ErrorCard";
import { WaitlistSection } from "../WaitlistSection/WaitlistSection";
export const MainMarkeplacePage = () => { export const MainMarkeplacePage = () => {
const { featuredAgents, topAgents, featuredCreators, isLoading, hasError } = const { featuredAgents, topAgents, featuredCreators, isLoading, hasError } =
@@ -46,6 +47,10 @@ export const MainMarkeplacePage = () => {
{/* 100px margin because our featured sections button are placed 40px below the container */} {/* 100px margin because our featured sections button are placed 40px below the container */}
<Separator className="mb-6 mt-24" /> <Separator className="mb-6 mt-24" />
{/* Waitlist Section - "Help Shape What's Next" */}
<WaitlistSection />
<Separator className="mb-6 mt-12" />
{topAgents && ( {topAgents && (
<AgentsSection sectionTitle="Top Agents" agents={topAgents.agents} /> <AgentsSection sectionTitle="Top Agents" agents={topAgents.agents} />
)} )}

View File

@@ -0,0 +1,105 @@
"use client";
import Image from "next/image";
import { Button } from "@/components/atoms/Button/Button";
import { Check } from "@phosphor-icons/react";
interface WaitlistCardProps {
name: string;
subHeading: string;
description: string;
imageUrl: string | null;
isMember?: boolean;
onCardClick: () => void;
onJoinClick: (e: React.MouseEvent) => void;
}
export function WaitlistCard({
name,
subHeading,
description,
imageUrl,
isMember = false,
onCardClick,
onJoinClick,
}: WaitlistCardProps) {
function handleJoinClick(e: React.MouseEvent) {
e.stopPropagation();
onJoinClick(e);
}
return (
<div
className="flex h-[24rem] w-full max-w-md cursor-pointer flex-col items-start rounded-3xl bg-white transition-all duration-300 hover:shadow-lg dark:bg-zinc-900 dark:hover:shadow-gray-700"
onClick={onCardClick}
data-testid="waitlist-card"
role="button"
tabIndex={0}
aria-label={`${name} waitlist card`}
onKeyDown={(e) => {
if (e.key === "Enter" || e.key === " ") {
onCardClick();
}
}}
>
{/* Image Section */}
<div className="relative aspect-[2/1.2] w-full overflow-hidden rounded-large md:aspect-[2.17/1]">
{imageUrl ? (
<Image
src={imageUrl}
alt={`${name} preview image`}
fill
className="object-cover"
/>
) : (
<div className="flex h-full w-full items-center justify-center bg-gradient-to-br from-neutral-200 to-neutral-300 dark:from-neutral-700 dark:to-neutral-800">
<span className="text-4xl font-bold text-neutral-400 dark:text-neutral-500">
{name.charAt(0)}
</span>
</div>
)}
</div>
<div className="mt-3 flex w-full flex-1 flex-col px-4">
{/* Name and Subheading */}
<div className="flex w-full flex-col">
<h3 className="line-clamp-1 font-poppins text-xl font-semibold text-[#272727] dark:text-neutral-100">
{name}
</h3>
<p className="mt-1 line-clamp-1 text-sm text-neutral-500 dark:text-neutral-400">
{subHeading}
</p>
</div>
{/* Description */}
<div className="mt-2 flex w-full flex-col">
<p className="line-clamp-5 text-sm font-normal leading-relaxed text-neutral-600 dark:text-neutral-400">
{description}
</p>
</div>
<div className="flex-grow" />
{/* Join Waitlist Button */}
<div className="mt-4 w-full pb-4">
{isMember ? (
<Button
disabled
className="w-full rounded-full bg-green-600 text-white hover:bg-green-600 dark:bg-green-700 dark:hover:bg-green-700"
>
<Check className="mr-2" size={16} weight="bold" />
On the waitlist
</Button>
) : (
<Button
onClick={handleJoinClick}
className="w-full rounded-full bg-zinc-800 text-white hover:bg-zinc-700 dark:bg-zinc-700 dark:hover:bg-zinc-600"
>
Join waitlist
</Button>
)}
</div>
</div>
</div>
);
}

View File

@@ -0,0 +1,356 @@
"use client";
import { useState } from "react";
import Image from "next/image";
import { Button } from "@/components/atoms/Button/Button";
import { Dialog } from "@/components/molecules/Dialog/Dialog";
import { Input } from "@/components/atoms/Input/Input";
import {
Carousel,
CarouselContent,
CarouselItem,
CarouselNext,
CarouselPrevious,
} from "@/components/__legacy__/ui/carousel";
import type { StoreWaitlistEntry } from "@/app/api/__generated__/models/storeWaitlistEntry";
import { Check, Play } from "@phosphor-icons/react";
import { useSupabaseStore } from "@/lib/supabase/hooks/useSupabaseStore";
import { useToast } from "@/components/molecules/Toast/use-toast";
import { usePostV2AddSelfToTheAgentWaitlist } from "@/app/api/__generated__/endpoints/store/store";
interface MediaItem {
type: "image" | "video";
url: string;
label?: string;
}
// Extract YouTube video ID from various URL formats
function getYouTubeVideoId(url: string): string | null {
const regExp =
/^.*((youtu.be\/)|(v\/)|(\/u\/\w\/)|(embed\/)|(watch\?))\??v?=?([^#&?]*).*/;
const match = url.match(regExp);
return match && match[7].length === 11 ? match[7] : null;
}
// Validate video URL for security
function isValidVideoUrl(url: string): boolean {
if (url.startsWith("data:video")) {
return true;
}
const videoExtensions = /\.(mp4|webm|ogg)$/i;
const youtubeRegex = /^(https?:\/\/)?(www\.)?(youtube\.com|youtu\.?be)\/.+$/;
const validUrl = /^(https?:\/\/)/i;
const cleanedUrl = url.split("?")[0];
return (
(validUrl.test(url) && videoExtensions.test(cleanedUrl)) ||
youtubeRegex.test(url)
);
}
// Video player with YouTube embed support
function VideoPlayer({
url,
autoPlay = false,
className = "",
}: {
url: string;
autoPlay?: boolean;
className?: string;
}) {
const youtubeId = getYouTubeVideoId(url);
if (youtubeId) {
return (
<iframe
src={`https://www.youtube.com/embed/${youtubeId}${autoPlay ? "?autoplay=1" : ""}`}
title="YouTube video player"
className={className}
allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture"
sandbox="allow-same-origin allow-scripts allow-presentation"
allowFullScreen
/>
);
}
if (!isValidVideoUrl(url)) {
return (
<div
className={`flex items-center justify-center bg-zinc-800 ${className}`}
>
<span className="text-sm text-zinc-400">Invalid video URL</span>
</div>
);
}
return <video src={url} controls autoPlay={autoPlay} className={className} />;
}
function MediaCarousel({ waitlist }: { waitlist: StoreWaitlistEntry }) {
const [activeVideo, setActiveVideo] = useState<string | null>(null);
// Build media items array: videos first, then images
const mediaItems: MediaItem[] = [
...(waitlist.videoUrl
? [{ type: "video" as const, url: waitlist.videoUrl, label: "Video" }]
: []),
...(waitlist.agentOutputDemoUrl
? [
{
type: "video" as const,
url: waitlist.agentOutputDemoUrl,
label: "Demo",
},
]
: []),
...waitlist.imageUrls.map((url) => ({ type: "image" as const, url })),
];
if (mediaItems.length === 0) return null;
// Single item - no carousel needed
if (mediaItems.length === 1) {
const item = mediaItems[0];
return (
<div className="relative aspect-[350/196] w-full overflow-hidden rounded-large">
{item.type === "image" ? (
<Image
src={item.url}
alt={`${waitlist.name} preview`}
fill
className="object-cover"
/>
) : (
<VideoPlayer url={item.url} className="h-full w-full object-cover" />
)}
</div>
);
}
// Multiple items - use carousel
return (
<Carousel className="w-full">
<CarouselContent>
{mediaItems.map((item, index) => (
<CarouselItem key={index}>
<div className="relative aspect-[350/196] w-full overflow-hidden rounded-large">
{item.type === "image" ? (
<Image
src={item.url}
alt={`${waitlist.name} preview ${index + 1}`}
fill
className="object-cover"
/>
) : activeVideo === item.url ? (
<VideoPlayer
url={item.url}
autoPlay
className="h-full w-full object-cover"
/>
) : (
<button
onClick={() => setActiveVideo(item.url)}
className="group relative h-full w-full bg-zinc-900"
>
<div className="absolute inset-0 flex items-center justify-center">
<div className="flex h-16 w-16 items-center justify-center rounded-full bg-white/90 transition-transform group-hover:scale-110">
<Play size={32} weight="fill" className="text-zinc-800" />
</div>
</div>
<span className="absolute bottom-3 left-3 text-sm text-white">
{item.label}
</span>
</button>
)}
</div>
</CarouselItem>
))}
</CarouselContent>
<CarouselPrevious className="left-2 top-1/2 -translate-y-1/2" />
<CarouselNext className="right-2 top-1/2 -translate-y-1/2" />
</Carousel>
);
}
interface WaitlistDetailModalProps {
waitlist: StoreWaitlistEntry;
isMember?: boolean;
onClose: () => void;
onJoinSuccess?: (waitlistId: string) => void;
}
export function WaitlistDetailModal({
waitlist,
isMember = false,
onClose,
onJoinSuccess,
}: WaitlistDetailModalProps) {
const { user } = useSupabaseStore();
const [email, setEmail] = useState("");
const [success, setSuccess] = useState(false);
const { toast } = useToast();
const joinWaitlistMutation = usePostV2AddSelfToTheAgentWaitlist();
function handleJoin() {
joinWaitlistMutation.mutate(
{
waitlistId: waitlist.waitlistId,
data: { email: user ? undefined : email },
},
{
onSuccess: (response) => {
if (response.status === 200) {
setSuccess(true);
toast({
title: "You're on the waitlist!",
description: `We'll notify you when ${waitlist.name} goes live.`,
});
onJoinSuccess?.(waitlist.waitlistId);
} else {
toast({
variant: "destructive",
title: "Error",
description: "Failed to join waitlist. Please try again.",
});
}
},
onError: () => {
toast({
variant: "destructive",
title: "Error",
description: "Failed to join waitlist. Please try again.",
});
},
},
);
}
// Success state
if (success) {
return (
<Dialog
title=""
controlled={{
isOpen: true,
set: async (open) => {
if (!open) onClose();
},
}}
onClose={onClose}
styling={{ maxWidth: "500px" }}
>
<Dialog.Content>
<div className="flex flex-col items-center justify-center py-4 text-center">
{/* Party emoji */}
<span className="mb-2 text-5xl">🎉</span>
{/* Title */}
<h2 className="mb-2 font-poppins text-[22px] font-medium leading-7 text-zinc-900 dark:text-zinc-100">
You&apos;re on the waitlist
</h2>
{/* Subtitle */}
<p className="text-base leading-[26px] text-zinc-600 dark:text-zinc-400">
Thanks for helping us prioritize which agents to build next.
We&apos;ll notify you when this agent goes live in the
marketplace.
</p>
</div>
{/* Close button */}
<Dialog.Footer className="flex justify-center pb-2 pt-4">
<Button
variant="secondary"
onClick={onClose}
className="rounded-full border border-zinc-700 bg-white px-4 py-3 text-zinc-900 hover:bg-zinc-100 dark:border-zinc-500 dark:bg-zinc-800 dark:text-zinc-100 dark:hover:bg-zinc-700"
>
Close
</Button>
</Dialog.Footer>
</Dialog.Content>
</Dialog>
);
}
// Main modal - handles both member and non-member states
return (
<Dialog
title="Join the waitlist"
controlled={{
isOpen: true,
set: async (open) => {
if (!open) onClose();
},
}}
onClose={onClose}
styling={{ maxWidth: "500px" }}
>
<Dialog.Content>
{/* Subtitle */}
<p className="mb-6 text-center text-base text-zinc-600 dark:text-zinc-400">
Help us decide what to build next and get notified when this agent
is ready
</p>
{/* Media Carousel */}
<MediaCarousel waitlist={waitlist} />
{/* Agent Name */}
<h3 className="mt-4 font-poppins text-[22px] font-medium leading-7 text-zinc-800 dark:text-zinc-100">
{waitlist.name}
</h3>
{/* Agent Description */}
<p className="mt-2 line-clamp-5 text-sm leading-[22px] text-zinc-500 dark:text-zinc-400">
{waitlist.description}
</p>
{/* Email input for non-logged-in users who haven't joined */}
{!isMember && !user && (
<div className="mt-4 pr-1">
<Input
id="email"
label="Email address"
type="email"
placeholder="you@example.com"
value={email}
onChange={(e) => setEmail(e.target.value)}
required
/>
</div>
)}
{/* Footer buttons */}
<Dialog.Footer className="sticky bottom-0 mt-6 flex justify-center gap-3 bg-white pb-2 pt-4 dark:bg-zinc-900">
{isMember ? (
<Button
disabled
className="rounded-full bg-green-600 px-4 py-3 text-white hover:bg-green-600 dark:bg-green-700 dark:hover:bg-green-700"
>
<Check size={16} className="mr-2" />
You&apos;re on the waitlist
</Button>
) : (
<>
<Button
onClick={handleJoin}
loading={joinWaitlistMutation.isPending}
disabled={!user && !email}
className="rounded-full bg-zinc-800 px-4 py-3 text-white hover:bg-zinc-700 dark:bg-zinc-700 dark:hover:bg-zinc-600"
>
Join waitlist
</Button>
<Button
type="button"
variant="secondary"
onClick={onClose}
className="rounded-full bg-zinc-200 px-4 py-3 text-zinc-900 hover:bg-zinc-300 dark:bg-zinc-700 dark:text-zinc-100 dark:hover:bg-zinc-600"
>
Not now
</Button>
</>
)}
</Dialog.Footer>
</Dialog.Content>
</Dialog>
);
}

View File

@@ -0,0 +1,102 @@
"use client";
import { useState } from "react";
import {
Carousel,
CarouselContent,
CarouselItem,
} from "@/components/__legacy__/ui/carousel";
import { WaitlistCard } from "../WaitlistCard/WaitlistCard";
import { WaitlistDetailModal } from "../WaitlistDetailModal/WaitlistDetailModal";
import type { StoreWaitlistEntry } from "@/app/api/__generated__/models/storeWaitlistEntry";
import { useWaitlistSection } from "./useWaitlistSection";
export function WaitlistSection() {
const { waitlists, joinedWaitlistIds, isLoading, hasError, markAsJoined } =
useWaitlistSection();
const [selectedWaitlist, setSelectedWaitlist] =
useState<StoreWaitlistEntry | null>(null);
function handleOpenModal(waitlist: StoreWaitlistEntry) {
setSelectedWaitlist(waitlist);
}
function handleJoinSuccess(waitlistId: string) {
markAsJoined(waitlistId);
}
// Don't render if loading, error, or no waitlists
if (isLoading || hasError || !waitlists || waitlists.length === 0) {
return null;
}
return (
<div className="flex flex-col items-center justify-center">
<div className="w-full max-w-[1360px]">
{/* Section Header */}
<div className="mb-6">
<h2 className="font-poppins text-2xl font-semibold text-[#282828] dark:text-neutral-200">
Help Shape What&apos;s Next
</h2>
<p className="mt-2 text-base text-neutral-600 dark:text-neutral-400">
These agents are in development. Your interest helps us prioritize
what gets built and we&apos;ll notify you when they&apos;re ready.
</p>
</div>
{/* Mobile Carousel View */}
<Carousel
className="md:hidden"
opts={{
loop: true,
}}
>
<CarouselContent>
{waitlists.map((waitlist) => (
<CarouselItem
key={waitlist.waitlistId}
className="min-w-64 max-w-71"
>
<WaitlistCard
name={waitlist.name}
subHeading={waitlist.subHeading}
description={waitlist.description}
imageUrl={waitlist.imageUrls[0] || null}
isMember={joinedWaitlistIds.has(waitlist.waitlistId)}
onCardClick={() => handleOpenModal(waitlist)}
onJoinClick={() => handleOpenModal(waitlist)}
/>
</CarouselItem>
))}
</CarouselContent>
</Carousel>
{/* Desktop Grid View */}
<div className="hidden grid-cols-1 place-items-center gap-6 md:grid md:grid-cols-2 lg:grid-cols-3">
{waitlists.map((waitlist) => (
<WaitlistCard
key={waitlist.waitlistId}
name={waitlist.name}
subHeading={waitlist.subHeading}
description={waitlist.description}
imageUrl={waitlist.imageUrls[0] || null}
isMember={joinedWaitlistIds.has(waitlist.waitlistId)}
onCardClick={() => handleOpenModal(waitlist)}
onJoinClick={() => handleOpenModal(waitlist)}
/>
))}
</div>
</div>
{/* Single Modal for both viewing and joining */}
{selectedWaitlist && (
<WaitlistDetailModal
waitlist={selectedWaitlist}
isMember={joinedWaitlistIds.has(selectedWaitlist.waitlistId)}
onClose={() => setSelectedWaitlist(null)}
onJoinSuccess={handleJoinSuccess}
/>
)}
</div>
);
}

View File

@@ -0,0 +1,58 @@
"use client";
import { useMemo } from "react";
import { useSupabaseStore } from "@/lib/supabase/hooks/useSupabaseStore";
import {
useGetV2GetTheAgentWaitlist,
useGetV2GetWaitlistIdsTheCurrentUserHasJoined,
getGetV2GetWaitlistIdsTheCurrentUserHasJoinedQueryKey,
} from "@/app/api/__generated__/endpoints/store/store";
import type { StoreWaitlistEntry } from "@/app/api/__generated__/models/storeWaitlistEntry";
import { useQueryClient } from "@tanstack/react-query";
export function useWaitlistSection() {
const { user } = useSupabaseStore();
const queryClient = useQueryClient();
// Fetch waitlists
const {
data: waitlistsResponse,
isLoading: waitlistsLoading,
isError: waitlistsError,
} = useGetV2GetTheAgentWaitlist();
// Fetch memberships if logged in
const { data: membershipsResponse, isLoading: membershipsLoading } =
useGetV2GetWaitlistIdsTheCurrentUserHasJoined({
query: {
enabled: !!user,
},
});
const waitlists: StoreWaitlistEntry[] = useMemo(() => {
if (waitlistsResponse?.status === 200) {
return waitlistsResponse.data.listings;
}
return [];
}, [waitlistsResponse]);
const joinedWaitlistIds: Set<string> = useMemo(() => {
if (membershipsResponse?.status === 200) {
return new Set(membershipsResponse.data);
}
return new Set();
}, [membershipsResponse]);
const isLoading = waitlistsLoading || (!!user && membershipsLoading);
const hasError = waitlistsError;
// Function to add a waitlist ID to joined set (called after successful join)
function markAsJoined(_waitlistId: string) {
// Invalidate the memberships query to refetch
queryClient.invalidateQueries({
queryKey: getGetV2GetWaitlistIdsTheCurrentUserHasJoinedQueryKey(),
});
}
return { waitlists, joinedWaitlistIds, isLoading, hasError, markAsJoined };
}

File diff suppressed because it is too large Load Diff

View File

@@ -362,25 +362,14 @@ export type GraphMeta = {
user_id: UserID; user_id: UserID;
version: number; version: number;
is_active: boolean; is_active: boolean;
created_at: Date;
name: string; name: string;
description: string; description: string;
instructions?: string | null; instructions?: string | null;
recommended_schedule_cron: string | null; recommended_schedule_cron: string | null;
forked_from_id?: GraphID | null; forked_from_id?: GraphID | null;
forked_from_version?: number | null; forked_from_version?: number | null;
input_schema: GraphInputSchema; };
output_schema: GraphOutputSchema;
credentials_input_schema: CredentialsInputSchema;
} & (
| {
has_external_trigger: true;
trigger_setup_info: GraphTriggerInfo;
}
| {
has_external_trigger: false;
trigger_setup_info: null;
}
);
export type GraphID = Brand<string, "GraphID">; export type GraphID = Brand<string, "GraphID">;
@@ -447,11 +436,22 @@ export type GraphTriggerInfo = {
/* Mirror of backend/data/graph.py:Graph */ /* Mirror of backend/data/graph.py:Graph */
export type Graph = GraphMeta & { export type Graph = GraphMeta & {
created_at: Date;
nodes: Node[]; nodes: Node[];
links: Link[]; links: Link[];
sub_graphs: Omit<Graph, "sub_graphs">[]; // Flattened sub-graphs sub_graphs: Omit<Graph, "sub_graphs">[]; // Flattened sub-graphs
}; input_schema: GraphInputSchema;
output_schema: GraphOutputSchema;
credentials_input_schema: CredentialsInputSchema;
} & (
| {
has_external_trigger: true;
trigger_setup_info: GraphTriggerInfo;
}
| {
has_external_trigger: false;
trigger_setup_info: null;
}
);
export type GraphUpdateable = Omit< export type GraphUpdateable = Omit<
Graph, Graph,

View File

@@ -1,325 +0,0 @@
# Workspace & Media File Architecture
This document describes the architecture for handling user files in AutoGPT Platform, covering persistent user storage (Workspace) and ephemeral media processing pipelines.
## Overview
The platform has two distinct file-handling layers:
| Layer | Purpose | Persistence | Scope |
|-------|---------|-------------|-------|
| **Workspace** | Long-term user file storage | Persistent (DB + GCS/local) | Per-user, session-scoped access |
| **Media Pipeline** | Ephemeral file processing for blocks | Temporary (local disk) | Per-execution |
## Database Models
### UserWorkspace
Represents a user's file storage space. Created on-demand (one per user).
```prisma
model UserWorkspace {
id String @id @default(uuid())
createdAt DateTime @default(now())
updatedAt DateTime @updatedAt
userId String @unique
Files UserWorkspaceFile[]
}
```
**Key points:**
- One workspace per user (enforced by `@unique` on `userId`)
- Created lazily via `get_or_create_workspace()`
- Uses upsert to handle race conditions
### UserWorkspaceFile
Represents a file stored in a user's workspace.
```prisma
model UserWorkspaceFile {
id String @id @default(uuid())
workspaceId String
name String // User-visible filename
path String // Virtual path (e.g., "/sessions/abc123/image.png")
storagePath String // Actual storage path (gcs://... or local://...)
mimeType String
sizeBytes BigInt
checksum String? // SHA256 for integrity
isDeleted Boolean @default(false)
deletedAt DateTime?
metadata Json @default("{}")
@@unique([workspaceId, path]) // Enforce unique paths within workspace
}
```
**Key points:**
- `path` is a virtual path for organizing files (not actual filesystem path)
- `storagePath` contains the actual GCS or local storage location
- Soft-delete pattern: `isDeleted` flag with `deletedAt` timestamp
- Path is modified on delete to free up the virtual path for reuse
---
## WorkspaceManager
**Location:** `backend/util/workspace.py`
High-level API for workspace file operations. Combines storage backend operations with database record management.
### Initialization
```python
from backend.util.workspace import WorkspaceManager
# Basic usage
manager = WorkspaceManager(user_id="user-123", workspace_id="ws-456")
# With session scoping (CoPilot sessions)
manager = WorkspaceManager(
user_id="user-123",
workspace_id="ws-456",
session_id="session-789"
)
```
### Session Scoping
When `session_id` is provided, files are isolated to `/sessions/{session_id}/`:
```python
# With session_id="abc123":
manager.write_file(content, "image.png")
# → stored at /sessions/abc123/image.png
# Cross-session access is explicit:
manager.read_file("/sessions/other-session/file.txt") # Works
```
**Why session scoping?**
- CoPilot conversations need file isolation
- Prevents file collisions between concurrent sessions
- Allows session cleanup without affecting other sessions
### Core Methods
| Method | Description |
|--------|-------------|
| `write_file(content, filename, path?, mime_type?, overwrite?)` | Write file to workspace |
| `read_file(path)` | Read file by virtual path |
| `read_file_by_id(file_id)` | Read file by ID |
| `list_files(path?, limit?, offset?, include_all_sessions?)` | List files |
| `delete_file(file_id)` | Soft-delete a file |
| `get_download_url(file_id, expires_in?)` | Get signed download URL |
| `get_file_info(file_id)` | Get file metadata |
| `get_file_count(path?, include_all_sessions?)` | Count files |
### Storage Backends
WorkspaceManager delegates to `WorkspaceStorageBackend`:
| Backend | When Used | Storage Path Format |
|---------|-----------|---------------------|
| `GCSWorkspaceStorage` | `media_gcs_bucket_name` is configured | `gcs://bucket/workspaces/{ws_id}/{file_id}/{filename}` |
| `LocalWorkspaceStorage` | No GCS bucket configured | `local://{ws_id}/{file_id}/{filename}` |
---
## store_media_file()
**Location:** `backend/util/file.py`
The media normalization pipeline. Handles various input types and normalizes them for processing or output.
### Purpose
Blocks receive files in many formats (URLs, data URIs, workspace references, local paths). `store_media_file()` normalizes these to a consistent format based on what the block needs.
### Input Types Handled
| Input Format | Example | How It's Processed |
|--------------|---------|-------------------|
| Data URI | `data:image/png;base64,iVBOR...` | Decoded, virus scanned, written locally |
| HTTP(S) URL | `https://example.com/image.png` | Downloaded, virus scanned, written locally |
| Workspace URI | `workspace://abc123` or `workspace:///path/to/file` | Read from workspace, virus scanned, written locally |
| Cloud path | `gcs://bucket/path` | Downloaded, virus scanned, written locally |
| Local path | `image.png` | Verified to exist in exec_file directory |
### Return Formats
The `return_format` parameter determines what you get back:
```python
from backend.util.file import store_media_file
# For local processing (ffmpeg, MoviePy, PIL)
local_path = await store_media_file(
file=input_file,
execution_context=ctx,
return_format="for_local_processing"
)
# Returns: "image.png" (relative path in exec_file dir)
# For external APIs (Replicate, OpenAI, etc.)
data_uri = await store_media_file(
file=input_file,
execution_context=ctx,
return_format="for_external_api"
)
# Returns: "data:image/png;base64,iVBOR..."
# For block output (adapts to execution context)
output = await store_media_file(
file=input_file,
execution_context=ctx,
return_format="for_block_output"
)
# In CoPilot: Returns "workspace://file-id#image/png"
# In graphs: Returns "data:image/png;base64,..."
```
### Execution Context
`store_media_file()` requires an `ExecutionContext` with:
- `graph_exec_id` - Required for temp file location
- `user_id` - Required for workspace access
- `workspace_id` - Optional; enables workspace features
- `session_id` - Optional; for session scoping in CoPilot
---
## Responsibility Boundaries
### Virus Scanning
| Component | Scans? | Notes |
|-----------|--------|-------|
| `store_media_file()` | ✅ Yes | Scans **all** content before writing to local disk |
| `WorkspaceManager.write_file()` | ✅ Yes | Scans content before persisting |
**Scanning happens at:**
1. `store_media_file()` — scans everything it downloads/decodes
2. `WorkspaceManager.write_file()` — scans before persistence
Tools like `WriteWorkspaceFileTool` don't need to scan because `WorkspaceManager.write_file()` handles it.
### Persistence
| Component | Persists To | Lifecycle |
|-----------|-------------|-----------|
| `store_media_file()` | Temp dir (`/tmp/exec_file/{exec_id}/`) | Cleaned after execution |
| `WorkspaceManager` | GCS or local storage + DB | Persistent until deleted |
**Automatic cleanup:** `clean_exec_files(graph_exec_id)` removes temp files after execution completes.
---
## Decision Tree: WorkspaceManager vs store_media_file
```
┌─────────────────────────────────────────────────────┐
│ What do you need to do with the file? │
└─────────────────────────────────────────────────────┘
┌─────────────┴─────────────┐
▼ ▼
Process in a block Store for user access
(ffmpeg, PIL, etc.) (CoPilot files, uploads)
│ │
▼ ▼
store_media_file() WorkspaceManager
with appropriate
return_format
┌──────┴──────┐
▼ ▼
"for_local_ "for_block_
processing" output"
│ │
▼ ▼
Get local Auto-saves to
path for workspace in
tools CoPilot context
```
### Quick Reference
| Scenario | Use |
|----------|-----|
| Block needs to process a file with ffmpeg | `store_media_file(..., return_format="for_local_processing")` |
| Block needs to send file to external API | `store_media_file(..., return_format="for_external_api")` |
| Block returning a generated file | `store_media_file(..., return_format="for_block_output")` |
| API endpoint handling file upload | `WorkspaceManager.write_file()` (after virus scan) |
| API endpoint serving file download | `WorkspaceManager.get_download_url()` |
| Listing user's files | `WorkspaceManager.list_files()` |
---
## Key Files Reference
| File | Purpose |
|------|---------|
| `backend/data/workspace.py` | Database CRUD operations for UserWorkspace and UserWorkspaceFile |
| `backend/util/workspace.py` | `WorkspaceManager` class - high-level workspace API |
| `backend/util/workspace_storage.py` | Storage backends (GCS, local) and `WorkspaceStorageBackend` interface |
| `backend/util/file.py` | `store_media_file()` and media processing utilities |
| `backend/util/virus_scanner.py` | `VirusScannerService` and `scan_content_safe()` |
| `schema.prisma` | Database model definitions |
---
## Common Patterns
### Block Processing a User's File
```python
async def run(self, input_data, *, execution_context, **kwargs):
# Normalize input to local path
local_path = await store_media_file(
file=input_data.video,
execution_context=execution_context,
return_format="for_local_processing",
)
# Process with local tools
output_path = process_video(local_path)
# Return (auto-saves to workspace in CoPilot)
result = await store_media_file(
file=output_path,
execution_context=execution_context,
return_format="for_block_output",
)
yield "output", result
```
### API Upload Endpoint
```python
async def upload_file(file: UploadFile, user_id: str, workspace_id: str):
content = await file.read()
# write_file handles virus scanning
manager = WorkspaceManager(user_id, workspace_id)
workspace_file = await manager.write_file(
content=content,
filename=file.filename,
)
return {"file_id": workspace_file.id}
```
---
## Configuration
| Setting | Purpose | Default |
|---------|---------|---------|
| `media_gcs_bucket_name` | GCS bucket for workspace storage | None (uses local) |
| `workspace_storage_dir` | Local storage directory | `{app_data}/workspaces` |
| `max_file_size_mb` | Maximum file size in MB | 100 |
| `clamav_service_enabled` | Enable virus scanning | true |
| `clamav_service_host` | ClamAV daemon host | localhost |
| `clamav_service_port` | ClamAV daemon port | 3310 |