Compare commits

...

91 Commits

Author SHA1 Message Date
Bently
dc981b52a3 feat(Platform): add claude 4 sonnet and opus models to platform (#10018)
This adds the latest claude 4 opus and sonnet to the platform

https://www.anthropic.com/news/claude-4
2025-05-22 19:16:25 +00:00
ograce1421
61643e6a47 fix(frontend): Top Agents header spacing (#10002)
Changed the section header for "Top Agents" to include a 24px margin. 
I have not tested this, an eng needs to test / look at this

## Summary
- set `margin` default to 24px in `AgentsSection`
- apply the bottom margin via an inline style

## Testing
- `npm test` *(fails: playwright not found)*
- `npm run lint` *(fails: next not found)*


### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  - [x] Test via deployment to the dev branch and verify by designer

---------

Co-authored-by: Zamil Majdy <zamil.majdy@agpt.co>
Co-authored-by: Nicholas Tindle <nicholas.tindle@agpt.co>
2025-05-22 16:12:08 +00:00
Reinier van der Leer
21b4d272ce feat(frontend/library): Replace "Loading..." by loading spinners (#9993)
- Resolves #9992

### Changes 🏗️

- Use `<LoadingBox>` instead of "Loading..." on `/library/agents/[id]`

![2025-05-20 23 26
vivaldi](https://github.com/user-attachments/assets/6fe8ce60-c249-4e4c-b3f1-eea925b003d3)


### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [x] Designer approves based on screencapture
2025-05-22 15:58:32 +00:00
Toran Bruce Richards
b8ba572629 Fix AddMemoryBlock JSON serialization error (#10013)
This pull request refines the handling of `input_data.content` and
improves error message formatting in the `run` method of `mem0.py`. The
changes enhance robustness and clarity in the code.

### Handling `input_data.content`:

* Updated the `run` method to handle `Content` objects explicitly,
ensuring proper formatting of messages when `input_data.content` is of
type `Content`. Additionally, non-standard types are now converted to
strings for consistent handling.
(`[autogpt_platform/backend/backend/blocks/mem0.pyR127-R130](diffhunk://#diff-d7abf8c3299388129480b6a9be78438fe7e0fbe239da630ebb486ad99c80dd24R127-R130)`)

### Error message formatting:

* Simplified the error message formatting by removing the unnecessary
`object=` keyword in the `str()` conversion of exceptions.
(`[autogpt_platform/backend/backend/blocks/mem0.pyL155-R157](diffhunk://#diff-d7abf8c3299388129480b6a9be78438fe7e0fbe239da630ebb486ad99c80dd24L155-R157)`)

## Summary
- fix AddMemoryBlock so `Content` input uses the underlying string
- improve error handling in Mem0 AddMemoryBlock

## Testing
- `ruff check autogpt_platform/backend/backend/blocks/mem0.py`
- `pre-commit run --files
autogpt_platform/backend/backend/blocks/mem0.py` *(fails: unable to
fetch remote hooks)*
- `poetry run pytest -k AddMemoryBlock -q` *(fails: Error 111 connecting
to localhost:6379)*

Checklist 📋
For code changes:
 I have clearly listed my changes in the PR description
 I have made a test plan
 I have tested my changes according to the test plan:
 Payload for webhook-triggered runs is shown on /library/agents/[id]
2025-05-22 15:54:18 +00:00
Nicholas Tindle
47deeb53c3 docs(platform): update AGENTS instructions (#10016)
## Summary
- refine contribution instructions in `autogpt_platform/AGENTS.md`

## Testing
- `pre-commit` *(fails to fetch hooks due to no network access)*

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [x] Docs only hcnage

---------

Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2025-05-22 15:54:02 +00:00
Zamil Majdy
1b81a7c755 fix(blocks): Error messages from SendWebRequestBlock use the requested translated IP instead of the orignal URL (#10009)
### Changes 🏗️

Keep the original URL when an HTTP error occurs in
`SendWebRequestBlock`.

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
- [x] Test sending POST request on a web that doesn't support POST
request using `SendWebRequestBlock`.
2025-05-22 15:46:01 +00:00
Zamil Majdy
8f1b3eb8ba fix(backend/executor): Make executor continuously running and retrying message consumption (#9999)
The executor can sometimes become dangling due to the executor stopping
executing messages but the process is not fully killed. This PR avoids
such a scenario by simply keeping retrying it.

### Changes 🏗️

Introduced continuous_retry decorator and use it to executor message
consumption/

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [x] Run executor service and execute some agents.
2025-05-22 12:11:08 +00:00
Reinier van der Leer
73ee6e272a fix(backend): Unbreak UserIntegrations parsing for missing None values (#9994)
Makes all optional fields on `Credentials` models actually optional, and
sets `exclude_none=True` on the corresponding `model_dump`.

This is a hotfix: after running the `aryshare-revid` branch on the dev
deployment, there is some data in the DB that isn't valid for the
`UserIntegrations` model on the `dev` branch (see
[here](https://github.com/Significant-Gravitas/AutoGPT/pull/9946#discussion_r2098428575)).

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
- [x] This fix worked on the `aryshare-revid` branch:
52b6d9696b
2025-05-20 23:50:19 +00:00
Reinier van der Leer
f466b010e4 fix(backend): Unbreak URL handling for GitHub blocks (#9989)
- Resolves #9987

### Changes 🏗️

- Split `pin_url(..)` out of `validate_url(..)` and call
`extra_url_validator` in between

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
- [x] GitHub Read Pull Request Block works with "Include PR Changes"
enabled
2025-05-20 22:54:39 +00:00
Nicholas Tindle
f8965e530f ref(frontend/admin): fix location of spending page (#9991)
### Changes 🏗️
Moves the route path for spending
drops min

<!-- Concisely describe all of the changes made in this pull request:
-->

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [x] test locally

---------

Co-authored-by: Bently <Github@bentlybro.com>
2025-05-20 21:43:38 +00:00
Zamil Majdy
701d283f69 fix(backend): Disable health check for scheduler service from the api server 2025-05-20 18:04:10 +01:00
Zamil Majdy
47c1a64cc2 fix(backend): Remove cleaner on graph executor exit 2025-05-20 17:45:19 +01:00
Reinier van der Leer
cf9cf4e7dd refactor(frontend): Move OttoChatWidget out of root layout (#9951)
- Resolves #9950

### Changes 🏗️

- Move `<OttoChatWidget>` from root layout into `FlowEditor`
- Pass graph info directly into `OttoChatWidget` instead of using
`useAgentGraph`
- Rearrange z-indices of elements in the builder

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  - Go to `/build`
  - [x] -> chat widget should show up in the bottom right corner
  - Open the widget and ask Otto something
  - [x] -> should work normally
  - Add a few blocks and save the graph
  - [x] -> "Include graph data" should show up
  - Click "Include graph data" and ask Otto something about your graph
  - [x] -> Otto should be aware of the graph structure and metadata
2025-05-20 13:38:09 +00:00
Reinier van der Leer
0a79e1c5fd feat(frontend/library): Show toast on WebSocket (dis|re)connect (#9949)
- Resolves #9941
- Follow-up to #9935

### Changes 🏗️

- Show toast when WS connection (dis|re)connects (on `/library/agents/[id]`)
  - Implement `BackendAPI.onWebSocketDisconnect`

Related improvements:
- Clean up WebSocket state management & logging in `BackendAPI`
- Clean up & split loading spinner implementation: `Spinner` -> `LoadingBox` + `LoadingSpinner`

Also, unrelated:
- fix(frontend/library): Add 2 second debounce to page refresh logic
  This eliminates 3 triple API calls (so 9 -> 3 total) on page load: `GET /library/agents/{agent_id}`, `GET /graphs/{graph_id}/executions`, and `GET /graphs/{graph_id}/executions/{exec_id}`

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  - Start the frontend and backend applications (locally)
  - Navigate to `/library/agents/[id]`
  - Kill the backend
  - [x] -> a toast should appear "Connection to server was lost"
  - [x] -> this toast should be shown as long as the server is down
  - Re-start the backend
  - [x] -> toast should change to show "Connection re-established"
  - [x] -> toast should now disappear after 2 seconds

---

Co-authored-by: Krzysztof Czerwinski <kpczerwinski@gmail.com>
2025-05-19 23:17:06 +02:00
Reinier van der Leer
ac532ca4b9 fix(backend): Graph execution update on terminate (#9952)
Resolves #9947

### Changes 🏗️

Backend:
- Send a graph execution update after terminating a run
- Don't wipe the graph execution stats when not passed in to `update_graph_execution_stats`

Frontend:
- Don't hide the output of stopped runs

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  - Go to `/library/agents/[id]`
  - Run an agent that takes a while (long enough to click stop and see the effect)
  - Hit stop after it has executed a few nodes
  - [x] -> run status should change to "Stopped"
  - [x] -> run stats (steps, duration, cost) should stay the same or increase only one last time
  - [x] -> output so far should be visible
  - [x] -> shown information should stay the same after refreshing the page

---

Co-authored-by: Krzysztof Czerwinski <34861343+kcze@users.noreply.github.com>
2025-05-19 23:15:39 +02:00
Zamil Majdy
aa2c2c1ad2 fix(backend): Force process exit on execution manager cleanup 2025-05-19 16:47:56 +01:00
dependabot[bot]
bd425331f1 chore(frontend/deps): Update 35 dependencies to latest minor versions (#9953)
Bumps the production-dependencies group with 35 updates in the
/autogpt_platform/frontend directory:

| Package | From | To |
| --- | --- | --- |
| [@faker-js/faker](https://github.com/faker-js/faker) | `9.6.0` |
`9.8.0` |
|
[@next/third-parties](https://github.com/vercel/next.js/tree/HEAD/packages/third-parties)
| `15.2.1` | `15.3.2` |
| [@radix-ui/react-alert-dialog](https://github.com/radix-ui/primitives)
| `1.1.6` | `1.1.13` |
| [@radix-ui/react-avatar](https://github.com/radix-ui/primitives) |
`1.1.3` | `1.1.9` |
| [@radix-ui/react-checkbox](https://github.com/radix-ui/primitives) |
`1.1.4` | `1.3.1` |
| [@radix-ui/react-collapsible](https://github.com/radix-ui/primitives)
| `1.1.3` | `1.1.10` |
| [@radix-ui/react-context-menu](https://github.com/radix-ui/primitives)
| `2.2.6` | `2.2.14` |
|
[@radix-ui/react-dropdown-menu](https://github.com/radix-ui/primitives)
| `2.1.6` | `2.1.14` |
| [@radix-ui/react-label](https://github.com/radix-ui/primitives) |
`2.1.2` | `2.1.6` |
| [@radix-ui/react-popover](https://github.com/radix-ui/primitives) |
`1.1.6` | `1.1.13` |
| [@radix-ui/react-radio-group](https://github.com/radix-ui/primitives)
| `1.2.3` | `1.3.6` |
| [@radix-ui/react-scroll-area](https://github.com/radix-ui/primitives)
| `1.2.3` | `1.2.8` |
| [@radix-ui/react-select](https://github.com/radix-ui/primitives) |
`2.1.6` | `2.2.4` |
| [@radix-ui/react-separator](https://github.com/radix-ui/primitives) |
`1.1.2` | `1.1.6` |
| [@radix-ui/react-switch](https://github.com/radix-ui/primitives) |
`1.1.3` | `1.2.4` |
| [@radix-ui/react-tabs](https://github.com/radix-ui/primitives) |
`1.1.4` | `1.1.11` |
| [@radix-ui/react-toast](https://github.com/radix-ui/primitives) |
`1.2.6` | `1.2.13` |
| [@radix-ui/react-tooltip](https://github.com/radix-ui/primitives) |
`1.1.8` | `1.2.6` |
| [@sentry/nextjs](https://github.com/getsentry/sentry-javascript) |
`9.10.1` | `9.19.0` |
| [@supabase/ssr](https://github.com/supabase/ssr) | `0.5.2` | `0.6.1` |
| [@supabase/supabase-js](https://github.com/supabase/supabase-js) |
`2.49.1` | `2.49.4` |
|
[@tanstack/react-table](https://github.com/TanStack/table/tree/HEAD/packages/react-table)
| `8.21.2` | `8.21.3` |
|
[@xyflow/react](https://github.com/xyflow/xyflow/tree/HEAD/packages/react)
| `12.4.2` | `12.6.4` |
| [cmdk](https://github.com/pacocoursey/cmdk/tree/HEAD/cmdk) | `1.0.4` |
`1.1.1` |
| [dotenv](https://github.com/motdotla/dotenv) | `16.4.7` | `16.5.0` |
| [embla-carousel-react](https://github.com/davidjerleke/embla-carousel)
| `8.5.2` | `8.6.0` |
| [framer-motion](https://github.com/motiondivision/motion) | `12.4.11`
| `12.12.1` |
| [geist](https://github.com/vercel/geist-font/tree/HEAD/packages/next)
| `1.3.1` | `1.4.2` |
|
[launchdarkly-react-client-sdk](https://github.com/launchdarkly/react-client-sdk)
| `3.6.1` | `3.7.0` |
|
[lucide-react](https://github.com/lucide-icons/lucide/tree/HEAD/packages/lucide-react)
| `0.479.0` | `0.510.0` |
| [next-themes](https://github.com/pacocoursey/next-themes) | `0.4.5` |
`0.4.6` |
| [react-day-picker](https://github.com/gpbl/react-day-picker) | `9.6.1`
| `9.7.0` |
| [react-hook-form](https://github.com/react-hook-form/react-hook-form)
| `7.54.2` | `7.56.3` |
| [recharts](https://github.com/recharts/recharts) | `2.15.1` | `2.15.3`
|
| [zod](https://github.com/colinhacks/zod) | `3.24.2` | `3.24.4` |


Updates `@faker-js/faker` from 9.6.0 to 9.8.0
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/faker-js/faker/releases"><code>@​faker-js/faker</code>'s
releases</a>.</em></p>
<blockquote>
<h2>v9.8.0</h2>
<h2>What's Changed</h2>
<ul>
<li>feat(locale): add country code for en_CA &amp; fr_CA by <a
href="https://github.com/alixlahuec"><code>@​alixlahuec</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3476">faker-js/faker#3476</a></li>
<li>test: use validator@13.15.0 with isULID, isISO31661Numeric,
isISO15924 by <a
href="https://github.com/matthewmayer"><code>@​matthewmayer</code></a>
in <a
href="https://redirect.github.com/faker-js/faker/pull/3482">faker-js/faker#3482</a></li>
<li>feat(locale): add zh_CN food by <a
href="https://github.com/yyz945947732"><code>@​yyz945947732</code></a>
in <a
href="https://redirect.github.com/faker-js/faker/pull/3479">faker-js/faker#3479</a></li>
<li>docs: more than 70 locales by <a
href="https://github.com/matthewmayer"><code>@​matthewmayer</code></a>
in <a
href="https://redirect.github.com/faker-js/faker/pull/3483">faker-js/faker#3483</a></li>
<li>feat(locale): update zh_CN location by <a
href="https://github.com/yyz945947732"><code>@​yyz945947732</code></a>
in <a
href="https://redirect.github.com/faker-js/faker/pull/3481">faker-js/faker#3481</a></li>
<li>feat(locale): update zh_CN animal by <a
href="https://github.com/yyz945947732"><code>@​yyz945947732</code></a>
in <a
href="https://redirect.github.com/faker-js/faker/pull/3480">faker-js/faker#3480</a></li>
<li>refactor(locale): ko state data update by <a
href="https://github.com/seoahan"><code>@​seoahan</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3487">faker-js/faker#3487</a></li>
<li>feat(locale): add zh_CN book by <a
href="https://github.com/yyz945947732"><code>@​yyz945947732</code></a>
in <a
href="https://redirect.github.com/faker-js/faker/pull/3477">faker-js/faker#3477</a></li>
<li>feat(locale): add Japanese date and month definitions by <a
href="https://github.com/matsueushi"><code>@​matsueushi</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3492">faker-js/faker#3492</a></li>
<li>feat(locale): add vehicle locale data for Japanese by <a
href="https://github.com/noritaka1166"><code>@​noritaka1166</code></a>
in <a
href="https://redirect.github.com/faker-js/faker/pull/3490">faker-js/faker#3490</a></li>
<li>feat(locale): update Japanese company categories by <a
href="https://github.com/noritaka1166"><code>@​noritaka1166</code></a>
in <a
href="https://redirect.github.com/faker-js/faker/pull/3489">faker-js/faker#3489</a></li>
<li>feat(locale): add Japanese science locale data including elements
and units by <a
href="https://github.com/noritaka1166"><code>@​noritaka1166</code></a>
in <a
href="https://redirect.github.com/faker-js/faker/pull/3491">faker-js/faker#3491</a></li>
<li>feat(locale): add Japanese sex definitions for person locale by <a
href="https://github.com/noritaka1166"><code>@​noritaka1166</code></a>
in <a
href="https://redirect.github.com/faker-js/faker/pull/3495">faker-js/faker#3495</a></li>
<li>refactor(locale): rename pt-BR streetSuffix to streetPrefix by <a
href="https://github.com/glmchalita"><code>@​glmchalita</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3493">faker-js/faker#3493</a></li>
<li>feat(locale): update zh_CN word by <a
href="https://github.com/yyz945947732"><code>@​yyz945947732</code></a>
in <a
href="https://redirect.github.com/faker-js/faker/pull/3478">faker-js/faker#3478</a></li>
<li>refactor(locale): normalize internet data by <a
href="https://github.com/xDivisionByZerox"><code>@​xDivisionByZerox</code></a>
in <a
href="https://redirect.github.com/faker-js/faker/pull/3502">faker-js/faker#3502</a></li>
<li>chore(deps): update eslint by <a
href="https://github.com/renovate"><code>@​renovate</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3500">faker-js/faker#3500</a></li>
<li>chore(deps): update dependency eslint-plugin-unicorn to v59 by <a
href="https://github.com/renovate"><code>@​renovate</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3501">faker-js/faker#3501</a></li>
<li>chore(deps): update vitest by <a
href="https://github.com/renovate"><code>@​renovate</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3497">faker-js/faker#3497</a></li>
<li>chore(deps): update cypress/browsers docker tag to v24 by <a
href="https://github.com/renovate"><code>@​renovate</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3507">faker-js/faker#3507</a></li>
<li>chore(deps): update all non-major dependencies by <a
href="https://github.com/renovate"><code>@​renovate</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3498">faker-js/faker#3498</a></li>
<li>chore(deps): update
mcr.microsoft.com/devcontainers/typescript-node:22 docker digest to
fb211a0 by <a
href="https://github.com/renovate"><code>@​renovate</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3508">faker-js/faker#3508</a></li>
<li>feat(locale): Add additional Japanese last names to the locale data
by <a
href="https://github.com/noritaka1166"><code>@​noritaka1166</code></a>
in <a
href="https://redirect.github.com/faker-js/faker/pull/3484">faker-js/faker#3484</a></li>
<li>chore(deps): update eslint by <a
href="https://github.com/renovate"><code>@​renovate</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3511">faker-js/faker#3511</a></li>
<li>chore(deps): lock file maintenance by <a
href="https://github.com/renovate"><code>@​renovate</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3486">faker-js/faker#3486</a></li>
<li>fix(locale): ko modified street_name to street_name_part by <a
href="https://github.com/seoahan"><code>@​seoahan</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3485">faker-js/faker#3485</a></li>
<li>fix(locale): correct Japanese country names by <a
href="https://github.com/matsueushi"><code>@​matsueushi</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3510">faker-js/faker#3510</a></li>
<li>chore(release): 9.8.0 by <a
href="https://github.com/fakerjs-bot"><code>@​fakerjs-bot</code></a> in
<a
href="https://redirect.github.com/faker-js/faker/pull/3515">faker-js/faker#3515</a></li>
</ul>
<h2>New Contributors</h2>
<ul>
<li><a
href="https://github.com/alixlahuec"><code>@​alixlahuec</code></a> made
their first contribution in <a
href="https://redirect.github.com/faker-js/faker/pull/3476">faker-js/faker#3476</a></li>
<li><a
href="https://github.com/yyz945947732"><code>@​yyz945947732</code></a>
made their first contribution in <a
href="https://redirect.github.com/faker-js/faker/pull/3479">faker-js/faker#3479</a></li>
<li><a href="https://github.com/seoahan"><code>@​seoahan</code></a> made
their first contribution in <a
href="https://redirect.github.com/faker-js/faker/pull/3487">faker-js/faker#3487</a></li>
<li><a
href="https://github.com/noritaka1166"><code>@​noritaka1166</code></a>
made their first contribution in <a
href="https://redirect.github.com/faker-js/faker/pull/3490">faker-js/faker#3490</a></li>
<li><a
href="https://github.com/glmchalita"><code>@​glmchalita</code></a> made
their first contribution in <a
href="https://redirect.github.com/faker-js/faker/pull/3493">faker-js/faker#3493</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/faker-js/faker/compare/v9.7.0...v9.8.0">https://github.com/faker-js/faker/compare/v9.7.0...v9.8.0</a></p>
<h2>v9.7.0</h2>
<h2>What's Changed</h2>
<ul>
<li>feat(locale): Add bn_BD locale by <a
href="https://github.com/AbrarShahriar"><code>@​AbrarShahriar</code></a>
in <a
href="https://redirect.github.com/faker-js/faker/pull/3439">faker-js/faker#3439</a></li>
<li>fix(airline): Air France and KLM Royal Dutch Airlines by <a
href="https://github.com/chimurai"><code>@​chimurai</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3440">faker-js/faker#3440</a></li>
<li>infra(comment-issue): fix display of thumbs up emoji by <a
href="https://github.com/xDivisionByZerox"><code>@​xDivisionByZerox</code></a>
in <a
href="https://redirect.github.com/faker-js/faker/pull/3444">faker-js/faker#3444</a></li>
<li>feat(locale): add localize sex support for zh_CN &amp; zh_TW by <a
href="https://github.com/sd44"><code>@​sd44</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3450">faker-js/faker#3450</a></li>
<li>fix(iban): more strict pattern for IE and PS by <a
href="https://github.com/xDivisionByZerox"><code>@​xDivisionByZerox</code></a>
in <a
href="https://redirect.github.com/faker-js/faker/pull/3464">faker-js/faker#3464</a></li>
<li>chore(deps): update devdependencies by <a
href="https://github.com/renovate"><code>@​renovate</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3456">faker-js/faker#3456</a></li>
<li>chore(deps): update all non-major dependencies by <a
href="https://github.com/renovate"><code>@​renovate</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3457">faker-js/faker#3457</a></li>
<li>chore(deps): update dependency prettier to v3.5.3 by <a
href="https://github.com/renovate"><code>@​renovate</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3455">faker-js/faker#3455</a></li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/faker-js/faker/blob/next/CHANGELOG.md"><code>@​faker-js/faker</code>'s
changelog</a>.</em></p>
<blockquote>
<h2><a
href="https://github.com/faker-js/faker/compare/v9.7.0...v9.8.0">9.8.0</a>
(2025-05-13)</h2>
<h3>New Locales</h3>
<ul>
<li><strong>locale:</strong> Add additional Japanese last names to the
locale data (<a
href="https://redirect.github.com/faker-js/faker/issues/3484">#3484</a>)
(<a
href="72e66c3a3a">72e66c3</a>)</li>
<li><strong>locale:</strong> add Japanese date and month definitions (<a
href="https://redirect.github.com/faker-js/faker/issues/3492">#3492</a>)
(<a
href="b70e7934b7">b70e793</a>)</li>
<li><strong>locale:</strong> add Japanese science locale data including
elements and units (<a
href="https://redirect.github.com/faker-js/faker/issues/3491">#3491</a>)
(<a
href="54fd5519e9">54fd551</a>)</li>
<li><strong>locale:</strong> add Japanese sex definitions for person
locale (<a
href="https://redirect.github.com/faker-js/faker/issues/3495">#3495</a>)
(<a
href="1dbd8fa511">1dbd8fa</a>)</li>
<li><strong>locale:</strong> add vehicle locale data for Japanese (<a
href="https://redirect.github.com/faker-js/faker/issues/3490">#3490</a>)
(<a
href="dfadb1da74">dfadb1d</a>)</li>
<li><strong>locale:</strong> add zh_CN book (<a
href="https://redirect.github.com/faker-js/faker/issues/3477">#3477</a>)
(<a
href="786a3d0bd8">786a3d0</a>)</li>
<li><strong>locale:</strong> add zh_CN food (<a
href="https://redirect.github.com/faker-js/faker/issues/3479">#3479</a>)
(<a
href="6c883e74b8">6c883e7</a>)</li>
<li><strong>locale:</strong> update Japanese company categories (<a
href="https://redirect.github.com/faker-js/faker/issues/3489">#3489</a>)
(<a
href="8c0953a261">8c0953a</a>)</li>
<li><strong>locale:</strong> update zh_CN animal (<a
href="https://redirect.github.com/faker-js/faker/issues/3480">#3480</a>)
(<a
href="38ee7b81a8">38ee7b8</a>)</li>
<li><strong>locale:</strong> update zh_CN location (<a
href="https://redirect.github.com/faker-js/faker/issues/3481">#3481</a>)
(<a
href="456f10276b">456f102</a>)</li>
<li><strong>locale:</strong> update zh_CN word (<a
href="https://redirect.github.com/faker-js/faker/issues/3478">#3478</a>)
(<a
href="aa98867765">aa98867</a>)</li>
</ul>
<h3>Changed Locales</h3>
<ul>
<li><strong>locale:</strong> ko state data update (<a
href="https://redirect.github.com/faker-js/faker/issues/3487">#3487</a>)
(<a
href="b611ec2e51">b611ec2</a>)</li>
<li><strong>locale:</strong> normalize internet data (<a
href="https://redirect.github.com/faker-js/faker/issues/3502">#3502</a>)
(<a
href="e6151e4efd">e6151e4</a>)</li>
<li><strong>locale:</strong> rename pt-BR streetSuffix to streetPrefix
(<a
href="https://redirect.github.com/faker-js/faker/issues/3493">#3493</a>)
(<a
href="7c23db316e">7c23db3</a>)</li>
</ul>
<h3>Bug Fixes</h3>
<ul>
<li><strong>locale:</strong> correct Japanese country names (<a
href="https://redirect.github.com/faker-js/faker/issues/3510">#3510</a>)
(<a
href="046bb81558">046bb81</a>)</li>
<li><strong>locale:</strong> correct the name of element Lv in Japanese
(<a
href="https://redirect.github.com/faker-js/faker/issues/3509">#3509</a>)
(<a
href="6a7ef4c2ad">6a7ef4c</a>)</li>
<li><strong>locale:</strong> ko modified street_name to street_name_part
(<a
href="https://redirect.github.com/faker-js/faker/issues/3485">#3485</a>)
(<a
href="c15da8efec">c15da8e</a>)</li>
</ul>
<h2><a
href="https://github.com/faker-js/faker/compare/v9.6.0...v9.7.0">9.7.0</a>
(2025-04-13)</h2>
<h3>New Locales</h3>
<ul>
<li><strong>locale:</strong> Add bn_BD locale (<a
href="https://redirect.github.com/faker-js/faker/issues/3439">#3439</a>)
(<a
href="fef0ad7859">fef0ad7</a>)</li>
<li><strong>locale:</strong> add cy locale, start with date (<a
href="https://redirect.github.com/faker-js/faker/issues/3462">#3462</a>)
(<a
href="f70a6f7a65">f70a6f7</a>)</li>
<li><strong>locale:</strong> add finance support for ja locale (<a
href="https://redirect.github.com/faker-js/faker/issues/3449">#3449</a>)
(<a
href="b2c5298c94">b2c5298</a>)</li>
<li><strong>locale:</strong> add localize sex support for zh_CN &amp;
zh_TW (<a
href="https://redirect.github.com/faker-js/faker/issues/3450">#3450</a>)
(<a
href="048c32581b">048c325</a>)</li>
<li><strong>locale:</strong> add Tamil language support (<a
href="https://redirect.github.com/faker-js/faker/issues/3468">#3468</a>)
(<a
href="cdf6dc4a97">cdf6dc4</a>)</li>
</ul>
<h3>Bug Fixes</h3>
<ul>
<li><strong>airline:</strong> Air France and KLM Royal Dutch Airlines
(<a
href="https://redirect.github.com/faker-js/faker/issues/3440">#3440</a>)
(<a
href="8a2d168f62">8a2d168</a>)</li>
<li><strong>iban:</strong> more strict pattern for IE and PS (<a
href="https://redirect.github.com/faker-js/faker/issues/3464">#3464</a>)
(<a
href="7b12056713">7b12056</a>)</li>
<li><strong>locale:</strong> rename ja and zh_CN company affix files (<a
href="https://redirect.github.com/faker-js/faker/issues/3448">#3448</a>)
(<a
href="1e551c5f47">1e551c5</a>)</li>
<li><strong>number:</strong> don't ignore multipleOf in float when
min=max (<a
href="https://redirect.github.com/faker-js/faker/issues/3417">#3417</a>)
(<a
href="e4cc4e50d1">e4cc4e5</a>)</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="549d71cf33"><code>549d71c</code></a>
chore(release): 9.8.0 (<a
href="https://redirect.github.com/faker-js/faker/issues/3515">#3515</a>)</li>
<li><a
href="046bb81558"><code>046bb81</code></a>
fix(locale): correct Japanese country names (<a
href="https://redirect.github.com/faker-js/faker/issues/3510">#3510</a>)</li>
<li><a
href="c15da8efec"><code>c15da8e</code></a>
fix(locale): ko modified street_name to street_name_part (<a
href="https://redirect.github.com/faker-js/faker/issues/3485">#3485</a>)</li>
<li><a
href="d6ba4cc1b1"><code>d6ba4cc</code></a>
chore(deps): lock file maintenance (<a
href="https://redirect.github.com/faker-js/faker/issues/3486">#3486</a>)</li>
<li><a
href="2faf57b01c"><code>2faf57b</code></a>
chore(deps): update eslint (<a
href="https://redirect.github.com/faker-js/faker/issues/3511">#3511</a>)</li>
<li><a
href="6a7ef4c2ad"><code>6a7ef4c</code></a>
fix(locale): correct the name of element Lv in Japanese (<a
href="https://redirect.github.com/faker-js/faker/issues/3509">#3509</a>)</li>
<li><a
href="72e66c3a3a"><code>72e66c3</code></a>
feat(locale): Add additional Japanese last names to the locale data (<a
href="https://redirect.github.com/faker-js/faker/issues/3484">#3484</a>)</li>
<li><a
href="c0d5217e10"><code>c0d5217</code></a>
chore(deps): update mcr.microsoft.com/devcontainers/typescript-node:22
docker...</li>
<li><a
href="049875d283"><code>049875d</code></a>
chore(deps): update all non-major dependencies (<a
href="https://redirect.github.com/faker-js/faker/issues/3498">#3498</a>)</li>
<li><a
href="07087ff6ef"><code>07087ff</code></a>
chore(deps): update cypress/browsers docker tag to v24 (<a
href="https://redirect.github.com/faker-js/faker/issues/3507">#3507</a>)</li>
<li>Additional commits viewable in <a
href="https://github.com/faker-js/faker/compare/v9.6.0...v9.8.0">compare
view</a></li>
</ul>
</details>
<details>
<summary>Maintainer changes</summary>
<p>This version was pushed to npm by <a
href="https://www.npmjs.com/~faker-bot">faker-bot</a>, a new releaser
for <code>@​faker-js/faker</code> since your current version.</p>
</details>
<br />

Updates `@next/third-parties` from 15.2.1 to 15.3.2
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/vercel/next.js/releases"><code>@​next/third-parties</code>'s
releases</a>.</em></p>
<blockquote>
<h2>v15.3.2</h2>
<blockquote>
<p>[!NOTE]<br />
This release is backporting bug fixes. It does <strong>not</strong>
include all pending features/changes on canary.</p>
</blockquote>
<h3>Core Changes</h3>
<ul>
<li>backport: fix(turbopack): Store persistence of wrapped task on
RawVc::LocalOutput (<a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/78488">#78488</a>)
(<a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/78883">#78883</a>)</li>
<li><code>@​next/mdx</code>: Use stable turbopack config options (<a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/78880">#78880</a>)</li>
<li>Fix react-compiler: Fix detection of interest (<a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/78879">#78879</a>)</li>
<li>Fix turbopack: Backport sourcemap bugfix (<a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/78881">#78881</a>)</li>
<li>[next-server] preserve rsc query for rsc redirects (<a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/78876">#78876</a>)</li>
<li>Update middleware public/static matching (<a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/78875">#78875</a>)</li>
</ul>
<h3>Credits</h3>
<p>Huge thanks to <a
href="https://github.com/ijjk"><code>@​ijjk</code></a>, <a
href="https://github.com/huozhi"><code>@​huozhi</code></a>, <a
href="https://github.com/kdy1"><code>@​kdy1</code></a>, <a
href="https://github.com/wbinnssmith"><code>@​wbinnssmith</code></a>,
and <a href="https://github.com/bgw"><code>@​bgw</code></a> for
helping!</p>
<h2>v15.3.1</h2>
<blockquote>
<p>[!NOTE]<br />
This release is backporting bug fixes. It does <strong>not</strong>
include all pending features/changes on canary.</p>
</blockquote>
<h3>Core Changes</h3>
<ul>
<li>chore: Backport SWC-based RC optimization (<a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/78260">#78260</a>)</li>
<li>fix: bump image-size@1.2.1 (<a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/78164">#78164</a>)</li>
</ul>
<h3>Credits</h3>
<p>Huge thanks to <a
href="https://github.com/kdy1"><code>@​kdy1</code></a> and <a
href="https://github.com/styfle"><code>@​styfle</code></a> for
helping!</p>
<h2>v15.3.1-canary.15</h2>
<h3>Core Changes</h3>
<ul>
<li>[Turbopack] refactor persistent caching from log based to cow
approach: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/76234">#76234</a></li>
</ul>
<h3>Misc Changes</h3>
<ul>
<li>fix(turbo-tasks-fs): Handle filesystem watcher rescan events: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/78045">#78045</a></li>
</ul>
<h3>Credits</h3>
<p>Huge thanks to <a
href="https://github.com/bgw"><code>@​bgw</code></a> and <a
href="https://github.com/sokra"><code>@​sokra</code></a> for
helping!</p>
<h2>v15.3.1-canary.14</h2>
<h3>Core Changes</h3>
<ul>
<li>Add graceful error boundary for bots requests: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/78298">#78298</a></li>
<li>make sure eslint-plugin-next is built when running 'pnpm dev': <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/78305">#78305</a></li>
<li>Migrate pages API routes to handler interface: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/78166">#78166</a></li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="d9ec4a4b57"><code>d9ec4a4</code></a>
v15.3.2</li>
<li><a
href="fa536cf2c9"><code>fa536cf</code></a>
v15.3.1</li>
<li><a
href="b2ff04995b"><code>b2ff049</code></a>
v15.3.0</li>
<li><a
href="60bfe64295"><code>60bfe64</code></a>
v15.3.0-canary.46</li>
<li><a
href="f71c4a1582"><code>f71c4a1</code></a>
v15.3.0-canary.45</li>
<li><a
href="4451bae75d"><code>4451bae</code></a>
v15.3.0-canary.44</li>
<li><a
href="87d7d8eb7a"><code>87d7d8e</code></a>
v15.3.0-canary.43</li>
<li><a
href="82ab39f801"><code>82ab39f</code></a>
v15.3.0-canary.42</li>
<li><a
href="8f1409d6ce"><code>8f1409d</code></a>
v15.3.0-canary.41</li>
<li><a
href="2139369821"><code>2139369</code></a>
v15.3.0-canary.40</li>
<li>Additional commits viewable in <a
href="https://github.com/vercel/next.js/commits/v15.3.2/packages/third-parties">compare
view</a></li>
</ul>
</details>
<br />

Updates `@radix-ui/react-alert-dialog` from 1.1.6 to 1.1.13
<details>
<summary>Commits</summary>
<ul>
<li>See full diff in <a
href="https://github.com/radix-ui/primitives/commits">compare
view</a></li>
</ul>
</details>
<br />

Updates `@radix-ui/react-avatar` from 1.1.3 to 1.1.9
<details>
<summary>Commits</summary>
<ul>
<li>See full diff in <a
href="https://github.com/radix-ui/primitives/commits">compare
view</a></li>
</ul>
</details>
<br />

Updates `@radix-ui/react-checkbox` from 1.1.4 to 1.3.1
<details>
<summary>Commits</summary>
<ul>
<li>See full diff in <a
href="https://github.com/radix-ui/primitives/commits">compare
view</a></li>
</ul>
</details>
<br />

Updates `@radix-ui/react-collapsible` from 1.1.3 to 1.1.10
<details>
<summary>Commits</summary>
<ul>
<li>See full diff in <a
href="https://github.com/radix-ui/primitives/commits">compare
view</a></li>
</ul>
</details>
<br />

Updates `@radix-ui/react-context-menu` from 2.2.6 to 2.2.14
<details>
<summary>Commits</summary>
<ul>
<li>See full diff in <a
href="https://github.com/radix-ui/primitives/commits">compare
view</a></li>
</ul>
</details>
<br />

Updates `@radix-ui/react-dialog` from 1.1.6 to 1.1.13
<details>
<summary>Commits</summary>
<ul>
<li>See full diff in <a
href="https://github.com/radix-ui/primitives/commits">compare
view</a></li>
</ul>
</details>
<br />

Updates `@radix-ui/react-dropdown-menu` from 2.1.6 to 2.1.14
<details>
<summary>Commits</summary>
<ul>
<li>See full diff in <a
href="https://github.com/radix-ui/primitives/commits">compare
view</a></li>
</ul>
</details>
<br />

Updates `@radix-ui/react-label` from 2.1.2 to 2.1.6
<details>
<summary>Commits</summary>
<ul>
<li>See full diff in <a
href="https://github.com/radix-ui/primitives/commits">compare
view</a></li>
</ul>
</details>
<br />

Updates `@radix-ui/react-popover` from 1.1.6 to 1.1.13
<details>
<summary>Commits</summary>
<ul>
<li>See full diff in <a
href="https://github.com/radix-ui/primitives/commits">compare
view</a></li>
</ul>
</details>
<br />

Updates `@radix-ui/react-radio-group` from 1.2.3 to 1.3.6
<details>
<summary>Commits</summary>
<ul>
<li>See full diff in <a
href="https://github.com/radix-ui/primitives/commits">compare
view</a></li>
</ul>
</details>
<br />

Updates `@radix-ui/react-scroll-area` from 1.2.3 to 1.2.8
<details>
<summary>Commits</summary>
<ul>
<li>See full diff in <a
href="https://github.com/radix-ui/primitives/commits">compare
view</a></li>
</ul>
</details>
<br />

Updates `@radix-ui/react-select` from 2.1.6 to 2.2.4
<details>
<summary>Commits</summary>
<ul>
<li>See full diff in <a
href="https://github.com/radix-ui/primitives/commits">compare
view</a></li>
</ul>
</details>
<br />

Updates `@radix-ui/react-separator` from 1.1.2 to 1.1.6
<details>
<summary>Commits</summary>
<ul>
<li>See full diff in <a
href="https://github.com/radix-ui/primitives/commits">compare
view</a></li>
</ul>
</details>
<br />

Updates `@radix-ui/react-slot` from 1.1.2 to 1.2.2
<details>
<summary>Commits</summary>
<ul>
<li>See full diff in <a
href="https://github.com/radix-ui/primitives/commits">compare
view</a></li>
</ul>
</details>
<br />

Updates `@radix-ui/react-switch` from 1.1.3 to 1.2.4
<details>
<summary>Commits</summary>
<ul>
<li>See full diff in <a
href="https://github.com/radix-ui/primitives/commits">compare
view</a></li>
</ul>
</details>
<br />

Updates `@radix-ui/react-tabs` from 1.1.4 to 1.1.11
<details>
<summary>Commits</summary>
<ul>
<li>See full diff in <a
href="https://github.com/radix-ui/primitives/commits">compare
view</a></li>
</ul>
</details>
<br />

Updates `@radix-ui/react-toast` from 1.2.6 to 1.2.13
<details>
<summary>Commits</summary>
<ul>
<li>See full diff in <a
href="https://github.com/radix-ui/primitives/commits">compare
view</a></li>
</ul>
</details>
<br />

Updates `@radix-ui/react-tooltip` from 1.1.8 to 1.2.6
<details>
<summary>Commits</summary>
<ul>
<li>See full diff in <a
href="https://github.com/radix-ui/primitives/commits">compare
view</a></li>
</ul>
</details>
<br />

Updates `@sentry/nextjs` from 9.10.1 to 9.19.0
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/getsentry/sentry-javascript/releases"><code>@​sentry/nextjs</code>'s
releases</a>.</em></p>
<blockquote>
<h2>9.19.0</h2>
<ul>
<li>feat(react-router): Add otel instrumentation for server requests (<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/16147">#16147</a>)</li>
<li>feat(remix): Vendor in
<code>opentelemetry-instrumentation-remix</code> (<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/16145">#16145</a>)</li>
<li>fix(browser): Ensure spans auto-ended for navigations have
<code>cancelled</code> reason (<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/16277">#16277</a>)</li>
<li>fix(node): Pin <code>@fastify/otel</code> fork to direct url to
allow installing without git (<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/16287">#16287</a>)</li>
<li>fix(react): Handle nested parameterized routes in reactrouterv3
transaction normalization (<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/16274">#16274</a>)</li>
</ul>
<p>Work in this release was contributed by <a
href="https://github.com/sidx1024"><code>@​sidx1024</code></a>. Thank
you for your contribution!</p>
<h2>Bundle size 📦</h2>
<table>
<thead>
<tr>
<th>Path</th>
<th>Size</th>
</tr>
</thead>
<tbody>
<tr>
<td><code>@​sentry/browser</code></td>
<td>23.4 KB</td>
</tr>
<tr>
<td><code>@​sentry/browser</code> - with treeshaking flags</td>
<td>23.06 KB</td>
</tr>
<tr>
<td><code>@​sentry/browser</code> (incl. Tracing)</td>
<td>37.31 KB</td>
</tr>
<tr>
<td><code>@​sentry/browser</code> (incl. Tracing, Replay)</td>
<td>74.53 KB</td>
</tr>
<tr>
<td><code>@​sentry/browser</code> (incl. Tracing, Replay) - with
treeshaking flags</td>
<td>67.72 KB</td>
</tr>
<tr>
<td><code>@​sentry/browser</code> (incl. Tracing, Replay with
Canvas)</td>
<td>79.19 KB</td>
</tr>
<tr>
<td><code>@​sentry/browser</code> (incl. Tracing, Replay, Feedback)</td>
<td>91 KB</td>
</tr>
<tr>
<td><code>@​sentry/browser</code> (incl. Feedback)</td>
<td>39.8 KB</td>
</tr>
<tr>
<td><code>@​sentry/browser</code> (incl. sendFeedback)</td>
<td>28.03 KB</td>
</tr>
<tr>
<td><code>@​sentry/browser</code> (incl. FeedbackAsync)</td>
<td>32.79 KB</td>
</tr>
<tr>
<td><code>@​sentry/react</code></td>
<td>25.17 KB</td>
</tr>
<tr>
<td><code>@​sentry/react</code> (incl. Tracing)</td>
<td>39.27 KB</td>
</tr>
<tr>
<td><code>@​sentry/vue</code></td>
<td>27.67 KB</td>
</tr>
<tr>
<td><code>@​sentry/vue</code> (incl. Tracing)</td>
<td>39.07 KB</td>
</tr>
<tr>
<td><code>@​sentry/svelte</code></td>
<td>23.43 KB</td>
</tr>
<tr>
<td>CDN Bundle</td>
<td>24.58 KB</td>
</tr>
<tr>
<td>CDN Bundle (incl. Tracing)</td>
<td>37.33 KB</td>
</tr>
<tr>
<td>CDN Bundle (incl. Tracing, Replay)</td>
<td>72.37 KB</td>
</tr>
<tr>
<td>CDN Bundle (incl. Tracing, Replay, Feedback)</td>
<td>77.68 KB</td>
</tr>
<tr>
<td>CDN Bundle - uncompressed</td>
<td>71.72 KB</td>
</tr>
<tr>
<td>CDN Bundle (incl. Tracing) - uncompressed</td>
<td>110.5 KB</td>
</tr>
<tr>
<td>CDN Bundle (incl. Tracing, Replay) - uncompressed</td>
<td>221.78 KB</td>
</tr>
<tr>
<td>CDN Bundle (incl. Tracing, Replay, Feedback) - uncompressed</td>
<td>234.31 KB</td>
</tr>
<tr>
<td><code>@​sentry/nextjs</code> (client)</td>
<td>40.88 KB</td>
</tr>
<tr>
<td><code>@​sentry/sveltekit</code> (client)</td>
<td>37.77 KB</td>
</tr>
<tr>
<td><code>@​sentry/node</code></td>
<td>154.29 KB</td>
</tr>
<tr>
<td><code>@​sentry/node</code> - without tracing</td>
<td>95.6 KB</td>
</tr>
<tr>
<td><code>@​sentry/aws-serverless</code></td>
<td>120.36 KB</td>
</tr>
</tbody>
</table>
<h2>9.18.0</h2>
<h3>Important changes</h3>
<ul>
<li><strong>feat: Support Node 24 (<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/16236">#16236</a>)</strong></li>
</ul>
<p>We now also publish profiling binaries for Node 24.</p>
<h3>Other changes</h3>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/getsentry/sentry-javascript/blob/develop/CHANGELOG.md"><code>@​sentry/nextjs</code>'s
changelog</a>.</em></p>
<blockquote>
<h2>9.19.0</h2>
<ul>
<li>feat(react-router): Add otel instrumentation for server requests (<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/16147">#16147</a>)</li>
<li>feat(remix): Vendor in
<code>opentelemetry-instrumentation-remix</code> (<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/16145">#16145</a>)</li>
<li>fix(browser): Ensure spans auto-ended for navigations have
<code>cancelled</code> reason (<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/16277">#16277</a>)</li>
<li>fix(node): Pin <code>@fastify/otel</code> fork to direct url to
allow installing without git (<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/16287">#16287</a>)</li>
<li>fix(react): Handle nested parameterized routes in reactrouterv3
transaction normalization (<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/16274">#16274</a>)</li>
</ul>
<p>Work in this release was contributed by <a
href="https://github.com/sidx1024"><code>@​sidx1024</code></a>. Thank
you for your contribution!</p>
<h2>9.18.0</h2>
<h3>Important changes</h3>
<ul>
<li><strong>feat: Support Node 24 (<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/16236">#16236</a>)</strong></li>
</ul>
<p>We now also publish profiling binaries for Node 24.</p>
<h3>Other changes</h3>
<ul>
<li>deps(node): Bump <code>import-in-the-middle</code> to
<code>1.13.1</code> (<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/16260">#16260</a>)</li>
<li>feat: Export <code>consoleLoggingIntegration</code> from vercel edge
sdk (<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/16228">#16228</a>)</li>
<li>feat(cloudflare): Add support for email, queue, and tail handler (<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/16233">#16233</a>)</li>
<li>feat(cloudflare): Improve http span data (<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/16232">#16232</a>)</li>
<li>feat(nextjs): Add more attributes for generation functions (<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/16214">#16214</a>)</li>
<li>feat(opentelemetry): Widen peer dependencies to support Otel v2 (<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/16246">#16246</a>)</li>
<li>fix(core): Gracefully handle invalid baggage entries (<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/16257">#16257</a>)</li>
<li>fix(node): Ensure traces are propagated without spans in Node 22+
(<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/16221">#16221</a>)</li>
<li>fix(node): Use sentry forked <code>@fastify/otel</code> dependency
with pinned Otel v1 deps (<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/16256">#16256</a>)</li>
<li>fix(remix): Remove vendored types (<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/16218">#16218</a>)</li>
</ul>
<h2>9.17.0</h2>
<ul>
<li>feat(node): Migrate to <code>@fastify/otel</code> (<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/15542">#15542</a>)</li>
</ul>
<h2>9.16.1</h2>
<ul>
<li>fix(core): Make sure logs get flushed in server-runtime-client (<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/16222">#16222</a>)</li>
<li>ref(node): Remove vercel flushing code that does nothing (<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/16217">#16217</a>)</li>
</ul>
<h2>9.16.0</h2>
<h3>Important changes</h3>
<ul>
<li><strong>feat: Create a Vite plugin that injects sentryConfig into
the global config (<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/16197">#16197</a>)</strong></li>
</ul>
<p>Add a new plugin <code>makeConfigInjectorPlugin</code> within our
existing vite plugin that updates the global vite config with sentry
options</p>
<ul>
<li><strong>feat(browser): Add option to sample linked traces
consistently (<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/16037">#16037</a>)</strong></li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="d69ea01d02"><code>d69ea01</code></a>
release: 9.19.0</li>
<li><a
href="9acaae28c1"><code>9acaae2</code></a>
Merge pull request <a
href="https://redirect.github.com/getsentry/sentry-javascript/issues/16288">#16288</a>
from getsentry/prepare-release/9.19.0</li>
<li><a
href="ed276633ce"><code>ed27663</code></a>
meta(changelog): Update changelog for 9.19.0</li>
<li><a
href="4b039f8a6e"><code>4b039f8</code></a>
fix(node): Pin <code>@fastify/otel</code> fork to direct url to allow
installing without...</li>
<li><a
href="26731edc99"><code>26731ed</code></a>
feat(react-router): Add otel instrumentation for server requests (<a
href="https://redirect.github.com/getsentry/sentry-javascript/issues/16147">#16147</a>)</li>
<li><a
href="efb86de64b"><code>efb86de</code></a>
chore: Add external contributor to CHANGELOG.md (<a
href="https://redirect.github.com/getsentry/sentry-javascript/issues/16282">#16282</a>)</li>
<li><a
href="e83658f23a"><code>e83658f</code></a>
fix(browser): Ensure spans auto-ended for navigations have
<code>cancelled</code> reason...</li>
<li><a
href="b9984338be"><code>b998433</code></a>
feat(remix): Vendor in <code>opentelemetry-instrumentation-remix</code>
(<a
href="https://redirect.github.com/getsentry/sentry-javascript/issues/16145">#16145</a>)</li>
<li><a
href="f55018fbf6"><code>f55018f</code></a>
fix(react): Handle nested parameterized routes in reactrouterv3
transaction n...</li>
<li><a
href="6d2b971980"><code>6d2b971</code></a>
Merge pull request <a
href="https://redirect.github.com/getsentry/sentry-javascript/issues/16276">#16276</a>
from getsentry/master</li>
<li>Additional commits viewable in <a
href="https://github.com/getsentry/sentry-javascript/compare/9.10.1...9.19.0">compare
view</a></li>
</ul>
</details>
<br />

Updates `@supabase/ssr` from 0.5.2 to 0.6.1
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/supabase/ssr/releases"><code>@​supabase/ssr</code>'s
releases</a>.</em></p>
<blockquote>
<h2>v0.6.1</h2>
<h2><a
href="https://github.com/supabase/ssr/compare/v0.6.0...v0.6.1">0.6.1</a>
(2025-03-16)</h2>
<h3>Bug Fixes</h3>
<ul>
<li>force release (<a
href="https://redirect.github.com/supabase/ssr/issues/98">#98</a>) (<a
href="66710e82aa">66710e8</a>)</li>
<li><strong>revert:</strong> &quot;feat: improve cookie chunk handling
via base64url+length encoding (<a
href="https://redirect.github.com/supabase/ssr/issues/90">#90</a>)&quot;
(<a href="https://redirect.github.com/supabase/ssr/issues/100">#100</a>)
(<a
href="2ea8e23525">2ea8e23</a>)</li>
</ul>
<h2>v0.6.1-rc.3</h2>
<p>This is a release candidate. See release-please PR <a
href="https://redirect.github.com/supabase/ssr/issues/99">#99</a> for
context.</p>
<h2>v0.6.1-rc.2</h2>
<p>This is a release candidate. See release-please PR <a
href="https://redirect.github.com/supabase/ssr/issues/99">#99</a> for
context.</p>
<h2>v0.6.0</h2>
<h2><a
href="https://github.com/supabase/ssr/compare/v0.5.2...v0.6.0">0.6.0</a>
(2025-02-27)</h2>
<h3>Features</h3>
<ul>
<li>improve cookie chunk handling via base64url+length encoding (<a
href="https://redirect.github.com/supabase/ssr/issues/90">#90</a>) (<a
href="6deb6871ca">6deb687</a>)</li>
<li>upgrade cookie dependency and cleanup imports (<a
href="https://redirect.github.com/supabase/ssr/issues/77">#77</a>) (<a
href="95245282e6">9524528</a>)</li>
</ul>
<h3>Bug Fixes</h3>
<ul>
<li>add <code>create*Client</code> string in <code>x-client-info</code>
(<a href="https://redirect.github.com/supabase/ssr/issues/85">#85</a>)
(<a
href="f271accfea">f271acc</a>)</li>
</ul>
<h2>v0.6.0-rc.5</h2>
<p>This is a release candidate. See release-please PR <a
href="https://redirect.github.com/supabase/ssr/issues/80">#80</a> for
context.</p>
<h2>v0.6.0-rc.3</h2>
<p>This is a release candidate. See release-please PR <a
href="https://redirect.github.com/supabase/ssr/issues/80">#80</a> for
context.</p>
<h2>v0.6.0-rc.2</h2>
<p>This is a release candidate. See release-please PR <a
href="https://redirect.github.com/supabase/ssr/issues/80">#80</a> for
context.</p>
<h2>v0.6.0-rc.1</h2>
<p>This is a release candidate. See release-please PR <a
href="https://redirect.github.com/supabase/ssr/issues/80">#80</a> for
context.</p>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/supabase/ssr/blob/main/CHANGELOG.md"><code>@​supabase/ssr</code>'s
changelog</a>.</em></p>
<blockquote>
<h2><a
href="https://github.com/supabase/ssr/compare/v0.6.0...v0.6.1">0.6.1</a>
(2025-03-16)</h2>
<h3>Bug Fixes</h3>
<ul>
<li>force release (<a
href="https://redirect.github.com/supabase/ssr/issues/98">#98</a>) (<a
href="66710e82aa">66710e8</a>)</li>
<li><strong>revert:</strong> &quot;feat: improve cookie chunk handling
via base64url+length encoding (<a
href="https://redirect.github.com/supabase/ssr/issues/90">#90</a>)&quot;
(<a href="https://redirect.github.com/supabase/ssr/issues/100">#100</a>)
(<a
href="2ea8e23525">2ea8e23</a>)</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="a1b60ba826"><code>a1b60ba</code></a>
chore(main): release 0.6.1 (<a
href="https://redirect.github.com/supabase/ssr/issues/99">#99</a>)</li>
<li><a
href="2ea8e23525"><code>2ea8e23</code></a>
fix(revert): &quot;feat: improve cookie chunk handling via
base64url+length encodi...</li>
<li><a
href="66710e82aa"><code>66710e8</code></a>
fix: force release (<a
href="https://redirect.github.com/supabase/ssr/issues/98">#98</a>)</li>
<li><a
href="37909b21f1"><code>37909b2</code></a>
Revert &quot;chore(main): release 0.6.0&quot; (<a
href="https://redirect.github.com/supabase/ssr/issues/95">#95</a>)</li>
<li><a
href="be9cd6c70e"><code>be9cd6c</code></a>
chore(main): release 0.6.0 (<a
href="https://redirect.github.com/supabase/ssr/issues/80">#80</a>)</li>
<li><a
href="6deb6871ca"><code>6deb687</code></a>
feat: improve cookie chunk handling via base64url+length encoding (<a
href="https://redirect.github.com/supabase/ssr/issues/90">#90</a>)</li>
<li><a
href="ef429dfbd1"><code>ef429df</code></a>
build(deps): bump vite from 5.4.10 to 5.4.14 (<a
href="https://redirect.github.com/supabase/ssr/issues/88">#88</a>)</li>
<li><a
href="ca469593e3"><code>ca46959</code></a>
ci: publish with provenance (<a
href="https://redirect.github.com/supabase/ssr/issues/86">#86</a>)</li>
<li><a
href="f271accfea"><code>f271acc</code></a>
fix: add <code>create*Client</code> string in <code>x-client-info</code>
(<a
href="https://redirect.github.com/supabase/ssr/issues/85">#85</a>)</li>
<li><a
href="95245282e6"><code>9524528</code></a>
feat: upgrade cookie dependency and cleanup imports (<a
href="https://redirect.github.com/supabase/ssr/issues/77">#77</a>)</li>
<li>See full diff in <a
href="https://github.com/supabase/ssr/compare/v0.5.2...v0.6.1">compare
view</a></li>
</ul>
</details>
<br />

Updates `@supabase/supabase-js` from 2.49.1 to 2.49.4
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/supabase/supabase-js/releases"><code>@​supabase/supabase-js</code>'s
releases</a>.</em></p>
<blockquote>
<h2>v2.49.4</h2>
<h2><a
href="https://github.com/supabase/supabase-js/compare/v2.49.3...v2.49.4">2.49.4</a>
(2025-03-29)</h2>
<h3>Bug Fixes</h3>
<ul>
<li><strong>deps:</strong> upgrade postgrest-js to 1.19.4 (<a
href="692e8e846b">692e8e8</a>)</li>
</ul>
<h2>v2.49.3</h2>
<h2><a
href="https://github.com/supabase/supabase-js/compare/v2.49.2...v2.49.3">2.49.3</a>
(2025-03-24)</h2>
<h3>Bug Fixes</h3>
<ul>
<li>bump auth-js to 2.69.1 (<a
href="https://redirect.github.com/supabase/supabase-js/issues/1383">#1383</a>)
(<a
href="f08bfd9c96">f08bfd9</a>)</li>
</ul>
<h2>v2.49.2</h2>
<h2><a
href="https://github.com/supabase/supabase-js/compare/v2.49.1...v2.49.2">2.49.2</a>
(2025-03-24)</h2>
<h3>Bug Fixes</h3>
<ul>
<li>bump auth-js to v2.69.0 (<a
href="https://redirect.github.com/supabase/supabase-js/issues/1380">#1380</a>)
(<a
href="73ab30dd61">73ab30d</a>)</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="692e8e846b"><code>692e8e8</code></a>
fix(deps): upgrade postgrest-js to 1.19.4</li>
<li><a
href="f08bfd9c96"><code>f08bfd9</code></a>
fix: bump auth-js to 2.69.1 (<a
href="https://redirect.github.com/supabase/supabase-js/issues/1383">#1383</a>)</li>
<li><a
href="73ab30dd61"><code>73ab30d</code></a>
fix: bump auth-js to v2.69.0 (<a
href="https://redirect.github.com/supabase/supabase-js/issues/1380">#1380</a>)</li>
<li><a
href="a5219a81b3"><code>a5219a8</code></a>
chore: add open role to README.md (<a
href="https://redirect.github.com/supabase/supabase-js/issues/1378">#1378</a>)</li>
<li>See full diff in <a
href="https://github.com/supabase/supabase-js/compare/v2.49.1...v2.49.4">compare
view</a></li>
</ul>
</details>
<br />

Updates `@tanstack/react-table` from 8.21.2 to 8.21.3
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/TanStack/table/releases"><code>@​tanstack/react-table</code>'s
releases</a>.</em></p>
<blockquote>
<h2>v8.21.3</h2>
<p>Version 8.21.3 - 4/14/25, 8:19 PM</p>
<h2>Changes</h2>
<h3>Fix</h3>
<ul>
<li>table-core: use right Document instance on getResizeHandler
(column-sizing feature) (<a
href="https://github.com/TanStack/table/tree/HEAD/packages/react-table/issues/5989">#5989</a>)
(54ce673) by <a
href="https://github.com/riccardoperra"><code>@​riccardoperra</code></a></li>
</ul>
<h3>Docs</h3>
<ul>
<li>fix all 158 broken links (<a
href="https://github.com/TanStack/table/tree/HEAD/packages/react-table/issues/5972">#5972</a>)
(f7bf6f1) by <a
href="https://github.com/kisaragi-hiu"><code>@​kisaragi-hiu</code></a></li>
<li>add vue example for grouping (<a
href="https://github.com/TanStack/table/tree/HEAD/packages/react-table/issues/5941">#5941</a>)
(3efa59c) by Harshil Patel</li>
</ul>
<h2>Packages</h2>
<ul>
<li><code>@​tanstack/table-core</code><a
href="https://github.com/8"><code>@​8</code></a>.21.3</li>
<li><code>@​tanstack/angular-table</code><a
href="https://github.com/8"><code>@​8</code></a>.21.3</li>
<li><code>@​tanstack/lit-table</code><a
href="https://github.com/8"><code>@​8</code></a>.21.3</li>
<li><code>@​tanstack/qwik-table</code><a
href="https://github.com/8"><code>@​8</code></a>.21.3</li>
<li><code>@​tanstack/react-table</code><a
href="https://github.com/8"><code>@​8</code></a>.21.3</li>
<li><code>@​tanstack/solid-table</code><a
href="https://github.com/8"><code>@​8</code></a>.21.3</li>
<li><code>@​tanstack/svelte-table</code><a
href="https://github.com/8"><code>@​8</code></a>.21.3</li>
<li><code>@​tanstack/vue-table</code><a
href="https://github.com/8"><code>@​8</code></a>.21.3</li>
<li><code>@​tanstack/react-table-devtools</code><a
href="https://github.com/8"><code>@​8</code></a>.21.3</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="f4dc742b7b"><code>f4dc742</code></a>
release: v8.21.3</li>
<li>See full diff in <a
href="https://github.com/TanStack/table/commits/v8.21.3/packages/react-table">compare
view</a></li>
</ul>
</details>
<br />

Updates `@xyflow/react` from 12.4.2 to 12.6.4
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/xyflow/xyflow/releases"><code>@​xyflow/react</code>'s
releases</a>.</em></p>
<blockquote>
<h2><code>@​xyflow/react</code><a
href="https://github.com/12"><code>@​12</code></a>.6.4</h2>
<h3>Patch Changes</h3>
<ul>
<li>
<p><a
href="https://redirect.github.com/xyflow/xyflow/pull/5263">#5263</a> <a
href="e4a8d4b43f"><code>e4a8d4b4</code></a>
Thanks <a href="https://github.com/moklick"><code>@​moklick</code></a>!
- Multi select key works when input is focused</p>
</li>
<li>
<p><a
href="https://redirect.github.com/xyflow/xyflow/pull/5266">#5266</a> <a
href="77107453fa"><code>77107453</code></a>
Thanks <a
href="https://github.com/peterkogo"><code>@​peterkogo</code></a>! - Fix
connection snapping for handles larger than connectionRadius</p>
</li>
<li>
<p>Updated dependencies [<a
href="77107453fa"><code>77107453</code></a>]:</p>
<ul>
<li><code>@​xyflow/system</code><a
href="https://github.com/0"><code>@​0</code></a>.0.61</li>
</ul>
</li>
</ul>
<h2><code>@​xyflow/react</code><a
href="https://github.com/12"><code>@​12</code></a>.6.3</h2>
<h3>Patch Changes</h3>
<ul>
<li>
<p><a
href="https://redirect.github.com/xyflow/xyflow/pull/5259">#5259</a> <a
href="77bf79c40e"><code>77bf79c4</code></a>
Thanks <a
href="https://github.com/peterkogo"><code>@​peterkogo</code></a>! - Fix
background-color css variable fallback.</p>
</li>
<li>
<p>Updated dependencies [<a
href="77bf79c40e"><code>77bf79c4</code></a>]:</p>
<ul>
<li><code>@​xyflow/system</code><a
href="https://github.com/0"><code>@​0</code></a>.0.60</li>
</ul>
</li>
</ul>
<h2><code>@​xyflow/react</code><a
href="https://github.com/12"><code>@​12</code></a>.6.2</h2>
<h3>Patch Changes</h3>
<ul>
<li>
<p><a
href="https://redirect.github.com/xyflow/xyflow/pull/5257">#5257</a> <a
href="b1314be04d"><code>b1314be0</code></a>
Thanks <a href="https://github.com/moklick"><code>@​moklick</code></a>!
- Use OnReconnect from system</p>
</li>
<li>
<p>Updated dependencies [<a
href="a95f0e2fbf"><code>a95f0e2f</code></a>]:</p>
<ul>
<li><code>@​xyflow/system</code><a
href="https://github.com/0"><code>@​0</code></a>.0.59</li>
</ul>
</li>
</ul>
<h2><code>@​xyflow/react</code><a
href="https://github.com/12"><code>@​12</code></a>.6.1</h2>
<h3>Patch Changes</h3>
<ul>
<li>
<p><a
href="https://redirect.github.com/xyflow/xyflow/pull/5249">#5249</a> <a
href="895b5d81c8"><code>895b5d81</code></a>
Thanks <a href="https://github.com/moklick"><code>@​moklick</code></a>!
- Call <code>onNodesChange</code> for uncontrolled flows that use
<code>updateNode</code></p>
</li>
<li>
<p><a
href="https://redirect.github.com/xyflow/xyflow/pull/5247">#5247</a> <a
href="67e1cb6891"><code>67e1cb68</code></a>
Thanks <a href="https://github.com/moklick"><code>@​moklick</code></a>!
- Cleanup TSDoc annotations for ReactFlow</p>
</li>
<li>
<p>Updated dependencies [<a
href="2a03213b06"><code>2a03213b</code></a>]:</p>
<ul>
<li><code>@​xyflow/system</code><a
href="https://github.com/0"><code>@​0</code></a>.0.58</li>
</ul>
</li>
</ul>
<h2><code>@​xyflow/react</code><a
href="https://github.com/12"><code>@​12</code></a>.6.0</h2>
<h3>Minor Changes</h3>
<ul>
<li>
<p><a
href="https://redirect.github.com/xyflow/xyflow/pull/5219">#5219</a> <a
href="4236adbc46"><code>4236adbc</code></a>
Thanks <a href="https://github.com/moklick"><code>@​moklick</code></a>!
- Add initialMinZoom, initialMaxZoom and initialFitViewOptions to
ReactFlowProvider</p>
</li>
<li>
<p><a
href="https://redirect.github.com/xyflow/xyflow/pull/5227">#5227</a> <a
href="a7d10ffce5"><code>a7d10ffc</code></a>
Thanks <a href="https://github.com/moklick"><code>@​moklick</code></a>!
- Add <code>resizeDirection</code> prop for the
<code>NodeResizeControl</code> component</p>
</li>
</ul>
<h3>Patch Changes</h3>
<ul>
<li>
<p><a
href="https://redirect.github.com/xyflow/xyflow/pull/5217">#5217</a> <a
href="bce74e8811"><code>bce74e88</code></a>
Thanks <a href="https://github.com/moklick"><code>@​moklick</code></a>!
- Keep node seleciton on pane click if elementsSelectable=false</p>
</li>
<li>
<p>Updated dependencies [<a
href="a7d10ffce5"><code>a7d10ffc</code></a>,
<a
href="4e681f9c52"><code>4e681f9c</code></a>]:</p>
<ul>
<li><code>@​xyflow/system</code><a
href="https://github.com/0"><code>@​0</code></a>.0.57</li>
</ul>
</li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/xyflow/xyflow/blob/main/packages/react/CHANGELOG.md"><code>@​xyflow/react</code>'s
changelog</a>.</em></p>
<blockquote>
<h2>12.6.4</h2>
<h3>Patch Changes</h3>
<ul>
<li>
<p><a
href="https://redirect.github.com/xyflow/xyflow/pull/5263">#5263</a> <a
href="e4a8d4b43f"><code>e4a8d4b4</code></a>
Thanks <a href="https://github.com/moklick"><code>@​moklick</code></a>!
- Multi select key works when input is focused</p>
</li>
<li>
<p><a
href="https://redirect.github.com/xyflow/xyflow/pull/5266">#5266</a> <a
href="77107453fa"><code>77107453</code></a>
Thanks <a
href="https://github.com/peterkogo"><code>@​peterkogo</code></a>! - Fix
connection snapping for handles larger than connectionRadius</p>
</li>
<li>
<p>Updated dependencies [<a
href="77107453fa"><code>77107453</code></a>]:</p>
<ul>
<li><code>@​xyflow/system</code><a
href="https://github.com/0"><code>@​0</code></a>.0.61</li>
</ul>
</li>
</ul>
<h2>12.6.3</h2>
<h3>Patch Changes</h3>
<ul>
<li>
<p><a
href="https://redirect.github.com/xyflow/xyflow/pull/5259">#5259</a> <a
href="77bf79c40e"><code>77bf79c4</code></a>
Thanks <a
href="https://github.com/peterkogo"><code>@​peterkogo</code></a>! - Fix
background-color css variable fallback.</p>
</li>
<li>
<p>Updated dependencies [<a
href="77bf79c40e"><code>77bf79c4</code></a>]:</p>
<ul>
<li><code>@​xyflow/system</code><a
href="https://github.com/0"><code>@​0</code></a>.0.60</li>
</ul>
</li>
</ul>
<h2>12.6.2</h2>
<h3>Patch Changes</h3>
<ul>
<li>
<p><a
href="https://redirect.github.com/xyflow/xyflow/pull/5257">#5257</a> <a
href="b1314be04d"><code>b1314be0</code></a>
Thanks <a href="https://github.com/moklick"><code>@​moklick</code></a>!
- Use OnReconnect from system</p>
</li>
<li>
<p>Updated dependencies [<a
href="a95f0e2fbf"><code>a95f0e2f</code></a>]:</p>
<ul>
<li><code>@​xyflow/system</code><a
href="https://github.com/0"><code>@​0</code></a>.0.59</li>
</ul>
</li>
</ul>
<h2>12.6.1</h2>
<h3>Patch Changes</h3>
<ul>
<li>
<p><a
href="https://redirect.github.com/xyflow/xyflow/pull/5249">#5249</a> <a
href="895b5d81c8"><code>895b5d81</code></a>
Thanks <a href="https://github.com/moklick"><code>@​moklick</code></a>!
- Call <code>onNodesChange</code> for uncontrolled flows that use
<code>updateNode</code></p>
</li>
<li>
<p><a
href="https://redirect.github.com/xyflow/xyflow/pull/5247">#5247</a> <a
href="67e1cb6891"><code>67e1cb68</code></a>
Thanks <a href="https://github.com/moklick"><code>@​moklick</code></a>!
- Cleanup TSDoc annotations for ReactFlow</p>
</li>
<li>
<p>Updated dependencies [<a
href="2a03213b06"><code>2a03213b</code></a>]:</p>
<ul>
<li><code>@​xyflow/system</code><a
href="https://github.com/0"><code>@​0</code></a>.0.58</li>
</ul>
</li>
</ul>
<h2>12.6.0</h2>
<h3>Minor Changes</h3>
<ul>
<li>
<p><a
href="https://redirect.github.com/xyflow/xyflow/pull/5219">#5219</a> <a
href="4236adbc46"><code>4236adbc</code></a>
Thanks <a href="https://github.com/moklick"><code>@​moklick</code></a>!
- Add initialMinZoom, initialMaxZoom and initialFitViewOptions to
ReactFlowProvider</p>
</li>
<li>
<p><a
href="https://redirect.github.com/xyflow/xyflow/pull/5227">#5227</a> <a
href="a7d10ffce5"><code>a7d10ffc</code></a>
Thanks <a href="https://github.com/moklick"><code>@​moklick</code></a>!
- Add <code>resizeDirection</code> prop for the
<code>NodeResizeControl</code> component</p>
</li>
</ul>
<h3>Patch Changes</h3>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="0076d82222"><code>0076d82</code></a>
chore(packages): bump</li>
<li><a
href="8ce85fd8f0"><code>8ce85fd</code></a>
fix(react): multi select works when input is focused</li>
<li><a
href="cee50e1dd0"><code>cee50e1</code></a>
chore(packages): bump</li>
<li><a
href="673a3a8718"><code>673a3a8</code></a>
merge main</li>
<li><a
href="9893b6321b"><code>9893b63</code></a>
chore(packages): bump</li>
<li><a
href="69021e8bbe"><code>69021e8</code></a>
chore(react): use OnReconnet from system</li>
<li><a
href="b30a496bf4"><code>b30a496</code></a>
merg main</li>
<li><a
href="2df553e611"><code>2df553e</code></a>
chore(packages): bump</li>
<li><a href="https://g...

_Description has been truncated_

---------

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Reinier van der Leer <pwuts@agpt.co>
2025-05-16 13:13:07 +00:00
Abhimanyu Yadav
0e53c540d4 fix(blocks): Disable Twitter and Todoist blocks if their OAuth is not configured (#9954)
I am disabling all the Twitter and Todoist blocks whose OAuth is not
configured.

> I have already checked it locally. When OAuth is not set, the blocks
do not appear in the block menu
2025-05-16 12:12:48 +00:00
dependabot[bot]
e48aec921e chore(backend/deps-dev): Bump 3 dev dependencies to latest minor versions (#9852)
Bumps [poethepoet](https://github.com/nat-n/poethepoet),
[pyright](https://github.com/RobertCraigie/pyright-python) and
[ruff](https://github.com/astral-sh/ruff) to their latest versions.

---------

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Reinier van der Leer <pwuts@agpt.co>
2025-05-16 11:30:42 +00:00
dependabot[bot]
d754c2349c chore(libs/deps-dev): Update ruff from 0.11.2 to 0.11.10 (#9775)
Bumps the development-dependencies group in
/autogpt_platform/autogpt_libs with 1 update:
[ruff](https://github.com/astral-sh/ruff).

Updates `ruff` from 0.11.2 to 0.11.10

[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=ruff&package-manager=pip&previous-version=0.11.2&new-version=0.11.4)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

You can trigger a rebase of this PR by commenting `@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore <dependency name> major version` will close this
group update PR and stop Dependabot creating any more for the specific
dependency's major version (unless you unignore this specific
dependency's major version or upgrade to it yourself)
- `@dependabot ignore <dependency name> minor version` will close this
group update PR and stop Dependabot creating any more for the specific
dependency's minor version (unless you unignore this specific
dependency's minor version or upgrade to it yourself)
- `@dependabot ignore <dependency name>` will close this group update PR
and stop Dependabot creating any more for the specific dependency
(unless you unignore this specific dependency or upgrade to it yourself)
- `@dependabot unignore <dependency name>` will remove all of the ignore
conditions of the specified dependency
- `@dependabot unignore <dependency name> <ignore condition>` will
remove the ignore condition of the specified dependency and ignore
conditions


</details>

> **Note**
> Automatic rebases have been disabled on this pull request as it has
been open for over 30 days.

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-05-16 10:42:45 +00:00
dependabot[bot]
870f8265b3 chore(backend/deps): Bump 18 dependencies to latest minor versions (#9930)
Bumps the production-dependencies group with 18 updates in the
/autogpt_platform/backend directory:

| Package | From | To |
| --- | --- | --- |
| [anthropic](https://github.com/anthropics/anthropic-sdk-python) |
`0.49.0` | `0.51.0` |
| [click](https://github.com/pallets/click) | `8.1.8` | `8.2.0` |
| [e2b-code-interpreter](https://github.com/e2b-dev/code-interpreter) |
`1.1.1` | `1.5.0` |
|
[google-api-python-client](https://github.com/googleapis/google-api-python-client)
| `2.166.0` | `2.169.0` |
|
[google-auth-oauthlib](https://github.com/GoogleCloudPlatform/google-auth-library-python-oauthlib)
| `1.2.1` | `1.2.2` |
| [groq](https://github.com/groq/groq-python) | `0.20.0` | `0.24.0` |
|
[launchdarkly-server-sdk](https://github.com/launchdarkly/python-server-sdk)
| `9.10.0` | `9.11.0` |
| [mem0ai](https://github.com/mem0ai/mem0) | `0.1.80` | `0.1.98` |
| [ollama](https://github.com/ollama/ollama-python) | `0.4.7` | `0.4.8`
|
| [openai](https://github.com/openai/openai-python) | `1.70.0` |
`1.78.1` |
| [poetry](https://github.com/python-poetry/poetry) | `2.1.2` | `2.1.3`
|
| [pydantic](https://github.com/pydantic/pydantic) | `2.11.1` | `2.11.4`
|
| [pydantic-settings](https://github.com/pydantic/pydantic-settings) |
`2.8.1` | `2.9.1` |
| [replicate](https://github.com/replicate/replicate-python) | `1.0.4` |
`1.0.6` |
| [sentry-sdk](https://github.com/getsentry/sentry-python) | `2.25.1` |
`2.28.0` |
| [supabase](https://github.com/supabase/supabase-py) | `2.15.0` |
`2.15.1` |
| [tenacity](https://github.com/jd/tenacity) | `9.0.0` | `9.1.2` |
| [uvicorn](https://github.com/encode/uvicorn) | `0.34.0` | `0.34.2` |


Updates `anthropic` from 0.49.0 to 0.51.0
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/anthropics/anthropic-sdk-python/releases">anthropic's
releases</a>.</em></p>
<blockquote>
<h2>v0.51.0</h2>
<h2>0.51.0 (2025-05-07)</h2>
<p>Full Changelog: <a
href="https://github.com/anthropics/anthropic-sdk-python/compare/v0.50.0...v0.51.0">v0.50.0...v0.51.0</a></p>
<h3>Features</h3>
<ul>
<li><strong>api:</strong> adds web search capabilities to the Claude API
(<a
href="bec0cf93c2">bec0cf9</a>)</li>
</ul>
<h3>Bug Fixes</h3>
<ul>
<li><strong>pydantic v1:</strong> more robust ModelField.annotation
check (<a
href="c50f406767">c50f406</a>)</li>
<li><strong>sockets:</strong> handle non-portable socket flags (<a
href="https://redirect.github.com/anthropics/anthropic-sdk-python/issues/935">#935</a>)
(<a
href="205c8dda37">205c8dd</a>)</li>
</ul>
<h3>Chores</h3>
<ul>
<li>broadly detect json family of content-type headers (<a
href="66bbb3a668">66bbb3a</a>)</li>
<li><strong>ci:</strong> only use depot for staging repos (<a
href="c867a11af3">c867a11</a>)</li>
<li><strong>ci:</strong> run on more branches and use depot runners (<a
href="95f5f17be0">95f5f17</a>)</li>
<li><strong>internal:</strong> add back missing custom modifications for
Web Search (<a
href="f43ba69d53">f43ba69</a>)</li>
<li><strong>internal:</strong> minor formatting changes (<a
href="8afef086af">8afef08</a>)</li>
<li>use lazy imports for resources (<a
href="704be817f4">704be81</a>)</li>
</ul>
<h2>v0.50.0</h2>
<h2>0.50.0 (2025-04-22)</h2>
<p>Full Changelog: <a
href="https://github.com/anthropics/anthropic-sdk-python/compare/v0.49.0...v0.50.0">v0.49.0...v0.50.0</a></p>
<h3>Features</h3>
<ul>
<li><strong>api:</strong> extract ContentBlockDelta events into their
own schemas (<a
href="https://redirect.github.com/anthropics/anthropic-sdk-python/issues/920">#920</a>)
(<a
href="ae773d673a">ae773d6</a>)</li>
<li><strong>api:</strong> manual updates (<a
href="46ac1f8d1c">46ac1f8</a>)</li>
<li><strong>api:</strong> manual updates (<a
href="48d9739ad7">48d9739</a>)</li>
<li><strong>api:</strong> manual updates (<a
href="66e8cc3fb2">66e8cc3</a>)</li>
<li><strong>api:</strong> manual updates (<a
href="a74746e0df">a74746e</a>)</li>
</ul>
<h3>Bug Fixes</h3>
<ul>
<li><strong>ci:</strong> ensure pip is always available (<a
href="https://redirect.github.com/anthropics/anthropic-sdk-python/issues/907">#907</a>)
(<a
href="36326871c1">3632687</a>)</li>
<li><strong>ci:</strong> remove publishing patch (<a
href="https://redirect.github.com/anthropics/anthropic-sdk-python/issues/908">#908</a>)
(<a
href="cae032381b">cae0323</a>)</li>
<li><strong>client:</strong> deduplicate stop reason type (<a
href="https://redirect.github.com/anthropics/anthropic-sdk-python/issues/913">#913</a>)
(<a
href="3ab0194550">3ab0194</a>)</li>
<li><strong>client:</strong> send all configured auth headers (<a
href="https://redirect.github.com/anthropics/anthropic-sdk-python/issues/929">#929</a>)
(<a
href="9d2581e79f">9d2581e</a>)</li>
<li><strong>perf:</strong> optimize some hot paths (<a
href="cff76cb00b">cff76cb</a>)</li>
<li><strong>perf:</strong> skip traversing types for NotGiven values (<a
href="dadac7fa72">dadac7f</a>)</li>
<li><strong>project:</strong> bump httpx minimum version to 0.25.0 (<a
href="b554138c2f">b554138</a>),
closes <a
href="https://redirect.github.com/anthropics/anthropic-sdk-python/issues/902">#902</a></li>
<li><strong>types:</strong> handle more discriminated union shapes (<a
href="https://redirect.github.com/anthropics/anthropic-sdk-python/issues/906">#906</a>)
(<a
href="2fc179a4d2">2fc179a</a>)</li>
<li><strong>vertex:</strong> explicitly include requests extra (<a
href="2b1221b76b">2b1221b</a>)</li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/anthropics/anthropic-sdk-python/blob/main/CHANGELOG.md">anthropic's
changelog</a>.</em></p>
<blockquote>
<h2>0.51.0 (2025-05-07)</h2>
<p>Full Changelog: <a
href="https://github.com/anthropics/anthropic-sdk-python/compare/v0.50.0...v0.51.0">v0.50.0...v0.51.0</a></p>
<h3>Features</h3>
<ul>
<li><strong>api:</strong> adds web search capabilities to the Claude API
(<a
href="bec0cf93c2">bec0cf9</a>)</li>
</ul>
<h3>Bug Fixes</h3>
<ul>
<li><strong>pydantic v1:</strong> more robust ModelField.annotation
check (<a
href="c50f406767">c50f406</a>)</li>
<li><strong>sockets:</strong> handle non-portable socket flags (<a
href="https://redirect.github.com/anthropics/anthropic-sdk-python/issues/935">#935</a>)
(<a
href="205c8dda37">205c8dd</a>)</li>
</ul>
<h3>Chores</h3>
<ul>
<li>broadly detect json family of content-type headers (<a
href="66bbb3a668">66bbb3a</a>)</li>
<li><strong>ci:</strong> only use depot for staging repos (<a
href="c867a11af3">c867a11</a>)</li>
<li><strong>ci:</strong> run on more branches and use depot runners (<a
href="95f5f17be0">95f5f17</a>)</li>
<li><strong>internal:</strong> add back missing custom modifications for
Web Search (<a
href="f43ba69d53">f43ba69</a>)</li>
<li><strong>internal:</strong> minor formatting changes (<a
href="8afef086af">8afef08</a>)</li>
<li>use lazy imports for resources (<a
href="704be817f4">704be81</a>)</li>
</ul>
<h2>0.50.0 (2025-04-22)</h2>
<p>Full Changelog: <a
href="https://github.com/anthropics/anthropic-sdk-python/compare/v0.49.0...v0.50.0">v0.49.0...v0.50.0</a></p>
<h3>Features</h3>
<ul>
<li><strong>api:</strong> extract ContentBlockDelta events into their
own schemas (<a
href="https://redirect.github.com/anthropics/anthropic-sdk-python/issues/920">#920</a>)
(<a
href="ae773d673a">ae773d6</a>)</li>
<li><strong>api:</strong> manual updates (<a
href="46ac1f8d1c">46ac1f8</a>)</li>
<li><strong>api:</strong> manual updates (<a
href="48d9739ad7">48d9739</a>)</li>
<li><strong>api:</strong> manual updates (<a
href="66e8cc3fb2">66e8cc3</a>)</li>
<li><strong>api:</strong> manual updates (<a
href="a74746e0df">a74746e</a>)</li>
</ul>
<h3>Bug Fixes</h3>
<ul>
<li><strong>ci:</strong> ensure pip is always available (<a
href="https://redirect.github.com/anthropics/anthropic-sdk-python/issues/907">#907</a>)
(<a
href="36326871c1">3632687</a>)</li>
<li><strong>ci:</strong> remove publishing patch (<a
href="https://redirect.github.com/anthropics/anthropic-sdk-python/issues/908">#908</a>)
(<a
href="cae032381b">cae0323</a>)</li>
<li><strong>client:</strong> deduplicate stop reason type (<a
href="https://redirect.github.com/anthropics/anthropic-sdk-python/issues/913">#913</a>)
(<a
href="3ab0194550">3ab0194</a>)</li>
<li><strong>client:</strong> send all configured auth headers (<a
href="https://redirect.github.com/anthropics/anthropic-sdk-python/issues/929">#929</a>)
(<a
href="9d2581e79f">9d2581e</a>)</li>
<li><strong>perf:</strong> optimize some hot paths (<a
href="cff76cb00b">cff76cb</a>)</li>
<li><strong>perf:</strong> skip traversing types for NotGiven values (<a
href="dadac7fa72">dadac7f</a>)</li>
<li><strong>project:</strong> bump httpx minimum version to 0.25.0 (<a
href="b554138c2f">b554138</a>),
closes <a
href="https://redirect.github.com/anthropics/anthropic-sdk-python/issues/902">#902</a></li>
<li><strong>types:</strong> handle more discriminated union shapes (<a
href="https://redirect.github.com/anthropics/anthropic-sdk-python/issues/906">#906</a>)
(<a
href="2fc179a4d2">2fc179a</a>)</li>
<li><strong>vertex:</strong> explicitly include requests extra (<a
href="2b1221b76b">2b1221b</a>)</li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="e42451ab3f"><code>e42451a</code></a>
release: 0.51.0</li>
<li><a
href="4c7f97f3ea"><code>4c7f97f</code></a>
chore(internal): add back missing custom modifications for Web
Search</li>
<li><a
href="2da00f26c5"><code>2da00f2</code></a>
feat(api): adds web search capabilities to the Claude API</li>
<li><a
href="51fd796456"><code>51fd796</code></a>
fix(sockets): handle non-portable socket flags (<a
href="https://redirect.github.com/anthropics/anthropic-sdk-python/issues/935">#935</a>)</li>
<li><a
href="ac6cfee090"><code>ac6cfee</code></a>
chore: use lazy imports for resources</li>
<li><a
href="215f5bbe56"><code>215f5bb</code></a>
chore: broadly detect json family of content-type headers</li>
<li><a
href="bcaa8a582b"><code>bcaa8a5</code></a>
chore(ci): only use depot for staging repos</li>
<li><a
href="a41e9c346a"><code>a41e9c3</code></a>
chore(ci): run on more branches and use depot runners</li>
<li><a
href="bfebcd91c6"><code>bfebcd9</code></a>
chore(internal): minor formatting changes</li>
<li><a
href="e3548ac1c5"><code>e3548ac</code></a>
fix(pydantic v1): more robust ModelField.annotation check</li>
<li>Additional commits viewable in <a
href="https://github.com/anthropics/anthropic-sdk-python/compare/v0.49.0...v0.51.0">compare
view</a></li>
</ul>
</details>
<br />

Updates `click` from 8.1.8 to 8.2.0
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/pallets/click/releases">click's
releases</a>.</em></p>
<blockquote>
<h2>8.2.0</h2>
<p>This is the Click 8.2.0 feature release. A feature release may
include new features, remove previously deprecated code, add new
deprecation, or introduce potentially breaking changes.</p>
<p>We encourage everyone to upgrade. You can read more about our <a
href="https://palletsprojects.com/versions">Version Support Policy</a>
on our website.</p>
<p>PyPI: <a
href="https://pypi.org/project/click/8.2.0/">https://pypi.org/project/click/8.2.0/</a>
Changes: <a
href="https://click.palletsprojects.com/en/stable/changes/">https://click.palletsprojects.com/en/stable/changes/</a>
Milestone <a
href="https://github.com/pallets/click/milestone/15">https://github.com/pallets/click/milestone/15</a></p>
<ul>
<li>Drop support for Python 3.7, 3.8,and 3.9. <a
href="https://redirect.github.com/pallets/click/issues/2588">#2588</a>,
<a
href="https://redirect.github.com/pallets/click/issues/2893">#2893</a></li>
<li>Use modern packaging metadata with <code>pyproject.toml</code>
instead of <code>setup.cfg</code>. <a
href="https://redirect.github.com/pallets/click/issues/2438">#2438</a></li>
<li>Use <code>flit_core</code> instead of <code>setuptools</code> as
build backend. <a
href="https://redirect.github.com/pallets/click/issues/2543">#2543</a></li>
<li>Deprecate the <code>__version__</code> attribute. Use feature
detection, or
<code>importlib.metadata.version(&quot;click&quot;)</code>, instead. <a
href="https://redirect.github.com/pallets/click/issues/2598">#2598</a></li>
<li><code>BaseCommand</code> is deprecated. <code>Command</code> is the
base class for all commands. <a
href="https://redirect.github.com/pallets/click/issues/2589">#2589</a></li>
<li><code>MultiCommand</code> is deprecated. <code>Group</code> is the
base class for all group commands. <a
href="https://redirect.github.com/pallets/click/issues/2590">#2590</a></li>
<li>The current parser and related classes and methods, are deprecated.
<a
href="https://redirect.github.com/pallets/click/issues/2205">#2205</a>
<ul>
<li><code>OptionParser</code> and the <code>parser</code> module, which
is a modified copy of <code>optparse</code> in the standard
library.</li>
<li><code>Context.protected_args</code> is unneeded.
<code>Context.args</code> contains any remaining arguments while
parsing.</li>
<li><code>Parameter.add_to_parser</code> (on both <code>Argument</code>
and <code>Option</code>) is unneeded. Parsing works directly without
building a separate parser.</li>
<li><code>split_arg_string</code> is moved from <code>parser</code> to
<code>shell_completion</code>.</li>
</ul>
</li>
<li>Enable deferred evaluation of annotations with <code>from __future__
import annotations</code>. <a
href="https://redirect.github.com/pallets/click/issues/2270">#2270</a></li>
<li>When generating a command's name from a decorated function's name,
the suffixes <code>_command</code>, <code>_cmd</code>,
<code>_group</code>, and <code>_grp</code> are removed. <a
href="https://redirect.github.com/pallets/click/issues/2322">#2322</a></li>
<li>Show the <code>types.ParamType.name</code> for
<code>types.Choice</code> options within <code>--help</code> message if
<code>show_choices=False</code> is specified. <a
href="https://redirect.github.com/pallets/click/issues/2356">#2356</a></li>
<li>Do not display default values in prompts when
<code>Option.show_default</code> is <code>False</code>. <a
href="https://redirect.github.com/pallets/click/issues/2509">#2509</a></li>
<li>Add <code>get_help_extra</code> method on <code>Option</code> to
fetch the generated extra items used in <code>get_help_record</code> to
render help text. <a
href="https://redirect.github.com/pallets/click/issues/2516">#2516</a>
<a
href="https://redirect.github.com/pallets/click/issues/2517">#2517</a></li>
<li>Keep stdout and stderr streams independent in
<code>CliRunner</code>. Always collect stderr output and never raise an
exception. Add a new output stream to simulate what the user sees in its
terminal. Removes the <code>mix_stderr</code> parameter in
<code>CliRunner</code>. <a
href="https://redirect.github.com/pallets/click/issues/2522">#2522</a>
<a
href="https://redirect.github.com/pallets/click/issues/2523">#2523</a></li>
<li><code>Option.show_envvar</code> now also shows environment variable
in error messages. <a
href="https://redirect.github.com/pallets/click/issues/2695">#2695</a>
<a
href="https://redirect.github.com/pallets/click/issues/2696">#2696</a></li>
<li><code>Context.close</code> will be called on exit. This results in
all <code>Context.call_on_close</code> callbacks and context managers
added via <code>Context.with_resource</code> to be closed on exit as
well. <a
href="https://redirect.github.com/pallets/click/issues/2680">#2680</a></li>
<li>Add <code>ProgressBar(hidden: bool)</code> to allow hiding the
progressbar. <a
href="https://redirect.github.com/pallets/click/issues/2609">#2609</a></li>
<li>A <code>UserWarning</code> will be shown when multiple parameters
attempt to use the same name. <a
href="https://redirect.github.com/pallets/click/issues/2396">#2396</a></li>
<li>When using <code>Option.envvar</code> with
<code>Option.flag_value</code>, the <code>flag_value</code> will always
be used instead of the value of the environment variable. <a
href="https://redirect.github.com/pallets/click/issues/2746">#2746</a>
<a
href="https://redirect.github.com/pallets/click/issues/2788">#2788</a></li>
<li>Add <code>Choice.get_invalid_choice_message</code> method for
customizing the invalid choice message. <a
href="https://redirect.github.com/pallets/click/issues/2621">#2621</a>
<a
href="https://redirect.github.com/pallets/click/issues/2622">#2622</a></li>
<li>If help is shown because <code>no_args_is_help</code> is enabled
(defaults to <code>True</code> for groups, <code>False</code> for
commands), the exit code is 2 instead of 0. <a
href="https://redirect.github.com/pallets/click/issues/1489">#1489</a>
<a
href="https://redirect.github.com/pallets/click/issues/1489">#1489</a></li>
<li>Contexts created during shell completion are closed properly, fixing
a <code>ResourceWarning</code> when using <code>click.File</code>. <a
href="https://redirect.github.com/pallets/click/issues/2644">#2644</a>
<a
href="https://redirect.github.com/pallets/click/issues/2800">#2800</a>
<a
href="https://redirect.github.com/pallets/click/issues/2767">#2767</a></li>
<li><code>click.edit(filename)</code> now supports passing an iterable
of filenames in case the editor supports editing multiple files at once.
Its return type is now also typed: <code>AnyStr</code> if
<code>text</code> is passed, otherwise <code>None</code>. <a
href="https://redirect.github.com/pallets/click/issues/2067">#2067</a>
<a
href="https://redirect.github.com/pallets/click/issues/2068">#2068</a></li>
<li>Specialized typing of <code>progressbar(length=...)</code> as
<code>ProgressBar[int]</code>. <a
href="https://redirect.github.com/pallets/click/issues/2630">#2630</a></li>
<li>Improve <code>echo_via_pager</code> behaviour in face of errors. <a
href="https://redirect.github.com/pallets/click/issues/2674">#2674</a>
<ul>
<li>Terminate the pager in case a generator passed to
<code>echo_via_pager</code> raises an exception.</li>
<li>Ensure to always close the pipe to the pager process and wait for it
to terminate.</li>
<li><code>echo_via_pager</code> will not ignore
<code>KeyboardInterrupt</code> anymore. This allows the user to search
for future output of the generator when using less and then aborting the
program using ctrl-c.</li>
</ul>
</li>
<li><code>deprecated: bool | str</code> can now be used on options and
arguments. This previously was only available for <code>Command</code>.
The message can now also be customised by using a <code>str</code>
instead of a <code>bool</code>. <a
href="https://redirect.github.com/pallets/click/issues/2263">#2263</a>
<a
href="https://redirect.github.com/pallets/click/issues/2271">#2271</a>
<ul>
<li><code>Command.deprecated</code> formatting in <code>--help</code>
changed from <code>(Deprecated) help</code> to <code>help
(DEPRECATED)</code>.</li>
<li>Parameters cannot be required nor prompted or an error is
raised.</li>
<li>A warning will be printed when something deprecated is used.</li>
</ul>
</li>
<li>Add a <code>catch_exceptions</code> parameter to
<code>CliRunner</code>. If <code>catch_exceptions</code> is not passed
to <code>CliRunner.invoke</code>, the value from <code>CliRunner</code>
is used. <a
href="https://redirect.github.com/pallets/click/issues/2817">#2817</a>
<a
href="https://redirect.github.com/pallets/click/issues/2818">#2818</a></li>
<li><code>Option.flag_value</code> will no longer have a default value
set based on <code>Option.default</code> if <code>Option.is_flag</code>
is <code>False</code>. This results in <code>Option.default</code> not
needing to implement <code>__bool__</code>. <a
href="https://redirect.github.com/pallets/click/issues/2829">#2829</a></li>
<li>Incorrect <code>click.edit</code> typing has been corrected. <a
href="https://redirect.github.com/pallets/click/issues/2804">#2804</a></li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/pallets/click/blob/main/CHANGES.rst">click's
changelog</a>.</em></p>
<blockquote>
<h2>Version 8.2.0</h2>
<p>Released 2025-05-10</p>
<ul>
<li>
<p>Drop support for Python 3.7, 3.8, and 3.9. :pr:<code>2588</code>
:pr:<code>2893</code></p>
</li>
<li>
<p>Use modern packaging metadata with <code>pyproject.toml</code>
instead of <code>setup.cfg</code>.
:pr:<code>2438</code></p>
</li>
<li>
<p>Use <code>flit_core</code> instead of <code>setuptools</code> as
build backend. :pr:<code>2543</code></p>
</li>
<li>
<p>Deprecate the <code>__version__</code> attribute. Use feature
detection, or
<code>importlib.metadata.version(&quot;click&quot;)</code>, instead.
:issue:<code>2598</code></p>
</li>
<li>
<p><code>BaseCommand</code> is deprecated. <code>Command</code> is the
base class for all
commands. :issue:<code>2589</code></p>
</li>
<li>
<p><code>MultiCommand</code> is deprecated. <code>Group</code> is the
base class for all group
commands. :issue:<code>2590</code></p>
</li>
<li>
<p>The current parser and related classes and methods, are deprecated.
:issue:<code>2205</code></p>
<ul>
<li><code>OptionParser</code> and the <code>parser</code> module, which
is a modified copy of
<code>optparse</code> in the standard library.</li>
<li><code>Context.protected_args</code> is unneeded.
<code>Context.args</code> contains any
remaining arguments while parsing.</li>
<li><code>Parameter.add_to_parser</code> (on both <code>Argument</code>
and <code>Option</code>) is
unneeded. Parsing works directly without building a separate
parser.</li>
<li><code>split_arg_string</code> is moved from <code>parser</code> to
<code>shell_completion</code>.</li>
</ul>
</li>
<li>
<p>Enable deferred evaluation of annotations with
<code>from __future__ import annotations</code>.
:pr:<code>2270</code></p>
</li>
<li>
<p>When generating a command's name from a decorated function's name,
the
suffixes <code>_command</code>, <code>_cmd</code>, <code>_group</code>,
and <code>_grp</code> are removed.
:issue:<code>2322</code></p>
</li>
<li>
<p>Show the <code>types.ParamType.name</code> for
<code>types.Choice</code> options within
<code>--help</code> message if <code>show_choices=False</code> is
specified.
:issue:<code>2356</code></p>
</li>
<li>
<p>Do not display default values in prompts when
<code>Option.show_default</code> is
<code>False</code>. :pr:<code>2509</code></p>
</li>
<li>
<p>Add <code>get_help_extra</code> method on <code>Option</code> to
fetch the generated extra
items used in <code>get_help_record</code> to render help text.
:issue:<code>2516</code>
:pr:<code>2517</code></p>
</li>
<li>
<p>Keep stdout and stderr streams independent in <code>CliRunner</code>.
Always
collect stderr output and never raise an exception. Add a new
output stream to simulate what the user sees in its terminal. Removes
the <code>mix_stderr</code> parameter in <code>CliRunner</code>.
:issue:<code>2522</code> :pr:<code>2523</code></p>
</li>
<li>
<p><code>Option.show_envvar</code> now also shows environment variable
in error messages.
:issue:<code>2695</code> :pr:<code>2696</code></p>
</li>
<li>
<p><code>Context.close</code> will be called on exit. This results in
all
<code>Context.call_on_close</code> callbacks and context managers added
via
<code>Context.with_resource</code> to be closed on exit as well.
:pr:<code>2680</code></p>
</li>
<li>
<p>Add <code>ProgressBar(hidden: bool)</code> to allow hiding the
progressbar. :issue:<code>2609</code></p>
</li>
<li>
<p>A <code>UserWarning</code> will be shown when multiple parameters
attempt to use the</p>
</li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="219206a186"><code>219206a</code></a>
release version 8.2.0</li>
<li><a
href="498f882604"><code>498f882</code></a>
drop end of life python versions (<a
href="https://redirect.github.com/pallets/click/issues/2893">#2893</a>)</li>
<li><a
href="ba770cbc96"><code>ba770cb</code></a>
drop end of life python versions</li>
<li><a
href="f14b75063f"><code>f14b750</code></a>
update dev dependencies</li>
<li><a
href="9982faee85"><code>9982fae</code></a>
Update CHANGES.rst</li>
<li><a
href="7318f5f11b"><code>7318f5f</code></a>
Update CHANGES.rst</li>
<li><a
href="b7c0ab471c"><code>b7c0ab4</code></a>
Merge <code>stable</code> into <code>main</code>; Release 8.2.0 (<a
href="https://redirect.github.com/pallets/click/issues/2873">#2873</a>)</li>
<li><a
href="c9b96fe08d"><code>c9b96fe</code></a>
Merge branch 'main' into main</li>
<li><a
href="ab21233fc8"><code>ab21233</code></a>
Rewrite second half of options docs (<a
href="https://redirect.github.com/pallets/click/issues/2848">#2848</a>)</li>
<li><a
href="8c89a14362"><code>8c89a14</code></a>
Merge branch 'main' into options_docs_2</li>
<li>Additional commits viewable in <a
href="https://github.com/pallets/click/compare/8.1.8...8.2.0">compare
view</a></li>
</ul>
</details>
<br />

Updates `e2b-code-interpreter` from 1.1.1 to 1.5.0
<details>
<summary>Commits</summary>
<ul>
<li><a
href="c590c09188"><code>c590c09</code></a>
Bump e2b core to 1.4.0 (<a
href="https://redirect.github.com/e2b-dev/code-interpreter/issues/101">#101</a>)</li>
<li><a
href="0ba58b4a97"><code>0ba58b4</code></a>
vitest: continue after failing test (<a
href="https://redirect.github.com/e2b-dev/code-interpreter/issues/100">#100</a>)</li>
<li><a
href="e80eb185dd"><code>e80eb18</code></a>
added ts-kernel tests (<a
href="https://redirect.github.com/e2b-dev/code-interpreter/issues/99">#99</a>)</li>
<li><a
href="8251c2fb3c"><code>8251c2f</code></a>
[skip ci] Release new versions</li>
<li><a
href="95163ce294"><code>95163ce</code></a>
Added TypeScript support to the code interpreter (<a
href="https://redirect.github.com/e2b-dev/code-interpreter/issues/91">#91</a>)</li>
<li><a
href="120d097e24"><code>120d097</code></a>
[skip ci] Release new versions</li>
<li><a
href="9f9a423e45"><code>9f9a423</code></a>
Updated template with SWC compiler for TypeScript (<a
href="https://redirect.github.com/e2b-dev/code-interpreter/issues/97">#97</a>)</li>
<li><a
href="3be5d224f8"><code>3be5d22</code></a>
[skip ci] Release new versions</li>
<li><a
href="2ee6ffb9aa"><code>2ee6ffb</code></a>
Update JS SDK target to es2017 (<a
href="https://redirect.github.com/e2b-dev/code-interpreter/issues/96">#96</a>)</li>
<li><a
href="2bfcfbb615"><code>2bfcfbb</code></a>
Security patches (<a
href="https://redirect.github.com/e2b-dev/code-interpreter/issues/95">#95</a>)</li>
<li>Additional commits viewable in <a
href="https://github.com/e2b-dev/code-interpreter/compare/@e2b/code-interpreter@1.1.1...@e2b/code-interpreter@1.5.0">compare
view</a></li>
</ul>
</details>
<br />

Updates `google-api-python-client` from 2.166.0 to 2.169.0
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/googleapis/google-api-python-client/releases">google-api-python-client's
releases</a>.</em></p>
<blockquote>
<h2>v2.169.0</h2>
<h2><a
href="https://github.com/googleapis/google-api-python-client/compare/v2.168.0...v2.169.0">2.169.0</a>
(2025-04-29)</h2>
<h3>Features</h3>
<ul>
<li><strong>aiplatform:</strong> Update the api <a
href="6ccd7f5371</a>
(<a
href="14bd22993e">14bd229</a>)</li>
<li><strong>artifactregistry:</strong> Update the api <a
href="ac6477013f</a>
(<a
href="14bd22993e">14bd229</a>)</li>
<li><strong>blockchainnodeengine:</strong> Update the api <a
href="4b1c7d2466</a>
(<a
href="14bd22993e">14bd229</a>)</li>
<li><strong>cloudchannel:</strong> Update the api <a
href="5e15d858dd</a>
(<a
href="14bd22993e">14bd229</a>)</li>
<li><strong>cloudresourcemanager:</strong> Update the api <a
href="94d98b6e6a</a>
(<a
href="14bd22993e">14bd229</a>)</li>
<li><strong>connectors:</strong> Update the api <a
href="a456e2ceaf</a>
(<a
href="14bd22993e">14bd229</a>)</li>
<li><strong>dataflow:</strong> Update the api <a
href="ea28e0289e</a>
(<a
href="14bd22993e">14bd229</a>)</li>
<li><strong>dataform:</strong> Update the api <a
href="35990c2ffd</a>
(<a
href="14bd22993e">14bd229</a>)</li>
<li><strong>datamigration:</strong> Update the api <a
href="d9218bd460</a>
(<a
href="14bd22993e">14bd229</a>)</li>
<li><strong>dataplex:</strong> Update the api <a
href="6df52e86e8</a>
(<a
href="14bd22993e">14bd229</a>)</li>
<li><strong>dialogflow:</strong> Update the api <a
href="bb0b59eedc</a>
(<a
href="14bd22993e">14bd229</a>)</li>
<li><strong>discoveryengine:</strong> Update the api <a
href="5af62dfdcf</a>
(<a
href="14bd22993e">14bd229</a>)</li>
<li><strong>displayvideo:</strong> Update the api <a
href="67effcf14b</a>
(<a
href="14bd22993e">14bd229</a>)</li>
<li><strong>documentai:</strong> Update the api <a
href="fadb34b1a6</a>
(<a
href="14bd22993e">14bd229</a>)</li>
<li><strong>drive:</strong> Update the api <a
href="7d2063f757</a>
(<a
href="14bd22993e">14bd229</a>)</li>
<li><strong>file:</strong> Update the api <a
href="a820c78341</a>
(<a
href="14bd22993e">14bd229</a>)</li>
<li><strong>firebaseml:</strong> Update the api <a
href="e45e9b0fc2</a>
(<a
href="14bd22993e">14bd229</a>)</li>
<li><strong>gkehub:</strong> Update the api <a
href="3c38499a90</a>
(<a
href="14bd22993e">14bd229</a>)</li>
<li><strong>iamcredentials:</strong> Update the api <a
href="c59ac8442a</a>
(<a
href="14bd22993e">14bd229</a>)</li>
<li><strong>integrations:</strong> Update the api <a
href="fa57493279</a>
(<a
href="14bd22993e">14bd229</a>)</li>
<li><strong>merchantapi:</strong> Update the api <a
href="8e00992fb2</a>
(<a
href="14bd22993e">14bd229</a>)</li>
<li><strong>networkconnectivity:</strong> Update the api <a
href="b3c8df3478</a>
(<a
href="14bd22993e">14bd229</a>)</li>
<li><strong>networkmanagement:</strong> Update the api <a
href="62a2c6a476</a>
(<a
href="14bd22993e">14bd229</a>)</li>
<li><strong>networksecurity:</strong> Update the api <a
href="39706e9fd8</a>
(<a
href="14bd22993e">14bd229</a>)</li>
<li><strong>oracledatabase:</strong> Update the api <a
href="6678c3beeb</a>
(<a
href="14bd22993e">14bd229</a>)</li>
<li><strong>orgpolicy:</strong> Update the api <a
href="d1244740ed</a>
(<a
href="14bd22993e">14bd229</a>)</li>
<li><strong>sheets:</strong> Update the api <a
href="8b09804195</a>
(<a
href="14bd22993e">14bd229</a>)</li>
<li><strong>sqladmin:</strong> Update the api <a
href="900e43c86d</a>
(<a
href="14bd22993e">14bd229</a>)</li>
<li><strong>tpu:</strong> Update the api <a
href="183d48eb24</a>
(<a
href="14bd22993e">14bd229</a>)</li>
<li><strong>translate:</strong> Update the api <a
href="c230f82427</a>
(<a
href="14bd22993e">14bd229</a>)</li>
<li><strong>vpcaccess:</strong> Update the api <a
href="cae086c59a</a>
(<a
href="14bd22993e">14bd229</a>)</li>
<li><strong>youtube:</strong> Update the api <a
href="d0b32ba5ba</a>
(<a
href="14bd22993e">14bd229</a>)</li>
</ul>
<h3>Bug Fixes</h3>
<ul>
<li><strong>admin:</strong> Update the api <a
href="82e6e7d206</a>
(<a
href="14bd22993e">14bd229</a>)</li>
<li><strong>cloudbuild:</strong> Update the api <a
href="009d14e0bd</a>
(<a
href="14bd22993e">14bd229</a>)</li>
<li><strong>storage:</strong> Update the api <a
href="e70b2a8745</a>
(<a
href="14bd22993e">14bd229</a>)</li>
</ul>
<h2>v2.168.0</h2>
<h2><a
href="https://github.com/googleapis/google-api-python-client/compare/v2.167.0...v2.168.0">2.168.0</a>
(2025-04-22)</h2>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="bece3b3b3a"><code>bece3b3</code></a>
chore(main): release 2.169.0 (<a
href="https://redirect.github.com/googleapis/google-api-python-client/issues/2600">#2600</a>)</li>
<li><a
href="14bd22993e"><code>14bd229</code></a>
chore: Update discovery artifacts (<a
href="https://redirect.github.com/googleapis/google-api-python-client/issues/2599">#2599</a>)</li>
<li><a
href="ac22e7ce69"><code>ac22e7c</code></a>
chore(main): release 2.168.0 (<a
href="https://redirect.github.com/googleapis/google-api-python-client/issues/2595">#2595</a>)</li>
<li><a
href="27e0dff489"><code>27e0dff</code></a>
chore: Update discovery artifacts (<a
href="https://redirect.github.com/googleapis/google-api-python-client/issues/2597">#2597</a>)</li>
<li><a
href="e2aaf41f7e"><code>e2aaf41</code></a>
chore: Update discovery artifacts (<a
href="https://redirect.github.com/googleapis/google-api-python-client/issues/2596">#2596</a>)</li>
<li><a
href="607bd3d516"><code>607bd3d</code></a>
fix: remove setup.cfg configuration for creating universal wheels (<a
href="https://redirect.github.com/googleapis/google-api-python-client/issues/2580">#2580</a>)</li>
<li><a
href="236c82dc14"><code>236c82d</code></a>
chore(main): release 2.167.0 (<a
href="https://redirect.github.com/googleapis/google-api-python-client/issues/2591">#2591</a>)</li>
<li><a
href="139f58fb30"><code>139f58f</code></a>
fix: explicitly declare support for Python 3.13 (<a
href="https://redirect.github.com/googleapis/google-api-python-client/issues/2593">#2593</a>)</li>
<li><a
href="f84e041b4b"><code>f84e041</code></a>
chore: Update discovery artifacts (<a
href="https://redirect.github.com/googleapis/google-api-python-client/issues/2594">#2594</a>)</li>
<li><a
href="a9db82a60a"><code>a9db82a</code></a>
chore: Update discovery artifacts (<a
href="https://redirect.github.com/googleapis/google-api-python-client/issues/2590">#2590</a>)</li>
<li>See full diff in <a
href="https://github.com/googleapis/google-api-python-client/compare/v2.166.0...v2.169.0">compare
view</a></li>
</ul>
</details>
<br />

Updates `google-auth-oauthlib` from 1.2.1 to 1.2.2
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/GoogleCloudPlatform/google-auth-library-python-oauthlib/releases">google-auth-oauthlib's
releases</a>.</em></p>
<blockquote>
<h2>v1.2.2</h2>
<h2><a
href="https://github.com/googleapis/google-auth-library-python-oauthlib/compare/v1.2.1...v1.2.2">1.2.2</a>
(2025-04-01)</h2>
<h3>Bug Fixes</h3>
<ul>
<li>Do not include docs/conf.py &amp; scripts in wheel (<a
href="https://redirect.github.com/googleapis/google-auth-library-python-oauthlib/issues/328">#328</a>)
(<a
href="78940dfce4">78940df</a>)</li>
<li>Let OS select an available port when running TestInstalledAppFlow
(<a
href="https://redirect.github.com/googleapis/google-auth-library-python-oauthlib/issues/407">#407</a>)
(<a
href="6060d65626">6060d65</a>),
closes <a
href="https://redirect.github.com/googleapis/google-auth-library-python-oauthlib/issues/381">#381</a></li>
<li>Remove setup.cfg configuration for creating universal wheels (<a
href="https://redirect.github.com/googleapis/google-auth-library-python-oauthlib/issues/405">#405</a>)
(<a
href="0b962ed5aa">0b962ed</a>)</li>
</ul>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/googleapis/google-auth-library-python-oauthlib/blob/main/CHANGELOG.md">google-auth-oauthlib's
changelog</a>.</em></p>
<blockquote>
<h2><a
href="https://github.com/googleapis/google-auth-library-python-oauthlib/compare/v1.2.1...v1.2.2">1.2.2</a>
(2025-04-01)</h2>
<h3>Bug Fixes</h3>
<ul>
<li>Do not include docs/conf.py &amp; scripts in wheel (<a
href="https://redirect.github.com/googleapis/google-auth-library-python-oauthlib/issues/328">#328</a>)
(<a
href="78940dfce4">78940df</a>)</li>
<li>Let OS select an available port when running TestInstalledAppFlow
(<a
href="https://redirect.github.com/googleapis/google-auth-library-python-oauthlib/issues/407">#407</a>)
(<a
href="6060d65626">6060d65</a>),
closes <a
href="https://redirect.github.com/googleapis/google-auth-library-python-oauthlib/issues/381">#381</a></li>
<li>Remove setup.cfg configuration for creating universal wheels (<a
href="https://redirect.github.com/googleapis/google-auth-library-python-oauthlib/issues/405">#405</a>)
(<a
href="0b962ed5aa">0b962ed</a>)</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="cc29cc3c37"><code>cc29cc3</code></a>
chore(main): release 1.2.2 (<a
href="https://redirect.github.com/GoogleCloudPlatform/google-auth-library-python-oauthlib/issues/368">#368</a>)</li>
<li><a
href="6060d65626"><code>6060d65</code></a>
fix: Let OS select an available port when running TestInstalledAppFlow
(<a
href="https://redirect.github.com/GoogleCloudPlatform/google-auth-library-python-oauthlib/issues/407">#407</a>)</li>
<li><a
href="0b962ed5aa"><code>0b962ed</code></a>
fix: remove setup.cfg configuration for creating universal wheels (<a
href="https://redirect.github.com/GoogleCloudPlatform/google-auth-library-python-oauthlib/issues/405">#405</a>)</li>
<li><a
href="dedc58a521"><code>dedc58a</code></a>
chore: remove unused files (<a
href="https://redirect.github.com/GoogleCloudPlatform/google-auth-library-python-oauthlib/issues/402">#402</a>)</li>
<li><a
href="63442e94fd"><code>63442e9</code></a>
chore(python): conditionally load credentials in .kokoro/build.sh (<a
href="https://redirect.github.com/GoogleCloudPlatform/google-auth-library-python-oauthlib/issues/398">#398</a>)</li>
<li><a
href="9a1dfab8e9"><code>9a1dfab</code></a>
chore: check if port is in use before returning the port to start a new
serve...</li>
<li><a
href="9c3861009e"><code>9c38610</code></a>
chore: Reduce prioirty of flaky tests (<a
href="https://redirect.github.com/GoogleCloudPlatform/google-auth-library-python-oauthlib/issues/390">#390</a>)</li>
<li><a
href="780f6a6cae"><code>780f6a6</code></a>
chore(python): Update the python version in docs presubmit to use 3.10
(<a
href="https://redirect.github.com/GoogleCloudPlatform/google-auth-library-python-oauthlib/issues/387">#387</a>)</li>
<li><a
href="2a561a6975"><code>2a561a6</code></a>
chore(deps): update all dependencies (<a
href="https://redirect.github.com/GoogleCloudPlatform/google-auth-library-python-oauthlib/issues/382">#382</a>)</li>
<li><a
href="c220b451d1"><code>c220b45</code></a>
chore(python): update dependencies in .kokoro/docker/docs (<a
href="https://redirect.github.com/GoogleCloudPlatform/google-auth-library-python-oauthlib/issues/380">#380</a>)</li>
<li>Additional commits viewable in <a
href="https://github.com/GoogleCloudPlatform/google-auth-library-python-oauthlib/compare/v1.2.1...v1.2.2">compare
view</a></li>
</ul>
</details>
<br />

Updates `groq` from 0.20.0 to 0.24.0
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/groq/groq-python/releases">groq's
releases</a>.</em></p>
<blockquote>
<h2>v0.24.0</h2>
<h2>0.24.0 (2025-05-02)</h2>
<p>Full Changelog: <a
href="https://github.com/groq/groq-python/compare/v0.23.1...v0.24.0">v0.23.1...v0.24.0</a></p>
<h3>Features</h3>
<ul>
<li><strong>api:</strong> api update (<a
href="e65ff4d299">e65ff4d</a>)</li>
</ul>
<h3>Bug Fixes</h3>
<ul>
<li>add include/exclude_domains to all chat completions overloads (<a
href="7616f4b2e9">7616f4b</a>)</li>
</ul>
<h2>v0.23.1</h2>
<h2>0.23.1 (2025-04-24)</h2>
<p>Full Changelog: <a
href="https://github.com/groq/groq-python/compare/v0.23.0...v0.23.1">v0.23.0...v0.23.1</a></p>
<h3>Bug Fixes</h3>
<ul>
<li>add executed_tools to streaming choicedelta (<a
href="fb26fbcd0b">fb26fbc</a>)</li>
<li><strong>pydantic v1:</strong> more robust ModelField.annotation
check (<a
href="40aaee2cd7">40aaee2</a>)</li>
</ul>
<h3>Chores</h3>
<ul>
<li>broadly detect json family of content-type headers (<a
href="2411533949">2411533</a>)</li>
<li><strong>ci:</strong> add timeout thresholds for CI jobs (<a
href="aae461436e">aae4614</a>)</li>
<li><strong>ci:</strong> only use depot for staging repos (<a
href="b6d1b47c1c">b6d1b47</a>)</li>
<li><strong>internal:</strong> codegen related update (<a
href="1da64f5c78">1da64f5</a>)</li>
<li><strong>internal:</strong> fix list file params (<a
href="a9b18debf8">a9b18de</a>)</li>
<li><strong>internal:</strong> import reformatting (<a
href="5068736832">5068736</a>)</li>
<li><strong>internal:</strong> minor formatting changes (<a
href="bc26d603a5">bc26d60</a>)</li>
<li><strong>internal:</strong> refactor retries to not use recursion (<a
href="488b9fe0a8">488b9fe</a>)</li>
</ul>
<h2>v0.23.0</h2>
<h2>0.23.0 (2025-04-22)</h2>
<p>Full Changelog: <a
href="https://github.com/groq/groq-python/compare/v0.22.0...v0.23.0">v0.22.0...v0.23.0</a></p>
<h3>Features</h3>
<ul>
<li><strong>api:</strong> api update (<a
href="f5cbd0148e">f5cbd01</a>)</li>
<li><strong>api:</strong> api update (<a
href="e7c5514b3e">e7c5514</a>)</li>
<li><strong>api:</strong> api update (<a
href="9d5b7c8ba4">9d5b7c8</a>)</li>
<li><strong>api:</strong> api update (<a
href="73357e15c4">73357e1</a>)</li>
<li><strong>api:</strong> api update (<a
href="b1d6697301">b1d6697</a>)</li>
<li><strong>api:</strong> api update (<a
href="98ef30efd2">98ef30e</a>)</li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/groq/groq-python/blob/main/CHANGELOG.md">groq's
changelog</a>.</em></p>
<blockquote>
<h2>0.24.0 (2025-05-02)</h2>
<p>Full Changelog: <a
href="https://github.com/groq/groq-python/compare/v0.23.1...v0.24.0">v0.23.1...v0.24.0</a></p>
<h3>Features</h3>
<ul>
<li><strong>api:</strong> api update (<a
href="e65ff4d299">e65ff4d</a>)</li>
</ul>
<h3>Bug Fixes</h3>
<ul>
<li>add include/exclude_domains to all chat completions overloads (<a
href="7616f4b2e9">7616f4b</a>)</li>
</ul>
<h2>0.23.1 (2025-04-24)</h2>
<p>Full Changelog: <a
href="https://github.com/groq/groq-python/compare/v0.23.0...v0.23.1">v0.23.0...v0.23.1</a></p>
<h3>Bug Fixes</h3>
<ul>
<li>add executed_tools to streaming choicedelta (<a
href="fb26fbcd0b">fb26fbc</a>)</li>
<li><strong>pydantic v1:</strong> more robust ModelField.annotation
check (<a
href="40aaee2cd7">40aaee2</a>)</li>
</ul>
<h3>Chores</h3>
<ul>
<li>broadly detect json family of content-type headers (<a
href="2411533949">2411533</a>)</li>
<li><strong>ci:</strong> add timeout thresholds for CI jobs (<a
href="aae461436e">aae4614</a>)</li>
<li><strong>ci:</strong> only use depot for staging repos (<a
href="b6d1b47c1c">b6d1b47</a>)</li>
<li><strong>internal:</strong> codegen related update (<a
href="1da64f5c78">1da64f5</a>)</li>
<li><strong>internal:</strong> fix list file params (<a
href="a9b18debf8">a9b18de</a>)</li>
<li><strong>internal:</strong> import reformatting (<a
href="5068736832">5068736</a>)</li>
<li><strong>internal:</strong> minor formatting changes (<a
href="bc26d603a5">bc26d60</a>)</li>
<li><strong>internal:</strong> refactor retries to not use recursion (<a
href="488b9fe0a8">488b9fe</a>)</li>
</ul>
<h2>0.23.0 (2025-04-22)</h2>
<p>Full Changelog: <a
href="https://github.com/groq/groq-python/compare/v0.22.0...v0.23.0">v0.22.0...v0.23.0</a></p>
<h3>Features</h3>
<ul>
<li><strong>api:</strong> api update (<a
href="f5cbd0148e">f5cbd01</a>)</li>
<li><strong>api:</strong> api update (<a
href="e7c5514b3e">e7c5514</a>)</li>
<li><strong>api:</strong> api update (<a
href="9d5b7c8ba4">9d5b7c8</a>)</li>
<li><strong>api:</strong> api update (<a
href="73357e15c4">73357e1</a>)</li>
<li><strong>api:</strong> api update (<a
href="b1d6697301">b1d6697</a>)</li>
<li><strong>api:</strong> api update (<a
href="98ef30efd2">98ef30e</a>)</li>
</ul>
<h3>Bug Fixes</h3>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="e1280b2c3e"><code>e1280b2</code></a>
release: 0.24.0 (<a
href="https://redirect.github.com/groq/groq-python/issues/240">#240</a>)</li>
<li><a
href="ab7fe20db7"><code>ab7fe20</code></a>
release: 0.23.1 (<a
href="https://redirect.github.com/groq/groq-python/issues/239">#239</a>)</li>
<li><a href="https://github.com/groq/groq-python/commit/f3d1ea0...

_Description has been truncated_

---------

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Reinier van der Leer <pwuts@agpt.co>
2025-05-16 10:26:39 +00:00
dependabot[bot]
ba91c9f736 chore(libs/deps): Update 4 dependencies (#9908)
Bumps the production-dependencies group with 4 updates in the
/autogpt_platform/autogpt_libs directory:
[google-cloud-logging](https://github.com/googleapis/python-logging),
[pydantic](https://github.com/pydantic/pydantic),
[pydantic-settings](https://github.com/pydantic/pydantic-settings) and
[supabase](https://github.com/supabase/supabase-py).

Updates `google-cloud-logging` from 3.11.4 to 3.12.1
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/googleapis/python-logging/releases">google-cloud-logging's
releases</a>.</em></p>
<blockquote>
<h2>v3.12.1</h2>
<h2><a
href="https://github.com/googleapis/python-logging/compare/v3.12.0...v3.12.1">3.12.1</a>
(2025-04-21)</h2>
<h3>Bug Fixes</h3>
<ul>
<li>Make logging handler close conditional to having the transport
opened (<a
href="https://redirect.github.com/googleapis/python-logging/issues/990">#990</a>)
(<a
href="66c6b91725">66c6b91</a>)</li>
</ul>
<h2>v3.12.0</h2>
<h2><a
href="https://github.com/googleapis/python-logging/compare/v3.11.4...v3.12.0">3.12.0</a>
(2025-04-10)</h2>
<h3>Features</h3>
<ul>
<li>Add REST Interceptors which support reading metadata (<a
href="681bcc5c1f">681bcc5</a>)</li>
<li>Add support for opt-in debug logging (<a
href="681bcc5c1f">681bcc5</a>)</li>
<li>Added flushes/close functionality to logging handlers (<a
href="https://redirect.github.com/googleapis/python-logging/issues/917">#917</a>)
(<a
href="d179304b34">d179304</a>)</li>
</ul>
<h3>Bug Fixes</h3>
<ul>
<li>Allow protobuf 6.x (<a
href="https://redirect.github.com/googleapis/python-logging/issues/977">#977</a>)
(<a
href="6757890013">6757890</a>)</li>
<li><strong>deps:</strong> Require google-cloud-audit-log &gt;= 0.3.1
(<a
href="https://redirect.github.com/googleapis/python-logging/issues/979">#979</a>)
(<a
href="1cc00ecf64">1cc00ec</a>)</li>
<li>Fix typing issue with gRPC metadata when key ends in -bin (<a
href="681bcc5c1f">681bcc5</a>)</li>
</ul>
<h3>Documentation</h3>
<ul>
<li>Added documentation on log_level and excluded_loggers params in
setup_logging (<a
href="https://redirect.github.com/googleapis/python-logging/issues/971">#971</a>)
(<a
href="70d9d25bf8">70d9d25</a>)</li>
<li>Update README to break infinite redirect loop (<a
href="https://redirect.github.com/googleapis/python-logging/issues/972">#972</a>)
(<a
href="52cd907bb3">52cd907</a>)</li>
</ul>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/googleapis/python-logging/blob/main/CHANGELOG.md">google-cloud-logging's
changelog</a>.</em></p>
<blockquote>
<h2><a
href="https://github.com/googleapis/python-logging/compare/v3.12.0...v3.12.1">3.12.1</a>
(2025-04-21)</h2>
<h3>Bug Fixes</h3>
<ul>
<li>Make logging handler close conditional to having the transport
opened (<a
href="https://redirect.github.com/googleapis/python-logging/issues/990">#990</a>)
(<a
href="66c6b91725">66c6b91</a>)</li>
</ul>
<h2><a
href="https://github.com/googleapis/python-logging/compare/v3.11.4...v3.12.0">3.12.0</a>
(2025-04-10)</h2>
<h3>Features</h3>
<ul>
<li>Add REST Interceptors which support reading metadata (<a
href="681bcc5c1f">681bcc5</a>)</li>
<li>Add support for opt-in debug logging (<a
href="681bcc5c1f">681bcc5</a>)</li>
<li>Added flushes/close functionality to logging handlers (<a
href="https://redirect.github.com/googleapis/python-logging/issues/917">#917</a>)
(<a
href="d179304b34">d179304</a>)</li>
</ul>
<h3>Bug Fixes</h3>
<ul>
<li>Allow protobuf 6.x (<a
href="https://redirect.github.com/googleapis/python-logging/issues/977">#977</a>)
(<a
href="6757890013">6757890</a>)</li>
<li><strong>deps:</strong> Require google-cloud-audit-log &gt;= 0.3.1
(<a
href="https://redirect.github.com/googleapis/python-logging/issues/979">#979</a>)
(<a
href="1cc00ecf64">1cc00ec</a>)</li>
<li>Fix typing issue with gRPC metadata when key ends in -bin (<a
href="681bcc5c1f">681bcc5</a>)</li>
</ul>
<h3>Documentation</h3>
<ul>
<li>Added documentation on log_level and excluded_loggers params in
setup_logging (<a
href="https://redirect.github.com/googleapis/python-logging/issues/971">#971</a>)
(<a
href="70d9d25bf8">70d9d25</a>)</li>
<li>Update README to break infinite redirect loop (<a
href="https://redirect.github.com/googleapis/python-logging/issues/972">#972</a>)
(<a
href="52cd907bb3">52cd907</a>)</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="f4fb25ab6f"><code>f4fb25a</code></a>
chore(main): release 3.12.1 (<a
href="https://redirect.github.com/googleapis/python-logging/issues/992">#992</a>)</li>
<li><a
href="66c6b91725"><code>66c6b91</code></a>
fix: make logging handler close conditional to having the transport
opened (#...</li>
<li><a
href="5f89b5f77d"><code>5f89b5f</code></a>
chore(main): release 3.12.0 (<a
href="https://redirect.github.com/googleapis/python-logging/issues/973">#973</a>)</li>
<li><a
href="5db27c2ac0"><code>5db27c2</code></a>
chore(python): remove .flake8 configuration file in templates (<a
href="https://redirect.github.com/googleapis/python-logging/issues/983">#983</a>)</li>
<li><a
href="d179304b34"><code>d179304</code></a>
feat: Added flushes/close functionality to logging handlers (<a
href="https://redirect.github.com/googleapis/python-logging/issues/917">#917</a>)</li>
<li><a
href="1cc00ecf64"><code>1cc00ec</code></a>
fix(deps): require google-cloud-audit-log &gt;= 0.3.1 (<a
href="https://redirect.github.com/googleapis/python-logging/issues/979">#979</a>)</li>
<li><a
href="42387bf63f"><code>42387bf</code></a>
chore: Update gapic-generator-python to 1.23.6 (<a
href="https://redirect.github.com/googleapis/python-logging/issues/982">#982</a>)</li>
<li><a
href="52cd907bb3"><code>52cd907</code></a>
docs: update README to break infinite redirect loop (<a
href="https://redirect.github.com/googleapis/python-logging/issues/972">#972</a>)</li>
<li><a
href="6757890013"><code>6757890</code></a>
fix: Allow protobuf 6.x (<a
href="https://redirect.github.com/googleapis/python-logging/issues/977">#977</a>)</li>
<li><a
href="40b1a528df"><code>40b1a52</code></a>
chore: remove unused files (<a
href="https://redirect.github.com/googleapis/python-logging/issues/976">#976</a>)</li>
<li>Additional commits viewable in <a
href="https://github.com/googleapis/python-logging/compare/v3.11.4...v3.12.1">compare
view</a></li>
</ul>
</details>
<br />

Updates `pydantic` from 2.11.1 to 2.11.4
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/pydantic/pydantic/releases">pydantic's
releases</a>.</em></p>
<blockquote>
<h2>v2.11.4 2025-04-29</h2>
<h3>What's Changed</h3>
<h4>Packaging</h4>
<ul>
<li>Bump <code>mkdocs-llmstxt</code> to v0.2.0 by <a
href="https://github.com/Viicos"><code>@​Viicos</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic/pull/11725">#11725</a></li>
</ul>
<h4>Changes</h4>
<ul>
<li>Allow config and bases to be specified together in
<code>create_model()</code> by <a
href="https://github.com/Viicos"><code>@​Viicos</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic/pull/11714">#11714</a>.
This change was backported as it was previously possible (although not
meant to be supported)
to provide <code>model_config</code> as a field, which would make it
possible to provide both configuration
and bases.</li>
</ul>
<h4>Fixes</h4>
<ul>
<li>Remove generics cache workaround by <a
href="https://github.com/Viicos"><code>@​Viicos</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic/pull/11755">#11755</a></li>
<li>Remove coercion of decimal constraints by <a
href="https://github.com/Viicos"><code>@​Viicos</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic/pull/11772">#11772</a></li>
<li>Fix crash when expanding root type in the mypy plugin by <a
href="https://github.com/Viicos"><code>@​Viicos</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic/pull/11735">#11735</a></li>
<li>Fix issue with recursive generic models by <a
href="https://github.com/Viicos"><code>@​Viicos</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic/pull/11775">#11775</a></li>
<li>Traverse <code>function-before</code> schemas during schema
gathering by <a
href="https://github.com/Viicos"><code>@​Viicos</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic/pull/11801">#11801</a></li>
</ul>
<h2>v2.11.3 2025-04-08</h2>
<!-- raw HTML omitted -->
<h2>What's Changed</h2>
<h3>Packaging</h3>
<ul>
<li>Update V1 copy to v1.10.21 by <a
href="https://github.com/Viicos"><code>@​Viicos</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic/pull/11706">pydantic/pydantic#11706</a></li>
</ul>
<h3>Fixes</h3>
<ul>
<li>Preserve field description when rebuilding model fields by <a
href="https://github.com/Viicos"><code>@​Viicos</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic/pull/11698">pydantic/pydantic#11698</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/pydantic/pydantic/compare/v2.11.2...v2.11.3">https://github.com/pydantic/pydantic/compare/v2.11.2...v2.11.3</a></p>
<h2>v2.11.2 2025-04-03</h2>
<!-- raw HTML omitted -->
<h2>What's Changed</h2>
<h3>Fixes</h3>
<ul>
<li>Bump <code>pydantic-core</code> to v2.33.1 by <a
href="https://github.com/Viicos"><code>@​Viicos</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic/pull/11678">pydantic/pydantic#11678</a></li>
<li>Make sure <code>__pydantic_private__</code> exists before setting
private attributes by <a
href="https://github.com/Viicos"><code>@​Viicos</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic/pull/11666">pydantic/pydantic#11666</a></li>
<li>Do not override <code>FieldInfo._complete</code> when using field
from parent class by <a
href="https://github.com/Viicos"><code>@​Viicos</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic/pull/11668">pydantic/pydantic#11668</a></li>
<li>Provide the available definitions when applying discriminated unions
by <a href="https://github.com/Viicos"><code>@​Viicos</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic/pull/11670">pydantic/pydantic#11670</a></li>
<li>Do not expand root type in the mypy plugin for variables by <a
href="https://github.com/Viicos"><code>@​Viicos</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic/pull/11676">pydantic/pydantic#11676</a></li>
<li>Mention the attribute name in model fields deprecation message by <a
href="https://github.com/Viicos"><code>@​Viicos</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic/pull/11674">pydantic/pydantic#11674</a></li>
<li>Properly validate parameterized mappings by <a
href="https://github.com/Viicos"><code>@​Viicos</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic/pull/11658">pydantic/pydantic#11658</a></li>
<li>Prepare release v2.11.2 by <a
href="https://github.com/Viicos"><code>@​Viicos</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic/pull/11684">pydantic/pydantic#11684</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/pydantic/pydantic/compare/v2.11.1...v2.11.2">https://github.com/pydantic/pydantic/compare/v2.11.1...v2.11.2</a></p>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/pydantic/pydantic/blob/main/HISTORY.md">pydantic's
changelog</a>.</em></p>
<blockquote>
<h2>v2.11.4 (2025-04-29)</h2>
<p><a
href="https://github.com/pydantic/pydantic/releases/tag/v2.11.4">GitHub
release</a></p>
<h3>What's Changed</h3>
<h4>Packaging</h4>
<ul>
<li>Bump <code>mkdocs-llmstxt</code> to v0.2.0 by <a
href="https://github.com/Viicos"><code>@​Viicos</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic/pull/11725">#11725</a></li>
</ul>
<h4>Changes</h4>
<ul>
<li>Allow config and bases to be specified together in
<code>create_model()</code> by <a
href="https://github.com/Viicos"><code>@​Viicos</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic/pull/11714">#11714</a>.
This change was backported as it was previously possible (although not
meant to be supported)
to provide <code>model_config</code> as a field, which would make it
possible to provide both configuration
and bases.</li>
</ul>
<h4>Fixes</h4>
<ul>
<li>Remove generics cache workaround by <a
href="https://github.com/Viicos"><code>@​Viicos</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic/pull/11755">#11755</a></li>
<li>Remove coercion of decimal constraints by <a
href="https://github.com/Viicos"><code>@​Viicos</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic/pull/11772">#11772</a></li>
<li>Fix crash when expanding root type in the mypy plugin by <a
href="https://github.com/Viicos"><code>@​Viicos</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic/pull/11735">#11735</a></li>
<li>Fix issue with recursive generic models by <a
href="https://github.com/Viicos"><code>@​Viicos</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic/pull/11775">#11775</a></li>
<li>Traverse <code>function-before</code> schemas during schema
gathering by <a
href="https://github.com/Viicos"><code>@​Viicos</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic/pull/11801">#11801</a></li>
</ul>
<h2>v2.11.3 (2025-04-08)</h2>
<p><a
href="https://github.com/pydantic/pydantic/releases/tag/v2.11.3">GitHub
release</a></p>
<h3>What's Changed</h3>
<h4>Packaging</h4>
<ul>
<li>Update V1 copy to v1.10.21 by <a
href="https://github.com/Viicos"><code>@​Viicos</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic/pull/11706">#11706</a></li>
</ul>
<h4>Fixes</h4>
<ul>
<li>Preserve field description when rebuilding model fields by <a
href="https://github.com/Viicos"><code>@​Viicos</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic/pull/11698">#11698</a></li>
</ul>
<h2>v2.11.2 (2025-04-03)</h2>
<p><a
href="https://github.com/pydantic/pydantic/releases/tag/v2.11.2">GitHub
release</a></p>
<h3>What's Changed</h3>
<h4>Fixes</h4>
<ul>
<li>Bump <code>pydantic-core</code> to v2.33.1 by <a
href="https://github.com/Viicos"><code>@​Viicos</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic/pull/11678">#11678</a></li>
<li>Make sure <code>__pydantic_private__</code> exists before setting
private attributes by <a
href="https://github.com/Viicos"><code>@​Viicos</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic/pull/11666">#11666</a></li>
<li>Do not override <code>FieldInfo._complete</code> when using field
from parent class by <a
href="https://github.com/Viicos"><code>@​Viicos</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic/pull/11668">#11668</a></li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="d444cd1cf6"><code>d444cd1</code></a>
Prepare release v2.11.4</li>
<li><a
href="828fc48d55"><code>828fc48</code></a>
Add documentation note about common pitfall with the annotated
pattern</li>
<li><a
href="42bf1fd784"><code>42bf1fd</code></a>
Bump <code>pydantic-core</code> to v2.33.2 (<a
href="https://redirect.github.com/pydantic/pydantic/issues/11804">#11804</a>)</li>
<li><a
href="7b3f513215"><code>7b3f513</code></a>
Allow config and bases to be specified together in
<code>create_model()</code></li>
<li><a
href="fc521388f2"><code>fc52138</code></a>
Traverse <code>function-before</code> schemas during schema
gathering</li>
<li><a
href="25af78934a"><code>25af789</code></a>
Fix issue with recursive generic models</li>
<li><a
href="91ef6bb39e"><code>91ef6bb</code></a>
Update monthly download count in documentation</li>
<li><a
href="a830775328"><code>a830775</code></a>
Bump <code>mkdocs-llmstxt</code> to v0.2.0</li>
<li><a
href="f5d1c87128"><code>f5d1c87</code></a>
Fix crash when expanding root type in the mypy plugin</li>
<li><a
href="c80bb355d7"><code>c80bb35</code></a>
Remove coercion of decimal constraints</li>
<li>Additional commits viewable in <a
href="https://github.com/pydantic/pydantic/compare/v2.11.1...v2.11.4">compare
view</a></li>
</ul>
</details>
<br />

Updates `pydantic-settings` from 2.8.1 to 2.9.1
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/pydantic/pydantic-settings/releases">pydantic-settings's
releases</a>.</em></p>
<blockquote>
<h2>v2.9.0</h2>
<h2>What's Changed</h2>
<ul>
<li>Drop support for Python 3.8 by <a
href="https://github.com/Viicos"><code>@​Viicos</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic-settings/pull/560">pydantic/pydantic-settings#560</a></li>
<li>Switch to <code>typing-inspection</code> by <a
href="https://github.com/Viicos"><code>@​Viicos</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic-settings/pull/556">pydantic/pydantic-settings#556</a></li>
<li>Introduce <code>uv</code> for Project Management by <a
href="https://github.com/KanchiShimono"><code>@​KanchiShimono</code></a>
in <a
href="https://redirect.github.com/pydantic/pydantic-settings/pull/547">pydantic/pydantic-settings#547</a></li>
<li>Refactor sources.py into a subpackage (<a
href="https://redirect.github.com/pydantic/pydantic-settings/issues/546">#546</a>)
by <a href="https://github.com/ezwiefel"><code>@​ezwiefel</code></a> in
<a
href="https://redirect.github.com/pydantic/pydantic-settings/pull/548">pydantic/pydantic-settings#548</a></li>
<li>chore: cleanup by <a
href="https://github.com/CodeWithEmad"><code>@​CodeWithEmad</code></a>
in <a
href="https://redirect.github.com/pydantic/pydantic-settings/pull/563">pydantic/pydantic-settings#563</a></li>
<li>Fix typo in documentation by <a
href="https://github.com/CodeWithEmad"><code>@​CodeWithEmad</code></a>
in <a
href="https://redirect.github.com/pydantic/pydantic-settings/pull/564">pydantic/pydantic-settings#564</a></li>
<li>Add support for AWS Secrets Manager by <a
href="https://github.com/mavwolverine"><code>@​mavwolverine</code></a>
in <a
href="https://redirect.github.com/pydantic/pydantic-settings/pull/532">pydantic/pydantic-settings#532</a></li>
<li>Fix minor typo: conotations =&gt; connotations by <a
href="https://github.com/svenevs"><code>@​svenevs</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic-settings/pull/577">pydantic/pydantic-settings#577</a></li>
<li>Azure Key Vault: Don't load disabled secret by <a
href="https://github.com/AndreuCodina"><code>@​AndreuCodina</code></a>
in <a
href="https://redirect.github.com/pydantic/pydantic-settings/pull/578">pydantic/pydantic-settings#578</a></li>
<li>Add support for GCP Secret Manager by <a
href="https://github.com/ezwiefel"><code>@​ezwiefel</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic-settings/pull/567">pydantic/pydantic-settings#567</a></li>
<li>CLI JSON Optional Default by <a
href="https://github.com/kschwab"><code>@​kschwab</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic-settings/pull/581">pydantic/pydantic-settings#581</a></li>
<li>Fix for env nested enum. by <a
href="https://github.com/kschwab"><code>@​kschwab</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic-settings/pull/589">pydantic/pydantic-settings#589</a></li>
<li>CLI submodel suppress. by <a
href="https://github.com/kschwab"><code>@​kschwab</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic-settings/pull/587">pydantic/pydantic-settings#587</a></li>
<li>Cli retrieve unknown args by <a
href="https://github.com/kschwab"><code>@​kschwab</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic-settings/pull/588">pydantic/pydantic-settings#588</a></li>
<li>Update pydantic by <a
href="https://github.com/hramezani"><code>@​hramezani</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic-settings/pull/593">pydantic/pydantic-settings#593</a></li>
<li>Fix check in CI by <a
href="https://github.com/hramezani"><code>@​hramezani</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic-settings/pull/595">pydantic/pydantic-settings#595</a></li>
</ul>
<h2>New Contributors</h2>
<ul>
<li><a href="https://github.com/ezwiefel"><code>@​ezwiefel</code></a>
made their first contribution in <a
href="https://redirect.github.com/pydantic/pydantic-settings/pull/548">pydantic/pydantic-settings#548</a></li>
<li><a
href="https://github.com/CodeWithEmad"><code>@​CodeWithEmad</code></a>
made their first contribution in <a
href="https://redirect.github.com/pydantic/pydantic-settings/pull/563">pydantic/pydantic-settings#563</a></li>
<li><a
href="https://github.com/mavwolverine"><code>@​mavwolverine</code></a>
made their first contribution in <a
href="https://redirect.github.com/pydantic/pydantic-settings/pull/532">pydantic/pydantic-settings#532</a></li>
<li><a href="https://github.com/svenevs"><code>@​svenevs</code></a> made
their first contribution in <a
href="https://redirect.github.com/pydantic/pydantic-settings/pull/577">pydantic/pydantic-settings#577</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/pydantic/pydantic-settings/compare/v2.8.1...v2.9.0">https://github.com/pydantic/pydantic-settings/compare/v2.8.1...v2.9.0</a></p>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="1874740923"><code>1874740</code></a>
Prepare release 2.9.1 (<a
href="https://redirect.github.com/pydantic/pydantic-settings/issues/600">#600</a>)</li>
<li><a
href="88e77bc8aa"><code>88e77bc</code></a>
Fix typo in gcp secret manager error message (<a
href="https://redirect.github.com/pydantic/pydantic-settings/issues/598">#598</a>)</li>
<li><a
href="e973d9afc8"><code>e973d9a</code></a>
fix: Expose ConfigFileSourceMixing on top level
sources/<strong>init</strong>.py (<a
href="https://redirect.github.com/pydantic/pydantic-settings/issues/597">#597</a>)</li>
<li><a
href="8c0f5f18b0"><code>8c0f5f1</code></a>
Fix check in CI (<a
href="https://redirect.github.com/pydantic/pydantic-settings/issues/595">#595</a>)</li>
<li><a
href="0ac2312042"><code>0ac2312</code></a>
Prepare release 2.9.0 (<a
href="https://redirect.github.com/pydantic/pydantic-settings/issues/594">#594</a>)</li>
<li><a
href="f3e5ac382c"><code>f3e5ac3</code></a>
Update pydantic (<a
href="https://redirect.github.com/pydantic/pydantic-settings/issues/593">#593</a>)</li>
<li><a
href="20640b0efe"><code>20640b0</code></a>
Cli retrieve unknown args (<a
href="https://redirect.github.com/pydantic/pydantic-settings/issues/588">#588</a>)</li>
<li><a
href="ed7fd42bfb"><code>ed7fd42</code></a>
CLI submodel suppress. (<a
href="https://redirect.github.com/pydantic/pydantic-settings/issues/587">#587</a>)</li>
<li><a
href="e9fb3164eb"><code>e9fb316</code></a>
Fix for env nested enum. (<a
href="https://redirect.github.com/pydantic/pydantic-settings/issues/589">#589</a>)</li>
<li><a
href="0e9b329c74"><code>0e9b329</code></a>
CLI JSON Optional Default (<a
href="https://redirect.github.com/pydantic/pydantic-settings/issues/581">#581</a>)</li>
<li>Additional commits viewable in <a
href="https://github.com/pydantic/pydantic-settings/compare/v2.8.1...v2.9.1">compare
view</a></li>
</ul>
</details>
<br />

Updates `supabase` from 2.15.0 to 2.15.1
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/supabase/supabase-py/releases">supabase's
releases</a>.</em></p>
<blockquote>
<h2>v2.15.1</h2>
<h2><a
href="https://github.com/supabase/supabase-py/compare/v2.15.0...v2.15.1">2.15.1</a>
(2025-04-28)</h2>
<h3>Bug Fixes</h3>
<ul>
<li><strong>postgrest:</strong> add missing count, head, and get params
(<a
href="https://redirect.github.com/supabase/supabase-py/issues/1098">#1098</a>)
(<a
href="e9c219ebda">e9c219e</a>)</li>
<li><strong>realtime:</strong> bump realtime from 2.4.2 to 2.4.3 (<a
href="https://redirect.github.com/supabase/supabase-py/issues/1112">#1112</a>)
(<a
href="1d429c6555">1d429c6</a>)</li>
<li>remove return type from postgrest methods (<a
href="https://redirect.github.com/supabase/supabase-py/issues/1110">#1110</a>)
(<a
href="6664f42157">6664f42</a>)</li>
</ul>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/supabase/supabase-py/blob/main/CHANGELOG.md">supabase's
changelog</a>.</em></p>
<blockquote>
<h2><a
href="https://github.com/supabase/supabase-py/compare/v2.15.0...v2.15.1">2.15.1</a>
(2025-04-28)</h2>
<h3>Bug Fixes</h3>
<ul>
<li><strong>postgrest:</strong> add missing count, head, and get params
(<a
href="https://redirect.github.com/supabase/supabase-py/issues/1098">#1098</a>)
(<a
href="e9c219ebda">e9c219e</a>)</li>
<li><strong>realtime:</strong> bump realtime from 2.4.2 to 2.4.3 (<a
href="https://redirect.github.com/supabase/supabase-py/issues/1112">#1112</a>)
(<a
href="1d429c6555">1d429c6</a>)</li>
<li>remove return type from postgrest methods (<a
href="https://redirect.github.com/supabase/supabase-py/issues/1110">#1110</a>)
(<a
href="6664f42157">6664f42</a>)</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="9fdf32f36c"><code>9fdf32f</code></a>
chore(main): release 2.15.1 (<a
href="https://redirect.github.com/supabase/supabase-py/issues/1099">#1099</a>)</li>
<li><a
href="1d429c6555"><code>1d429c6</code></a>
fix(realtime): bump realtime from 2.4.2 to 2.4.3 (<a
href="https://redirect.github.com/supabase/supabase-py/issues/1112">#1112</a>)</li>
<li><a
href="b12940429d"><code>b129404</code></a>
chore(deps-dev): bump commitizen from 4.4.1 to 4.6.0 (<a
href="https://redirect.github.com/supabase/supabase-py/issues/1104">#1104</a>)</li>
<li><a
href="ad8f99e4a0"><code>ad8f99e</code></a>
chore(deps-dev): bump pytest-cov from 6.0.0 to 6.1.1 (<a
href="https://redirect.github.com/supabase/supabase-py/issues/1100">#1100</a>)</li>
<li><a
href="6664f42157"><code>6664f42</code></a>
fix: remove return type from postgrest methods (<a
href="https://redirect.github.com/supabase/supabase-py/issues/1110">#1110</a>)</li>
<li><a
href="c0ca1758ba"><code>c0ca175</code></a>
ci: explicit permissions and remove _target (<a
href="https://redirect.github.com/supabase/supabase-py/issues/1102">#1102</a>)</li>
<li><a
href="e9c219ebda"><code>e9c219e</code></a>
fix(postgrest): add missing count, head, and get params (<a
href="https://redirect.github.com/supabase/supabase-py/issues/1098">#1098</a>)</li>
<li><a
href="d5aa9ce4c7"><code>d5aa9ce</code></a>
chore(deps-dev): bump python-dotenv from 1.0.1 to 1.1.0 (<a
href="https://redirect.github.com/supabase/supabase-py/issues/1085">#1085</a>)</li>
<li><a
href="8ff9ba8e23"><code>8ff9ba8</code></a>
chore(deps-dev): bump pytest-asyncio from 0.25.3 to 0.26.0 (<a
href="https://redirect.github.com/supabase/supabase-py/issues/1084">#1084</a>)</li>
<li>See full diff in <a
href="https://github.com/supabase/supabase-py/compare/v2.15.0...v2.15.1">compare
view</a></li>
</ul>
</details>
<br />


Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore <dependency name> major version` will close this
group update PR and stop Dependabot creating any more for the specific
dependency's major version (unless you unignore this specific
dependency's major version or upgrade to it yourself)
- `@dependabot ignore <dependency name> minor version` will close this
group update PR and stop Dependabot creating any more for the specific
dependency's minor version (unless you unignore this specific
dependency's minor version or upgrade to it yourself)
- `@dependabot ignore <dependency name>` will close this group update PR
and stop Dependabot creating any more for the specific dependency
(unless you unignore this specific dependency or upgrade to it yourself)
- `@dependabot unignore <dependency name>` will remove all of the ignore
conditions of the specified dependency
- `@dependabot unignore <dependency name> <ignore condition>` will
remove the ignore condition of the specified dependency and ignore
conditions


</details>

---------

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Reinier van der Leer <pwuts@agpt.co>
2025-05-16 09:44:23 +00:00
Emmanuel Ferdman
e5368f3857 fix: Resolve logger.warn(..) deprecration warnings (#9938)
This small PR resolves the deprecation warnings of the `logger` library:
```
DeprecationWarning: The 'warn' method is deprecated, use 'warning' instead
```
2025-05-16 10:56:03 +02:00
Reinier van der Leer
c73c6fe5c3 Merge branch 'master' into dev 2025-05-15 19:28:40 +02:00
Nicholas Tindle
9bef383df2 fix(backend): Make URL pinning work with extra_url_validator (#9940)
Github Blocks use an URL transformer passed to `Requests` to convert web
URLs to the API URLs. This doesn't always work with the anti-SSRF URL
pinning mechanism that was implemented in #8531.

### Changes 🏗️
In `Requests.request(..)`:
- Apply `validate_url` *after* `extra_url_validator`, to prevent
mismatch between `pinned_url` and `original_hostname`
- Simplify logic & add clarifying comments

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  - [x] Tested the github blocks that had the issue

---------

Co-authored-by: Reinier van der Leer <pwuts@agpt.co>
2025-05-15 13:12:39 +00:00
Chirag Modi
2dc038b6c0 Add Llama API Support (#9899)
The changes in this PR are to add Llama API support.

### Changes 🏗️
We add both backend and frontend support.

**Backend**:
- Add llama_api provider
- Include models supported by Llama API along with configs
- llm_call
- credential store and llama_api_key field in Settings

**Frontend**:
- Llama API as a type
- Credentials input and provider for Llama API
 

### Checklist 📋

#### For code changes:
- [X] I have clearly listed my changes in the PR description
- [X] I have tested my changes according to the test plan:

**Test Plan**:

<details>
  <summary>AI Text Generator</summary>
  
  - [X] Start-up backend and frontend:
- Start backend with Docker services: `docker compose up -d --build`
     - Start frontend: `npm install && npm run dev`
- By visiting http://localhost:3000/, test inference and structured
outputs
  - [X] Create from scratch 
  - [X] Request for Llama API Credentials
  
<img width="2015" alt="image"
src="https://github.com/user-attachments/assets/3dede402-3718-4441-9327-ecab25c63ebf"
/>

  - [X] Execute an agent with at least 3 blocks
 
<img width="2026" alt="image"
src="https://github.com/user-attachments/assets/59d6d56b-2ccc-4af5-b511-4af312c3f7f8"
/>

  - [X] Confirm it executes correctly
</details>

<details>
  <summary>Structured Response Generator</summary>
  
  - [X] Start-up backend and frontend:
- Start backend with Docker services: `docker compose up -d --build`
     - Start frontend: `npm install && npm run dev`
- By visiting http://localhost:3000/, test inference and structured
outputs
  - [X] Create from scratch 
  - [X] Execute an agent 
<img width="2023" alt="image"
src="https://github.com/user-attachments/assets/d1107638-bf1b-45b1-a296-1e0fac29525b"
/>

  - [X] Confirm it executes correctly
</details>

---------

Co-authored-by: Nicholas Tindle <nicholas.tindle@agpt.co>
2025-05-14 19:45:40 +00:00
Krzysztof Czerwinski
cd6deb87c3 fix(frontend): Revert congrats onboarding screen action (#9943)
Revert congrats onboarding screen action of #9916
2025-05-14 19:19:27 +00:00
Zamil Majdy
1999ba38d9 feat(backend): Truncate logging on NotificationManager and LLMBlocks (#9939)
> Log entry with size 865.8K exceeds maximum size of 256.0K

Some logs are just too large.

### Changes 🏗️

Truncate logging on NotificationManager and LLMBlocks

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [x] Existing CI
2025-05-14 19:18:59 +00:00
Nicholas Tindle
e8fa996c2f refactor(frontend): Move from remote loaded GTM to local (#9933)
<!-- Clearly explain the need for these changes: -->

As part of a small security review, we found that google won't let you
load with integrity. That seems insane but the general workaround is to
load a static copy of your own

### Changes 🏗️
Moves all the google analytics scripts in house instead of loading via
cdn
<!-- Concisely describe all of the changes made in this pull request:
-->

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
- [x] tested it in a new isolated environment to confirm events still
work as expected everywhere they are used

---------

Co-authored-by: Reinier van der Leer <pwuts@agpt.co>
2025-05-14 16:07:40 +00:00
Krzysztof Czerwinski
e22d2c848a feat(frontend): Onboarding Updates 3 (#9916)
A collection of UX update and bug fixes for onboarding and wallet.

### Changes 🏗️

- Show spinner loading indicator when onboarding button is clicked
- Use `getLibraryAgentByStoreListingVersionID` instead of
`addMarketplaceAgentToLibrary` on congrats screen
- Fix `Not enough segments` issue: don't fetch onboarding when user is
logged out
- Minor updates
  - Fill some missing deps in deps arrays
  - `Spinner` component, styles updates
  - Use `useMemo`/`useCallback`
- Show error toast when onboarding agent fails to run:

<img width="405" alt="Screenshot 2025-05-06 at 5 09 01 PM"
src="https://github.com/user-attachments/assets/dd1272da-326a-448d-995d-98ac773b3ee4"
/>

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  - [x] Onboarding can be completed
  - [x] Failing agent shows toast
  - [x] Wallet can be opened and works properly (tasks, confetti)
  - [x] Dependency arrays don't cause infinite loops
2025-05-14 14:29:29 +00:00
Reinier van der Leer
9471fd6b58 fix(frontend): Re-subscribe on WebSocket re-connect (#9935)
- Resolves #9929

### Changes 🏗️

- Implement `BackendAPI.onWebSocketConnect(..)`

- Improve reliability and reactivity of `/library/agents/[id]`:
  - Refresh page data and (re)subscribe and on WebSocket (re)connect
- Break up multi-action hooks into smaller parts to reduce unnecessary
re-renders and requests
  - Reduce duplicate requests

- Use `onWebSocketConnect` in `useAgentGraph` as well

- Tidy up `autogpt-server-api/client.ts` a bit

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  - Go to `/library/agents/[id]`
  - Run the agent
    - [x] -> UI should update normally with execution updates
- Suspend your computer, or restart the backend (or WS server) to break
the connection
    - DO NOT REFRESH THE TAB ITSELF
- [x] -> On reconnect, page data should be refreshed (check in network
tab of dev tools)
  - Run the agent again
    - [x] -> UI should update normally with execution updates
2025-05-14 00:39:14 +00:00
Krzysztof Czerwinski
c4bbfd5050 feat(frontend): Update login and signup feedback (#9917)
Currently both login and signup page show the same feedback on error on
cloud (join the waitlist).

### Changes 🏗️
Update login&signup feedback for cloud.

- Use cards for login and signup feedback instead of html list
- Make login page show prompt to redirect user to signup

<img width="474" alt="Screenshot 2025-05-07 at 4 01 07 PM"
src="https://github.com/user-attachments/assets/45f189ea-5fea-45bb-89f9-7323418d69ea"
/>
<img width="476" alt="Screenshot 2025-05-07 at 4 03 29 PM"
src="https://github.com/user-attachments/assets/96f4cd7f-f3e6-44b2-b647-96ee98063572"
/>

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  - [x] Signup&login works with correct credentials
  - [x] Signup&login shows correct error message on wrong credentials
  - [x] Links work correctly
2025-05-13 15:57:55 +00:00
Toran Bruce Richards
08639bb1f0 fix(backend): Change library agents fetch sort from version to updatedAt (#9932)
<!-- Clearly explain the need for these changes: -->

### Changes 🏗️
This pull request includes a small change to the `get_my_agents`
function in `autogpt_platform/backend/backend/server/v2/store/db.py`.
The change updates the sorting order for retrieving library agents to
use the `updatedAt` field instead of `agentGraphVersion`.

This aligns with the later sorting that is applied to the fetched page
in the frontend.

<!-- Concisely describe all of the changes made in this pull request:
-->

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [ ] ...

<details>
  <summary>Example test plan</summary>
  
  - [ ] Create from scratch and execute an agent with at least 3 blocks
- [ ] Import an agent from file upload, and confirm it executes
correctly
  - [ ] Upload agent to marketplace
- [ ] Import an agent from marketplace and confirm it executes correctly
  - [ ] Edit an agent from monitor, and confirm it executes correctly
</details>

#### For configuration changes:
- [x] `.env.example` is updated or already compatible with my changes
- [x] `docker-compose.yml` is updated or already compatible with my
changes
- [x] I have included a list of my configuration changes in the PR
description (under **Changes**)

<details>
  <summary>Examples of configuration changes</summary>

  - Changing ports
  - Adding new services that need to communicate with each other
  - Secrets or environment variable changes
  - New or infrastructure changes such as databases
</details>
2025-05-13 07:49:55 +00:00
Zamil Majdy
4d99ae27c9 fix(backend): Avoid executor stuck on cleanup waiting execution completion 2025-05-12 22:30:58 +01:00
Zamil Majdy
64ff161323 fix(backend): Avoid executor stuck on cleanup waiting execution completion 2025-05-12 22:29:41 +01:00
Nicholas Tindle
2b5b93a0f7 feat(backend): send emails daily instead of hourly (#9927)
<!-- Clearly explain the need for these changes: -->

John thinks an hourly email is a bit much; I'm inclined to agree.

### Changes 🏗️

<!-- Concisely describe all of the changes made in this pull request:
-->
swaps the time for dealing the batch for the agent runs from one hour to
one day

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [x] No testing done
2025-05-12 18:24:14 +00:00
Nicholas Tindle
79cc08787b feat(blocks): initial google calendar (#9920)
We want to add support for reading + writing events for google calendar
so agents can do cool stuff with that!

### Changes 🏗️

* Add support for google calendar blocks

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  - [x] Run manual tests of each block
  - [x] Build and run automatic tests
2025-05-12 18:21:05 +00:00
Reinier van der Leer
b740a6edc0 fix(backend): Unbreak existing Agent Executor nodes (#9928)
- Follow-up fix for #9862

### Changes 🏗️

- Rename the stored `data` input field to `inputs` on all Agent Executor
nodes in the DB

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
- [x] Dry-run the query in Supabase to verify that the data operation
works
  - [x] CI
2025-05-12 11:41:07 +00:00
Zamil Majdy
c5946927ea Merge branch 'master' of github.com:Significant-Gravitas/AutoGPT into dev 2025-05-10 05:21:04 +01:00
Zamil Majdy
30086357bc Merge branch 'dev' of github.com:Significant-Gravitas/AutoGPT into dev 2025-05-10 05:19:38 +01:00
Zamil Majdy
e090195e57 feat(backend): User notification service to send a system alert 2025-05-10 05:19:22 +01:00
Zamil Majdy
d2bf0af3cd feat(backend): User notification service to send a system alert 2025-05-10 05:13:38 +01:00
Zamil Majdy
4413366ea7 fix(blocks): Fix load_all_blocks in concurrent calls (#9926)
```
2025-05-09 11:54:15,171 ERROR  Error executing graph 954e6fc8-9c90-46fa-be5b-4063eb519ec7: Block ID AIMusicGeneratorBlock error: 44f6c8ad-d75c-4ae1-8209-aad1c0326928 is already in use
```

### Changes 🏗️

This PR avoids the use of global variables messing up with the way
`load_all_blocks` is cached.

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [x] CI
2025-05-09 13:48:15 +00:00
Zamil Majdy
3d05c26f26 fix(backend): Fix walltime & cputime stats on interrupted execution 2025-05-09 14:09:56 +01:00
Zamil Majdy
c736d401a6 fix(backend): Fix walltime & cputime stats on interrupted execution 2025-05-09 14:08:37 +01:00
Zamil Majdy
e8bc83445a feat(backend): Immediate alert on late execution check job failure (#9925)
Currently, there is no guarantee that an error will be reported right
away. And late execution is a serious issue that needs to be addressed
quickly

### Changes 🏗️

Provided a direct alert when late execution occurs.

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
- [x] Manual run on the late executions job on artificially created late
executions.
2025-05-09 07:31:23 +00:00
Zamil Majdy
8de88395f1 fix(backend): Continue stats accounting on aborted or broken executions (#9921)
This is a follow-up to
https://github.com/Significant-Gravitas/AutoGPT/pull/9903

The continued graph execution restarted all the execution stats from
zero, making the execution stats misleading.

### Changes 🏗️

Continue the execution stats when continuing the graph execution.

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
- [x] Existing tests, manual graph run with the graph execution aborted
midway.
2025-05-09 06:46:30 +00:00
Zamil Majdy
82cf0bcde7 refactor(backend): Clear out Notification Service code blockage (#9915)
Some of the code paths in the notification & scheduler service were
synchronous HTTP calls that execute a long-running job that blocks. This
makes the service threads busy waiting.

### Changes 🏗️

* Remove queue_notification API
* Remove DTO
* Move heavy tasks intothe  executor

<!-- Concisely describe all of the changes made in this pull request:
-->

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
- [x] Manually executing notification service jobs through the scheduler
API
2025-05-09 06:16:43 +00:00
Krzysztof Czerwinski
089e7aae88 fix(frontend): Catch exception on agent listing page (#9923)
Listing page throws exception on deployment because of supabase auth
issue.

### Changes 🏗️

Catch the exception when getting library agent. This reverts the
behavior of listing page and it'll always show "Add to Library" when
user is logged in.

### Checklist 📋

#### For code changes:
- [ ] I have clearly listed my changes in the PR description
- [ ] I have made a test plan
- [ ] I have tested my changes according to the test plan:
  - [ ] ...
2025-05-08 22:29:40 +00:00
Toran Bruce Richards
74e6a6a43a fix(frontend/library): Quick Patch for Rendering Agent Outputs (#9922)
<!-- Clearly explain the need for these changes: -->
The goal of this change is a quick and temporary tweak to improve the
displaying of output text in the Agent Runs screen.

This change is made anticipating that these outputs will be properly
improved in the near future, and is thus just a temporary change in
order to display text in a human readable format.

### Changes 🏗️
There is one change in this PR:
- The class of the Agent Output textbox is changed to properly display
text without impacting the design.
Below is a before and after of this change:

**Before**

![image](https://github.com/user-attachments/assets/5ecc16c8-8ffd-4cc6-a627-00502848450e)

**After**

![image](https://github.com/user-attachments/assets/5601c2a0-f32f-4c75-afec-d86dc0c5081d)


### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [ ] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [ ] ...

---------

Co-authored-by: Bentlybro <Github@bentlybro.com>
2025-05-08 11:41:27 +00:00
Reinier van der Leer
433b76b539 fix(backend/scheduler): Unbreak Scheduler.get_execution_schedules (#9919)
- Resolves #9918
- Follow-up fix for #9914

### Changes 🏗️

- In `get_graph_execution_schedules`, skip jobs when their kwargs can't
be parsed as `GraphExecutionJobArgs`
- Rename methods of `Scheduler` to clarify their scope (scheduled
*graph* executions)

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  - Go to `/library/agents/[id]` (which calls `GET /api/schedules`)
    - [x] -> `GET /api/schedules` request returns HTTP 200
2025-05-08 11:26:51 +00:00
Reinier van der Leer
1ad6c76f9c feat(backend): Require discriminator value on graph save (#9858)
If a node has a multi-credentials input (e.g. AI Text Generator block)
but the discriminator value (e.g. model choice) is missing, the input
can't be discriminated into a single-provider input. Discrimination into
a single-provider input is necessary to make a graph-level credentials
input for use in the Library.

### Changes 🏗️

- feat(backend): Require discriminator fields to always have a value

- dx(frontend): Improve typing of discriminator stuff
- dx(frontend): Fix typing in `NodeOneOfDiscriminatorField` component

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
- [x] Saving & running graphs with and without credentials works
normally
- Note: We don't have any blocks with a discriminator that doesn't have
a default value, so currently I don't think it's possible to produce a
case where this mechanism would be triggered.
2025-05-08 09:45:22 +00:00
Bently
104928c614 feat(platform): Add captcha to login, signup and password reset pages (#9847)
This PR adds Cloudflare's Turnstile CAPTCHA to the login, signup, and
password reset pages. it is setup to only show and work when behave as
is set to CLOUD so it will not show for local hosted users.

### Changes 🏗️

#### Backend Changes
-
**[backend/server/v2/turnstile/routes.py](https://github.com/Significant-Gravitas/AutoGPT/compare/dev...bently/secrt-1169-implement-captcha-on-sign-up?expand=1#diff-2c5c2cb13346370fc48bdde8691a0d3bbfc030f7718288101b67b641c7948c10)**:
Created API endpoint at `/api/turnstile/verify` to proxy verification
requests to Cloudflare
-
**[backend/server/v2/turnstile/service.py](https://github.com/Significant-Gravitas/AutoGPT/compare/dev...bently/secrt-1169-implement-captcha-on-sign-up?expand=1#diff-296991fdc3ea821ae5a568ca96bb89789f2fc7dda7b62f59ef6bcadfaea16e56)**:
Implements service to verify CAPTCHA tokens with Cloudflare using
server-side secret key

#### Frontend Changes
-
**[frontend/src/lib/turnstile.ts](https://github.com/Significant-Gravitas/AutoGPT/compare/dev...bently/secrt-1169-implement-captcha-on-sign-up?expand=1#diff-a698e2718e0f6b0afe1d0c7fda571a7bfcbec6aeacc963c2b3620cc683dc4448)**:
Client-side function to call the backend verification endpoint
-
**[frontend/src/components/auth/Turnstile.tsx](https://github.com/Significant-Gravitas/AutoGPT/compare/dev...bently/secrt-1169-implement-captcha-on-sign-up?expand=1#diff-71a73d58d0ba5e46e5702f2f2599284e72a8fcf6c5d0b5c72e7358570d631aa7)**:
Reusable Turnstile component that renders and manages the CAPTCHA widget
-
**[frontend/src/hooks/useTurnstile.ts](https://github.com/Significant-Gravitas/AutoGPT/compare/dev...bently/secrt-1169-implement-captcha-on-sign-up?expand=1#diff-4a6a9363243ab2a88dbfb498917f464896ada059617bd8b0fb51df532c73827d)**:
Custom hook that manages Turnstile state and conditionally activates
based on environment

#### Auth Flow Integration
- Modified server actions in `login`, `signup`, and `reset_password` to
accept and verify Turnstile tokens
- Updated auth page components to integrate the CAPTCHA widget with form
submissions

### Configuration Changes
- Added two new environment variables:
- `NEXT_PUBLIC_CLOUDFLARE_TURNSTILE_SITE_KEY`: Public site key for
frontend
- `CLOUDFLARE_TURNSTILE_SECRET_KEY`: Secret key for backend verification

### Test Plan 📋
- Ask Bently for the keys to test locally!
- [x] Test login, signup and password reset with Turnstile enabled
(BEHAVE_AS=CLOUD)
- [x] Verify CAPTCHA appears and must be completed before form
submission
  - [x] Verify error message appears if CAPTCHA is not completed
  - [x] Verify form submission works after completing CAPTCHA
- [x] Test login, signup and password reset with Turnstile disabled
(BEHAVE_AS=LOCAL)
  - [x] Verify CAPTCHA does not appear
  - [x] Verify form submission works without CAPTCHA
- [x] Test with invalid site key to ensure proper error handling

---------

Co-authored-by: Krzysztof Czerwinski <34861343+kcze@users.noreply.github.com>
2025-05-07 21:08:12 +00:00
Reinier van der Leer
0726a00fb7 fix(backend): Include sub-graphs in graph-level credentials support (#9862)
The Library Agent credentials UX (#9789) currently doesn't work for
sub-graphs.

### Changes 🏗️

- Include sub-graphs in generating `Graph.credentials_input_schema`
- Propagate `node_credentials_input_map` into `AgentExecutionBlock`
executions
- Fix: also apply `node_credentials_input_map` in `_enqueue_next_nodes`

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  - Import a graph with sub-graphs that need credentials
  - Run this agent from the Library
  - [x] -> Should work
2025-05-07 17:28:39 +00:00
Zamil Majdy
ac8ef9bdb2 feat(backend): Introduce late execution check scheduled job (#9914)
Introduce a late execution check scheduled job. The late threshold
duration is configurable.
This initial version only reports the error to Sentry.

### Changes 🏗️

* Added late execution check scheduled job
* Move the registration weekly notification processing job out of API
call and calling it directly from the scheduler service.

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [x] Manual firing of scheduled job through an exposed API
2025-05-07 05:00:37 +00:00
Zamil Majdy
519ad94ec9 Merge branch 'master' of github.com:Significant-Gravitas/AutoGPT into dev 2025-05-06 20:30:36 +07:00
Nicholas Tindle
505320fcd3 feat(backend): Move Scheduler (#9904)
<!-- Clearly explain the need for these changes: -->
We want the scheduler shouldn't scale with the rest API lol

### Changes 🏗️
pulls out the scheduler into its own service
<!-- Concisely describe all of the changes made in this pull request:
-->

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [x] test it

---------

Co-authored-by: Zamil Majdy <zamil.majdy@agpt.co>
2025-05-05 18:59:28 +00:00
Krzysztof Czerwinski
6f1578239a feat(platform): Update Marketplace Agent listing buttons (#9843)
Currently agent listing on Marketplace have bad UX.

### Changes 🏗️

- Add function and endpoint to check if user has `LibraryAgent` by given
`storeListingVersionId`
- Redesign listing buttons
- `Add to library` shown when user is logged in and doesn't have an
agent in library
  - `See runs` shown when user logged in as has the agent in the library
  - `Download agent` always shown
  - Disabled buttons during processing (adding/downloading)
- Stop raising when owner is trying to add own agent. Now it'll simply
redirect to Library.
- Remove button appearing/flickering after a delay on listing page -
logged in status is now checked in server component.
- Show error toast on adding/redirecting to library and downloading
error
- Update breadcrumbs and page title to say `Marketplace` instead of
`Store`
- `font-geist` -> `font-sans` (`font-geist` var doesn't exist)

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  - [x] Button on a listing is `Add to library` (no library agent)
  - [x] Agent can be added and user is redirected
- [x] Button on the listing is `See runs` and clicking it redirects to
the library agent
  - [x] Remove agent from library
  - [x] Buttons shows `Add to library` again
  - [x] Agent can be re-added
  - [x] Agent can be downloaded
  - [x] `Add to library` Button is hidden when user is logged out

---------

Co-authored-by: Zamil Majdy <zamil.majdy@agpt.co>
2025-05-05 16:47:58 +00:00
Zamil Majdy
79319ad1a7 fix(backend): Avoid broken process pool by not failing process initializer (#9907)
Process initializer on the process pool should never fail, but we do
network-related stuff there.
This cause the pool to be in a broken state.

### Changes 🏗️

Remove the health check step on process initializer.

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [x] Existing CI test
2025-05-05 13:27:04 +00:00
Nicholas Tindle
afb66f75ec fix: disable google sheets in prod based on oauth review (#9906)
<!-- Clearly explain the need for these changes: -->

Our oauth review wants us to drop this in favor of a diff scope that
will require additional work

### Changes 🏗️
Disables the oauth sheets scopes in prod

<!-- Concisely describe all of the changes made in this pull request:
-->

### Checklist 📋

#### For code changes:
- [ ] I have clearly listed my changes in the PR description
- [ ] I have made a test plan
- [ ] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [ ] set env locally
2025-05-02 19:40:51 +00:00
Krzysztof Czerwinski
59ec61ef98 feat(platform): Onboarding design&UX update (#9905)
A collection of updates regarding onboarding and wallet.

### Changes 🏗️

- `try-except` instead of `if` when rewarding (skip unnecessary db call)
- Make external services question onboarding step optional
- Add `SmartImage` component to lazy load images with pulse animation
and use it throughout onboarding
- Use store agent name instead of graph graph name (run page)
- Fix some images breaking layout on the agent card (run page)
- Center agent card vertically and horizontally (center on the left half
of page) (run page)
- Delay and tweak confetti when opening wallet and when task finished
(wallet)
- Flash wallet when credits change value
- Make tutorial video grayscale on completed steps (wallet)
- Fix confetti triggering on page refresh (wallet)
- Redirect to agent run page instead of Library after onboarding
- Expand task groups by default (wallet) - this means tutorial videos
are visible by default

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
- [x] Services step is optional and skipping it doesn't break onboarding
  - [x] `SmartImage` works properly
  - [x] Agent card is aligned properly, including on page scroll
  - [x] Wallet flash when credits value change
  - [x] User is redirected to the agent runs page after onboarding
2025-05-02 14:42:01 +00:00
Zamil Majdy
d7077b5161 feat(backend): Continue instead of retrying aborted/broken agent execution (#9903)
Currently, the agent/graph execution engine is consuming the execution
queue and acknowledges the message after fully completing its execution
or failing it.

However, in the case of the agent executor failing due to a
hardware/resource issue, or the executor did not manage to acknowledge
the execution message. Another agent executor will pick it up and start
the execution again from the beginning.

The scope of this PR is to make the next executor pick up the next work
to continue the pre-existing execution instead of starting it all over
from the beginning.

### Changes 🏗️

* Removed `start_node_execs` from `GraphExecutionEntry`
* Populate the starting graph node from the DB query instead (fetching
Running & Queued node executions).
* Removed `get_incomplete_node_executions` from DB manager.
* Use get_node_executions with a status filter instead.
* Allow graph execution to end in non-FAILED/COMPLETED status, e.g, when
the executor is interrupted, it should be stuck in the running status,
and let other executors continue the task.

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
- [x] Run an agent, stop the executor midway, re-reun the executor, the
execution should be continued instead of restarted.
2025-05-01 16:02:03 +00:00
Zamil Majdy
475c5a5cc3 fix(backend): Avoid executing any agent with zero balance (#9901)
### Changes 🏗️

* Avoid executing any agent with a zero balance.
* Make node execution count global across agents for a single user.

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
- [x] Run agents by tweaking the `execution_cost_count_threshold` &
`execution_cost_per_threshold` values.
2025-05-01 15:11:38 +00:00
Zamil Majdy
f5a07f1a35 hotfix(backend): Avoid executing any agent with zero balance (#9902)
### Changes 🏗️

* Avoid executing any agent with a zero balance.
* Make node execution count global across agents for a single user.

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
- [x] Run agents by tweaking the `execution_cost_count_threshold` &
`execution_cost_per_threshold` values.
2025-05-01 10:11:09 -05:00
Zamil Majdy
86d5cfe60b feat(backend): Support flexible RPC client (#9842)
Using sync code in the async route often introduces a blocking
event-loop code that impacts stability.

The current RPC system only provides a synchronous client to call the
service endpoints.
The scope of this PR is to provide an entirely decoupled signature
between client and server, allowing the client can mix & match async &
sync options on the client code while not changing the async/sync nature
of the server.

### Changes 🏗️

* Add support for flexible async/sync RPC client.
* Migrate scheduler client to all-async client.

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  - [x] Scheduler route test.
  - [x] Modified service_test.py
  - [x] Run normal agent executions
2025-05-01 04:38:06 +00:00
Bently
602f887623 feat(frontend): fix admin add dollars (#9898)
Fixes the admin add dollars, in the ``add-money-button.tsx`` file, in
the handleApproveSubmit action it was trying to use formatCredits for
the value which is wrong, this fix changes it

```diff
 <form action={handleApproveSubmit}>
   <input type="hidden" name="id" value={userId} />
   <input
     type="hidden"
     name="amount"
-    value={formatCredits(Number(dollarAmount))}
+    value={Math.round(parseFloat(dollarAmount) * 100)}
   />
```
i was able to add $1, $0.10 and $0.01

![image](https://github.com/user-attachments/assets/3a3126c2-5f17-4c9b-8657-4372332a0ea3)
2025-04-30 17:24:26 +00:00
Bentlybro
1edde778c5 Merge branch 'master' into dev 2025-04-30 16:46:50 +01:00
Zamil Majdy
3526986f98 fix(backend): Failing test on a new Pydantic version (#9897)
```
FAILED test/model_test.py::test_agent_preset_from_db - pydantic_core._pydantic_core.ValidationError: 1 validation error for AgentNodeExecutionInputOutput

E       pydantic_core._pydantic_core.ValidationError: 1 validation error for AgentNodeExecutionInputOutput
E       data
E         JSON input should be string, bytes or bytearray [type=json_type, input_value=Json, input_type=Json]
E           For further information visit https://errors.pydantic.dev/2.11/v/json_type
```

### Changes 🏗️

Manually creating a Prisma model often breaks, and we have such an
instance in the test.
This PR fixes the test to make the new Pydantic happy.

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  - [x] CI
2025-04-30 13:59:17 +00:00
Nicholas Tindle
04c4340ee3 feat(frontend,backend): user spending admin dashboard (#9751)
<!-- Clearly explain the need for these changes: -->
We need a way to refund people who spend money on agents wihout making
manual db actions

### Changes 🏗️
- Adds a bunch for refunding users
- Adds reasons and admin id for actions
- Add admin to db manager
- Add UI for this for the admin panel
- Clean up pagination controls
<!-- Concisely describe all of the changes made in this pull request:
-->

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [x] Test by importing dev db as baseline
- [x] Add transactions on top for "refund", and make sure all existing
transactions work

---------

Co-authored-by: Zamil Majdy <zamil.majdy@agpt.co>
2025-04-29 17:39:25 +00:00
Zamil Majdy
9fa62c03f6 feat(backend): Improve cancel execution reliability (#9889)
When an executor dies, an ongoing execution will not be retried and will
just stuck in the running status.
This change avoids such a scenario by allowing an execution of an entry
that is not in QUEUED status with the low-probability risk of double
execution.

### Changes 🏗️

* Allow non-QUEUED status to be re-executed.
* Improve cleanup of node & graph executor.
* Make a cancellation request consumption a separate thread to avoid
being blocked by other messages.
* Remove unused retry loop on the execution manager.

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [x] Run agent, kill the server, re-run it, agent restarted.
2025-04-29 17:06:03 +00:00
Mareddy Lohith Reddy
d5dc687484 fix: handle empty 204 responses in SendWebRequestBlock (#9887)
<!-- Clearly explain the need for these changes: -->
This PR fixes [Issue
#9883](https://github.com/Significant-Gravitas/AutoGPT/issues/9883),
where the SendWebRequestBlock crashes when receiving a 204 No Content
response, such as when posting to a Discord webhook. The fix ensures
that empty responses are handled gracefully, and the block does not
crash.

### Changes 🏗️
- Added a check to handle empty HTTP responses (like 204 status) in
SendWebRequestBlock
- Fallback to empty string or None if there is no response content
- Prevents server errors when parsing non-existent response bodies

<!-- Concisely describe all of the changes made in this pull request:
-->

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [x] Send a POST request to an endpoint that returns 204 No Content
  - [x]  Confirm that SendWebRequestBlock handles it without crashing
  - [x] Confirm that regular 200 OK JSON responses still work

---------

Co-authored-by: Zamil Majdy <zamil.majdy@agpt.co>
Co-authored-by: Lohith-11 <lohithr011@gamil.com>
Co-authored-by: Toran Bruce Richards <toran.richards@gmail.com>
Co-authored-by: Nicholas Tindle <nicholas.tindle@agpt.co>
2025-04-28 19:16:04 +00:00
Japh
fb5ce0a16d Add Note to "Getting Started" page for Raspberry Pi 5 page size issue (#9888)
Add Note to "Getting Started" page for Raspberry Pi 5 page size issue
with `supabase-vector` that prevents `docker compose up` from running
successfully.

<!-- Clearly explain the need for these changes: -->

### Changes 🏗️

- Added a Note to the "Getting Started" page that explains a change in
Raspberry Pi OS for Raspberry Pi 5s, and how to revert the change to
avoid an issue running the backend on Docker.

<!-- Concisely describe all of the changes made in this pull request:
-->

### Checklist 📋

#### For code changes:
- [x] No code changes

#### For configuration changes:
- [x] No configuration changes

---------

Co-authored-by: Zamil Majdy <zamil.majdy@agpt.co>
Co-authored-by: Nicholas Tindle <nicholas.tindle@agpt.co>
2025-04-28 19:07:44 +00:00
Nicholas Tindle
a1f17ca797 fix: use subheading for agent info not description (#9891)
<!-- Clearly explain the need for these changes: -->
we oopsed and used the wrong attribute for short desc
### Changes 🏗️
Uses sub heading instead now
<!-- Concisely describe all of the changes made in this pull request:
-->

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [x] check the expected text shows
2025-04-28 18:38:43 +00:00
Nicholas Tindle
8fdfd75cc4 feat: allow admins to download agents for review (#9881)
<!-- Clearly explain the need for these changes: -->
for admins to approve agents for the marketplace, we need to be able to
run them. this is a quick workaround for downloading them so you can put
them in your marketplace to check

### Changes 🏗️
- clones various endpoints related to downloading into an admin side
with logging, and admin checks
- adds download button and removes open in builder action
<!-- Concisely describe all of the changes made in this pull request:
-->

### Checklist 📋

#### For code changes:
- [ ] I have clearly listed my changes in the PR description
- [ ] I have made a test plan
- [ ] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [ ] Test downloading agents from local marketplace
2025-04-28 17:58:23 +00:00
Abhimanyu Yadav
5b5b2043e8 fix(frontend): Add support to optional multiselect (#9885)
- fix #9882 

we’re currently using optional multi select, and it’s working great.
We’re able to correctly determine the data type for it. However, there’s
a small issue. We’re not using the correct subSchema that is inside
anyOf on the multi select input. This is why we’re getting the problem
on the Twitter block. It’s the only one that’s using this type of input,
so it’s the only one that’s affected.

![Screenshot 2025-04-26 at 5 39
51 PM](https://github.com/user-attachments/assets/834d64d8-84dc-4dbd-a03a-df03172ecee5)

---------

Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2025-04-28 15:18:29 +00:00
Nicholas Tindle
7d83f1db05 feat(block): bring back PrintConsoleBlock (#9850)
### Changes 🏗️

Bring back PrintConsoleBlock

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  - [x] Print console block

---------

Co-authored-by: Zamil Majdy <zamil.majdy@agpt.co>
2025-04-28 09:55:57 +00:00
Zamil Majdy
f07696e3c1 fix(backend): Fix top-up with zero transaction flow (#9886)
The transaction with zero payment amount will not generate a payment ID,
so the checkout failed for this scenario.

### Changes 🏗️

Don't use payment id as transaction key on top-up with zero payment
amount.

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  - [x] Top-up with stripe coupon
2025-04-27 14:22:08 +00:00
Zamil Majdy
96a173a85f fix(backend): Avoid releasing lock that is no longer owned by the current thread (#9878)
There are instances of node executions that were failed and end up stuck
in the RUNNING status due to the execution failed to release the lock:
```
2025-04-24 20:53:31,573 INFO  [ExecutionManager|uid:25eba2d1-e9c1-44bc-88c7-43e0f4fbad5a|gid:01f8c315-c163-4dd1-a8a0-d396477c5a9f|nid:f8bf84ae-b1f0-4434-8f04-80f43852bc30]|geid:2e1b35c6-0d2f-4e97-adea-f6fe0d9965d0|neid:590b29ea-63ee-4e24-a429-de5a3e191e72|-] Failed node execution 590b29ea-63ee-4e24-a429-de5a3e191e72: Cannot release a lock that's no longer owned
```

### Changes 🏗️

Check the ownership of the lock before releasing.

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [x] Existing CI tests.

(cherry picked from commit ef022720d5)
2025-04-25 23:55:18 +07:00
sentry-autofix[bot]
9715ea5313 fix: handle token limits and estimate token count for llm calls (#9880)
👋 Hi there! This PR was automatically generated by Autofix 🤖

This fix was triggered by Toran Bruce Richards.

Fixes
[AUTOGPT-SERVER-1ZY](https://sentry.io/organizations/significant-gravitas/issues/6386687527/).
The issue was that: `llm_call` calculates `max_tokens` without
considering `input_tokens`, causing OpenRouter API errors when the
context window is exceeded.

- Implements a function `estimate_token_count` to estimate the number of
tokens in a list of messages.
- Calculates available tokens based on the context window, estimated
input tokens, and user-defined max tokens.
- Adjusts `max_tokens` for LLM calls to prevent exceeding context window
limits.
- Reduces `max_tokens` by 15% and retries if a token limit error is
encountered during LLM calls.

If you have any questions or feedback for the Sentry team about this
fix, please email [autofix@sentry.io](mailto:autofix@sentry.io) with the
Run ID: 32838.

---------

Co-authored-by: sentry-autofix[bot] <157164994+sentry-autofix[bot]@users.noreply.github.com>
Co-authored-by: Krzysztof Czerwinski <kpczerwinski@gmail.com>
2025-04-25 13:45:47 +00:00
Zamil Majdy
ef022720d5 fix(backend): Avoid releasing lock that is no longer owned by the current thread (#9878)
There are instances of node executions that were failed and end up stuck
in the RUNNING status due to the execution failed to release the lock:
```
2025-04-24 20:53:31,573 INFO  [ExecutionManager|uid:25eba2d1-e9c1-44bc-88c7-43e0f4fbad5a|gid:01f8c315-c163-4dd1-a8a0-d396477c5a9f|nid:f8bf84ae-b1f0-4434-8f04-80f43852bc30]|geid:2e1b35c6-0d2f-4e97-adea-f6fe0d9965d0|neid:590b29ea-63ee-4e24-a429-de5a3e191e72|-] Failed node execution 590b29ea-63ee-4e24-a429-de5a3e191e72: Cannot release a lock that's no longer owned
```

### Changes 🏗️

Check the ownership of the lock before releasing.

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [x] Existing CI tests.
2025-04-25 07:39:10 +00:00
Zamil Majdy
4ddb206f86 feat(frontend): Add billing page toggle (#9877)
### Changes 🏗️

Provide a system toggle for disabling the billing page:
NEXT_PUBLIC_SHOW_BILLING_PAGE

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  - [x] Toggle `NEXT_PUBLIC_SHOW_BILLING_PAGE` value.
2025-04-24 19:33:20 +00:00
Zamil Majdy
91f34966c8 fix(block): Fix Smart Decision Block missing input beads & incompability with input in special characters (#9875)
Smart Decision Block was not able to work with sub agent with custom
name input & the bead were not properly propagated in the execution UI.
The scope of this PR is fixing it.

### Changes 🏗️

* Introduce an easy to parse format of tool edge:
`{tool}_^_{func}_~_{arg}`. Graph using SmartDecisionBlock needs to be
re-saved before execution to work.
* Reduce cluttering on a smart decision block logic.
* Fix beads not being shown for a smart decision block tool calling.

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  - [x] Execute an SDM with some special character input as a tool

<img width="672" alt="image"
src="https://github.com/user-attachments/assets/873556b3-c16a-4dd1-ad84-bc86c636c406"
/>
2025-04-24 19:24:41 +00:00
Krzysztof Czerwinski
11a69170b5 feat(frontend): Update "Edit a copy" modal and buttons (#9876)
Update "Edit a copy" modal text when copying marketplace agent in
Library. Update agent action buttons to reflect the design accurately.

### Changes 🏗️

- Update modal text
- Disable copying owned agents (only marketplace allowed)
- `Open in Builder` -> `Customize agent`
- Disabled `Customize agent` instead of hiding
- Change `Delete agent` to non-destructive design

### Checklist 📋

#### For code changes:
- [ ] I have clearly listed my changes in the PR description
- [ ] I have made a test plan
- [ ] I have tested my changes according to the test plan:
  - [ ] ...
2025-04-24 16:30:43 +00:00
Krzysztof Czerwinski
0675a41e42 fix(backend): Strip secrets, credentials when forking agent (#9874)
Strip secrets, credentials when forking agent

### Changes 🏗️

<!-- Concisely describe all of the changes made in this pull request:
-->

### Checklist 📋

#### For code changes:
- [ ] I have clearly listed my changes in the PR description
- [ ] I have made a test plan
- [ ] I have tested my changes according to the test plan:
  - [ ] ...
2025-04-24 15:09:41 +00:00
Bentlybro
56ce1a0c1c Merge branch 'master' into dev 2025-04-24 14:21:34 +01:00
Zamil Majdy
7fbe135ec8 feat(backend): Expose execution prometheus metrics (#9866)
Currently, we have no visibility on the state of the execution manager,
the scope of this PR is to open up the observability of it by exposing
Prometheus metrics.

### Changes 🏗️

Re-use the execution manager port to expose the Prometheus metrics.

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  - [x] Hit /metrics on 8002 port
2025-04-24 07:48:38 +00:00
Zamil Majdy
eb6a0b34e1 feat(backend): Use forkserver on process creation if possible (#9864)
### Changes 🏗️

Set process starting mode to forkserver instead of spawn, if possible,
for performance benefits.

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  - [x] Existing tests
2025-04-24 07:36:36 +00:00
Zamil Majdy
1e3236a041 feat(backend): Add retry on executor process initialization (#9865)
Executor process initialization can fail and cause this error:
```
concurrent.futures.process.BrokenProcessPool: A child process terminated abruptly, the process pool is not usable anymore
```

### Changes 🏗️

Add retry to reduce the chance of the initialization error to happen.

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  - [x] Existing tests
2025-04-23 21:24:17 +00:00
Krzysztof Czerwinski
160a622ba4 feat(platform): Forking agent in Library (#9870)
This PR introduces copying agents feature in the Library. Users can copy
and download their library agents but they can edit only the ones they
own (included copied ones).

### Changes 🏗️

- DB migration: add relation in `AgentGraph`: `forked_from_id` and
`forked_from_version`
- Add `fork_graph` function that makes a hardcopy of agent graph and its
nodes (all with new ids)
- Add `fork_library_agent` that copies library agent and its graph for a
user
- Add endpoint `/library/agents/{libraryAgentId}/fork`
- Add UI to `library/agents/[id]/page.tsx`: `Edit a copy` button with
dialog confirmation

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  - [x] Agent can be copied, edited and runs
2025-04-23 16:28:42 +00:00
Toran Bruce Richards
e2a226dc49 Update repo-close-stale-issues.yml 2025-04-23 14:51:18 +01:00
Zamil Majdy
5047e99fd1 fix(frontend): Hide Google Maps Key ID filter (#9861)
### Changes 🏗️


![image](https://github.com/user-attachments/assets/d6b9f971-d914-4ff1-9319-a903707a2c72)

Hide Google Maps system id key on the frontend UI.

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan
2025-04-22 16:50:05 +00:00
179 changed files with 7451 additions and 3335 deletions

View File

@@ -16,7 +16,7 @@ jobs:
# operations-per-run: 5000
stale-issue-message: >
This issue has automatically been marked as _stale_ because it has not had
any activity in the last 50 days. You can _unstale_ it by commenting or
any activity in the last 170 days. You can _unstale_ it by commenting or
removing the label. Otherwise, this issue will be closed in 10 days.
stale-pr-message: >
This pull request has automatically been marked as _stale_ because it has
@@ -25,7 +25,7 @@ jobs:
close-issue-message: >
This issue was closed automatically because it has been stale for 10 days
with no activity.
days-before-stale: 100
days-before-stale: 170
days-before-close: 10
# Do not touch meta issues:
exempt-issue-labels: meta,fridge,project management

6
.vscode/launch.json vendored
View File

@@ -32,9 +32,9 @@
"type": "debugpy",
"request": "launch",
"module": "backend.app",
// "env": {
// "ENV": "dev"
// },
"env": {
"OBJC_DISABLE_INITIALIZE_FORK_SAFETY": "YES"
},
"envFile": "${workspaceFolder}/backend/.env",
"justMyCode": false,
"cwd": "${workspaceFolder}/autogpt_platform/backend"

50
AGENTS.md Normal file
View File

@@ -0,0 +1,50 @@
# AutoGPT Platform Contribution Guide
This guide provides context for Codex when updating the **autogpt_platform** folder.
## Directory overview
- `autogpt_platform/backend` FastAPI based backend service.
- `autogpt_platform/autogpt_libs` Shared Python libraries.
- `autogpt_platform/frontend` Next.js + Typescript frontend.
- `autogpt_platform/docker-compose.yml` development stack.
See `docs/content/platform/getting-started.md` for setup instructions.
## Code style
- Format Python code with `poetry run format`.
- Format frontend code using `yarn format`.
## Testing
- Backend: `poetry run test` (runs pytest with a docker based postgres + prisma).
- Frontend: `yarn test` or `yarn test-ui` for Playwright tests. See `docs/content/platform/contributing/tests.md` for tips.
Always run the relevant linters and tests before committing.
Use conventional commit messages for all commits (e.g. `feat(backend): add API`).
Types:
- feat
- fix
- refactor
- ci
- dx (developer experience)
Scopes:
- platform
- platform/library
- platform/marketplace
- backend
- backend/executor
- frontend
- frontend/library
- frontend/marketplace
- blocks
## Pull requests
- Use the template in `.github/PULL_REQUEST_TEMPLATE.md`.
- Rely on the pre-commit checks for linting and formatting
- Fill out the **Changes** section and the checklist.
- Use conventional commit titles with a scope (e.g. `feat(frontend): add feature`).
- Keep out-of-scope changes under 20% of the PR.
- Ensure PR descriptions are complete.
- For changes touching `data/*.py`, validate user ID checks or explain why not needed.
- If adding protected frontend routes, update `frontend/lib/supabase/middleware.ts`.
- Use the linear ticket branch structure if given codex/open-1668-resume-dropped-runs

View File

@@ -16,7 +16,7 @@ logger = logging.getLogger(__name__)
async def auth_middleware(request: Request):
if not settings.ENABLE_AUTH:
# If authentication is disabled, allow the request to proceed
logger.warn("Auth disabled")
logger.warning("Auth disabled")
return {}
security = HTTPBearer()

View File

@@ -31,7 +31,7 @@ class RedisKeyedMutex:
try:
yield
finally:
if lock.locked():
if lock.locked() and lock.owned():
lock.release()
def acquire(self, key: Any) -> "RedisLock":

View File

@@ -1,4 +1,4 @@
# This file is automatically @generated by Poetry 2.1.2 and should not be changed by hand.
# This file is automatically @generated by Poetry 2.1.1 and should not be changed by hand.
[[package]]
name = "aiohappyeyeballs"
@@ -177,7 +177,7 @@ files = [
{file = "async-timeout-4.0.3.tar.gz", hash = "sha256:4640d96be84d82d02ed59ea2b7105a0f7b33abe8703703cd0ab0bf87c427522f"},
{file = "async_timeout-4.0.3-py3-none-any.whl", hash = "sha256:7405140ff1230c310e51dc27b3145b9092d659ce68ff733fb0cefe3ee42be028"},
]
markers = {main = "python_version == \"3.10\"", dev = "python_full_version < \"3.11.3\""}
markers = {main = "python_version < \"3.11\"", dev = "python_full_version < \"3.11.3\""}
[[package]]
name = "attrs"
@@ -375,7 +375,7 @@ description = "Backport of PEP 654 (exception groups)"
optional = false
python-versions = ">=3.7"
groups = ["main"]
markers = "python_version == \"3.10\""
markers = "python_version < \"3.11\""
files = [
{file = "exceptiongroup-1.2.2-py3-none-any.whl", hash = "sha256:3111b9d131c238bec2f8f516e123e14ba243563fb135d3fe885990585aa7795b"},
{file = "exceptiongroup-1.2.2.tar.gz", hash = "sha256:47c2edf7c6738fafb49fd34290706d1a1a2f4d1c6df275526b62cbb4aa5393cc"},
@@ -562,19 +562,19 @@ protobuf = ">=3.20.2,<4.21.0 || >4.21.0,<4.21.1 || >4.21.1,<4.21.2 || >4.21.2,<4
[[package]]
name = "google-cloud-audit-log"
version = "0.3.0"
version = "0.3.2"
description = "Google Cloud Audit Protos"
optional = false
python-versions = ">=3.7"
groups = ["main"]
files = [
{file = "google_cloud_audit_log-0.3.0-py2.py3-none-any.whl", hash = "sha256:8340793120a1d5aa143605def8704ecdcead15106f754ef1381ae3bab533722f"},
{file = "google_cloud_audit_log-0.3.0.tar.gz", hash = "sha256:901428b257020d8c1d1133e0fa004164a555e5a395c7ca3cdbb8486513df3a65"},
{file = "google_cloud_audit_log-0.3.2-py3-none-any.whl", hash = "sha256:daaedfb947a0d77f524e1bd2b560242ab4836fe1afd6b06b92f152b9658554ed"},
{file = "google_cloud_audit_log-0.3.2.tar.gz", hash = "sha256:2598f1533a7d7cdd6c7bf448c12e5519c1d53162d78784e10bcdd1df67791bc3"},
]
[package.dependencies]
googleapis-common-protos = ">=1.56.2,<2.0dev"
protobuf = ">=3.20.2,<4.21.1 || >4.21.1,<4.21.2 || >4.21.2,<4.21.3 || >4.21.3,<4.21.4 || >4.21.4,<4.21.5 || >4.21.5,<6.0.0dev"
googleapis-common-protos = ">=1.56.2,<2.0.0"
protobuf = ">=3.20.2,<4.21.1 || >4.21.1,<4.21.2 || >4.21.2,<4.21.3 || >4.21.3,<4.21.4 || >4.21.4,<4.21.5 || >4.21.5,<7.0.0"
[[package]]
name = "google-cloud-core"
@@ -597,30 +597,30 @@ grpc = ["grpcio (>=1.38.0,<2.0dev)", "grpcio-status (>=1.38.0,<2.0.dev0)"]
[[package]]
name = "google-cloud-logging"
version = "3.11.4"
version = "3.12.1"
description = "Stackdriver Logging API client library"
optional = false
python-versions = ">=3.7"
groups = ["main"]
files = [
{file = "google_cloud_logging-3.11.4-py2.py3-none-any.whl", hash = "sha256:1d465ac62df29fb94bba4d6b4891035e57d573d84541dd8a40eebbc74422b2f0"},
{file = "google_cloud_logging-3.11.4.tar.gz", hash = "sha256:32305d989323f3c58603044e2ac5d9cf23e9465ede511bbe90b4309270d3195c"},
{file = "google_cloud_logging-3.12.1-py2.py3-none-any.whl", hash = "sha256:6817878af76ec4e7568976772839ab2c43ddfd18fbbf2ce32b13ef549cd5a862"},
{file = "google_cloud_logging-3.12.1.tar.gz", hash = "sha256:36efc823985055b203904e83e1c8f9f999b3c64270bcda39d57386ca4effd678"},
]
[package.dependencies]
google-api-core = {version = ">=1.34.1,<2.0.dev0 || >=2.11.dev0,<3.0.0dev", extras = ["grpc"]}
google-auth = ">=2.14.1,<2.24.0 || >2.24.0,<2.25.0 || >2.25.0,<3.0.0dev"
google-cloud-appengine-logging = ">=0.1.3,<2.0.0dev"
google-cloud-audit-log = ">=0.2.4,<1.0.0dev"
google-cloud-core = ">=2.0.0,<3.0.0dev"
grpc-google-iam-v1 = ">=0.12.4,<1.0.0dev"
google-api-core = {version = ">=1.34.1,<2.0.dev0 || >=2.11.dev0,<3.0.0", extras = ["grpc"]}
google-auth = ">=2.14.1,<2.24.0 || >2.24.0,<2.25.0 || >2.25.0,<3.0.0"
google-cloud-appengine-logging = ">=0.1.3,<2.0.0"
google-cloud-audit-log = ">=0.3.1,<1.0.0"
google-cloud-core = ">=2.0.0,<3.0.0"
grpc-google-iam-v1 = ">=0.12.4,<1.0.0"
opentelemetry-api = ">=1.9.0"
proto-plus = [
{version = ">=1.25.0,<2.0.0dev", markers = "python_version >= \"3.13\""},
{version = ">=1.22.2,<2.0.0dev", markers = "python_version >= \"3.11\" and python_version < \"3.13\""},
{version = ">=1.22.0,<2.0.0dev", markers = "python_version < \"3.11\""},
{version = ">=1.25.0,<2.0.0", markers = "python_version >= \"3.13\""},
{version = ">=1.22.2,<2.0.0", markers = "python_version >= \"3.11\" and python_version < \"3.13\""},
{version = ">=1.22.0,<2.0.0", markers = "python_version < \"3.11\""},
]
protobuf = ">=3.20.2,<4.21.0 || >4.21.0,<4.21.1 || >4.21.1,<4.21.2 || >4.21.2,<4.21.3 || >4.21.3,<4.21.4 || >4.21.4,<4.21.5 || >4.21.5,<6.0.0dev"
protobuf = ">=3.20.2,<4.21.0 || >4.21.0,<4.21.1 || >4.21.1,<4.21.2 || >4.21.2,<4.21.3 || >4.21.3,<4.21.4 || >4.21.4,<4.21.5 || >4.21.5,<7.0.0"
[[package]]
name = "googleapis-common-protos"
@@ -1238,19 +1238,19 @@ pyasn1 = ">=0.4.6,<0.7.0"
[[package]]
name = "pydantic"
version = "2.11.1"
version = "2.11.4"
description = "Data validation using Python type hints"
optional = false
python-versions = ">=3.9"
groups = ["main"]
files = [
{file = "pydantic-2.11.1-py3-none-any.whl", hash = "sha256:5b6c415eee9f8123a14d859be0c84363fec6b1feb6b688d6435801230b56e0b8"},
{file = "pydantic-2.11.1.tar.gz", hash = "sha256:442557d2910e75c991c39f4b4ab18963d57b9b55122c8b2a9cd176d8c29ce968"},
{file = "pydantic-2.11.4-py3-none-any.whl", hash = "sha256:d9615eaa9ac5a063471da949c8fc16376a84afb5024688b3ff885693506764eb"},
{file = "pydantic-2.11.4.tar.gz", hash = "sha256:32738d19d63a226a52eed76645a98ee07c1f410ee41d93b4afbfa85ed8111c2d"},
]
[package.dependencies]
annotated-types = ">=0.6.0"
pydantic-core = "2.33.0"
pydantic-core = "2.33.2"
typing-extensions = ">=4.12.2"
typing-inspection = ">=0.4.0"
@@ -1260,111 +1260,111 @@ timezone = ["tzdata ; python_version >= \"3.9\" and platform_system == \"Windows
[[package]]
name = "pydantic-core"
version = "2.33.0"
version = "2.33.2"
description = "Core functionality for Pydantic validation and serialization"
optional = false
python-versions = ">=3.9"
groups = ["main"]
files = [
{file = "pydantic_core-2.33.0-cp310-cp310-macosx_10_12_x86_64.whl", hash = "sha256:71dffba8fe9ddff628c68f3abd845e91b028361d43c5f8e7b3f8b91d7d85413e"},
{file = "pydantic_core-2.33.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:abaeec1be6ed535a5d7ffc2e6c390083c425832b20efd621562fbb5bff6dc518"},
{file = "pydantic_core-2.33.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:759871f00e26ad3709efc773ac37b4d571de065f9dfb1778012908bcc36b3a73"},
{file = "pydantic_core-2.33.0-cp310-cp310-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:dcfebee69cd5e1c0b76a17e17e347c84b00acebb8dd8edb22d4a03e88e82a207"},
{file = "pydantic_core-2.33.0-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:1b1262b912435a501fa04cd213720609e2cefa723a07c92017d18693e69bf00b"},
{file = "pydantic_core-2.33.0-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:4726f1f3f42d6a25678c67da3f0b10f148f5655813c5aca54b0d1742ba821b8f"},
{file = "pydantic_core-2.33.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:e790954b5093dff1e3a9a2523fddc4e79722d6f07993b4cd5547825c3cbf97b5"},
{file = "pydantic_core-2.33.0-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:34e7fb3abe375b5c4e64fab75733d605dda0f59827752debc99c17cb2d5f3276"},
{file = "pydantic_core-2.33.0-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:ecb158fb9b9091b515213bed3061eb7deb1d3b4e02327c27a0ea714ff46b0760"},
{file = "pydantic_core-2.33.0-cp310-cp310-musllinux_1_1_armv7l.whl", hash = "sha256:4d9149e7528af8bbd76cc055967e6e04617dcb2a2afdaa3dea899406c5521faa"},
{file = "pydantic_core-2.33.0-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:e81a295adccf73477220e15ff79235ca9dcbcee4be459eb9d4ce9a2763b8386c"},
{file = "pydantic_core-2.33.0-cp310-cp310-win32.whl", hash = "sha256:f22dab23cdbce2005f26a8f0c71698457861f97fc6318c75814a50c75e87d025"},
{file = "pydantic_core-2.33.0-cp310-cp310-win_amd64.whl", hash = "sha256:9cb2390355ba084c1ad49485d18449b4242da344dea3e0fe10babd1f0db7dcfc"},
{file = "pydantic_core-2.33.0-cp311-cp311-macosx_10_12_x86_64.whl", hash = "sha256:a608a75846804271cf9c83e40bbb4dab2ac614d33c6fd5b0c6187f53f5c593ef"},
{file = "pydantic_core-2.33.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:e1c69aa459f5609dec2fa0652d495353accf3eda5bdb18782bc5a2ae45c9273a"},
{file = "pydantic_core-2.33.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:b9ec80eb5a5f45a2211793f1c4aeddff0c3761d1c70d684965c1807e923a588b"},
{file = "pydantic_core-2.33.0-cp311-cp311-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:e925819a98318d17251776bd3d6aa9f3ff77b965762155bdad15d1a9265c4cfd"},
{file = "pydantic_core-2.33.0-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:5bf68bb859799e9cec3d9dd8323c40c00a254aabb56fe08f907e437005932f2b"},
{file = "pydantic_core-2.33.0-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:1b2ea72dea0825949a045fa4071f6d5b3d7620d2a208335207793cf29c5a182d"},
{file = "pydantic_core-2.33.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:1583539533160186ac546b49f5cde9ffc928062c96920f58bd95de32ffd7bffd"},
{file = "pydantic_core-2.33.0-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:23c3e77bf8a7317612e5c26a3b084c7edeb9552d645742a54a5867635b4f2453"},
{file = "pydantic_core-2.33.0-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:a7a7f2a3f628d2f7ef11cb6188bcf0b9e1558151d511b974dfea10a49afe192b"},
{file = "pydantic_core-2.33.0-cp311-cp311-musllinux_1_1_armv7l.whl", hash = "sha256:f1fb026c575e16f673c61c7b86144517705865173f3d0907040ac30c4f9f5915"},
{file = "pydantic_core-2.33.0-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:635702b2fed997e0ac256b2cfbdb4dd0bf7c56b5d8fba8ef03489c03b3eb40e2"},
{file = "pydantic_core-2.33.0-cp311-cp311-win32.whl", hash = "sha256:07b4ced28fccae3f00626eaa0c4001aa9ec140a29501770a88dbbb0966019a86"},
{file = "pydantic_core-2.33.0-cp311-cp311-win_amd64.whl", hash = "sha256:4927564be53239a87770a5f86bdc272b8d1fbb87ab7783ad70255b4ab01aa25b"},
{file = "pydantic_core-2.33.0-cp311-cp311-win_arm64.whl", hash = "sha256:69297418ad644d521ea3e1aa2e14a2a422726167e9ad22b89e8f1130d68e1e9a"},
{file = "pydantic_core-2.33.0-cp312-cp312-macosx_10_12_x86_64.whl", hash = "sha256:6c32a40712e3662bebe524abe8abb757f2fa2000028d64cc5a1006016c06af43"},
{file = "pydantic_core-2.33.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:8ec86b5baa36f0a0bfb37db86c7d52652f8e8aa076ab745ef7725784183c3fdd"},
{file = "pydantic_core-2.33.0-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4deac83a8cc1d09e40683be0bc6d1fa4cde8df0a9bf0cda5693f9b0569ac01b6"},
{file = "pydantic_core-2.33.0-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:175ab598fb457a9aee63206a1993874badf3ed9a456e0654273e56f00747bbd6"},
{file = "pydantic_core-2.33.0-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:5f36afd0d56a6c42cf4e8465b6441cf546ed69d3a4ec92724cc9c8c61bd6ecf4"},
{file = "pydantic_core-2.33.0-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:0a98257451164666afafc7cbf5fb00d613e33f7e7ebb322fbcd99345695a9a61"},
{file = "pydantic_core-2.33.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ecc6d02d69b54a2eb83ebcc6f29df04957f734bcf309d346b4f83354d8376862"},
{file = "pydantic_core-2.33.0-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:1a69b7596c6603afd049ce7f3835bcf57dd3892fc7279f0ddf987bebed8caa5a"},
{file = "pydantic_core-2.33.0-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:ea30239c148b6ef41364c6f51d103c2988965b643d62e10b233b5efdca8c0099"},
{file = "pydantic_core-2.33.0-cp312-cp312-musllinux_1_1_armv7l.whl", hash = "sha256:abfa44cf2f7f7d7a199be6c6ec141c9024063205545aa09304349781b9a125e6"},
{file = "pydantic_core-2.33.0-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:20d4275f3c4659d92048c70797e5fdc396c6e4446caf517ba5cad2db60cd39d3"},
{file = "pydantic_core-2.33.0-cp312-cp312-win32.whl", hash = "sha256:918f2013d7eadea1d88d1a35fd4a1e16aaf90343eb446f91cb091ce7f9b431a2"},
{file = "pydantic_core-2.33.0-cp312-cp312-win_amd64.whl", hash = "sha256:aec79acc183865bad120b0190afac467c20b15289050648b876b07777e67ea48"},
{file = "pydantic_core-2.33.0-cp312-cp312-win_arm64.whl", hash = "sha256:5461934e895968655225dfa8b3be79e7e927e95d4bd6c2d40edd2fa7052e71b6"},
{file = "pydantic_core-2.33.0-cp313-cp313-macosx_10_12_x86_64.whl", hash = "sha256:f00e8b59e1fc8f09d05594aa7d2b726f1b277ca6155fc84c0396db1b373c4555"},
{file = "pydantic_core-2.33.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:1a73be93ecef45786d7d95b0c5e9b294faf35629d03d5b145b09b81258c7cd6d"},
{file = "pydantic_core-2.33.0-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ff48a55be9da6930254565ff5238d71d5e9cd8c5487a191cb85df3bdb8c77365"},
{file = "pydantic_core-2.33.0-cp313-cp313-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:26a4ea04195638dcd8c53dadb545d70badba51735b1594810e9768c2c0b4a5da"},
{file = "pydantic_core-2.33.0-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:41d698dcbe12b60661f0632b543dbb119e6ba088103b364ff65e951610cb7ce0"},
{file = "pydantic_core-2.33.0-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:ae62032ef513fe6281ef0009e30838a01057b832dc265da32c10469622613885"},
{file = "pydantic_core-2.33.0-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:f225f3a3995dbbc26affc191d0443c6c4aa71b83358fd4c2b7d63e2f6f0336f9"},
{file = "pydantic_core-2.33.0-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:5bdd36b362f419c78d09630cbaebc64913f66f62bda6d42d5fbb08da8cc4f181"},
{file = "pydantic_core-2.33.0-cp313-cp313-musllinux_1_1_aarch64.whl", hash = "sha256:2a0147c0bef783fd9abc9f016d66edb6cac466dc54a17ec5f5ada08ff65caf5d"},
{file = "pydantic_core-2.33.0-cp313-cp313-musllinux_1_1_armv7l.whl", hash = "sha256:c860773a0f205926172c6644c394e02c25421dc9a456deff16f64c0e299487d3"},
{file = "pydantic_core-2.33.0-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:138d31e3f90087f42aa6286fb640f3c7a8eb7bdae829418265e7e7474bd2574b"},
{file = "pydantic_core-2.33.0-cp313-cp313-win32.whl", hash = "sha256:d20cbb9d3e95114325780f3cfe990f3ecae24de7a2d75f978783878cce2ad585"},
{file = "pydantic_core-2.33.0-cp313-cp313-win_amd64.whl", hash = "sha256:ca1103d70306489e3d006b0f79db8ca5dd3c977f6f13b2c59ff745249431a606"},
{file = "pydantic_core-2.33.0-cp313-cp313-win_arm64.whl", hash = "sha256:6291797cad239285275558e0a27872da735b05c75d5237bbade8736f80e4c225"},
{file = "pydantic_core-2.33.0-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:7b79af799630af263eca9ec87db519426d8c9b3be35016eddad1832bac812d87"},
{file = "pydantic_core-2.33.0-cp313-cp313t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:eabf946a4739b5237f4f56d77fa6668263bc466d06a8036c055587c130a46f7b"},
{file = "pydantic_core-2.33.0-cp313-cp313t-win_amd64.whl", hash = "sha256:8a1d581e8cdbb857b0e0e81df98603376c1a5c34dc5e54039dcc00f043df81e7"},
{file = "pydantic_core-2.33.0-cp39-cp39-macosx_10_12_x86_64.whl", hash = "sha256:7c9c84749f5787781c1c45bb99f433402e484e515b40675a5d121ea14711cf61"},
{file = "pydantic_core-2.33.0-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:64672fa888595a959cfeff957a654e947e65bbe1d7d82f550417cbd6898a1d6b"},
{file = "pydantic_core-2.33.0-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:26bc7367c0961dec292244ef2549afa396e72e28cc24706210bd44d947582c59"},
{file = "pydantic_core-2.33.0-cp39-cp39-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:ce72d46eb201ca43994303025bd54d8a35a3fc2a3495fac653d6eb7205ce04f4"},
{file = "pydantic_core-2.33.0-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:14229c1504287533dbf6b1fc56f752ce2b4e9694022ae7509631ce346158de11"},
{file = "pydantic_core-2.33.0-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:085d8985b1c1e48ef271e98a658f562f29d89bda98bf120502283efbc87313eb"},
{file = "pydantic_core-2.33.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:31860fbda80d8f6828e84b4a4d129fd9c4535996b8249cfb8c720dc2a1a00bb8"},
{file = "pydantic_core-2.33.0-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:f200b2f20856b5a6c3a35f0d4e344019f805e363416e609e9b47c552d35fd5ea"},
{file = "pydantic_core-2.33.0-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:5f72914cfd1d0176e58ddc05c7a47674ef4222c8253bf70322923e73e14a4ac3"},
{file = "pydantic_core-2.33.0-cp39-cp39-musllinux_1_1_armv7l.whl", hash = "sha256:91301a0980a1d4530d4ba7e6a739ca1a6b31341252cb709948e0aca0860ce0ae"},
{file = "pydantic_core-2.33.0-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:7419241e17c7fbe5074ba79143d5523270e04f86f1b3a0dff8df490f84c8273a"},
{file = "pydantic_core-2.33.0-cp39-cp39-win32.whl", hash = "sha256:7a25493320203005d2a4dac76d1b7d953cb49bce6d459d9ae38e30dd9f29bc9c"},
{file = "pydantic_core-2.33.0-cp39-cp39-win_amd64.whl", hash = "sha256:82a4eba92b7ca8af1b7d5ef5f3d9647eee94d1f74d21ca7c21e3a2b92e008358"},
{file = "pydantic_core-2.33.0-pp310-pypy310_pp73-macosx_10_12_x86_64.whl", hash = "sha256:e2762c568596332fdab56b07060c8ab8362c56cf2a339ee54e491cd503612c50"},
{file = "pydantic_core-2.33.0-pp310-pypy310_pp73-macosx_11_0_arm64.whl", hash = "sha256:5bf637300ff35d4f59c006fff201c510b2b5e745b07125458a5389af3c0dff8c"},
{file = "pydantic_core-2.33.0-pp310-pypy310_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:62c151ce3d59ed56ebd7ce9ce5986a409a85db697d25fc232f8e81f195aa39a1"},
{file = "pydantic_core-2.33.0-pp310-pypy310_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9ee65f0cc652261744fd07f2c6e6901c914aa6c5ff4dcfaf1136bc394d0dd26b"},
{file = "pydantic_core-2.33.0-pp310-pypy310_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:024d136ae44d233e6322027bbf356712b3940bee816e6c948ce4b90f18471b3d"},
{file = "pydantic_core-2.33.0-pp310-pypy310_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:e37f10f6d4bc67c58fbd727108ae1d8b92b397355e68519f1e4a7babb1473442"},
{file = "pydantic_core-2.33.0-pp310-pypy310_pp73-musllinux_1_1_armv7l.whl", hash = "sha256:502ed542e0d958bd12e7c3e9a015bce57deaf50eaa8c2e1c439b512cb9db1e3a"},
{file = "pydantic_core-2.33.0-pp310-pypy310_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:715c62af74c236bf386825c0fdfa08d092ab0f191eb5b4580d11c3189af9d330"},
{file = "pydantic_core-2.33.0-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:bccc06fa0372151f37f6b69834181aa9eb57cf8665ed36405fb45fbf6cac3bae"},
{file = "pydantic_core-2.33.0-pp311-pypy311_pp73-macosx_10_12_x86_64.whl", hash = "sha256:5d8dc9f63a26f7259b57f46a7aab5af86b2ad6fbe48487500bb1f4b27e051e4c"},
{file = "pydantic_core-2.33.0-pp311-pypy311_pp73-macosx_11_0_arm64.whl", hash = "sha256:30369e54d6d0113d2aa5aee7a90d17f225c13d87902ace8fcd7bbf99b19124db"},
{file = "pydantic_core-2.33.0-pp311-pypy311_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f3eb479354c62067afa62f53bb387827bee2f75c9c79ef25eef6ab84d4b1ae3b"},
{file = "pydantic_core-2.33.0-pp311-pypy311_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:0310524c833d91403c960b8a3cf9f46c282eadd6afd276c8c5edc617bd705dc9"},
{file = "pydantic_core-2.33.0-pp311-pypy311_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:eddb18a00bbb855325db27b4c2a89a4ba491cd6a0bd6d852b225172a1f54b36c"},
{file = "pydantic_core-2.33.0-pp311-pypy311_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:ade5dbcf8d9ef8f4b28e682d0b29f3008df9842bb5ac48ac2c17bc55771cc976"},
{file = "pydantic_core-2.33.0-pp311-pypy311_pp73-musllinux_1_1_armv7l.whl", hash = "sha256:2c0afd34f928383e3fd25740f2050dbac9d077e7ba5adbaa2227f4d4f3c8da5c"},
{file = "pydantic_core-2.33.0-pp311-pypy311_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:7da333f21cd9df51d5731513a6d39319892947604924ddf2e24a4612975fb936"},
{file = "pydantic_core-2.33.0-pp311-pypy311_pp73-win_amd64.whl", hash = "sha256:4b6d77c75a57f041c5ee915ff0b0bb58eabb78728b69ed967bc5b780e8f701b8"},
{file = "pydantic_core-2.33.0-pp39-pypy39_pp73-macosx_10_12_x86_64.whl", hash = "sha256:ba95691cf25f63df53c1d342413b41bd7762d9acb425df8858d7efa616c0870e"},
{file = "pydantic_core-2.33.0-pp39-pypy39_pp73-macosx_11_0_arm64.whl", hash = "sha256:4f1ab031feb8676f6bd7c85abec86e2935850bf19b84432c64e3e239bffeb1ec"},
{file = "pydantic_core-2.33.0-pp39-pypy39_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:58c1151827eef98b83d49b6ca6065575876a02d2211f259fb1a6b7757bd24dd8"},
{file = "pydantic_core-2.33.0-pp39-pypy39_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:a66d931ea2c1464b738ace44b7334ab32a2fd50be023d863935eb00f42be1778"},
{file = "pydantic_core-2.33.0-pp39-pypy39_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:0bcf0bab28995d483f6c8d7db25e0d05c3efa5cebfd7f56474359e7137f39856"},
{file = "pydantic_core-2.33.0-pp39-pypy39_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:89670d7a0045acb52be0566df5bc8b114ac967c662c06cf5e0c606e4aadc964b"},
{file = "pydantic_core-2.33.0-pp39-pypy39_pp73-musllinux_1_1_armv7l.whl", hash = "sha256:b716294e721d8060908dbebe32639b01bfe61b15f9f57bcc18ca9a0e00d9520b"},
{file = "pydantic_core-2.33.0-pp39-pypy39_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:fc53e05c16697ff0c1c7c2b98e45e131d4bfb78068fffff92a82d169cbb4c7b7"},
{file = "pydantic_core-2.33.0-pp39-pypy39_pp73-win_amd64.whl", hash = "sha256:68504959253303d3ae9406b634997a2123a0b0c1da86459abbd0ffc921695eac"},
{file = "pydantic_core-2.33.0.tar.gz", hash = "sha256:40eb8af662ba409c3cbf4a8150ad32ae73514cd7cb1f1a2113af39763dd616b3"},
{file = "pydantic_core-2.33.2-cp310-cp310-macosx_10_12_x86_64.whl", hash = "sha256:2b3d326aaef0c0399d9afffeb6367d5e26ddc24d351dbc9c636840ac355dc5d8"},
{file = "pydantic_core-2.33.2-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:0e5b2671f05ba48b94cb90ce55d8bdcaaedb8ba00cc5359f6810fc918713983d"},
{file = "pydantic_core-2.33.2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0069c9acc3f3981b9ff4cdfaf088e98d83440a4c7ea1bc07460af3d4dc22e72d"},
{file = "pydantic_core-2.33.2-cp310-cp310-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:d53b22f2032c42eaaf025f7c40c2e3b94568ae077a606f006d206a463bc69572"},
{file = "pydantic_core-2.33.2-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:0405262705a123b7ce9f0b92f123334d67b70fd1f20a9372b907ce1080c7ba02"},
{file = "pydantic_core-2.33.2-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:4b25d91e288e2c4e0662b8038a28c6a07eaac3e196cfc4ff69de4ea3db992a1b"},
{file = "pydantic_core-2.33.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6bdfe4b3789761f3bcb4b1ddf33355a71079858958e3a552f16d5af19768fef2"},
{file = "pydantic_core-2.33.2-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:efec8db3266b76ef9607c2c4c419bdb06bf335ae433b80816089ea7585816f6a"},
{file = "pydantic_core-2.33.2-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:031c57d67ca86902726e0fae2214ce6770bbe2f710dc33063187a68744a5ecac"},
{file = "pydantic_core-2.33.2-cp310-cp310-musllinux_1_1_armv7l.whl", hash = "sha256:f8de619080e944347f5f20de29a975c2d815d9ddd8be9b9b7268e2e3ef68605a"},
{file = "pydantic_core-2.33.2-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:73662edf539e72a9440129f231ed3757faab89630d291b784ca99237fb94db2b"},
{file = "pydantic_core-2.33.2-cp310-cp310-win32.whl", hash = "sha256:0a39979dcbb70998b0e505fb1556a1d550a0781463ce84ebf915ba293ccb7e22"},
{file = "pydantic_core-2.33.2-cp310-cp310-win_amd64.whl", hash = "sha256:b0379a2b24882fef529ec3b4987cb5d003b9cda32256024e6fe1586ac45fc640"},
{file = "pydantic_core-2.33.2-cp311-cp311-macosx_10_12_x86_64.whl", hash = "sha256:4c5b0a576fb381edd6d27f0a85915c6daf2f8138dc5c267a57c08a62900758c7"},
{file = "pydantic_core-2.33.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:e799c050df38a639db758c617ec771fd8fb7a5f8eaaa4b27b101f266b216a246"},
{file = "pydantic_core-2.33.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:dc46a01bf8d62f227d5ecee74178ffc448ff4e5197c756331f71efcc66dc980f"},
{file = "pydantic_core-2.33.2-cp311-cp311-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:a144d4f717285c6d9234a66778059f33a89096dfb9b39117663fd8413d582dcc"},
{file = "pydantic_core-2.33.2-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:73cf6373c21bc80b2e0dc88444f41ae60b2f070ed02095754eb5a01df12256de"},
{file = "pydantic_core-2.33.2-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:3dc625f4aa79713512d1976fe9f0bc99f706a9dee21dfd1810b4bbbf228d0e8a"},
{file = "pydantic_core-2.33.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:881b21b5549499972441da4758d662aeea93f1923f953e9cbaff14b8b9565aef"},
{file = "pydantic_core-2.33.2-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:bdc25f3681f7b78572699569514036afe3c243bc3059d3942624e936ec93450e"},
{file = "pydantic_core-2.33.2-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:fe5b32187cbc0c862ee201ad66c30cf218e5ed468ec8dc1cf49dec66e160cc4d"},
{file = "pydantic_core-2.33.2-cp311-cp311-musllinux_1_1_armv7l.whl", hash = "sha256:bc7aee6f634a6f4a95676fcb5d6559a2c2a390330098dba5e5a5f28a2e4ada30"},
{file = "pydantic_core-2.33.2-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:235f45e5dbcccf6bd99f9f472858849f73d11120d76ea8707115415f8e5ebebf"},
{file = "pydantic_core-2.33.2-cp311-cp311-win32.whl", hash = "sha256:6368900c2d3ef09b69cb0b913f9f8263b03786e5b2a387706c5afb66800efd51"},
{file = "pydantic_core-2.33.2-cp311-cp311-win_amd64.whl", hash = "sha256:1e063337ef9e9820c77acc768546325ebe04ee38b08703244c1309cccc4f1bab"},
{file = "pydantic_core-2.33.2-cp311-cp311-win_arm64.whl", hash = "sha256:6b99022f1d19bc32a4c2a0d544fc9a76e3be90f0b3f4af413f87d38749300e65"},
{file = "pydantic_core-2.33.2-cp312-cp312-macosx_10_12_x86_64.whl", hash = "sha256:a7ec89dc587667f22b6a0b6579c249fca9026ce7c333fc142ba42411fa243cdc"},
{file = "pydantic_core-2.33.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:3c6db6e52c6d70aa0d00d45cdb9b40f0433b96380071ea80b09277dba021ddf7"},
{file = "pydantic_core-2.33.2-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4e61206137cbc65e6d5256e1166f88331d3b6238e082d9f74613b9b765fb9025"},
{file = "pydantic_core-2.33.2-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:eb8c529b2819c37140eb51b914153063d27ed88e3bdc31b71198a198e921e011"},
{file = "pydantic_core-2.33.2-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:c52b02ad8b4e2cf14ca7b3d918f3eb0ee91e63b3167c32591e57c4317e134f8f"},
{file = "pydantic_core-2.33.2-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:96081f1605125ba0855dfda83f6f3df5ec90c61195421ba72223de35ccfb2f88"},
{file = "pydantic_core-2.33.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8f57a69461af2a5fa6e6bbd7a5f60d3b7e6cebb687f55106933188e79ad155c1"},
{file = "pydantic_core-2.33.2-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:572c7e6c8bb4774d2ac88929e3d1f12bc45714ae5ee6d9a788a9fb35e60bb04b"},
{file = "pydantic_core-2.33.2-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:db4b41f9bd95fbe5acd76d89920336ba96f03e149097365afe1cb092fceb89a1"},
{file = "pydantic_core-2.33.2-cp312-cp312-musllinux_1_1_armv7l.whl", hash = "sha256:fa854f5cf7e33842a892e5c73f45327760bc7bc516339fda888c75ae60edaeb6"},
{file = "pydantic_core-2.33.2-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:5f483cfb75ff703095c59e365360cb73e00185e01aaea067cd19acffd2ab20ea"},
{file = "pydantic_core-2.33.2-cp312-cp312-win32.whl", hash = "sha256:9cb1da0f5a471435a7bc7e439b8a728e8b61e59784b2af70d7c169f8dd8ae290"},
{file = "pydantic_core-2.33.2-cp312-cp312-win_amd64.whl", hash = "sha256:f941635f2a3d96b2973e867144fde513665c87f13fe0e193c158ac51bfaaa7b2"},
{file = "pydantic_core-2.33.2-cp312-cp312-win_arm64.whl", hash = "sha256:cca3868ddfaccfbc4bfb1d608e2ccaaebe0ae628e1416aeb9c4d88c001bb45ab"},
{file = "pydantic_core-2.33.2-cp313-cp313-macosx_10_12_x86_64.whl", hash = "sha256:1082dd3e2d7109ad8b7da48e1d4710c8d06c253cbc4a27c1cff4fbcaa97a9e3f"},
{file = "pydantic_core-2.33.2-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:f517ca031dfc037a9c07e748cefd8d96235088b83b4f4ba8939105d20fa1dcd6"},
{file = "pydantic_core-2.33.2-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0a9f2c9dd19656823cb8250b0724ee9c60a82f3cdf68a080979d13092a3b0fef"},
{file = "pydantic_core-2.33.2-cp313-cp313-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:2b0a451c263b01acebe51895bfb0e1cc842a5c666efe06cdf13846c7418caa9a"},
{file = "pydantic_core-2.33.2-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:1ea40a64d23faa25e62a70ad163571c0b342b8bf66d5fa612ac0dec4f069d916"},
{file = "pydantic_core-2.33.2-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:0fb2d542b4d66f9470e8065c5469ec676978d625a8b7a363f07d9a501a9cb36a"},
{file = "pydantic_core-2.33.2-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9fdac5d6ffa1b5a83bca06ffe7583f5576555e6c8b3a91fbd25ea7780f825f7d"},
{file = "pydantic_core-2.33.2-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:04a1a413977ab517154eebb2d326da71638271477d6ad87a769102f7c2488c56"},
{file = "pydantic_core-2.33.2-cp313-cp313-musllinux_1_1_aarch64.whl", hash = "sha256:c8e7af2f4e0194c22b5b37205bfb293d166a7344a5b0d0eaccebc376546d77d5"},
{file = "pydantic_core-2.33.2-cp313-cp313-musllinux_1_1_armv7l.whl", hash = "sha256:5c92edd15cd58b3c2d34873597a1e20f13094f59cf88068adb18947df5455b4e"},
{file = "pydantic_core-2.33.2-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:65132b7b4a1c0beded5e057324b7e16e10910c106d43675d9bd87d4f38dde162"},
{file = "pydantic_core-2.33.2-cp313-cp313-win32.whl", hash = "sha256:52fb90784e0a242bb96ec53f42196a17278855b0f31ac7c3cc6f5c1ec4811849"},
{file = "pydantic_core-2.33.2-cp313-cp313-win_amd64.whl", hash = "sha256:c083a3bdd5a93dfe480f1125926afcdbf2917ae714bdb80b36d34318b2bec5d9"},
{file = "pydantic_core-2.33.2-cp313-cp313-win_arm64.whl", hash = "sha256:e80b087132752f6b3d714f041ccf74403799d3b23a72722ea2e6ba2e892555b9"},
{file = "pydantic_core-2.33.2-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:61c18fba8e5e9db3ab908620af374db0ac1baa69f0f32df4f61ae23f15e586ac"},
{file = "pydantic_core-2.33.2-cp313-cp313t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:95237e53bb015f67b63c91af7518a62a8660376a6a0db19b89acc77a4d6199f5"},
{file = "pydantic_core-2.33.2-cp313-cp313t-win_amd64.whl", hash = "sha256:c2fc0a768ef76c15ab9238afa6da7f69895bb5d1ee83aeea2e3509af4472d0b9"},
{file = "pydantic_core-2.33.2-cp39-cp39-macosx_10_12_x86_64.whl", hash = "sha256:a2b911a5b90e0374d03813674bf0a5fbbb7741570dcd4b4e85a2e48d17def29d"},
{file = "pydantic_core-2.33.2-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:6fa6dfc3e4d1f734a34710f391ae822e0a8eb8559a85c6979e14e65ee6ba2954"},
{file = "pydantic_core-2.33.2-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:c54c939ee22dc8e2d545da79fc5381f1c020d6d3141d3bd747eab59164dc89fb"},
{file = "pydantic_core-2.33.2-cp39-cp39-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:53a57d2ed685940a504248187d5685e49eb5eef0f696853647bf37c418c538f7"},
{file = "pydantic_core-2.33.2-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:09fb9dd6571aacd023fe6aaca316bd01cf60ab27240d7eb39ebd66a3a15293b4"},
{file = "pydantic_core-2.33.2-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:0e6116757f7959a712db11f3e9c0a99ade00a5bbedae83cb801985aa154f071b"},
{file = "pydantic_core-2.33.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8d55ab81c57b8ff8548c3e4947f119551253f4e3787a7bbc0b6b3ca47498a9d3"},
{file = "pydantic_core-2.33.2-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:c20c462aa4434b33a2661701b861604913f912254e441ab8d78d30485736115a"},
{file = "pydantic_core-2.33.2-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:44857c3227d3fb5e753d5fe4a3420d6376fa594b07b621e220cd93703fe21782"},
{file = "pydantic_core-2.33.2-cp39-cp39-musllinux_1_1_armv7l.whl", hash = "sha256:eb9b459ca4df0e5c87deb59d37377461a538852765293f9e6ee834f0435a93b9"},
{file = "pydantic_core-2.33.2-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:9fcd347d2cc5c23b06de6d3b7b8275be558a0c90549495c699e379a80bf8379e"},
{file = "pydantic_core-2.33.2-cp39-cp39-win32.whl", hash = "sha256:83aa99b1285bc8f038941ddf598501a86f1536789740991d7d8756e34f1e74d9"},
{file = "pydantic_core-2.33.2-cp39-cp39-win_amd64.whl", hash = "sha256:f481959862f57f29601ccced557cc2e817bce7533ab8e01a797a48b49c9692b3"},
{file = "pydantic_core-2.33.2-pp310-pypy310_pp73-macosx_10_12_x86_64.whl", hash = "sha256:5c4aa4e82353f65e548c476b37e64189783aa5384903bfea4f41580f255fddfa"},
{file = "pydantic_core-2.33.2-pp310-pypy310_pp73-macosx_11_0_arm64.whl", hash = "sha256:d946c8bf0d5c24bf4fe333af284c59a19358aa3ec18cb3dc4370080da1e8ad29"},
{file = "pydantic_core-2.33.2-pp310-pypy310_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:87b31b6846e361ef83fedb187bb5b4372d0da3f7e28d85415efa92d6125d6e6d"},
{file = "pydantic_core-2.33.2-pp310-pypy310_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:aa9d91b338f2df0508606f7009fde642391425189bba6d8c653afd80fd6bb64e"},
{file = "pydantic_core-2.33.2-pp310-pypy310_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:2058a32994f1fde4ca0480ab9d1e75a0e8c87c22b53a3ae66554f9af78f2fe8c"},
{file = "pydantic_core-2.33.2-pp310-pypy310_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:0e03262ab796d986f978f79c943fc5f620381be7287148b8010b4097f79a39ec"},
{file = "pydantic_core-2.33.2-pp310-pypy310_pp73-musllinux_1_1_armv7l.whl", hash = "sha256:1a8695a8d00c73e50bff9dfda4d540b7dee29ff9b8053e38380426a85ef10052"},
{file = "pydantic_core-2.33.2-pp310-pypy310_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:fa754d1850735a0b0e03bcffd9d4b4343eb417e47196e4485d9cca326073a42c"},
{file = "pydantic_core-2.33.2-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:a11c8d26a50bfab49002947d3d237abe4d9e4b5bdc8846a63537b6488e197808"},
{file = "pydantic_core-2.33.2-pp311-pypy311_pp73-macosx_10_12_x86_64.whl", hash = "sha256:dd14041875d09cc0f9308e37a6f8b65f5585cf2598a53aa0123df8b129d481f8"},
{file = "pydantic_core-2.33.2-pp311-pypy311_pp73-macosx_11_0_arm64.whl", hash = "sha256:d87c561733f66531dced0da6e864f44ebf89a8fba55f31407b00c2f7f9449593"},
{file = "pydantic_core-2.33.2-pp311-pypy311_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:2f82865531efd18d6e07a04a17331af02cb7a651583c418df8266f17a63c6612"},
{file = "pydantic_core-2.33.2-pp311-pypy311_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:2bfb5112df54209d820d7bf9317c7a6c9025ea52e49f46b6a2060104bba37de7"},
{file = "pydantic_core-2.33.2-pp311-pypy311_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:64632ff9d614e5eecfb495796ad51b0ed98c453e447a76bcbeeb69615079fc7e"},
{file = "pydantic_core-2.33.2-pp311-pypy311_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:f889f7a40498cc077332c7ab6b4608d296d852182211787d4f3ee377aaae66e8"},
{file = "pydantic_core-2.33.2-pp311-pypy311_pp73-musllinux_1_1_armv7l.whl", hash = "sha256:de4b83bb311557e439b9e186f733f6c645b9417c84e2eb8203f3f820a4b988bf"},
{file = "pydantic_core-2.33.2-pp311-pypy311_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:82f68293f055f51b51ea42fafc74b6aad03e70e191799430b90c13d643059ebb"},
{file = "pydantic_core-2.33.2-pp311-pypy311_pp73-win_amd64.whl", hash = "sha256:329467cecfb529c925cf2bbd4d60d2c509bc2fb52a20c1045bf09bb70971a9c1"},
{file = "pydantic_core-2.33.2-pp39-pypy39_pp73-macosx_10_12_x86_64.whl", hash = "sha256:87acbfcf8e90ca885206e98359d7dca4bcbb35abdc0ff66672a293e1d7a19101"},
{file = "pydantic_core-2.33.2-pp39-pypy39_pp73-macosx_11_0_arm64.whl", hash = "sha256:7f92c15cd1e97d4b12acd1cc9004fa092578acfa57b67ad5e43a197175d01a64"},
{file = "pydantic_core-2.33.2-pp39-pypy39_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d3f26877a748dc4251cfcfda9dfb5f13fcb034f5308388066bcfe9031b63ae7d"},
{file = "pydantic_core-2.33.2-pp39-pypy39_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:dac89aea9af8cd672fa7b510e7b8c33b0bba9a43186680550ccf23020f32d535"},
{file = "pydantic_core-2.33.2-pp39-pypy39_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:970919794d126ba8645f3837ab6046fb4e72bbc057b3709144066204c19a455d"},
{file = "pydantic_core-2.33.2-pp39-pypy39_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:3eb3fe62804e8f859c49ed20a8451342de53ed764150cb14ca71357c765dc2a6"},
{file = "pydantic_core-2.33.2-pp39-pypy39_pp73-musllinux_1_1_armv7l.whl", hash = "sha256:3abcd9392a36025e3bd55f9bd38d908bd17962cc49bc6da8e7e96285336e2bca"},
{file = "pydantic_core-2.33.2-pp39-pypy39_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:3a1c81334778f9e3af2f8aeb7a960736e5cab1dfebfb26aabca09afd2906c039"},
{file = "pydantic_core-2.33.2-pp39-pypy39_pp73-win_amd64.whl", hash = "sha256:2807668ba86cb38c6817ad9bc66215ab8584d1d304030ce4f0887336f28a5e27"},
{file = "pydantic_core-2.33.2.tar.gz", hash = "sha256:7cb8bc3605c29176e1b105350d2e6474142d7c1bd1d9327c4a9bdb46bf827acc"},
]
[package.dependencies]
@@ -1372,22 +1372,25 @@ typing-extensions = ">=4.6.0,<4.7.0 || >4.7.0"
[[package]]
name = "pydantic-settings"
version = "2.8.1"
version = "2.9.1"
description = "Settings management using Pydantic"
optional = false
python-versions = ">=3.8"
python-versions = ">=3.9"
groups = ["main"]
files = [
{file = "pydantic_settings-2.8.1-py3-none-any.whl", hash = "sha256:81942d5ac3d905f7f3ee1a70df5dfb62d5569c12f51a5a647defc1c3d9ee2e9c"},
{file = "pydantic_settings-2.8.1.tar.gz", hash = "sha256:d5c663dfbe9db9d5e1c646b2e161da12f0d734d422ee56f567d0ea2cee4e8585"},
{file = "pydantic_settings-2.9.1-py3-none-any.whl", hash = "sha256:59b4f431b1defb26fe620c71a7d3968a710d719f5f4cdbbdb7926edeb770f6ef"},
{file = "pydantic_settings-2.9.1.tar.gz", hash = "sha256:c509bf79d27563add44e8446233359004ed85066cd096d8b510f715e6ef5d268"},
]
[package.dependencies]
pydantic = ">=2.7.0"
python-dotenv = ">=0.21.0"
typing-inspection = ">=0.4.0"
[package.extras]
aws-secrets-manager = ["boto3 (>=1.35.0)", "boto3-stubs[secretsmanager]"]
azure-key-vault = ["azure-identity (>=1.16.0)", "azure-keyvault-secrets (>=4.8.0)"]
gcp-secret-manager = ["google-cloud-secret-manager (>=2.23.1)"]
toml = ["tomli (>=2.0.1)"]
yaml = ["pyyaml (>=6.0.1)"]
@@ -1575,30 +1578,30 @@ pyasn1 = ">=0.1.3"
[[package]]
name = "ruff"
version = "0.11.2"
version = "0.11.10"
description = "An extremely fast Python linter and code formatter, written in Rust."
optional = false
python-versions = ">=3.7"
groups = ["dev"]
files = [
{file = "ruff-0.11.2-py3-none-linux_armv6l.whl", hash = "sha256:c69e20ea49e973f3afec2c06376eb56045709f0212615c1adb0eda35e8a4e477"},
{file = "ruff-0.11.2-py3-none-macosx_10_12_x86_64.whl", hash = "sha256:2c5424cc1c4eb1d8ecabe6d4f1b70470b4f24a0c0171356290b1953ad8f0e272"},
{file = "ruff-0.11.2-py3-none-macosx_11_0_arm64.whl", hash = "sha256:ecf20854cc73f42171eedb66f006a43d0a21bfb98a2523a809931cda569552d9"},
{file = "ruff-0.11.2-py3-none-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0c543bf65d5d27240321604cee0633a70c6c25c9a2f2492efa9f6d4b8e4199bb"},
{file = "ruff-0.11.2-py3-none-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:20967168cc21195db5830b9224be0e964cc9c8ecf3b5a9e3ce19876e8d3a96e3"},
{file = "ruff-0.11.2-py3-none-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:955a9ce63483999d9f0b8f0b4a3ad669e53484232853054cc8b9d51ab4c5de74"},
{file = "ruff-0.11.2-py3-none-manylinux_2_17_ppc64.manylinux2014_ppc64.whl", hash = "sha256:86b3a27c38b8fce73bcd262b0de32e9a6801b76d52cdb3ae4c914515f0cef608"},
{file = "ruff-0.11.2-py3-none-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a3b66a03b248c9fcd9d64d445bafdf1589326bee6fc5c8e92d7562e58883e30f"},
{file = "ruff-0.11.2-py3-none-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:0397c2672db015be5aa3d4dac54c69aa012429097ff219392c018e21f5085147"},
{file = "ruff-0.11.2-py3-none-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:869bcf3f9abf6457fbe39b5a37333aa4eecc52a3b99c98827ccc371a8e5b6f1b"},
{file = "ruff-0.11.2-py3-none-musllinux_1_2_aarch64.whl", hash = "sha256:2a2b50ca35457ba785cd8c93ebbe529467594087b527a08d487cf0ee7b3087e9"},
{file = "ruff-0.11.2-py3-none-musllinux_1_2_armv7l.whl", hash = "sha256:7c69c74bf53ddcfbc22e6eb2f31211df7f65054bfc1f72288fc71e5f82db3eab"},
{file = "ruff-0.11.2-py3-none-musllinux_1_2_i686.whl", hash = "sha256:6e8fb75e14560f7cf53b15bbc55baf5ecbe373dd5f3aab96ff7aa7777edd7630"},
{file = "ruff-0.11.2-py3-none-musllinux_1_2_x86_64.whl", hash = "sha256:842a472d7b4d6f5924e9297aa38149e5dcb1e628773b70e6387ae2c97a63c58f"},
{file = "ruff-0.11.2-py3-none-win32.whl", hash = "sha256:aca01ccd0eb5eb7156b324cfaa088586f06a86d9e5314b0eb330cb48415097cc"},
{file = "ruff-0.11.2-py3-none-win_amd64.whl", hash = "sha256:3170150172a8f994136c0c66f494edf199a0bbea7a409f649e4bc8f4d7084080"},
{file = "ruff-0.11.2-py3-none-win_arm64.whl", hash = "sha256:52933095158ff328f4c77af3d74f0379e34fd52f175144cefc1b192e7ccd32b4"},
{file = "ruff-0.11.2.tar.gz", hash = "sha256:ec47591497d5a1050175bdf4e1a4e6272cddff7da88a2ad595e1e326041d8d94"},
{file = "ruff-0.11.10-py3-none-linux_armv6l.whl", hash = "sha256:859a7bfa7bc8888abbea31ef8a2b411714e6a80f0d173c2a82f9041ed6b50f58"},
{file = "ruff-0.11.10-py3-none-macosx_10_12_x86_64.whl", hash = "sha256:968220a57e09ea5e4fd48ed1c646419961a0570727c7e069842edd018ee8afed"},
{file = "ruff-0.11.10-py3-none-macosx_11_0_arm64.whl", hash = "sha256:1067245bad978e7aa7b22f67113ecc6eb241dca0d9b696144256c3a879663bca"},
{file = "ruff-0.11.10-py3-none-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f4854fd09c7aed5b1590e996a81aeff0c9ff51378b084eb5a0b9cd9518e6cff2"},
{file = "ruff-0.11.10-py3-none-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:8b4564e9f99168c0f9195a0fd5fa5928004b33b377137f978055e40008a082c5"},
{file = "ruff-0.11.10-py3-none-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:5b6a9cc5b62c03cc1fea0044ed8576379dbaf751d5503d718c973d5418483641"},
{file = "ruff-0.11.10-py3-none-manylinux_2_17_ppc64.manylinux2014_ppc64.whl", hash = "sha256:607ecbb6f03e44c9e0a93aedacb17b4eb4f3563d00e8b474298a201622677947"},
{file = "ruff-0.11.10-py3-none-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:7b3a522fa389402cd2137df9ddefe848f727250535c70dafa840badffb56b7a4"},
{file = "ruff-0.11.10-py3-none-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:2f071b0deed7e9245d5820dac235cbdd4ef99d7b12ff04c330a241ad3534319f"},
{file = "ruff-0.11.10-py3-none-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:4a60e3a0a617eafba1f2e4186d827759d65348fa53708ca547e384db28406a0b"},
{file = "ruff-0.11.10-py3-none-musllinux_1_2_aarch64.whl", hash = "sha256:da8ec977eaa4b7bf75470fb575bea2cb41a0e07c7ea9d5a0a97d13dbca697bf2"},
{file = "ruff-0.11.10-py3-none-musllinux_1_2_armv7l.whl", hash = "sha256:ddf8967e08227d1bd95cc0851ef80d2ad9c7c0c5aab1eba31db49cf0a7b99523"},
{file = "ruff-0.11.10-py3-none-musllinux_1_2_i686.whl", hash = "sha256:5a94acf798a82db188f6f36575d80609072b032105d114b0f98661e1679c9125"},
{file = "ruff-0.11.10-py3-none-musllinux_1_2_x86_64.whl", hash = "sha256:3afead355f1d16d95630df28d4ba17fb2cb9c8dfac8d21ced14984121f639bad"},
{file = "ruff-0.11.10-py3-none-win32.whl", hash = "sha256:dc061a98d32a97211af7e7f3fa1d4ca2fcf919fb96c28f39551f35fc55bdbc19"},
{file = "ruff-0.11.10-py3-none-win_amd64.whl", hash = "sha256:5cc725fbb4d25b0f185cb42df07ab6b76c4489b4bfb740a175f3a59c70e8a224"},
{file = "ruff-0.11.10-py3-none-win_arm64.whl", hash = "sha256:ef69637b35fb8b210743926778d0e45e1bffa850a7c61e428c6b971549b5f5d1"},
{file = "ruff-0.11.10.tar.gz", hash = "sha256:d522fb204b4959909ecac47da02830daec102eeb100fb50ea9554818d47a5fa6"},
]
[[package]]
@@ -1660,14 +1663,14 @@ test = ["pylint", "pytest", "pytest-black", "pytest-cov", "pytest-pylint"]
[[package]]
name = "supabase"
version = "2.15.0"
version = "2.15.1"
description = "Supabase client for Python."
optional = false
python-versions = "<4.0,>=3.9"
groups = ["main"]
files = [
{file = "supabase-2.15.0-py3-none-any.whl", hash = "sha256:a665c7ab6c8ad1d80609ab62ad657f66fdaf38070ec9e0db5c7887fd72b109c0"},
{file = "supabase-2.15.0.tar.gz", hash = "sha256:2e66289ad74ae9c4cb04a69f9de00cd2ce880cd890de23269a40ac5b69151d26"},
{file = "supabase-2.15.1-py3-none-any.whl", hash = "sha256:749299cdd74ecf528f52045c1e60d9dba81cc2054656f754c0ca7fba0dd34827"},
{file = "supabase-2.15.1.tar.gz", hash = "sha256:66e847dab9346062aa6a25b4e81ac786b972c5d4299827c57d1d5bd6a0346070"},
]
[package.dependencies]
@@ -1701,7 +1704,7 @@ description = "A lil' TOML parser"
optional = false
python-versions = ">=3.8"
groups = ["main"]
markers = "python_version == \"3.10\""
markers = "python_version < \"3.11\""
files = [
{file = "tomli-2.1.0-py3-none-any.whl", hash = "sha256:a5c57c3d1c56f5ccdf89f6523458f60ef716e210fc47c4cfb188c5ba473e0391"},
{file = "tomli-2.1.0.tar.gz", hash = "sha256:3f646cae2aec94e17d04973e4249548320197cfabdf130015d023de4b74d8ab8"},
@@ -2034,4 +2037,4 @@ type = ["pytest-mypy"]
[metadata]
lock-version = "2.1"
python-versions = ">=3.10,<4.0"
content-hash = "c8e23c0609cae0717447f575849b658bee9203b784ec7270b62629cddbbbd9ca"
content-hash = "78ebf65cdef769cfbe92fe204f01e32d219cca9ee5a6ca9e657aa0630be63802"

View File

@@ -7,20 +7,20 @@ readme = "README.md"
packages = [{ include = "autogpt_libs" }]
[tool.poetry.dependencies]
python = ">=3.10,<4.0"
colorama = "^0.4.6"
expiringdict = "^1.2.2"
google-cloud-logging = "^3.11.4"
pydantic = "^2.11.1"
pydantic-settings = "^2.8.1"
google-cloud-logging = "^3.12.1"
pydantic = "^2.11.4"
pydantic-settings = "^2.9.1"
pyjwt = "^2.10.1"
pytest-asyncio = "^0.26.0"
pytest-mock = "^3.14.0"
python = ">=3.10,<4.0"
supabase = "^2.15.0"
supabase = "^2.15.1"
[tool.poetry.group.dev.dependencies]
redis = "^5.2.1"
ruff = "^0.11.0"
ruff = "^0.11.10"
[build-system]
requires = ["poetry-core"]

View File

@@ -66,6 +66,13 @@ MEDIA_GCS_BUCKET_NAME=
## and tunnel it to your locally running backend.
PLATFORM_BASE_URL=http://localhost:3000
## Cloudflare Turnstile (CAPTCHA) Configuration
## Get these from the Cloudflare Turnstile dashboard: https://dash.cloudflare.com/?to=/:account/turnstile
## This is the backend secret key
TURNSTILE_SECRET_KEY=
## This is the verify URL
TURNSTILE_VERIFY_URL=https://challenges.cloudflare.com/turnstile/v0/siteverify
## == INTEGRATION CREDENTIALS == ##
# Each set of server side credentials is required for the corresponding 3rd party
# integration to work.
@@ -122,6 +129,7 @@ OPENAI_API_KEY=
ANTHROPIC_API_KEY=
GROQ_API_KEY=
OPEN_ROUTER_API_KEY=
LLAMA_API_KEY=
# Reddit
# Go to https://www.reddit.com/prefs/apps and create a new app

View File

@@ -1,3 +1,4 @@
import functools
import importlib
import os
import re
@@ -10,17 +11,11 @@ if TYPE_CHECKING:
T = TypeVar("T")
_AVAILABLE_BLOCKS: dict[str, type["Block"]] = {}
@functools.cache
def load_all_blocks() -> dict[str, type["Block"]]:
from backend.data.block import Block
if _AVAILABLE_BLOCKS:
return _AVAILABLE_BLOCKS
# Dynamically load all modules under backend.blocks
AVAILABLE_MODULES = []
current_dir = Path(__file__).parent
modules = [
str(f.relative_to(current_dir))[:-3].replace(os.path.sep, ".")
@@ -35,9 +30,9 @@ def load_all_blocks() -> dict[str, type["Block"]]:
)
importlib.import_module(f".{module}", package=__name__)
AVAILABLE_MODULES.append(module)
# Load all Block instances from the available modules
available_blocks: dict[str, type["Block"]] = {}
for block_cls in all_subclasses(Block):
class_name = block_cls.__name__
@@ -58,7 +53,7 @@ def load_all_blocks() -> dict[str, type["Block"]]:
f"Block ID {block.name} error: {block.id} is not a valid UUID"
)
if block.id in _AVAILABLE_BLOCKS:
if block.id in available_blocks:
raise ValueError(
f"Block ID {block.name} error: {block.id} is already in use"
)
@@ -89,9 +84,9 @@ def load_all_blocks() -> dict[str, type["Block"]]:
f"{block.name} has a boolean field with no default value"
)
_AVAILABLE_BLOCKS[block.id] = block_cls
available_blocks[block.id] = block_cls
return _AVAILABLE_BLOCKS
return available_blocks
__all__ = ["load_all_blocks"]

View File

@@ -1,5 +1,5 @@
import logging
from typing import Any
from typing import Any, Optional
from backend.data.block import (
Block,
@@ -11,7 +11,7 @@ from backend.data.block import (
get_block,
)
from backend.data.execution import ExecutionStatus
from backend.data.model import SchemaField
from backend.data.model import CredentialsMetaInput, SchemaField
from backend.util import json
logger = logging.getLogger(__name__)
@@ -23,17 +23,21 @@ class AgentExecutorBlock(Block):
graph_id: str = SchemaField(description="Graph ID")
graph_version: int = SchemaField(description="Graph Version")
data: BlockInput = SchemaField(description="Input data for the graph")
inputs: BlockInput = SchemaField(description="Input data for the graph")
input_schema: dict = SchemaField(description="Input schema for the graph")
output_schema: dict = SchemaField(description="Output schema for the graph")
node_credentials_input_map: Optional[
dict[str, dict[str, CredentialsMetaInput]]
] = SchemaField(default=None, hidden=True)
@classmethod
def get_input_schema(cls, data: BlockInput) -> dict[str, Any]:
return data.get("input_schema", {})
@classmethod
def get_input_defaults(cls, data: BlockInput) -> BlockInput:
return data.get("data", {})
return data.get("inputs", {})
@classmethod
def get_missing_input(cls, data: BlockInput) -> set[str]:
@@ -67,7 +71,8 @@ class AgentExecutorBlock(Block):
graph_id=input_data.graph_id,
graph_version=input_data.graph_version,
user_id=input_data.user_id,
inputs=input_data.data,
inputs=input_data.inputs,
node_credentials_input_map=input_data.node_credentials_input_map,
)
log_id = f"Graph #{input_data.graph_id}-V{input_data.graph_version}, exec-id: {graph_exec.id}"
logger.info(f"Starting execution of {log_id}")

View File

@@ -1,8 +1,8 @@
from enum import Enum
from typing import Literal
import replicate
from pydantic import SecretStr
from replicate.client import Client as ReplicateClient
from replicate.helpers import FileOutput
from backend.data.block import Block, BlockCategory, BlockSchema
@@ -170,7 +170,7 @@ class AIImageGeneratorBlock(Block):
):
try:
# Initialize Replicate client
client = replicate.Client(api_token=credentials.api_key.get_secret_value())
client = ReplicateClient(api_token=credentials.api_key.get_secret_value())
# Run the model with input parameters
output = client.run(model_name, input=input_params, wait=False)

View File

@@ -3,8 +3,8 @@ import time
from enum import Enum
from typing import Literal
import replicate
from pydantic import SecretStr
from replicate.client import Client as ReplicateClient
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.model import (
@@ -196,7 +196,7 @@ class AIMusicGeneratorBlock(Block):
normalization_strategy: NormalizationStrategy,
):
# Initialize Replicate client with the API key
client = replicate.Client(api_token=api_key.get_secret_value())
client = ReplicateClient(api_token=api_key.get_secret_value())
# Run the model with parameters
output = client.run(

View File

@@ -88,6 +88,33 @@ class StoreValueBlock(Block):
yield "output", input_data.data or input_data.input
class PrintToConsoleBlock(Block):
class Input(BlockSchema):
text: Any = SchemaField(description="The data to print to the console.")
class Output(BlockSchema):
output: Any = SchemaField(description="The data printed to the console.")
status: str = SchemaField(description="The status of the print operation.")
def __init__(self):
super().__init__(
id="f3b1c1b2-4c4f-4f0d-8d2f-4c4f0d8d2f4c",
description="Print the given text to the console, this is used for a debugging purpose.",
categories={BlockCategory.BASIC},
input_schema=PrintToConsoleBlock.Input,
output_schema=PrintToConsoleBlock.Output,
test_input={"text": "Hello, World!"},
test_output=[
("output", "Hello, World!"),
("status", "printed"),
],
)
def run(self, input_data: Input, **kwargs) -> BlockOutput:
yield "output", input_data.text
yield "status", "printed"
class FindInDictionaryBlock(Block):
class Input(BlockSchema):
input: Any = SchemaField(description="Dictionary to lookup from")

View File

@@ -1,19 +1,30 @@
from typing import overload
from urllib.parse import urlparse
from backend.blocks.github._auth import (
GithubCredentials,
GithubFineGrainedAPICredentials,
)
from backend.util.request import Requests
from backend.util.request import URL, Requests
def _convert_to_api_url(url: str) -> str:
@overload
def _convert_to_api_url(url: str) -> str: ...
@overload
def _convert_to_api_url(url: URL) -> URL: ...
def _convert_to_api_url(url: str | URL) -> str | URL:
"""
Converts a standard GitHub URL to the corresponding GitHub API URL.
Handles repository URLs, issue URLs, pull request URLs, and more.
"""
parsed_url = urlparse(url)
path_parts = parsed_url.path.strip("/").split("/")
if url_as_str := isinstance(url, str):
url = urlparse(url)
path_parts = url.path.strip("/").split("/")
if len(path_parts) >= 2:
owner, repo = path_parts[0], path_parts[1]
@@ -28,7 +39,7 @@ def _convert_to_api_url(url: str) -> str:
else:
raise ValueError("Invalid GitHub URL format.")
return api_url
return api_url if url_as_str else urlparse(api_url)
def _get_headers(credentials: GithubCredentials) -> dict[str, str]:

View File

@@ -0,0 +1,598 @@
import enum
import uuid
from datetime import datetime, timedelta, timezone
from typing import Literal
from google.oauth2.credentials import Credentials
from googleapiclient.discovery import build
from pydantic import BaseModel
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.model import SchemaField
from backend.util.settings import AppEnvironment, Settings
from ._auth import (
GOOGLE_OAUTH_IS_CONFIGURED,
TEST_CREDENTIALS,
TEST_CREDENTIALS_INPUT,
GoogleCredentials,
GoogleCredentialsField,
GoogleCredentialsInput,
)
class CalendarEvent(BaseModel):
"""Structured representation of a Google Calendar event."""
id: str
title: str
start_time: str
end_time: str
is_all_day: bool
location: str | None
description: str | None
organizer: str | None
attendees: list[str]
has_video_call: bool
video_link: str | None
calendar_link: str
is_recurring: bool
class GoogleCalendarReadEventsBlock(Block):
class Input(BlockSchema):
credentials: GoogleCredentialsInput = GoogleCredentialsField(
["https://www.googleapis.com/auth/calendar.readonly"]
)
calendar_id: str = SchemaField(
description="Calendar ID (use 'primary' for your main calendar)",
default="primary",
)
max_events: int = SchemaField(
description="Maximum number of events to retrieve", default=10
)
start_time: datetime = SchemaField(
description="Retrieve events starting from this time",
default_factory=lambda: datetime.now(tz=timezone.utc),
)
time_range_days: int = SchemaField(
description="Number of days to look ahead for events", default=30
)
search_term: str | None = SchemaField(
description="Optional search term to filter events by", default=None
)
page_token: str | None = SchemaField(
description="Page token from previous request to get the next batch of events. You can use this if you have lots of events you want to process in a loop",
default=None,
)
include_declined_events: bool = SchemaField(
description="Include events you've declined", default=False
)
class Output(BlockSchema):
events: list[CalendarEvent] = SchemaField(
description="List of calendar events in the requested time range",
default_factory=list,
)
event: CalendarEvent = SchemaField(
description="One of the calendar events in the requested time range"
)
next_page_token: str | None = SchemaField(
description="Token for retrieving the next page of events if more exist",
default=None,
)
error: str = SchemaField(
description="Error message if the request failed",
)
def __init__(self):
settings = Settings()
# Create realistic test data for events
test_now = datetime.now(tz=timezone.utc)
test_tomorrow = test_now + timedelta(days=1)
test_event_dict = {
"id": "event1id",
"title": "Team Meeting",
"start_time": test_tomorrow.strftime("%Y-%m-%d %H:%M"),
"end_time": (test_tomorrow + timedelta(hours=1)).strftime("%Y-%m-%d %H:%M"),
"is_all_day": False,
"location": "Conference Room A",
"description": "Weekly team sync",
"organizer": "manager@example.com",
"attendees": ["colleague1@example.com", "colleague2@example.com"],
"has_video_call": True,
"video_link": "https://meet.google.com/abc-defg-hij",
"calendar_link": "https://calendar.google.com/calendar/event?eid=event1id",
"is_recurring": True,
}
super().__init__(
id="80bc3ed1-e9a4-449e-8163-a8fc86f74f6a",
description="Retrieves upcoming events from a Google Calendar with filtering options",
categories={BlockCategory.PRODUCTIVITY, BlockCategory.DATA},
input_schema=GoogleCalendarReadEventsBlock.Input,
output_schema=GoogleCalendarReadEventsBlock.Output,
disabled=not GOOGLE_OAUTH_IS_CONFIGURED
or settings.config.app_env == AppEnvironment.PRODUCTION,
test_input={
"credentials": TEST_CREDENTIALS_INPUT,
"calendar_id": "primary",
"max_events": 5,
"start_time": test_now.isoformat(),
"time_range_days": 7,
"search_term": None,
"include_declined_events": False,
"page_token": None,
},
test_credentials=TEST_CREDENTIALS,
test_output=[
("event", test_event_dict),
("events", [test_event_dict]),
],
test_mock={
"_read_calendar": lambda *args, **kwargs: {
"items": [
{
"id": "event1id",
"summary": "Team Meeting",
"start": {
"dateTime": test_tomorrow.isoformat(),
"timeZone": "UTC",
},
"end": {
"dateTime": (
test_tomorrow + timedelta(hours=1)
).isoformat(),
"timeZone": "UTC",
},
"location": "Conference Room A",
"description": "Weekly team sync",
"organizer": {"email": "manager@example.com"},
"attendees": [
{"email": "colleague1@example.com"},
{"email": "colleague2@example.com"},
],
"conferenceData": {
"conferenceUrl": "https://meet.google.com/abc-defg-hij"
},
"htmlLink": "https://calendar.google.com/calendar/event?eid=event1id",
"recurrence": ["RRULE:FREQ=WEEKLY;COUNT=10"],
}
],
"nextPageToken": None,
},
"_format_events": lambda *args, **kwargs: [test_event_dict],
},
)
def run(
self, input_data: Input, *, credentials: GoogleCredentials, **kwargs
) -> BlockOutput:
try:
service = self._build_service(credentials, **kwargs)
# Calculate end time based on start time and time range
end_time = input_data.start_time + timedelta(
days=input_data.time_range_days
)
# Call Google Calendar API
result = self._read_calendar(
service=service,
calendarId=input_data.calendar_id,
time_min=input_data.start_time.isoformat(),
time_max=end_time.isoformat(),
max_results=input_data.max_events,
single_events=True,
search_term=input_data.search_term,
show_deleted=False,
show_hidden=input_data.include_declined_events,
page_token=input_data.page_token,
)
# Format events into a user-friendly structure
formatted_events = self._format_events(result.get("items", []))
# Include next page token if available
if next_page_token := result.get("nextPageToken"):
yield "next_page_token", next_page_token
for event in formatted_events:
yield "event", event
yield "events", formatted_events
except Exception as e:
yield "error", str(e)
@staticmethod
def _build_service(credentials: GoogleCredentials, **kwargs):
creds = Credentials(
token=(
credentials.access_token.get_secret_value()
if credentials.access_token
else None
),
refresh_token=(
credentials.refresh_token.get_secret_value()
if credentials.refresh_token
else None
),
token_uri="https://oauth2.googleapis.com/token",
client_id=Settings().secrets.google_client_id,
client_secret=Settings().secrets.google_client_secret,
scopes=credentials.scopes,
)
return build("calendar", "v3", credentials=creds)
def _read_calendar(
self,
service,
calendarId: str,
time_min: str,
time_max: str,
max_results: int,
single_events: bool,
search_term: str | None = None,
show_deleted: bool = False,
show_hidden: bool = False,
page_token: str | None = None,
) -> dict:
"""Read calendar events with optional filtering."""
calendar = service.events()
# Build query parameters
params = {
"calendarId": calendarId,
"timeMin": time_min,
"timeMax": time_max,
"maxResults": max_results,
"singleEvents": single_events,
"orderBy": "startTime",
"showDeleted": show_deleted,
"showHiddenInvitations": show_hidden,
**({"pageToken": page_token} if page_token else {}),
}
# Add search term if provided
if search_term:
params["q"] = search_term
result = calendar.list(**params).execute()
return result
def _format_events(self, events: list[dict]) -> list[CalendarEvent]:
"""Format Google Calendar API events into user-friendly structure."""
formatted_events = []
for event in events:
# Determine if all-day event
is_all_day = "date" in event.get("start", {})
# Format start and end times
if is_all_day:
start_time = event.get("start", {}).get("date", "")
end_time = event.get("end", {}).get("date", "")
else:
# Convert ISO format to more readable format
start_datetime = datetime.fromisoformat(
event.get("start", {}).get("dateTime", "").replace("Z", "+00:00")
)
end_datetime = datetime.fromisoformat(
event.get("end", {}).get("dateTime", "").replace("Z", "+00:00")
)
start_time = start_datetime.strftime("%Y-%m-%d %H:%M")
end_time = end_datetime.strftime("%Y-%m-%d %H:%M")
# Extract attendees
attendees = []
for attendee in event.get("attendees", []):
if email := attendee.get("email"):
attendees.append(email)
# Check for video call link
has_video_call = False
video_link = None
if conf_data := event.get("conferenceData"):
if conf_url := conf_data.get("conferenceUrl"):
has_video_call = True
video_link = conf_url
elif entry_points := conf_data.get("entryPoints", []):
for entry in entry_points:
if entry.get("entryPointType") == "video":
has_video_call = True
video_link = entry.get("uri")
break
# Create formatted event
formatted_event = CalendarEvent(
id=event.get("id", ""),
title=event.get("summary", "Untitled Event"),
start_time=start_time,
end_time=end_time,
is_all_day=is_all_day,
location=event.get("location"),
description=event.get("description"),
organizer=event.get("organizer", {}).get("email"),
attendees=attendees,
has_video_call=has_video_call,
video_link=video_link,
calendar_link=event.get("htmlLink", ""),
is_recurring=bool(event.get("recurrence")),
)
formatted_events.append(formatted_event)
return formatted_events
class ReminderPreset(enum.Enum):
"""Common reminder times before an event."""
TEN_MINUTES = 10
THIRTY_MINUTES = 30
ONE_HOUR = 60
ONE_DAY = 1440 # 24 hours in minutes
class RecurrenceFrequency(enum.Enum):
"""Frequency options for recurring events."""
DAILY = "DAILY"
WEEKLY = "WEEKLY"
MONTHLY = "MONTHLY"
YEARLY = "YEARLY"
class ExactTiming(BaseModel):
"""Model for specifying start and end times."""
discriminator: Literal["exact_timing"]
start_datetime: datetime
end_datetime: datetime
class DurationTiming(BaseModel):
"""Model for specifying start time and duration."""
discriminator: Literal["duration_timing"]
start_datetime: datetime
duration_minutes: int
class OneTimeEvent(BaseModel):
"""Model for a one-time event."""
discriminator: Literal["one_time"]
class RecurringEvent(BaseModel):
"""Model for a recurring event."""
discriminator: Literal["recurring"]
frequency: RecurrenceFrequency
count: int
class GoogleCalendarCreateEventBlock(Block):
class Input(BlockSchema):
credentials: GoogleCredentialsInput = GoogleCredentialsField(
["https://www.googleapis.com/auth/calendar"]
)
# Event Details
event_title: str = SchemaField(description="Title of the event")
location: str | None = SchemaField(
description="Location of the event", default=None
)
description: str | None = SchemaField(
description="Description of the event", default=None
)
# Timing
timing: ExactTiming | DurationTiming = SchemaField(
discriminator="discriminator",
advanced=False,
description="Specify when the event starts and ends",
default_factory=lambda: DurationTiming(
discriminator="duration_timing",
start_datetime=datetime.now().replace(microsecond=0, second=0, minute=0)
+ timedelta(hours=1),
duration_minutes=60,
),
)
# Calendar selection
calendar_id: str = SchemaField(
description="Calendar ID (use 'primary' for your main calendar)",
default="primary",
)
# Guests
guest_emails: list[str] = SchemaField(
description="Email addresses of guests to invite", default_factory=list
)
send_notifications: bool = SchemaField(
description="Send email notifications to guests", default=True
)
# Extras
add_google_meet: bool = SchemaField(
description="Include a Google Meet video conference link", default=False
)
recurrence: OneTimeEvent | RecurringEvent = SchemaField(
discriminator="discriminator",
description="Whether the event repeats",
default_factory=lambda: OneTimeEvent(discriminator="one_time"),
)
reminder_minutes: list[ReminderPreset] = SchemaField(
description="When to send reminders before the event",
default_factory=lambda: [ReminderPreset.TEN_MINUTES],
)
class Output(BlockSchema):
event_id: str = SchemaField(description="ID of the created event")
event_link: str = SchemaField(
description="Link to view the event in Google Calendar"
)
error: str = SchemaField(description="Error message if event creation failed")
def __init__(self):
settings = Settings()
super().__init__(
id="ed2ec950-fbff-4204-94c0-023fb1d625e0",
description="This block creates a new event in Google Calendar with customizable parameters.",
categories={BlockCategory.PRODUCTIVITY},
input_schema=GoogleCalendarCreateEventBlock.Input,
output_schema=GoogleCalendarCreateEventBlock.Output,
disabled=not GOOGLE_OAUTH_IS_CONFIGURED
or settings.config.app_env == AppEnvironment.PRODUCTION,
test_input={
"credentials": TEST_CREDENTIALS_INPUT,
"event_title": "Team Meeting",
"location": "Conference Room A",
"description": "Weekly team sync-up",
"calendar_id": "primary",
"guest_emails": ["colleague1@example.com", "colleague2@example.com"],
"add_google_meet": True,
"send_notifications": True,
"reminder_minutes": [
ReminderPreset.TEN_MINUTES.value,
ReminderPreset.ONE_HOUR.value,
],
},
test_credentials=TEST_CREDENTIALS,
test_output=[
("event_id", "abc123event_id"),
("event_link", "https://calendar.google.com/calendar/event?eid=abc123"),
],
test_mock={
"_create_event": lambda *args, **kwargs: {
"id": "abc123event_id",
"htmlLink": "https://calendar.google.com/calendar/event?eid=abc123",
}
},
)
def run(
self, input_data: Input, *, credentials: GoogleCredentials, **kwargs
) -> BlockOutput:
try:
service = self._build_service(credentials, **kwargs)
# Get start and end times based on the timing option
if input_data.timing.discriminator == "exact_timing":
start_datetime = input_data.timing.start_datetime
end_datetime = input_data.timing.end_datetime
else: # duration_timing
start_datetime = input_data.timing.start_datetime
end_datetime = start_datetime + timedelta(
minutes=input_data.timing.duration_minutes
)
# Format datetimes for Google Calendar API
start_time_str = start_datetime.isoformat()
end_time_str = end_datetime.isoformat()
# Build the event body
event_body = {
"summary": input_data.event_title,
"start": {"dateTime": start_time_str},
"end": {"dateTime": end_time_str},
}
# Add optional fields
if input_data.location:
event_body["location"] = input_data.location
if input_data.description:
event_body["description"] = input_data.description
# Add guests
if input_data.guest_emails:
event_body["attendees"] = [
{"email": email} for email in input_data.guest_emails
]
# Add reminders
if input_data.reminder_minutes:
event_body["reminders"] = {
"useDefault": False,
"overrides": [
{"method": "popup", "minutes": reminder.value}
for reminder in input_data.reminder_minutes
],
}
# Add Google Meet
if input_data.add_google_meet:
event_body["conferenceData"] = {
"createRequest": {
"requestId": f"meet-{uuid.uuid4()}",
"conferenceSolutionKey": {"type": "hangoutsMeet"},
}
}
# Add recurrence
if input_data.recurrence.discriminator == "recurring":
rule = f"RRULE:FREQ={input_data.recurrence.frequency.value}"
rule += f";COUNT={input_data.recurrence.count}"
event_body["recurrence"] = [rule]
# Create the event
result = self._create_event(
service=service,
calendar_id=input_data.calendar_id,
event_body=event_body,
send_notifications=input_data.send_notifications,
conference_data_version=1 if input_data.add_google_meet else 0,
)
yield "event_id", result.get("id", "")
yield "event_link", result.get("htmlLink", "")
except Exception as e:
yield "error", str(e)
@staticmethod
def _build_service(credentials: GoogleCredentials, **kwargs):
creds = Credentials(
token=(
credentials.access_token.get_secret_value()
if credentials.access_token
else None
),
refresh_token=(
credentials.refresh_token.get_secret_value()
if credentials.refresh_token
else None
),
token_uri="https://oauth2.googleapis.com/token",
client_id=Settings().secrets.google_client_id,
client_secret=Settings().secrets.google_client_secret,
scopes=credentials.scopes,
)
return build("calendar", "v3", credentials=creds)
def _create_event(
self,
service,
calendar_id: str,
event_body: dict,
send_notifications: bool = False,
conference_data_version: int = 0,
) -> dict:
"""Create a new event in Google Calendar."""
calendar = service.events()
# Make the API call
result = calendar.insert(
calendarId=calendar_id,
body=event_body,
sendNotifications=send_notifications,
conferenceDataVersion=conference_data_version,
).execute()
return result

View File

@@ -3,7 +3,7 @@ from googleapiclient.discovery import build
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.model import SchemaField
from backend.util.settings import Settings
from backend.util.settings import AppEnvironment, Settings
from ._auth import (
GOOGLE_OAUTH_IS_CONFIGURED,
@@ -36,13 +36,15 @@ class GoogleSheetsReadBlock(Block):
)
def __init__(self):
settings = Settings()
super().__init__(
id="5724e902-3635-47e9-a108-aaa0263a4988",
description="This block reads data from a Google Sheets spreadsheet.",
categories={BlockCategory.DATA},
input_schema=GoogleSheetsReadBlock.Input,
output_schema=GoogleSheetsReadBlock.Output,
disabled=not GOOGLE_OAUTH_IS_CONFIGURED,
disabled=not GOOGLE_OAUTH_IS_CONFIGURED
or settings.config.app_env == AppEnvironment.PRODUCTION,
test_input={
"spreadsheet_id": "1BxiMVs0XRA5nFMdKvBdBZjgmUUqptlbs74OgvE2upms",
"range": "Sheet1!A1:B2",

View File

@@ -82,7 +82,15 @@ class SendWebRequestBlock(Block):
json=body if input_data.json_format else None,
data=body if not input_data.json_format else None,
)
result = response.json() if input_data.json_format else response.text
if input_data.json_format:
if response.status_code == 204 or not response.content.strip():
result = None
else:
result = response.json()
else:
result = response.text
yield "response", result
except HTTPError as e:

View File

@@ -23,10 +23,11 @@ from backend.data.model import (
)
from backend.integrations.providers import ProviderName
from backend.util import json
from backend.util.logging import TruncatedLogger
from backend.util.settings import BehaveAs, Settings
from backend.util.text import TextFormatter
logger = logging.getLogger(__name__)
logger = TruncatedLogger(logging.getLogger(__name__), "[LLM-Block]")
fmt = TextFormatter()
LLMProviderName = Literal[
@@ -35,6 +36,7 @@ LLMProviderName = Literal[
ProviderName.OLLAMA,
ProviderName.OPENAI,
ProviderName.OPEN_ROUTER,
ProviderName.LLAMA_API,
]
AICredentials = CredentialsMetaInput[LLMProviderName, Literal["api_key"]]
@@ -99,6 +101,8 @@ class LlmModel(str, Enum, metaclass=LlmModelMeta):
GPT4_TURBO = "gpt-4-turbo"
GPT3_5_TURBO = "gpt-3.5-turbo"
# Anthropic models
CLAUDE_4_OPUS = "claude-opus-4-20250514"
CLAUDE_4_SONNET = "claude-sonnet-4-20250514"
CLAUDE_3_7_SONNET = "claude-3-7-sonnet-20250219"
CLAUDE_3_5_SONNET = "claude-3-5-sonnet-latest"
CLAUDE_3_5_HAIKU = "claude-3-5-haiku-latest"
@@ -140,6 +144,11 @@ class LlmModel(str, Enum, metaclass=LlmModelMeta):
GRYPHE_MYTHOMAX_L2_13B = "gryphe/mythomax-l2-13b"
META_LLAMA_4_SCOUT = "meta-llama/llama-4-scout"
META_LLAMA_4_MAVERICK = "meta-llama/llama-4-maverick"
# Llama API models
LLAMA_API_LLAMA_4_SCOUT = "Llama-4-Scout-17B-16E-Instruct-FP8"
LLAMA_API_LLAMA4_MAVERICK = "Llama-4-Maverick-17B-128E-Instruct-FP8"
LLAMA_API_LLAMA3_3_8B = "Llama-3.3-8B-Instruct"
LLAMA_API_LLAMA3_3_70B = "Llama-3.3-70B-Instruct"
@property
def metadata(self) -> ModelMetadata:
@@ -177,6 +186,12 @@ MODEL_METADATA = {
), # gpt-4-turbo-2024-04-09
LlmModel.GPT3_5_TURBO: ModelMetadata("openai", 16385, 4096), # gpt-3.5-turbo-0125
# https://docs.anthropic.com/en/docs/about-claude/models
LlmModel.CLAUDE_4_OPUS: ModelMetadata(
"anthropic", 200000, 8192
), # claude-4-opus-20250514
LlmModel.CLAUDE_4_SONNET: ModelMetadata(
"anthropic", 200000, 8192
), # claude-4-sonnet-20250514
LlmModel.CLAUDE_3_7_SONNET: ModelMetadata(
"anthropic", 200000, 8192
), # claude-3-7-sonnet-20250219
@@ -229,6 +244,11 @@ MODEL_METADATA = {
LlmModel.GRYPHE_MYTHOMAX_L2_13B: ModelMetadata("open_router", 4096, 4096),
LlmModel.META_LLAMA_4_SCOUT: ModelMetadata("open_router", 131072, 131072),
LlmModel.META_LLAMA_4_MAVERICK: ModelMetadata("open_router", 1048576, 1000000),
# Llama API models
LlmModel.LLAMA_API_LLAMA_4_SCOUT: ModelMetadata("llama_api", 128000, 4028),
LlmModel.LLAMA_API_LLAMA4_MAVERICK: ModelMetadata("llama_api", 128000, 4028),
LlmModel.LLAMA_API_LLAMA3_3_8B: ModelMetadata("llama_api", 128000, 4028),
LlmModel.LLAMA_API_LLAMA3_3_70B: ModelMetadata("llama_api", 128000, 4028),
}
for model in LlmModel:
@@ -288,6 +308,13 @@ def convert_openai_tool_fmt_to_anthropic(
return anthropic_tools
def estimate_token_count(prompt_messages: list[dict]) -> int:
char_count = sum(len(str(msg.get("content", ""))) for msg in prompt_messages)
message_overhead = len(prompt_messages) * 4
estimated_tokens = (char_count // 4) + message_overhead
return int(estimated_tokens * 1.2)
def llm_call(
credentials: APIKeyCredentials,
llm_model: LlmModel,
@@ -319,7 +346,14 @@ def llm_call(
- completion_tokens: The number of tokens used in the completion.
"""
provider = llm_model.metadata.provider
max_tokens = max_tokens or llm_model.max_output_tokens or 4096
# Calculate available tokens based on context window and input length
estimated_input_tokens = estimate_token_count(prompt)
context_window = llm_model.context_window
model_max_output = llm_model.max_output_tokens or 4096
user_max = max_tokens or model_max_output
available_tokens = max(context_window - estimated_input_tokens, 0)
max_tokens = max(min(available_tokens, model_max_output, user_max), 0)
if provider == "openai":
tools_param = tools if tools else openai.NOT_GIVEN
@@ -425,7 +459,7 @@ def llm_call(
if not tool_calls and resp.stop_reason == "tool_use":
logger.warning(
"Tool use stop reason but no tool calls found in content. %s", resp
f"Tool use stop reason but no tool calls found in content. {resp}"
)
return LLMResponse(
@@ -475,6 +509,7 @@ def llm_call(
model=llm_model.value,
prompt=f"{sys_messages}\n\n{usr_messages}",
stream=False,
options={"num_ctx": max_tokens},
)
return LLMResponse(
raw_response=response.get("response") or "",
@@ -500,9 +535,6 @@ def llm_call(
messages=prompt, # type: ignore
max_tokens=max_tokens,
tools=tools_param, # type: ignore
parallel_tool_calls=(
openai.NOT_GIVEN if parallel_tool_calls is None else parallel_tool_calls
),
)
# If there's no response, raise an error
@@ -526,6 +558,56 @@ def llm_call(
else:
tool_calls = None
return LLMResponse(
raw_response=response.choices[0].message,
prompt=prompt,
response=response.choices[0].message.content or "",
tool_calls=tool_calls,
prompt_tokens=response.usage.prompt_tokens if response.usage else 0,
completion_tokens=response.usage.completion_tokens if response.usage else 0,
)
elif provider == "llama_api":
tools_param = tools if tools else openai.NOT_GIVEN
client = openai.OpenAI(
base_url="https://api.llama.com/compat/v1/",
api_key=credentials.api_key.get_secret_value(),
)
response = client.chat.completions.create(
extra_headers={
"HTTP-Referer": "https://agpt.co",
"X-Title": "AutoGPT",
},
model=llm_model.value,
messages=prompt, # type: ignore
max_tokens=max_tokens,
tools=tools_param, # type: ignore
parallel_tool_calls=(
openai.NOT_GIVEN if parallel_tool_calls is None else parallel_tool_calls
),
)
# If there's no response, raise an error
if not response.choices:
if response:
raise ValueError(f"Llama API error: {response}")
else:
raise ValueError("No response from Llama API.")
if response.choices[0].message.tool_calls:
tool_calls = [
ToolContentBlock(
id=tool.id,
type=tool.type,
function=ToolCall(
name=tool.function.name, arguments=tool.function.arguments
),
)
for tool in response.choices[0].message.tool_calls
]
else:
tool_calls = None
return LLMResponse(
raw_response=response.choices[0].message,
prompt=prompt,
@@ -773,6 +855,16 @@ class AIStructuredResponseGeneratorBlock(AIBlockBase):
prompt.append({"role": "user", "content": retry_prompt})
except Exception as e:
logger.exception(f"Error calling LLM: {e}")
if (
"maximum context length" in str(e).lower()
or "token limit" in str(e).lower()
):
if input_data.max_tokens is None:
input_data.max_tokens = llm_model.max_output_tokens or 4096
input_data.max_tokens = int(input_data.max_tokens * 0.85)
logger.debug(
f"Reducing max_tokens to {input_data.max_tokens} for next attempt"
)
retry_prompt = f"Error calling LLM: {e}"
finally:
self.merge_stats(

View File

@@ -124,8 +124,10 @@ class AddMemoryBlock(Block, Mem0Base):
if isinstance(input_data.content, Conversation):
messages = input_data.content.messages
elif isinstance(input_data.content, Content):
messages = [{"role": "user", "content": input_data.content.content}]
else:
messages = [{"role": "user", "content": input_data.content}]
messages = [{"role": "user", "content": str(input_data.content)}]
params = {
"user_id": user_id,
@@ -152,7 +154,7 @@ class AddMemoryBlock(Block, Mem0Base):
yield "action", "NO_CHANGE"
except Exception as e:
yield "error", str(object=e)
yield "error", str(e)
class SearchMemoryBlock(Block, Mem0Base):

View File

@@ -2,8 +2,8 @@ import os
from enum import Enum
from typing import Literal
import replicate
from pydantic import SecretStr
from replicate.client import Client as ReplicateClient
from replicate.helpers import FileOutput
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
@@ -198,7 +198,7 @@ class ReplicateFluxAdvancedModelBlock(Block):
safety_tolerance,
):
# Initialize Replicate client with the API key
client = replicate.Client(api_token=api_key.get_secret_value())
client = ReplicateClient(api_token=api_key.get_secret_value())
# Run the model with additional parameters
output: FileOutput | list[FileOutput] = client.run( # type: ignore This is because they changed the return type, and didn't update the type hint! It should be overloaded depending on the value of `use_file_output` to `FileOutput | list[FileOutput]` but it's `Any | Iterator[Any]`

View File

@@ -26,10 +26,10 @@ logger = logging.getLogger(__name__)
@thread_cached
def get_database_manager_client():
from backend.executor import DatabaseManager
from backend.executor import DatabaseManagerClient
from backend.util.service import get_service_client
return get_service_client(DatabaseManager)
return get_service_client(DatabaseManagerClient)
def _get_tool_requests(entry: dict[str, Any]) -> list[str]:
@@ -246,6 +246,10 @@ class SmartDecisionMakerBlock(Block):
test_credentials=llm.TEST_CREDENTIALS,
)
@staticmethod
def cleanup(s: str):
return re.sub(r"[^a-zA-Z0-9_-]", "_", s).lower()
@staticmethod
def _create_block_function_signature(
sink_node: "Node", links: list["Link"]
@@ -266,7 +270,7 @@ class SmartDecisionMakerBlock(Block):
block = sink_node.block
tool_function: dict[str, Any] = {
"name": re.sub(r"[^a-zA-Z0-9_-]", "_", block.name).lower(),
"name": SmartDecisionMakerBlock.cleanup(block.name),
"description": block.description,
}
@@ -281,7 +285,7 @@ class SmartDecisionMakerBlock(Block):
and sink_block_input_schema.model_fields[link.sink_name].description
else f"The {link.sink_name} of the tool"
)
properties[link.sink_name.lower()] = {
properties[SmartDecisionMakerBlock.cleanup(link.sink_name)] = {
"type": "string",
"description": description,
}
@@ -326,7 +330,7 @@ class SmartDecisionMakerBlock(Block):
)
tool_function: dict[str, Any] = {
"name": re.sub(r"[^a-zA-Z0-9_-]", "_", sink_graph_meta.name).lower(),
"name": SmartDecisionMakerBlock.cleanup(sink_graph_meta.name),
"description": sink_graph_meta.description,
}
@@ -341,7 +345,7 @@ class SmartDecisionMakerBlock(Block):
in sink_block_input_schema["properties"][link.sink_name]
else f"The {link.sink_name} of the tool"
)
properties[link.sink_name.lower()] = {
properties[SmartDecisionMakerBlock.cleanup(link.sink_name)] = {
"type": "string",
"description": description,
}
@@ -503,7 +507,7 @@ class SmartDecisionMakerBlock(Block):
tool_args = json.loads(tool_call.function.arguments)
for arg_name, arg_value in tool_args.items():
yield f"tools_^_{tool_name}_{arg_name}".lower(), arg_value
yield f"tools_^_{tool_name}_~_{arg_name}", arg_value
response.prompt.append(response.raw_response)
yield "conversations", response.prompt

View File

@@ -7,6 +7,7 @@ from typing_extensions import Optional
from backend.blocks.todoist._auth import (
TEST_CREDENTIALS,
TEST_CREDENTIALS_INPUT,
TODOIST_OAUTH_IS_CONFIGURED,
TodoistCredentials,
TodoistCredentialsField,
TodoistCredentialsInput,
@@ -61,6 +62,7 @@ class TodoistCreateCommentBlock(Block):
categories={BlockCategory.PRODUCTIVITY},
input_schema=TodoistCreateCommentBlock.Input,
output_schema=TodoistCreateCommentBlock.Output,
disabled=not TODOIST_OAUTH_IS_CONFIGURED,
test_input={
"credentials": TEST_CREDENTIALS_INPUT,
"content": "Test comment",
@@ -164,6 +166,7 @@ class TodoistGetCommentsBlock(Block):
categories={BlockCategory.PRODUCTIVITY},
input_schema=TodoistGetCommentsBlock.Input,
output_schema=TodoistGetCommentsBlock.Output,
disabled=not TODOIST_OAUTH_IS_CONFIGURED,
test_input={
"credentials": TEST_CREDENTIALS_INPUT,
"id_type": {"discriminator": "task", "task_id": "2995104339"},
@@ -268,6 +271,7 @@ class TodoistGetCommentBlock(Block):
categories={BlockCategory.PRODUCTIVITY},
input_schema=TodoistGetCommentBlock.Input,
output_schema=TodoistGetCommentBlock.Output,
disabled=not TODOIST_OAUTH_IS_CONFIGURED,
test_input={
"credentials": TEST_CREDENTIALS_INPUT,
"comment_id": "2992679862",
@@ -346,6 +350,7 @@ class TodoistUpdateCommentBlock(Block):
categories={BlockCategory.PRODUCTIVITY},
input_schema=TodoistUpdateCommentBlock.Input,
output_schema=TodoistUpdateCommentBlock.Output,
disabled=not TODOIST_OAUTH_IS_CONFIGURED,
test_input={
"credentials": TEST_CREDENTIALS_INPUT,
"comment_id": "2992679862",
@@ -404,6 +409,7 @@ class TodoistDeleteCommentBlock(Block):
categories={BlockCategory.PRODUCTIVITY},
input_schema=TodoistDeleteCommentBlock.Input,
output_schema=TodoistDeleteCommentBlock.Output,
disabled=not TODOIST_OAUTH_IS_CONFIGURED,
test_input={
"credentials": TEST_CREDENTIALS_INPUT,
"comment_id": "2992679862",

View File

@@ -4,6 +4,7 @@ from typing_extensions import Optional
from backend.blocks.todoist._auth import (
TEST_CREDENTIALS,
TEST_CREDENTIALS_INPUT,
TODOIST_OAUTH_IS_CONFIGURED,
TodoistCredentials,
TodoistCredentialsField,
TodoistCredentialsInput,
@@ -42,6 +43,7 @@ class TodoistCreateLabelBlock(Block):
categories={BlockCategory.PRODUCTIVITY},
input_schema=TodoistCreateLabelBlock.Input,
output_schema=TodoistCreateLabelBlock.Output,
disabled=not TODOIST_OAUTH_IS_CONFIGURED,
test_input={
"credentials": TEST_CREDENTIALS_INPUT,
"name": "Test Label",
@@ -130,6 +132,7 @@ class TodoistListLabelsBlock(Block):
categories={BlockCategory.PRODUCTIVITY},
input_schema=TodoistListLabelsBlock.Input,
output_schema=TodoistListLabelsBlock.Output,
disabled=not TODOIST_OAUTH_IS_CONFIGURED,
test_input={"credentials": TEST_CREDENTIALS_INPUT},
test_credentials=TEST_CREDENTIALS,
test_output=[
@@ -211,6 +214,7 @@ class TodoistGetLabelBlock(Block):
categories={BlockCategory.PRODUCTIVITY},
input_schema=TodoistGetLabelBlock.Input,
output_schema=TodoistGetLabelBlock.Output,
disabled=not TODOIST_OAUTH_IS_CONFIGURED,
test_input={
"credentials": TEST_CREDENTIALS_INPUT,
"label_id": "2156154810",
@@ -293,6 +297,7 @@ class TodoistUpdateLabelBlock(Block):
categories={BlockCategory.PRODUCTIVITY},
input_schema=TodoistUpdateLabelBlock.Input,
output_schema=TodoistUpdateLabelBlock.Output,
disabled=not TODOIST_OAUTH_IS_CONFIGURED,
test_input={
"credentials": TEST_CREDENTIALS_INPUT,
"label_id": "2156154810",
@@ -364,6 +369,7 @@ class TodoistDeleteLabelBlock(Block):
categories={BlockCategory.PRODUCTIVITY},
input_schema=TodoistDeleteLabelBlock.Input,
output_schema=TodoistDeleteLabelBlock.Output,
disabled=not TODOIST_OAUTH_IS_CONFIGURED,
test_input={
"credentials": TEST_CREDENTIALS_INPUT,
"label_id": "2156154810",
@@ -415,6 +421,7 @@ class TodoistGetSharedLabelsBlock(Block):
categories={BlockCategory.PRODUCTIVITY},
input_schema=TodoistGetSharedLabelsBlock.Input,
output_schema=TodoistGetSharedLabelsBlock.Output,
disabled=not TODOIST_OAUTH_IS_CONFIGURED,
test_input={"credentials": TEST_CREDENTIALS_INPUT},
test_credentials=TEST_CREDENTIALS,
test_output=[("labels", ["Label1", "Label2", "Label3"])],
@@ -471,6 +478,7 @@ class TodoistRenameSharedLabelsBlock(Block):
categories={BlockCategory.PRODUCTIVITY},
input_schema=TodoistRenameSharedLabelsBlock.Input,
output_schema=TodoistRenameSharedLabelsBlock.Output,
disabled=not TODOIST_OAUTH_IS_CONFIGURED,
test_input={
"credentials": TEST_CREDENTIALS_INPUT,
"name": "OldLabel",
@@ -526,6 +534,7 @@ class TodoistRemoveSharedLabelsBlock(Block):
categories={BlockCategory.PRODUCTIVITY},
input_schema=TodoistRemoveSharedLabelsBlock.Input,
output_schema=TodoistRemoveSharedLabelsBlock.Output,
disabled=not TODOIST_OAUTH_IS_CONFIGURED,
test_input={"credentials": TEST_CREDENTIALS_INPUT, "name": "LabelToRemove"},
test_credentials=TEST_CREDENTIALS,
test_output=[("success", True)],

View File

@@ -4,6 +4,7 @@ from typing_extensions import Optional
from backend.blocks.todoist._auth import (
TEST_CREDENTIALS,
TEST_CREDENTIALS_INPUT,
TODOIST_OAUTH_IS_CONFIGURED,
TodoistCredentials,
TodoistCredentialsField,
TodoistCredentialsInput,
@@ -35,6 +36,7 @@ class TodoistListProjectsBlock(Block):
categories={BlockCategory.PRODUCTIVITY},
input_schema=TodoistListProjectsBlock.Input,
output_schema=TodoistListProjectsBlock.Output,
disabled=not TODOIST_OAUTH_IS_CONFIGURED,
test_input={
"credentials": TEST_CREDENTIALS_INPUT,
},
@@ -150,6 +152,7 @@ class TodoistCreateProjectBlock(Block):
categories={BlockCategory.PRODUCTIVITY},
input_schema=TodoistCreateProjectBlock.Input,
output_schema=TodoistCreateProjectBlock.Output,
disabled=not TODOIST_OAUTH_IS_CONFIGURED,
test_input={"credentials": TEST_CREDENTIALS_INPUT, "name": "Test Project"},
test_credentials=TEST_CREDENTIALS,
test_output=[("success", True)],
@@ -230,6 +233,7 @@ class TodoistGetProjectBlock(Block):
categories={BlockCategory.PRODUCTIVITY},
input_schema=TodoistGetProjectBlock.Input,
output_schema=TodoistGetProjectBlock.Output,
disabled=not TODOIST_OAUTH_IS_CONFIGURED,
test_input={
"credentials": TEST_CREDENTIALS_INPUT,
"project_id": "2203306141",
@@ -332,6 +336,7 @@ class TodoistUpdateProjectBlock(Block):
categories={BlockCategory.PRODUCTIVITY},
input_schema=TodoistUpdateProjectBlock.Input,
output_schema=TodoistUpdateProjectBlock.Output,
disabled=not TODOIST_OAUTH_IS_CONFIGURED,
test_input={
"credentials": TEST_CREDENTIALS_INPUT,
"project_id": "2203306141",
@@ -413,6 +418,7 @@ class TodoistDeleteProjectBlock(Block):
categories={BlockCategory.PRODUCTIVITY},
input_schema=TodoistDeleteProjectBlock.Input,
output_schema=TodoistDeleteProjectBlock.Output,
disabled=not TODOIST_OAUTH_IS_CONFIGURED,
test_input={
"credentials": TEST_CREDENTIALS_INPUT,
"project_id": "2203306141",
@@ -481,6 +487,7 @@ class TodoistListCollaboratorsBlock(Block):
categories={BlockCategory.PRODUCTIVITY},
input_schema=TodoistListCollaboratorsBlock.Input,
output_schema=TodoistListCollaboratorsBlock.Output,
disabled=not TODOIST_OAUTH_IS_CONFIGURED,
test_input={
"credentials": TEST_CREDENTIALS_INPUT,
"project_id": "2203306141",

View File

@@ -4,6 +4,7 @@ from typing_extensions import Optional
from backend.blocks.todoist._auth import (
TEST_CREDENTIALS,
TEST_CREDENTIALS_INPUT,
TODOIST_OAUTH_IS_CONFIGURED,
TodoistCredentials,
TodoistCredentialsField,
TodoistCredentialsInput,
@@ -36,6 +37,7 @@ class TodoistListSectionsBlock(Block):
categories={BlockCategory.PRODUCTIVITY},
input_schema=TodoistListSectionsBlock.Input,
output_schema=TodoistListSectionsBlock.Output,
disabled=not TODOIST_OAUTH_IS_CONFIGURED,
test_input={
"credentials": TEST_CREDENTIALS_INPUT,
"project_id": "2203306141",
@@ -207,6 +209,7 @@ class TodoistGetSectionBlock(Block):
categories={BlockCategory.PRODUCTIVITY},
input_schema=TodoistGetSectionBlock.Input,
output_schema=TodoistGetSectionBlock.Output,
disabled=not TODOIST_OAUTH_IS_CONFIGURED,
test_input={"credentials": TEST_CREDENTIALS_INPUT, "section_id": "7025"},
test_credentials=TEST_CREDENTIALS,
test_output=[
@@ -275,6 +278,7 @@ class TodoistDeleteSectionBlock(Block):
categories={BlockCategory.PRODUCTIVITY},
input_schema=TodoistDeleteSectionBlock.Input,
output_schema=TodoistDeleteSectionBlock.Output,
disabled=not TODOIST_OAUTH_IS_CONFIGURED,
test_input={"credentials": TEST_CREDENTIALS_INPUT, "section_id": "7025"},
test_credentials=TEST_CREDENTIALS,
test_output=[("success", True)],

View File

@@ -7,6 +7,7 @@ from typing_extensions import Optional
from backend.blocks.todoist._auth import (
TEST_CREDENTIALS,
TEST_CREDENTIALS_INPUT,
TODOIST_OAUTH_IS_CONFIGURED,
TodoistCredentials,
TodoistCredentialsField,
TodoistCredentialsInput,
@@ -86,6 +87,7 @@ class TodoistCreateTaskBlock(Block):
categories={BlockCategory.PRODUCTIVITY},
input_schema=TodoistCreateTaskBlock.Input,
output_schema=TodoistCreateTaskBlock.Output,
disabled=not TODOIST_OAUTH_IS_CONFIGURED,
test_input={
"credentials": TEST_CREDENTIALS_INPUT,
"content": "Buy groceries",
@@ -217,6 +219,7 @@ class TodoistGetTasksBlock(Block):
categories={BlockCategory.PRODUCTIVITY},
input_schema=TodoistGetTasksBlock.Input,
output_schema=TodoistGetTasksBlock.Output,
disabled=not TODOIST_OAUTH_IS_CONFIGURED,
test_input={
"credentials": TEST_CREDENTIALS_INPUT,
"project_id": "2203306141",
@@ -309,6 +312,7 @@ class TodoistGetTaskBlock(Block):
categories={BlockCategory.PRODUCTIVITY},
input_schema=TodoistGetTaskBlock.Input,
output_schema=TodoistGetTaskBlock.Output,
disabled=not TODOIST_OAUTH_IS_CONFIGURED,
test_input={"credentials": TEST_CREDENTIALS_INPUT, "task_id": "2995104339"},
test_credentials=TEST_CREDENTIALS,
test_output=[
@@ -428,6 +432,7 @@ class TodoistUpdateTaskBlock(Block):
categories={BlockCategory.PRODUCTIVITY},
input_schema=TodoistUpdateTaskBlock.Input,
output_schema=TodoistUpdateTaskBlock.Output,
disabled=not TODOIST_OAUTH_IS_CONFIGURED,
test_input={
"credentials": TEST_CREDENTIALS_INPUT,
"task_id": "2995104339",
@@ -467,32 +472,24 @@ class TodoistUpdateTaskBlock(Block):
)
task_updates = {}
if input_data.content is not None:
task_updates["content"] = input_data.content
if input_data.description is not None:
task_updates["description"] = input_data.description
if input_data.project_id is not None:
task_updates["project_id"] = input_data.project_id
if input_data.section_id is not None:
task_updates["section_id"] = input_data.section_id
if input_data.parent_id is not None:
task_updates["parent_id"] = input_data.parent_id
if input_data.order is not None:
task_updates["order"] = input_data.order
if input_data.labels is not None:
task_updates["labels"] = input_data.labels
if input_data.priority is not None:
task_updates["priority"] = input_data.priority
if due_date is not None:
task_updates["due_date"] = due_date
if deadline_date is not None:
task_updates["deadline_date"] = deadline_date
if input_data.assignee_id is not None:
task_updates["assignee_id"] = input_data.assignee_id
if input_data.duration is not None:
task_updates["duration"] = input_data.duration
if input_data.duration_unit is not None:
task_updates["duration_unit"] = input_data.duration_unit
update_fields = {
"content": input_data.content,
"description": input_data.description,
"project_id": input_data.project_id,
"section_id": input_data.section_id,
"parent_id": input_data.parent_id,
"order": input_data.order,
"labels": input_data.labels,
"priority": input_data.priority,
"due_date": due_date,
"deadline_date": deadline_date,
"assignee_id": input_data.assignee_id,
"duration": input_data.duration,
"duration_unit": input_data.duration_unit,
}
# Filter out None values
task_updates = {k: v for k, v in update_fields.items() if v is not None}
self.update_task(
credentials,
@@ -526,6 +523,7 @@ class TodoistCloseTaskBlock(Block):
categories={BlockCategory.PRODUCTIVITY},
input_schema=TodoistCloseTaskBlock.Input,
output_schema=TodoistCloseTaskBlock.Output,
disabled=not TODOIST_OAUTH_IS_CONFIGURED,
test_input={"credentials": TEST_CREDENTIALS_INPUT, "task_id": "2995104339"},
test_credentials=TEST_CREDENTIALS,
test_output=[("success", True)],
@@ -576,6 +574,7 @@ class TodoistReopenTaskBlock(Block):
categories={BlockCategory.PRODUCTIVITY},
input_schema=TodoistReopenTaskBlock.Input,
output_schema=TodoistReopenTaskBlock.Output,
disabled=not TODOIST_OAUTH_IS_CONFIGURED,
test_input={"credentials": TEST_CREDENTIALS_INPUT, "task_id": "2995104339"},
test_credentials=TEST_CREDENTIALS,
test_output=[
@@ -628,6 +627,7 @@ class TodoistDeleteTaskBlock(Block):
categories={BlockCategory.PRODUCTIVITY},
input_schema=TodoistDeleteTaskBlock.Input,
output_schema=TodoistDeleteTaskBlock.Output,
disabled=not TODOIST_OAUTH_IS_CONFIGURED,
test_input={"credentials": TEST_CREDENTIALS_INPUT, "task_id": "2995104339"},
test_credentials=TEST_CREDENTIALS,
test_output=[

View File

@@ -1,4 +1,4 @@
# Todo : Add new Type support
# Todo : Add new Type support, and disable block if it's Oauth is not configured
# from typing import cast
# import tweepy

View File

@@ -1,4 +1,4 @@
# Todo : Add new Type support
# Todo : Add new Type support, and disable block if it's Oauth is not configured
# from typing import cast

View File

@@ -4,6 +4,7 @@ import tweepy
from backend.blocks.twitter._auth import (
TEST_CREDENTIALS,
TEST_CREDENTIALS_INPUT,
TWITTER_OAUTH_IS_CONFIGURED,
TwitterCredentials,
TwitterCredentialsField,
TwitterCredentialsInput,
@@ -44,6 +45,7 @@ class TwitterUnfollowListBlock(Block):
categories={BlockCategory.SOCIAL},
input_schema=TwitterUnfollowListBlock.Input,
output_schema=TwitterUnfollowListBlock.Output,
disabled=not TWITTER_OAUTH_IS_CONFIGURED,
test_input={"list_id": "123456789", "credentials": TEST_CREDENTIALS_INPUT},
test_credentials=TEST_CREDENTIALS,
test_output=[
@@ -106,6 +108,7 @@ class TwitterFollowListBlock(Block):
categories={BlockCategory.SOCIAL},
input_schema=TwitterFollowListBlock.Input,
output_schema=TwitterFollowListBlock.Output,
disabled=not TWITTER_OAUTH_IS_CONFIGURED,
test_input={"list_id": "123456789", "credentials": TEST_CREDENTIALS_INPUT},
test_credentials=TEST_CREDENTIALS,
test_output=[

View File

@@ -6,6 +6,7 @@ from tweepy.client import Response
from backend.blocks.twitter._auth import (
TEST_CREDENTIALS,
TEST_CREDENTIALS_INPUT,
TWITTER_OAUTH_IS_CONFIGURED,
TwitterCredentials,
TwitterCredentialsField,
TwitterCredentialsInput,
@@ -63,6 +64,7 @@ class TwitterGetListBlock(Block):
categories={BlockCategory.SOCIAL},
input_schema=TwitterGetListBlock.Input,
output_schema=TwitterGetListBlock.Output,
disabled=not TWITTER_OAUTH_IS_CONFIGURED,
test_input={
"list_id": "84839422",
"credentials": TEST_CREDENTIALS_INPUT,
@@ -220,6 +222,7 @@ class TwitterGetOwnedListsBlock(Block):
categories={BlockCategory.SOCIAL},
input_schema=TwitterGetOwnedListsBlock.Input,
output_schema=TwitterGetOwnedListsBlock.Output,
disabled=not TWITTER_OAUTH_IS_CONFIGURED,
test_input={
"user_id": "2244994945",
"max_results": 10,

View File

@@ -6,6 +6,7 @@ from tweepy.client import Response
from backend.blocks.twitter._auth import (
TEST_CREDENTIALS,
TEST_CREDENTIALS_INPUT,
TWITTER_OAUTH_IS_CONFIGURED,
TwitterCredentials,
TwitterCredentialsField,
TwitterCredentialsInput,
@@ -65,6 +66,7 @@ class TwitterRemoveListMemberBlock(Block):
categories={BlockCategory.SOCIAL},
input_schema=TwitterRemoveListMemberBlock.Input,
output_schema=TwitterRemoveListMemberBlock.Output,
disabled=not TWITTER_OAUTH_IS_CONFIGURED,
test_input={
"list_id": "123456789",
"user_id": "987654321",
@@ -138,6 +140,7 @@ class TwitterAddListMemberBlock(Block):
categories={BlockCategory.SOCIAL},
input_schema=TwitterAddListMemberBlock.Input,
output_schema=TwitterAddListMemberBlock.Output,
disabled=not TWITTER_OAUTH_IS_CONFIGURED,
test_input={
"list_id": "123456789",
"user_id": "987654321",
@@ -229,6 +232,7 @@ class TwitterGetListMembersBlock(Block):
categories={BlockCategory.SOCIAL},
input_schema=TwitterGetListMembersBlock.Input,
output_schema=TwitterGetListMembersBlock.Output,
disabled=not TWITTER_OAUTH_IS_CONFIGURED,
test_input={
"list_id": "123456789",
"max_results": 2,
@@ -405,6 +409,7 @@ class TwitterGetListMembershipsBlock(Block):
categories={BlockCategory.SOCIAL},
input_schema=TwitterGetListMembershipsBlock.Input,
output_schema=TwitterGetListMembershipsBlock.Output,
disabled=not TWITTER_OAUTH_IS_CONFIGURED,
test_input={
"user_id": "123456789",
"max_results": 1,

View File

@@ -6,6 +6,7 @@ from tweepy.client import Response
from backend.blocks.twitter._auth import (
TEST_CREDENTIALS,
TEST_CREDENTIALS_INPUT,
TWITTER_OAUTH_IS_CONFIGURED,
TwitterCredentials,
TwitterCredentialsField,
TwitterCredentialsInput,
@@ -81,6 +82,7 @@ class TwitterGetListTweetsBlock(Block):
categories={BlockCategory.SOCIAL},
input_schema=TwitterGetListTweetsBlock.Input,
output_schema=TwitterGetListTweetsBlock.Output,
disabled=not TWITTER_OAUTH_IS_CONFIGURED,
test_input={
"list_id": "84839422",
"max_results": 1,

View File

@@ -6,6 +6,7 @@ from tweepy.client import Response
from backend.blocks.twitter._auth import (
TEST_CREDENTIALS,
TEST_CREDENTIALS_INPUT,
TWITTER_OAUTH_IS_CONFIGURED,
TwitterCredentials,
TwitterCredentialsField,
TwitterCredentialsInput,
@@ -41,6 +42,7 @@ class TwitterDeleteListBlock(Block):
categories={BlockCategory.SOCIAL},
input_schema=TwitterDeleteListBlock.Input,
output_schema=TwitterDeleteListBlock.Output,
disabled=not TWITTER_OAUTH_IS_CONFIGURED,
test_input={"list_id": "1234567890", "credentials": TEST_CREDENTIALS_INPUT},
test_credentials=TEST_CREDENTIALS,
test_output=[("success", True)],
@@ -118,6 +120,7 @@ class TwitterUpdateListBlock(Block):
categories={BlockCategory.SOCIAL},
input_schema=TwitterUpdateListBlock.Input,
output_schema=TwitterUpdateListBlock.Output,
disabled=not TWITTER_OAUTH_IS_CONFIGURED,
test_input={
"list_id": "1234567890",
"name": "Updated List Name",
@@ -214,6 +217,7 @@ class TwitterCreateListBlock(Block):
categories={BlockCategory.SOCIAL},
input_schema=TwitterCreateListBlock.Input,
output_schema=TwitterCreateListBlock.Output,
disabled=not TWITTER_OAUTH_IS_CONFIGURED,
test_input={
"name": "New List Name",
"description": "New List Description",

View File

@@ -6,6 +6,7 @@ from tweepy.client import Response
from backend.blocks.twitter._auth import (
TEST_CREDENTIALS,
TEST_CREDENTIALS_INPUT,
TWITTER_OAUTH_IS_CONFIGURED,
TwitterCredentials,
TwitterCredentialsField,
TwitterCredentialsInput,
@@ -52,6 +53,7 @@ class TwitterUnpinListBlock(Block):
categories={BlockCategory.SOCIAL},
input_schema=TwitterUnpinListBlock.Input,
output_schema=TwitterUnpinListBlock.Output,
disabled=not TWITTER_OAUTH_IS_CONFIGURED,
test_input={"list_id": "123456789", "credentials": TEST_CREDENTIALS_INPUT},
test_credentials=TEST_CREDENTIALS,
test_output=[("success", True)],
@@ -115,6 +117,7 @@ class TwitterPinListBlock(Block):
categories={BlockCategory.SOCIAL},
input_schema=TwitterPinListBlock.Input,
output_schema=TwitterPinListBlock.Output,
disabled=not TWITTER_OAUTH_IS_CONFIGURED,
test_input={"list_id": "123456789", "credentials": TEST_CREDENTIALS_INPUT},
test_credentials=TEST_CREDENTIALS,
test_output=[("success", True)],
@@ -184,6 +187,7 @@ class TwitterGetPinnedListsBlock(Block):
categories={BlockCategory.SOCIAL},
input_schema=TwitterGetPinnedListsBlock.Input,
output_schema=TwitterGetPinnedListsBlock.Output,
disabled=not TWITTER_OAUTH_IS_CONFIGURED,
test_input={
"expansions": None,
"list_fields": None,

View File

@@ -6,6 +6,7 @@ from tweepy.client import Response
from backend.blocks.twitter._auth import (
TEST_CREDENTIALS,
TEST_CREDENTIALS_INPUT,
TWITTER_OAUTH_IS_CONFIGURED,
TwitterCredentials,
TwitterCredentialsField,
TwitterCredentialsInput,
@@ -78,6 +79,7 @@ class TwitterSearchSpacesBlock(Block):
categories={BlockCategory.SOCIAL},
input_schema=TwitterSearchSpacesBlock.Input,
output_schema=TwitterSearchSpacesBlock.Output,
disabled=not TWITTER_OAUTH_IS_CONFIGURED,
test_input={
"query": "tech",
"max_results": 1,

View File

@@ -7,6 +7,7 @@ from tweepy.client import Response
from backend.blocks.twitter._auth import (
TEST_CREDENTIALS,
TEST_CREDENTIALS_INPUT,
TWITTER_OAUTH_IS_CONFIGURED,
TwitterCredentials,
TwitterCredentialsField,
TwitterCredentialsInput,
@@ -94,6 +95,7 @@ class TwitterGetSpacesBlock(Block):
categories={BlockCategory.SOCIAL},
input_schema=TwitterGetSpacesBlock.Input,
output_schema=TwitterGetSpacesBlock.Output,
disabled=not TWITTER_OAUTH_IS_CONFIGURED,
test_input={
"identifier": {
"discriminator": "space_list",
@@ -249,6 +251,7 @@ class TwitterGetSpaceByIdBlock(Block):
categories={BlockCategory.SOCIAL},
input_schema=TwitterGetSpaceByIdBlock.Input,
output_schema=TwitterGetSpaceByIdBlock.Output,
disabled=not TWITTER_OAUTH_IS_CONFIGURED,
test_input={
"space_id": "1DXxyRYNejbKM",
"credentials": TEST_CREDENTIALS_INPUT,
@@ -409,6 +412,7 @@ class TwitterGetSpaceBuyersBlock(Block):
categories={BlockCategory.SOCIAL},
input_schema=TwitterGetSpaceBuyersBlock.Input,
output_schema=TwitterGetSpaceBuyersBlock.Output,
disabled=not TWITTER_OAUTH_IS_CONFIGURED,
test_input={
"space_id": "1DXxyRYNejbKM",
"credentials": TEST_CREDENTIALS_INPUT,
@@ -537,6 +541,7 @@ class TwitterGetSpaceTweetsBlock(Block):
categories={BlockCategory.SOCIAL},
input_schema=TwitterGetSpaceTweetsBlock.Input,
output_schema=TwitterGetSpaceTweetsBlock.Output,
disabled=not TWITTER_OAUTH_IS_CONFIGURED,
test_input={
"space_id": "1DXxyRYNejbKM",
"credentials": TEST_CREDENTIALS_INPUT,

View File

@@ -6,6 +6,7 @@ from tweepy.client import Response
from backend.blocks.twitter._auth import (
TEST_CREDENTIALS,
TEST_CREDENTIALS_INPUT,
TWITTER_OAUTH_IS_CONFIGURED,
TwitterCredentials,
TwitterCredentialsField,
TwitterCredentialsInput,
@@ -55,6 +56,7 @@ class TwitterBookmarkTweetBlock(Block):
categories={BlockCategory.SOCIAL},
input_schema=TwitterBookmarkTweetBlock.Input,
output_schema=TwitterBookmarkTweetBlock.Output,
disabled=not TWITTER_OAUTH_IS_CONFIGURED,
test_input={
"tweet_id": "1234567890",
"credentials": TEST_CREDENTIALS_INPUT,
@@ -147,6 +149,7 @@ class TwitterGetBookmarkedTweetsBlock(Block):
categories={BlockCategory.SOCIAL},
input_schema=TwitterGetBookmarkedTweetsBlock.Input,
output_schema=TwitterGetBookmarkedTweetsBlock.Output,
disabled=not TWITTER_OAUTH_IS_CONFIGURED,
test_input={
"max_results": 2,
"pagination_token": None,
@@ -330,6 +333,7 @@ class TwitterRemoveBookmarkTweetBlock(Block):
categories={BlockCategory.SOCIAL},
input_schema=TwitterRemoveBookmarkTweetBlock.Input,
output_schema=TwitterRemoveBookmarkTweetBlock.Output,
disabled=not TWITTER_OAUTH_IS_CONFIGURED,
test_input={
"tweet_id": "1234567890",
"credentials": TEST_CREDENTIALS_INPUT,

View File

@@ -3,6 +3,7 @@ import tweepy
from backend.blocks.twitter._auth import (
TEST_CREDENTIALS,
TEST_CREDENTIALS_INPUT,
TWITTER_OAUTH_IS_CONFIGURED,
TwitterCredentials,
TwitterCredentialsField,
TwitterCredentialsInput,
@@ -38,6 +39,7 @@ class TwitterHideReplyBlock(Block):
categories={BlockCategory.SOCIAL},
input_schema=TwitterHideReplyBlock.Input,
output_schema=TwitterHideReplyBlock.Output,
disabled=not TWITTER_OAUTH_IS_CONFIGURED,
test_input={
"tweet_id": "1234567890",
"credentials": TEST_CREDENTIALS_INPUT,
@@ -109,6 +111,7 @@ class TwitterUnhideReplyBlock(Block):
categories={BlockCategory.SOCIAL},
input_schema=TwitterUnhideReplyBlock.Input,
output_schema=TwitterUnhideReplyBlock.Output,
disabled=not TWITTER_OAUTH_IS_CONFIGURED,
test_input={
"tweet_id": "1234567890",
"credentials": TEST_CREDENTIALS_INPUT,

View File

@@ -6,6 +6,7 @@ from tweepy.client import Response
from backend.blocks.twitter._auth import (
TEST_CREDENTIALS,
TEST_CREDENTIALS_INPUT,
TWITTER_OAUTH_IS_CONFIGURED,
TwitterCredentials,
TwitterCredentialsField,
TwitterCredentialsInput,
@@ -60,6 +61,7 @@ class TwitterLikeTweetBlock(Block):
categories={BlockCategory.SOCIAL},
input_schema=TwitterLikeTweetBlock.Input,
output_schema=TwitterLikeTweetBlock.Output,
disabled=not TWITTER_OAUTH_IS_CONFIGURED,
test_input={
"tweet_id": "1234567890",
"credentials": TEST_CREDENTIALS_INPUT,
@@ -159,6 +161,7 @@ class TwitterGetLikingUsersBlock(Block):
categories={BlockCategory.SOCIAL},
input_schema=TwitterGetLikingUsersBlock.Input,
output_schema=TwitterGetLikingUsersBlock.Output,
disabled=not TWITTER_OAUTH_IS_CONFIGURED,
test_input={
"tweet_id": "1234567890",
"max_results": 1,
@@ -337,6 +340,7 @@ class TwitterGetLikedTweetsBlock(Block):
categories={BlockCategory.SOCIAL},
input_schema=TwitterGetLikedTweetsBlock.Input,
output_schema=TwitterGetLikedTweetsBlock.Output,
disabled=not TWITTER_OAUTH_IS_CONFIGURED,
test_input={
"user_id": "1234567890",
"max_results": 2,
@@ -531,6 +535,7 @@ class TwitterUnlikeTweetBlock(Block):
categories={BlockCategory.SOCIAL},
input_schema=TwitterUnlikeTweetBlock.Input,
output_schema=TwitterUnlikeTweetBlock.Output,
disabled=not TWITTER_OAUTH_IS_CONFIGURED,
test_input={
"tweet_id": "1234567890",
"credentials": TEST_CREDENTIALS_INPUT,

View File

@@ -8,6 +8,7 @@ from tweepy.client import Response
from backend.blocks.twitter._auth import (
TEST_CREDENTIALS,
TEST_CREDENTIALS_INPUT,
TWITTER_OAUTH_IS_CONFIGURED,
TwitterCredentials,
TwitterCredentialsField,
TwitterCredentialsInput,
@@ -131,6 +132,7 @@ class TwitterPostTweetBlock(Block):
categories={BlockCategory.SOCIAL},
input_schema=TwitterPostTweetBlock.Input,
output_schema=TwitterPostTweetBlock.Output,
disabled=not TWITTER_OAUTH_IS_CONFIGURED,
test_input={
"tweet_text": "This is a test tweet.",
"credentials": TEST_CREDENTIALS_INPUT,
@@ -263,6 +265,7 @@ class TwitterDeleteTweetBlock(Block):
categories={BlockCategory.SOCIAL},
input_schema=TwitterDeleteTweetBlock.Input,
output_schema=TwitterDeleteTweetBlock.Output,
disabled=not TWITTER_OAUTH_IS_CONFIGURED,
test_input={
"tweet_id": "1234567890",
"credentials": TEST_CREDENTIALS_INPUT,
@@ -357,6 +360,7 @@ class TwitterSearchRecentTweetsBlock(Block):
categories={BlockCategory.SOCIAL},
input_schema=TwitterSearchRecentTweetsBlock.Input,
output_schema=TwitterSearchRecentTweetsBlock.Output,
disabled=not TWITTER_OAUTH_IS_CONFIGURED,
test_input={
"query": "from:twitterapi #twitterapi",
"credentials": TEST_CREDENTIALS_INPUT,

View File

@@ -6,6 +6,7 @@ from tweepy.client import Response
from backend.blocks.twitter._auth import (
TEST_CREDENTIALS,
TEST_CREDENTIALS_INPUT,
TWITTER_OAUTH_IS_CONFIGURED,
TwitterCredentials,
TwitterCredentialsField,
TwitterCredentialsInput,
@@ -86,6 +87,7 @@ class TwitterGetQuoteTweetsBlock(Block):
categories={BlockCategory.SOCIAL},
input_schema=TwitterGetQuoteTweetsBlock.Input,
output_schema=TwitterGetQuoteTweetsBlock.Output,
disabled=not TWITTER_OAUTH_IS_CONFIGURED,
test_input={
"tweet_id": "1234567890",
"max_results": 2,

View File

@@ -6,6 +6,7 @@ from tweepy.client import Response
from backend.blocks.twitter._auth import (
TEST_CREDENTIALS,
TEST_CREDENTIALS_INPUT,
TWITTER_OAUTH_IS_CONFIGURED,
TwitterCredentials,
TwitterCredentialsField,
TwitterCredentialsInput,
@@ -52,6 +53,7 @@ class TwitterRetweetBlock(Block):
categories={BlockCategory.SOCIAL},
input_schema=TwitterRetweetBlock.Input,
output_schema=TwitterRetweetBlock.Output,
disabled=not TWITTER_OAUTH_IS_CONFIGURED,
test_input={
"tweet_id": "1234567890",
"credentials": TEST_CREDENTIALS_INPUT,
@@ -128,6 +130,7 @@ class TwitterRemoveRetweetBlock(Block):
categories={BlockCategory.SOCIAL},
input_schema=TwitterRemoveRetweetBlock.Input,
output_schema=TwitterRemoveRetweetBlock.Output,
disabled=not TWITTER_OAUTH_IS_CONFIGURED,
test_input={
"tweet_id": "1234567890",
"credentials": TEST_CREDENTIALS_INPUT,
@@ -231,6 +234,7 @@ class TwitterGetRetweetersBlock(Block):
categories={BlockCategory.SOCIAL},
input_schema=TwitterGetRetweetersBlock.Input,
output_schema=TwitterGetRetweetersBlock.Output,
disabled=not TWITTER_OAUTH_IS_CONFIGURED,
test_input={
"tweet_id": "1234567890",
"credentials": TEST_CREDENTIALS_INPUT,

View File

@@ -7,6 +7,7 @@ from tweepy.client import Response
from backend.blocks.twitter._auth import (
TEST_CREDENTIALS,
TEST_CREDENTIALS_INPUT,
TWITTER_OAUTH_IS_CONFIGURED,
TwitterCredentials,
TwitterCredentialsField,
TwitterCredentialsInput,
@@ -91,6 +92,7 @@ class TwitterGetUserMentionsBlock(Block):
categories={BlockCategory.SOCIAL},
input_schema=TwitterGetUserMentionsBlock.Input,
output_schema=TwitterGetUserMentionsBlock.Output,
disabled=not TWITTER_OAUTH_IS_CONFIGURED,
test_input={
"user_id": "12345",
"credentials": TEST_CREDENTIALS_INPUT,
@@ -332,6 +334,7 @@ class TwitterGetHomeTimelineBlock(Block):
categories={BlockCategory.SOCIAL},
input_schema=TwitterGetHomeTimelineBlock.Input,
output_schema=TwitterGetHomeTimelineBlock.Output,
disabled=not TWITTER_OAUTH_IS_CONFIGURED,
test_input={
"credentials": TEST_CREDENTIALS_INPUT,
"max_results": 2,
@@ -568,6 +571,7 @@ class TwitterGetUserTweetsBlock(Block):
categories={BlockCategory.SOCIAL},
input_schema=TwitterGetUserTweetsBlock.Input,
output_schema=TwitterGetUserTweetsBlock.Output,
disabled=not TWITTER_OAUTH_IS_CONFIGURED,
test_input={
"user_id": "12345",
"credentials": TEST_CREDENTIALS_INPUT,

View File

@@ -6,6 +6,7 @@ from tweepy.client import Response
from backend.blocks.twitter._auth import (
TEST_CREDENTIALS,
TEST_CREDENTIALS_INPUT,
TWITTER_OAUTH_IS_CONFIGURED,
TwitterCredentials,
TwitterCredentialsField,
TwitterCredentialsInput,
@@ -67,6 +68,7 @@ class TwitterGetTweetBlock(Block):
categories={BlockCategory.SOCIAL},
input_schema=TwitterGetTweetBlock.Input,
output_schema=TwitterGetTweetBlock.Output,
disabled=not TWITTER_OAUTH_IS_CONFIGURED,
test_input={
"tweet_id": "1460323737035677698",
"credentials": TEST_CREDENTIALS_INPUT,
@@ -229,6 +231,7 @@ class TwitterGetTweetsBlock(Block):
categories={BlockCategory.SOCIAL},
input_schema=TwitterGetTweetsBlock.Input,
output_schema=TwitterGetTweetsBlock.Output,
disabled=not TWITTER_OAUTH_IS_CONFIGURED,
test_input={
"tweet_ids": ["1460323737035677698"],
"credentials": TEST_CREDENTIALS_INPUT,

View File

@@ -6,6 +6,7 @@ from tweepy.client import Response
from backend.blocks.twitter._auth import (
TEST_CREDENTIALS,
TEST_CREDENTIALS_INPUT,
TWITTER_OAUTH_IS_CONFIGURED,
TwitterCredentials,
TwitterCredentialsField,
TwitterCredentialsInput,
@@ -64,6 +65,7 @@ class TwitterGetBlockedUsersBlock(Block):
categories={BlockCategory.SOCIAL},
input_schema=TwitterGetBlockedUsersBlock.Input,
output_schema=TwitterGetBlockedUsersBlock.Output,
disabled=not TWITTER_OAUTH_IS_CONFIGURED,
test_input={
"max_results": 10,
"pagination_token": "",

View File

@@ -6,6 +6,7 @@ from tweepy.client import Response
from backend.blocks.twitter._auth import (
TEST_CREDENTIALS,
TEST_CREDENTIALS_INPUT,
TWITTER_OAUTH_IS_CONFIGURED,
TwitterCredentials,
TwitterCredentialsField,
TwitterCredentialsInput,
@@ -55,6 +56,7 @@ class TwitterUnfollowUserBlock(Block):
categories={BlockCategory.SOCIAL},
input_schema=TwitterUnfollowUserBlock.Input,
output_schema=TwitterUnfollowUserBlock.Output,
disabled=not TWITTER_OAUTH_IS_CONFIGURED,
test_input={
"target_user_id": "12345",
"credentials": TEST_CREDENTIALS_INPUT,
@@ -126,6 +128,7 @@ class TwitterFollowUserBlock(Block):
categories={BlockCategory.SOCIAL},
input_schema=TwitterFollowUserBlock.Input,
output_schema=TwitterFollowUserBlock.Output,
disabled=not TWITTER_OAUTH_IS_CONFIGURED,
test_input={
"target_user_id": "12345",
"credentials": TEST_CREDENTIALS_INPUT,
@@ -213,6 +216,7 @@ class TwitterGetFollowersBlock(Block):
categories={BlockCategory.SOCIAL},
input_schema=TwitterGetFollowersBlock.Input,
output_schema=TwitterGetFollowersBlock.Output,
disabled=not TWITTER_OAUTH_IS_CONFIGURED,
test_input={
"target_user_id": "12345",
"max_results": 1,
@@ -386,6 +390,7 @@ class TwitterGetFollowingBlock(Block):
categories={BlockCategory.SOCIAL},
input_schema=TwitterGetFollowingBlock.Input,
output_schema=TwitterGetFollowingBlock.Output,
disabled=not TWITTER_OAUTH_IS_CONFIGURED,
test_input={
"target_user_id": "12345",
"max_results": 1,

View File

@@ -6,6 +6,7 @@ from tweepy.client import Response
from backend.blocks.twitter._auth import (
TEST_CREDENTIALS,
TEST_CREDENTIALS_INPUT,
TWITTER_OAUTH_IS_CONFIGURED,
TwitterCredentials,
TwitterCredentialsField,
TwitterCredentialsInput,
@@ -55,6 +56,7 @@ class TwitterUnmuteUserBlock(Block):
categories={BlockCategory.SOCIAL},
input_schema=TwitterUnmuteUserBlock.Input,
output_schema=TwitterUnmuteUserBlock.Output,
disabled=not TWITTER_OAUTH_IS_CONFIGURED,
test_input={
"target_user_id": "12345",
"credentials": TEST_CREDENTIALS_INPUT,
@@ -139,6 +141,7 @@ class TwitterGetMutedUsersBlock(Block):
categories={BlockCategory.SOCIAL},
input_schema=TwitterGetMutedUsersBlock.Input,
output_schema=TwitterGetMutedUsersBlock.Output,
disabled=not TWITTER_OAUTH_IS_CONFIGURED,
test_input={
"max_results": 2,
"pagination_token": "",
@@ -289,6 +292,7 @@ class TwitterMuteUserBlock(Block):
categories={BlockCategory.SOCIAL},
input_schema=TwitterMuteUserBlock.Input,
output_schema=TwitterMuteUserBlock.Output,
disabled=not TWITTER_OAUTH_IS_CONFIGURED,
test_input={
"target_user_id": "12345",
"credentials": TEST_CREDENTIALS_INPUT,

View File

@@ -7,6 +7,7 @@ from tweepy.client import Response
from backend.blocks.twitter._auth import (
TEST_CREDENTIALS,
TEST_CREDENTIALS_INPUT,
TWITTER_OAUTH_IS_CONFIGURED,
TwitterCredentials,
TwitterCredentialsField,
TwitterCredentialsInput,
@@ -75,6 +76,7 @@ class TwitterGetUserBlock(Block):
categories={BlockCategory.SOCIAL},
input_schema=TwitterGetUserBlock.Input,
output_schema=TwitterGetUserBlock.Output,
disabled=not TWITTER_OAUTH_IS_CONFIGURED,
test_input={
"identifier": {"discriminator": "username", "username": "twitter"},
"credentials": TEST_CREDENTIALS_INPUT,
@@ -251,6 +253,7 @@ class TwitterGetUsersBlock(Block):
categories={BlockCategory.SOCIAL},
input_schema=TwitterGetUsersBlock.Input,
output_schema=TwitterGetUsersBlock.Output,
disabled=not TWITTER_OAUTH_IS_CONFIGURED,
test_input={
"identifier": {
"discriminator": "username_list",

View File

@@ -26,6 +26,7 @@ from backend.integrations.credentials_store import (
groq_credentials,
ideogram_credentials,
jina_credentials,
llama_api_credentials,
open_router_credentials,
openai_credentials,
replicate_credentials,
@@ -46,6 +47,8 @@ MODEL_COST: dict[LlmModel, int] = {
LlmModel.GPT4O: 3,
LlmModel.GPT4_TURBO: 10,
LlmModel.GPT3_5_TURBO: 1,
LlmModel.CLAUDE_4_OPUS: 21,
LlmModel.CLAUDE_4_SONNET: 5,
LlmModel.CLAUDE_3_7_SONNET: 5,
LlmModel.CLAUDE_3_5_SONNET: 4,
LlmModel.CLAUDE_3_5_HAIKU: 1, # $0.80 / $4.00
@@ -81,6 +84,10 @@ MODEL_COST: dict[LlmModel, int] = {
LlmModel.GRYPHE_MYTHOMAX_L2_13B: 1,
LlmModel.META_LLAMA_4_SCOUT: 1,
LlmModel.META_LLAMA_4_MAVERICK: 1,
LlmModel.LLAMA_API_LLAMA_4_SCOUT: 1,
LlmModel.LLAMA_API_LLAMA4_MAVERICK: 1,
LlmModel.LLAMA_API_LLAMA3_3_8B: 1,
LlmModel.LLAMA_API_LLAMA3_3_70B: 1,
}
for model in LlmModel:
@@ -153,6 +160,23 @@ LLM_COST = (
for model, cost in MODEL_COST.items()
if MODEL_METADATA[model].provider == "open_router"
]
# Llama API Models
+ [
BlockCost(
cost_type=BlockCostType.RUN,
cost_filter={
"model": model,
"credentials": {
"id": llama_api_credentials.id,
"provider": llama_api_credentials.provider,
"type": llama_api_credentials.type,
},
},
cost_amount=cost,
)
for model, cost in MODEL_COST.items()
if MODEL_METADATA[model].provider == "llama_api"
]
)
# =============== This is the exhaustive list of cost for each Block =============== #

View File

@@ -1,11 +1,10 @@
import asyncio
import logging
from abc import ABC, abstractmethod
from collections import defaultdict
from datetime import datetime, timezone
from typing import Any, cast
import stripe
from autogpt_libs.utils.cache import thread_cached
from prisma import Json
from prisma.enums import (
CreditRefundRequestStatus,
@@ -20,7 +19,7 @@ from prisma.types import (
CreditTransactionCreateInput,
CreditTransactionWhereInput,
)
from tenacity import retry, stop_after_attempt, wait_exponential
from pydantic import BaseModel
from backend.data import db
from backend.data.block_cost_config import BLOCK_COSTS
@@ -28,15 +27,17 @@ from backend.data.cost import BlockCost
from backend.data.model import (
AutoTopUpConfig,
RefundRequest,
TopUpType,
TransactionHistory,
UserTransaction,
)
from backend.data.notifications import NotificationEventDTO, RefundRequestData
from backend.data.user import get_user_by_id
from backend.executor.utils import UsageTransactionMetadata
from backend.notifications import NotificationManager
from backend.data.notifications import NotificationEventModel, RefundRequestData
from backend.data.user import get_user_by_id, get_user_email_by_id
from backend.notifications.notifications import queue_notification_async
from backend.server.model import Pagination
from backend.server.v2.admin.model import UserHistoryResponse
from backend.util.exceptions import InsufficientBalanceError
from backend.util.service import get_service_client
from backend.util.retry import func_retry
from backend.util.settings import Settings
settings = Settings()
@@ -45,6 +46,17 @@ logger = logging.getLogger(__name__)
base_url = settings.config.frontend_base_url or settings.config.platform_base_url
class UsageTransactionMetadata(BaseModel):
graph_exec_id: str | None = None
graph_id: str | None = None
node_id: str | None = None
node_exec_id: str | None = None
block_id: str | None = None
block: str | None = None
input: dict[str, Any] | None = None
reason: str | None = None
class UserCreditBase(ABC):
@abstractmethod
async def get_credits(self, user_id: str) -> int:
@@ -262,11 +274,7 @@ class UserCreditBase(ABC):
)
return transaction_balance, transaction_time
@retry(
stop=stop_after_attempt(5),
wait=wait_exponential(multiplier=1, min=1, max=10),
reraise=True,
)
@func_retry
async def _enable_transaction(
self,
transaction_key: str,
@@ -364,22 +372,17 @@ class UserCreditBase(ABC):
class UserCredit(UserCreditBase):
@thread_cached
def notification_client(self) -> NotificationManager:
return get_service_client(NotificationManager)
async def _send_refund_notification(
self,
notification_request: RefundRequestData,
notification_type: NotificationType,
):
await asyncio.to_thread(
lambda: self.notification_client().queue_notification(
NotificationEventDTO(
user_id=notification_request.user_id,
type=notification_type,
data=notification_request.model_dump(),
)
await queue_notification_async(
NotificationEventModel(
user_id=notification_request.user_id,
type=notification_type,
data=notification_request,
)
)
@@ -409,6 +412,7 @@ class UserCredit(UserCreditBase):
# Avoid multiple auto top-ups within the same graph execution.
key=f"AUTO-TOP-UP-{user_id}-{metadata.graph_exec_id}",
ceiling_balance=auto_top_up.threshold,
top_up_type=TopUpType.AUTO,
)
except Exception as e:
# Failed top-up is not critical, we can move on.
@@ -418,26 +422,30 @@ class UserCredit(UserCreditBase):
return balance
async def top_up_credits(self, user_id: str, amount: int):
await self._top_up_credits(user_id, amount)
async def top_up_credits(
self,
user_id: str,
amount: int,
top_up_type: TopUpType = TopUpType.UNCATEGORIZED,
):
await self._top_up_credits(
user_id=user_id, amount=amount, top_up_type=top_up_type
)
async def onboarding_reward(self, user_id: str, credits: int, step: OnboardingStep):
key = f"REWARD-{user_id}-{step.value}"
if not await CreditTransaction.prisma().find_first(
where={
"userId": user_id,
"transactionKey": key,
}
):
try:
await self._add_transaction(
user_id=user_id,
amount=credits,
transaction_type=CreditTransactionType.GRANT,
transaction_key=key,
transaction_key=f"REWARD-{user_id}-{step.value}",
metadata=Json(
{"reason": f"Reward for completing {step.value} onboarding step."}
),
)
except UniqueViolationError:
# Already rewarded for this step
pass
async def top_up_refund(
self, user_id: str, transaction_key: str, metadata: dict[str, str]
@@ -602,7 +610,7 @@ class UserCredit(UserCreditBase):
evidence_text += (
f"- {tx.description}: Amount ${tx.amount / 100:.2f} on {tx.transaction_time.isoformat()}, "
f"resulting balance ${tx.balance / 100:.2f} {additional_comment}\n"
f"resulting balance ${tx.running_balance / 100:.2f} {additional_comment}\n"
)
evidence_text += (
"\nThis evidence demonstrates that the transaction was authorized and that the charged amount was used to render the service as agreed."
@@ -621,7 +629,24 @@ class UserCredit(UserCreditBase):
amount: int,
key: str | None = None,
ceiling_balance: int | None = None,
top_up_type: TopUpType = TopUpType.UNCATEGORIZED,
metadata: dict | None = None,
):
# init metadata, without sharing it with the world
metadata = metadata or {}
if not metadata["reason"]:
match top_up_type:
case TopUpType.MANUAL:
metadata["reason"] = {"reason": f"Top up credits for {user_id}"}
case TopUpType.AUTO:
metadata["reason"] = {
"reason": f"Auto top up credits for {user_id}"
}
case _:
metadata["reason"] = {
"reason": f"Top up reason unknown for {user_id}"
}
if amount < 0:
raise ValueError(f"Top up amount must not be negative: {amount}")
@@ -644,6 +669,7 @@ class UserCredit(UserCreditBase):
is_active=False,
transaction_key=key,
ceiling_balance=ceiling_balance,
metadata=(Json(metadata)),
)
customer_id = await get_stripe_customer_id(user_id)
@@ -786,10 +812,15 @@ class UserCredit(UserCreditBase):
# Check the Checkout Session's payment_status property
# to determine if fulfillment should be performed
if checkout_session.payment_status in ["paid", "no_payment_required"]:
assert isinstance(checkout_session.payment_intent, stripe.PaymentIntent)
if payment_intent := checkout_session.payment_intent:
assert isinstance(payment_intent, stripe.PaymentIntent)
new_transaction_key = payment_intent.id
else:
new_transaction_key = None
await self._enable_transaction(
transaction_key=credit_transaction.transactionKey,
new_transaction_key=checkout_session.payment_intent.id,
new_transaction_key=new_transaction_key,
user_id=credit_transaction.userId,
metadata=Json(checkout_session),
)
@@ -822,8 +853,9 @@ class UserCredit(UserCreditBase):
take=transaction_count_limit,
)
# doesn't fill current_balance, reason, user_email, admin_email, or extra_data
grouped_transactions: dict[str, UserTransaction] = defaultdict(
lambda: UserTransaction()
lambda: UserTransaction(user_id=user_id)
)
tx_time = None
for t in transactions:
@@ -853,7 +885,7 @@ class UserCredit(UserCreditBase):
if tx_time > gt.transaction_time:
gt.transaction_time = tx_time
gt.balance = t.runningBalance or 0
gt.running_balance = t.runningBalance or 0
return TransactionHistory(
transactions=list(grouped_transactions.values()),
@@ -903,6 +935,7 @@ class BetaUserCredit(UserCredit):
amount=max(self.num_user_credits_refill - balance, 0),
transaction_type=CreditTransactionType.GRANT,
transaction_key=f"MONTHLY-CREDIT-TOP-UP-{cur_time}",
metadata=Json({"reason": "Monthly credit refill"}),
)
return balance
except UniqueViolationError:
@@ -912,7 +945,7 @@ class BetaUserCredit(UserCredit):
class DisabledUserCredit(UserCreditBase):
async def get_credits(self, *args, **kwargs) -> int:
return 0
return 100
async def get_transaction_history(self, *args, **kwargs) -> TransactionHistory:
return TransactionHistory(transactions=[], next_transaction_time=None)
@@ -990,3 +1023,81 @@ async def get_auto_top_up(user_id: str) -> AutoTopUpConfig:
return AutoTopUpConfig(threshold=0, amount=0)
return AutoTopUpConfig.model_validate(user.topUpConfig)
async def admin_get_user_history(
page: int = 1,
page_size: int = 20,
search: str | None = None,
transaction_filter: CreditTransactionType | None = None,
) -> UserHistoryResponse:
if page < 1 or page_size < 1:
raise ValueError("Invalid pagination input")
where_clause: CreditTransactionWhereInput = {}
if transaction_filter:
where_clause["type"] = transaction_filter
if search:
where_clause["OR"] = [
{"userId": {"contains": search, "mode": "insensitive"}},
{"User": {"is": {"email": {"contains": search, "mode": "insensitive"}}}},
{"User": {"is": {"name": {"contains": search, "mode": "insensitive"}}}},
]
transactions = await CreditTransaction.prisma().find_many(
where=where_clause,
skip=(page - 1) * page_size,
take=page_size,
include={"User": True},
order={"createdAt": "desc"},
)
total = await CreditTransaction.prisma().count(where=where_clause)
total_pages = (total + page_size - 1) // page_size
history = []
for tx in transactions:
admin_id = ""
admin_email = ""
reason = ""
metadata: dict = cast(dict, tx.metadata) or {}
if metadata:
admin_id = metadata.get("admin_id")
admin_email = (
(await get_user_email_by_id(admin_id) or f"Unknown Admin: {admin_id}")
if admin_id
else ""
)
reason = metadata.get("reason", "No reason provided")
balance, last_update = await get_user_credit_model()._get_credits(tx.userId)
history.append(
UserTransaction(
transaction_key=tx.transactionKey,
transaction_time=tx.createdAt,
transaction_type=tx.type,
amount=tx.amount,
current_balance=balance,
running_balance=tx.runningBalance or 0,
user_id=tx.userId,
user_email=(
tx.User.email
if tx.User
else (await get_user_by_id(tx.userId)).email
),
reason=reason,
admin_email=admin_email,
extra_data=str(metadata),
)
)
return UserHistoryResponse(
history=history,
pagination=Pagination(
total_items=total,
total_pages=total_pages,
current_page=page,
page_size=page_size,
),
)

View File

@@ -24,13 +24,14 @@ from prisma.models import (
)
from prisma.types import (
AgentGraphExecutionCreateInput,
AgentGraphExecutionUpdateManyMutationInput,
AgentGraphExecutionWhereInput,
AgentNodeExecutionCreateInput,
AgentNodeExecutionInputOutputCreateInput,
AgentNodeExecutionUpdateInput,
AgentNodeExecutionWhereInput,
)
from pydantic import BaseModel
from pydantic import BaseModel, ConfigDict
from pydantic.fields import Field
from backend.server.v2.store.exceptions import DatabaseError
@@ -69,10 +70,55 @@ class GraphExecutionMeta(BaseDbModel):
ended_at: datetime
class Stats(BaseModel):
cost: int = Field(..., description="Execution cost (cents)")
duration: float = Field(..., description="Seconds from start to end of run")
node_exec_time: float = Field(..., description="Seconds of total node runtime")
node_exec_count: int = Field(..., description="Number of node executions")
model_config = ConfigDict(
extra="allow",
arbitrary_types_allowed=True,
)
cost: int = Field(
default=0,
description="Execution cost (cents)",
)
duration: float = Field(
default=0,
description="Seconds from start to end of run",
)
duration_cpu_only: float = Field(
default=0,
description="CPU sec of duration",
)
node_exec_time: float = Field(
default=0,
description="Seconds of total node runtime",
)
node_exec_time_cpu_only: float = Field(
default=0,
description="CPU sec of node_exec_time",
)
node_exec_count: int = Field(
default=0,
description="Number of node executions",
)
node_error_count: int = Field(
default=0,
description="Number of node errors",
)
error: str | None = Field(
default=None,
description="Error message if any",
)
def to_db(self) -> GraphExecutionStats:
return GraphExecutionStats(
cost=self.cost,
walltime=self.duration,
cputime=self.duration_cpu_only,
nodes_walltime=self.node_exec_time,
nodes_cputime=self.node_exec_time_cpu_only,
node_count=self.node_exec_count,
node_error_count=self.node_error_count,
error=self.error,
)
stats: Stats | None
@@ -106,8 +152,16 @@ class GraphExecutionMeta(BaseDbModel):
GraphExecutionMeta.Stats(
cost=stats.cost,
duration=stats.walltime,
duration_cpu_only=stats.cputime,
node_exec_time=stats.nodes_walltime,
node_exec_time_cpu_only=stats.nodes_cputime,
node_exec_count=stats.node_count,
node_error_count=stats.node_error_count,
error=(
str(stats.error)
if isinstance(stats.error, Exception)
else stats.error
),
)
if stats
else None
@@ -208,18 +262,6 @@ class GraphExecutionWithNodes(GraphExecution):
graph_id=self.graph_id,
graph_version=self.graph_version or 0,
graph_exec_id=self.id,
start_node_execs=[
NodeExecutionEntry(
user_id=self.user_id,
graph_exec_id=node_exec.graph_exec_id,
graph_id=node_exec.graph_id,
node_exec_id=node_exec.node_exec_id,
node_id=node_exec.node_id,
block_id=node_exec.block_id,
data=node_exec.input_data,
)
for node_exec in self.node_executions
],
node_credentials_input_map={}, # FIXME
)
@@ -280,13 +322,28 @@ class NodeExecutionResult(BaseModel):
end_time=_node_exec.endedTime,
)
def to_node_execution_entry(self) -> "NodeExecutionEntry":
return NodeExecutionEntry(
user_id=self.user_id,
graph_exec_id=self.graph_exec_id,
graph_id=self.graph_id,
node_exec_id=self.node_exec_id,
node_id=self.node_id,
block_id=self.block_id,
inputs=self.input_data,
)
# --------------------- Model functions --------------------- #
async def get_graph_executions(
graph_id: Optional[str] = None,
user_id: Optional[str] = None,
graph_id: str | None = None,
user_id: str | None = None,
statuses: list[ExecutionStatus] | None = None,
created_time_gte: datetime | None = None,
created_time_lte: datetime | None = None,
limit: int | None = None,
) -> list[GraphExecutionMeta]:
where_filter: AgentGraphExecutionWhereInput = {
"isDeleted": False,
@@ -295,10 +352,18 @@ async def get_graph_executions(
where_filter["userId"] = user_id
if graph_id:
where_filter["agentGraphId"] = graph_id
if created_time_gte or created_time_lte:
where_filter["createdAt"] = {
"gte": created_time_gte or datetime.min.replace(tzinfo=timezone.utc),
"lte": created_time_lte or datetime.max.replace(tzinfo=timezone.utc),
}
if statuses:
where_filter["OR"] = [{"executionStatus": status} for status in statuses]
executions = await AgentGraphExecution.prisma().find_many(
where=where_filter,
order={"createdAt": "desc"},
take=limit,
)
return [GraphExecutionMeta.from_db(execution) for execution in executions]
@@ -492,21 +557,12 @@ async def upsert_execution_output(
async def update_graph_execution_start_time(
graph_exec_id: str,
) -> GraphExecution | None:
count = await AgentGraphExecution.prisma().update_many(
where={
"id": graph_exec_id,
"executionStatus": ExecutionStatus.QUEUED,
},
res = await AgentGraphExecution.prisma().update(
where={"id": graph_exec_id},
data={
"executionStatus": ExecutionStatus.RUNNING,
"startedAt": datetime.now(tz=timezone.utc),
},
)
if count == 0:
return None
res = await AgentGraphExecution.prisma().find_unique(
where={"id": graph_exec_id},
include=GRAPH_EXECUTION_INCLUDE,
)
return GraphExecution.from_db(res) if res else None
@@ -517,9 +573,15 @@ async def update_graph_execution_stats(
status: ExecutionStatus,
stats: GraphExecutionStats | None = None,
) -> GraphExecution | None:
data = stats.model_dump() if stats else {}
if isinstance(data.get("error"), Exception):
data["error"] = str(data["error"])
update_data: AgentGraphExecutionUpdateManyMutationInput = {
"executionStatus": status
}
if stats:
stats_dict = stats.model_dump()
if isinstance(stats_dict.get("error"), Exception):
stats_dict["error"] = str(stats_dict["error"])
update_data["stats"] = Json(stats_dict)
updated_count = await AgentGraphExecution.prisma().update_many(
where={
@@ -529,10 +591,7 @@ async def update_graph_execution_stats(
{"executionStatus": ExecutionStatus.QUEUED},
],
},
data={
"executionStatus": status,
"stats": Json(data),
},
data=update_data,
)
if updated_count == 0:
return None
@@ -625,8 +684,9 @@ async def delete_graph_execution(
)
async def get_node_execution_results(
async def get_node_executions(
graph_exec_id: str,
node_id: str | None = None,
block_ids: list[str] | None = None,
statuses: list[ExecutionStatus] | None = None,
limit: int | None = None,
@@ -634,6 +694,8 @@ async def get_node_execution_results(
where_clause: AgentNodeExecutionWhereInput = {
"agentGraphExecutionId": graph_exec_id,
}
if node_id:
where_clause["agentNodeId"] = node_id
if block_ids:
where_clause["Node"] = {"is": {"agentBlockId": {"in": block_ids}}}
if statuses:
@@ -648,28 +710,6 @@ async def get_node_execution_results(
return res
async def get_graph_executions_in_timerange(
user_id: str, start_time: str, end_time: str
) -> list[GraphExecution]:
try:
executions = await AgentGraphExecution.prisma().find_many(
where={
"startedAt": {
"gte": datetime.fromisoformat(start_time),
"lte": datetime.fromisoformat(end_time),
},
"userId": user_id,
"isDeleted": False,
},
include=GRAPH_EXECUTION_INCLUDE,
)
return [GraphExecution.from_db(execution) for execution in executions]
except Exception as e:
raise DatabaseError(
f"Failed to get executions in timerange {start_time} to {end_time} for user {user_id}: {e}"
) from e
async def get_latest_node_execution(
node_id: str, graph_eid: str
) -> NodeExecutionResult | None:
@@ -690,20 +730,6 @@ async def get_latest_node_execution(
return NodeExecutionResult.from_db(execution)
async def get_incomplete_node_executions(
node_id: str, graph_eid: str
) -> list[NodeExecutionResult]:
executions = await AgentNodeExecution.prisma().find_many(
where={
"agentNodeId": node_id,
"agentGraphExecutionId": graph_eid,
"executionStatus": ExecutionStatus.INCOMPLETE,
},
include=EXECUTION_RESULT_INCLUDE,
)
return [NodeExecutionResult.from_db(execution) for execution in executions]
# ----------------- Execution Infrastructure ----------------- #
@@ -712,7 +738,6 @@ class GraphExecutionEntry(BaseModel):
graph_exec_id: str
graph_id: str
graph_version: int
start_node_execs: list["NodeExecutionEntry"]
node_credentials_input_map: Optional[dict[str, dict[str, CredentialsMetaInput]]]
@@ -723,7 +748,7 @@ class NodeExecutionEntry(BaseModel):
node_exec_id: str
node_id: str
block_id: str
data: BlockInput
inputs: BlockInput
class ExecutionQueue(Generic[T]):

View File

@@ -172,6 +172,8 @@ class BaseGraph(BaseDbModel):
description: str
nodes: list[Node] = []
links: list[Link] = []
forked_from_id: str | None = None
forked_from_version: int | None = None
@computed_field
@property
@@ -197,11 +199,6 @@ class BaseGraph(BaseDbModel):
)
)
@computed_field
@property
def credentials_input_schema(self) -> dict[str, Any]:
return self._credentials_input_schema.jsonschema()
@staticmethod
def _generate_schema(
*props: tuple[type[AgentInputBlock.Input] | type[AgentOutputBlock.Input], dict],
@@ -234,6 +231,15 @@ class BaseGraph(BaseDbModel):
"required": [p.name for p in schema_fields if p.value is None],
}
class Graph(BaseGraph):
sub_graphs: list[BaseGraph] = [] # Flattened sub-graphs
@computed_field
@property
def credentials_input_schema(self) -> dict[str, Any]:
return self._credentials_input_schema.jsonschema()
@property
def _credentials_input_schema(self) -> type[BlockSchema]:
graph_credentials_inputs = self.aggregate_credentials_inputs()
@@ -312,17 +318,14 @@ class BaseGraph(BaseDbModel):
),
(node.id, field_name),
)
for node in self.nodes
for graph in [self] + self.sub_graphs
for node in graph.nodes
for field_name, field_info in node.block.input_schema.get_credentials_fields_info().items()
)
)
}
class Graph(BaseGraph):
sub_graphs: list[BaseGraph] = [] # Flattened sub-graphs, only used in export
class GraphModel(Graph):
user_id: str
nodes: list[NodeModel] = [] # type: ignore
@@ -398,7 +401,7 @@ class GraphModel(Graph):
if node.block_id != AgentExecutorBlock().id:
continue
node.input_default["user_id"] = user_id
node.input_default.setdefault("data", {})
node.input_default.setdefault("inputs", {})
if (graph_id := node.input_default.get("graph_id")) in graph_id_map:
node.input_default["graph_id"] = graph_id_map[graph_id]
@@ -409,10 +412,13 @@ class GraphModel(Graph):
@staticmethod
def _validate_graph(graph: BaseGraph, for_run: bool = False):
def is_tool_pin(name: str) -> bool:
return name.startswith("tools_^_")
def sanitize(name):
sanitized_name = name.split("_#_")[0].split("_@_")[0].split("_$_")[0]
if sanitized_name.startswith("tools_^_"):
return sanitized_name.split("_^_")[0]
if is_tool_pin(sanitized_name):
return "tools"
return sanitized_name
# Validate smart decision maker nodes
@@ -422,10 +428,6 @@ class GraphModel(Graph):
if (block := get_block(node.block_id)) is not None
}
for node in graph.nodes:
if (block := nodes_block.get(node.id)) is None:
raise ValueError(f"Invalid block {node.block_id} for node #{node.id}")
input_links = defaultdict(list)
for link in graph.links:
@@ -440,8 +442,8 @@ class GraphModel(Graph):
[sanitize(name) for name in node.input_default]
+ [sanitize(link.sink_name) for link in input_links.get(node.id, [])]
)
input_schema = block.input_schema
for name in (required_fields := input_schema.get_required_fields()):
InputSchema = block.input_schema
for name in (required_fields := InputSchema.get_required_fields()):
if (
name not in provided_inputs
# Webhook payload is passed in by ExecutionManager
@@ -451,7 +453,7 @@ class GraphModel(Graph):
in (BlockType.WEBHOOK, BlockType.WEBHOOK_MANUAL)
)
# Checking availability of credentials is done by ExecutionManager
and name not in input_schema.get_credentials_fields()
and name not in InputSchema.get_credentials_fields()
# Validate only I/O nodes, or validate everything when executing
and (
for_run
@@ -478,37 +480,43 @@ class GraphModel(Graph):
)
# Get input schema properties and check dependencies
input_fields = input_schema.model_fields
input_fields = InputSchema.model_fields
def has_value(name):
def has_value(node: Node, name: str):
return (
node is not None
and name in node.input_default
name in node.input_default
and node.input_default[name] is not None
and str(node.input_default[name]).strip() != ""
) or (name in input_fields and input_fields[name].default is not None)
# Validate dependencies between fields
for field_name, field_info in input_fields.items():
# Apply input dependency validation only on run & field with depends_on
json_schema_extra = field_info.json_schema_extra or {}
if not (
for_run
and isinstance(json_schema_extra, dict)
and (
dependencies := cast(
list[str], json_schema_extra.get("depends_on", [])
)
)
):
for field_name in input_fields.keys():
field_json_schema = InputSchema.get_field_schema(field_name)
dependencies: list[str] = []
# Check regular field dependencies (only pre graph execution)
if for_run:
dependencies.extend(field_json_schema.get("depends_on", []))
# Require presence of credentials discriminator (always).
# The `discriminator` is either the name of a sibling field (str),
# or an object that discriminates between possible types for this field:
# {"propertyName": prop_name, "mapping": {prop_value: sub_schema}}
if (
discriminator := field_json_schema.get("discriminator")
) and isinstance(discriminator, str):
dependencies.append(discriminator)
if not dependencies:
continue
# Check if dependent field has value in input_default
field_has_value = has_value(field_name)
field_has_value = has_value(node, field_name)
field_is_required = field_name in required_fields
# Check for missing dependencies when dependent field is present
missing_deps = [dep for dep in dependencies if not has_value(dep)]
missing_deps = [dep for dep in dependencies if not has_value(node, dep)]
if missing_deps and (field_has_value or field_is_required):
raise ValueError(
f"Node {block.name} #{node.id}: Field `{field_name}` requires [{', '.join(missing_deps)}] to be set"
@@ -553,7 +561,7 @@ class GraphModel(Graph):
if block.block_type not in [BlockType.AGENT]
else vals.get("input_schema", {}).get("properties", {}).keys()
)
if sanitized_name not in fields and not name.startswith("tools_^_"):
if sanitized_name not in fields and not is_tool_pin(name):
fields_msg = f"Allowed fields: {fields}"
raise ValueError(f"{prefix}, `{name}` invalid, {fields_msg}")
@@ -570,6 +578,8 @@ class GraphModel(Graph):
id=graph.id,
user_id=graph.userId if not for_export else "",
version=graph.version,
forked_from_id=graph.forkedFromId,
forked_from_version=graph.forkedFromVersion,
is_active=graph.isActive,
name=graph.name or "",
description=graph.description or "",
@@ -682,6 +692,7 @@ async def get_graph(
version: int | None = None,
user_id: str | None = None,
for_export: bool = False,
include_subgraphs: bool = False,
) -> GraphModel | None:
"""
Retrieves a graph from the DB.
@@ -718,6 +729,58 @@ async def get_graph(
):
return None
if include_subgraphs or for_export:
sub_graphs = await get_sub_graphs(graph)
return GraphModel.from_db(
graph=graph,
sub_graphs=sub_graphs,
for_export=for_export,
)
return GraphModel.from_db(graph, for_export)
async def get_graph_as_admin(
graph_id: str,
version: int | None = None,
user_id: str | None = None,
for_export: bool = False,
) -> GraphModel | None:
"""
Intentionally parallels the get_graph but should only be used for admin tasks, because can return any graph that's been submitted
Retrieves a graph from the DB.
Defaults to the version with `is_active` if `version` is not passed.
Returns `None` if the record is not found.
"""
logger.warning(f"Getting {graph_id=} {version=} as ADMIN {user_id=} {for_export=}")
where_clause: AgentGraphWhereInput = {
"id": graph_id,
}
if version is not None:
where_clause["version"] = version
graph = await AgentGraph.prisma().find_first(
where=where_clause,
include=AGENT_GRAPH_INCLUDE,
order={"version": "desc"},
)
# For access, the graph must be owned by the user or listed in the store
if graph is None or (
graph.userId != user_id
and not (
await StoreListingVersion.prisma().find_first(
where={
"agentGraphId": graph_id,
"agentGraphVersion": version or graph.version,
}
)
)
):
return None
if for_export:
sub_graphs = await get_sub_graphs(graph)
return GraphModel.from_db(
@@ -847,6 +910,27 @@ async def create_graph(graph: Graph, user_id: str) -> GraphModel:
raise ValueError(f"Created graph {graph.id} v{graph.version} is not in DB")
async def fork_graph(graph_id: str, graph_version: int, user_id: str) -> GraphModel:
"""
Forks a graph by copying it and all its nodes and links to a new graph.
"""
async with transaction() as tx:
graph = await get_graph(graph_id, graph_version, user_id, True)
if not graph:
raise ValueError(f"Graph {graph_id} v{graph_version} not found")
# Set forked from ID and version as itself as it's about ot be copied
graph.forked_from_id = graph.id
graph.forked_from_version = graph.version
graph.name = f"{graph.name} (copy)"
graph.reassign_ids(user_id=user_id, reassign_graph_id=True)
graph.validate_graph(for_run=False)
await __create_graph(tx, graph, user_id)
return graph
async def __create_graph(tx, graph: Graph, user_id: str):
graphs = [graph] + graph.sub_graphs
@@ -859,6 +943,8 @@ async def __create_graph(tx, graph: Graph, user_id: str):
description=graph.description,
isActive=graph.is_active,
userId=user_id,
forkedFromId=graph.forked_from_id,
forkedFromVersion=graph.forked_from_version,
)
for graph in graphs
]

View File

@@ -1,6 +1,7 @@
from __future__ import annotations
import base64
import enum
import logging
from collections import defaultdict
from datetime import datetime, timezone
@@ -188,7 +189,7 @@ def SchemaField(
class _BaseCredentials(BaseModel):
id: str = Field(default_factory=lambda: str(uuid4()))
provider: str
title: Optional[str]
title: Optional[str] = None
@field_serializer("*")
def dump_secret_strings(value: Any, _info):
@@ -199,13 +200,13 @@ class _BaseCredentials(BaseModel):
class OAuth2Credentials(_BaseCredentials):
type: Literal["oauth2"] = "oauth2"
username: Optional[str]
username: Optional[str] = None
"""Username of the third-party service user that these credentials belong to"""
access_token: SecretStr
access_token_expires_at: Optional[int]
access_token_expires_at: Optional[int] = None
"""Unix timestamp (seconds) indicating when the access token expires (if at all)"""
refresh_token: Optional[SecretStr]
refresh_token_expires_at: Optional[int]
refresh_token: Optional[SecretStr] = None
refresh_token_expires_at: Optional[int] = None
"""Unix timestamp (seconds) indicating when the refresh token expires (if at all)"""
scopes: list[str]
metadata: dict[str, Any] = Field(default_factory=dict)
@@ -449,6 +450,12 @@ class ContributorDetails(BaseModel):
name: str = Field(title="Name", description="The name of the contributor.")
class TopUpType(enum.Enum):
AUTO = "AUTO"
MANUAL = "MANUAL"
UNCATEGORIZED = "UNCATEGORIZED"
class AutoTopUpConfig(BaseModel):
amount: int
"""Amount of credits to top up."""
@@ -461,12 +468,18 @@ class UserTransaction(BaseModel):
transaction_time: datetime = datetime.min.replace(tzinfo=timezone.utc)
transaction_type: CreditTransactionType = CreditTransactionType.USAGE
amount: int = 0
balance: int = 0
running_balance: int = 0
current_balance: int = 0
description: str | None = None
usage_graph_id: str | None = None
usage_execution_id: str | None = None
usage_node_count: int = 0
usage_start_time: datetime = datetime.max.replace(tzinfo=timezone.utc)
user_id: str
user_email: str | None = None
reason: str | None = None
admin_email: str | None = None
extra_data: str | None = None
class TransactionHistory(BaseModel):

View File

@@ -189,26 +189,14 @@ NotificationData = Annotated[
]
class NotificationEventDTO(BaseModel):
user_id: str
class BaseEventModel(BaseModel):
type: NotificationType
data: dict
created_at: datetime = Field(default_factory=lambda: datetime.now(tz=timezone.utc))
retry_count: int = 0
class SummaryParamsEventDTO(BaseModel):
user_id: str
type: NotificationType
data: dict
created_at: datetime = Field(default_factory=lambda: datetime.now(tz=timezone.utc))
class NotificationEventModel(BaseModel, Generic[NotificationDataType_co]):
user_id: str
type: NotificationType
class NotificationEventModel(BaseEventModel, Generic[NotificationDataType_co]):
data: NotificationDataType_co
created_at: datetime = Field(default_factory=lambda: datetime.now(tz=timezone.utc))
@property
def strategy(self) -> QueueType:
@@ -225,11 +213,8 @@ class NotificationEventModel(BaseModel, Generic[NotificationDataType_co]):
return NotificationTypeOverride(self.type).template
class SummaryParamsEventModel(BaseModel, Generic[SummaryParamsType_co]):
user_id: str
type: NotificationType
class SummaryParamsEventModel(BaseEventModel, Generic[SummaryParamsType_co]):
data: SummaryParamsType_co
created_at: datetime = Field(default_factory=lambda: datetime.now(tz=timezone.utc))
def get_notif_data_type(
@@ -384,7 +369,7 @@ class UserNotificationBatchDTO(BaseModel):
def get_batch_delay(notification_type: NotificationType) -> timedelta:
return {
NotificationType.AGENT_RUN: timedelta(minutes=60),
NotificationType.AGENT_RUN: timedelta(days=1),
NotificationType.ZERO_BALANCE: timedelta(minutes=60),
NotificationType.LOW_BALANCE: timedelta(minutes=60),
NotificationType.BLOCK_EXECUTION_FAILED: timedelta(minutes=60),

View File

@@ -124,7 +124,7 @@ async def get_user_integrations(user_id: str) -> UserIntegrations:
async def update_user_integrations(user_id: str, data: UserIntegrations):
encrypted_data = JSONCryptor().encrypt(data.model_dump())
encrypted_data = JSONCryptor().encrypt(data.model_dump(exclude_none=True))
await User.prisma().update(
where={"id": user_id},
data={"integrations": encrypted_data},

View File

@@ -1,9 +1,10 @@
from .database import DatabaseManager
from .database import DatabaseManager, DatabaseManagerClient
from .manager import ExecutionManager
from .scheduler import Scheduler
__all__ = [
"DatabaseManager",
"DatabaseManagerClient",
"ExecutionManager",
"Scheduler",
]

View File

@@ -1,13 +1,15 @@
import logging
from typing import Callable, Concatenate, ParamSpec, TypeVar, cast
from backend.data import db
from backend.data.credit import UsageTransactionMetadata, get_user_credit_model
from backend.data.execution import (
create_graph_execution,
get_graph_execution,
get_incomplete_node_executions,
get_graph_execution_meta,
get_graph_executions,
get_latest_node_execution,
get_node_execution_results,
get_node_executions,
update_graph_execution_start_time,
update_graph_execution_stats,
update_node_execution_stats,
@@ -39,12 +41,14 @@ from backend.data.user import (
update_user_integrations,
update_user_metadata,
)
from backend.util.service import AppService, exposed_run_and_wait
from backend.util.service import AppService, AppServiceClient, endpoint_to_sync, expose
from backend.util.settings import Config
config = Config()
_user_credit_model = get_user_credit_model()
logger = logging.getLogger(__name__)
P = ParamSpec("P")
R = TypeVar("R")
async def _spend_credits(
@@ -53,6 +57,10 @@ async def _spend_credits(
return await _user_credit_model.spend_credits(user_id, cost, metadata)
async def _get_credits(user_id: str) -> int:
return await _user_credit_model.get_credits(user_id)
class DatabaseManager(AppService):
def run_service(self) -> None:
@@ -69,58 +77,115 @@ class DatabaseManager(AppService):
def get_port(cls) -> int:
return config.database_api_port
@staticmethod
def _(
f: Callable[P, R], name: str | None = None
) -> Callable[Concatenate[object, P], R]:
if name is not None:
f.__name__ = name
return cast(Callable[Concatenate[object, P], R], expose(f))
# Executions
get_graph_execution = exposed_run_and_wait(get_graph_execution)
create_graph_execution = exposed_run_and_wait(create_graph_execution)
get_node_execution_results = exposed_run_and_wait(get_node_execution_results)
get_incomplete_node_executions = exposed_run_and_wait(
get_incomplete_node_executions
)
get_latest_node_execution = exposed_run_and_wait(get_latest_node_execution)
update_node_execution_status = exposed_run_and_wait(update_node_execution_status)
update_node_execution_status_batch = exposed_run_and_wait(
update_node_execution_status_batch
)
update_graph_execution_start_time = exposed_run_and_wait(
update_graph_execution_start_time
)
update_graph_execution_stats = exposed_run_and_wait(update_graph_execution_stats)
update_node_execution_stats = exposed_run_and_wait(update_node_execution_stats)
upsert_execution_input = exposed_run_and_wait(upsert_execution_input)
upsert_execution_output = exposed_run_and_wait(upsert_execution_output)
get_graph_execution = _(get_graph_execution)
get_graph_executions = _(get_graph_executions)
get_graph_execution_meta = _(get_graph_execution_meta)
create_graph_execution = _(create_graph_execution)
get_node_executions = _(get_node_executions)
get_latest_node_execution = _(get_latest_node_execution)
update_node_execution_status = _(update_node_execution_status)
update_node_execution_status_batch = _(update_node_execution_status_batch)
update_graph_execution_start_time = _(update_graph_execution_start_time)
update_graph_execution_stats = _(update_graph_execution_stats)
update_node_execution_stats = _(update_node_execution_stats)
upsert_execution_input = _(upsert_execution_input)
upsert_execution_output = _(upsert_execution_output)
# Graphs
get_node = exposed_run_and_wait(get_node)
get_graph = exposed_run_and_wait(get_graph)
get_connected_output_nodes = exposed_run_and_wait(get_connected_output_nodes)
get_graph_metadata = exposed_run_and_wait(get_graph_metadata)
get_node = _(get_node)
get_graph = _(get_graph)
get_connected_output_nodes = _(get_connected_output_nodes)
get_graph_metadata = _(get_graph_metadata)
# Credits
spend_credits = exposed_run_and_wait(_spend_credits)
spend_credits = _(_spend_credits, name="spend_credits")
get_credits = _(_get_credits, name="get_credits")
# User + User Metadata + User Integrations
get_user_metadata = exposed_run_and_wait(get_user_metadata)
update_user_metadata = exposed_run_and_wait(update_user_metadata)
get_user_integrations = exposed_run_and_wait(get_user_integrations)
update_user_integrations = exposed_run_and_wait(update_user_integrations)
get_user_metadata = _(get_user_metadata)
update_user_metadata = _(update_user_metadata)
get_user_integrations = _(get_user_integrations)
update_user_integrations = _(update_user_integrations)
# User Comms - async
get_active_user_ids_in_timerange = exposed_run_and_wait(
get_active_user_ids_in_timerange
)
get_user_email_by_id = exposed_run_and_wait(get_user_email_by_id)
get_user_email_verification = exposed_run_and_wait(get_user_email_verification)
get_user_notification_preference = exposed_run_and_wait(
get_user_notification_preference
)
get_active_user_ids_in_timerange = _(get_active_user_ids_in_timerange)
get_user_email_by_id = _(get_user_email_by_id)
get_user_email_verification = _(get_user_email_verification)
get_user_notification_preference = _(get_user_notification_preference)
# Notifications - async
create_or_add_to_user_notification_batch = exposed_run_and_wait(
create_or_add_to_user_notification_batch = _(
create_or_add_to_user_notification_batch
)
empty_user_notification_batch = exposed_run_and_wait(empty_user_notification_batch)
get_all_batches_by_type = exposed_run_and_wait(get_all_batches_by_type)
get_user_notification_batch = exposed_run_and_wait(get_user_notification_batch)
get_user_notification_oldest_message_in_batch = exposed_run_and_wait(
empty_user_notification_batch = _(empty_user_notification_batch)
get_all_batches_by_type = _(get_all_batches_by_type)
get_user_notification_batch = _(get_user_notification_batch)
get_user_notification_oldest_message_in_batch = _(
get_user_notification_oldest_message_in_batch
)
class DatabaseManagerClient(AppServiceClient):
d = DatabaseManager
_ = endpoint_to_sync
@classmethod
def get_service_type(cls):
return DatabaseManager
# Executions
get_graph_execution = _(d.get_graph_execution)
get_graph_executions = _(d.get_graph_executions)
get_graph_execution_meta = _(d.get_graph_execution_meta)
create_graph_execution = _(d.create_graph_execution)
get_node_executions = _(d.get_node_executions)
get_latest_node_execution = _(d.get_latest_node_execution)
update_node_execution_status = _(d.update_node_execution_status)
update_node_execution_status_batch = _(d.update_node_execution_status_batch)
update_graph_execution_start_time = _(d.update_graph_execution_start_time)
update_graph_execution_stats = _(d.update_graph_execution_stats)
update_node_execution_stats = _(d.update_node_execution_stats)
upsert_execution_input = _(d.upsert_execution_input)
upsert_execution_output = _(d.upsert_execution_output)
# Graphs
get_node = _(d.get_node)
get_graph = _(d.get_graph)
get_connected_output_nodes = _(d.get_connected_output_nodes)
get_graph_metadata = _(d.get_graph_metadata)
# Credits
spend_credits = _(d.spend_credits)
get_credits = _(d.get_credits)
# User + User Metadata + User Integrations
get_user_metadata = _(d.get_user_metadata)
update_user_metadata = _(d.update_user_metadata)
get_user_integrations = _(d.get_user_integrations)
update_user_integrations = _(d.update_user_integrations)
# User Comms - async
get_active_user_ids_in_timerange = _(d.get_active_user_ids_in_timerange)
get_user_email_by_id = _(d.get_user_email_by_id)
get_user_email_verification = _(d.get_user_email_verification)
get_user_notification_preference = _(d.get_user_notification_preference)
# Notifications - async
create_or_add_to_user_notification_batch = _(
d.create_or_add_to_user_notification_batch
)
empty_user_notification_batch = _(d.empty_user_notification_batch)
get_all_batches_by_type = _(d.get_all_batches_by_type)
get_user_notification_batch = _(d.get_user_notification_batch)
get_user_notification_oldest_message_in_batch = _(
d.get_user_notification_oldest_message_in_batch
)

View File

@@ -5,35 +5,42 @@ import os
import signal
import sys
import threading
import time
from concurrent.futures import Future, ProcessPoolExecutor
from contextlib import contextmanager
from multiprocessing.pool import AsyncResult, Pool
from typing import TYPE_CHECKING, Any, Generator, TypeVar, cast
from typing import TYPE_CHECKING, Any, Generator, Optional, TypeVar, cast
from pika.adapters.blocking_connection import BlockingChannel
from pika.spec import Basic, BasicProperties
from redis.lock import Lock as RedisLock
from backend.blocks.io import AgentOutputBlock
from backend.data.model import GraphExecutionStats, NodeExecutionStats
from backend.data.model import (
CredentialsMetaInput,
GraphExecutionStats,
NodeExecutionStats,
)
from backend.data.notifications import (
AgentRunData,
LowBalanceData,
NotificationEventDTO,
NotificationEventModel,
NotificationType,
)
from backend.data.rabbitmq import SyncRabbitMQ
from backend.executor.utils import create_execution_queue_config
from backend.notifications.notifications import queue_notification
from backend.util.exceptions import InsufficientBalanceError
if TYPE_CHECKING:
from backend.executor import DatabaseManager
from backend.notifications.notifications import NotificationManager
from backend.executor import DatabaseManagerClient
from autogpt_libs.utils.cache import clear_thread_cache, thread_cached
from autogpt_libs.utils.cache import thread_cached
from prometheus_client import Gauge, start_http_server
from backend.blocks.agent import AgentExecutorBlock
from backend.data import redis
from backend.data.block import BlockData, BlockInput, BlockSchema, get_block
from backend.data.credit import UsageTransactionMetadata
from backend.data.execution import (
ExecutionQueue,
ExecutionStatus,
@@ -47,7 +54,6 @@ from backend.executor.utils import (
GRAPH_EXECUTION_CANCEL_QUEUE_NAME,
GRAPH_EXECUTION_QUEUE_NAME,
CancelExecutionEvent,
UsageTransactionMetadata,
block_usage_cost,
execution_usage_cost,
get_execution_event_bus,
@@ -59,16 +65,28 @@ from backend.integrations.creds_manager import IntegrationCredentialsManager
from backend.util import json
from backend.util.decorator import error_logged, time_measured
from backend.util.file import clean_exec_files
from backend.util.logging import configure_logging
from backend.util.logging import TruncatedLogger, configure_logging
from backend.util.process import AppProcess, set_service_name
from backend.util.service import close_service_client, get_service_client
from backend.util.retry import continuous_retry, func_retry
from backend.util.service import get_service_client
from backend.util.settings import Settings
logger = logging.getLogger(__name__)
settings = Settings()
active_runs_gauge = Gauge(
"execution_manager_active_runs", "Number of active graph runs"
)
pool_size_gauge = Gauge(
"execution_manager_pool_size", "Maximum number of graph workers"
)
utilization_gauge = Gauge(
"execution_manager_utilization_ratio",
"Ratio of active graph runs to max graph workers",
)
class LogMetadata:
class LogMetadata(TruncatedLogger):
def __init__(
self,
user_id: str,
@@ -77,8 +95,9 @@ class LogMetadata:
node_eid: str,
node_id: str,
block_name: str,
max_length: int = 1000,
):
self.metadata = {
metadata = {
"component": "ExecutionManager",
"user_id": user_id,
"graph_eid": graph_eid,
@@ -87,33 +106,13 @@ class LogMetadata:
"node_id": node_id,
"block_name": block_name,
}
self.prefix = f"[ExecutionManager|uid:{user_id}|gid:{graph_id}|nid:{node_id}]|geid:{graph_eid}|neid:{node_eid}|{block_name}]"
def info(self, msg: str, **extra):
msg = self._wrap(msg, **extra)
logger.info(msg, extra={"json_fields": {**self.metadata, **extra}})
def warning(self, msg: str, **extra):
msg = self._wrap(msg, **extra)
logger.warning(msg, extra={"json_fields": {**self.metadata, **extra}})
def error(self, msg: str, **extra):
msg = self._wrap(msg, **extra)
logger.error(msg, extra={"json_fields": {**self.metadata, **extra}})
def debug(self, msg: str, **extra):
msg = self._wrap(msg, **extra)
logger.debug(msg, extra={"json_fields": {**self.metadata, **extra}})
def exception(self, msg: str, **extra):
msg = self._wrap(msg, **extra)
logger.exception(msg, extra={"json_fields": {**self.metadata, **extra}})
def _wrap(self, msg: str, **extra):
extra_msg = str(extra or "")
if len(extra_msg) > 1000:
extra_msg = extra_msg[:1000] + "..."
return f"{self.prefix} {msg} {extra_msg}"
prefix = f"[ExecutionManager|uid:{user_id}|gid:{graph_id}|nid:{node_id}]|geid:{graph_eid}|neid:{node_eid}|{block_name}]"
super().__init__(
logger,
max_length=max_length,
prefix=prefix,
metadata=metadata,
)
T = TypeVar("T")
@@ -121,10 +120,13 @@ ExecutionStream = Generator[NodeExecutionEntry, None, None]
def execute_node(
db_client: "DatabaseManager",
db_client: "DatabaseManagerClient",
creds_manager: IntegrationCredentialsManager,
data: NodeExecutionEntry,
execution_stats: NodeExecutionStats | None = None,
node_credentials_input_map: Optional[
dict[str, dict[str, CredentialsMetaInput]]
] = None,
) -> ExecutionStream:
"""
Execute a node in the graph. This will trigger a block execution on a node,
@@ -172,7 +174,7 @@ def execute_node(
)
# Sanity check: validate the execution input.
input_data, error = validate_exec(node, data.data, resolve_input=False)
input_data, error = validate_exec(node, data.inputs, resolve_input=False)
if input_data is None:
log_metadata.error(f"Skip execution, input validation error: {error}")
push_output("error", error)
@@ -182,8 +184,12 @@ def execute_node(
# Re-shape the input data for agent block.
# AgentExecutorBlock specially separate the node input_data & its input_default.
if isinstance(node_block, AgentExecutorBlock):
input_data = {**node.input_default, "data": input_data}
data.data = input_data
_input_data = AgentExecutorBlock.Input(**node.input_default)
_input_data.inputs = input_data
if node_credentials_input_map:
_input_data.node_credentials_input_map = node_credentials_input_map
input_data = _input_data.model_dump()
data.inputs = input_data
# Execute the node
input_data_str = json.dumps(input_data)
@@ -230,6 +236,7 @@ def execute_node(
graph_exec_id=graph_exec_id,
graph_id=graph_id,
log_metadata=log_metadata,
node_credentials_input_map=node_credentials_input_map,
):
yield execution
@@ -248,13 +255,14 @@ def execute_node(
graph_exec_id=graph_exec_id,
graph_id=graph_id,
log_metadata=log_metadata,
node_credentials_input_map=node_credentials_input_map,
):
yield execution
raise e
finally:
# Ensure credentials are released even if execution fails
if creds_lock and creds_lock.locked():
if creds_lock and creds_lock.locked() and creds_lock.owned():
try:
creds_lock.release()
except Exception as e:
@@ -270,13 +278,14 @@ def execute_node(
def _enqueue_next_nodes(
db_client: "DatabaseManager",
db_client: "DatabaseManagerClient",
node: Node,
output: BlockData,
user_id: str,
graph_exec_id: str,
graph_id: str,
log_metadata: LogMetadata,
node_credentials_input_map: Optional[dict[str, dict[str, CredentialsMetaInput]]],
) -> list[NodeExecutionEntry]:
def add_enqueued_execution(
node_exec_id: str, node_id: str, block_id: str, data: BlockInput
@@ -292,7 +301,7 @@ def _enqueue_next_nodes(
node_exec_id=node_exec_id,
node_id=node_id,
block_id=block_id,
data=data,
inputs=data,
)
def register_next_executions(node_link: Link) -> list[NodeExecutionEntry]:
@@ -333,6 +342,15 @@ def _enqueue_next_nodes(
for name in static_link_names:
next_node_input[name] = latest_execution.input_data.get(name)
# Apply node credentials overrides
node_credentials = None
if node_credentials_input_map and (
node_credentials := node_credentials_input_map.get(next_node.id)
):
next_node_input.update(
{k: v.model_dump() for k, v in node_credentials.items()}
)
# Validate the input data for the next node.
next_node_input, validation_msg = validate_exec(next_node, next_node_input)
suffix = f"{next_output_name}>{next_input_name}~{next_node_exec_id}:{validation_msg}"
@@ -359,8 +377,10 @@ def _enqueue_next_nodes(
# If link is static, there could be some incomplete executions waiting for it.
# Load and complete the input missing input data, and try to re-enqueue them.
for iexec in db_client.get_incomplete_node_executions(
next_node_id, graph_exec_id
for iexec in db_client.get_node_executions(
node_id=next_node_id,
graph_exec_id=graph_exec_id,
statuses=[ExecutionStatus.INCOMPLETE],
):
idata = iexec.input_data
ineid = iexec.node_exec_id
@@ -373,6 +393,12 @@ def _enqueue_next_nodes(
for input_name in static_link_names:
idata[input_name] = next_node_input[input_name]
# Apply node credentials overrides
if node_credentials:
idata.update(
{k: v.model_dump() for k, v in node_credentials.items()}
)
idata, msg = validate_exec(next_node, idata)
suffix = f"{next_output_name}>{next_input_name}~{ineid}:{msg}"
if not idata:
@@ -422,6 +448,7 @@ class Executor:
"""
@classmethod
@func_retry
def on_node_executor_start(cls):
configure_logging()
set_service_name("NodeExecutor")
@@ -432,36 +459,28 @@ class Executor:
# Set up shutdown handlers
cls.shutdown_lock = threading.Lock()
atexit.register(cls.on_node_executor_stop) # handle regular shutdown
signal.signal( # handle termination
signal.SIGTERM, lambda _, __: cls.on_node_executor_sigterm()
)
atexit.register(cls.on_node_executor_stop)
signal.signal(signal.SIGTERM, lambda _, __: cls.on_node_executor_sigterm())
signal.signal(signal.SIGINT, lambda _, __: cls.on_node_executor_sigterm())
@classmethod
def on_node_executor_stop(cls):
def on_node_executor_stop(cls, log=logger.info):
if not cls.shutdown_lock.acquire(blocking=False):
return # already shutting down
logger.info(f"[on_node_executor_stop {cls.pid}] ⏳ Releasing locks...")
log(f"[on_node_executor_stop {cls.pid}] ⏳ Releasing locks...")
cls.creds_manager.release_all_locks()
logger.info(f"[on_node_executor_stop {cls.pid}] ⏳ Disconnecting Redis...")
log(f"[on_node_executor_stop {cls.pid}] ⏳ Disconnecting Redis...")
redis.disconnect()
logger.info(f"[on_node_executor_stop {cls.pid}] ⏳ Disconnecting DB manager...")
close_service_client(cls.db_client)
logger.info(f"[on_node_executor_stop {cls.pid}] ✅ Finished cleanup")
log(f"[on_node_executor_stop {cls.pid}] ⏳ Disconnecting DB manager...")
cls.db_client.close()
log(f"[on_node_executor_stop {cls.pid}] ✅ Finished NodeExec cleanup")
sys.exit(0)
@classmethod
def on_node_executor_sigterm(cls):
llprint(f"[on_node_executor_sigterm {cls.pid}] ⚠️ SIGTERM received")
if not cls.shutdown_lock.acquire(blocking=False):
return # already shutting down
llprint(f"[on_node_executor_stop {cls.pid}] ⏳ Releasing locks...")
cls.creds_manager.release_all_locks()
llprint(f"[on_node_executor_stop {cls.pid}] ⏳ Disconnecting Redis...")
redis.disconnect()
llprint(f"[on_node_executor_stop {cls.pid}] ✅ Finished cleanup")
sys.exit(0)
llprint(f"[on_node_executor_sigterm {cls.pid}] ⚠️ NodeExec SIGTERM received")
cls.on_node_executor_stop(log=llprint)
@classmethod
@error_logged
@@ -469,6 +488,9 @@ class Executor:
cls,
q: ExecutionQueue[NodeExecutionEntry],
node_exec: NodeExecutionEntry,
node_credentials_input_map: Optional[
dict[str, dict[str, CredentialsMetaInput]]
] = None,
) -> NodeExecutionStats:
log_metadata = LogMetadata(
user_id=node_exec.user_id,
@@ -481,7 +503,7 @@ class Executor:
execution_stats = NodeExecutionStats()
timing_info, _ = cls._on_node_execution(
q, node_exec, log_metadata, execution_stats
q, node_exec, log_metadata, execution_stats, node_credentials_input_map
)
execution_stats.walltime = timing_info.wall_time
execution_stats.cputime = timing_info.cpu_time
@@ -501,6 +523,9 @@ class Executor:
node_exec: NodeExecutionEntry,
log_metadata: LogMetadata,
stats: NodeExecutionStats | None = None,
node_credentials_input_map: Optional[
dict[str, dict[str, CredentialsMetaInput]]
] = None,
):
try:
log_metadata.info(f"Start node execution {node_exec.node_exec_id}")
@@ -509,6 +534,7 @@ class Executor:
creds_manager=cls.creds_manager,
data=node_exec,
execution_stats=stats,
node_credentials_input_map=node_credentials_input_map,
):
q.add(execution)
log_metadata.info(f"Finished node execution {node_exec.node_exec_id}")
@@ -527,6 +553,7 @@ class Executor:
stats.error = e
@classmethod
@func_retry
def on_graph_executor_start(cls):
configure_logging()
set_service_name("GraphExecutor")
@@ -534,23 +561,8 @@ class Executor:
cls.db_client = get_db_client()
cls.pool_size = settings.config.num_node_workers
cls.pid = os.getpid()
cls.notification_service = get_notification_service()
cls._init_node_executor_pool()
logger.info(
f"Graph executor {cls.pid} started with {cls.pool_size} node workers"
)
# Set up shutdown handler
atexit.register(cls.on_graph_executor_stop)
@classmethod
def on_graph_executor_stop(cls):
prefix = f"[on_graph_executor_stop {cls.pid}]"
logger.info(f"{prefix} ⏳ Terminating node executor pool...")
cls.executor.terminate()
logger.info(f"{prefix} ⏳ Disconnecting DB manager...")
close_service_client(cls.db_client)
logger.info(f"{prefix} ✅ Finished cleanup")
logger.info(f"GraphExec {cls.pid} started with {cls.pool_size} node workers")
@classmethod
def _init_node_executor_pool(cls):
@@ -572,22 +584,46 @@ class Executor:
node_eid="*",
block_name="-",
)
exec_meta = cls.db_client.update_graph_execution_start_time(
graph_exec.graph_exec_id
exec_meta = cls.db_client.get_graph_execution_meta(
user_id=graph_exec.user_id,
execution_id=graph_exec.graph_exec_id,
)
if exec_meta is None:
logger.warning(
f"Skipped graph execution {graph_exec.graph_exec_id}, the graph execution is not found or not currently in the QUEUED state."
log_metadata.warning(
f"Skipped graph execution #{graph_exec.graph_exec_id}, the graph execution is not found."
)
return
if exec_meta.status == ExecutionStatus.QUEUED:
log_metadata.info(f"⚙️ Starting graph execution #{graph_exec.graph_exec_id}")
exec_meta.status = ExecutionStatus.RUNNING
send_execution_update(
cls.db_client.update_graph_execution_start_time(
graph_exec.graph_exec_id
)
)
elif exec_meta.status == ExecutionStatus.RUNNING:
log_metadata.info(
f"⚙️ Graph execution #{graph_exec.graph_exec_id} is already running, continuing where it left off."
)
else:
log_metadata.warning(
f"Skipped graph execution {graph_exec.graph_exec_id}, the graph execution status is `{exec_meta.status}`."
)
return
send_execution_update(exec_meta)
timing_info, (exec_stats, status, error) = cls._on_graph_execution(
graph_exec, cancel, log_metadata
graph_exec=graph_exec,
cancel=cancel,
log_metadata=log_metadata,
execution_stats=(
exec_meta.stats.to_db() if exec_meta.stats else GraphExecutionStats()
),
)
exec_stats.walltime = timing_info.wall_time
exec_stats.cputime = timing_info.cpu_time
exec_stats.error = str(error)
exec_stats.walltime += timing_info.wall_time
exec_stats.cputime += timing_info.cpu_time
exec_stats.error = str(error) if error else exec_stats.error
if graph_exec_result := cls.db_client.update_graph_execution_stats(
graph_exec_id=graph_exec.graph_exec_id,
@@ -604,13 +640,15 @@ class Executor:
node_exec: NodeExecutionEntry,
execution_count: int,
execution_stats: GraphExecutionStats,
) -> int:
):
block = get_block(node_exec.block_id)
if not block:
logger.error(f"Block {node_exec.block_id} not found.")
return execution_count
return
cost, matching_filter = block_usage_cost(block=block, input_data=node_exec.data)
cost, matching_filter = block_usage_cost(
block=block, input_data=node_exec.inputs
)
if cost > 0:
cls.db_client.spend_credits(
user_id=node_exec.user_id,
@@ -623,11 +661,12 @@ class Executor:
block_id=node_exec.block_id,
block=block.name,
input=matching_filter,
reason=f"Ran block {node_exec.block_id} {block.name}",
),
)
execution_stats.cost += cost
cost, execution_count = execution_usage_cost(execution_count)
cost, usage_count = execution_usage_cost(execution_count)
if cost > 0:
cls.db_client.spend_credits(
user_id=node_exec.user_id,
@@ -636,15 +675,14 @@ class Executor:
graph_exec_id=node_exec.graph_exec_id,
graph_id=node_exec.graph_id,
input={
"execution_count": execution_count,
"execution_count": usage_count,
"charge": "Execution Cost",
},
reason=f"Execution Cost for {usage_count} blocks of ex_id:{node_exec.graph_exec_id} g_id:{node_exec.graph_id}",
),
)
execution_stats.cost += cost
return execution_count
@classmethod
@time_measured
def _on_graph_execution(
@@ -652,6 +690,7 @@ class Executor:
graph_exec: GraphExecutionEntry,
cancel: threading.Event,
log_metadata: LogMetadata,
execution_stats: GraphExecutionStats,
) -> tuple[GraphExecutionStats, ExecutionStatus, Exception | None]:
"""
Returns:
@@ -659,8 +698,6 @@ class Executor:
ExecutionStatus: The final status of the graph execution.
Exception | None: The error that occurred during the execution, if any.
"""
log_metadata.info(f"Start graph execution {graph_exec.graph_exec_id}")
execution_stats = GraphExecutionStats()
execution_status = ExecutionStatus.RUNNING
error = None
finished = False
@@ -681,11 +718,21 @@ class Executor:
cancel_thread.start()
try:
queue = ExecutionQueue[NodeExecutionEntry]()
for node_exec in graph_exec.start_node_execs:
queue.add(node_exec)
if cls.db_client.get_credits(graph_exec.user_id) <= 0:
raise InsufficientBalanceError(
user_id=graph_exec.user_id,
message="You have no credits left to run an agent.",
balance=0,
amount=1,
)
queue = ExecutionQueue[NodeExecutionEntry]()
for node_exec in cls.db_client.get_node_executions(
graph_exec.graph_exec_id,
statuses=[ExecutionStatus.RUNNING, ExecutionStatus.QUEUED],
):
queue.add(node_exec.to_node_execution_entry())
exec_cost_counter = 0
running_executions: dict[str, AsyncResult] = {}
def make_exec_callback(exec_data: NodeExecutionEntry):
@@ -741,9 +788,9 @@ class Executor:
)
try:
exec_cost_counter = cls._charge_usage(
cls._charge_usage(
node_exec=queued_node_exec,
execution_count=exec_cost_counter + 1,
execution_count=increment_execution_count(graph_exec.user_id),
execution_stats=execution_stats,
)
except InsufficientBalanceError as error:
@@ -773,7 +820,7 @@ class Executor:
if (node_creds_map := graph_exec.node_credentials_input_map) and (
node_field_creds_map := node_creds_map.get(node_id)
):
queued_node_exec.data.update(
queued_node_exec.inputs.update(
{
field_name: creds_meta.model_dump()
for field_name, creds_meta in node_field_creds_map.items()
@@ -783,7 +830,7 @@ class Executor:
# Initiate node execution
running_executions[queued_node_exec.node_id] = cls.executor.apply_async(
cls.on_node_execution,
(queue, queued_node_exec),
(queue, queued_node_exec, node_creds_map),
callback=make_exec_callback(queued_node_exec),
)
@@ -804,24 +851,21 @@ class Executor:
execution.wait(3)
log_metadata.info(f"Finished graph execution {graph_exec.graph_exec_id}")
execution_status = ExecutionStatus.COMPLETED
except Exception as e:
error = e
finally:
if error:
log_metadata.error(
f"Failed graph execution {graph_exec.graph_exec_id}: {error}"
)
execution_status = ExecutionStatus.FAILED
else:
execution_status = ExecutionStatus.COMPLETED
log_metadata.error(
f"Failed graph execution {graph_exec.graph_exec_id}: {error}"
)
execution_status = ExecutionStatus.FAILED
finally:
if not cancel.is_set():
finished = True
cancel.set()
cancel_thread.join()
clean_exec_files(graph_exec.graph_exec_id)
return execution_stats, execution_status, error
@classmethod
@@ -833,7 +877,7 @@ class Executor:
metadata = cls.db_client.get_graph_metadata(
graph_exec.graph_id, graph_exec.graph_version
)
outputs = cls.db_client.get_node_execution_results(
outputs = cls.db_client.get_node_executions(
graph_exec.graph_exec_id,
block_ids=[AgentOutputBlock().id],
)
@@ -846,21 +890,21 @@ class Executor:
for output in outputs
]
event = NotificationEventDTO(
user_id=graph_exec.user_id,
type=NotificationType.AGENT_RUN,
data=AgentRunData(
outputs=named_outputs,
agent_name=metadata.name if metadata else "Unknown Agent",
credits_used=exec_stats.cost,
execution_time=exec_stats.walltime,
graph_id=graph_exec.graph_id,
node_count=exec_stats.node_count,
).model_dump(),
queue_notification(
NotificationEventModel(
user_id=graph_exec.user_id,
type=NotificationType.AGENT_RUN,
data=AgentRunData(
outputs=named_outputs,
agent_name=metadata.name if metadata else "Unknown Agent",
credits_used=exec_stats.cost,
execution_time=exec_stats.walltime,
graph_id=graph_exec.graph_id,
node_count=exec_stats.node_count,
),
)
)
cls.notification_service.queue_notification(event)
@classmethod
def _handle_low_balance_notif(
cls,
@@ -874,8 +918,8 @@ class Executor:
base_url = (
settings.config.frontend_base_url or settings.config.platform_base_url
)
cls.notification_service.queue_notification(
NotificationEventDTO(
queue_notification(
NotificationEventModel(
user_id=user_id,
type=NotificationType.LOW_BALANCE,
data=LowBalanceData(
@@ -883,7 +927,7 @@ class Executor:
billing_page_link=f"{base_url}/profile/credits",
shortfall=shortfall,
agent_name=metadata.name if metadata else "Unknown Agent",
).model_dump(),
),
)
)
@@ -895,34 +939,19 @@ class ExecutionManager(AppProcess):
self.running = True
self.active_graph_runs: dict[str, tuple[Future, threading.Event]] = {}
@classmethod
def get_port(cls) -> int:
return settings.config.execution_manager_port
def run(self):
retry_count_max = settings.config.execution_manager_loop_max_retry
retry_count = 0
pool_size_gauge.set(self.pool_size)
active_runs_gauge.set(0)
utilization_gauge.set(0)
for retry_count in range(retry_count_max):
try:
self._run()
except Exception as e:
if not self.running:
break
logger.exception(
f"[{self.service_name}] Error in execution manager: {e}"
)
if retry_count >= retry_count_max:
logger.error(
f"[{self.service_name}] Max retries reached ({retry_count_max}), exiting..."
)
break
else:
logger.info(
f"[{self.service_name}] Retrying execution loop in {retry_count} seconds..."
)
time.sleep(retry_count)
self.metrics_server = threading.Thread(
target=start_http_server,
args=(settings.config.execution_manager_port,),
daemon=True,
)
self.metrics_server.start()
logger.info(f"[{self.service_name}] Starting execution manager...")
self._run()
def _run(self):
logger.info(f"[{self.service_name}] ⏳ Spawn max-{self.pool_size} workers...")
@@ -934,23 +963,41 @@ class ExecutionManager(AppProcess):
logger.info(f"[{self.service_name}] ⏳ Connecting to Redis...")
redis.connect()
# Consume Cancel & Run execution requests.
clear_thread_cache(get_execution_queue)
channel = get_execution_queue().get_channel()
channel.basic_qos(prefetch_count=self.pool_size)
channel.basic_consume(
threading.Thread(
target=lambda: self._consume_execution_cancel(),
daemon=True,
).start()
self._consume_execution_run()
@continuous_retry()
def _consume_execution_cancel(self):
cancel_client = SyncRabbitMQ(create_execution_queue_config())
cancel_client.connect()
cancel_channel = cancel_client.get_channel()
logger.info(f"[{self.service_name}] ⏳ Starting cancel message consumer...")
cancel_channel.basic_consume(
queue=GRAPH_EXECUTION_CANCEL_QUEUE_NAME,
on_message_callback=self._handle_cancel_message,
auto_ack=True,
)
channel.basic_consume(
cancel_channel.start_consuming()
raise RuntimeError(f"❌ cancel message consumer is stopped: {cancel_channel}")
@continuous_retry()
def _consume_execution_run(self):
run_client = SyncRabbitMQ(create_execution_queue_config())
run_client.connect()
run_channel = run_client.get_channel()
run_channel.basic_qos(prefetch_count=self.pool_size)
run_channel.basic_consume(
queue=GRAPH_EXECUTION_QUEUE_NAME,
on_message_callback=self._handle_run_message,
auto_ack=False,
)
logger.info(f"[{self.service_name}] Ready to consume messages...")
channel.start_consuming()
logger.info(f"[{self.service_name}] ⏳ Starting to consume run messages...")
run_channel.start_consuming()
raise RuntimeError(f"❌ run message consumer is stopped: {run_channel}")
def _handle_cancel_message(
self,
@@ -1020,11 +1067,15 @@ class ExecutionManager(AppProcess):
Executor.on_graph_execution, graph_exec_entry, cancel_event
)
self.active_graph_runs[graph_exec_id] = (future, cancel_event)
active_runs_gauge.set(len(self.active_graph_runs))
utilization_gauge.set(len(self.active_graph_runs) / self.pool_size)
def _on_run_done(f: Future):
logger.info(f"[{self.service_name}] Run completed for {graph_exec_id}")
try:
self.active_graph_runs.pop(graph_exec_id, None)
active_runs_gauge.set(len(self.active_graph_runs))
utilization_gauge.set(len(self.active_graph_runs) / self.pool_size)
if f.exception():
logger.error(
f"[{self.service_name}] Execution for {graph_exec_id} failed: {f.exception()}"
@@ -1043,42 +1094,41 @@ class ExecutionManager(AppProcess):
def cleanup(self):
super().cleanup()
self._on_cleanup()
logger.info(f"[{self.service_name}] ⏳ Shutting down service loop...")
def _on_cleanup(self, log=logger.info):
prefix = f"[{self.service_name}][on_graph_executor_stop {os.getpid()}]"
log(f"{prefix} ⏳ Shutting down service loop...")
self.running = False
logger.info(f"[{self.service_name}] ⏳ Shutting down RabbitMQ channel...")
log(f"{prefix} ⏳ Shutting down RabbitMQ channel...")
get_execution_queue().get_channel().stop_consuming()
logger.info(f"[{self.service_name}] ⏳ Shutting down graph executor pool...")
self.executor.shutdown(cancel_futures=True)
if hasattr(self, "executor"):
log(f"{prefix} ⏳ Shutting down GraphExec pool...")
self.executor.shutdown(cancel_futures=True, wait=False)
logger.info(f"[{self.service_name}] ⏳ Disconnecting Redis...")
log(f"{prefix} ⏳ Disconnecting Redis...")
redis.disconnect()
@property
def db_client(self) -> "DatabaseManager":
return get_db_client()
log(f"{prefix} ✅ Finished GraphExec cleanup")
sys.exit(0)
# ------- UTILITIES ------- #
@thread_cached
def get_db_client() -> "DatabaseManager":
from backend.executor import DatabaseManager
def get_db_client() -> "DatabaseManagerClient":
from backend.executor import DatabaseManagerClient
return get_service_client(DatabaseManager)
# Disable health check for the service client to avoid breaking process initializer.
return get_service_client(DatabaseManagerClient, health_check=False)
@thread_cached
def get_notification_service() -> "NotificationManager":
from backend.notifications import NotificationManager
return get_service_client(NotificationManager)
def send_execution_update(entry: GraphExecution | NodeExecutionResult):
def send_execution_update(entry: GraphExecution | NodeExecutionResult | None):
if entry is None:
return
return get_execution_event_bus().publish(entry)
@@ -1089,14 +1139,26 @@ def synchronized(key: str, timeout: int = 60):
lock.acquire()
yield
finally:
if lock.locked():
if lock.locked() and lock.owned():
lock.release()
def increment_execution_count(user_id: str) -> int:
"""
Increment the execution count for a given user,
this will be used to charge the user for the execution cost.
"""
r = redis.get_redis()
k = f"uec:{user_id}" # User Execution Count global key
counter = cast(int, r.incr(k))
if counter == 1:
r.expire(k, settings.config.execution_counter_expiration_time)
return counter
def llprint(message: str):
"""
Low-level print/log helper function for use in signal handlers.
Regular log/print statements are not allowed in signal handlers.
"""
if logger.getEffectiveLevel() == logging.DEBUG:
os.write(sys.stdout.fileno(), (message + "\n").encode())
os.write(sys.stdout.fileno(), (message + "\n").encode())

View File

@@ -1,5 +1,6 @@
import logging
import os
from datetime import datetime, timedelta, timezone
from enum import Enum
from urllib.parse import parse_qs, urlencode, urlparse, urlunparse
@@ -12,13 +13,21 @@ from apscheduler.triggers.cron import CronTrigger
from autogpt_libs.utils.cache import thread_cached
from dotenv import load_dotenv
from prisma.enums import NotificationType
from pydantic import BaseModel
from pydantic import BaseModel, ValidationError
from sqlalchemy import MetaData, create_engine
from backend.data.block import BlockInput
from backend.data.execution import ExecutionStatus
from backend.executor import utils as execution_utils
from backend.notifications.notifications import NotificationManager
from backend.util.service import AppService, expose, get_service_client
from backend.notifications.notifications import NotificationManagerClient
from backend.util.metrics import sentry_capture_error
from backend.util.service import (
AppService,
AppServiceClient,
endpoint_to_async,
expose,
get_service_client,
)
from backend.util.settings import Config
@@ -59,13 +68,11 @@ def job_listener(event):
@thread_cached
def get_notification_client():
from backend.notifications import NotificationManager
return get_service_client(NotificationManager)
return get_service_client(NotificationManagerClient)
def execute_graph(**kwargs):
args = ExecutionJobArgs(**kwargs)
args = GraphExecutionJobArgs(**kwargs)
try:
log(f"Executing recurring job for graph #{args.graph_id}")
execution_utils.add_graph_execution(
@@ -78,6 +85,37 @@ def execute_graph(**kwargs):
logger.exception(f"Error executing graph {args.graph_id}: {e}")
class LateExecutionException(Exception):
pass
def report_late_executions() -> str:
late_executions = execution_utils.get_db_client().get_graph_executions(
statuses=[ExecutionStatus.QUEUED],
created_time_gte=datetime.now(timezone.utc)
- timedelta(seconds=config.execution_late_notification_checkrange_secs),
created_time_lte=datetime.now(timezone.utc)
- timedelta(seconds=config.execution_late_notification_threshold_secs),
limit=1000,
)
if not late_executions:
return "No late executions detected."
num_late_executions = len(late_executions)
num_users = len(set([r.user_id for r in late_executions]))
error = LateExecutionException(
f"Late executions detected: {num_late_executions} late executions from {num_users} users "
f"in the last {config.execution_late_notification_checkrange_secs} seconds. "
f"Graph has been queued for more than {config.execution_late_notification_threshold_secs} seconds. "
"Please check the executor status."
)
msg = str(error)
sentry_capture_error(error)
get_notification_client().discord_system_alert(msg)
return msg
def process_existing_batches(**kwargs):
args = NotificationJobArgs(**kwargs)
try:
@@ -103,7 +141,7 @@ class Jobstores(Enum):
WEEKLY_NOTIFICATIONS = "weekly_notifications"
class ExecutionJobArgs(BaseModel):
class GraphExecutionJobArgs(BaseModel):
graph_id: str
input_data: BlockInput
user_id: str
@@ -111,14 +149,16 @@ class ExecutionJobArgs(BaseModel):
cron: str
class ExecutionJobInfo(ExecutionJobArgs):
class GraphExecutionJobInfo(GraphExecutionJobArgs):
id: str
name: str
next_run_time: str
@staticmethod
def from_db(job_args: ExecutionJobArgs, job_obj: JobObj) -> "ExecutionJobInfo":
return ExecutionJobInfo(
def from_db(
job_args: GraphExecutionJobArgs, job_obj: JobObj
) -> "GraphExecutionJobInfo":
return GraphExecutionJobInfo(
id=job_obj.id,
name=job_obj.name,
next_run_time=job_obj.next_run_time.isoformat(),
@@ -151,6 +191,9 @@ class NotificationJobInfo(NotificationJobArgs):
class Scheduler(AppService):
scheduler: BlockingScheduler
def __init__(self, register_system_tasks: bool = True):
self.register_system_tasks = register_system_tasks
@classmethod
def get_port(cls) -> int:
return config.execution_scheduler_port
@@ -159,11 +202,6 @@ class Scheduler(AppService):
def db_pool_size(cls) -> int:
return config.scheduler_db_pool_size
@property
@thread_cached
def notification_client(self) -> NotificationManager:
return get_service_client(NotificationManager)
def run_service(self):
load_dotenv()
db_schema, db_url = _extract_schema_from_url(os.getenv("DIRECT_URL"))
@@ -193,6 +231,37 @@ class Scheduler(AppService):
Jobstores.WEEKLY_NOTIFICATIONS.value: MemoryJobStore(),
}
)
if self.register_system_tasks:
# Notification PROCESS WEEKLY SUMMARY
self.scheduler.add_job(
process_weekly_summary,
CronTrigger.from_crontab("0 * * * *"),
id="process_weekly_summary",
kwargs={},
replace_existing=True,
jobstore=Jobstores.WEEKLY_NOTIFICATIONS.value,
)
# Notification PROCESS EXISTING BATCHES
# self.scheduler.add_job(
# process_existing_batches,
# id="process_existing_batches",
# CronTrigger.from_crontab("0 12 * * 5"),
# replace_existing=True,
# jobstore=Jobstores.BATCHED_NOTIFICATIONS.value,
# )
# Notification LATE EXECUTIONS ALERT
self.scheduler.add_job(
report_late_executions,
id="report_late_executions",
trigger="interval",
replace_existing=True,
seconds=config.execution_late_notification_threshold_secs,
jobstore=Jobstores.EXECUTION.value,
)
self.scheduler.add_listener(job_listener, EVENT_JOB_EXECUTED | EVENT_JOB_ERROR)
self.scheduler.start()
@@ -203,15 +272,15 @@ class Scheduler(AppService):
self.scheduler.shutdown(wait=False)
@expose
def add_execution_schedule(
def add_graph_execution_schedule(
self,
graph_id: str,
graph_version: int,
cron: str,
input_data: BlockInput,
user_id: str,
) -> ExecutionJobInfo:
job_args = ExecutionJobArgs(
) -> GraphExecutionJobInfo:
job_args = GraphExecutionJobArgs(
graph_id=graph_id,
input_data=input_data,
user_id=user_id,
@@ -226,77 +295,66 @@ class Scheduler(AppService):
jobstore=Jobstores.EXECUTION.value,
)
log(f"Added job {job.id} with cron schedule '{cron}' input data: {input_data}")
return ExecutionJobInfo.from_db(job_args, job)
return GraphExecutionJobInfo.from_db(job_args, job)
@expose
def delete_schedule(self, schedule_id: str, user_id: str) -> ExecutionJobInfo:
def delete_graph_execution_schedule(
self, schedule_id: str, user_id: str
) -> GraphExecutionJobInfo:
job = self.scheduler.get_job(schedule_id, jobstore=Jobstores.EXECUTION.value)
if not job:
log(f"Job {schedule_id} not found.")
raise ValueError(f"Job #{schedule_id} not found.")
job_args = ExecutionJobArgs(**job.kwargs)
job_args = GraphExecutionJobArgs(**job.kwargs)
if job_args.user_id != user_id:
raise ValueError("User ID does not match the job's user ID.")
log(f"Deleting job {schedule_id}")
job.remove()
return ExecutionJobInfo.from_db(job_args, job)
return GraphExecutionJobInfo.from_db(job_args, job)
@expose
def get_execution_schedules(
def get_graph_execution_schedules(
self, graph_id: str | None = None, user_id: str | None = None
) -> list[ExecutionJobInfo]:
) -> list[GraphExecutionJobInfo]:
jobs: list[JobObj] = self.scheduler.get_jobs(jobstore=Jobstores.EXECUTION.value)
schedules = []
for job in self.scheduler.get_jobs(jobstore=Jobstores.EXECUTION.value):
logger.info(
for job in jobs:
logger.debug(
f"Found job {job.id} with cron schedule {job.trigger} and args {job.kwargs}"
)
job_args = ExecutionJobArgs(**job.kwargs)
try:
job_args = GraphExecutionJobArgs.model_validate(job.kwargs)
except ValidationError:
continue
if (
job.next_run_time is not None
and (graph_id is None or job_args.graph_id == graph_id)
and (user_id is None or job_args.user_id == user_id)
):
schedules.append(ExecutionJobInfo.from_db(job_args, job))
schedules.append(GraphExecutionJobInfo.from_db(job_args, job))
return schedules
@expose
def add_batched_notification_schedule(
self,
notification_types: list[NotificationType],
data: dict,
cron: str,
) -> NotificationJobInfo:
job_args = NotificationJobArgs(
notification_types=notification_types,
cron=cron,
)
job = self.scheduler.add_job(
process_existing_batches,
CronTrigger.from_crontab(cron),
kwargs=job_args.model_dump(),
replace_existing=True,
jobstore=Jobstores.BATCHED_NOTIFICATIONS.value,
)
log(f"Added job {job.id} with cron schedule '{cron}' input data: {data}")
return NotificationJobInfo.from_db(job_args, job)
def execute_process_existing_batches(self, kwargs: dict):
process_existing_batches(**kwargs)
@expose
def add_weekly_notification_schedule(self, cron: str) -> NotificationJobInfo:
def execute_process_weekly_summary(self):
process_weekly_summary()
job = self.scheduler.add_job(
process_weekly_summary,
CronTrigger.from_crontab(cron),
kwargs={},
replace_existing=True,
jobstore=Jobstores.WEEKLY_NOTIFICATIONS.value,
)
log(f"Added job {job.id} with cron schedule '{cron}'")
return NotificationJobInfo.from_db(
NotificationJobArgs(
cron=cron, notification_types=[NotificationType.WEEKLY_SUMMARY]
),
job,
)
@expose
def execute_report_late_executions(self):
return report_late_executions()
class SchedulerClient(AppServiceClient):
@classmethod
def get_service_type(cls):
return Scheduler
add_execution_schedule = endpoint_to_async(Scheduler.add_graph_execution_schedule)
delete_schedule = endpoint_to_async(Scheduler.delete_graph_execution_schedule)
get_execution_schedules = endpoint_to_async(Scheduler.get_graph_execution_schedules)

View File

@@ -41,7 +41,7 @@ from backend.util.settings import Config
from backend.util.type import convert
if TYPE_CHECKING:
from backend.executor import DatabaseManager
from backend.executor import DatabaseManagerClient
from backend.integrations.credentials_store import IntegrationCredentialsStore
config = Config()
@@ -82,40 +82,32 @@ def get_integration_credentials_store() -> "IntegrationCredentialsStore":
@thread_cached
def get_db_client() -> "DatabaseManager":
from backend.executor import DatabaseManager
def get_db_client() -> "DatabaseManagerClient":
from backend.executor import DatabaseManagerClient
return get_service_client(DatabaseManager)
return get_service_client(DatabaseManagerClient)
# ============ Execution Cost Helpers ============ #
class UsageTransactionMetadata(BaseModel):
graph_exec_id: str | None = None
graph_id: str | None = None
node_id: str | None = None
node_exec_id: str | None = None
block_id: str | None = None
block: str | None = None
input: BlockInput | None = None
def execution_usage_cost(execution_count: int) -> tuple[int, int]:
"""
Calculate the cost of executing a graph based on the number of executions.
Calculate the cost of executing a graph based on the current number of node executions.
Args:
execution_count: Number of executions
execution_count: Number of node executions
Returns:
Tuple of cost amount and remaining execution count
Tuple of cost amount and the number of execution count that is included in the cost.
"""
return (
execution_count
// config.execution_cost_count_threshold
* config.execution_cost_per_threshold,
execution_count % config.execution_cost_count_threshold,
(
config.execution_cost_per_threshold
if execution_count % config.execution_cost_count_threshold == 0
else 0
),
config.execution_cost_count_threshold,
)
@@ -266,7 +258,7 @@ def validate_exec(
If the data is valid, the first element will be the resolved input data, and
the second element will be the block name.
"""
node_block: Block | None = get_block(node.block_id)
node_block = get_block(node.block_id)
if not node_block:
return None, f"Block for {node.block_id} not found."
schema = node_block.input_schema
@@ -616,7 +608,10 @@ async def add_graph_execution_async(
ValueError: If the graph is not found or if there are validation errors.
""" # noqa
graph: GraphModel | None = await get_graph(
graph_id=graph_id, user_id=user_id, version=graph_version
graph_id=graph_id,
user_id=user_id,
version=graph_version,
include_subgraphs=True,
)
if not graph:
raise NotFoundError(f"Graph #{graph_id} not found.")
@@ -676,6 +671,9 @@ def add_graph_execution(
preset_id: Optional[str] = None,
graph_version: Optional[int] = None,
graph_credentials_inputs: Optional[dict[str, CredentialsMetaInput]] = None,
node_credentials_input_map: Optional[
dict[str, dict[str, CredentialsMetaInput]]
] = None,
) -> GraphExecutionWithNodes:
"""
Adds a graph execution to the queue and returns the execution entry.
@@ -688,6 +686,7 @@ def add_graph_execution(
graph_version: The version of the graph to execute.
graph_credentials_inputs: Credentials inputs to use in the execution.
Keys should map to the keys generated by `GraphModel.aggregate_credentials_inputs`.
node_credentials_input_map: Credentials inputs to use in the execution, mapped to specific nodes.
Returns:
GraphExecutionEntry: The entry for the graph execution.
Raises:
@@ -695,12 +694,15 @@ def add_graph_execution(
"""
db = get_db_client()
graph: GraphModel | None = db.get_graph(
graph_id=graph_id, user_id=user_id, version=graph_version
graph_id=graph_id,
user_id=user_id,
version=graph_version,
include_subgraphs=True,
)
if not graph:
raise NotFoundError(f"Graph #{graph_id} not found.")
node_credentials_input_map = (
node_credentials_input_map = node_credentials_input_map or (
make_node_credentials_input_map(graph, graph_credentials_inputs)
if graph_credentials_inputs
else None

View File

@@ -7,7 +7,7 @@ from typing import TYPE_CHECKING, Optional
from pydantic import SecretStr
if TYPE_CHECKING:
from backend.executor.database import DatabaseManager
from backend.executor.database import DatabaseManagerClient
from autogpt_libs.utils.cache import thread_cached
from autogpt_libs.utils.synchronize import RedisKeyedMutex
@@ -177,6 +177,14 @@ zerobounce_credentials = APIKeyCredentials(
expires_at=None,
)
llama_api_credentials = APIKeyCredentials(
id="d44045af-1c33-4833-9e19-752313214de2",
provider="llama_api",
api_key=SecretStr(settings.secrets.llama_api_key),
title="Use Credits for Llama API",
expires_at=None,
)
DEFAULT_CREDENTIALS = [
ollama_credentials,
revid_credentials,
@@ -210,11 +218,11 @@ class IntegrationCredentialsStore:
@property
@thread_cached
def db_manager(self) -> "DatabaseManager":
from backend.executor.database import DatabaseManager
def db_manager(self) -> "DatabaseManagerClient":
from backend.executor.database import DatabaseManagerClient
from backend.util.service import get_service_client
return get_service_client(DatabaseManager)
return get_service_client(DatabaseManagerClient)
def add_creds(self, user_id: str, credentials: Credentials) -> None:
with self.locked_user_integrations(user_id):

View File

@@ -93,7 +93,7 @@ class IntegrationCredentialsManager:
fresh_credentials = oauth_handler.refresh_tokens(credentials)
self.store.update_creds(user_id, fresh_credentials)
if _lock and _lock.locked():
if _lock and _lock.locked() and _lock.owned():
_lock.release()
credentials = fresh_credentials
@@ -145,7 +145,7 @@ class IntegrationCredentialsManager:
try:
yield
finally:
if lock.locked():
if lock.locked() and lock.owned():
lock.release()
def release_all_locks(self):

View File

@@ -20,6 +20,7 @@ class ProviderName(str, Enum):
IDEOGRAM = "ideogram"
JINA = "jina"
LINEAR = "linear"
LLAMA_API = "llama_api"
MEDIUM = "medium"
MEM0 = "mem0"
NOTION = "notion"

View File

@@ -1,5 +1,6 @@
from .notifications import NotificationManager
from .notifications import NotificationManager, NotificationManagerClient
__all__ = [
"NotificationManager",
"NotificationManagerClient",
]

View File

@@ -1,5 +1,6 @@
import logging
import time
from concurrent.futures import ThreadPoolExecutor
from datetime import datetime, timedelta, timezone
from typing import Callable
@@ -7,20 +8,18 @@ import aio_pika
from aio_pika.exceptions import QueueEmpty
from autogpt_libs.utils.cache import thread_cached
from prisma.enums import NotificationType
from pydantic import BaseModel
from backend.data import rabbitmq
from backend.data.notifications import (
BaseEventModel,
BaseSummaryData,
BaseSummaryParams,
DailySummaryData,
DailySummaryParams,
NotificationEventDTO,
NotificationEventModel,
NotificationResult,
NotificationTypeOverride,
QueueType,
SummaryParamsEventDTO,
SummaryParamsEventModel,
WeeklySummaryData,
WeeklySummaryParams,
@@ -28,96 +27,179 @@ from backend.data.notifications import (
get_notif_data_type,
get_summary_params_type,
)
from backend.data.rabbitmq import Exchange, ExchangeType, Queue, RabbitMQConfig
from backend.data.rabbitmq import (
AsyncRabbitMQ,
Exchange,
ExchangeType,
Queue,
RabbitMQConfig,
SyncRabbitMQ,
)
from backend.data.user import generate_unsubscribe_link
from backend.notifications.email import EmailSender
from backend.util.service import AppService, expose, get_service_client
from backend.util.logging import TruncatedLogger
from backend.util.metrics import discord_send_alert
from backend.util.service import (
AppService,
AppServiceClient,
expose,
get_service_client,
)
from backend.util.settings import Settings
logger = logging.getLogger(__name__)
logger = TruncatedLogger(logging.getLogger(__name__), "[NotificationManager]")
settings = Settings()
class NotificationEvent(BaseModel):
event: NotificationEventDTO
model: NotificationEventModel
NOTIFICATION_EXCHANGE = Exchange(name="notifications", type=ExchangeType.TOPIC)
DEAD_LETTER_EXCHANGE = Exchange(name="dead_letter", type=ExchangeType.TOPIC)
EXCHANGES = [NOTIFICATION_EXCHANGE, DEAD_LETTER_EXCHANGE]
background_executor = ThreadPoolExecutor(max_workers=2)
def create_notification_config() -> RabbitMQConfig:
"""Create RabbitMQ configuration for notifications"""
notification_exchange = Exchange(name="notifications", type=ExchangeType.TOPIC)
dead_letter_exchange = Exchange(name="dead_letter", type=ExchangeType.TOPIC)
queues = [
# Main notification queues
Queue(
name="immediate_notifications",
exchange=notification_exchange,
exchange=NOTIFICATION_EXCHANGE,
routing_key="notification.immediate.#",
arguments={
"x-dead-letter-exchange": dead_letter_exchange.name,
"x-dead-letter-exchange": DEAD_LETTER_EXCHANGE.name,
"x-dead-letter-routing-key": "failed.immediate",
},
),
Queue(
name="admin_notifications",
exchange=notification_exchange,
exchange=NOTIFICATION_EXCHANGE,
routing_key="notification.admin.#",
arguments={
"x-dead-letter-exchange": dead_letter_exchange.name,
"x-dead-letter-exchange": DEAD_LETTER_EXCHANGE.name,
"x-dead-letter-routing-key": "failed.admin",
},
),
# Summary notification queues
Queue(
name="summary_notifications",
exchange=notification_exchange,
exchange=NOTIFICATION_EXCHANGE,
routing_key="notification.summary.#",
arguments={
"x-dead-letter-exchange": dead_letter_exchange.name,
"x-dead-letter-exchange": DEAD_LETTER_EXCHANGE.name,
"x-dead-letter-routing-key": "failed.summary",
},
),
# Batch Queue
Queue(
name="batch_notifications",
exchange=notification_exchange,
exchange=NOTIFICATION_EXCHANGE,
routing_key="notification.batch.#",
arguments={
"x-dead-letter-exchange": dead_letter_exchange.name,
"x-dead-letter-exchange": DEAD_LETTER_EXCHANGE.name,
"x-dead-letter-routing-key": "failed.batch",
},
),
# Failed notifications queue
Queue(
name="failed_notifications",
exchange=dead_letter_exchange,
exchange=DEAD_LETTER_EXCHANGE,
routing_key="failed.#",
),
]
return RabbitMQConfig(
exchanges=[
notification_exchange,
dead_letter_exchange,
],
exchanges=EXCHANGES,
queues=queues,
)
@thread_cached
def get_scheduler():
from backend.executor import Scheduler
def get_db():
from backend.executor.database import DatabaseManagerClient
return get_service_client(Scheduler)
return get_service_client(DatabaseManagerClient)
@thread_cached
def get_db():
from backend.executor.database import DatabaseManager
def get_notification_queue() -> SyncRabbitMQ:
client = SyncRabbitMQ(create_notification_config())
client.connect()
return client
return get_service_client(DatabaseManager)
@thread_cached
async def get_async_notification_queue() -> AsyncRabbitMQ:
client = AsyncRabbitMQ(create_notification_config())
await client.connect()
return client
def get_routing_key(event_type: NotificationType) -> str:
strategy = NotificationTypeOverride(event_type).strategy
"""Get the appropriate routing key for an event"""
if strategy == QueueType.IMMEDIATE:
return f"notification.immediate.{event_type.value}"
elif strategy == QueueType.BACKOFF:
return f"notification.backoff.{event_type.value}"
elif strategy == QueueType.ADMIN:
return f"notification.admin.{event_type.value}"
elif strategy == QueueType.BATCH:
return f"notification.batch.{event_type.value}"
elif strategy == QueueType.SUMMARY:
return f"notification.summary.{event_type.value}"
return f"notification.{event_type.value}"
def queue_notification(event: NotificationEventModel) -> NotificationResult:
"""Queue a notification - exposed method for other services to call"""
try:
logger.debug(f"Received Request to queue {event=}")
exchange = "notifications"
routing_key = get_routing_key(event.type)
queue = get_notification_queue()
queue.publish_message(
routing_key=routing_key,
message=event.model_dump_json(),
exchange=next(ex for ex in EXCHANGES if ex.name == exchange),
)
return NotificationResult(
success=True,
message=f"Notification queued with routing key: {routing_key}",
)
except Exception as e:
logger.exception(f"Error queueing notification: {e}")
return NotificationResult(success=False, message=str(e))
async def queue_notification_async(event: NotificationEventModel) -> NotificationResult:
"""Queue a notification - exposed method for other services to call"""
try:
logger.debug(f"Received Request to queue {event=}")
exchange = "notifications"
routing_key = get_routing_key(event.type)
queue = await get_async_notification_queue()
await queue.publish_message(
routing_key=routing_key,
message=event.model_dump_json(),
exchange=next(ex for ex in EXCHANGES if ex.name == exchange),
)
return NotificationResult(
success=True,
message=f"Notification queued with routing key: {routing_key}",
)
except Exception as e:
logger.exception(f"Error queueing notification: {e}")
return NotificationResult(success=False, message=str(e))
class NotificationManager(AppService):
@@ -147,23 +229,11 @@ class NotificationManager(AppService):
def get_port(cls) -> int:
return settings.config.notification_service_port
def get_routing_key(self, event_type: NotificationType) -> str:
strategy = NotificationTypeOverride(event_type).strategy
"""Get the appropriate routing key for an event"""
if strategy == QueueType.IMMEDIATE:
return f"notification.immediate.{event_type.value}"
elif strategy == QueueType.BACKOFF:
return f"notification.backoff.{event_type.value}"
elif strategy == QueueType.ADMIN:
return f"notification.admin.{event_type.value}"
elif strategy == QueueType.BATCH:
return f"notification.batch.{event_type.value}"
elif strategy == QueueType.SUMMARY:
return f"notification.summary.{event_type.value}"
return f"notification.{event_type.value}"
@expose
def queue_weekly_summary(self):
background_executor.submit(self._queue_weekly_summary)
def _queue_weekly_summary(self):
"""Process weekly summary for specified notification types"""
try:
logger.info("Processing weekly summary queuing operation")
@@ -177,13 +247,13 @@ class NotificationManager(AppService):
for user in users:
self._queue_scheduled_notification(
SummaryParamsEventDTO(
SummaryParamsEventModel(
user_id=user,
type=NotificationType.WEEKLY_SUMMARY,
data=WeeklySummaryParams(
start_date=start_time,
end_date=current_time,
).model_dump(),
),
),
)
processed_count += 1
@@ -195,6 +265,9 @@ class NotificationManager(AppService):
@expose
def process_existing_batches(self, notification_types: list[NotificationType]):
background_executor.submit(self._process_existing_batches, notification_types)
def _process_existing_batches(self, notification_types: list[NotificationType]):
"""Process existing batches for specified notification types"""
try:
processed_count = 0
@@ -314,65 +387,23 @@ class NotificationManager(AppService):
}
@expose
def queue_notification(self, event: NotificationEventDTO) -> NotificationResult:
"""Queue a notification - exposed method for other services to call"""
try:
logger.info(f"Received Request to queue {event=}")
# Workaround for not being able to serialize generics over the expose bus
parsed_event = NotificationEventModel[
get_notif_data_type(event.type)
].model_validate(event.model_dump())
routing_key = self.get_routing_key(parsed_event.type)
message = parsed_event.model_dump_json()
def discord_system_alert(self, content: str):
discord_send_alert(content)
logger.info(f"Received Request to queue {message=}")
exchange = "notifications"
# Publish to RabbitMQ
self.run_and_wait(
self.rabbit.publish_message(
routing_key=routing_key,
message=message,
exchange=next(
ex for ex in self.rabbit_config.exchanges if ex.name == exchange
),
)
)
return NotificationResult(
success=True,
message=f"Notification queued with routing key: {routing_key}",
)
except Exception as e:
logger.exception(f"Error queueing notification: {e}")
return NotificationResult(success=False, message=str(e))
def _queue_scheduled_notification(self, event: SummaryParamsEventDTO):
def _queue_scheduled_notification(self, event: SummaryParamsEventModel):
"""Queue a scheduled notification - exposed method for other services to call"""
try:
logger.info(f"Received Request to queue scheduled notification {event=}")
parsed_event = SummaryParamsEventModel[
get_summary_params_type(event.type)
].model_validate(event.model_dump())
routing_key = self.get_routing_key(event.type)
message = parsed_event.model_dump_json()
logger.info(f"Received Request to queue {message=}")
logger.debug(f"Received Request to queue scheduled notification {event=}")
exchange = "notifications"
routing_key = get_routing_key(event.type)
# Publish to RabbitMQ
self.run_and_wait(
self.rabbit.publish_message(
routing_key=routing_key,
message=message,
exchange=next(
ex for ex in self.rabbit_config.exchanges if ex.name == exchange
),
message=event.model_dump_json(),
exchange=next(ex for ex in EXCHANGES if ex.name == exchange),
)
)
@@ -498,13 +529,12 @@ class NotificationManager(AppService):
)
return False
def _parse_message(self, message: str) -> NotificationEvent | None:
def _parse_message(self, message: str) -> NotificationEventModel | None:
try:
event = NotificationEventDTO.model_validate_json(message)
model = NotificationEventModel[
event = BaseEventModel.model_validate_json(message)
return NotificationEventModel[
get_notif_data_type(event.type)
].model_validate_json(message)
return NotificationEvent(event=event, model=model)
except Exception as e:
logger.error(f"Error parsing message due to non matching schema {e}")
return None
@@ -512,14 +542,12 @@ class NotificationManager(AppService):
def _process_admin_message(self, message: str) -> bool:
"""Process a single notification, sending to an admin, returning whether to put into the failed queue"""
try:
parsed = self._parse_message(message)
if not parsed:
event = self._parse_message(message)
if not event:
return False
event = parsed.event
model = parsed.model
logger.debug(f"Processing notification for admin: {model}")
logger.debug(f"Processing notification for admin: {event}")
recipient_email = settings.config.refund_notification_email
self.email_sender.send_templated(event.type, recipient_email, model)
self.email_sender.send_templated(event.type, recipient_email, event)
return True
except Exception as e:
logger.exception(f"Error processing notification for admin queue: {e}")
@@ -528,12 +556,10 @@ class NotificationManager(AppService):
def _process_immediate(self, message: str) -> bool:
"""Process a single notification immediately, returning whether to put into the failed queue"""
try:
parsed = self._parse_message(message)
if not parsed:
event = self._parse_message(message)
if not event:
return False
event = parsed.event
model = parsed.model
logger.debug(f"Processing immediate notification: {model}")
logger.debug(f"Processing immediate notification: {event}")
recipient_email = get_db().get_user_email_by_id(event.user_id)
if not recipient_email:
@@ -554,7 +580,7 @@ class NotificationManager(AppService):
self.email_sender.send_templated(
notification=event.type,
user_email=recipient_email,
data=model,
data=event,
user_unsub_link=unsub_link,
)
return True
@@ -565,12 +591,10 @@ class NotificationManager(AppService):
def _process_batch(self, message: str) -> bool:
"""Process a single notification with a batching strategy, returning whether to put into the failed queue"""
try:
parsed = self._parse_message(message)
if not parsed:
event = self._parse_message(message)
if not event:
return False
event = parsed.event
model = parsed.model
logger.info(f"Processing batch notification: {model}")
logger.info(f"Processing batch notification: {event}")
recipient_email = get_db().get_user_email_by_id(event.user_id)
if not recipient_email:
@@ -586,7 +610,7 @@ class NotificationManager(AppService):
)
return True
should_send = self._should_batch(event.user_id, event.type, model)
should_send = self._should_batch(event.user_id, event.type, event)
if not should_send:
logger.info("Batch not old enough to send")
@@ -628,7 +652,7 @@ class NotificationManager(AppService):
"""Process a single notification with a summary strategy, returning whether to put into the failed queue"""
try:
logger.info(f"Processing summary notification: {message}")
event = SummaryParamsEventDTO.model_validate_json(message)
event = BaseEventModel.model_validate_json(message)
model = SummaryParamsEventModel[
get_summary_params_type(event.type)
].model_validate_json(message)
@@ -709,22 +733,6 @@ class NotificationManager(AppService):
logger.info(f"[{self.service_name}] Started notification service")
# Set up scheduler for batch processing of all notification types
# this can be changed later to spawn different cleanups on different schedules
try:
get_scheduler().add_batched_notification_schedule(
notification_types=list(NotificationType),
data={},
cron="0 * * * *",
)
# get_scheduler().add_weekly_notification_schedule(
# # weekly on Friday at 12pm
# cron="0 12 * * 5",
# )
logger.info("Scheduled notification cleanup")
except Exception as e:
logger.error(f"Error scheduling notification cleanup: {e}")
# Set up queue consumers
channel = self.run_and_wait(self.rabbit.get_channel())
@@ -774,3 +782,13 @@ class NotificationManager(AppService):
super().cleanup()
logger.info(f"[{self.service_name}] ⏳ Disconnecting RabbitMQ...")
self.run_and_wait(self.rabbitmq_service.disconnect())
class NotificationManagerClient(AppServiceClient):
@classmethod
def get_service_type(cls):
return NotificationManager
process_existing_batches = NotificationManager.process_existing_batches
queue_weekly_summary = NotificationManager.queue_weekly_summary
discord_system_alert = NotificationManager.discord_system_alert

View File

@@ -1,5 +1,5 @@
from backend.app import run_processes
from backend.executor import DatabaseManager, Scheduler
from backend.executor import DatabaseManager
from backend.notifications.notifications import NotificationManager
from backend.server.rest_api import AgentServer
@@ -11,7 +11,6 @@ def main():
run_processes(
NotificationManager(),
DatabaseManager(),
Scheduler(),
AgentServer(),
)

View File

@@ -0,0 +1,13 @@
from backend.app import run_processes
from backend.executor.scheduler import Scheduler
def main():
"""
Run all the processes required for the AutoGPT-server Scheduling System.
"""
run_processes(Scheduler())
if __name__ == "__main__":
main()

View File

@@ -122,7 +122,7 @@ async def get_graph_execution_results(
if not graph:
raise HTTPException(status_code=404, detail=f"Graph #{graph_id} not found.")
results = await execution_db.get_node_execution_results(graph_exec_id)
results = await execution_db.get_node_executions(graph_exec_id)
last_result = results[-1] if results else None
execution_status = (
last_result.status if last_result else AgentExecutionStatus.INCOMPLETE

View File

@@ -19,6 +19,7 @@ import backend.data.graph
import backend.data.user
import backend.server.routers.postmark.postmark
import backend.server.routers.v1
import backend.server.v2.admin.credit_admin_routes
import backend.server.v2.admin.store_admin_routes
import backend.server.v2.library.db
import backend.server.v2.library.model
@@ -26,6 +27,7 @@ import backend.server.v2.library.routes
import backend.server.v2.otto.routes
import backend.server.v2.store.model
import backend.server.v2.store.routes
import backend.server.v2.turnstile.routes
import backend.util.service
import backend.util.settings
from backend.blocks.llm import LlmModel
@@ -107,12 +109,20 @@ app.include_router(
tags=["v2", "admin"],
prefix="/api/store",
)
app.include_router(
backend.server.v2.admin.credit_admin_routes.router,
tags=["v2", "admin"],
prefix="/api/credits",
)
app.include_router(
backend.server.v2.library.routes.router, tags=["v2"], prefix="/api/library"
)
app.include_router(
backend.server.v2.otto.routes.router, tags=["v2"], prefix="/api/otto"
)
app.include_router(
backend.server.v2.turnstile.routes.router, tags=["v2"], prefix="/api/turnstile"
)
app.include_router(
backend.server.routers.postmark.postmark.router,

View File

@@ -57,7 +57,7 @@ from backend.data.user import (
update_user_email,
update_user_notification_preference,
)
from backend.executor import Scheduler, scheduler
from backend.executor import scheduler
from backend.executor import utils as execution_utils
from backend.executor.utils import create_execution_queue_config
from backend.integrations.creds_manager import IntegrationCredentialsManager
@@ -83,8 +83,8 @@ if TYPE_CHECKING:
@thread_cached
def execution_scheduler_client() -> Scheduler:
return get_service_client(Scheduler)
def execution_scheduler_client() -> scheduler.SchedulerClient:
return get_service_client(scheduler.SchedulerClient, health_check=False)
@thread_cached
@@ -422,7 +422,11 @@ async def get_graph(
for_export: bool = False,
) -> graph_db.GraphModel:
graph = await graph_db.get_graph(
graph_id, version, user_id=user_id, for_export=for_export
graph_id,
version,
user_id=user_id,
for_export=for_export,
include_subgraphs=True, # needed to construct full credentials input schema
)
if not graph:
raise HTTPException(status_code=404, detail=f"Graph #{graph_id} not found.")
@@ -656,14 +660,18 @@ async def _cancel_execution(graph_exec_id: str):
exchange=execution_utils.GRAPH_EXECUTION_CANCEL_EXCHANGE,
)
# Update the status of the graph & node executions
await execution_db.update_graph_execution_stats(
# Update the status of the graph execution
graph_execution = await execution_db.update_graph_execution_stats(
graph_exec_id,
execution_db.ExecutionStatus.TERMINATED,
)
if graph_execution:
await execution_event_bus().publish(graph_execution)
# Update the status of the node executions
node_execs = [
node_exec.model_copy(update={"status": execution_db.ExecutionStatus.TERMINATED})
for node_exec in await execution_db.get_node_execution_results(
for node_exec in await execution_db.get_node_executions(
graph_exec_id=graph_exec_id,
statuses=[
execution_db.ExecutionStatus.QUEUED,
@@ -672,7 +680,6 @@ async def _cancel_execution(graph_exec_id: str):
],
)
]
await execution_db.update_node_execution_status_batch(
[node_exec.node_exec_id for node_exec in node_execs],
execution_db.ExecutionStatus.TERMINATED,
@@ -769,7 +776,7 @@ class ScheduleCreationRequest(pydantic.BaseModel):
async def create_schedule(
user_id: Annotated[str, Depends(get_user_id)],
schedule: ScheduleCreationRequest,
) -> scheduler.ExecutionJobInfo:
) -> scheduler.GraphExecutionJobInfo:
graph = await graph_db.get_graph(
schedule.graph_id, schedule.graph_version, user_id=user_id
)
@@ -779,14 +786,12 @@ async def create_schedule(
detail=f"Graph #{schedule.graph_id} v.{schedule.graph_version} not found.",
)
return await asyncio.to_thread(
lambda: execution_scheduler_client().add_execution_schedule(
graph_id=schedule.graph_id,
graph_version=graph.version,
cron=schedule.cron,
input_data=schedule.input_data,
user_id=user_id,
)
return await execution_scheduler_client().add_execution_schedule(
graph_id=schedule.graph_id,
graph_version=graph.version,
cron=schedule.cron,
input_data=schedule.input_data,
user_id=user_id,
)
@@ -795,11 +800,11 @@ async def create_schedule(
tags=["schedules"],
dependencies=[Depends(auth_middleware)],
)
def delete_schedule(
async def delete_schedule(
schedule_id: str,
user_id: Annotated[str, Depends(get_user_id)],
) -> dict[Any, Any]:
execution_scheduler_client().delete_schedule(schedule_id, user_id=user_id)
await execution_scheduler_client().delete_schedule(schedule_id, user_id=user_id)
return {"id": schedule_id}
@@ -808,11 +813,11 @@ def delete_schedule(
tags=["schedules"],
dependencies=[Depends(auth_middleware)],
)
def get_execution_schedules(
async def get_execution_schedules(
user_id: Annotated[str, Depends(get_user_id)],
graph_id: str | None = None,
) -> list[scheduler.ExecutionJobInfo]:
return execution_scheduler_client().get_execution_schedules(
) -> list[scheduler.GraphExecutionJobInfo]:
return await execution_scheduler_client().get_execution_schedules(
user_id=user_id,
graph_id=graph_id,
)

View File

@@ -0,0 +1,77 @@
import logging
import typing
from autogpt_libs.auth import requires_admin_user
from autogpt_libs.auth.depends import get_user_id
from fastapi import APIRouter, Body, Depends
from prisma import Json
from prisma.enums import CreditTransactionType
from backend.data.credit import admin_get_user_history, get_user_credit_model
from backend.server.v2.admin.model import AddUserCreditsResponse, UserHistoryResponse
logger = logging.getLogger(__name__)
_user_credit_model = get_user_credit_model()
router = APIRouter(
prefix="/admin",
tags=["credits", "admin"],
dependencies=[Depends(requires_admin_user)],
)
@router.post("/add_credits", response_model=AddUserCreditsResponse)
async def add_user_credits(
user_id: typing.Annotated[str, Body()],
amount: typing.Annotated[int, Body()],
comments: typing.Annotated[str, Body()],
admin_user: typing.Annotated[
str,
Depends(get_user_id),
],
):
""" """
logger.info(f"Admin user {admin_user} is adding {amount} credits to user {user_id}")
new_balance, transaction_key = await _user_credit_model._add_transaction(
user_id,
amount,
transaction_type=CreditTransactionType.GRANT,
metadata=Json({"admin_id": admin_user, "reason": comments}),
)
return {
"new_balance": new_balance,
"transaction_key": transaction_key,
}
@router.get(
"/users_history",
response_model=UserHistoryResponse,
)
async def admin_get_all_user_history(
admin_user: typing.Annotated[
str,
Depends(get_user_id),
],
search: typing.Optional[str] = None,
page: int = 1,
page_size: int = 20,
transaction_filter: typing.Optional[CreditTransactionType] = None,
):
""" """
logger.info(f"Admin user {admin_user} is getting grant history")
try:
resp = await admin_get_user_history(
page=page,
page_size=page_size,
search=search,
transaction_filter=transaction_filter,
)
logger.info(f"Admin user {admin_user} got {len(resp.history)} grant history")
return resp
except Exception as e:
logger.exception(f"Error getting grant history: {e}")
raise e

View File

@@ -0,0 +1,16 @@
from pydantic import BaseModel
from backend.data.model import UserTransaction
from backend.server.model import Pagination
class UserHistoryResponse(BaseModel):
"""Response model for listings with version history"""
history: list[UserTransaction]
pagination: Pagination
class AddUserCreditsResponse(BaseModel):
new_balance: int
transaction_key: str

View File

@@ -1,4 +1,5 @@
import logging
import tempfile
import typing
import autogpt_libs.auth.depends
@@ -9,6 +10,7 @@ import prisma.enums
import backend.server.v2.store.db
import backend.server.v2.store.exceptions
import backend.server.v2.store.model
import backend.util.json
logger = logging.getLogger(__name__)
@@ -98,3 +100,47 @@ async def review_submission(
status_code=500,
content={"detail": "An error occurred while reviewing the submission"},
)
@router.get(
"/submissions/download/{store_listing_version_id}",
tags=["store", "admin"],
dependencies=[fastapi.Depends(autogpt_libs.auth.depends.requires_admin_user)],
)
async def admin_download_agent_file(
user: typing.Annotated[
autogpt_libs.auth.models.User,
fastapi.Depends(autogpt_libs.auth.depends.requires_admin_user),
],
store_listing_version_id: str = fastapi.Path(
..., description="The ID of the agent to download"
),
) -> fastapi.responses.FileResponse:
"""
Download the agent file by streaming its content.
Args:
store_listing_version_id (str): The ID of the agent to download
Returns:
StreamingResponse: A streaming response containing the agent's graph data.
Raises:
HTTPException: If the agent is not found or an unexpected error occurs.
"""
graph_data = await backend.server.v2.store.db.get_agent(
user_id=user.user_id,
store_listing_version_id=store_listing_version_id,
)
file_name = f"agent_{graph_data.id}_v{graph_data.version or 'latest'}.json"
# Sending graph as a stream (similar to marketplace v1)
with tempfile.NamedTemporaryFile(
mode="w", suffix=".json", delete=False
) as tmp_file:
tmp_file.write(backend.util.json.dumps(graph_data))
tmp_file.flush()
return fastapi.responses.FileResponse(
tmp_file.name, filename=file_name, media_type="application/json"
)

View File

@@ -13,12 +13,17 @@ import backend.server.v2.library.model as library_model
import backend.server.v2.store.exceptions as store_exceptions
import backend.server.v2.store.image_gen as store_image_gen
import backend.server.v2.store.media as store_media
from backend.data import db
from backend.data import graph as graph_db
from backend.data.db import locked_transaction
from backend.data.includes import library_agent_include
from backend.integrations.creds_manager import IntegrationCredentialsManager
from backend.integrations.webhooks.graph_lifecycle_hooks import on_graph_activate
from backend.util.settings import Config
logger = logging.getLogger(__name__)
config = Config()
integration_creds_manager = IntegrationCredentialsManager()
async def list_library_agents(
@@ -170,6 +175,44 @@ async def get_library_agent(id: str, user_id: str) -> library_model.LibraryAgent
raise store_exceptions.DatabaseError("Failed to fetch library agent") from e
async def get_library_agent_by_store_version_id(
store_listing_version_id: str,
user_id: str,
):
"""
Get the library agent metadata for a given store listing version ID and user ID.
"""
logger.debug(
f"Getting library agent for store listing ID: {store_listing_version_id}"
)
store_listing_version = (
await prisma.models.StoreListingVersion.prisma().find_unique(
where={"id": store_listing_version_id},
)
)
if not store_listing_version:
logger.warning(f"Store listing version not found: {store_listing_version_id}")
raise store_exceptions.AgentNotFoundError(
f"Store listing version {store_listing_version_id} not found or invalid"
)
# Check if user already has this agent
agent = await prisma.models.LibraryAgent.prisma().find_first(
where={
"userId": user_id,
"agentGraphId": store_listing_version.agentGraphId,
"agentGraphVersion": store_listing_version.agentGraphVersion,
"isDeleted": False,
},
include={"AgentGraph": True},
)
if agent:
return library_model.LibraryAgent.from_db(agent)
else:
return None
async def add_generated_agent_image(
graph: backend.data.graph.GraphModel,
library_agent_id: str,
@@ -206,7 +249,7 @@ async def add_generated_agent_image(
async def create_library_agent(
graph: backend.data.graph.GraphModel,
user_id: str,
) -> prisma.models.LibraryAgent:
) -> library_model.LibraryAgent:
"""
Adds an agent to the user's library (LibraryAgent table).
@@ -227,7 +270,7 @@ async def create_library_agent(
)
try:
return await prisma.models.LibraryAgent.prisma().create(
agent = await prisma.models.LibraryAgent.prisma().create(
data=prisma.types.LibraryAgentCreateInput(
isCreatedByUser=(user_id == graph.user_id),
useGraphIsActiveVersion=True,
@@ -238,8 +281,10 @@ async def create_library_agent(
"graphVersionId": {"id": graph.id, "version": graph.version}
}
},
)
),
include={"AgentGraph": True},
)
return library_model.LibraryAgent.from_db(agent)
except prisma.errors.PrismaError as e:
logger.error(f"Database error creating agent in library: {e}")
raise store_exceptions.DatabaseError("Failed to create agent in library") from e
@@ -390,11 +435,6 @@ async def add_store_agent_to_library(
)
graph = store_listing_version.AgentGraph
if graph.userId == user_id:
logger.warning(
f"User #{user_id} attempted to add their own agent to their library"
)
raise store_exceptions.DatabaseError("Cannot add own agent to library")
# Check if user already has this agent
existing_library_agent = (
@@ -404,7 +444,7 @@ async def add_store_agent_to_library(
"agentGraphId": graph.id,
"agentGraphVersion": graph.version,
},
include=library_agent_include(user_id),
include={"AgentGraph": True},
)
)
if existing_library_agent:
@@ -662,3 +702,47 @@ async def delete_preset(user_id: str, preset_id: str) -> None:
except prisma.errors.PrismaError as e:
logger.error(f"Database error deleting preset: {e}")
raise store_exceptions.DatabaseError("Failed to delete preset") from e
async def fork_library_agent(library_agent_id: str, user_id: str):
"""
Clones a library agent and its underyling graph and nodes (with new ids) for the given user.
Args:
library_agent_id: The ID of the library agent to fork.
user_id: The ID of the user who owns the library agent.
Returns:
The forked LibraryAgent.
Raises:
DatabaseError: If there's an error during the forking process.
"""
logger.debug(f"Forking library agent {library_agent_id} for user {user_id}")
try:
async with db.locked_transaction(f"usr_trx_{user_id}-fork_agent"):
# Fetch the original agent
original_agent = await get_library_agent(library_agent_id, user_id)
# Check if user owns the library agent
# TODO: once we have open/closed sourced agents this needs to be enabled ~kcze
# + update library/agents/[id]/page.tsx agent actions
# if not original_agent.can_access_graph:
# raise store_exceptions.DatabaseError(
# f"User {user_id} cannot access library agent graph {library_agent_id}"
# )
# Fork the underlying graph and nodes
new_graph = await graph_db.fork_graph(
original_agent.graph_id, original_agent.graph_version, user_id
)
new_graph = await on_graph_activate(
new_graph,
get_credentials=lambda id: integration_creds_manager.get(user_id, id),
)
# Create a library agent for the new graph
return await create_library_agent(new_graph, user_id)
except prisma.errors.PrismaError as e:
logger.error(f"Database error cloning library agent: {e}")
raise store_exceptions.DatabaseError("Failed to fork library agent") from e

View File

@@ -22,11 +22,13 @@ async def test_agent_preset_from_db():
userId="test-user-123",
isDeleted=False,
InputPresets=[
prisma.models.AgentNodeExecutionInputOutput(
id="input-123",
time=datetime.datetime.now(),
name="input1",
data=prisma.Json({"type": "string", "value": "test value"}),
prisma.models.AgentNodeExecutionInputOutput.model_validate(
{
"id": "input-123",
"time": datetime.datetime.now(),
"name": "input1",
"data": '{"type": "string", "value": "test value"}',
}
)
],
)

View File

@@ -85,6 +85,30 @@ async def get_library_agent(
return await library_db.get_library_agent(id=library_agent_id, user_id=user_id)
@router.get(
"/marketplace/{store_listing_version_id}/",
tags=["store, library"],
response_model=library_model.LibraryAgent | None,
)
async def get_library_agent_by_store_listing_version_id(
store_listing_version_id: str,
user_id: str = Depends(autogpt_auth_lib.depends.get_user_id),
):
"""
Get Library Agent from Store Listing Version ID.
"""
try:
return await library_db.get_library_agent_by_store_version_id(
store_listing_version_id, user_id
)
except Exception as e:
logger.error(f"Could not fetch library agent from store version ID: {e}")
raise HTTPException(
status_code=status.HTTP_500_INTERNAL_SERVER_ERROR,
detail="Failed to add agent to library",
) from e
@router.post(
"",
status_code=status.HTTP_201_CREATED,
@@ -190,3 +214,14 @@ async def update_library_agent(
status_code=status.HTTP_500_INTERNAL_SERVER_ERROR,
detail="Failed to update library agent",
) from e
@router.post("/{library_agent_id}/fork")
async def fork_library_agent(
library_agent_id: str,
user_id: str = Depends(autogpt_auth_lib.depends.get_user_id),
) -> library_model.LibraryAgent:
return await library_db.fork_library_agent(
library_agent_id=library_agent_id,
user_id=user_id,
)

View File

@@ -793,6 +793,7 @@ async def create_store_version(
changes_summary=changes_summary,
version=next_version,
)
except prisma.errors.PrismaError as e:
raise backend.server.v2.store.exceptions.DatabaseError(
"Failed to create new store version"
@@ -966,7 +967,7 @@ async def get_my_agents(
library_agents = await prisma.models.LibraryAgent.prisma().find_many(
where=search_filter,
order=[{"agentGraphVersion": "desc"}],
order=[{"updatedAt": "desc"}],
skip=(page - 1) * page_size,
take=page_size,
include={"AgentGraph": True},
@@ -1361,3 +1362,31 @@ async def get_admin_listings_with_versions(
page_size=page_size,
),
)
async def get_agent_as_admin(
user_id: str | None,
store_listing_version_id: str,
) -> GraphModel:
"""Get agent using the version ID and store listing version ID."""
store_listing_version = (
await prisma.models.StoreListingVersion.prisma().find_unique(
where={"id": store_listing_version_id}
)
)
if not store_listing_version:
raise ValueError(f"Store listing version {store_listing_version_id} not found")
graph = await backend.data.graph.get_graph_as_admin(
user_id=user_id,
graph_id=store_listing_version.agentGraphId,
version=store_listing_version.agentGraphVersion,
for_export=True,
)
if not graph:
raise ValueError(
f"Agent {store_listing_version.agentGraphId} v{store_listing_version.agentGraphVersion} not found"
)
return graph

View File

@@ -3,9 +3,9 @@ import io
import logging
from enum import Enum
import replicate
import replicate.exceptions
from prisma.models import AgentGraph
from replicate.client import Client as ReplicateClient
from replicate.exceptions import ReplicateError
from replicate.helpers import FileOutput
from backend.blocks.ideogram import (
@@ -117,7 +117,7 @@ async def generate_agent_image_v1(agent: Graph | AgentGraph) -> io.BytesIO:
prompt = f"Create a visually engaging app store thumbnail for the AI agent that highlights what it does in a clear and captivating way:\n- **Name**: {agent.name}\n- **Description**: {agent.description}\nFocus on showcasing its core functionality with an appealing design."
# Set up Replicate client
client = replicate.Client(api_token=settings.secrets.replicate_api_key)
client = ReplicateClient(api_token=settings.secrets.replicate_api_key)
# Model parameters
input_data = {
@@ -158,7 +158,7 @@ async def generate_agent_image_v1(agent: Graph | AgentGraph) -> io.BytesIO:
return io.BytesIO(image_bytes)
except replicate.exceptions.ReplicateError as e:
except ReplicateError as e:
if e.status == 401:
raise RuntimeError("Invalid Replicate API token") from e
raise RuntimeError(f"Replicate API error: {str(e)}") from e

View File

@@ -4,20 +4,7 @@ from typing import List
import prisma.enums
import pydantic
class Pagination(pydantic.BaseModel):
total_items: int = pydantic.Field(
description="Total number of items.", examples=[42]
)
total_pages: int = pydantic.Field(
description="Total number of pages.", examples=[97]
)
current_page: int = pydantic.Field(
description="Current_page page number.", examples=[1]
)
page_size: int = pydantic.Field(
description="Number of items per page.", examples=[25]
)
from backend.server.model import Pagination
class MyAgent(pydantic.BaseModel):

View File

@@ -0,0 +1,30 @@
from typing import Optional
from pydantic import BaseModel, Field
class TurnstileVerifyRequest(BaseModel):
"""Request model for verifying a Turnstile token."""
token: str = Field(description="The Turnstile token to verify")
action: Optional[str] = Field(
default=None, description="The action that the user is attempting to perform"
)
class TurnstileVerifyResponse(BaseModel):
"""Response model for the Turnstile verification endpoint."""
success: bool = Field(description="Whether the token verification was successful")
error: Optional[str] = Field(
default=None, description="Error message if verification failed"
)
challenge_timestamp: Optional[str] = Field(
default=None, description="Timestamp of the challenge (ISO format)"
)
hostname: Optional[str] = Field(
default=None, description="Hostname of the site where the challenge was solved"
)
action: Optional[str] = Field(
default=None, description="The action associated with this verification"
)

View File

@@ -0,0 +1,108 @@
import logging
import aiohttp
from fastapi import APIRouter
from backend.util.settings import Settings
from .models import TurnstileVerifyRequest, TurnstileVerifyResponse
logger = logging.getLogger(__name__)
router = APIRouter()
settings = Settings()
@router.post("/verify", response_model=TurnstileVerifyResponse)
async def verify_turnstile_token(
request: TurnstileVerifyRequest,
) -> TurnstileVerifyResponse:
"""
Verify a Cloudflare Turnstile token.
This endpoint verifies a token returned by the Cloudflare Turnstile challenge
on the client side. It returns whether the verification was successful.
"""
logger.info(f"Verifying Turnstile token for action: {request.action}")
return await verify_token(request)
async def verify_token(request: TurnstileVerifyRequest) -> TurnstileVerifyResponse:
"""
Verify a Cloudflare Turnstile token by making a request to the Cloudflare API.
"""
# Get the secret key from settings
turnstile_secret_key = settings.secrets.turnstile_secret_key
turnstile_verify_url = settings.secrets.turnstile_verify_url
if not turnstile_secret_key:
logger.error("Turnstile secret key is not configured")
return TurnstileVerifyResponse(
success=False,
error="CONFIGURATION_ERROR",
challenge_timestamp=None,
hostname=None,
action=None,
)
try:
async with aiohttp.ClientSession() as session:
payload = {
"secret": turnstile_secret_key,
"response": request.token,
}
if request.action:
payload["action"] = request.action
logger.debug(f"Verifying Turnstile token with action: {request.action}")
async with session.post(
turnstile_verify_url,
data=payload,
timeout=aiohttp.ClientTimeout(total=10),
) as response:
if response.status != 200:
error_text = await response.text()
logger.error(f"Turnstile API error: {error_text}")
return TurnstileVerifyResponse(
success=False,
error=f"API_ERROR: {response.status}",
challenge_timestamp=None,
hostname=None,
action=None,
)
data = await response.json()
logger.debug(f"Turnstile API response: {data}")
# Parse the response and return a structured object
return TurnstileVerifyResponse(
success=data.get("success", False),
error=(
data.get("error-codes", None)[0]
if data.get("error-codes")
else None
),
challenge_timestamp=data.get("challenge_timestamp"),
hostname=data.get("hostname"),
action=data.get("action"),
)
except aiohttp.ClientError as e:
logger.error(f"Connection error to Turnstile API: {str(e)}")
return TurnstileVerifyResponse(
success=False,
error=f"CONNECTION_ERROR: {str(e)}",
challenge_timestamp=None,
hostname=None,
action=None,
)
except Exception as e:
logger.error(f"Unexpected error in Turnstile verification: {str(e)}")
return TurnstileVerifyResponse(
success=False,
error=f"UNEXPECTED_ERROR: {str(e)}",
challenge_timestamp=None,
hostname=None,
action=None,
)

View File

@@ -6,7 +6,6 @@ from typing import Protocol
import uvicorn
from autogpt_libs.auth import parse_jwt_token
from autogpt_libs.logging.utils import generate_uvicorn_config
from autogpt_libs.utils.cache import thread_cached
from fastapi import Depends, FastAPI, WebSocket, WebSocketDisconnect
from starlette.middleware.cors import CORSMiddleware
@@ -19,7 +18,7 @@ from backend.server.model import (
WSSubscribeGraphExecutionRequest,
WSSubscribeGraphExecutionsRequest,
)
from backend.util.service import AppProcess, get_service_client
from backend.util.service import AppProcess
from backend.util.settings import AppEnvironment, Config, Settings
logger = logging.getLogger(__name__)
@@ -46,13 +45,6 @@ def get_connection_manager():
return _connection_manager
@thread_cached
def get_db_client():
from backend.executor import DatabaseManager
return get_service_client(DatabaseManager)
async def event_broadcaster(manager: ConnectionManager):
try:
event_queue = AsyncRedisExecutionEventBus()

View File

@@ -1,3 +1,5 @@
from logging import Logger
from backend.util.settings import AppEnvironment, BehaveAs, Settings
settings = Settings()
@@ -18,3 +20,47 @@ def configure_logging():
# Silence httpx logger
logging.getLogger("httpx").setLevel(logging.WARNING)
class TruncatedLogger:
def __init__(
self,
logger: Logger,
prefix: str = "",
metadata: dict | None = None,
max_length: int = 1000,
):
self.logger = logger
self.metadata = metadata or {}
self.max_length = max_length
self.prefix = prefix
def info(self, msg: str, **extra):
msg = self._wrap(msg, **extra)
self.logger.info(msg, extra=self._get_metadata(**extra))
def warning(self, msg: str, **extra):
msg = self._wrap(msg, **extra)
self.logger.warning(msg, extra=self._get_metadata(**extra))
def error(self, msg: str, **extra):
msg = self._wrap(msg, **extra)
self.logger.error(msg, extra=self._get_metadata(**extra))
def debug(self, msg: str, **extra):
msg = self._wrap(msg, **extra)
self.logger.debug(msg, extra=self._get_metadata(**extra))
def exception(self, msg: str, **extra):
msg = self._wrap(msg, **extra)
self.logger.exception(msg, extra=self._get_metadata(**extra))
def _get_metadata(self, **extra):
metadata = {**self.metadata, **extra}
return {"json_fields": metadata} if metadata else {}
def _wrap(self, msg: str, **extra):
extra_msg = str(extra or "")
if len(extra_msg) > 1000:
extra_msg = extra_msg[:1000] + "..."
return f"{self.prefix} {msg} {extra_msg}"

View File

@@ -1,6 +1,8 @@
import asyncio
import logging
import sentry_sdk
from pydantic import SecretStr
from sentry_sdk.integrations.anthropic import AnthropicIntegration
from sentry_sdk.integrations.logging import LoggingIntegration
@@ -22,3 +24,43 @@ def sentry_init():
),
],
)
def sentry_capture_error(error: Exception):
sentry_sdk.capture_exception(error)
sentry_sdk.flush()
def discord_send_alert(content: str):
from backend.blocks.discord import SendDiscordMessageBlock
from backend.data.model import APIKeyCredentials, CredentialsMetaInput, ProviderName
from backend.util.settings import Settings
settings = Settings()
creds = APIKeyCredentials(
provider="discord",
api_key=SecretStr(settings.secrets.discord_bot_token),
title="Provide Discord Bot Token for the platform alert",
expires_at=None,
)
try:
loop = asyncio.get_event_loop()
except RuntimeError:
loop = asyncio.new_event_loop()
asyncio.set_event_loop(loop)
return SendDiscordMessageBlock().run_once(
SendDiscordMessageBlock.Input(
credentials=CredentialsMetaInput(
id=creds.id,
title=creds.title,
type=creds.type,
provider=ProviderName.DISCORD,
),
message_content=content,
channel_name=settings.config.platform_alert_discord_channel,
),
"status",
credentials=creds,
)

View File

@@ -3,7 +3,7 @@ import os
import signal
import sys
from abc import ABC, abstractmethod
from multiprocessing import Process, set_start_method
from multiprocessing import Process, get_all_start_methods, set_start_method
from typing import Optional
from backend.util.logging import configure_logging
@@ -30,7 +30,12 @@ class AppProcess(ABC):
process: Optional[Process] = None
cleaned_up = False
set_start_method("spawn", force=True)
if "forkserver" in get_all_start_methods():
set_start_method("forkserver", force=True)
else:
logger.warning("Forkserver start method is not available. Using spawn instead.")
set_start_method("spawn", force=True)
configure_logging()
sentry_init()

View File

@@ -2,8 +2,9 @@ import ipaddress
import re
import socket
import ssl
from typing import Callable
from urllib.parse import quote, urljoin, urlparse, urlunparse
from typing import Callable, Optional
from urllib.parse import ParseResult as URL
from urllib.parse import quote, urljoin, urlparse
import idna
import requests as req
@@ -44,17 +45,15 @@ def _is_ip_blocked(ip: str) -> bool:
return any(ip_addr in network for network in BLOCKED_IP_NETWORKS)
def _remove_insecure_headers(headers: dict, old_url: str, new_url: str) -> dict:
def _remove_insecure_headers(headers: dict, old_url: URL, new_url: URL) -> dict:
"""
Removes sensitive headers (Authorization, Proxy-Authorization, Cookie)
if the scheme/host/port of new_url differ from old_url.
"""
old_parsed = urlparse(old_url)
new_parsed = urlparse(new_url)
if (
(old_parsed.scheme != new_parsed.scheme)
or (old_parsed.hostname != new_parsed.hostname)
or (old_parsed.port != new_parsed.port)
(old_url.scheme != new_url.scheme)
or (old_url.hostname != new_url.hostname)
or (old_url.port != new_url.port)
):
headers.pop("Authorization", None)
headers.pop("Proxy-Authorization", None)
@@ -81,19 +80,16 @@ class HostSSLAdapter(HTTPAdapter):
)
def validate_url(
url: str,
trusted_origins: list[str],
enable_dns_rebinding: bool = True,
) -> tuple[str, str]:
def validate_url(url: str, trusted_origins: list[str]) -> tuple[URL, bool, list[str]]:
"""
Validates the URL to prevent SSRF attacks by ensuring it does not point
to a private, link-local, or otherwise blocked IP address — unless
the hostname is explicitly trusted.
Returns a tuple of:
- pinned_url: a URL that has the netloc replaced with the validated IP
- ascii_hostname: the original ASCII hostname (IDNA-decoded) for use in the Host header
Returns:
str: The validated, canonicalized, parsed URL
is_trusted: Boolean indicating if the hostname is in trusted_origins
ip_addresses: List of IP addresses for the host; empty if the host is trusted
"""
# Canonicalize URL
url = url.strip("/ ").replace("\\", "/")
@@ -122,45 +118,56 @@ def validate_url(
if not HOSTNAME_REGEX.match(ascii_hostname):
raise ValueError("Hostname contains invalid characters.")
# If hostname is trusted, skip IP-based checks but still return pinned URL
if ascii_hostname in trusted_origins:
pinned_netloc = ascii_hostname
if parsed.port:
pinned_netloc += f":{parsed.port}"
# Check if hostname is trusted
is_trusted = ascii_hostname in trusted_origins
pinned_url = urlunparse(
(
parsed.scheme,
pinned_netloc,
quote(parsed.path, safe="/%:@"),
parsed.params,
parsed.query,
parsed.fragment,
)
)
return pinned_url, ascii_hostname
# If not trusted, validate IP addresses
ip_addresses: list[str] = []
if not is_trusted:
# Resolve all IP addresses for the hostname
ip_addresses = _resolve_host(ascii_hostname)
# Resolve all IP addresses for the hostname
try:
ip_list = [str(res[4][0]) for res in socket.getaddrinfo(ascii_hostname, None)]
ipv4 = [ip for ip in ip_list if ":" not in ip]
ipv6 = [ip for ip in ip_list if ":" in ip]
ip_addresses = ipv4 + ipv6 # Prefer IPv4 over IPv6
except socket.gaierror:
raise ValueError(f"Unable to resolve IP address for hostname {ascii_hostname}")
# Block any IP address that belongs to a blocked range
for ip_str in ip_addresses:
if _is_ip_blocked(ip_str):
raise ValueError(
f"Access to blocked or private IP address {ip_str} "
f"for hostname {ascii_hostname} is not allowed."
)
return (
URL(
parsed.scheme,
ascii_hostname,
quote(parsed.path, safe="/%:@"),
parsed.params,
parsed.query,
parsed.fragment,
),
is_trusted,
ip_addresses,
)
def pin_url(url: URL, ip_addresses: Optional[list[str]] = None) -> URL:
"""
Pins a URL to a specific IP address to prevent DNS rebinding attacks.
Args:
url: The original URL
ip_addresses: List of IP addresses corresponding to the URL's host
Returns:
pinned_url: The URL with hostname replaced with IP address
"""
if not url.hostname:
raise ValueError(f"URL has no hostname: {url}")
if not ip_addresses:
raise ValueError(f"No IP addresses found for {ascii_hostname}")
# Resolve all IP addresses for the hostname
ip_addresses = _resolve_host(url.hostname)
# Block any IP address that belongs to a blocked range
for ip_str in ip_addresses:
if _is_ip_blocked(ip_str):
raise ValueError(
f"Access to blocked or private IP address {ip_str} "
f"for hostname {ascii_hostname} is not allowed."
)
# Pin to the first valid IP (for SSRF defense).
# Pin to the first valid IP (for SSRF defense)
pinned_ip = ip_addresses[0]
# If it's IPv6, bracket it
@@ -169,24 +176,31 @@ def validate_url(
else:
pinned_netloc = pinned_ip
if parsed.port:
pinned_netloc += f":{parsed.port}"
if url.port:
pinned_netloc += f":{url.port}"
if not enable_dns_rebinding:
pinned_netloc = ascii_hostname
pinned_url = urlunparse(
(
parsed.scheme,
pinned_netloc,
quote(parsed.path, safe="/%:@"),
parsed.params,
parsed.query,
parsed.fragment,
)
return URL(
url.scheme,
pinned_netloc,
url.path,
url.params,
url.query,
url.fragment,
)
return pinned_url, ascii_hostname # (pinned_url, original_hostname)
def _resolve_host(hostname: str) -> list[str]:
try:
ip_list = [str(res[4][0]) for res in socket.getaddrinfo(hostname, None)]
ipv4 = [ip for ip in ip_list if ":" not in ip]
ipv6 = [ip for ip in ip_list if ":" in ip]
ip_addresses = ipv4 + ipv6 # Prefer IPv4 over IPv6
except socket.gaierror:
raise ValueError(f"Unable to resolve IP address for hostname {hostname}")
if not ip_addresses:
raise ValueError(f"No IP addresses found for {hostname}")
return ip_addresses
class Requests:
@@ -200,7 +214,7 @@ class Requests:
self,
trusted_origins: list[str] | None = None,
raise_for_status: bool = True,
extra_url_validator: Callable[[str], str] | None = None,
extra_url_validator: Callable[[URL], URL] | None = None,
extra_headers: dict[str, str] | None = None,
):
self.trusted_origins = []
@@ -224,45 +238,52 @@ class Requests:
*args,
**kwargs,
) -> req.Response:
# Validate URL and get pinned URL + original hostname
pinned_url, original_hostname = validate_url(url, self.trusted_origins)
# Validate URL and get trust status
url, is_trusted, ip_addresses = validate_url(url, self.trusted_origins)
# Apply any extra user-defined validation/transformation
if self.extra_url_validator is not None:
pinned_url = self.extra_url_validator(pinned_url)
url = self.extra_url_validator(url)
# Pin the URL if untrusted
hostname = url.hostname
original_url = url.geturl()
if not is_trusted:
url = pin_url(url, ip_addresses)
# Merge any extra headers
headers = dict(headers) if headers else {}
if self.extra_headers is not None:
headers.update(self.extra_headers)
# Force the Host header to the original hostname
headers["Host"] = original_hostname
# Create a fresh session & mount our HostSSLAdapter if pinned to IP
session = req.Session()
pinned_parsed = urlparse(pinned_url)
# If pinned_url netloc is an IP (not in trusted_origins),
# then we attach the custom SNI adapter:
if pinned_parsed.hostname and pinned_parsed.hostname != original_hostname:
# That means we definitely pinned to an IP
mount_prefix = f"{pinned_parsed.scheme}://{pinned_parsed.hostname}"
if pinned_parsed.port:
mount_prefix += f":{pinned_parsed.port}"
adapter = HostSSLAdapter(ssl_hostname=original_hostname)
# If untrusted, the hostname in the URL is replaced with the corresponding
# IP address, and we need to override the Host header with the actual hostname.
if url.hostname != hostname:
headers["Host"] = hostname
# If hostname was untrusted and we replaced it by (pinned it to) its IP,
# we also need to attach a custom SNI adapter to make SSL work:
adapter = HostSSLAdapter(ssl_hostname=hostname)
session.mount("https://", adapter)
# Perform the request with redirects disabled for manual handling
response = session.request(
method,
pinned_url,
url.geturl(),
headers=headers,
allow_redirects=False,
*args,
**kwargs,
)
# Replace response URLs with the original host for clearer error messages
if url.hostname != hostname:
response.url = original_url
if response.request is not None:
response.request.url = original_url
if self.raise_for_status:
response.raise_for_status()
@@ -277,13 +298,13 @@ class Requests:
# The base URL is the pinned_url we just used
# so that relative redirects resolve correctly.
new_url = urljoin(pinned_url, location)
redirect_url = urlparse(urljoin(url.geturl(), location))
# Carry forward the same headers but update Host
new_headers = _remove_insecure_headers(dict(headers), url, new_url)
new_headers = _remove_insecure_headers(headers, url, redirect_url)
return self.request(
method,
new_url,
redirect_url.geturl(),
headers=new_headers,
allow_redirects=allow_redirects,
max_redirects=max_redirects - 1,

View File

@@ -2,6 +2,7 @@ import asyncio
import logging
import os
import threading
import time
from functools import wraps
from uuid import uuid4
@@ -73,3 +74,31 @@ def conn_retry(
return async_wrapper if is_coroutine else sync_wrapper
return decorator
func_retry = retry(
reraise=False,
stop=stop_after_attempt(5),
wait=wait_exponential(multiplier=1, min=1, max=30),
)
def continuous_retry(*, retry_delay: float = 1.0):
def decorator(func):
@wraps(func)
def wrapper(*args, **kwargs):
while True:
try:
return func(*args, **kwargs)
except Exception as exc:
logger.exception(
"%s failed with %s — retrying in %.2f s",
func.__name__,
exc,
retry_delay,
)
time.sleep(retry_delay)
return wrapper
return decorator

View File

@@ -5,8 +5,10 @@ import os
import threading
import time
from abc import ABC, abstractmethod
from functools import cached_property, update_wrapper
from typing import (
Any,
Awaitable,
Callable,
Concatenate,
Coroutine,
@@ -42,24 +44,15 @@ api_call_timeout = config.rpc_client_call_timeout
P = ParamSpec("P")
R = TypeVar("R")
EXPOSED_FLAG = "__exposed__"
def expose(func: C) -> C:
func = getattr(func, "__func__", func)
setattr(func, "__exposed__", True)
setattr(func, EXPOSED_FLAG, True)
return func
def exposed_run_and_wait(
f: Callable[P, Coroutine[None, None, R]]
) -> Callable[Concatenate[object, P], R]:
# TODO:
# This function lies about its return type to make the DynamicClient
# call the function synchronously, fix this when DynamicClient can choose
# to call a function synchronously or asynchronously.
return expose(f) # type: ignore
# --------------------------------------------------
# AppService for IPC service based on HTTP request through FastAPI
# --------------------------------------------------
@@ -203,7 +196,7 @@ class AppService(BaseAppService, ABC):
# Register the exposed API routes.
for attr_name, attr in vars(type(self)).items():
if getattr(attr, "__exposed__", False):
if getattr(attr, EXPOSED_FLAG, False):
route_path = f"/{attr_name}"
self.fastapi_app.add_api_route(
route_path,
@@ -234,31 +227,53 @@ class AppService(BaseAppService, ABC):
AS = TypeVar("AS", bound=AppService)
def close_service_client(client: Any) -> None:
if hasattr(client, "close"):
client.close()
else:
logger.warning(f"Client {client} is not closable")
class AppServiceClient(ABC):
@classmethod
@abstractmethod
def get_service_type(cls) -> Type[AppService]:
pass
def health_check(self):
pass
def close(self):
pass
@conn_retry("FastAPI client", "Creating service client", max_retry=api_comm_retry)
ASC = TypeVar("ASC", bound=AppServiceClient)
@conn_retry("AppService client", "Creating service client", max_retry=api_comm_retry)
def get_service_client(
service_type: Type[AS],
service_client_type: Type[ASC],
call_timeout: int | None = api_call_timeout,
) -> AS:
health_check: bool = True,
) -> ASC:
class DynamicClient:
def __init__(self):
service_type = service_client_type.get_service_type()
host = service_type.get_host()
port = service_type.get_port()
self.base_url = f"http://{host}:{port}".rstrip("/")
self.client = httpx.Client(
@cached_property
def sync_client(self) -> httpx.Client:
return httpx.Client(
base_url=self.base_url,
timeout=call_timeout,
)
def _call_method(self, method_name: str, **kwargs) -> Any:
@cached_property
def async_client(self) -> httpx.AsyncClient:
return httpx.AsyncClient(
base_url=self.base_url,
timeout=call_timeout,
)
def _handle_call_method_response(
self, response: httpx.Response, method_name: str
) -> Any:
try:
response = self.client.post(method_name, json=to_dict(kwargs))
response.raise_for_status()
return response.json()
except httpx.HTTPStatusError as e:
@@ -269,36 +284,103 @@ def get_service_client(
*(error.args or [str(e)])
)
def _call_method_sync(self, method_name: str, **kwargs) -> Any:
return self._handle_call_method_response(
method_name=method_name,
response=self.sync_client.post(method_name, json=to_dict(kwargs)),
)
async def _call_method_async(self, method_name: str, **kwargs) -> Any:
return self._handle_call_method_response(
method_name=method_name,
response=await self.async_client.post(
method_name, json=to_dict(kwargs)
),
)
async def aclose(self):
self.sync_client.close()
await self.async_client.aclose()
def close(self):
self.client.close()
self.sync_client.close()
def _get_params(self, signature: inspect.Signature, *args, **kwargs) -> dict:
if args:
arg_names = list(signature.parameters.keys())
if arg_names[0] in ("self", "cls"):
arg_names = arg_names[1:]
kwargs.update(dict(zip(arg_names, args)))
return kwargs
def _get_return(self, expected_return: TypeAdapter | None, result: Any) -> Any:
if expected_return:
return expected_return.validate_python(result)
return result
def __getattr__(self, name: str) -> Callable[..., Any]:
# Try to get the original function from the service type.
orig_func = getattr(service_type, name, None)
if orig_func is None:
raise AttributeError(f"Method {name} not found in {service_type}")
original_func = getattr(service_client_type, name, None)
if original_func is None:
raise AttributeError(
f"Method {name} not found in {service_client_type}"
)
else:
name = original_func.__name__
sig = inspect.signature(orig_func)
sig = inspect.signature(original_func)
ret_ann = sig.return_annotation
if ret_ann != inspect.Signature.empty:
expected_return = TypeAdapter(ret_ann)
else:
expected_return = None
def method(*args, **kwargs) -> Any:
if args:
arg_names = list(sig.parameters.keys())
if arg_names[0] in ("self", "cls"):
arg_names = arg_names[1:]
kwargs.update(dict(zip(arg_names, args)))
result = self._call_method(name, **kwargs)
if expected_return:
return expected_return.validate_python(result)
return result
if inspect.iscoroutinefunction(original_func):
return method
async def async_method(*args, **kwargs) -> Any:
params = self._get_params(sig, *args, **kwargs)
result = await self._call_method_async(name, **params)
return self._get_return(expected_return, result)
client = cast(AS, DynamicClient())
client.health_check()
return async_method
else:
return cast(AS, client)
def sync_method(*args, **kwargs) -> Any:
params = self._get_params(sig, *args, **kwargs)
result = self._call_method_sync(name, **params)
return self._get_return(expected_return, result)
return sync_method
client = cast(ASC, DynamicClient())
if health_check:
client.health_check()
return client
def endpoint_to_sync(
func: Callable[Concatenate[Any, P], Awaitable[R]],
) -> Callable[Concatenate[Any, P], R]:
"""
Produce a *typed* stub that **looks** synchronous to the typechecker.
"""
def _stub(*args: P.args, **kwargs: P.kwargs) -> R: # pragma: no cover
raise RuntimeError("should be intercepted by __getattr__")
update_wrapper(_stub, func)
return cast(Callable[Concatenate[Any, P], R], _stub)
def endpoint_to_async(
func: Callable[Concatenate[Any, P], R],
) -> Callable[Concatenate[Any, P], Awaitable[R]]:
"""
The async mirror of `to_sync`.
"""
async def _stub(*args: P.args, **kwargs: P.kwargs) -> R: # pragma: no cover
raise RuntimeError("should be intercepted by __getattr__")
update_wrapper(_stub, func)
return cast(Callable[Concatenate[Any, P], Awaitable[R]], _stub)

View File

@@ -117,6 +117,18 @@ class Config(UpdateTrackingModel["Config"], BaseSettings):
default=1,
description="Cost per execution in cents after each threshold.",
)
execution_counter_expiration_time: int = Field(
default=60 * 60 * 24,
description="Time in seconds after which the execution counter is reset.",
)
execution_late_notification_threshold_secs: int = Field(
default=5 * 60,
description="Time in seconds after which the execution stuck on QUEUED status is considered late.",
)
execution_late_notification_checkrange_secs: int = Field(
default=60 * 60,
description="Time in seconds for how far back to check for the late executions.",
)
model_config = SettingsConfigDict(
env_file=".env",
@@ -137,10 +149,6 @@ class Config(UpdateTrackingModel["Config"], BaseSettings):
default=8002,
description="The port for execution manager daemon to run on",
)
execution_manager_loop_max_retry: int = Field(
default=5,
description="The maximum number of retries for the execution manager loop",
)
execution_scheduler_port: int = Field(
default=8003,
@@ -231,6 +239,10 @@ class Config(UpdateTrackingModel["Config"], BaseSettings):
default=True,
description="Whether to enable the agent input subtype blocks",
)
platform_alert_discord_channel: str = Field(
default="local-alerts",
description="The Discord channel for the platform",
)
@field_validator("platform_base_url", "frontend_base_url")
@classmethod
@@ -342,6 +354,16 @@ class Secrets(UpdateTrackingModel["Secrets"], BaseSettings):
description="The secret key to use for the unsubscribe user by token",
)
# Cloudflare Turnstile credentials
turnstile_secret_key: str = Field(
default="",
description="Cloudflare Turnstile backend secret key",
)
turnstile_verify_url: str = Field(
default="https://challenges.cloudflare.com/turnstile/v0/siteverify",
description="Cloudflare Turnstile verify URL",
)
# OAuth server credentials for integrations
# --8<-- [start:OAuthServerCredentialsExample]
github_client_id: str = Field(default="", description="GitHub OAuth client ID")
@@ -366,6 +388,7 @@ class Secrets(UpdateTrackingModel["Secrets"], BaseSettings):
anthropic_api_key: str = Field(default="", description="Anthropic API key")
groq_api_key: str = Field(default="", description="Groq API key")
open_router_api_key: str = Field(default="", description="Open Router API Key")
llama_api_key: str = Field(default="", description="Llama API Key")
reddit_client_id: str = Field(default="", description="Reddit client ID")
reddit_client_secret: str = Field(default="", description="Reddit client secret")

View File

@@ -25,7 +25,7 @@ class SpinTestServer:
self.db_api = DatabaseManager()
self.exec_manager = ExecutionManager()
self.agent_server = AgentServer()
self.scheduler = Scheduler()
self.scheduler = Scheduler(register_system_tasks=False)
self.notif_manager = NotificationManager()
@staticmethod

View File

@@ -0,0 +1,7 @@
-- AlterTable
ALTER TABLE "AgentGraph"
ADD COLUMN "forkedFromId" TEXT,
ADD COLUMN "forkedFromVersion" INTEGER;
-- AddForeignKey
ALTER TABLE "AgentGraph" ADD CONSTRAINT "AgentGraph_forkedFromId_forkedFromVersion_fkey" FOREIGN KEY ("forkedFromId", "forkedFromVersion") REFERENCES "AgentGraph"("id", "version") ON DELETE SET NULL ON UPDATE CASCADE;

View File

@@ -0,0 +1,5 @@
-- CreateIndex
CREATE INDEX "AgentGraphExecution_createdAt_idx" ON "AgentGraphExecution"("createdAt");
-- CreateIndex
CREATE INDEX "AgentNodeExecution_addedTime_idx" ON "AgentNodeExecution"("addedTime");

View File

@@ -0,0 +1,9 @@
-- Rename 'data' input to 'inputs' on all Agent Executor nodes
UPDATE "AgentNode" AS node
SET "constantInput" = jsonb_set(
"constantInput",
'{inputs}',
"constantInput"->'data'
) - 'data'
WHERE node."agentBlockId" = 'e189baac-8c20-45a1-94a7-55177ea42565'
AND node."constantInput" ? 'data';

View File

@@ -178,20 +178,20 @@ files = [
[[package]]
name = "anthropic"
version = "0.49.0"
version = "0.51.0"
description = "The official Python library for the anthropic API"
optional = false
python-versions = ">=3.8"
groups = ["main"]
files = [
{file = "anthropic-0.49.0-py3-none-any.whl", hash = "sha256:bbc17ad4e7094988d2fa86b87753ded8dce12498f4b85fe5810f208f454a8375"},
{file = "anthropic-0.49.0.tar.gz", hash = "sha256:c09e885b0f674b9119b4f296d8508907f6cff0009bc20d5cf6b35936c40b4398"},
{file = "anthropic-0.51.0-py3-none-any.whl", hash = "sha256:b8b47d482c9aa1f81b923555cebb687c2730309a20d01be554730c8302e0f62a"},
{file = "anthropic-0.51.0.tar.gz", hash = "sha256:6f824451277992af079554430d5b2c8ff5bc059cc2c968cdc3f06824437da201"},
]
[package.dependencies]
anyio = ">=3.5.0,<5"
distro = ">=1.7.0,<2"
httpx = ">=0.23.0,<1"
httpx = ">=0.25.0,<1"
jiter = ">=0.4.0,<1"
pydantic = ">=1.9.0,<3"
sniffio = "*"
@@ -199,7 +199,7 @@ typing-extensions = ">=4.10,<5"
[package.extras]
bedrock = ["boto3 (>=1.28.57)", "botocore (>=1.31.57)"]
vertex = ["google-auth (>=2,<3)"]
vertex = ["google-auth[requests] (>=2,<3)"]
[[package]]
name = "anyio"
@@ -298,68 +298,18 @@ develop = true
[package.dependencies]
colorama = "^0.4.6"
expiringdict = "^1.2.2"
google-cloud-logging = "^3.11.4"
pydantic = "^2.11.1"
pydantic-settings = "^2.8.1"
google-cloud-logging = "^3.12.1"
pydantic = "^2.11.4"
pydantic-settings = "^2.9.1"
pyjwt = "^2.10.1"
pytest-asyncio = "^0.26.0"
pytest-mock = "^3.14.0"
supabase = "^2.15.0"
supabase = "^2.15.1"
[package.source]
type = "directory"
url = "../autogpt_libs"
[[package]]
name = "azure-common"
version = "1.1.28"
description = "Microsoft Azure Client Library for Python (Common)"
optional = false
python-versions = "*"
groups = ["main"]
files = [
{file = "azure-common-1.1.28.zip", hash = "sha256:4ac0cd3214e36b6a1b6a442686722a5d8cc449603aa833f3f0f40bda836704a3"},
{file = "azure_common-1.1.28-py2.py3-none-any.whl", hash = "sha256:5c12d3dcf4ec20599ca6b0d3e09e86e146353d443e7fcc050c9a19c1f9df20ad"},
]
[[package]]
name = "azure-core"
version = "1.32.0"
description = "Microsoft Azure Core Library for Python"
optional = false
python-versions = ">=3.8"
groups = ["main"]
files = [
{file = "azure_core-1.32.0-py3-none-any.whl", hash = "sha256:eac191a0efb23bfa83fddf321b27b122b4ec847befa3091fa736a5c32c50d7b4"},
{file = "azure_core-1.32.0.tar.gz", hash = "sha256:22b3c35d6b2dae14990f6c1be2912bf23ffe50b220e708a28ab1bb92b1c730e5"},
]
[package.dependencies]
requests = ">=2.21.0"
six = ">=1.11.0"
typing-extensions = ">=4.6.0"
[package.extras]
aio = ["aiohttp (>=3.0)"]
[[package]]
name = "azure-search-documents"
version = "11.5.2"
description = "Microsoft Azure Cognitive Search Client Library for Python"
optional = false
python-versions = ">=3.8"
groups = ["main"]
files = [
{file = "azure_search_documents-11.5.2-py3-none-any.whl", hash = "sha256:c949d011008a4b0bcee3db91132741b4e4d50ddb3f7e2f48944d949d4b413b11"},
{file = "azure_search_documents-11.5.2.tar.gz", hash = "sha256:98977dd1fa4978d3b7d8891a0856b3becb6f02cc07ff2e1ea40b9c7254ada315"},
]
[package.dependencies]
azure-common = ">=1.1"
azure-core = ">=1.28.0"
isodate = ">=0.6.0"
typing-extensions = ">=4.6.0"
[[package]]
name = "backoff"
version = "2.2.1"
@@ -727,14 +677,14 @@ rapidfuzz = ">=3.0.0,<4.0.0"
[[package]]
name = "click"
version = "8.1.8"
version = "8.2.0"
description = "Composable command line interface toolkit"
optional = false
python-versions = ">=3.7"
python-versions = ">=3.10"
groups = ["main", "dev"]
files = [
{file = "click-8.1.8-py3-none-any.whl", hash = "sha256:63c132bbbed01578a06712a2d1f497bb62d9c1c0d329b7903a866228027263b2"},
{file = "click-8.1.8.tar.gz", hash = "sha256:ed53c9d8990d83c2a27deae68e4ee337473f6330c040a31d4225c9574d16096a"},
{file = "click-8.2.0-py3-none-any.whl", hash = "sha256:6b303f0b2aa85f1cb4e5303078fadcbcd4e476f114fab9b5007005711839325c"},
{file = "click-8.2.0.tar.gz", hash = "sha256:f5452aeddd9988eefa20f90f05ab66f17fce1ee2a36907fd30b05bbb5953814d"},
]
[package.dependencies]
@@ -1008,14 +958,14 @@ pgp = ["gpg"]
[[package]]
name = "e2b"
version = "1.0.5"
version = "1.4.0"
description = "E2B SDK that give agents cloud environments"
optional = false
python-versions = "<4.0,>=3.8"
python-versions = "<4.0,>=3.9"
groups = ["main"]
files = [
{file = "e2b-1.0.5-py3-none-any.whl", hash = "sha256:a71bdec46f33d3e38e87d475d7fd2939bd7b6b753b819c9639ca211cd375b79e"},
{file = "e2b-1.0.5.tar.gz", hash = "sha256:43c82705af7b7d4415c2510ff77dab4dc075351e0b769d6adf8e0d7bb4868d13"},
{file = "e2b-1.4.0-py3-none-any.whl", hash = "sha256:a489015ece78ecabfdc281463ed495b6e4adf8c66278bb7312069d8ded21ab52"},
{file = "e2b-1.4.0.tar.gz", hash = "sha256:6a4596d4f91df32340bdbfac429591980a8dedd7ac509aae73f1bbc128175245"},
]
[package.dependencies]
@@ -1023,25 +973,25 @@ attrs = ">=23.2.0"
httpcore = ">=1.0.5,<2.0.0"
httpx = ">=0.27.0,<1.0.0"
packaging = ">=24.1"
protobuf = ">=3.20.0,<6.0.0"
protobuf = ">=5.29.4,<6.0.0"
python-dateutil = ">=2.8.2"
typing-extensions = ">=4.1.0"
[[package]]
name = "e2b-code-interpreter"
version = "1.1.1"
version = "1.5.0"
description = "E2B Code Interpreter - Stateful code execution"
optional = false
python-versions = "<4.0,>=3.9"
groups = ["main"]
files = [
{file = "e2b_code_interpreter-1.1.1-py3-none-any.whl", hash = "sha256:f56450b192456f24df89b9159d1067d50c7133d587ab12116144638969409578"},
{file = "e2b_code_interpreter-1.1.1.tar.gz", hash = "sha256:b13091f75fc127ad3a268b8746e5da996c6734f432e606fcd4f3897a5b1c2bf0"},
{file = "e2b_code_interpreter-1.5.0-py3-none-any.whl", hash = "sha256:299f5641a3754264a07f8edc3cccb744d6b009f10dc9285789a9352e24989a9b"},
{file = "e2b_code_interpreter-1.5.0.tar.gz", hash = "sha256:cd6028b6f20c4231e88a002de86484b9d4a99ea588b5be183b9ec7189a0f3cf6"},
]
[package.dependencies]
attrs = ">=21.3.0"
e2b = ">=1.0.4,<2.0.0"
e2b = ">=1.4.0,<2.0.0"
httpx = ">=0.20.0,<1.0.0"
[[package]]
@@ -1344,14 +1294,14 @@ grpcio-gcp = ["grpcio-gcp (>=0.2.2,<1.0.dev0)"]
[[package]]
name = "google-api-python-client"
version = "2.166.0"
version = "2.169.0"
description = "Google API Client Library for Python"
optional = false
python-versions = ">=3.7"
groups = ["main"]
files = [
{file = "google_api_python_client-2.166.0-py2.py3-none-any.whl", hash = "sha256:dd8cc74d9fc18538ab05cbd2e93cb4f82382f910c5f6945db06c91f1deae6e45"},
{file = "google_api_python_client-2.166.0.tar.gz", hash = "sha256:b8cf843bd9d736c134aef76cf1dc7a47c9283a2ef24267b97207b9dd43b30ef7"},
{file = "google_api_python_client-2.169.0-py3-none-any.whl", hash = "sha256:dae3e882dc0e6f28e60cf09c1f13fedfd881db84f824dd418aa9e44def2fe00d"},
{file = "google_api_python_client-2.169.0.tar.gz", hash = "sha256:0585bb97bd5f5bf3ed8d4bf624593e4c5a14d06c811d1952b07a1f94b4d12c51"},
]
[package.dependencies]
@@ -1404,14 +1354,14 @@ httplib2 = ">=0.19.0"
[[package]]
name = "google-auth-oauthlib"
version = "1.2.1"
version = "1.2.2"
description = "Google Authentication Library"
optional = false
python-versions = ">=3.6"
groups = ["main"]
files = [
{file = "google_auth_oauthlib-1.2.1-py2.py3-none-any.whl", hash = "sha256:2d58a27262d55aa1b87678c3ba7142a080098cbc2024f903c62355deb235d91f"},
{file = "google_auth_oauthlib-1.2.1.tar.gz", hash = "sha256:afd0cad092a2eaa53cd8e8298557d6de1034c6cb4a740500b5357b648af97263"},
{file = "google_auth_oauthlib-1.2.2-py3-none-any.whl", hash = "sha256:fd619506f4b3908b5df17b65f39ca8d66ea56986e5472eb5978fd8f3786f00a2"},
{file = "google_auth_oauthlib-1.2.2.tar.gz", hash = "sha256:11046fb8d3348b296302dd939ace8af0a724042e8029c1b872d87fabc9f41684"},
]
[package.dependencies]
@@ -1441,19 +1391,19 @@ protobuf = ">=3.20.2,<4.21.0 || >4.21.0,<4.21.1 || >4.21.1,<4.21.2 || >4.21.2,<4
[[package]]
name = "google-cloud-audit-log"
version = "0.3.0"
version = "0.3.2"
description = "Google Cloud Audit Protos"
optional = false
python-versions = ">=3.7"
groups = ["main"]
files = [
{file = "google_cloud_audit_log-0.3.0-py2.py3-none-any.whl", hash = "sha256:8340793120a1d5aa143605def8704ecdcead15106f754ef1381ae3bab533722f"},
{file = "google_cloud_audit_log-0.3.0.tar.gz", hash = "sha256:901428b257020d8c1d1133e0fa004164a555e5a395c7ca3cdbb8486513df3a65"},
{file = "google_cloud_audit_log-0.3.2-py3-none-any.whl", hash = "sha256:daaedfb947a0d77f524e1bd2b560242ab4836fe1afd6b06b92f152b9658554ed"},
{file = "google_cloud_audit_log-0.3.2.tar.gz", hash = "sha256:2598f1533a7d7cdd6c7bf448c12e5519c1d53162d78784e10bcdd1df67791bc3"},
]
[package.dependencies]
googleapis-common-protos = ">=1.56.2,<2.0dev"
protobuf = ">=3.20.2,<4.21.1 || >4.21.1,<4.21.2 || >4.21.2,<4.21.3 || >4.21.3,<4.21.4 || >4.21.4,<4.21.5 || >4.21.5,<6.0.0dev"
googleapis-common-protos = ">=1.56.2,<2.0.0"
protobuf = ">=3.20.2,<4.21.1 || >4.21.1,<4.21.2 || >4.21.2,<4.21.3 || >4.21.3,<4.21.4 || >4.21.4,<4.21.5 || >4.21.5,<7.0.0"
[[package]]
name = "google-cloud-core"
@@ -1476,29 +1426,29 @@ grpc = ["grpcio (>=1.38.0,<2.0dev)", "grpcio-status (>=1.38.0,<2.0.dev0)"]
[[package]]
name = "google-cloud-logging"
version = "3.11.4"
version = "3.12.1"
description = "Stackdriver Logging API client library"
optional = false
python-versions = ">=3.7"
groups = ["main"]
files = [
{file = "google_cloud_logging-3.11.4-py2.py3-none-any.whl", hash = "sha256:1d465ac62df29fb94bba4d6b4891035e57d573d84541dd8a40eebbc74422b2f0"},
{file = "google_cloud_logging-3.11.4.tar.gz", hash = "sha256:32305d989323f3c58603044e2ac5d9cf23e9465ede511bbe90b4309270d3195c"},
{file = "google_cloud_logging-3.12.1-py2.py3-none-any.whl", hash = "sha256:6817878af76ec4e7568976772839ab2c43ddfd18fbbf2ce32b13ef549cd5a862"},
{file = "google_cloud_logging-3.12.1.tar.gz", hash = "sha256:36efc823985055b203904e83e1c8f9f999b3c64270bcda39d57386ca4effd678"},
]
[package.dependencies]
google-api-core = {version = ">=1.34.1,<2.0.dev0 || >=2.11.dev0,<3.0.0dev", extras = ["grpc"]}
google-auth = ">=2.14.1,<2.24.0 || >2.24.0,<2.25.0 || >2.25.0,<3.0.0dev"
google-cloud-appengine-logging = ">=0.1.3,<2.0.0dev"
google-cloud-audit-log = ">=0.2.4,<1.0.0dev"
google-cloud-core = ">=2.0.0,<3.0.0dev"
grpc-google-iam-v1 = ">=0.12.4,<1.0.0dev"
google-api-core = {version = ">=1.34.1,<2.0.dev0 || >=2.11.dev0,<3.0.0", extras = ["grpc"]}
google-auth = ">=2.14.1,<2.24.0 || >2.24.0,<2.25.0 || >2.25.0,<3.0.0"
google-cloud-appengine-logging = ">=0.1.3,<2.0.0"
google-cloud-audit-log = ">=0.3.1,<1.0.0"
google-cloud-core = ">=2.0.0,<3.0.0"
grpc-google-iam-v1 = ">=0.12.4,<1.0.0"
opentelemetry-api = ">=1.9.0"
proto-plus = [
{version = ">=1.22.2,<2.0.0dev", markers = "python_version >= \"3.11\""},
{version = ">=1.22.0,<2.0.0dev", markers = "python_version < \"3.11\""},
{version = ">=1.22.2,<2.0.0", markers = "python_version >= \"3.11\""},
{version = ">=1.22.0,<2.0.0", markers = "python_version < \"3.11\""},
]
protobuf = ">=3.20.2,<4.21.0 || >4.21.0,<4.21.1 || >4.21.1,<4.21.2 || >4.21.2,<4.21.3 || >4.21.3,<4.21.4 || >4.21.4,<4.21.5 || >4.21.5,<6.0.0dev"
protobuf = ">=3.20.2,<4.21.0 || >4.21.0,<4.21.1 || >4.21.1,<4.21.2 || >4.21.2,<4.21.3 || >4.21.3,<4.21.4 || >4.21.4,<4.21.5 || >4.21.5,<7.0.0"
[[package]]
name = "google-cloud-storage"
@@ -1739,14 +1689,14 @@ test = ["objgraph", "psutil"]
[[package]]
name = "groq"
version = "0.20.0"
version = "0.24.0"
description = "The official Python library for the groq API"
optional = false
python-versions = ">=3.8"
groups = ["main"]
files = [
{file = "groq-0.20.0-py3-none-any.whl", hash = "sha256:c27b89903eb2b77f94ed95837ff3cadfc8c9e670953b1c5e5e2e855fea54b6c5"},
{file = "groq-0.20.0.tar.gz", hash = "sha256:2a201d41cae768c53d411dabcfea2333e2e138df22d909ed555ece426f1e016f"},
{file = "groq-0.24.0-py3-none-any.whl", hash = "sha256:0020e6b0b2b267263c9eb7c318deef13c12f399c6525734200b11d777b00088e"},
{file = "groq-0.24.0.tar.gz", hash = "sha256:e821559de8a77fb81d2585b3faec80ff923d6d64fd52339b33f6c94997d6f7f5"},
]
[package.dependencies]
@@ -2214,18 +2164,6 @@ files = [
{file = "installer-0.7.0.tar.gz", hash = "sha256:a26d3e3116289bb08216e0d0f7d925fcef0b0194eedfa0c944bcaaa106c4b631"},
]
[[package]]
name = "isodate"
version = "0.7.2"
description = "An ISO 8601 date/time/duration parser and formatter"
optional = false
python-versions = ">=3.7"
groups = ["main"]
files = [
{file = "isodate-0.7.2-py3-none-any.whl", hash = "sha256:28009937d8031054830160fce6d409ed342816b543597cece116d966c6d99e15"},
{file = "isodate-0.7.2.tar.gz", hash = "sha256:4cd1aa0f43ca76f4a6c6c0292a85f40b35ec2e43e315b59f06e6d32171a953e6"},
]
[[package]]
name = "isort"
version = "5.13.2"
@@ -2519,14 +2457,14 @@ urllib3 = ">=1.26.0,<3"
[[package]]
name = "launchdarkly-server-sdk"
version = "9.10.0"
version = "9.11.0"
description = "LaunchDarkly SDK for Python"
optional = false
python-versions = ">=3.8"
groups = ["main"]
files = [
{file = "launchdarkly_server_sdk-9.10.0-py3-none-any.whl", hash = "sha256:691f3e20453149ab141ed3bce1f2422f3d84215920e365e778508c15cb4bc243"},
{file = "launchdarkly_server_sdk-9.10.0.tar.gz", hash = "sha256:4228d86b1b8a1986cedb5a49ae9dfa7d55c53cfa1e9796c4e2aad725337896f0"},
{file = "launchdarkly_server_sdk-9.11.0-py3-none-any.whl", hash = "sha256:2409d298b3614165d755a325d102f8bf5948da5876268eb161443864b18a6ae5"},
{file = "launchdarkly_server_sdk-9.11.0.tar.gz", hash = "sha256:5106f0c574c529108217fe00b862329c381f5f415dde0ef62a7106f42ec64e22"},
]
[package.dependencies]
@@ -2628,21 +2566,19 @@ files = [
[[package]]
name = "mem0ai"
version = "0.1.80"
version = "0.1.98"
description = "Long-term memory for AI Agents"
optional = false
python-versions = "<4.0,>=3.9"
groups = ["main"]
files = [
{file = "mem0ai-0.1.80-py3-none-any.whl", hash = "sha256:3974d2bc1e64b4cb3389b756f13c4ed35aa5c950a9f62b33e719ec7311576d65"},
{file = "mem0ai-0.1.80.tar.gz", hash = "sha256:15593f3b22741558cd177d61e052db6044184149455f29fe7ceb50019515df8c"},
{file = "mem0ai-0.1.98-py3-none-any.whl", hash = "sha256:c53b8113c2430ed46327f0ce38febd096cd76ff778ecd43764df9adba5c4f6e7"},
{file = "mem0ai-0.1.98.tar.gz", hash = "sha256:a46b77bdfa5997844f164715393be3707489d7d11c0a97d513c03cabe43f9013"},
]
[package.dependencies]
azure-search-documents = ">=11.5.0,<12.0.0"
openai = ">=1.33.0,<2.0.0"
posthog = ">=3.5.0,<4.0.0"
psycopg2-binary = ">=2.9.10,<3.0.0"
pydantic = ">=2.7.3,<3.0.0"
pytz = ">=2024.1,<2025.0"
qdrant-client = ">=1.9.1,<2.0.0"
@@ -2988,14 +2924,14 @@ signedtoken = ["cryptography (>=3.0.0)", "pyjwt (>=2.0.0,<3)"]
[[package]]
name = "ollama"
version = "0.4.7"
version = "0.4.8"
description = "The official Python client for Ollama."
optional = false
python-versions = "<4.0,>=3.8"
groups = ["main"]
files = [
{file = "ollama-0.4.7-py3-none-any.whl", hash = "sha256:85505663cca67a83707be5fb3aeff0ea72e67846cea5985529d8eca4366564a1"},
{file = "ollama-0.4.7.tar.gz", hash = "sha256:891dcbe54f55397d82d289c459de0ea897e103b86a3f1fad0fdb1895922a75ff"},
{file = "ollama-0.4.8-py3-none-any.whl", hash = "sha256:04312af2c5e72449aaebac4a2776f52ef010877c554103419d3f36066fe8af4c"},
{file = "ollama-0.4.8.tar.gz", hash = "sha256:1121439d49b96fa8339842965d0616eba5deb9f8c790786cdf4c0b3df4833802"},
]
[package.dependencies]
@@ -3004,14 +2940,14 @@ pydantic = ">=2.9.0,<3.0.0"
[[package]]
name = "openai"
version = "1.70.0"
version = "1.78.1"
description = "The official Python library for the openai API"
optional = false
python-versions = ">=3.8"
groups = ["main"]
files = [
{file = "openai-1.70.0-py3-none-any.whl", hash = "sha256:f6438d053fd8b2e05fd6bef70871e832d9bbdf55e119d0ac5b92726f1ae6f614"},
{file = "openai-1.70.0.tar.gz", hash = "sha256:e52a8d54c3efeb08cf58539b5b21a5abef25368b5432965e4de88cdf4e091b2b"},
{file = "openai-1.78.1-py3-none-any.whl", hash = "sha256:7368bf147ca499804cc408fe68cdb6866a060f38dec961bbc97b04f9d917907e"},
{file = "openai-1.78.1.tar.gz", hash = "sha256:8b26b364531b100df1b961d03560042e5f5be11301d7d49a6cd1a2b9af824dca"},
]
[package.dependencies]
@@ -3026,7 +2962,7 @@ typing-extensions = ">=4.11,<5"
[package.extras]
datalib = ["numpy (>=1)", "pandas (>=1.2.3)", "pandas-stubs (>=1.1.0.11)"]
realtime = ["websockets (>=13,<15)"]
realtime = ["websockets (>=13,<16)"]
voice-helpers = ["numpy (>=2.0.2)", "sounddevice (>=0.5.1)"]
[[package]]
@@ -3337,14 +3273,14 @@ testing = ["pytest", "pytest-benchmark"]
[[package]]
name = "poethepoet"
version = "0.33.1"
version = "0.34.0"
description = "A task runner that works well with poetry."
optional = false
python-versions = ">=3.9"
groups = ["dev"]
files = [
{file = "poethepoet-0.33.1-py3-none-any.whl", hash = "sha256:b86d80a81b2ca4e4ce8e8f716cc6004a1a1cdead027778bc07d1c26cb3664770"},
{file = "poethepoet-0.33.1.tar.gz", hash = "sha256:8775e09b64f773278b5483659ff238a708723491efadeedd1c2cbf773558cb4c"},
{file = "poethepoet-0.34.0-py3-none-any.whl", hash = "sha256:c472d6f0fdb341b48d346f4ccd49779840c15b30dfd6bc6347a80d6274b5e34e"},
{file = "poethepoet-0.34.0.tar.gz", hash = "sha256:86203acce555bbfe45cb6ccac61ba8b16a5784264484195874da457ddabf5850"},
]
[package.dependencies]
@@ -3357,14 +3293,14 @@ poetry-plugin = ["poetry (>=1.2.0,<3.0.0) ; python_version < \"4.0\""]
[[package]]
name = "poetry"
version = "2.1.2"
version = "2.1.3"
description = "Python dependency management and packaging made easy."
optional = false
python-versions = "<4.0,>=3.9"
groups = ["main"]
files = [
{file = "poetry-2.1.2-py3-none-any.whl", hash = "sha256:df7dfe7e5f9cd50ed3b8d1a013afcc379645f66d7e9aa43728689e34fb016216"},
{file = "poetry-2.1.2.tar.gz", hash = "sha256:6a0694645ee24ba93cb94254db66e47971344562ddd5578e82bf35e572bc546d"},
{file = "poetry-2.1.3-py3-none-any.whl", hash = "sha256:7054d3f97ccce7f31961ead16250407c4577bfe57e2037a190ae2913fc40a20c"},
{file = "poetry-2.1.3.tar.gz", hash = "sha256:f2c9bd6790b19475976d88ea4553bcc3533c0dc73f740edc4fffe9e2add50594"},
]
[package.dependencies]
@@ -3380,7 +3316,7 @@ packaging = ">=24.0"
pbs-installer = {version = ">=2025.1.6,<2026.0.0", extras = ["download", "install"]}
pkginfo = ">=1.12,<2.0"
platformdirs = ">=3.0.0,<5"
poetry-core = "2.1.2"
poetry-core = "2.1.3"
pyproject-hooks = ">=1.0.0,<2.0.0"
requests = ">=2.26,<3.0"
requests-toolbelt = ">=1.0.0,<2.0.0"
@@ -3393,14 +3329,14 @@ xattr = {version = ">=1.0.0,<2.0.0", markers = "sys_platform == \"darwin\""}
[[package]]
name = "poetry-core"
version = "2.1.2"
version = "2.1.3"
description = "Poetry PEP 517 Build Backend"
optional = false
python-versions = "<4.0,>=3.9"
groups = ["main"]
files = [
{file = "poetry_core-2.1.2-py3-none-any.whl", hash = "sha256:ecb1e8f7d4f071a21cd0feb8c19bd1aec80de6fb0e82aa9d809a591e544431b4"},
{file = "poetry_core-2.1.2.tar.gz", hash = "sha256:f9dbbbd0ebf9755476a1d57f04b30e9aecf71ca9dc2fcd4b17aba92c0002aa04"},
{file = "poetry_core-2.1.3-py3-none-any.whl", hash = "sha256:2c704f05016698a54ca1d327f46ce2426d72eaca6ff614132c8477c292266771"},
{file = "poetry_core-2.1.3.tar.gz", hash = "sha256:0522a015477ed622c89aad56a477a57813cace0c8e7ff2a2906b7ef4a2e296a4"},
]
[[package]]
@@ -3568,6 +3504,21 @@ files = [
[package.dependencies]
tqdm = "*"
[[package]]
name = "prometheus-client"
version = "0.21.1"
description = "Python client for the Prometheus monitoring system."
optional = false
python-versions = ">=3.8"
groups = ["main"]
files = [
{file = "prometheus_client-0.21.1-py3-none-any.whl", hash = "sha256:594b45c410d6f4f8888940fe80b5cc2521b305a1fafe1c58609ef715a001f301"},
{file = "prometheus_client-0.21.1.tar.gz", hash = "sha256:252505a722ac04b0456be05c05f75f45d760c2911ffc45f2a06bcaed9f3ae3fb"},
]
[package.extras]
twisted = ["twisted"]
[[package]]
name = "propcache"
version = "0.2.1"
@@ -3680,23 +3631,23 @@ testing = ["google-api-core (>=1.31.5)"]
[[package]]
name = "protobuf"
version = "5.29.3"
version = "5.29.4"
description = ""
optional = false
python-versions = ">=3.8"
groups = ["main"]
files = [
{file = "protobuf-5.29.3-cp310-abi3-win32.whl", hash = "sha256:3ea51771449e1035f26069c4c7fd51fba990d07bc55ba80701c78f886bf9c888"},
{file = "protobuf-5.29.3-cp310-abi3-win_amd64.whl", hash = "sha256:a4fa6f80816a9a0678429e84973f2f98cbc218cca434abe8db2ad0bffc98503a"},
{file = "protobuf-5.29.3-cp38-abi3-macosx_10_9_universal2.whl", hash = "sha256:a8434404bbf139aa9e1300dbf989667a83d42ddda9153d8ab76e0d5dcaca484e"},
{file = "protobuf-5.29.3-cp38-abi3-manylinux2014_aarch64.whl", hash = "sha256:daaf63f70f25e8689c072cfad4334ca0ac1d1e05a92fc15c54eb9cf23c3efd84"},
{file = "protobuf-5.29.3-cp38-abi3-manylinux2014_x86_64.whl", hash = "sha256:c027e08a08be10b67c06bf2370b99c811c466398c357e615ca88c91c07f0910f"},
{file = "protobuf-5.29.3-cp38-cp38-win32.whl", hash = "sha256:84a57163a0ccef3f96e4b6a20516cedcf5bb3a95a657131c5c3ac62200d23252"},
{file = "protobuf-5.29.3-cp38-cp38-win_amd64.whl", hash = "sha256:b89c115d877892a512f79a8114564fb435943b59067615894c3b13cd3e1fa107"},
{file = "protobuf-5.29.3-cp39-cp39-win32.whl", hash = "sha256:0eb32bfa5219fc8d4111803e9a690658aa2e6366384fd0851064b963b6d1f2a7"},
{file = "protobuf-5.29.3-cp39-cp39-win_amd64.whl", hash = "sha256:6ce8cc3389a20693bfde6c6562e03474c40851b44975c9b2bf6df7d8c4f864da"},
{file = "protobuf-5.29.3-py3-none-any.whl", hash = "sha256:0a18ed4a24198528f2333802eb075e59dea9d679ab7a6c5efb017a59004d849f"},
{file = "protobuf-5.29.3.tar.gz", hash = "sha256:5da0f41edaf117bde316404bad1a486cb4ededf8e4a54891296f648e8e076620"},
{file = "protobuf-5.29.4-cp310-abi3-win32.whl", hash = "sha256:13eb236f8eb9ec34e63fc8b1d6efd2777d062fa6aaa68268fb67cf77f6839ad7"},
{file = "protobuf-5.29.4-cp310-abi3-win_amd64.whl", hash = "sha256:bcefcdf3976233f8a502d265eb65ea740c989bacc6c30a58290ed0e519eb4b8d"},
{file = "protobuf-5.29.4-cp38-abi3-macosx_10_9_universal2.whl", hash = "sha256:307ecba1d852ec237e9ba668e087326a67564ef83e45a0189a772ede9e854dd0"},
{file = "protobuf-5.29.4-cp38-abi3-manylinux2014_aarch64.whl", hash = "sha256:aec4962f9ea93c431d5714ed1be1c93f13e1a8618e70035ba2b0564d9e633f2e"},
{file = "protobuf-5.29.4-cp38-abi3-manylinux2014_x86_64.whl", hash = "sha256:d7d3f7d1d5a66ed4942d4fefb12ac4b14a29028b209d4bfb25c68ae172059922"},
{file = "protobuf-5.29.4-cp38-cp38-win32.whl", hash = "sha256:1832f0515b62d12d8e6ffc078d7e9eb06969aa6dc13c13e1036e39d73bebc2de"},
{file = "protobuf-5.29.4-cp38-cp38-win_amd64.whl", hash = "sha256:476cb7b14914c780605a8cf62e38c2a85f8caff2e28a6a0bad827ec7d6c85d68"},
{file = "protobuf-5.29.4-cp39-cp39-win32.whl", hash = "sha256:fd32223020cb25a2cc100366f1dedc904e2d71d9322403224cdde5fdced0dabe"},
{file = "protobuf-5.29.4-cp39-cp39-win_amd64.whl", hash = "sha256:678974e1e3a9b975b8bc2447fca458db5f93a2fb6b0c8db46b6675b5b5346812"},
{file = "protobuf-5.29.4-py3-none-any.whl", hash = "sha256:3fde11b505e1597f71b875ef2fc52062b6a9740e5f7c8997ce878b6009145862"},
{file = "protobuf-5.29.4.tar.gz", hash = "sha256:4f1dfcd7997b31ef8f53ec82781ff434a28bf71d9102ddde14d076adcfc78c99"},
]
[[package]]
@@ -3854,20 +3805,20 @@ files = [
[[package]]
name = "pydantic"
version = "2.11.1"
version = "2.11.4"
description = "Data validation using Python type hints"
optional = false
python-versions = ">=3.9"
groups = ["main"]
files = [
{file = "pydantic-2.11.1-py3-none-any.whl", hash = "sha256:5b6c415eee9f8123a14d859be0c84363fec6b1feb6b688d6435801230b56e0b8"},
{file = "pydantic-2.11.1.tar.gz", hash = "sha256:442557d2910e75c991c39f4b4ab18963d57b9b55122c8b2a9cd176d8c29ce968"},
{file = "pydantic-2.11.4-py3-none-any.whl", hash = "sha256:d9615eaa9ac5a063471da949c8fc16376a84afb5024688b3ff885693506764eb"},
{file = "pydantic-2.11.4.tar.gz", hash = "sha256:32738d19d63a226a52eed76645a98ee07c1f410ee41d93b4afbfa85ed8111c2d"},
]
[package.dependencies]
annotated-types = ">=0.6.0"
email-validator = {version = ">=2.0.0", optional = true, markers = "extra == \"email\""}
pydantic-core = "2.33.0"
pydantic-core = "2.33.2"
typing-extensions = ">=4.12.2"
typing-inspection = ">=0.4.0"
@@ -3877,111 +3828,111 @@ timezone = ["tzdata ; python_version >= \"3.9\" and platform_system == \"Windows
[[package]]
name = "pydantic-core"
version = "2.33.0"
version = "2.33.2"
description = "Core functionality for Pydantic validation and serialization"
optional = false
python-versions = ">=3.9"
groups = ["main"]
files = [
{file = "pydantic_core-2.33.0-cp310-cp310-macosx_10_12_x86_64.whl", hash = "sha256:71dffba8fe9ddff628c68f3abd845e91b028361d43c5f8e7b3f8b91d7d85413e"},
{file = "pydantic_core-2.33.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:abaeec1be6ed535a5d7ffc2e6c390083c425832b20efd621562fbb5bff6dc518"},
{file = "pydantic_core-2.33.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:759871f00e26ad3709efc773ac37b4d571de065f9dfb1778012908bcc36b3a73"},
{file = "pydantic_core-2.33.0-cp310-cp310-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:dcfebee69cd5e1c0b76a17e17e347c84b00acebb8dd8edb22d4a03e88e82a207"},
{file = "pydantic_core-2.33.0-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:1b1262b912435a501fa04cd213720609e2cefa723a07c92017d18693e69bf00b"},
{file = "pydantic_core-2.33.0-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:4726f1f3f42d6a25678c67da3f0b10f148f5655813c5aca54b0d1742ba821b8f"},
{file = "pydantic_core-2.33.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:e790954b5093dff1e3a9a2523fddc4e79722d6f07993b4cd5547825c3cbf97b5"},
{file = "pydantic_core-2.33.0-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:34e7fb3abe375b5c4e64fab75733d605dda0f59827752debc99c17cb2d5f3276"},
{file = "pydantic_core-2.33.0-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:ecb158fb9b9091b515213bed3061eb7deb1d3b4e02327c27a0ea714ff46b0760"},
{file = "pydantic_core-2.33.0-cp310-cp310-musllinux_1_1_armv7l.whl", hash = "sha256:4d9149e7528af8bbd76cc055967e6e04617dcb2a2afdaa3dea899406c5521faa"},
{file = "pydantic_core-2.33.0-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:e81a295adccf73477220e15ff79235ca9dcbcee4be459eb9d4ce9a2763b8386c"},
{file = "pydantic_core-2.33.0-cp310-cp310-win32.whl", hash = "sha256:f22dab23cdbce2005f26a8f0c71698457861f97fc6318c75814a50c75e87d025"},
{file = "pydantic_core-2.33.0-cp310-cp310-win_amd64.whl", hash = "sha256:9cb2390355ba084c1ad49485d18449b4242da344dea3e0fe10babd1f0db7dcfc"},
{file = "pydantic_core-2.33.0-cp311-cp311-macosx_10_12_x86_64.whl", hash = "sha256:a608a75846804271cf9c83e40bbb4dab2ac614d33c6fd5b0c6187f53f5c593ef"},
{file = "pydantic_core-2.33.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:e1c69aa459f5609dec2fa0652d495353accf3eda5bdb18782bc5a2ae45c9273a"},
{file = "pydantic_core-2.33.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:b9ec80eb5a5f45a2211793f1c4aeddff0c3761d1c70d684965c1807e923a588b"},
{file = "pydantic_core-2.33.0-cp311-cp311-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:e925819a98318d17251776bd3d6aa9f3ff77b965762155bdad15d1a9265c4cfd"},
{file = "pydantic_core-2.33.0-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:5bf68bb859799e9cec3d9dd8323c40c00a254aabb56fe08f907e437005932f2b"},
{file = "pydantic_core-2.33.0-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:1b2ea72dea0825949a045fa4071f6d5b3d7620d2a208335207793cf29c5a182d"},
{file = "pydantic_core-2.33.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:1583539533160186ac546b49f5cde9ffc928062c96920f58bd95de32ffd7bffd"},
{file = "pydantic_core-2.33.0-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:23c3e77bf8a7317612e5c26a3b084c7edeb9552d645742a54a5867635b4f2453"},
{file = "pydantic_core-2.33.0-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:a7a7f2a3f628d2f7ef11cb6188bcf0b9e1558151d511b974dfea10a49afe192b"},
{file = "pydantic_core-2.33.0-cp311-cp311-musllinux_1_1_armv7l.whl", hash = "sha256:f1fb026c575e16f673c61c7b86144517705865173f3d0907040ac30c4f9f5915"},
{file = "pydantic_core-2.33.0-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:635702b2fed997e0ac256b2cfbdb4dd0bf7c56b5d8fba8ef03489c03b3eb40e2"},
{file = "pydantic_core-2.33.0-cp311-cp311-win32.whl", hash = "sha256:07b4ced28fccae3f00626eaa0c4001aa9ec140a29501770a88dbbb0966019a86"},
{file = "pydantic_core-2.33.0-cp311-cp311-win_amd64.whl", hash = "sha256:4927564be53239a87770a5f86bdc272b8d1fbb87ab7783ad70255b4ab01aa25b"},
{file = "pydantic_core-2.33.0-cp311-cp311-win_arm64.whl", hash = "sha256:69297418ad644d521ea3e1aa2e14a2a422726167e9ad22b89e8f1130d68e1e9a"},
{file = "pydantic_core-2.33.0-cp312-cp312-macosx_10_12_x86_64.whl", hash = "sha256:6c32a40712e3662bebe524abe8abb757f2fa2000028d64cc5a1006016c06af43"},
{file = "pydantic_core-2.33.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:8ec86b5baa36f0a0bfb37db86c7d52652f8e8aa076ab745ef7725784183c3fdd"},
{file = "pydantic_core-2.33.0-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4deac83a8cc1d09e40683be0bc6d1fa4cde8df0a9bf0cda5693f9b0569ac01b6"},
{file = "pydantic_core-2.33.0-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:175ab598fb457a9aee63206a1993874badf3ed9a456e0654273e56f00747bbd6"},
{file = "pydantic_core-2.33.0-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:5f36afd0d56a6c42cf4e8465b6441cf546ed69d3a4ec92724cc9c8c61bd6ecf4"},
{file = "pydantic_core-2.33.0-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:0a98257451164666afafc7cbf5fb00d613e33f7e7ebb322fbcd99345695a9a61"},
{file = "pydantic_core-2.33.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ecc6d02d69b54a2eb83ebcc6f29df04957f734bcf309d346b4f83354d8376862"},
{file = "pydantic_core-2.33.0-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:1a69b7596c6603afd049ce7f3835bcf57dd3892fc7279f0ddf987bebed8caa5a"},
{file = "pydantic_core-2.33.0-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:ea30239c148b6ef41364c6f51d103c2988965b643d62e10b233b5efdca8c0099"},
{file = "pydantic_core-2.33.0-cp312-cp312-musllinux_1_1_armv7l.whl", hash = "sha256:abfa44cf2f7f7d7a199be6c6ec141c9024063205545aa09304349781b9a125e6"},
{file = "pydantic_core-2.33.0-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:20d4275f3c4659d92048c70797e5fdc396c6e4446caf517ba5cad2db60cd39d3"},
{file = "pydantic_core-2.33.0-cp312-cp312-win32.whl", hash = "sha256:918f2013d7eadea1d88d1a35fd4a1e16aaf90343eb446f91cb091ce7f9b431a2"},
{file = "pydantic_core-2.33.0-cp312-cp312-win_amd64.whl", hash = "sha256:aec79acc183865bad120b0190afac467c20b15289050648b876b07777e67ea48"},
{file = "pydantic_core-2.33.0-cp312-cp312-win_arm64.whl", hash = "sha256:5461934e895968655225dfa8b3be79e7e927e95d4bd6c2d40edd2fa7052e71b6"},
{file = "pydantic_core-2.33.0-cp313-cp313-macosx_10_12_x86_64.whl", hash = "sha256:f00e8b59e1fc8f09d05594aa7d2b726f1b277ca6155fc84c0396db1b373c4555"},
{file = "pydantic_core-2.33.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:1a73be93ecef45786d7d95b0c5e9b294faf35629d03d5b145b09b81258c7cd6d"},
{file = "pydantic_core-2.33.0-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ff48a55be9da6930254565ff5238d71d5e9cd8c5487a191cb85df3bdb8c77365"},
{file = "pydantic_core-2.33.0-cp313-cp313-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:26a4ea04195638dcd8c53dadb545d70badba51735b1594810e9768c2c0b4a5da"},
{file = "pydantic_core-2.33.0-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:41d698dcbe12b60661f0632b543dbb119e6ba088103b364ff65e951610cb7ce0"},
{file = "pydantic_core-2.33.0-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:ae62032ef513fe6281ef0009e30838a01057b832dc265da32c10469622613885"},
{file = "pydantic_core-2.33.0-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:f225f3a3995dbbc26affc191d0443c6c4aa71b83358fd4c2b7d63e2f6f0336f9"},
{file = "pydantic_core-2.33.0-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:5bdd36b362f419c78d09630cbaebc64913f66f62bda6d42d5fbb08da8cc4f181"},
{file = "pydantic_core-2.33.0-cp313-cp313-musllinux_1_1_aarch64.whl", hash = "sha256:2a0147c0bef783fd9abc9f016d66edb6cac466dc54a17ec5f5ada08ff65caf5d"},
{file = "pydantic_core-2.33.0-cp313-cp313-musllinux_1_1_armv7l.whl", hash = "sha256:c860773a0f205926172c6644c394e02c25421dc9a456deff16f64c0e299487d3"},
{file = "pydantic_core-2.33.0-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:138d31e3f90087f42aa6286fb640f3c7a8eb7bdae829418265e7e7474bd2574b"},
{file = "pydantic_core-2.33.0-cp313-cp313-win32.whl", hash = "sha256:d20cbb9d3e95114325780f3cfe990f3ecae24de7a2d75f978783878cce2ad585"},
{file = "pydantic_core-2.33.0-cp313-cp313-win_amd64.whl", hash = "sha256:ca1103d70306489e3d006b0f79db8ca5dd3c977f6f13b2c59ff745249431a606"},
{file = "pydantic_core-2.33.0-cp313-cp313-win_arm64.whl", hash = "sha256:6291797cad239285275558e0a27872da735b05c75d5237bbade8736f80e4c225"},
{file = "pydantic_core-2.33.0-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:7b79af799630af263eca9ec87db519426d8c9b3be35016eddad1832bac812d87"},
{file = "pydantic_core-2.33.0-cp313-cp313t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:eabf946a4739b5237f4f56d77fa6668263bc466d06a8036c055587c130a46f7b"},
{file = "pydantic_core-2.33.0-cp313-cp313t-win_amd64.whl", hash = "sha256:8a1d581e8cdbb857b0e0e81df98603376c1a5c34dc5e54039dcc00f043df81e7"},
{file = "pydantic_core-2.33.0-cp39-cp39-macosx_10_12_x86_64.whl", hash = "sha256:7c9c84749f5787781c1c45bb99f433402e484e515b40675a5d121ea14711cf61"},
{file = "pydantic_core-2.33.0-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:64672fa888595a959cfeff957a654e947e65bbe1d7d82f550417cbd6898a1d6b"},
{file = "pydantic_core-2.33.0-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:26bc7367c0961dec292244ef2549afa396e72e28cc24706210bd44d947582c59"},
{file = "pydantic_core-2.33.0-cp39-cp39-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:ce72d46eb201ca43994303025bd54d8a35a3fc2a3495fac653d6eb7205ce04f4"},
{file = "pydantic_core-2.33.0-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:14229c1504287533dbf6b1fc56f752ce2b4e9694022ae7509631ce346158de11"},
{file = "pydantic_core-2.33.0-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:085d8985b1c1e48ef271e98a658f562f29d89bda98bf120502283efbc87313eb"},
{file = "pydantic_core-2.33.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:31860fbda80d8f6828e84b4a4d129fd9c4535996b8249cfb8c720dc2a1a00bb8"},
{file = "pydantic_core-2.33.0-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:f200b2f20856b5a6c3a35f0d4e344019f805e363416e609e9b47c552d35fd5ea"},
{file = "pydantic_core-2.33.0-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:5f72914cfd1d0176e58ddc05c7a47674ef4222c8253bf70322923e73e14a4ac3"},
{file = "pydantic_core-2.33.0-cp39-cp39-musllinux_1_1_armv7l.whl", hash = "sha256:91301a0980a1d4530d4ba7e6a739ca1a6b31341252cb709948e0aca0860ce0ae"},
{file = "pydantic_core-2.33.0-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:7419241e17c7fbe5074ba79143d5523270e04f86f1b3a0dff8df490f84c8273a"},
{file = "pydantic_core-2.33.0-cp39-cp39-win32.whl", hash = "sha256:7a25493320203005d2a4dac76d1b7d953cb49bce6d459d9ae38e30dd9f29bc9c"},
{file = "pydantic_core-2.33.0-cp39-cp39-win_amd64.whl", hash = "sha256:82a4eba92b7ca8af1b7d5ef5f3d9647eee94d1f74d21ca7c21e3a2b92e008358"},
{file = "pydantic_core-2.33.0-pp310-pypy310_pp73-macosx_10_12_x86_64.whl", hash = "sha256:e2762c568596332fdab56b07060c8ab8362c56cf2a339ee54e491cd503612c50"},
{file = "pydantic_core-2.33.0-pp310-pypy310_pp73-macosx_11_0_arm64.whl", hash = "sha256:5bf637300ff35d4f59c006fff201c510b2b5e745b07125458a5389af3c0dff8c"},
{file = "pydantic_core-2.33.0-pp310-pypy310_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:62c151ce3d59ed56ebd7ce9ce5986a409a85db697d25fc232f8e81f195aa39a1"},
{file = "pydantic_core-2.33.0-pp310-pypy310_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9ee65f0cc652261744fd07f2c6e6901c914aa6c5ff4dcfaf1136bc394d0dd26b"},
{file = "pydantic_core-2.33.0-pp310-pypy310_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:024d136ae44d233e6322027bbf356712b3940bee816e6c948ce4b90f18471b3d"},
{file = "pydantic_core-2.33.0-pp310-pypy310_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:e37f10f6d4bc67c58fbd727108ae1d8b92b397355e68519f1e4a7babb1473442"},
{file = "pydantic_core-2.33.0-pp310-pypy310_pp73-musllinux_1_1_armv7l.whl", hash = "sha256:502ed542e0d958bd12e7c3e9a015bce57deaf50eaa8c2e1c439b512cb9db1e3a"},
{file = "pydantic_core-2.33.0-pp310-pypy310_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:715c62af74c236bf386825c0fdfa08d092ab0f191eb5b4580d11c3189af9d330"},
{file = "pydantic_core-2.33.0-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:bccc06fa0372151f37f6b69834181aa9eb57cf8665ed36405fb45fbf6cac3bae"},
{file = "pydantic_core-2.33.0-pp311-pypy311_pp73-macosx_10_12_x86_64.whl", hash = "sha256:5d8dc9f63a26f7259b57f46a7aab5af86b2ad6fbe48487500bb1f4b27e051e4c"},
{file = "pydantic_core-2.33.0-pp311-pypy311_pp73-macosx_11_0_arm64.whl", hash = "sha256:30369e54d6d0113d2aa5aee7a90d17f225c13d87902ace8fcd7bbf99b19124db"},
{file = "pydantic_core-2.33.0-pp311-pypy311_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f3eb479354c62067afa62f53bb387827bee2f75c9c79ef25eef6ab84d4b1ae3b"},
{file = "pydantic_core-2.33.0-pp311-pypy311_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:0310524c833d91403c960b8a3cf9f46c282eadd6afd276c8c5edc617bd705dc9"},
{file = "pydantic_core-2.33.0-pp311-pypy311_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:eddb18a00bbb855325db27b4c2a89a4ba491cd6a0bd6d852b225172a1f54b36c"},
{file = "pydantic_core-2.33.0-pp311-pypy311_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:ade5dbcf8d9ef8f4b28e682d0b29f3008df9842bb5ac48ac2c17bc55771cc976"},
{file = "pydantic_core-2.33.0-pp311-pypy311_pp73-musllinux_1_1_armv7l.whl", hash = "sha256:2c0afd34f928383e3fd25740f2050dbac9d077e7ba5adbaa2227f4d4f3c8da5c"},
{file = "pydantic_core-2.33.0-pp311-pypy311_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:7da333f21cd9df51d5731513a6d39319892947604924ddf2e24a4612975fb936"},
{file = "pydantic_core-2.33.0-pp311-pypy311_pp73-win_amd64.whl", hash = "sha256:4b6d77c75a57f041c5ee915ff0b0bb58eabb78728b69ed967bc5b780e8f701b8"},
{file = "pydantic_core-2.33.0-pp39-pypy39_pp73-macosx_10_12_x86_64.whl", hash = "sha256:ba95691cf25f63df53c1d342413b41bd7762d9acb425df8858d7efa616c0870e"},
{file = "pydantic_core-2.33.0-pp39-pypy39_pp73-macosx_11_0_arm64.whl", hash = "sha256:4f1ab031feb8676f6bd7c85abec86e2935850bf19b84432c64e3e239bffeb1ec"},
{file = "pydantic_core-2.33.0-pp39-pypy39_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:58c1151827eef98b83d49b6ca6065575876a02d2211f259fb1a6b7757bd24dd8"},
{file = "pydantic_core-2.33.0-pp39-pypy39_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:a66d931ea2c1464b738ace44b7334ab32a2fd50be023d863935eb00f42be1778"},
{file = "pydantic_core-2.33.0-pp39-pypy39_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:0bcf0bab28995d483f6c8d7db25e0d05c3efa5cebfd7f56474359e7137f39856"},
{file = "pydantic_core-2.33.0-pp39-pypy39_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:89670d7a0045acb52be0566df5bc8b114ac967c662c06cf5e0c606e4aadc964b"},
{file = "pydantic_core-2.33.0-pp39-pypy39_pp73-musllinux_1_1_armv7l.whl", hash = "sha256:b716294e721d8060908dbebe32639b01bfe61b15f9f57bcc18ca9a0e00d9520b"},
{file = "pydantic_core-2.33.0-pp39-pypy39_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:fc53e05c16697ff0c1c7c2b98e45e131d4bfb78068fffff92a82d169cbb4c7b7"},
{file = "pydantic_core-2.33.0-pp39-pypy39_pp73-win_amd64.whl", hash = "sha256:68504959253303d3ae9406b634997a2123a0b0c1da86459abbd0ffc921695eac"},
{file = "pydantic_core-2.33.0.tar.gz", hash = "sha256:40eb8af662ba409c3cbf4a8150ad32ae73514cd7cb1f1a2113af39763dd616b3"},
{file = "pydantic_core-2.33.2-cp310-cp310-macosx_10_12_x86_64.whl", hash = "sha256:2b3d326aaef0c0399d9afffeb6367d5e26ddc24d351dbc9c636840ac355dc5d8"},
{file = "pydantic_core-2.33.2-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:0e5b2671f05ba48b94cb90ce55d8bdcaaedb8ba00cc5359f6810fc918713983d"},
{file = "pydantic_core-2.33.2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0069c9acc3f3981b9ff4cdfaf088e98d83440a4c7ea1bc07460af3d4dc22e72d"},
{file = "pydantic_core-2.33.2-cp310-cp310-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:d53b22f2032c42eaaf025f7c40c2e3b94568ae077a606f006d206a463bc69572"},
{file = "pydantic_core-2.33.2-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:0405262705a123b7ce9f0b92f123334d67b70fd1f20a9372b907ce1080c7ba02"},
{file = "pydantic_core-2.33.2-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:4b25d91e288e2c4e0662b8038a28c6a07eaac3e196cfc4ff69de4ea3db992a1b"},
{file = "pydantic_core-2.33.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6bdfe4b3789761f3bcb4b1ddf33355a71079858958e3a552f16d5af19768fef2"},
{file = "pydantic_core-2.33.2-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:efec8db3266b76ef9607c2c4c419bdb06bf335ae433b80816089ea7585816f6a"},
{file = "pydantic_core-2.33.2-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:031c57d67ca86902726e0fae2214ce6770bbe2f710dc33063187a68744a5ecac"},
{file = "pydantic_core-2.33.2-cp310-cp310-musllinux_1_1_armv7l.whl", hash = "sha256:f8de619080e944347f5f20de29a975c2d815d9ddd8be9b9b7268e2e3ef68605a"},
{file = "pydantic_core-2.33.2-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:73662edf539e72a9440129f231ed3757faab89630d291b784ca99237fb94db2b"},
{file = "pydantic_core-2.33.2-cp310-cp310-win32.whl", hash = "sha256:0a39979dcbb70998b0e505fb1556a1d550a0781463ce84ebf915ba293ccb7e22"},
{file = "pydantic_core-2.33.2-cp310-cp310-win_amd64.whl", hash = "sha256:b0379a2b24882fef529ec3b4987cb5d003b9cda32256024e6fe1586ac45fc640"},
{file = "pydantic_core-2.33.2-cp311-cp311-macosx_10_12_x86_64.whl", hash = "sha256:4c5b0a576fb381edd6d27f0a85915c6daf2f8138dc5c267a57c08a62900758c7"},
{file = "pydantic_core-2.33.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:e799c050df38a639db758c617ec771fd8fb7a5f8eaaa4b27b101f266b216a246"},
{file = "pydantic_core-2.33.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:dc46a01bf8d62f227d5ecee74178ffc448ff4e5197c756331f71efcc66dc980f"},
{file = "pydantic_core-2.33.2-cp311-cp311-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:a144d4f717285c6d9234a66778059f33a89096dfb9b39117663fd8413d582dcc"},
{file = "pydantic_core-2.33.2-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:73cf6373c21bc80b2e0dc88444f41ae60b2f070ed02095754eb5a01df12256de"},
{file = "pydantic_core-2.33.2-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:3dc625f4aa79713512d1976fe9f0bc99f706a9dee21dfd1810b4bbbf228d0e8a"},
{file = "pydantic_core-2.33.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:881b21b5549499972441da4758d662aeea93f1923f953e9cbaff14b8b9565aef"},
{file = "pydantic_core-2.33.2-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:bdc25f3681f7b78572699569514036afe3c243bc3059d3942624e936ec93450e"},
{file = "pydantic_core-2.33.2-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:fe5b32187cbc0c862ee201ad66c30cf218e5ed468ec8dc1cf49dec66e160cc4d"},
{file = "pydantic_core-2.33.2-cp311-cp311-musllinux_1_1_armv7l.whl", hash = "sha256:bc7aee6f634a6f4a95676fcb5d6559a2c2a390330098dba5e5a5f28a2e4ada30"},
{file = "pydantic_core-2.33.2-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:235f45e5dbcccf6bd99f9f472858849f73d11120d76ea8707115415f8e5ebebf"},
{file = "pydantic_core-2.33.2-cp311-cp311-win32.whl", hash = "sha256:6368900c2d3ef09b69cb0b913f9f8263b03786e5b2a387706c5afb66800efd51"},
{file = "pydantic_core-2.33.2-cp311-cp311-win_amd64.whl", hash = "sha256:1e063337ef9e9820c77acc768546325ebe04ee38b08703244c1309cccc4f1bab"},
{file = "pydantic_core-2.33.2-cp311-cp311-win_arm64.whl", hash = "sha256:6b99022f1d19bc32a4c2a0d544fc9a76e3be90f0b3f4af413f87d38749300e65"},
{file = "pydantic_core-2.33.2-cp312-cp312-macosx_10_12_x86_64.whl", hash = "sha256:a7ec89dc587667f22b6a0b6579c249fca9026ce7c333fc142ba42411fa243cdc"},
{file = "pydantic_core-2.33.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:3c6db6e52c6d70aa0d00d45cdb9b40f0433b96380071ea80b09277dba021ddf7"},
{file = "pydantic_core-2.33.2-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4e61206137cbc65e6d5256e1166f88331d3b6238e082d9f74613b9b765fb9025"},
{file = "pydantic_core-2.33.2-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:eb8c529b2819c37140eb51b914153063d27ed88e3bdc31b71198a198e921e011"},
{file = "pydantic_core-2.33.2-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:c52b02ad8b4e2cf14ca7b3d918f3eb0ee91e63b3167c32591e57c4317e134f8f"},
{file = "pydantic_core-2.33.2-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:96081f1605125ba0855dfda83f6f3df5ec90c61195421ba72223de35ccfb2f88"},
{file = "pydantic_core-2.33.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8f57a69461af2a5fa6e6bbd7a5f60d3b7e6cebb687f55106933188e79ad155c1"},
{file = "pydantic_core-2.33.2-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:572c7e6c8bb4774d2ac88929e3d1f12bc45714ae5ee6d9a788a9fb35e60bb04b"},
{file = "pydantic_core-2.33.2-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:db4b41f9bd95fbe5acd76d89920336ba96f03e149097365afe1cb092fceb89a1"},
{file = "pydantic_core-2.33.2-cp312-cp312-musllinux_1_1_armv7l.whl", hash = "sha256:fa854f5cf7e33842a892e5c73f45327760bc7bc516339fda888c75ae60edaeb6"},
{file = "pydantic_core-2.33.2-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:5f483cfb75ff703095c59e365360cb73e00185e01aaea067cd19acffd2ab20ea"},
{file = "pydantic_core-2.33.2-cp312-cp312-win32.whl", hash = "sha256:9cb1da0f5a471435a7bc7e439b8a728e8b61e59784b2af70d7c169f8dd8ae290"},
{file = "pydantic_core-2.33.2-cp312-cp312-win_amd64.whl", hash = "sha256:f941635f2a3d96b2973e867144fde513665c87f13fe0e193c158ac51bfaaa7b2"},
{file = "pydantic_core-2.33.2-cp312-cp312-win_arm64.whl", hash = "sha256:cca3868ddfaccfbc4bfb1d608e2ccaaebe0ae628e1416aeb9c4d88c001bb45ab"},
{file = "pydantic_core-2.33.2-cp313-cp313-macosx_10_12_x86_64.whl", hash = "sha256:1082dd3e2d7109ad8b7da48e1d4710c8d06c253cbc4a27c1cff4fbcaa97a9e3f"},
{file = "pydantic_core-2.33.2-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:f517ca031dfc037a9c07e748cefd8d96235088b83b4f4ba8939105d20fa1dcd6"},
{file = "pydantic_core-2.33.2-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0a9f2c9dd19656823cb8250b0724ee9c60a82f3cdf68a080979d13092a3b0fef"},
{file = "pydantic_core-2.33.2-cp313-cp313-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:2b0a451c263b01acebe51895bfb0e1cc842a5c666efe06cdf13846c7418caa9a"},
{file = "pydantic_core-2.33.2-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:1ea40a64d23faa25e62a70ad163571c0b342b8bf66d5fa612ac0dec4f069d916"},
{file = "pydantic_core-2.33.2-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:0fb2d542b4d66f9470e8065c5469ec676978d625a8b7a363f07d9a501a9cb36a"},
{file = "pydantic_core-2.33.2-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9fdac5d6ffa1b5a83bca06ffe7583f5576555e6c8b3a91fbd25ea7780f825f7d"},
{file = "pydantic_core-2.33.2-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:04a1a413977ab517154eebb2d326da71638271477d6ad87a769102f7c2488c56"},
{file = "pydantic_core-2.33.2-cp313-cp313-musllinux_1_1_aarch64.whl", hash = "sha256:c8e7af2f4e0194c22b5b37205bfb293d166a7344a5b0d0eaccebc376546d77d5"},
{file = "pydantic_core-2.33.2-cp313-cp313-musllinux_1_1_armv7l.whl", hash = "sha256:5c92edd15cd58b3c2d34873597a1e20f13094f59cf88068adb18947df5455b4e"},
{file = "pydantic_core-2.33.2-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:65132b7b4a1c0beded5e057324b7e16e10910c106d43675d9bd87d4f38dde162"},
{file = "pydantic_core-2.33.2-cp313-cp313-win32.whl", hash = "sha256:52fb90784e0a242bb96ec53f42196a17278855b0f31ac7c3cc6f5c1ec4811849"},
{file = "pydantic_core-2.33.2-cp313-cp313-win_amd64.whl", hash = "sha256:c083a3bdd5a93dfe480f1125926afcdbf2917ae714bdb80b36d34318b2bec5d9"},
{file = "pydantic_core-2.33.2-cp313-cp313-win_arm64.whl", hash = "sha256:e80b087132752f6b3d714f041ccf74403799d3b23a72722ea2e6ba2e892555b9"},
{file = "pydantic_core-2.33.2-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:61c18fba8e5e9db3ab908620af374db0ac1baa69f0f32df4f61ae23f15e586ac"},
{file = "pydantic_core-2.33.2-cp313-cp313t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:95237e53bb015f67b63c91af7518a62a8660376a6a0db19b89acc77a4d6199f5"},
{file = "pydantic_core-2.33.2-cp313-cp313t-win_amd64.whl", hash = "sha256:c2fc0a768ef76c15ab9238afa6da7f69895bb5d1ee83aeea2e3509af4472d0b9"},
{file = "pydantic_core-2.33.2-cp39-cp39-macosx_10_12_x86_64.whl", hash = "sha256:a2b911a5b90e0374d03813674bf0a5fbbb7741570dcd4b4e85a2e48d17def29d"},
{file = "pydantic_core-2.33.2-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:6fa6dfc3e4d1f734a34710f391ae822e0a8eb8559a85c6979e14e65ee6ba2954"},
{file = "pydantic_core-2.33.2-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:c54c939ee22dc8e2d545da79fc5381f1c020d6d3141d3bd747eab59164dc89fb"},
{file = "pydantic_core-2.33.2-cp39-cp39-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:53a57d2ed685940a504248187d5685e49eb5eef0f696853647bf37c418c538f7"},
{file = "pydantic_core-2.33.2-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:09fb9dd6571aacd023fe6aaca316bd01cf60ab27240d7eb39ebd66a3a15293b4"},
{file = "pydantic_core-2.33.2-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:0e6116757f7959a712db11f3e9c0a99ade00a5bbedae83cb801985aa154f071b"},
{file = "pydantic_core-2.33.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8d55ab81c57b8ff8548c3e4947f119551253f4e3787a7bbc0b6b3ca47498a9d3"},
{file = "pydantic_core-2.33.2-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:c20c462aa4434b33a2661701b861604913f912254e441ab8d78d30485736115a"},
{file = "pydantic_core-2.33.2-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:44857c3227d3fb5e753d5fe4a3420d6376fa594b07b621e220cd93703fe21782"},
{file = "pydantic_core-2.33.2-cp39-cp39-musllinux_1_1_armv7l.whl", hash = "sha256:eb9b459ca4df0e5c87deb59d37377461a538852765293f9e6ee834f0435a93b9"},
{file = "pydantic_core-2.33.2-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:9fcd347d2cc5c23b06de6d3b7b8275be558a0c90549495c699e379a80bf8379e"},
{file = "pydantic_core-2.33.2-cp39-cp39-win32.whl", hash = "sha256:83aa99b1285bc8f038941ddf598501a86f1536789740991d7d8756e34f1e74d9"},
{file = "pydantic_core-2.33.2-cp39-cp39-win_amd64.whl", hash = "sha256:f481959862f57f29601ccced557cc2e817bce7533ab8e01a797a48b49c9692b3"},
{file = "pydantic_core-2.33.2-pp310-pypy310_pp73-macosx_10_12_x86_64.whl", hash = "sha256:5c4aa4e82353f65e548c476b37e64189783aa5384903bfea4f41580f255fddfa"},
{file = "pydantic_core-2.33.2-pp310-pypy310_pp73-macosx_11_0_arm64.whl", hash = "sha256:d946c8bf0d5c24bf4fe333af284c59a19358aa3ec18cb3dc4370080da1e8ad29"},
{file = "pydantic_core-2.33.2-pp310-pypy310_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:87b31b6846e361ef83fedb187bb5b4372d0da3f7e28d85415efa92d6125d6e6d"},
{file = "pydantic_core-2.33.2-pp310-pypy310_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:aa9d91b338f2df0508606f7009fde642391425189bba6d8c653afd80fd6bb64e"},
{file = "pydantic_core-2.33.2-pp310-pypy310_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:2058a32994f1fde4ca0480ab9d1e75a0e8c87c22b53a3ae66554f9af78f2fe8c"},
{file = "pydantic_core-2.33.2-pp310-pypy310_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:0e03262ab796d986f978f79c943fc5f620381be7287148b8010b4097f79a39ec"},
{file = "pydantic_core-2.33.2-pp310-pypy310_pp73-musllinux_1_1_armv7l.whl", hash = "sha256:1a8695a8d00c73e50bff9dfda4d540b7dee29ff9b8053e38380426a85ef10052"},
{file = "pydantic_core-2.33.2-pp310-pypy310_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:fa754d1850735a0b0e03bcffd9d4b4343eb417e47196e4485d9cca326073a42c"},
{file = "pydantic_core-2.33.2-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:a11c8d26a50bfab49002947d3d237abe4d9e4b5bdc8846a63537b6488e197808"},
{file = "pydantic_core-2.33.2-pp311-pypy311_pp73-macosx_10_12_x86_64.whl", hash = "sha256:dd14041875d09cc0f9308e37a6f8b65f5585cf2598a53aa0123df8b129d481f8"},
{file = "pydantic_core-2.33.2-pp311-pypy311_pp73-macosx_11_0_arm64.whl", hash = "sha256:d87c561733f66531dced0da6e864f44ebf89a8fba55f31407b00c2f7f9449593"},
{file = "pydantic_core-2.33.2-pp311-pypy311_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:2f82865531efd18d6e07a04a17331af02cb7a651583c418df8266f17a63c6612"},
{file = "pydantic_core-2.33.2-pp311-pypy311_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:2bfb5112df54209d820d7bf9317c7a6c9025ea52e49f46b6a2060104bba37de7"},
{file = "pydantic_core-2.33.2-pp311-pypy311_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:64632ff9d614e5eecfb495796ad51b0ed98c453e447a76bcbeeb69615079fc7e"},
{file = "pydantic_core-2.33.2-pp311-pypy311_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:f889f7a40498cc077332c7ab6b4608d296d852182211787d4f3ee377aaae66e8"},
{file = "pydantic_core-2.33.2-pp311-pypy311_pp73-musllinux_1_1_armv7l.whl", hash = "sha256:de4b83bb311557e439b9e186f733f6c645b9417c84e2eb8203f3f820a4b988bf"},
{file = "pydantic_core-2.33.2-pp311-pypy311_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:82f68293f055f51b51ea42fafc74b6aad03e70e191799430b90c13d643059ebb"},
{file = "pydantic_core-2.33.2-pp311-pypy311_pp73-win_amd64.whl", hash = "sha256:329467cecfb529c925cf2bbd4d60d2c509bc2fb52a20c1045bf09bb70971a9c1"},
{file = "pydantic_core-2.33.2-pp39-pypy39_pp73-macosx_10_12_x86_64.whl", hash = "sha256:87acbfcf8e90ca885206e98359d7dca4bcbb35abdc0ff66672a293e1d7a19101"},
{file = "pydantic_core-2.33.2-pp39-pypy39_pp73-macosx_11_0_arm64.whl", hash = "sha256:7f92c15cd1e97d4b12acd1cc9004fa092578acfa57b67ad5e43a197175d01a64"},
{file = "pydantic_core-2.33.2-pp39-pypy39_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d3f26877a748dc4251cfcfda9dfb5f13fcb034f5308388066bcfe9031b63ae7d"},
{file = "pydantic_core-2.33.2-pp39-pypy39_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:dac89aea9af8cd672fa7b510e7b8c33b0bba9a43186680550ccf23020f32d535"},
{file = "pydantic_core-2.33.2-pp39-pypy39_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:970919794d126ba8645f3837ab6046fb4e72bbc057b3709144066204c19a455d"},
{file = "pydantic_core-2.33.2-pp39-pypy39_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:3eb3fe62804e8f859c49ed20a8451342de53ed764150cb14ca71357c765dc2a6"},
{file = "pydantic_core-2.33.2-pp39-pypy39_pp73-musllinux_1_1_armv7l.whl", hash = "sha256:3abcd9392a36025e3bd55f9bd38d908bd17962cc49bc6da8e7e96285336e2bca"},
{file = "pydantic_core-2.33.2-pp39-pypy39_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:3a1c81334778f9e3af2f8aeb7a960736e5cab1dfebfb26aabca09afd2906c039"},
{file = "pydantic_core-2.33.2-pp39-pypy39_pp73-win_amd64.whl", hash = "sha256:2807668ba86cb38c6817ad9bc66215ab8584d1d304030ce4f0887336f28a5e27"},
{file = "pydantic_core-2.33.2.tar.gz", hash = "sha256:7cb8bc3605c29176e1b105350d2e6474142d7c1bd1d9327c4a9bdb46bf827acc"},
]
[package.dependencies]
@@ -3989,22 +3940,25 @@ typing-extensions = ">=4.6.0,<4.7.0 || >4.7.0"
[[package]]
name = "pydantic-settings"
version = "2.8.1"
version = "2.9.1"
description = "Settings management using Pydantic"
optional = false
python-versions = ">=3.8"
python-versions = ">=3.9"
groups = ["main"]
files = [
{file = "pydantic_settings-2.8.1-py3-none-any.whl", hash = "sha256:81942d5ac3d905f7f3ee1a70df5dfb62d5569c12f51a5a647defc1c3d9ee2e9c"},
{file = "pydantic_settings-2.8.1.tar.gz", hash = "sha256:d5c663dfbe9db9d5e1c646b2e161da12f0d734d422ee56f567d0ea2cee4e8585"},
{file = "pydantic_settings-2.9.1-py3-none-any.whl", hash = "sha256:59b4f431b1defb26fe620c71a7d3968a710d719f5f4cdbbdb7926edeb770f6ef"},
{file = "pydantic_settings-2.9.1.tar.gz", hash = "sha256:c509bf79d27563add44e8446233359004ed85066cd096d8b510f715e6ef5d268"},
]
[package.dependencies]
pydantic = ">=2.7.0"
python-dotenv = ">=0.21.0"
typing-inspection = ">=0.4.0"
[package.extras]
aws-secrets-manager = ["boto3 (>=1.35.0)", "boto3-stubs[secretsmanager]"]
azure-key-vault = ["azure-identity (>=1.16.0)", "azure-keyvault-secrets (>=4.8.0)"]
gcp-secret-manager = ["google-cloud-secret-manager (>=2.23.1)"]
toml = ["tomli (>=2.0.1)"]
yaml = ["pyyaml (>=6.0.1)"]
@@ -4079,14 +4033,14 @@ files = [
[[package]]
name = "pyright"
version = "1.1.398"
version = "1.1.400"
description = "Command line wrapper for pyright"
optional = false
python-versions = ">=3.7"
groups = ["dev"]
files = [
{file = "pyright-1.1.398-py3-none-any.whl", hash = "sha256:0a70bfd007d9ea7de1cf9740e1ad1a40a122592cfe22a3f6791b06162ad08753"},
{file = "pyright-1.1.398.tar.gz", hash = "sha256:357a13edd9be8082dc73be51190913e475fa41a6efb6ec0d4b7aab3bc11638d8"},
{file = "pyright-1.1.400-py3-none-any.whl", hash = "sha256:c80d04f98b5a4358ad3a35e241dbf2a408eee33a40779df365644f8054d2517e"},
{file = "pyright-1.1.400.tar.gz", hash = "sha256:b8a3ba40481aa47ba08ffb3228e821d22f7d391f83609211335858bf05686bdb"},
]
[package.dependencies]
@@ -4628,14 +4582,14 @@ files = [
[[package]]
name = "replicate"
version = "1.0.4"
version = "1.0.6"
description = "Python client for Replicate"
optional = false
python-versions = ">=3.8"
groups = ["main"]
files = [
{file = "replicate-1.0.4-py3-none-any.whl", hash = "sha256:f568f6271ff715067901b6094c23c37373bbcfd7de0ff9b85e9c9ead567e09e7"},
{file = "replicate-1.0.4.tar.gz", hash = "sha256:f718601863ef1f419aa7dcdab1ea8770ba5489b571b86edf840cd506d68758ef"},
{file = "replicate-1.0.6-py3-none-any.whl", hash = "sha256:d544f837dc7e9dc3b3c1df60a145c7d6f362d6719b719793a44a4be28837103d"},
{file = "replicate-1.0.6.tar.gz", hash = "sha256:b8a0f1649ed4146c3d624e22a418b8c6decce9346cffc110c90fde5995c46e60"},
]
[package.dependencies]
@@ -4830,30 +4784,30 @@ pyasn1 = ">=0.1.3"
[[package]]
name = "ruff"
version = "0.11.2"
version = "0.11.10"
description = "An extremely fast Python linter and code formatter, written in Rust."
optional = false
python-versions = ">=3.7"
groups = ["dev"]
files = [
{file = "ruff-0.11.2-py3-none-linux_armv6l.whl", hash = "sha256:c69e20ea49e973f3afec2c06376eb56045709f0212615c1adb0eda35e8a4e477"},
{file = "ruff-0.11.2-py3-none-macosx_10_12_x86_64.whl", hash = "sha256:2c5424cc1c4eb1d8ecabe6d4f1b70470b4f24a0c0171356290b1953ad8f0e272"},
{file = "ruff-0.11.2-py3-none-macosx_11_0_arm64.whl", hash = "sha256:ecf20854cc73f42171eedb66f006a43d0a21bfb98a2523a809931cda569552d9"},
{file = "ruff-0.11.2-py3-none-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0c543bf65d5d27240321604cee0633a70c6c25c9a2f2492efa9f6d4b8e4199bb"},
{file = "ruff-0.11.2-py3-none-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:20967168cc21195db5830b9224be0e964cc9c8ecf3b5a9e3ce19876e8d3a96e3"},
{file = "ruff-0.11.2-py3-none-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:955a9ce63483999d9f0b8f0b4a3ad669e53484232853054cc8b9d51ab4c5de74"},
{file = "ruff-0.11.2-py3-none-manylinux_2_17_ppc64.manylinux2014_ppc64.whl", hash = "sha256:86b3a27c38b8fce73bcd262b0de32e9a6801b76d52cdb3ae4c914515f0cef608"},
{file = "ruff-0.11.2-py3-none-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a3b66a03b248c9fcd9d64d445bafdf1589326bee6fc5c8e92d7562e58883e30f"},
{file = "ruff-0.11.2-py3-none-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:0397c2672db015be5aa3d4dac54c69aa012429097ff219392c018e21f5085147"},
{file = "ruff-0.11.2-py3-none-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:869bcf3f9abf6457fbe39b5a37333aa4eecc52a3b99c98827ccc371a8e5b6f1b"},
{file = "ruff-0.11.2-py3-none-musllinux_1_2_aarch64.whl", hash = "sha256:2a2b50ca35457ba785cd8c93ebbe529467594087b527a08d487cf0ee7b3087e9"},
{file = "ruff-0.11.2-py3-none-musllinux_1_2_armv7l.whl", hash = "sha256:7c69c74bf53ddcfbc22e6eb2f31211df7f65054bfc1f72288fc71e5f82db3eab"},
{file = "ruff-0.11.2-py3-none-musllinux_1_2_i686.whl", hash = "sha256:6e8fb75e14560f7cf53b15bbc55baf5ecbe373dd5f3aab96ff7aa7777edd7630"},
{file = "ruff-0.11.2-py3-none-musllinux_1_2_x86_64.whl", hash = "sha256:842a472d7b4d6f5924e9297aa38149e5dcb1e628773b70e6387ae2c97a63c58f"},
{file = "ruff-0.11.2-py3-none-win32.whl", hash = "sha256:aca01ccd0eb5eb7156b324cfaa088586f06a86d9e5314b0eb330cb48415097cc"},
{file = "ruff-0.11.2-py3-none-win_amd64.whl", hash = "sha256:3170150172a8f994136c0c66f494edf199a0bbea7a409f649e4bc8f4d7084080"},
{file = "ruff-0.11.2-py3-none-win_arm64.whl", hash = "sha256:52933095158ff328f4c77af3d74f0379e34fd52f175144cefc1b192e7ccd32b4"},
{file = "ruff-0.11.2.tar.gz", hash = "sha256:ec47591497d5a1050175bdf4e1a4e6272cddff7da88a2ad595e1e326041d8d94"},
{file = "ruff-0.11.10-py3-none-linux_armv6l.whl", hash = "sha256:859a7bfa7bc8888abbea31ef8a2b411714e6a80f0d173c2a82f9041ed6b50f58"},
{file = "ruff-0.11.10-py3-none-macosx_10_12_x86_64.whl", hash = "sha256:968220a57e09ea5e4fd48ed1c646419961a0570727c7e069842edd018ee8afed"},
{file = "ruff-0.11.10-py3-none-macosx_11_0_arm64.whl", hash = "sha256:1067245bad978e7aa7b22f67113ecc6eb241dca0d9b696144256c3a879663bca"},
{file = "ruff-0.11.10-py3-none-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f4854fd09c7aed5b1590e996a81aeff0c9ff51378b084eb5a0b9cd9518e6cff2"},
{file = "ruff-0.11.10-py3-none-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:8b4564e9f99168c0f9195a0fd5fa5928004b33b377137f978055e40008a082c5"},
{file = "ruff-0.11.10-py3-none-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:5b6a9cc5b62c03cc1fea0044ed8576379dbaf751d5503d718c973d5418483641"},
{file = "ruff-0.11.10-py3-none-manylinux_2_17_ppc64.manylinux2014_ppc64.whl", hash = "sha256:607ecbb6f03e44c9e0a93aedacb17b4eb4f3563d00e8b474298a201622677947"},
{file = "ruff-0.11.10-py3-none-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:7b3a522fa389402cd2137df9ddefe848f727250535c70dafa840badffb56b7a4"},
{file = "ruff-0.11.10-py3-none-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:2f071b0deed7e9245d5820dac235cbdd4ef99d7b12ff04c330a241ad3534319f"},
{file = "ruff-0.11.10-py3-none-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:4a60e3a0a617eafba1f2e4186d827759d65348fa53708ca547e384db28406a0b"},
{file = "ruff-0.11.10-py3-none-musllinux_1_2_aarch64.whl", hash = "sha256:da8ec977eaa4b7bf75470fb575bea2cb41a0e07c7ea9d5a0a97d13dbca697bf2"},
{file = "ruff-0.11.10-py3-none-musllinux_1_2_armv7l.whl", hash = "sha256:ddf8967e08227d1bd95cc0851ef80d2ad9c7c0c5aab1eba31db49cf0a7b99523"},
{file = "ruff-0.11.10-py3-none-musllinux_1_2_i686.whl", hash = "sha256:5a94acf798a82db188f6f36575d80609072b032105d114b0f98661e1679c9125"},
{file = "ruff-0.11.10-py3-none-musllinux_1_2_x86_64.whl", hash = "sha256:3afead355f1d16d95630df28d4ba17fb2cb9c8dfac8d21ced14984121f639bad"},
{file = "ruff-0.11.10-py3-none-win32.whl", hash = "sha256:dc061a98d32a97211af7e7f3fa1d4ca2fcf919fb96c28f39551f35fc55bdbc19"},
{file = "ruff-0.11.10-py3-none-win_amd64.whl", hash = "sha256:5cc725fbb4d25b0f185cb42df07ab6b76c4489b4bfb740a175f3a59c70e8a224"},
{file = "ruff-0.11.10-py3-none-win_arm64.whl", hash = "sha256:ef69637b35fb8b210743926778d0e45e1bffa850a7c61e428c6b971549b5f5d1"},
{file = "ruff-0.11.10.tar.gz", hash = "sha256:d522fb204b4959909ecac47da02830daec102eeb100fb50ea9554818d47a5fa6"},
]
[[package]]
@@ -4887,14 +4841,14 @@ files = [
[[package]]
name = "sentry-sdk"
version = "2.25.1"
version = "2.28.0"
description = "Python client for Sentry (https://sentry.io)"
optional = false
python-versions = ">=3.6"
groups = ["main"]
files = [
{file = "sentry_sdk-2.25.1-py2.py3-none-any.whl", hash = "sha256:60b016d0772789454dc55a284a6a44212044d4a16d9f8448725effee97aaf7f6"},
{file = "sentry_sdk-2.25.1.tar.gz", hash = "sha256:f9041b7054a7cf12d41eadabe6458ce7c6d6eea7a97cfe1b760b6692e9562cf0"},
{file = "sentry_sdk-2.28.0-py2.py3-none-any.whl", hash = "sha256:51496e6cb3cb625b99c8e08907c67a9112360259b0ef08470e532c3ab184a232"},
{file = "sentry_sdk-2.28.0.tar.gz", hash = "sha256:14d2b73bc93afaf2a9412490329099e6217761cbab13b6ee8bc0e82927e1504e"},
]
[package.dependencies]
@@ -5181,14 +5135,14 @@ typing-extensions = {version = ">=4.5.0", markers = "python_version >= \"3.7\""}
[[package]]
name = "supabase"
version = "2.15.0"
version = "2.15.1"
description = "Supabase client for Python."
optional = false
python-versions = "<4.0,>=3.9"
groups = ["main"]
files = [
{file = "supabase-2.15.0-py3-none-any.whl", hash = "sha256:a665c7ab6c8ad1d80609ab62ad657f66fdaf38070ec9e0db5c7887fd72b109c0"},
{file = "supabase-2.15.0.tar.gz", hash = "sha256:2e66289ad74ae9c4cb04a69f9de00cd2ce880cd890de23269a40ac5b69151d26"},
{file = "supabase-2.15.1-py3-none-any.whl", hash = "sha256:749299cdd74ecf528f52045c1e60d9dba81cc2054656f754c0ca7fba0dd34827"},
{file = "supabase-2.15.1.tar.gz", hash = "sha256:66e847dab9346062aa6a25b4e81ac786b972c5d4299827c57d1d5bd6a0346070"},
]
[package.dependencies]
@@ -5217,14 +5171,14 @@ strenum = ">=0.4.15,<0.5.0"
[[package]]
name = "tenacity"
version = "9.0.0"
version = "9.1.2"
description = "Retry code until it succeeds"
optional = false
python-versions = ">=3.8"
python-versions = ">=3.9"
groups = ["main"]
files = [
{file = "tenacity-9.0.0-py3-none-any.whl", hash = "sha256:93de0c98785b27fcf659856aa9f54bfbd399e29969b0621bc7f762bd441b4539"},
{file = "tenacity-9.0.0.tar.gz", hash = "sha256:807f37ca97d62aa361264d497b0e31e92b8027044942bfa756160d908320d73b"},
{file = "tenacity-9.1.2-py3-none-any.whl", hash = "sha256:f77bf36710d8b73a50b2dd155c97b870017ad21afe6ab300326b0371b3b05138"},
{file = "tenacity-9.1.2.tar.gz", hash = "sha256:1169d376c297e7de388d18b4481760d478b0e99a777cad3a9c86e556f4b697cb"},
]
[package.extras]
@@ -5536,14 +5490,14 @@ zstd = ["zstandard (>=0.18.0)"]
[[package]]
name = "uvicorn"
version = "0.34.0"
version = "0.34.2"
description = "The lightning-fast ASGI server."
optional = false
python-versions = ">=3.9"
groups = ["main"]
files = [
{file = "uvicorn-0.34.0-py3-none-any.whl", hash = "sha256:023dc038422502fa28a09c7a30bf2b6991512da7dcdb8fd35fe57cfc154126f4"},
{file = "uvicorn-0.34.0.tar.gz", hash = "sha256:404051050cd7e905de2c9a7e61790943440b3416f49cb409f965d9dcd0fa73e9"},
{file = "uvicorn-0.34.2-py3-none-any.whl", hash = "sha256:deb49af569084536d269fe0a6d67e3754f104cf03aba7c11c40f01aadf33c403"},
{file = "uvicorn-0.34.2.tar.gz", hash = "sha256:0e929828f6186353a80b58ea719861d2629d766293b6d19baf086ba31d4f3328"},
]
[package.dependencies]
@@ -6310,4 +6264,4 @@ cffi = ["cffi (>=1.11)"]
[metadata]
lock-version = "2.1"
python-versions = ">=3.10,<3.13"
content-hash = "781f77ec77cfce78b34fb57063dcc81df8e9c5a4be9a644033a0c197e0063730"
content-hash = "f0d0aae83b885e97413e0effe5f61dd24d50b6ff77f243d855053e7588877f35"

View File

@@ -10,56 +10,57 @@ packages = [{ include = "backend", format = "sdist" }]
[tool.poetry.dependencies]
python = ">=3.10,<3.13"
aio-pika = "^9.5.5"
anthropic = "^0.49.0"
anthropic = "^0.51.0"
apscheduler = "^3.11.0"
autogpt-libs = { path = "../autogpt_libs", develop = true }
bleach = { extras = ["css"], version = "^6.2.0" }
click = "^8.1.7"
click = "^8.2.0"
cryptography = "^43.0"
discord-py = "^2.5.2"
e2b-code-interpreter = "^1.1.1"
e2b-code-interpreter = "^1.5.0"
fastapi = "^0.115.12"
feedparser = "^6.0.11"
flake8 = "^7.2.0"
google-api-python-client = "^2.166.0"
google-auth-oauthlib = "^1.2.1"
google-api-python-client = "^2.169.0"
google-auth-oauthlib = "^1.2.2"
google-cloud-storage = "^3.1.0"
googlemaps = "^4.10.0"
gravitasml = "^0.1.3"
groq = "^0.20.0"
groq = "^0.24.0"
jinja2 = "^3.1.6"
jsonref = "^1.1.0"
jsonschema = "^4.22.0"
launchdarkly-server-sdk = "^9.10.0"
mem0ai = "^0.1.80"
launchdarkly-server-sdk = "^9.11.0"
mem0ai = "^0.1.98"
moviepy = "^2.1.2"
ollama = "^0.4.1"
openai = "^1.70.0"
ollama = "^0.4.8"
openai = "^1.78.1"
pika = "^1.3.2"
pinecone = "^5.3.1"
poetry = "^2.1.2"
poetry = "^2.1.3"
postmarker = "^1.0"
praw = "~7.8.1"
prisma = "^0.15.0"
prometheus-client = "^0.21.1"
psutil = "^7.0.0"
psycopg2-binary = "^2.9.10"
pydantic = { extras = ["email"], version = "^2.11.1" }
pydantic-settings = "^2.8.1"
pydantic = { extras = ["email"], version = "^2.11.4" }
pydantic-settings = "^2.9.1"
pytest = "^8.3.5"
pytest-asyncio = "^0.26.0"
python-dotenv = "^1.1.0"
python-multipart = "^0.0.20"
redis = "^5.2.0"
replicate = "^1.0.4"
sentry-sdk = {extras = ["anthropic", "fastapi", "launchdarkly", "openai", "sqlalchemy"], version = "^2.25.1"}
replicate = "^1.0.6"
sentry-sdk = {extras = ["anthropic", "fastapi", "launchdarkly", "openai", "sqlalchemy"], version = "^2.28.0"}
sqlalchemy = "^2.0.40"
strenum = "^0.4.9"
stripe = "^11.5.0"
supabase = "2.15.0"
tenacity = "^9.0.0"
supabase = "2.15.1"
tenacity = "^9.1.2"
todoist-api-python = "^2.1.7"
tweepy = "^4.14.0"
uvicorn = { extras = ["standard"], version = "^0.34.0" }
uvicorn = { extras = ["standard"], version = "^0.34.2" }
websockets = "^14.2"
youtube-transcript-api = "^0.6.2"
zerobouncesdk = "^1.1.1"
@@ -71,12 +72,12 @@ black = "^24.10.0"
faker = "^33.3.1"
httpx = "^0.28.1"
isort = "^5.13.2"
poethepoet = "^0.33.1"
pyright = "^1.1.398"
poethepoet = "^0.34.0"
pyright = "^1.1.400"
pytest-mock = "^3.14.0"
pytest-watcher = "^0.4.2"
requests = "^2.32.3"
ruff = "^0.11.2"
ruff = "^0.11.10"
# NOTE: please insert new dependencies in their alphabetical location
[build-system]
@@ -87,6 +88,7 @@ build-backend = "poetry.core.masonry.api"
app = "backend.app:main"
rest = "backend.rest:main"
ws = "backend.ws:main"
scheduler = "backend.scheduler:main"
executor = "backend.exec:main"
cli = "backend.cli:main"
format = "linter:format"

View File

@@ -118,6 +118,11 @@ model AgentGraph {
// This allows us to delete user data with deleting the agent which maybe in use by other users
User User @relation(fields: [userId], references: [id], onDelete: Cascade)
forkedFromId String?
forkedFromVersion Int?
forkedFrom AgentGraph? @relation("AgentGraphForks", fields: [forkedFromId, forkedFromVersion], references: [id, version])
forks AgentGraph[] @relation("AgentGraphForks")
Nodes AgentNode[]
Executions AgentGraphExecution[]
@@ -347,6 +352,7 @@ model AgentGraphExecution {
@@index([agentGraphId, agentGraphVersion])
@@index([userId])
@@index([createdAt])
}
// This model describes the execution of an AgentNode.
@@ -373,6 +379,7 @@ model AgentNodeExecution {
@@index([agentGraphExecutionId])
@@index([agentNodeId])
@@index([addedTime])
}
// This model describes the output of an AgentNodeExecution.

View File

@@ -6,10 +6,10 @@ from prisma.models import CreditTransaction
from backend.blocks.llm import AITextGeneratorBlock
from backend.data.block import get_block
from backend.data.credit import BetaUserCredit
from backend.data.credit import BetaUserCredit, UsageTransactionMetadata
from backend.data.execution import NodeExecutionEntry
from backend.data.user import DEFAULT_USER_ID
from backend.executor.utils import UsageTransactionMetadata, block_usage_cost
from backend.executor.utils import block_usage_cost
from backend.integrations.credentials_store import openai_credentials
from backend.util.test import SpinTestServer
@@ -34,7 +34,7 @@ async def spend_credits(entry: NodeExecutionEntry) -> int:
if not block:
raise RuntimeError(f"Block {entry.block_id} not found")
cost, matching_filter = block_usage_cost(block=block, input_data=entry.data)
cost, matching_filter = block_usage_cost(block=block, input_data=entry.inputs)
await user_credit.spend_credits(
entry.user_id,
cost,
@@ -46,6 +46,7 @@ async def spend_credits(entry: NodeExecutionEntry) -> int:
block_id=entry.block_id,
block=entry.block_id,
input=matching_filter,
reason=f"Ran block {entry.block_id} {block.name}",
),
)
@@ -66,7 +67,7 @@ async def test_block_credit_usage(server: SpinTestServer):
graph_exec_id="test_graph_exec",
node_exec_id="test_node_exec",
block_id=AITextGeneratorBlock().id,
data={
inputs={
"model": "gpt-4-turbo",
"credentials": {
"id": openai_credentials.id,
@@ -86,7 +87,7 @@ async def test_block_credit_usage(server: SpinTestServer):
graph_exec_id="test_graph_exec",
node_exec_id="test_node_exec",
block_id=AITextGeneratorBlock().id,
data={"model": "gpt-4-turbo", "api_key": "owned_api_key"},
inputs={"model": "gpt-4-turbo", "api_key": "owned_api_key"},
),
)
assert spending_amount_2 == 0

View File

@@ -1,7 +1,7 @@
import pytest
from backend.data import db
from backend.executor import Scheduler
from backend.executor.scheduler import SchedulerClient
from backend.server.model import CreateGraph
from backend.usecases.sample import create_test_graph, create_test_user
from backend.util.service import get_service_client
@@ -17,11 +17,11 @@ async def test_agent_schedule(server: SpinTestServer):
user_id=test_user.id,
)
scheduler = get_service_client(Scheduler)
schedules = scheduler.get_execution_schedules(test_graph.id, test_user.id)
scheduler = get_service_client(SchedulerClient)
schedules = await scheduler.get_execution_schedules(test_graph.id, test_user.id)
assert len(schedules) == 0
schedule = scheduler.add_execution_schedule(
schedule = await scheduler.add_execution_schedule(
graph_id=test_graph.id,
user_id=test_user.id,
graph_version=1,
@@ -30,10 +30,12 @@ async def test_agent_schedule(server: SpinTestServer):
)
assert schedule
schedules = scheduler.get_execution_schedules(test_graph.id, test_user.id)
schedules = await scheduler.get_execution_schedules(test_graph.id, test_user.id)
assert len(schedules) == 1
assert schedules[0].cron == "0 0 * * *"
scheduler.delete_schedule(schedule.id, user_id=test_user.id)
schedules = scheduler.get_execution_schedules(test_graph.id, user_id=test_user.id)
await scheduler.delete_schedule(schedule.id, user_id=test_user.id)
schedules = await scheduler.get_execution_schedules(
test_graph.id, user_id=test_user.id
)
assert len(schedules) == 0

View File

@@ -1,10 +1,10 @@
import pytest
from backend.util.request import validate_url
from backend.util.request import pin_url, validate_url
@pytest.mark.parametrize(
"url, trusted_origins, expected_value, should_raise",
"raw_url, trusted_origins, expected_value, should_raise",
[
# Rejected IP ranges
("localhost", [], None, True),
@@ -55,14 +55,14 @@ from backend.util.request import validate_url
],
)
def test_validate_url_no_dns_rebinding(
url, trusted_origins, expected_value, should_raise
raw_url: str, trusted_origins: list[str], expected_value: str, should_raise: bool
):
if should_raise:
with pytest.raises(ValueError):
validate_url(url, trusted_origins, enable_dns_rebinding=False)
validate_url(raw_url, trusted_origins)
else:
url, host = validate_url(url, trusted_origins, enable_dns_rebinding=False)
assert url == expected_value
validated_url, _, _ = validate_url(raw_url, trusted_origins)
assert validated_url.geturl() == expected_value
@pytest.mark.parametrize(
@@ -79,7 +79,11 @@ def test_validate_url_no_dns_rebinding(
],
)
def test_dns_rebinding_fix(
monkeypatch, hostname, resolved_ips, expect_error, expected_ip
monkeypatch,
hostname: str,
resolved_ips: list[str],
expect_error: bool,
expected_ip: str,
):
"""
Tests that validate_url pins the first valid public IP address, and rejects
@@ -96,11 +100,13 @@ def test_dns_rebinding_fix(
if expect_error:
# If any IP is blocked, we expect a ValueError
with pytest.raises(ValueError):
validate_url(hostname, [])
url, _, ip_addresses = validate_url(hostname, [])
pin_url(url, ip_addresses)
else:
pinned_url, ascii_hostname = validate_url(hostname, [])
url, _, ip_addresses = validate_url(hostname, [])
pinned_url = pin_url(url, ip_addresses).geturl()
# The pinned_url should contain the first valid IP
assert pinned_url.startswith("http://") or pinned_url.startswith("https://")
assert expected_ip in pinned_url
# The ascii_hostname should match our original hostname after IDNA encoding
assert ascii_hostname == hostname
# The unpinned URL's hostname should match our original IDNA encoded hostname
assert url.hostname == hostname

Some files were not shown because too many files have changed in this diff Show More