Compare commits

..

227 Commits

Author SHA1 Message Date
Abhimanyu Yadav
54bbafc431 Merge branch 'dev' into ci-chromatic 2025-04-22 20:26:42 +05:30
Krzysztof Czerwinski
c80d357149 feat(frontend): Use route groups (#9855)
Navbar sometimes disappears outside `/onboarding`.

### Changes 🏗️

This PR solves the problem of disappearing Navbar outside `/onboarding`
by introducing `app/(platform)` route group.

- Move all routes requiring Navbar to `app/(platform)`
- Move `<Navbar>` to `app/(platform)/layout.tsx`
- Move `/onboarding` to `app/(no-navbar/`
- Remove pathname injection to header from middleware and stop relying
on it to hide the navbar

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  - [x] Common routes work properly
2025-04-22 09:10:12 +00:00
Abhimanyu Yadav
5662783624 Merge branch 'dev' into ci-chromatic 2025-04-22 10:29:24 +05:30
Zamil Majdy
20d39f6d44 fix(platform): Fix Google Maps API Key setting through env (#9848)
Setting the Google Maps API through the API has never worked on the
platform.

### Changes 🏗️

Set the default api key from the environment variable.

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [x] Test GoogleMapsBlock
2025-04-22 03:00:47 +07:00
Bently
d5b82c01e0 feat(backend): Adds latest llm models (#9856)
This PR adds the following models:
OpenAI's O3: https://platform.openai.com/docs/models/o3
OpenAI's GPT 4.1: https://platform.openai.com/docs/models/gpt-4.1
Anthropics Claude 3.7: https://www.anthropic.com/news/claude-3-7-sonnet
Googles gemini 2.5 pro:
https://openrouter.ai/google/gemini-2.5-pro-preview-03-25
2025-04-21 19:26:21 +00:00
Abhimanyu Yadav
69b8d96516 fix(library/run): Replace credits to cents (#9845)
Replacing credits with cents (100 credits = 1$).

I haven’t touched anything internally, just changed the UI.

Everything is working great.

On the frontend, there’s no other place where we use credits instead of
dollars.

![Screenshot 2025-04-19 at 11 36
00 AM](https://github.com/user-attachments/assets/de799b5c-094e-4c96-a7da-273ce60b2125)
<img width="1503" alt="Screenshot 2025-04-19 at 11 33 24 AM"
src="https://github.com/user-attachments/assets/87d7e218-f8f5-4e2e-92ef-70c81735db6b"
/>
2025-04-21 12:31:48 +00:00
Krzysztof Czerwinski
67af77e179 fix((backend): Fix migrate llm models in existing agents (#9810)
https://github.com/Significant-Gravitas/AutoGPT/pull/9452 was throwing
`operator does not exist: text ? unknown` on deployed dev and so the
function call was commented as a hotfix.
This PR fixes and re-enables the llm model migration function.

### Changes 🏗️

- Uncomment and fix `migrate_llm_models` function

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  - [x] Migrate nodes with non-existing models
  - [x] Don't migrate nodes without any model or with correct models

---------

Co-authored-by: Zamil Majdy <zamil.majdy@agpt.co>
2025-04-19 12:52:36 +00:00
Abhimanyu Yadav
2a92970a5f fix(marketplace/library): Removing white borders from Avatar (#9818)
There are some white borders around the avatar in the store card, but
they are not present in the design, so I'm removing them.

![Screenshot 2025-04-15 at 3 58
05 PM](https://github.com/user-attachments/assets/f8c98076-9cc3-46f1-b4f3-41d4e48f6127)
2025-04-19 05:36:36 +00:00
Zamil Majdy
9052ee7b95 fix(backend): Clear RabbitMQ connection cache on execution-manager retry 2025-04-19 07:50:04 +02:00
Zamil Majdy
c783f64b33 fix(backend): Handle add execution API request failure (#9838)
There are cases where the publishing agent execution is failing, making
the agent execution appear to be stuck in a queue, but the execution has
never been in a queue in the first place.

### Changes 🏗️

On publishing failure, we set the graph & starting node execution status
to FAILED and let the UI bubble up the error so the user can try again.

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  - [x] Normal add execution flow
2025-04-18 18:35:43 +00:00
Zamil Majdy
055a231aed feat(backend): Add retry mechanism for pika publish_message (#9839)
For unknown reason publishing message can fail sometimes due to the
connection being broken:
MessageQueue suddenly unavailable, connection simply broke, connection
being reset, etc.

### Changes 🏗️

Adding a tenacity retry on AMQP or ConnectionError, which hopefully can
alleviate the issue.

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  - [x] Simple add execution
2025-04-18 17:56:27 +00:00
Reinier van der Leer
417d7732af feat(platform/library): Add credentials UX on /library/agents/[id] (#9789)
- Resolves #9771
- ... in a non-persistent way, so it won't work for webhook-triggered
agents
    For webhooks: #9541

### Changes 🏗️

Frontend:
- Add credentials inputs in Library "New run" screen (based on
`graph.credentials_input_schema`)
- Refactor `CredentialsInput` and `useCredentials` to not rely on XYFlow
context

- Unsplit lists of saved credentials in `CredentialsProvider` state

- Move logic that was being executed at component render to `useEffect`
hooks in `CredentialsInput`

Backend:
- Implement logic to aggregate credentials input requirements to one per
provider per graph
- Add `BaseGraph.credentials_input_schema` (JSON schema) computed field
    Underlying added logic:
- `BaseGraph._credentials_input_schema` - makes a `BlockSchema` from a
graph's aggregated credentials inputs
- `BaseGraph.aggregate_credentials_inputs()` - aggregates a graph's
nodes' credentials inputs using `CredentialsFieldInfo.combine(..)`
- `BlockSchema.get_credentials_fields_info() -> dict[str,
CredentialsFieldInfo]`
- `CredentialsFieldInfo` model (created from
`_CredentialsFieldSchemaExtra`)

- Implement logic to inject explicitly passed credentials into graph
execution
  - Add `credentials_inputs` parameter to `execute_graph` endpoint
- Add `graph_credentials_input` parameter to
`.executor.utils.add_graph_execution(..)`
  - Implement `.executor.utils.make_node_credentials_input_map(..)`
  - Amend `.executor.utils.construct_node_execution_input`
  - Add `GraphExecutionEntry.node_credentials_input_map` attribute
  - Amend validation to allow injecting credentials
    - Amend `GraphModel._validate_graph(..)`
    - Amend `.executor.utils._validate_node_input_credentials`
- Add `node_credentials_map` parameter to
`ExecutionManager.add_execution(..)`
    - Amend execution validation to handle side-loaded credentials
    - Add `GraphExecutionEntry.node_execution_map` attribute
- Add mechanism to inject passed credentials into node execution data
- Add credentials injection mechanism to node execution queueing logic
in `Executor._on_graph_execution(..)`

- Replace boilerplate logic in `v1.execute_graph` endpoint with call to
existing `.executor.utils.add_graph_execution(..)`
- Replace calls to `.server.routers.v1.execute_graph` with
`add_graph_execution`

Also:
- Address tech debt in `GraphModel._validate_gaph(..)`
- Fix type checking in `BaseGraph._generate_schema(..)`

#### TODO
- [ ] ~~Make "Run again" work with credentials in
`AgentRunDetailsView`~~
- [ ] Prohibit saving a graph if it has nodes with missing discriminator
value for discriminated credentials inputs

### Checklist 📋

#### For code changes:
- [ ] I have clearly listed my changes in the PR description
- [ ] I have made a test plan
- [ ] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [ ] ...
2025-04-18 14:27:13 +00:00
Krzysztof Czerwinski
f16a398a8e feat(frontend): Update completed task group design in Wallet (#9820)
This redesigns how the task group is displayed when finished for both
expanded and folded state.

### Changes 🏗️

- Folded state now displays `Done` badge and hides tasks
- Expanded state shows only task names and hides details and video

Screenshot:
1. Expanded unfinished group
2. Expanded finished group
3. Folded finished group

<img width="463" alt="Screenshot 2025-04-15 at 2 05 31 PM"
src="https://github.com/user-attachments/assets/40152073-fc0e-47c2-9fd4-a6b0161280e6"
/>

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  - [x] Finished group displays correctly
  - [x] Unfinished group displays correctly
2025-04-18 09:45:35 +00:00
Krzysztof Czerwinski
e8bbd945f2 feat(frontend): Wallet top-up and auto-refill (#9819)
### Changes 🏗️

- Add top-up and auto-refill tabs in the Wallet
- Add shadcn `tabs` component
- Disable increase/decrease spinner buttons on number inputs across
Platform (moved css from `customnode.css` to `globals.css`

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  - [x] Incorrect values are detected properly
  - [x] Top-up works
  - [x] Setting auto-refill works
2025-04-18 09:44:54 +00:00
Krzysztof Czerwinski
d1730d7b1d fix(frontend): Fix onboarding agent execution (#9822)
Onboarding executes original agent graph directly without waiting for
marketplace agent to be added to user library.

### Changes 🏗️

- Execute library agent after it's already added to library

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  - [x] Onboarding agent executes properly
2025-04-18 09:36:40 +00:00
Krzysztof Czerwinski
8ea64327a1 fix(backend): Fix array types in database (#9828)
Array fields in `schema.prisma` are non-nullable, but generated
migrations don’t add `NOT NULL` constraints. This causes existing rows
to get `NULL` values when new array columns are added, breaking schema
expectations and leading to bugs.

### Changes 🏗️

- Backfill all `NULL` rows on non-nullable array columns to empty arrays
- Set `NOT NULL` constraint on all array columns

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  - [x] Existing `NULL` rows are properly backfilled
  - [x] Existing arrays are not set to default empty arrays
  - [x] Affected columns became non-nullable in the db
2025-04-18 07:43:54 +00:00
Bently
3cf30c22fb update(docs): Remove outdated submodule command from docs (#9836)
### Changes 🏗️

Updates to the setup docs to remove the old unneeded ``git submodule
update --init --recursive --progress`` command + some other small tweaks
around it
2025-04-17 16:45:07 +00:00
Reinier van der Leer
05c670eef9 fix(frontend/library): Prevent execution updates mixing between library agents (#9835)
If the websocket doesn't disconnect when the user switches to viewing a
different agent, they aren't unsubscribed. If execution updates *from a
different agent* are adopted into the page state, that can cause
crashes.

### Changes 🏗️

- Filter incoming execution updates by `graph_id`

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [ ] I have tested my changes according to the test plan:
- Go to an agent and initiate a run that will take a while (long enough
to navigate to a different agent)
  - Navigate: Library -> [another agent]
- [ ] Runs from the first agent don't show up in the runs list of the
other agent
2025-04-17 14:11:09 +00:00
Zamil Majdy
f6a4b036c7 fix(block): Disable LLM blocks parallel tool calls (#9834)
SmartDecisionBlock sometimes tried to be smart by calling multiple tool
calls and our platform does not support this yet.

### Changes 🏗️

Disable parallel tool calls for OpenAI & OpenRouter LLM provider LLM
blocks.

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [x] Tested SmartDecisionBlock & AITextGeneratorBlock
2025-04-17 12:58:05 +00:00
Zamil Majdy
c43924cd4e feat(backend): Add RabbitMQ connection cleanup on executor shutdown hook 2025-04-17 01:28:15 +02:00
Zamil Majdy
e3846c22bd fix(backend): Avoid multithreaded pika access (#9832)
### Changes 🏗️

Avoid other threads accessing the channel within the same process.

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  - [x] Manual agent runs
2025-04-16 22:06:07 +00:00
Toran Bruce Richards
9a7a838418 fix(backend): Change node output logging type from info to debug (#9831)
<!-- Clearly explain the need for these changes: -->

### Changes 🏗️
This PR simply changes the logging type from info to debug of node
outputs in the agent.py file.
<!-- Concisely describe all of the changes made in this pull request:
-->

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [ ] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [x] ...

<details>
  <summary>Example test plan</summary>
  
  - [ ] Create from scratch and execute an agent with at least 3 blocks
- [ ] Import an agent from file upload, and confirm it executes
correctly
  - [ ] Upload agent to marketplace
- [ ] Import an agent from marketplace and confirm it executes correctly
  - [ ] Edit an agent from monitor, and confirm it executes correctly
</details>

#### For configuration changes:
- [x] `.env.example` is updated or already compatible with my changes
- [x] `docker-compose.yml` is updated or already compatible with my
changes
- [x] I have included a list of my configuration changes in the PR
description (under **Changes**)

<details>
  <summary>Examples of configuration changes</summary>

  - Changing ports
  - Adding new services that need to communicate with each other
  - Secrets or environment variable changes
  - New or infrastructure changes such as databases
</details>

---------

Co-authored-by: Bentlybro <Github@bentlybro.com>
2025-04-16 20:45:51 +00:00
Toran Bruce Richards
d61d815208 fix(logging): Change node data logging to debug level from info (#9830)
<!-- Clearly explain the need for these changes: -->

### Changes 🏗️
This change simply changes the logging level of node inputs and outputs
to debug level. This change is needed because currently logging all node
data causes logs that are too large for the logger to prevent nodes from
running.

<!-- Concisely describe all of the changes made in this pull request:
-->

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [ ] I have made a test plan
- [ ] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [x] ...

<details>
  <summary>Example test plan</summary>
  
  - [ ] Create from scratch and execute an agent with at least 3 blocks
- [ ] Import an agent from file upload, and confirm it executes
correctly
  - [ ] Upload agent to marketplace
- [ ] Import an agent from marketplace and confirm it executes correctly
  - [ ] Edit an agent from monitor, and confirm it executes correctly
</details>

#### For configuration changes:
- [x] `.env.example` is updated or already compatible with my changes
- [x] `docker-compose.yml` is updated or already compatible with my
changes
- [x] I have included a list of my configuration changes in the PR
description (under **Changes**)

<details>
  <summary>Examples of configuration changes</summary>

  - Changing ports
  - Adding new services that need to communicate with each other
  - Secrets or environment variable changes
  - New or infrastructure changes such as databases
</details>
2025-04-16 19:22:52 +00:00
Zamil Majdy
44e3770003 fix(backend): Fix execution manager message consuming pattern (#9829)
We have seen instances where the executor gets stuck in a failing
message-consuming loop due to the upstream RabbitMQ being down. The
current message-consuming pattern is not optimal for handling this.

### Changes 🏗️

* Add a retry limit to the execution loop limit.
* Use `basic_consume` instead of `basic_get` for handling message
consumption.

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  - [x] Run agents cancel them
2025-04-16 22:54:26 +07:00
Zamil Majdy
c0ee71fb27 fix(frontend/builder): Fix key-value pair input for any non-string types (#9826)
- Resolves #9823 

The key-value pairs input, like those used in CreateDictionaryBlock, are
assumed to be either a numeric or a string type.
When it has `any` type, it was randomly assumed to be a numeric type. 

### Changes 🏗️

Only convert to number when it's explicitly defined to do so on
key-value pair input.

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
- [x] Tried two different key-value pair input: AiTextGenerator &
CreateDictionary
2025-04-16 11:10:50 +00:00
Zamil Majdy
71cdc18674 fix(backend): Fix cancel_execution can only work once (#9825)
### Changes 🏗️

The recent change to the execution cancelation fix turns out to only
work on the first request.
This PR change fixes it by reworking how the thread_cached work on async
functions.

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [x] Cancel agent executions multiple times
2025-04-16 10:33:49 +00:00
Zamil Majdy
dc9348ec26 fix(frontend): Fix Input value mixup on Library page (#9821)
### Changes 🏗️

Fix this broken behaviors:
Input data mix-up caused by running two different executions of the same
agent with the same input.

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  - [x] Run agent with old user
- [x] Running two different executions of the same agent with the same
input.
2025-04-16 09:31:07 +00:00
Zamil Majdy
3ccbc31705 Revert: fix(frontend): Fix Input value mixup on Library page & broken marketplace on no onboarding data 2025-04-15 21:28:43 +02:00
Zamil Majdy
7cf0c6fe46 fix(frontend): Fix Input value mixup on Library page & broken marketplace on no onboarding data 2025-04-15 21:25:25 +02:00
Zamil Majdy
c69faa2a94 fix(frontend): Fix Input value mixup on Library page & broken marketplace on no onboarding data 2025-04-15 21:24:39 +02:00
Nicholas Tindle
0c9dbbbe24 Merge branch 'master' into dev 2025-04-15 12:00:02 -05:00
Nicholas Tindle
3e0742f9c5 Spike/infra pooling (#9812)
<!-- Clearly explain the need for these changes: -->
Swap to pooling supabase connections rather than depending on x number
of max open connections

### Changes 🏗️
Adds direct connect URL to be used throughout the system
<!-- Concisely describe all of the changes made in this pull request:
-->

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
- [x] Test thoroughly all of the endpoints in the dev env with switched
infra matching pr
  - [x] Follow the new release plan tests
  - [x] Follow the old release plan tests

#### For configuration changes:
- [x] `.env.example` is updated or already compatible with my changes
- [x] `docker-compose.yml` is updated or already compatible with my
changes
- [x] I have included a list of my configuration changes in the PR
description (under **Changes**)

<details>
  <summary>configuration changes</summary>

- Change how we connect to the database to use direct when configured
and database URL when not
  - update prisma for this
  - have default matching database and default
</details>
2025-04-15 15:40:15 +00:00
Krzysztof Czerwinski
d791cdea76 feat(platform): Onboarding Phase 2 (#9736)
### Changes 🏗️

- Update onboarding to give user rewards for completing steps
- Remove `canvas-confetti` lib and add `party-js` instead; the former
didn't allow to play confetti from a component
- Add onboarding videos in `frontend/public/onboarding/`
- Remove Balance (`CreditsCard.tsx`) and add openable `Wallet.tsx` (and
accompanying `WalletTaskGroup.tsx`) instead that displays grouped
onboarding tasks with descriptions and short instructional videos
- Further relevant updates to `useOnboarding`, `types.ts`
- Implement onboarding rewards
- Add `onboarding_reward` function in `credit.py` that is used to reward
user for finished onboarding tasks safely - transaction key is
deterministic, so the same user won't be rewarded twice for the same
step.
  - Add `reward_user` in `onboarding.py`
- Update `UserOnboarding` model and add a migration

<img width="464" alt="Screenshot 2025-04-05 at 6 06 29 PM"
src="https://github.com/user-attachments/assets/fca8d09e-0139-466b-b679-d24117ad01f0"
/>

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  - [x] Onboarding works
  - [x] Tasks can be completed
  - [x] Rewards are added correctly for all completed tasks
2025-04-12 10:56:59 +00:00
Zamil Majdy
bb92226f5d feat(backend): Remove RPC service from Agent Executor (#9804)
Currently the execution task is not properly distributed between
executors because we need to send the execution request to the execution
server.

The execution manager now accepts the execution request from the message
queue. Thus, we can remove the synchronous RPC system from this service,
let the system focus on executing the agent, and not spare any process
for the HTTP API interface.

This will also reduce the risk of the execution service being too busy
and not able to accept any add execution requests.

### Changes 🏗️

* Remove the RPC system in Agent Executor
* Allow the cancellation of the execution that is still waiting in the
queue (by avoiding it from being executed).
* Make a unified helper for adding an execution request to the system
and move other execution-related helper functions into
`executor/utils.py`.
* Remove non-db connections (redis / rabbitmq) in Database Manager and
let the client manage this by themselves.

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  - [x] Existing CI, some agent runs
2025-04-11 19:03:47 +00:00
Zamil Majdy
f7ca5ac1ba feat(backend/executor): Move execution queue + cancel mechanism to RabbitMQ (#9759)
The graph execution queue is not disk-persisted; when the executor dies,
the executions are lost.

The scope of this issue is migrating the execution queue from an
inter-process queue to a RabbitMQ message queue. A sync client should be
used for this.

- Resolves #9746
- Resolves #9714

### Changes 🏗️

Move the execution manager from multiprocess.Queue into persisted
Rabbit-MQ.

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  - [x] Execute agents.

<details>
  <summary>Example test plan</summary>
  
  - [ ] Create from scratch and execute an agent with at least 3 blocks
- [ ] Import an agent from file upload, and confirm it executes
correctly
  - [ ] Upload agent to marketplace
- [ ] Import an agent from marketplace and confirm it executes correctly
  - [ ] Edit an agent from monitor, and confirm it executes correctly
</details>

#### For configuration changes:
- [ ] `.env.example` is updated or already compatible with my changes
- [ ] `docker-compose.yml` is updated or already compatible with my
changes
- [ ] I have included a list of my configuration changes in the PR
description (under **Changes**)

<details>
  <summary>Examples of configuration changes</summary>

  - Changing ports
  - Adding new services that need to communicate with each other
  - Secrets or environment variable changes
  - New or infrastructure changes such as databases
</details>
2025-04-11 14:15:39 +00:00
Abhimanyu Yadav
4621a95bf3 fix(marketplace): Fix small UI bugs (#9800)
Resolving the bugs listed below
- #9796 
- #9797 
- #9798 
- #8998 
- #9799 

### Changes I have made 
- Removed border and set border-radius to `24px` in FeaturedCard
- Removed `white` background from breadcrumbs
- Changed distance between featured section arrow from `28px` to `12px`
- Added `1.5rem` spacing and changed color to `gray-200` on the
creator’s page separator
- Removed focus ring from the Search Library input
- And some small UI changes on marketplace

### Screenshots

<img width="658" alt="Screenshot 2025-04-10 at 3 26 56 PM"
src="https://github.com/user-attachments/assets/22bef6f0-19b9-42a6-8227-fedca33141ba"
/>

<img width="505" alt="Screenshot 2025-04-10 at 3 27 07 PM"
src="https://github.com/user-attachments/assets/2a5409a1-94c6-4d15-a35d-e4ed9b075055"
/>

<img width="1373" alt="Screenshot 2025-04-10 at 3 28 39 PM"
src="https://github.com/user-attachments/assets/046ea726-2a98-4000-abc8-9139fffe80dc"
/>

<img width="368" alt="Screenshot 2025-04-10 at 3 29 07 PM"
src="https://github.com/user-attachments/assets/4e0510ad-f535-4760-a703-651766ff522b"
/>
2025-04-11 13:09:35 +00:00
Abhimanyu Yadav
8d8a6e450f fix(marketplace): Render newline in marketplace description text (#9808)
- fix #9177 

Add `whitespace-pre-line` tailwind property to allow newline rendering
in marketplace description text

### Before

![Screenshot 2025-04-11 at 10 32
23 AM](https://github.com/user-attachments/assets/b07f58b6-218e-4b33-a018-93757e59cd8d)

### After

![Screenshot 2025-04-11 at 10 32
59 AM](https://github.com/user-attachments/assets/f1086ee4-aef3-491a-ba81-cf681086f67b)
2025-04-11 10:50:32 +00:00
Reinier van der Leer
8ea3bfabc4 fix(backend/db): Fix unchecked Prisma statements (#9805) 2025-04-10 23:04:42 +02:00
Nicholas Tindle
cda07e81d1 feat(frontend, backend): track sentry environment on frontend + sentry init in app services (#9773)
<!-- Clearly explain the need for these changes: -->
We want to be able to filter errors according to where they occur in
sentry so we need to track and include that data. We also are not
logging everything from app services correctly so fix that up

### Changes 🏗️

<!-- Concisely describe all of the changes made in this pull request:
-->
- Adds env tracking for frontend
- adds sentry init in app service spawn

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
- [x] Tested by running and making sure all events + logs are inserted
into sentry correctly
2025-04-10 16:28:07 +01:00
Abhimanyu Yadav
6156fbb731 fix(marketplace): Fixing margins between headers, divider and content (#9757)
- fix #9003 
- fix - #8969 
- fix #8970 

Adding correct margins in between headers, divider and content.

### Changes made

- Remove any vertical padding or margin from the section.
- Add top and bottom margins to the separator, so the spacing between
sections is handled only by the separator.
- Also, add a size prop in AvatarFallback because its size is currently
broken. It’s not able to extract the size properly from the className.
2025-04-10 16:28:01 +01:00
Nicholas Tindle
2ca18d77a4 feat(frontend, backend): track sentry environment on frontend + sentry init in app services (#9773)
<!-- Clearly explain the need for these changes: -->
We want to be able to filter errors according to where they occur in
sentry so we need to track and include that data. We also are not
logging everything from app services correctly so fix that up

### Changes 🏗️

<!-- Concisely describe all of the changes made in this pull request:
-->
- Adds env tracking for frontend
- adds sentry init in app service spawn

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
- [x] Tested by running and making sure all events + logs are inserted
into sentry correctly
2025-04-10 14:34:26 +00:00
Abhimanyu Yadav
3e6d9bf963 fix(marketplace): Fixing margins between headers, divider and content (#9757)
- fix #9003 
- fix - #8969 
- fix #8970 

Adding correct margins in between headers, divider and content.

### Changes made

- Remove any vertical padding or margin from the section.
- Add top and bottom margins to the separator, so the spacing between
sections is handled only by the separator.
- Also, add a size prop in AvatarFallback because its size is currently
broken. It’s not able to extract the size properly from the className.
2025-04-10 13:04:16 +00:00
Abhimanyu Yadav
07a09d802c fix(marketplace): Fix store card style (#9769)
- fix #9222 
- fix #9221 
- fix #8966

### Changes made
- Standardized the height of store cards.
- Corrected spacing and responsiveness behavior.
- Removed horizontal margin and max-width from the featured section.
- Fixed the aspect ratio of the agent image in the store card.
- Now, a normal desktop screen displays 3 columns of agents instead of
4.

<img width="1512" alt="Screenshot 2025-04-07 at 7 09 40 AM"
src="https://github.com/user-attachments/assets/50d3b5c9-4e7c-456e-b5f1-7c0093509bd3"
/>
2025-04-10 12:01:42 +01:00
Reinier van der Leer
353396110c refactor(backend): Clean up Library & Store DB schema (#9774)
Distilled from #9541 to reduce the scope of that PR.

- Part of #9307

-  Blocks #9786
  -  Blocks #9541

### Changes 🏗️

- Fix `LibraryAgent` schema (for #9786)
- Fix relationships between `LibraryAgent`, `AgentGraph`, and
`AgentPreset`
  - Impose uniqueness constraint on `LibraryAgent`

- Rename things that are called `agent` that actually refer to a
`graph`/`agentGraph`
- Fix singular/plural forms in DB schema
- Simplify reference names of closely related objects (e.g.
`AgentGraph.AgentGraphExecutions` -> `AgentGraph.Executions`)

- Eliminate use of `# type: ignore` in DB statements
  - Add `typed` and `typed_cast` utilities to `backend.util.type`

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
- [x] CI static type checking (with all risky `# type: ignore` removed)
  - [x] Check that column references in views are updated
2025-04-10 10:40:25 +00:00
Abhimanyu Yadav
70890dee43 fix(marketplace): Fix store card style (#9769)
- fix #9222 
- fix #9221 
- fix #8966

### Changes made
- Standardized the height of store cards.
- Corrected spacing and responsiveness behavior.
- Removed horizontal margin and max-width from the featured section.
- Fixed the aspect ratio of the agent image in the store card.
- Now, a normal desktop screen displays 3 columns of agents instead of
4.

<img width="1512" alt="Screenshot 2025-04-07 at 7 09 40 AM"
src="https://github.com/user-attachments/assets/50d3b5c9-4e7c-456e-b5f1-7c0093509bd3"
/>
2025-04-10 10:31:14 +00:00
Nicholas Tindle
62361ccc48 feat: deep copy the schema (#9794)
<!-- Clearly explain the need for these changes: -->
We were duplicating placeholder values across all agents 😨 

### Changes 🏗️

<!-- Concisely describe all of the changes made in this pull request:
-->
Deep copies the schema instead

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [x] Test the broken agent in dev
2025-04-09 21:32:50 +00:00
Reinier van der Leer
755a80c87a fix(blocks): Fix block I/O value sharing (#9793)
- Resolves #9792

### Changes 🏗️

- Replace all `default=[]` -> `default_factory=list`
- Replace all `default={}` -> `default_factory=dict`

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [ ] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [ ] CI

---------

Co-authored-by: Krzysztof Czerwinski <kpczerwinski@gmail.com>
2025-04-09 19:15:41 +00:00
Bentlybro
2a6676a5b8 Merge branch 'master' into dev 2025-04-09 14:49:09 +01:00
Reinier van der Leer
5a83b233f8 fix(backend): Add required method cleanup to MainApp
The absence of this method caused type checking errors.
2025-04-09 13:09:21 +02:00
Reinier van der Leer
cb1a3703ad fix(ci): Fix linter exit code on failure (#9777)
The linter currently exits with exit code 0 even if linting fails. This
makes the CI linter permissive which isn't good.

Changes:
- Make linter exit with an error code if a linting step fails
- Fix existing formatting issues
2025-04-09 11:30:04 +02:00
Bently
91f62c47f9 feat(backend): Add new llama 4 maverick & scout models (#9788)
This PR is to add the new [Meta: Llama 4
Maverick](https://openrouter.ai/meta-llama/llama-4-maverick) and [Meta:
Llama 4 Scout](https://openrouter.ai/meta-llama/llama-4-scout) models
via [OpenRouter](https://openrouter.ai/)


### Changes 🏗️

Added the model names to ``llm.py``
```
    META_LLAMA_4_SCOUT = "meta-llama/llama-4-scout"
    META_LLAMA_4_MAVERICK = "meta-llama/llama-4-maverick"
```
and the modela metadata
```
    LlmModel.META_LLAMA_4_SCOUT: ModelMetadata("open_router", 131072, 131072),
    LlmModel.META_LLAMA_4_MAVERICK: ModelMetadata("open_router", 1048576, 1000000),
```

and i have added the model price to ``block_cost_config.py``
```
    LlmModel.META_LLAMA_4_SCOUT: 1,
    LlmModel.META_LLAMA_4_MAVERICK: 1,
```

### Checklist 📋

- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
- [x] Open the build page and place a ai text block, open the model
select and scroll to the bottom and select either of the 2 models
  - [x] test them with a prompt and wait for a reply!
2025-04-08 21:59:53 +00:00
Zamil Majdy
7fedb5e2fd refactor(backend): Un-share resource initializations from AppService + Remove Pyro (#9750)
This is a prerequisite infra change for
https://github.com/Significant-Gravitas/AutoGPT/issues/9714.

We will need a service where we can maintain our own client (db, redis,
rabbitmq, be it async/sync) and configure our own cadence of
initialization and cleanup.

While refactoring the service.py, an option to use Pyro as an RPC
protocol is also removed.

### Changes 🏗️

* Decouple resource initialization and cleanup from the parent
AppService logic.
* Removed Pyro.

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [x] CI
2025-04-08 19:47:22 +00:00
Nicholas Tindle
d316ed23d4 [Snyk] Security upgrade next from 14.2.25 to 14.2.26 (#9767)
![snyk-top-banner](https://res.cloudinary.com/snyk/image/upload/r-d/scm-platform/snyk-pull-requests/pr-banner-default.svg)

### Snyk has created this PR to fix 1 vulnerabilities in the yarn
dependencies of this project.

#### Snyk changed the following file(s):

- `autogpt_platform/frontend/package.json`
- `autogpt_platform/frontend/yarn.lock`


#### Note for
[zero-installs](https://yarnpkg.com/features/zero-installs) users

If you are using the Yarn feature
[zero-installs](https://yarnpkg.com/features/zero-installs) that was
introduced in Yarn V2, note that this PR does not update the
`.yarn/cache/` directory meaning this code cannot be pulled and
immediately developed on as one would expect for a zero-install project
- you will need to run `yarn` to update the contents of the
`./yarn/cache` directory.
If you are not using zero-install you can ignore this as your flow
should likely be unchanged.




#### Vulnerabilities that will be fixed with an upgrade:

|  | Issue | Score | 

:-------------------------:|:-------------------------|:-------------------------
![medium
severity](https://res.cloudinary.com/snyk/image/upload/w_20,h_20/v1561977819/icon/m.png
'medium severity') | Information Exposure
<br/>[SNYK-JS-NEXT-9634163](https://snyk.io/vuln/SNYK-JS-NEXT-9634163) |
&nbsp;&nbsp;**601**&nbsp;&nbsp;




---

> [!IMPORTANT]
>
> - Check the changes in this PR to ensure they won't cause issues with
your project.
> - Max score is 1000. Note that the real score may have changed since
the PR was raised.
> - This PR was automatically created by Snyk using the credentials of a
real user.

---

**Note:** _You are seeing this because you or someone else with access
to this repository has authorized Snyk to open fix PRs._

For more information: <img
src="https://api.segment.io/v1/pixel/track?data=eyJ3cml0ZUtleSI6InJyWmxZcEdHY2RyTHZsb0lYd0dUcVg4WkFRTnNCOUEwIiwiYW5vbnltb3VzSWQiOiI5MzYyNGJiZC1jMTE3LTQ3NDYtOGFlOC1hYjIyMGE4OGI4M2UiLCJldmVudCI6IlBSIHZpZXdlZCIsInByb3BlcnRpZXMiOnsicHJJZCI6IjkzNjI0YmJkLWMxMTctNDc0Ni04YWU4LWFiMjIwYTg4YjgzZSJ9fQ=="
width="0" height="0"/>
🧐 [View latest project
report](https://app.snyk.io/org/significant-gravitas/project/3d924968-0cf3-4767-9609-501fa4962856?utm_source&#x3D;github&amp;utm_medium&#x3D;referral&amp;page&#x3D;fix-pr)
📜 [Customise PR
templates](https://docs.snyk.io/scan-using-snyk/pull-requests/snyk-fix-pull-or-merge-requests/customize-pr-templates?utm_source=github&utm_content=fix-pr-template)
🛠 [Adjust project
settings](https://app.snyk.io/org/significant-gravitas/project/3d924968-0cf3-4767-9609-501fa4962856?utm_source&#x3D;github&amp;utm_medium&#x3D;referral&amp;page&#x3D;fix-pr/settings)
📚 [Read about Snyk's upgrade
logic](https://docs.snyk.io/scan-with-snyk/snyk-open-source/manage-vulnerabilities/upgrade-package-versions-to-fix-vulnerabilities?utm_source=github&utm_content=fix-pr-template)

---

**Learn how to fix vulnerabilities with free interactive lessons:**

🦉 [Learn about vulnerability in an interactive lesson of Snyk
Learn.](https://learn.snyk.io/?loc&#x3D;fix-pr)

[//]: #
'snyk:metadata:{"customTemplate":{"variablesUsed":[],"fieldsUsed":[]},"dependencies":[{"name":"next","from":"14.2.25","to":"14.2.26"}],"env":"prod","issuesToFix":["SNYK-JS-NEXT-9634163"],"prId":"93624bbd-c117-4746-8ae8-ab220a88b83e","prPublicId":"93624bbd-c117-4746-8ae8-ab220a88b83e","packageManager":"yarn","priorityScoreList":[601],"projectPublicId":"3d924968-0cf3-4767-9609-501fa4962856","projectUrl":"https://app.snyk.io/org/significant-gravitas/project/3d924968-0cf3-4767-9609-501fa4962856?utm_source=github&utm_medium=referral&page=fix-pr","prType":"fix","templateFieldSources":{"branchName":"default","commitMessage":"default","description":"default","title":"default"},"templateVariants":["updated-fix-title","priorityScore"],"type":"auto","upgrade":["SNYK-JS-NEXT-9634163"],"vulns":["SNYK-JS-NEXT-9634163"],"patch":[],"isBreakingChange":false,"remediationStrategy":"vuln"}'

---------

Co-authored-by: Reinier van der Leer <pwuts@agpt.co>
Co-authored-by: snyk-bot <snyk-bot@snyk.io>
Co-authored-by: Zamil Majdy <zamil.majdy@agpt.co>
2025-04-07 17:15:50 +00:00
Nicholas Tindle
3c14861d8e fix(backend): reduce log level for retrying connection (#9765)
<!-- Clearly explain the need for these changes: -->
Now that we are trying to use Sentry more, cleaning up some errors ->
warnings is a good idea

### Changes 🏗️
- reduces log level of retry to warning
<!-- Concisely describe all of the changes made in this pull request:
-->

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [x] check it comes through in sentry
2025-04-07 17:00:18 +00:00
Nicholas Tindle
074a00ce86 fix(backend): ProviderName behavior when loading secrets (#9764)
<!-- Clearly explain the need for these changes: -->
We got this error in
sentry:[AUTOGPT-SERVER-33P](https://significant-gravitas.sentry.io/issues/6462614597/events/bb4871d796b04e759ade55197498cff9/)
```
Level: Error
'Secrets' object has no attribute 'ProviderName.GOOGLE_client_id'
```

### Changes 🏗️
- Follows pattern used when accessing these in
`_get_provider_oauth_handler` in the router
<!-- Concisely describe all of the changes made in this pull request:
-->

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [x] Test to make sure getting works
2025-04-07 16:59:48 +00:00
Krzysztof Czerwinski
0aeaaa7801 fix(frontend): Fill defaults from schema to hardcodedValues in CustomNode.tsx (#9772)
Fix https://github.com/Significant-Gravitas/AutoGPT/pull/9632

### Changes 🏗️

- Set default values from input schema to `hardcodedValues` in
`CustomNode.tsx`

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  - [x] Default values are correctly applied to newly created node
2025-04-07 16:53:57 +00:00
Abhimanyu Yadav
2e5a770f35 fix(marketplace): Fix typography of heading in marketplace (#9737)
- fix #8956

### Changes:
- Updated line height from 28px to 36px for improved readability.
- Ensured that all section headings (“Featured agents”, “Top agents”,
“Featured creators”, and “Become a creator”) now have a uniform style.
- Verified that font-poppins is correctly set in the Tailwind config
file and layout.tsx.
- Color changed from #282828 to #262626

### Scope:
- This PR only includes typography-related adjustments.

![Screenshot 2025-04-02 at 5 29
05 PM](https://github.com/user-attachments/assets/e27b0d52-d8c7-4921-ae18-e3f75264e74d)
2025-04-07 10:28:28 +00:00
Abhimanyu Yadav
8b2265c996 feat(frontend): Add advanced block search with relevance ranking (#9711)
- fix #9425 

- Enhancing the functionality of searching blocks on the build page

Currently, it only performs exact matching on the block name and
description. I added a scoring mechanism for searching.

- The scoring algorithm works as follows:
     - Returns 1 if no query (all blocks match equally)
     - Normalized query for case-insensitive matching
- Returns 3 for exact substring matches in block name (highest priority)
- Returns 2 when all query words appear in the block name (regardless of
order)
- Returns 1.X for blocks with names similar to query using Jaro-Winkler
distance (X is similarity score)
- Returns 0.5 when all query words appear in the block description
(lowest priority)
     - Returns 0 for no match

Higher scores will appear first in search results.

> I have used an external library for Jaro-Winkler distance -
[link](https://www.npmjs.com/package/jaro-winkler)

Before
![Screenshot 2025-03-28 at 12 09
24 PM](https://github.com/user-attachments/assets/e135c007-cd9a-4692-88fc-3ad42b097c22)

After
![Screenshot 2025-03-28 at 12 09
17 PM](https://github.com/user-attachments/assets/28cd01c1-0d8e-44fa-8e04-ba9796118ba3)
2025-04-07 08:54:00 +00:00
Krzysztof Czerwinski
73d43312d1 feat(frontend): Use TypeBasedInput for onboarding agent input (#9762)
### Changes 🏗️

- Use the same code as in Library to display inputs for onboarding agent
- Fixes bug that crashes frontend when showing onboarding inputs
- Remove no longer needed `OnboardingAgentInput` component

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  - [x] All input types display correctly
  - [x] Onboarding agent runs

Co-authored-by: Nicholas Tindle <nicholas.tindle@agpt.co>
2025-04-04 16:54:58 +00:00
Zamil Majdy
3771a0924c fix(backend): Update deprecated code caused by upgrades (#9758)
This series of upgrades:
https://github.com/significant-gravitas/autogpt/pull/9727
https://github.com/Significant-Gravitas/AutoGPT/pull/9728
https://github.com/Significant-Gravitas/AutoGPT/pull/9560

Caused some code in the repo being deprecated, this PR addresses those.

### Changes 🏗️

Fix pydantic config, usage of field, usage of proper prisma
`CreateInput` type, pytest loop-scope.

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  - [x] CI, manual test on running some agents.

---------

Co-authored-by: Nicholas Tindle <nicholas.tindle@agpt.co>
2025-04-04 16:34:40 +00:00
Nicholas Tindle
4397746a87 feat(backend): baseline sentry logging (#9756)
<!-- Clearly explain the need for these changes: -->
Sentry just released logs so lets enrich our details there too

### Changes 🏗️
- Adds sentry logging
- Adds dependencies tracking all of our sentry integrations
- Adds environment tracking to sentry
<!-- Concisely describe all of the changes made in this pull request:
-->

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
- [x] Tested to make sure events show up in sentry with the correct
environment logging
2025-04-04 15:31:52 +00:00
Nicholas Tindle
2e871b0761 fix(frontend): bad handling on error prompts (#9754)
<!-- Clearly explain the need for these changes: -->

I oopsed and had an extra unneeded parameter (as @majdyz pointed out)
and wasn't respected everywhere it was used.

### Changes 🏗️

<!-- Concisely describe all of the changes made in this pull request:
-->
- Remove parameter
- update all the places AuthFeedback is called

### Checklist 📋

#### For code changes:
- [ ] I have clearly listed my changes in the PR description
- [ ] I have made a test plan
- [ ] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [ ] Test all pages with authfeedback on it

Co-authored-by: Bently <tomnoon9@gmail.com>
2025-04-03 22:16:39 +00:00
Reinier van der Leer
8ceb03ce1a feat(frontend/library): Add "Open in builder" run action (#9755)
- Resolves #9730

### Changes 🏗️

- feat: Add "Open in builder" run action

- refactor: Add `ActionButtonGroup` to replace boilerplate code in
`AgentRunDetailsView`, `AgentRunDraftView`, `AgentScheduleDetailsView`
  - feat: Add link support to `ActionButtonGroup`, `ButtonAction`

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  - Go to `/library/agents/[id]`
    - [x] "Run again" button works
    - [x] "Open in builder" button-link works
2025-04-03 21:34:33 +00:00
Bently
ce98925d58 update(docs): Remove out dated tutorial video from docs & readme (#9753)
This is to remove the out dated tutorial video from docs & readme and
add a direct link to the docs in the readme

### Changes 🏗️

Remove video link from readme.md
Remove video link from
https://github.com/Significant-Gravitas/AutoGPT/blob/dev/docs/content/platform/getting-started.md
Add direct link to docs in readme.me
2025-04-03 19:16:54 +00:00
Reinier van der Leer
1fc984f7fd feat(platform/library): Add real-time "Steps" count to agent run view (#9740)
- Resolves #9731

### Changes 🏗️

- feat: Add "Steps" showing `node_execution_count` to agent run view
  - Add `GraphExecutionMeta.stats.node_exec_count` attribute

- feat(backend/executor): Send graph execution update after *every* node
execution (instead of only I/O node executions)
  - Update graph execution stats after every node execution

- refactor: Move `GraphExecutionMeta` stats into sub-object
(`cost`, `duration`, `total_run_time` -> `stats.cost`, `stats.duration`,
`stats.node_exec_time`)

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  - View an agent run with 1+ steps on `/library/agents/[id]`
    - [x] "Info" section layout doesn't break
    - [x] Number of steps is shown
  - Initiate a new agent run
    - [x] "Steps" increments in real time during execution
2025-04-03 18:58:21 +00:00
Madura Herath
d0d610720c docs(platform): Add WSL 2 recommendation for Docker on Windows (#9749)
In this pull request, the following changes have been made in response
to Issue #9190:

Documentation Changes:

- Added a note to the AutoGPT documentation regarding Docker
installation on Windows.

- Specifically, the note advises users to opt for WSL2 (Windows
Subsystem for Linux version 2) instead of Hyper-V during Docker setup to
prevent issues with Supabase, such as the "unhealthy" status for
supabase-db.

---------

Co-authored-by: Madura Herath <madurah@verdentra.com>
Co-authored-by: Bently <tomnoon9@gmail.com>
2025-04-03 18:35:07 +00:00
Reinier van der Leer
77a44b1213 fix(platform/library): Fix UX for webhook-triggered runs (#9680)
- Resolves #9679

### Changes 🏗️

Frontend:
- Fix crash on `payload` graph input
- Fix crash on object type agent I/O values
- Hide "+ New run" if `graph.webhook_id` is set

Backend:
- Add computed field `webhook_id` to `GraphModel`
  - Add computed property `webhook_input_node` to `GraphModel`
- Refactor:
  - Move `Node.webhook_id` -> `NodeModel.webhook_id`
  - Move `NodeModel.block` -> `Node.block` (computed property)
  - Replace `get_block(node.block_id)` with `node.block` where sensible

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  - [x] Create and run a simple graph
  - [x] Create a graph with a webhook trigger and ensure it works
- [x] Check out the runs of a webhook-triggered graph and ensure the
page works
2025-04-03 17:31:02 +00:00
Nicholas Tindle
7179f9cea0 feat(backend, libs): Tell uvicorn to use our logger + always log to stdout+stderr (#9742)
<!-- Clearly explain the need for these changes: -->

Uvicorn and our logs were ending up in different places, this pr enures
uvicorn using our logging config, not their own.

### Changes 🏗️
- Clears uvicorn's loggers for rest, ws
- always log to stdout,stderr and additionally log to gcp is appropriate
<!-- Concisely describe all of the changes made in this pull request:
-->

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
- [x] Test all possible variants of the log cloud vs not and ensure that
uvicorn logs show up in the same place that rest of the system logs do
for all

---------

Co-authored-by: Zamil Majdy <zamil.majdy@agpt.co>
2025-04-03 16:42:20 +00:00
Reinier van der Leer
698af4e16a refactor(frontend): Clean up graph import & export logic (#9717)
- Resolves #9716
- Builds on the work done in #9627

### Changes 🏗️

- Remove `safeCopyGraph`; export directly from backend instead
- Explicitly name sanitization functions for *importing* graphs; move to
`@/lib/autogpt-server-api/utils`
- Amend `BackendAPI.getGraph(..)` to delete `.user_id` if `for_export ==
true`

Out-of-scope improvements:
- Add missing `user_id` to frontend `Graph` types
- Add `UserID` branded type for `User.id` + all `user_id` properties

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
- Create and configure an agent with the Publish To Medium block, a
block that uses credentials, and a webhook trigger
  - Go to `/monitoring` and click the agent you just created
    - [x] -> "Export" button should work
      - [x] -> Exported file contains no credentials or secrets
      - [x] -> Exported file contains no user IDs
      - [x] -> Exported file contains no webhook IDs
2025-04-03 16:17:25 +00:00
Abhimanyu Yadav
7085d88b2c fix(marketplace): Add 58px bottom padding to creator page agents section on large screens (#9738)
- fix #9000 

Currently, we have a 32px bottom padding on the Creator’s page on larger
screen. I have added an extra 58px to make it 90px.
2025-04-03 16:03:50 +00:00
Abhimanyu Yadav
4a82edb0c3 fix(marketplace): Fix margin between divider and section on creators page (#9744)
- fix #8998 

Replace padding with margin top and update UI spacing from 32px to 25px
2025-04-03 16:02:03 +00:00
Abhimanyu Yadav
0fc423fd55 fix(marketplace): Fix margin between arrows and carousel (#9745)
- fix #8958 

Currently, the arrow button and carousel have a 16px margin, and the
button is placed 12px below the top of the container. This makes the
spacing appear to be 28px. Therefore, place the button and indicator at
the top of the container.
2025-04-03 16:01:27 +00:00
Abhimanyu Yadav
adb3263211 fix(marketplace): Reduce margin between search bar and chips to 20px (#9748)
- fix #8955 

Reduce the margin between the search bar and chips from 24px to 20px.
2025-04-03 16:00:28 +00:00
Abhimanyu Yadav
3b5feb2c25 fix(marketplace): Fix store card typography (#9739)
- fix #8965 

### Changes Made:
- **Title**: Increased line height from 20px to 32px.
- **Creator Name:**
   - Changed font to Geist Sans.
   - Updated font size to 20px and leading to 28px.
- **Description**: Applied Geist Sans font.
- **Stats Line:** Applied Geist Sans font.
   - Font Configuration Fix:

> Previously, we were using font-gist, which is not defined in the
tailwind config file, hence Updated to use font-sans instead.

I have also fixed the height and width of the profile picture in the
creator card in this PR. The issue is linked below:
- #9314

![Screenshot 2025-04-02 at 6 32
10 PM](https://github.com/user-attachments/assets/1c2d9779-0a5e-4269-b3d2-37526a0949d3)

The margin is perfectly set to 24px; only the height and width of the
image need to be changed.

---------

Co-authored-by: Bently <tomnoon9@gmail.com>
2025-04-03 15:55:20 +00:00
Nicholas Tindle
6f3da1b7d0 refactor(backend): move the router files for postmark to not the v2 folder (#9597)
<!-- Clearly explain the need for these changes: -->
One of the pull request review notes from when these were first made is
that they don't belong in the v2 folder. This pr fixes where they are.

### Changes 🏗️
- Moves from v2 to routers for the postmark tooling
<!-- Concisely describe all of the changes made in this pull request:
-->

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [x] Check that linting and tests pass

---------

Co-authored-by: Zamil Majdy <zamil.majdy@agpt.co>
2025-04-03 15:52:22 +00:00
Nicholas Tindle
6b7c8d5234 fix(backend): handle notification service errors more elegently (#9734)
<!-- Clearly explain the need for these changes: -->
We have logged 272k timeout errors in the past week from the event loop.
Don't raise those as errors.

Also along the way for diagnosing this we found that some items were
inserted into batches with incomplete datasets so handle that too.

### Changes 🏗️
- Handle timeout errors explicitly 
- Add better messaging for other error types
- Add filtering for queueing bad mezsaging
- add filtering for reading bad batches
<!-- Concisely describe all of the changes made in this pull request:
-->

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [x] Pull dev db
  - [x] Test new code to check stability + error reduction

---------

Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2025-04-03 14:57:06 +00:00
Reinier van der Leer
8e912a016f fix(ci/backend): Use Poetry version from lockfile (#9729)
Currently, our CI always uses the latest version of Poetry. This causes
issues with the lockfile check whenever a new Poetry version is
released, especially if that new version has different lockfile
generation behavior.

This new mechanism determines the Poetry version to use as follows:
- Get Poetry version from backend/poetry.lock in the current branch
- Get Poetry version from backend/poetry.lock on the base branch
- Use the newest version out of the two found versions

This way, we don't automatically update to new Poetry versions, but it
is still possible to update to newer versions through pull requests.
2025-04-03 12:43:10 +00:00
Reinier van der Leer
824da5e58c rename autogpt_platform license file 2025-04-03 14:10:44 +02:00
Zamil Majdy
378f49a2d9 fix(frontend): Fix toggle input label & time picker margin 2025-04-03 15:46:51 +04:00
Zamil Majdy
ad303d69d1 fix(frontend): Add border on opened select input-button 2025-04-03 11:44:08 +04:00
Zamil Majdy
200e5814b3 fix(backend): Cleanup service on service closure (#9735)
The cleanup command was only called on SIGTERM, making it possible for
the service to close without being cleaned. Risking the connection not
being proactively closed when the service is unused.

### Changes 🏗️

Call the cleanup command on the service finally block.

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  - [x] Run the service, stop it, see the log is printed (locally)
2025-04-02 04:21:40 +00:00
Nicholas Tindle
d879df062e feat(blocks): add a generic webhook block (#9584)
<!-- Clearly explain the need for these changes: -->
I want to be able to insert data into the graph as a webhook from
various services without making a provider specific webhook for things
like discord, slack, uptime bots, etc.

### Changes 🏗️
- Adds a generic webhook block that others can use
<!-- Concisely describe all of the changes made in this pull request:
-->

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
- [x] Test the endpoint that is generated with a graph, making sure to
pass data and consts to it
2025-04-02 03:22:59 +00:00
dependabot[bot]
6e595e6e28 chore(frontend/deps): bump @sentry/nextjs from 8.54.0 to 9.6.0 in /autogpt_platform/frontend (#9646)
Bumps [@sentry/nextjs](https://github.com/getsentry/sentry-javascript)
from 8.54.0 to 9.6.0.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/getsentry/sentry-javascript/releases"><code>@​sentry/nextjs</code>'s
releases</a>.</em></p>
<blockquote>
<h2>9.6.0</h2>
<h3>Important Changes</h3>
<ul>
<li>
<p><strong>feat(tanstackstart): Add
<code>@sentry/tanstackstart-react</code> package and make
<code>@sentry/tanstackstart</code> package a utility package (<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/15629">#15629</a>)</strong></p>
<p>Since TanStack Start is supposed to be a generic framework that
supports libraries like React and Solid, the
<code>@sentry/tanstackstart</code> SDK package was renamed to
<code>@sentry/tanstackstart-react</code> to reflect that the SDK is
specifically intended to be used for React TanStack Start applications.
Note that the TanStack Start SDK is still in alpha status and may be
subject to breaking changes in non-major package updates.</p>
</li>
</ul>
<h3>Other Changes</h3>
<ul>
<li>feat(astro): Accept all vite-plugin options (<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/15638">#15638</a>)</li>
<li>feat(deps): bump <code>@​sentry/webpack-plugin</code> from 3.2.1 to
3.2.2 (<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/15627">#15627</a>)</li>
<li>feat(tanstackstart): Refine initial API (<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/15574">#15574</a>)</li>
<li>fix(core): Ensure <code>fill</code> only patches functions (<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/15632">#15632</a>)</li>
<li>fix(nextjs): Consider <code>pageExtensions</code> when looking for
instrumentation file (<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/15701">#15701</a>)</li>
<li>fix(remix): Null-check <code>options</code> (<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/15610">#15610</a>)</li>
<li>fix(sveltekit): Correctly parse angle bracket type assertions for
auto instrumentation (<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/15578">#15578</a>)</li>
<li>fix(sveltekit): Guard process variable (<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/15605">#15605</a>)</li>
</ul>
<p>Work in this release was contributed by <a
href="https://github.com/angelikatyborska"><code>@​angelikatyborska</code></a>
and <a
href="https://github.com/nwalters512"><code>@​nwalters512</code></a>.
Thank you for your contributions!</p>
<h2>Bundle size 📦</h2>
<table>
<thead>
<tr>
<th>Path</th>
<th>Size</th>
</tr>
</thead>
<tbody>
<tr>
<td><code>@​sentry/browser</code></td>
<td>23.15 KB</td>
</tr>
<tr>
<td><code>@​sentry/browser</code> - with treeshaking flags</td>
<td>22.94 KB</td>
</tr>
<tr>
<td><code>@​sentry/browser</code> (incl. Tracing)</td>
<td>36.21 KB</td>
</tr>
<tr>
<td><code>@​sentry/browser</code> (incl. Tracing, Replay)</td>
<td>73.39 KB</td>
</tr>
<tr>
<td><code>@​sentry/browser</code> (incl. Tracing, Replay) - with
treeshaking flags</td>
<td>66.8 KB</td>
</tr>
<tr>
<td><code>@​sentry/browser</code> (incl. Tracing, Replay with
Canvas)</td>
<td>78.01 KB</td>
</tr>
<tr>
<td><code>@​sentry/browser</code> (incl. Tracing, Replay, Feedback)</td>
<td>90.57 KB</td>
</tr>
<tr>
<td><code>@​sentry/browser</code> (incl. Feedback)</td>
<td>40.3 KB</td>
</tr>
<tr>
<td><code>@​sentry/browser</code> (incl. sendFeedback)</td>
<td>27.79 KB</td>
</tr>
<tr>
<td><code>@​sentry/browser</code> (incl. FeedbackAsync)</td>
<td>32.58 KB</td>
</tr>
<tr>
<td><code>@​sentry/react</code></td>
<td>24.97 KB</td>
</tr>
<tr>
<td><code>@​sentry/react</code> (incl. Tracing)</td>
<td>38.1 KB</td>
</tr>
<tr>
<td><code>@​sentry/vue</code></td>
<td>27.4 KB</td>
</tr>
<tr>
<td><code>@​sentry/vue</code> (incl. Tracing)</td>
<td>37.9 KB</td>
</tr>
<tr>
<td><code>@​sentry/svelte</code></td>
<td>23.18 KB</td>
</tr>
<tr>
<td>CDN Bundle</td>
<td>24.36 KB</td>
</tr>
<tr>
<td>CDN Bundle (incl. Tracing)</td>
<td>36.26 KB</td>
</tr>
<tr>
<td>CDN Bundle (incl. Tracing, Replay)</td>
<td>71.27 KB</td>
</tr>
<tr>
<td>CDN Bundle (incl. Tracing, Replay, Feedback)</td>
<td>76.45 KB</td>
</tr>
<tr>
<td>CDN Bundle - uncompressed</td>
<td>71.19 KB</td>
</tr>
<tr>
<td>CDN Bundle (incl. Tracing) - uncompressed</td>
<td>107.57 KB</td>
</tr>
<tr>
<td>CDN Bundle (incl. Tracing, Replay) - uncompressed</td>
<td>218.84 KB</td>
</tr>
<tr>
<td>CDN Bundle (incl. Tracing, Replay, Feedback) - uncompressed</td>
<td>231.4 KB</td>
</tr>
<tr>
<td><code>@​sentry/nextjs</code> (client)</td>
<td>39.27 KB</td>
</tr>
<tr>
<td><code>@​sentry/sveltekit</code> (client)</td>
<td>36.63 KB</td>
</tr>
</tbody>
</table>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/getsentry/sentry-javascript/blob/9.6.0/CHANGELOG.md"><code>@​sentry/nextjs</code>'s
changelog</a>.</em></p>
<blockquote>
<h2>9.6.0</h2>
<h3>Important Changes</h3>
<ul>
<li>
<p><strong>feat(tanstackstart): Add
<code>@sentry/tanstackstart-react</code> package and make
<code>@sentry/tanstackstart</code> package a utility package (<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/15629">#15629</a>)</strong></p>
<p>Since TanStack Start is supposed to be a generic framework that
supports libraries like React and Solid, the
<code>@sentry/tanstackstart</code> SDK package was renamed to
<code>@sentry/tanstackstart-react</code> to reflect that the SDK is
specifically intended to be used for React TanStack Start applications.
Note that the TanStack Start SDK is still in alpha status and may be
subject to breaking changes in non-major package updates.</p>
</li>
</ul>
<h3>Other Changes</h3>
<ul>
<li>feat(astro): Accept all vite-plugin options (<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/15638">#15638</a>)</li>
<li>feat(deps): bump <code>@​sentry/webpack-plugin</code> from 3.2.1 to
3.2.2 (<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/15627">#15627</a>)</li>
<li>feat(tanstackstart): Refine initial API (<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/15574">#15574</a>)</li>
<li>fix(core): Ensure <code>fill</code> only patches functions (<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/15632">#15632</a>)</li>
<li>fix(nextjs): Consider <code>pageExtensions</code> when looking for
instrumentation file (<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/15701">#15701</a>)</li>
<li>fix(remix): Null-check <code>options</code> (<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/15610">#15610</a>)</li>
<li>fix(sveltekit): Correctly parse angle bracket type assertions for
auto instrumentation (<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/15578">#15578</a>)</li>
<li>fix(sveltekit): Guard process variable (<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/15605">#15605</a>)</li>
</ul>
<p>Work in this release was contributed by <a
href="https://github.com/angelikatyborska"><code>@​angelikatyborska</code></a>
and <a
href="https://github.com/nwalters512"><code>@​nwalters512</code></a>.
Thank you for your contributions!</p>
<h2>9.5.0</h2>
<h3>Important Changes</h3>
<p>We found some issues with the new feedback screenshot annotation
where screenshots are not being generated properly. Due to this issue,
we are reverting the feature.</p>
<ul>
<li>Revert &quot;feat(feedback) Allowing annotation via highlighting
&amp; masking (<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/15484">#15484</a>)&quot;
(<a
href="https://redirect.github.com/getsentry/sentry-javascript/issues/15609">#15609</a>)</li>
</ul>
<h3>Other Changes</h3>
<ul>
<li>Add cloudflare adapter detection and path generation (<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/15603">#15603</a>)</li>
<li>deps(nextjs): Bump rollup to <code>4.34.9</code> (<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/15589">#15589</a>)</li>
<li>feat(bun): Automatically add performance integrations (<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/15586">#15586</a>)</li>
<li>feat(replay): Bump rrweb to 2.34.0 (<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/15580">#15580</a>)</li>
<li>fix(browser): Call original function on early return from patched
history API (<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/15576">#15576</a>)</li>
<li>fix(nestjs): Copy metadata in custom decorators (<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/15598">#15598</a>)</li>
<li>fix(react-router): Fix config type import (<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/15583">#15583</a>)</li>
<li>fix(remix): Use correct types export for
<code>@sentry/remix/cloudflare</code> (<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/15599">#15599</a>)</li>
<li>fix(vue): Attach Pinia state only once per event (<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/15588">#15588</a>)</li>
</ul>
<p>Work in this release was contributed by <a
href="https://github.com/msurdi-a8c"><code>@​msurdi-a8c</code></a>, <a
href="https://github.com/namoscato"><code>@​namoscato</code></a>, and <a
href="https://github.com/rileyg98"><code>@​rileyg98</code></a>. Thank
you for your contributions!</p>
<h2>9.4.0</h2>
<ul>
<li>feat(core): Add types for logs protocol and envelope (<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/15530">#15530</a>)</li>
<li>feat(deps): Bump <code>@sentry/cli</code> from 2.41.1 to 2.42.2 (<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/15510">#15510</a>)</li>
<li>feat(deps): Bump <code>@sentry/webpack-plugin</code> from 3.1.2 to
3.2.1 (<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/15512">#15512</a>)</li>
<li>feat(feedback) Allowing annotation via highlighting &amp; masking
(<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/15484">#15484</a>)</li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="6ec4602781"><code>6ec4602</code></a>
release: 9.6.0</li>
<li><a
href="5ba80bc5fd"><code>5ba80bc</code></a>
Merge pull request <a
href="https://redirect.github.com/getsentry/sentry-javascript/issues/15703">#15703</a>
from getsentry/prepare-release/9.6.0</li>
<li><a
href="8dc6e50597"><code>8dc6e50</code></a>
Remove unnecessary changelog item</li>
<li><a
href="7889768035"><code>7889768</code></a>
meta(changelog): Update changelog for 9.6.0</li>
<li><a
href="2b5526565c"><code>2b55265</code></a>
fix(nextjs): Consider <code>pageExtensions</code> when looking for
instrumentation file ...</li>
<li><a
href="7d88266a6e"><code>7d88266</code></a>
chore(ci): Remove <code>type</code> from canary failure template (<a
href="https://redirect.github.com/getsentry/sentry-javascript/issues/15698">#15698</a>)</li>
<li><a
href="48ed271b6d"><code>48ed271</code></a>
chore(deps): bump esbuild from 0.20.0 to 0.25.0 in
/dev-packages/e2e-tests/te...</li>
<li><a
href="e15988c2ad"><code>e15988c</code></a>
chore: Add external contributor to CHANGELOG.md (<a
href="https://redirect.github.com/getsentry/sentry-javascript/issues/15642">#15642</a>)</li>
<li><a
href="5c4cab7b34"><code>5c4cab7</code></a>
chore(deps): Deduplicate <code>@babel</code> dependencies (<a
href="https://redirect.github.com/getsentry/sentry-javascript/issues/15639">#15639</a>)</li>
<li><a
href="ce1ced8172"><code>ce1ced8</code></a>
chore: Add external contributor to CHANGELOG.md (<a
href="https://redirect.github.com/getsentry/sentry-javascript/issues/15640">#15640</a>)</li>
<li>Additional commits viewable in <a
href="https://github.com/getsentry/sentry-javascript/compare/8.54.0...9.6.0">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=@sentry/nextjs&package-manager=npm_and_yarn&previous-version=8.54.0&new-version=9.6.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-04-02 02:51:36 +00:00
dependabot[bot]
05af4a24ce chore(libs/deps): bump the production-dependencies group across 1 directory with 4 updates (#9727)
Bumps the production-dependencies group with 4 updates in the
/autogpt_platform/autogpt_libs directory:
[pydantic](https://github.com/pydantic/pydantic),
[pydantic-settings](https://github.com/pydantic/pydantic-settings),
[pytest-asyncio](https://github.com/pytest-dev/pytest-asyncio) and
[supabase](https://github.com/supabase/supabase-py).

Updates `pydantic` from 2.10.6 to 2.11.1
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/pydantic/pydantic/releases">pydantic's
releases</a>.</em></p>
<blockquote>
<h2>v2.11.1 2025-03-28</h2>
<!-- raw HTML omitted -->
<h2>What's Changed</h2>
<h3>Fixes</h3>
<ul>
<li>Do not override <code>'definitions-ref'</code> schemas containing
serialization schemas or metadata by <a
href="https://github.com/Viicos"><code>@​Viicos</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic/pull/11644">pydantic/pydantic#11644</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/pydantic/pydantic/compare/v2.11.0...v2.11.1">https://github.com/pydantic/pydantic/compare/v2.11.0...v2.11.1</a></p>
<h2>v2.11.0 2025-03-27</h2>
<!-- raw HTML omitted -->
<h2>What's Changed</h2>
<h3>Packaging</h3>
<ul>
<li>Re-enable memray related tests on Python 3.12+ by <a
href="https://github.com/Viicos"><code>@​Viicos</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic/pull/11191">pydantic/pydantic#11191</a></li>
<li>Bump astral-sh/setup-uv from 4 to 5 by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic/pull/11205">pydantic/pydantic#11205</a></li>
<li>Add a <code>check_pydantic_core_version()</code> function by <a
href="https://github.com/Viicos"><code>@​Viicos</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic/pull/11324">pydantic/pydantic#11324</a></li>
<li>Remove <code>greenlet</code> development dependency by <a
href="https://github.com/Viicos"><code>@​Viicos</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic/pull/11351">pydantic/pydantic#11351</a></li>
<li>Bump ruff from 0.9.2 to 0.9.5 by <a
href="https://github.com/Viicos"><code>@​Viicos</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic/pull/11407">pydantic/pydantic#11407</a></li>
<li>Improve release automation process by <a
href="https://github.com/austinyu"><code>@​austinyu</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic/pull/11427">pydantic/pydantic#11427</a></li>
<li>Bump dawidd6/action-download-artifact from 8 to 9 by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic/pull/11513">pydantic/pydantic#11513</a></li>
<li>Bump <code>pydantic-core</code> to v2.32.0 by <a
href="https://github.com/Viicos"><code>@​Viicos</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic/pull/11567">pydantic/pydantic#11567</a></li>
</ul>
<h3>New Features</h3>
<ul>
<li>Support unsubstituted type variables with both a default and a bound
or constraints by <a
href="https://github.com/FyZzyss"><code>@​FyZzyss</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic/pull/10789">pydantic/pydantic#10789</a></li>
<li>Add a <code>default_factory_takes_validated_data</code> property to
<code>FieldInfo</code> by <a
href="https://github.com/Viicos"><code>@​Viicos</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic/pull/11034">pydantic/pydantic#11034</a></li>
<li>Raise a better error when a generic alias is used inside
<code>type[]</code> by <a
href="https://github.com/Viicos"><code>@​Viicos</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic/pull/11088">pydantic/pydantic#11088</a></li>
<li>Properly support PEP 695 generics syntax by <a
href="https://github.com/Viicos"><code>@​Viicos</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic/pull/11189">pydantic/pydantic#11189</a></li>
<li>Properly support type variable defaults by <a
href="https://github.com/Viicos"><code>@​Viicos</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic/pull/11332">pydantic/pydantic#11332</a></li>
<li>Add support for validating v6, v7, v8 UUIDs by <a
href="https://github.com/astei"><code>@​astei</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic/pull/11436">pydantic/pydantic#11436</a></li>
<li>Improve alias configuration APIs by <a
href="https://github.com/sydney-runkle"><code>@​sydney-runkle</code></a>
in <a
href="https://redirect.github.com/pydantic/pydantic/pull/11468">pydantic/pydantic#11468</a></li>
<li>Add experimental support for free threading by <a
href="https://github.com/Viicos"><code>@​Viicos</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic/pull/11516">pydantic/pydantic#11516</a></li>
<li>Add <code>encoded_string()</code> method to the URL types by <a
href="https://github.com/YassinNouh21"><code>@​YassinNouh21</code></a>
in <a
href="https://redirect.github.com/pydantic/pydantic/pull/11580">pydantic/pydantic#11580</a></li>
<li>Add support for <code>defer_build</code> with
<code>@validate_call</code> decorator by <a
href="https://github.com/Viicos"><code>@​Viicos</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic/pull/11584">pydantic/pydantic#11584</a></li>
<li>Allow <code>@with_config</code> decorator to be used with keyword
arguments by <a
href="https://github.com/Viicos"><code>@​Viicos</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic/pull/11608">pydantic/pydantic#11608</a></li>
<li>Simplify customization of default value inclusion in JSON Schema
generation by <a
href="https://github.com/Viicos"><code>@​Viicos</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic/pull/11634">pydantic/pydantic#11634</a></li>
<li>Add <code>generate_arguments_schema()</code> function by <a
href="https://github.com/Viicos"><code>@​Viicos</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic/pull/11572">pydantic/pydantic#11572</a></li>
</ul>
<h3>Changes</h3>
<ul>
<li>Rework <code>create_model</code> field definitions format by <a
href="https://github.com/Viicos"><code>@​Viicos</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic/pull/11032">pydantic/pydantic#11032</a></li>
<li>Raise a deprecation warning when a field is annotated as final with
a default value by <a
href="https://github.com/Viicos"><code>@​Viicos</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic/pull/11168">pydantic/pydantic#11168</a></li>
<li>Deprecate accessing <code>model_fields</code> and
<code>model_computed_fields</code> on instances by <a
href="https://github.com/Viicos"><code>@​Viicos</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic/pull/11169">pydantic/pydantic#11169</a></li>
<li>Move core schema generation logic for path types inside the
<code>GenerateSchema</code> class by <a
href="https://github.com/sydney-runkle"><code>@​sydney-runkle</code></a>
in <a
href="https://redirect.github.com/pydantic/pydantic/pull/10846">pydantic/pydantic#10846</a></li>
<li>Move <code>Mapping</code> schema gen to <code>GenerateSchema</code>
to complete removal of <code>prepare_annotations_for_known_type</code>
workaround by <a
href="https://github.com/sydney-runkle"><code>@​sydney-runkle</code></a>
in <a
href="https://redirect.github.com/pydantic/pydantic/pull/11247">pydantic/pydantic#11247</a></li>
<li>Remove Python 3.8 Support by <a
href="https://github.com/sydney-runkle"><code>@​sydney-runkle</code></a>
in <a
href="https://redirect.github.com/pydantic/pydantic/pull/11258">pydantic/pydantic#11258</a></li>
<li>Optimize calls to <code>get_type_ref</code> by <a
href="https://github.com/Viicos"><code>@​Viicos</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic/pull/10863">pydantic/pydantic#10863</a></li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/pydantic/pydantic/blob/main/HISTORY.md">pydantic's
changelog</a>.</em></p>
<blockquote>
<h2>v2.11.1 (2025-03-28)</h2>
<p><a
href="https://github.com/pydantic/pydantic/releases/tag/v2.11.1">GitHub
release</a></p>
<h3>What's Changed</h3>
<h4>Fixes</h4>
<ul>
<li>Do not override <code>'definitions-ref'</code> schemas containing
serialization schemas or metadata by <a
href="https://github.com/Viicos"><code>@​Viicos</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic/pull/11644">#11644</a></li>
</ul>
<h2>v2.11.0 (2025-03-27)</h2>
<p><a
href="https://github.com/pydantic/pydantic/releases/tag/v2.11.0">GitHub
release</a></p>
<h3>What's Changed</h3>
<p>Pydantic v2.11 is a version strongly focused on build time
performance of Pydantic models (and core schema generation in general).
See the <a
href="https://pydantic.dev/articles/pydantic-v2-11-release">blog
post</a> for more details.</p>
<h4>Packaging</h4>
<ul>
<li>Bump <code>pydantic-core</code> to v2.33.0 by <a
href="https://github.com/Viicos"><code>@​Viicos</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic/pull/11631">#11631</a></li>
</ul>
<h4>New Features</h4>
<ul>
<li>Add <code>encoded_string()</code> method to the URL types by <a
href="https://github.com/YassinNouh21"><code>@​YassinNouh21</code></a>
in <a
href="https://redirect.github.com/pydantic/pydantic/pull/11580">#11580</a></li>
<li>Add support for <code>defer_build</code> with
<code>@validate_call</code> decorator by <a
href="https://github.com/Viicos"><code>@​Viicos</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic/pull/11584">#11584</a></li>
<li>Allow <code>@with_config</code> decorator to be used with keyword
arguments by <a
href="https://github.com/Viicos"><code>@​Viicos</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic/pull/11608">#11608</a></li>
<li>Simplify customization of default value inclusion in JSON Schema
generation by <a
href="https://github.com/Viicos"><code>@​Viicos</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic/pull/11634">#11634</a></li>
<li>Add <code>generate_arguments_schema()</code> function by <a
href="https://github.com/Viicos"><code>@​Viicos</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic/pull/11572">#11572</a></li>
</ul>
<h4>Fixes</h4>
<ul>
<li>Allow generic typed dictionaries to be used for unpacked variadic
keyword parameters by <a
href="https://github.com/Viicos"><code>@​Viicos</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic/pull/11571">#11571</a></li>
<li>Fix runtime error when computing model string representation
involving cached properties and self-referenced models by <a
href="https://github.com/Viicos"><code>@​Viicos</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic/pull/11579">#11579</a></li>
<li>Preserve other steps when using the ellipsis in the pipeline API by
<a href="https://github.com/Viicos"><code>@​Viicos</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic/pull/11626">#11626</a></li>
<li>Fix deferred discriminator application logic by <a
href="https://github.com/Viicos"><code>@​Viicos</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic/pull/11591">#11591</a></li>
</ul>
<h3>New Contributors</h3>
<ul>
<li><a href="https://github.com/cmenon12"><code>@​cmenon12</code></a>
made their first contribution in <a
href="https://redirect.github.com/pydantic/pydantic/pull/11562">#11562</a></li>
<li><a href="https://github.com/Jeukoh"><code>@​Jeukoh</code></a> made
their first contribution in <a
href="https://redirect.github.com/pydantic/pydantic/pull/11611">#11611</a></li>
</ul>
<h2>v2.11.0b2 (2025-03-17)</h2>
<p><a
href="https://github.com/pydantic/pydantic/releases/tag/v2.11.0b2">GitHub
release</a></p>
<h3>What's Changed</h3>
<h4>Packaging</h4>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="6c38dc93f4"><code>6c38dc9</code></a>
Prepare release v2.11.1 (<a
href="https://redirect.github.com/pydantic/pydantic/issues/11648">#11648</a>)</li>
<li><a
href="1dcddac2c5"><code>1dcddac</code></a>
Do not override <code>'definitions-ref'</code> schemas containing
serialization schemas ...</li>
<li><a
href="024fdae2b5"><code>024fdae</code></a>
Fix small typos (<a
href="https://redirect.github.com/pydantic/pydantic/issues/11643">#11643</a>)</li>
<li><a
href="58e61fa3c6"><code>58e61fa</code></a>
Prepare release v2.11.0 (<a
href="https://redirect.github.com/pydantic/pydantic/issues/11635">#11635</a>)</li>
<li><a
href="e2c2e811e3"><code>e2c2e81</code></a>
Add <code>generate_arguments_schema()</code> experimental function (<a
href="https://redirect.github.com/pydantic/pydantic/issues/11572">#11572</a>)</li>
<li><a
href="72bea3f22f"><code>72bea3f</code></a>
Add <code>mkdocs-llmstxt</code> documentation plugin (<a
href="https://redirect.github.com/pydantic/pydantic/issues/11632">#11632</a>)</li>
<li><a
href="fcba83291a"><code>fcba832</code></a>
Simplify customization of default value inclusion in JSON Schema
generation (...</li>
<li><a
href="6f11161524"><code>6f11161</code></a>
Add support for extra keys validation for models (<a
href="https://redirect.github.com/pydantic/pydantic/issues/11578">#11578</a>)</li>
<li><a
href="7917b11bd2"><code>7917b11</code></a>
Disable third-party workflow issue report (<a
href="https://redirect.github.com/pydantic/pydantic/issues/11629">#11629</a>)</li>
<li><a
href="f5226d2946"><code>f5226d2</code></a>
Bump <code>pydantic-core</code> to v2.33.0 (<a
href="https://redirect.github.com/pydantic/pydantic/issues/11631">#11631</a>)</li>
<li>Additional commits viewable in <a
href="https://github.com/pydantic/pydantic/compare/v2.10.6...v2.11.1">compare
view</a></li>
</ul>
</details>
<br />

Updates `pydantic-settings` from 2.7.1 to 2.8.1
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/pydantic/pydantic-settings/releases">pydantic-settings's
releases</a>.</em></p>
<blockquote>
<h2>v2.8.1</h2>
<h2>What's Changed</h2>
<ul>
<li>Fix for init source kwarg alias resolution. by <a
href="https://github.com/kschwab"><code>@​kschwab</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic-settings/pull/550">pydantic/pydantic-settings#550</a></li>
<li>Revert usage of positional only argument in
<code>BaseSettings.__init__</code> by <a
href="https://github.com/Viicos"><code>@​Viicos</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic-settings/pull/557">pydantic/pydantic-settings#557</a></li>
<li>Revert use of <code>object</code> instead of <code>Any</code> by <a
href="https://github.com/Viicos"><code>@​Viicos</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic-settings/pull/559">pydantic/pydantic-settings#559</a></li>
<li>Prepare release 2.8.1 by <a
href="https://github.com/hramezani"><code>@​hramezani</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic-settings/pull/558">pydantic/pydantic-settings#558</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/pydantic/pydantic-settings/compare/v2.8.0...v2.8.1">https://github.com/pydantic/pydantic-settings/compare/v2.8.0...v2.8.1</a></p>
<h2>v2.8.0</h2>
<h2>What's Changed</h2>
<ul>
<li>CLI support for optional and variadic positional args by <a
href="https://github.com/kschwab"><code>@​kschwab</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic-settings/pull/519">pydantic/pydantic-settings#519</a></li>
<li>Improve env_prefix config doc by <a
href="https://github.com/hramezani"><code>@​hramezani</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic-settings/pull/523">pydantic/pydantic-settings#523</a></li>
<li>Add env_nested_max_split setting by <a
href="https://github.com/gsakkis"><code>@​gsakkis</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic-settings/pull/534">pydantic/pydantic-settings#534</a></li>
<li>Avoid using <code>Any</code> in <code>BaseSettings</code> signature
to avoid mypy errors by <a
href="https://github.com/Viicos"><code>@​Viicos</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic-settings/pull/529">pydantic/pydantic-settings#529</a></li>
<li>Asynchronous CLI methods in CliApp by <a
href="https://github.com/KanchiShimono"><code>@​KanchiShimono</code></a>
in <a
href="https://redirect.github.com/pydantic/pydantic-settings/pull/533">pydantic/pydantic-settings#533</a></li>
<li>Don't explode env vars if env_nested_delimiter is empty by <a
href="https://github.com/gsakkis"><code>@​gsakkis</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic-settings/pull/540">pydantic/pydantic-settings#540</a></li>
<li>Prepare release 2.8.0 by <a
href="https://github.com/hramezani"><code>@​hramezani</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic-settings/pull/541">pydantic/pydantic-settings#541</a></li>
</ul>
<h2>New Contributors</h2>
<ul>
<li><a href="https://github.com/gsakkis"><code>@​gsakkis</code></a> made
their first contribution in <a
href="https://redirect.github.com/pydantic/pydantic-settings/pull/534">pydantic/pydantic-settings#534</a></li>
<li><a
href="https://github.com/KanchiShimono"><code>@​KanchiShimono</code></a>
made their first contribution in <a
href="https://redirect.github.com/pydantic/pydantic-settings/pull/533">pydantic/pydantic-settings#533</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/pydantic/pydantic-settings/compare/v2.7.1...v2.8.0">https://github.com/pydantic/pydantic-settings/compare/v2.7.1...v2.8.0</a></p>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="5f33b62056"><code>5f33b62</code></a>
Prepare release 2.8.1 (<a
href="https://redirect.github.com/pydantic/pydantic-settings/issues/558">#558</a>)</li>
<li><a
href="fa64a4eebb"><code>fa64a4e</code></a>
Revert use of <code>object</code> instead of <code>Any</code> (<a
href="https://redirect.github.com/pydantic/pydantic-settings/issues/559">#559</a>)</li>
<li><a
href="21e6b23cb7"><code>21e6b23</code></a>
Revert usage of positional only argument in
<code>BaseSettings.__init__</code> (<a
href="https://redirect.github.com/pydantic/pydantic-settings/issues/557">#557</a>)</li>
<li><a
href="1a4f3f43f9"><code>1a4f3f4</code></a>
Fix for init source kwarg alias resolution. (<a
href="https://redirect.github.com/pydantic/pydantic-settings/issues/550">#550</a>)</li>
<li><a
href="f76c7fef4e"><code>f76c7fe</code></a>
Prepare release 2.8.0 (<a
href="https://redirect.github.com/pydantic/pydantic-settings/issues/541">#541</a>)</li>
<li><a
href="4b6fd3d096"><code>4b6fd3d</code></a>
Don't explode env vars if env_nested_delimiter is empty (<a
href="https://redirect.github.com/pydantic/pydantic-settings/issues/540">#540</a>)</li>
<li><a
href="7835118fbd"><code>7835118</code></a>
Asynchronous CLI methods in CliApp (<a
href="https://redirect.github.com/pydantic/pydantic-settings/issues/533">#533</a>)</li>
<li><a
href="537f7514aa"><code>537f751</code></a>
Avoid using <code>Any</code> in <code>BaseSettings</code> signature to
avoid mypy errors (<a
href="https://redirect.github.com/pydantic/pydantic-settings/issues/529">#529</a>)</li>
<li><a
href="ccf99b2d78"><code>ccf99b2</code></a>
Add env_nested_max_split setting (<a
href="https://redirect.github.com/pydantic/pydantic-settings/issues/534">#534</a>)</li>
<li><a
href="65929cd1f5"><code>65929cd</code></a>
Improve env_prefix config doc (<a
href="https://redirect.github.com/pydantic/pydantic-settings/issues/523">#523</a>)</li>
<li>Additional commits viewable in <a
href="https://github.com/pydantic/pydantic-settings/compare/v2.7.1...v2.8.1">compare
view</a></li>
</ul>
</details>
<br />

Updates `pytest-asyncio` from 0.25.3 to 0.26.0
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/pytest-dev/pytest-asyncio/releases">pytest-asyncio's
releases</a>.</em></p>
<blockquote>
<h2>pytest-asyncio 0.26.0</h2>
<ul>
<li>Adds configuration option that sets default event loop scope for all
tests <a
href="https://redirect.github.com/pytest-dev/pytest-asyncio/issues/793">#793</a></li>
<li>Improved type annotations for <code>pytest_asyncio.fixture</code> <a
href="https://redirect.github.com/pytest-dev/pytest-asyncio/pull/1045">#1045</a></li>
<li>Added <code>typing-extensions</code> as additional dependency for
Python <code>&lt;3.10</code> <a
href="https://redirect.github.com/pytest-dev/pytest-asyncio/pull/1045">#1045</a></li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="4f8ce457b0"><code>4f8ce45</code></a>
docs: Prepare release of v0.26.0.</li>
<li><a
href="498e8a7786"><code>498e8a7</code></a>
Build(deps): Bump attrs from 25.1.0 to 25.3.0 in
/dependencies/default</li>
<li><a
href="01c22ffb63"><code>01c22ff</code></a>
build: Update project metadata to use SPDX license identifier</li>
<li><a
href="78191c98ed"><code>78191c9</code></a>
[pre-commit.ci] pre-commit autoupdate</li>
<li><a
href="9a455516ea"><code>9a45551</code></a>
Build(deps): Bump hypothesis in /dependencies/default</li>
<li><a
href="6680409439"><code>6680409</code></a>
Build(deps): Bump coverage from 7.7.0 to 7.7.1 in
/dependencies/default</li>
<li><a
href="aa82c574fe"><code>aa82c57</code></a>
Build(deps): Bump iniconfig from 2.0.0 to 2.1.0 in
/dependencies/default</li>
<li><a
href="cca587ea4f"><code>cca587e</code></a>
[pre-commit.ci] pre-commit autoupdate</li>
<li><a
href="5d90b29621"><code>5d90b29</code></a>
Build(deps): Bump hypothesis in /dependencies/default</li>
<li><a
href="c2622628b6"><code>c262262</code></a>
Build(deps): Bump coverage from 7.6.12 to 7.7.0 in
/dependencies/default</li>
<li>Additional commits viewable in <a
href="https://github.com/pytest-dev/pytest-asyncio/compare/v0.25.3...v0.26.0">compare
view</a></li>
</ul>
</details>
<br />

Updates `supabase` from 2.13.0 to 2.15.0
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/supabase/supabase-py/releases">supabase's
releases</a>.</em></p>
<blockquote>
<h2>v2.15.0</h2>
<h2><a
href="https://github.com/supabase/supabase-py/compare/v2.14.0...v2.15.0">2.15.0</a>
(2025-03-26)</h2>
<h3>Features</h3>
<ul>
<li><strong>postgrest:</strong> bump postgrest from 0.19.3 to 1.0.0 (<a
href="https://redirect.github.com/supabase/supabase-py/issues/1074">#1074</a>)
(<a
href="5e59df6bfa">5e59df6</a>)</li>
</ul>
<h3>Bug Fixes</h3>
<ul>
<li><strong>auth:</strong> bump gotrue from 2.11.4 to 2.12.0 (<a
href="https://redirect.github.com/supabase/supabase-py/issues/1087">#1087</a>)
(<a
href="da3ed9cdd7">da3ed9c</a>)</li>
<li><strong>functions:</strong> bump supafunc from 0.9.3 to 0.9.4 (<a
href="https://redirect.github.com/supabase/supabase-py/issues/1088">#1088</a>)
(<a
href="0340c8eeb0">0340c8e</a>)</li>
<li><strong>postgrest:</strong> bump postgrest from 1.0.0 to 1.0.1 (<a
href="https://redirect.github.com/supabase/supabase-py/issues/1083">#1083</a>)
(<a
href="44d2ca56eb">44d2ca5</a>)</li>
<li><strong>realtime:</strong> bump realtime from 2.4.0 to 2.4.1 (<a
href="https://redirect.github.com/supabase/supabase-py/issues/1066">#1066</a>)
(<a
href="1f92945a13">1f92945</a>)</li>
<li><strong>realtime:</strong> bump realtime from 2.4.1 to 2.4.2 (<a
href="https://redirect.github.com/supabase/supabase-py/issues/1089">#1089</a>)
(<a
href="7816d7f40e">7816d7f</a>)</li>
<li>schema method should use postgres method directly (<a
href="https://redirect.github.com/supabase/supabase-py/issues/1082">#1082</a>)
(<a
href="b9923249d9">b992324</a>)</li>
</ul>
<h2>v2.14.0</h2>
<h2><a
href="https://github.com/supabase/supabase-py/compare/v2.13.0...v2.14.0">2.14.0</a>
(2025-03-20)</h2>
<h3>Features</h3>
<ul>
<li><strong>realtime:</strong> bump realtime from 2.3.0 to 2.4.0 (<a
href="https://redirect.github.com/supabase/supabase-py/issues/1059">#1059</a>)
(<a
href="9cdf7fa462">9cdf7fa</a>)</li>
</ul>
<h3>Bug Fixes</h3>
<ul>
<li><strong>auth:</strong> bump gotrue from 2.11.3 to 2.11.4 (<a
href="https://redirect.github.com/supabase/supabase-py/issues/1060">#1060</a>)
(<a
href="a8600fd9e3">a8600fd</a>)</li>
</ul>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/supabase/supabase-py/blob/main/CHANGELOG.md">supabase's
changelog</a>.</em></p>
<blockquote>
<h2><a
href="https://github.com/supabase/supabase-py/compare/v2.14.0...v2.15.0">2.15.0</a>
(2025-03-26)</h2>
<h3>Features</h3>
<ul>
<li><strong>postgrest:</strong> bump postgrest from 0.19.3 to 1.0.0 (<a
href="https://redirect.github.com/supabase/supabase-py/issues/1074">#1074</a>)
(<a
href="5e59df6bfa">5e59df6</a>)</li>
</ul>
<h3>Bug Fixes</h3>
<ul>
<li><strong>auth:</strong> bump gotrue from 2.11.4 to 2.12.0 (<a
href="https://redirect.github.com/supabase/supabase-py/issues/1087">#1087</a>)
(<a
href="da3ed9cdd7">da3ed9c</a>)</li>
<li><strong>functions:</strong> bump supafunc from 0.9.3 to 0.9.4 (<a
href="https://redirect.github.com/supabase/supabase-py/issues/1088">#1088</a>)
(<a
href="0340c8eeb0">0340c8e</a>)</li>
<li><strong>postgrest:</strong> bump postgrest from 1.0.0 to 1.0.1 (<a
href="https://redirect.github.com/supabase/supabase-py/issues/1083">#1083</a>)
(<a
href="44d2ca56eb">44d2ca5</a>)</li>
<li><strong>realtime:</strong> bump realtime from 2.4.0 to 2.4.1 (<a
href="https://redirect.github.com/supabase/supabase-py/issues/1066">#1066</a>)
(<a
href="1f92945a13">1f92945</a>)</li>
<li><strong>realtime:</strong> bump realtime from 2.4.1 to 2.4.2 (<a
href="https://redirect.github.com/supabase/supabase-py/issues/1089">#1089</a>)
(<a
href="7816d7f40e">7816d7f</a>)</li>
<li>schema method should use postgres method directly (<a
href="https://redirect.github.com/supabase/supabase-py/issues/1082">#1082</a>)
(<a
href="b9923249d9">b992324</a>)</li>
</ul>
<h2><a
href="https://github.com/supabase/supabase-py/compare/v2.13.0...v2.14.0">2.14.0</a>
(2025-03-20)</h2>
<h3>Features</h3>
<ul>
<li><strong>realtime:</strong> bump realtime from 2.3.0 to 2.4.0 (<a
href="https://redirect.github.com/supabase/supabase-py/issues/1059">#1059</a>)
(<a
href="9cdf7fa462">9cdf7fa</a>)</li>
</ul>
<h3>Bug Fixes</h3>
<ul>
<li><strong>auth:</strong> bump gotrue from 2.11.3 to 2.11.4 (<a
href="https://redirect.github.com/supabase/supabase-py/issues/1060">#1060</a>)
(<a
href="a8600fd9e3">a8600fd</a>)</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="2fa8891e78"><code>2fa8891</code></a>
chore(main): release 2.15.0 (<a
href="https://redirect.github.com/supabase/supabase-py/issues/1080">#1080</a>)</li>
<li><a
href="7816d7f40e"><code>7816d7f</code></a>
fix(realtime): bump realtime from 2.4.1 to 2.4.2 (<a
href="https://redirect.github.com/supabase/supabase-py/issues/1089">#1089</a>)</li>
<li><a
href="0340c8eeb0"><code>0340c8e</code></a>
fix(functions): bump supafunc from 0.9.3 to 0.9.4 (<a
href="https://redirect.github.com/supabase/supabase-py/issues/1088">#1088</a>)</li>
<li><a
href="da3ed9cdd7"><code>da3ed9c</code></a>
fix(auth): bump gotrue from 2.11.4 to 2.12.0 (<a
href="https://redirect.github.com/supabase/supabase-py/issues/1087">#1087</a>)</li>
<li><a
href="44d2ca56eb"><code>44d2ca5</code></a>
fix(postgrest): bump postgrest from 1.0.0 to 1.0.1 (<a
href="https://redirect.github.com/supabase/supabase-py/issues/1083">#1083</a>)</li>
<li><a
href="b9923249d9"><code>b992324</code></a>
fix: schema method should use postgres method directly (<a
href="https://redirect.github.com/supabase/supabase-py/issues/1082">#1082</a>)</li>
<li><a
href="5e59df6bfa"><code>5e59df6</code></a>
feat(postgrest): bump postgrest from 0.19.3 to 1.0.0 (<a
href="https://redirect.github.com/supabase/supabase-py/issues/1074">#1074</a>)</li>
<li><a
href="36858ee02d"><code>36858ee</code></a>
chore(deps-dev): bump pytest from 8.3.4 to 8.3.5 (<a
href="https://redirect.github.com/supabase/supabase-py/issues/1070">#1070</a>)</li>
<li><a
href="9589770fa3"><code>9589770</code></a>
chore(deps-dev): bump commitizen from 4.2.2 to 4.4.1 (<a
href="https://redirect.github.com/supabase/supabase-py/issues/1069">#1069</a>)</li>
<li><a
href="58246bc5c8"><code>58246bc</code></a>
chore(deps-dev): bump isort from 6.0.0 to 6.0.1 (<a
href="https://redirect.github.com/supabase/supabase-py/issues/1065">#1065</a>)</li>
<li>Additional commits viewable in <a
href="https://github.com/supabase/supabase-py/compare/v2.13.0...v2.15.0">compare
view</a></li>
</ul>
</details>
<br />


Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore <dependency name> major version` will close this
group update PR and stop Dependabot creating any more for the specific
dependency's major version (unless you unignore this specific
dependency's major version or upgrade to it yourself)
- `@dependabot ignore <dependency name> minor version` will close this
group update PR and stop Dependabot creating any more for the specific
dependency's minor version (unless you unignore this specific
dependency's minor version or upgrade to it yourself)
- `@dependabot ignore <dependency name>` will close this group update PR
and stop Dependabot creating any more for the specific dependency
(unless you unignore this specific dependency or upgrade to it yourself)
- `@dependabot unignore <dependency name>` will remove all of the ignore
conditions of the specified dependency
- `@dependabot unignore <dependency name> <ignore condition>` will
remove the ignore condition of the specified dependency and ignore
conditions


</details>

---------

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Nicholas Tindle <nicholas.tindle@agpt.co>
2025-04-02 02:08:49 +00:00
dependabot[bot]
c8836953bf chore(backend/deps-dev): bump the development-dependencies group across 1 directory with 5 updates (#9560)
Bumps the development-dependencies group with 5 updates in the
/autogpt_platform/backend directory:

| Package | From | To |
| --- | --- | --- |
| [aiohappyeyeballs](https://github.com/aio-libs/aiohappyeyeballs) |
`2.4.4` | `2.4.6` |
| [httpx](https://github.com/encode/httpx) | `0.27.2` | `0.28.1` |
| [poethepoet](https://github.com/nat-n/poethepoet) | `0.32.1` |
`0.33.0` |
| [pyright](https://github.com/RobertCraigie/pyright-python) |
`1.1.392.post0` | `1.1.396` |
| [ruff](https://github.com/astral-sh/ruff) | `0.9.3` | `0.9.9` |


Updates `aiohappyeyeballs` from 2.4.4 to 2.4.6
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/aio-libs/aiohappyeyeballs/releases">aiohappyeyeballs's
releases</a>.</em></p>
<blockquote>
<h2>v2.4.6 (2025-02-07)</h2>
<h3>Bug Fixes</h3>
<ul>
<li>Ensure all timers are cancelled when after staggered race finishes
(<a
href="https://redirect.github.com/aio-libs/aiohappyeyeballs/pull/136">#136</a>,
<a
href="f75891d897"><code>f75891d</code></a>)</li>
</ul>
<hr />
<p><strong>Detailed Changes</strong>: <a
href="https://github.com/aio-libs/aiohappyeyeballs/compare/v2.4.5...v2.4.6">v2.4.5...v2.4.6</a></p>
<h2>v2.4.5 (2025-02-07)</h2>
<h3>Bug Fixes</h3>
<ul>
<li>Keep classifiers in project to avoid automatic enrichment (<a
href="https://redirect.github.com/aio-libs/aiohappyeyeballs/pull/134">#134</a>,
<a
href="99edb20e9d"><code>99edb20</code></a>)</li>
</ul>
<p>Co-authored-by: J. Nick Koston <a
href="mailto:nick@koston.org">nick@koston.org</a></p>
<ul>
<li>Move classifiers to prevent recalculation by Poetry (<a
href="https://redirect.github.com/aio-libs/aiohappyeyeballs/pull/131">#131</a>,
<a
href="66e1c90ae8"><code>66e1c90</code></a>)</li>
</ul>
<p>Co-authored-by: Martin Styk <a
href="mailto:martin.styk@oracle.com">martin.styk@oracle.com</a></p>
<p>Co-authored-by: J. Nick Koston <a
href="mailto:nick@koston.org">nick@koston.org</a></p>
<hr />
<p><strong>Detailed Changes</strong>: <a
href="https://github.com/aio-libs/aiohappyeyeballs/compare/v2.4.4...v2.4.5">v2.4.4...v2.4.5</a></p>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/aio-libs/aiohappyeyeballs/blob/main/CHANGELOG.md">aiohappyeyeballs's
changelog</a>.</em></p>
<blockquote>
<h2>v2.4.6 (2025-02-07)</h2>
<h3>Bug fixes</h3>
<ul>
<li>Ensure all timers are cancelled when after staggered race finishes
(<a
href="https://redirect.github.com/aio-libs/aiohappyeyeballs/issues/136">#136</a>)
(<a
href="f75891d897"><code>f75891d</code></a>)</li>
</ul>
<h2>v2.4.5 (2025-02-07)</h2>
<h3>Bug fixes</h3>
<ul>
<li>Keep classifiers in project to avoid automatic enrichment (<a
href="https://redirect.github.com/aio-libs/aiohappyeyeballs/issues/134">#134</a>)
(<a
href="99edb20e9d"><code>99edb20</code></a>)</li>
<li>Move classifiers to prevent recalculation by poetry (<a
href="https://redirect.github.com/aio-libs/aiohappyeyeballs/issues/131">#131</a>)
(<a
href="66e1c90ae8"><code>66e1c90</code></a>)</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="f18ad492a3"><code>f18ad49</code></a>
2.4.6</li>
<li><a
href="f75891d897"><code>f75891d</code></a>
fix: ensure all timers are cancelled when after staggered race finishes
(<a
href="https://redirect.github.com/aio-libs/aiohappyeyeballs/issues/136">#136</a>)</li>
<li><a
href="cbc674d409"><code>cbc674d</code></a>
2.4.5</li>
<li><a
href="99edb20e9d"><code>99edb20</code></a>
fix: keep classifiers in project to avoid automatic enrichment (<a
href="https://redirect.github.com/aio-libs/aiohappyeyeballs/issues/134">#134</a>)</li>
<li><a
href="9baf0b340e"><code>9baf0b3</code></a>
chore(deps-ci): bump the github-actions group with 9 updates (<a
href="https://redirect.github.com/aio-libs/aiohappyeyeballs/issues/135">#135</a>)</li>
<li><a
href="678eab0dd4"><code>678eab0</code></a>
chore: update dependabot.yml to include GHA (<a
href="https://redirect.github.com/aio-libs/aiohappyeyeballs/issues/133">#133</a>)</li>
<li><a
href="66e1c90ae8"><code>66e1c90</code></a>
fix: move classifiers to prevent recalculation by Poetry (<a
href="https://redirect.github.com/aio-libs/aiohappyeyeballs/issues/131">#131</a>)</li>
<li><a
href="850640e0f7"><code>850640e</code></a>
chore: migrate to poetry 2.0 (<a
href="https://redirect.github.com/aio-libs/aiohappyeyeballs/issues/132">#132</a>)</li>
<li><a
href="75ec0dcabc"><code>75ec0dc</code></a>
chore(pre-commit.ci): pre-commit autoupdate (<a
href="https://redirect.github.com/aio-libs/aiohappyeyeballs/issues/129">#129</a>)</li>
<li><a
href="7d7f1180f2"><code>7d7f118</code></a>
chore(pre-commit.ci): pre-commit autoupdate (<a
href="https://redirect.github.com/aio-libs/aiohappyeyeballs/issues/128">#128</a>)</li>
<li>Additional commits viewable in <a
href="https://github.com/aio-libs/aiohappyeyeballs/compare/v2.4.4...v2.4.6">compare
view</a></li>
</ul>
</details>
<br />

Updates `httpx` from 0.27.2 to 0.28.1
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/encode/httpx/releases">httpx's
releases</a>.</em></p>
<blockquote>
<h2>Version 0.28.1</h2>
<h2>0.28.1 (6th December, 2024)</h2>
<ul>
<li>Fix SSL case where <code>verify=False</code> together with client
side certificates.</li>
</ul>
<h2>Version 0.28.0</h2>
<h2>0.28.0 (28th November, 2024)</h2>
<p>The 0.28 release includes a limited set of deprecations.</p>
<p><strong>Deprecations</strong>:</p>
<p>We are working towards a simplified SSL configuration API.</p>
<p><em>For users of the standard <code>verify=True</code> or
<code>verify=False</code> cases, or
<code>verify=&lt;ssl_context&gt;</code> case this should require no
changes. The following cases have been deprecated...</em></p>
<ul>
<li>The <code>verify</code> argument as a string argument is now
deprecated and will raise warnings.</li>
<li>The <code>cert</code> argument is now deprecated and will raise
warnings.</li>
</ul>
<p>Our revised <a
href="https://github.com/encode/httpx/blob/HEAD/docs/advanced/ssl.md">SSL
documentation</a> covers how to implement the same behaviour with a more
constrained API.</p>
<p><strong>The following changes are also included</strong>:</p>
<ul>
<li>The deprecated <code>proxies</code> argument has now been
removed.</li>
<li>The deprecated <code>app</code> argument has now been removed.</li>
<li>JSON request bodies use a compact representation. (<a
href="https://redirect.github.com/encode/httpx/issues/3363">#3363</a>)</li>
<li>Review URL percent escape sets, based on WHATWG spec. (<a
href="https://redirect.github.com/encode/httpx/issues/3371">#3371</a>,
<a
href="https://redirect.github.com/encode/httpx/issues/3373">#3373</a>)</li>
<li>Ensure <code>certifi</code> and <code>httpcore</code> are only
imported if required. (<a
href="https://redirect.github.com/encode/httpx/issues/3377">#3377</a>)</li>
<li>Treat <code>socks5h</code> as a valid proxy scheme. (<a
href="https://redirect.github.com/encode/httpx/issues/3178">#3178</a>)</li>
<li>Cleanup <code>Request()</code> method signature in line with
<code>client.request()</code> and <code>httpx.request()</code>. (<a
href="https://redirect.github.com/encode/httpx/issues/3378">#3378</a>)</li>
<li>Bugfix: When passing <code>params={}</code>, always strictly update
rather than merge with an existing querystring. (<a
href="https://redirect.github.com/encode/httpx/issues/3364">#3364</a>)</li>
</ul>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/encode/httpx/blob/master/CHANGELOG.md">httpx's
changelog</a>.</em></p>
<blockquote>
<h2>0.28.1 (6th December, 2024)</h2>
<ul>
<li>Fix SSL case where <code>verify=False</code> together with client
side certificates.</li>
</ul>
<h2>0.28.0 (28th November, 2024)</h2>
<p>Be aware that the default <em>JSON request bodies now use a more
compact representation</em>. This is generally considered a prefered
style, tho may require updates to test suites.</p>
<p>The 0.28 release includes a limited set of deprecations...</p>
<p><strong>Deprecations</strong>:</p>
<p>We are working towards a simplified SSL configuration API.</p>
<p><em>For users of the standard <code>verify=True</code> or
<code>verify=False</code> cases, or
<code>verify=&lt;ssl_context&gt;</code> case this should require no
changes. The following cases have been deprecated...</em></p>
<ul>
<li>The <code>verify</code> argument as a string argument is now
deprecated and will raise warnings.</li>
<li>The <code>cert</code> argument is now deprecated and will raise
warnings.</li>
</ul>
<p>Our revised <a
href="https://github.com/encode/httpx/blob/master/docs/advanced/ssl.md">SSL
documentation</a> covers how to implement the same behaviour with a more
constrained API.</p>
<p><strong>The following changes are also included</strong>:</p>
<ul>
<li>The deprecated <code>proxies</code> argument has now been
removed.</li>
<li>The deprecated <code>app</code> argument has now been removed.</li>
<li>JSON request bodies use a compact representation. (<a
href="https://redirect.github.com/encode/httpx/issues/3363">#3363</a>)</li>
<li>Review URL percent escape sets, based on WHATWG spec. (<a
href="https://redirect.github.com/encode/httpx/issues/3371">#3371</a>,
<a
href="https://redirect.github.com/encode/httpx/issues/3373">#3373</a>)</li>
<li>Ensure <code>certifi</code> and <code>httpcore</code> are only
imported if required. (<a
href="https://redirect.github.com/encode/httpx/issues/3377">#3377</a>)</li>
<li>Treat <code>socks5h</code> as a valid proxy scheme. (<a
href="https://redirect.github.com/encode/httpx/issues/3178">#3178</a>)</li>
<li>Cleanup <code>Request()</code> method signature in line with
<code>client.request()</code> and <code>httpx.request()</code>. (<a
href="https://redirect.github.com/encode/httpx/issues/3378">#3378</a>)</li>
<li>Bugfix: When passing <code>params={}</code>, always strictly update
rather than merge with an existing querystring. (<a
href="https://redirect.github.com/encode/httpx/issues/3364">#3364</a>)</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="26d48e0634"><code>26d48e0</code></a>
Version 0.28.1 (<a
href="https://redirect.github.com/encode/httpx/issues/3445">#3445</a>)</li>
<li><a
href="89599a9541"><code>89599a9</code></a>
Fix <code>verify=False</code>, <code>cert=...</code> case. (<a
href="https://redirect.github.com/encode/httpx/issues/3442">#3442</a>)</li>
<li><a
href="8ecb86f0d7"><code>8ecb86f</code></a>
Add test for request params behavior changes (<a
href="https://redirect.github.com/encode/httpx/issues/3364">#3364</a>)
(<a
href="https://redirect.github.com/encode/httpx/issues/3440">#3440</a>)</li>
<li><a
href="0cb7e5a2e7"><code>0cb7e5a</code></a>
Bump the python-packages group with 11 updates (<a
href="https://redirect.github.com/encode/httpx/issues/3434">#3434</a>)</li>
<li><a
href="15e21e9ea3"><code>15e21e9</code></a>
Updating deprecated docstring Client() class (<a
href="https://redirect.github.com/encode/httpx/issues/3426">#3426</a>)</li>
<li><a
href="80960fa319"><code>80960fa</code></a>
Version 0.28.0. (<a
href="https://redirect.github.com/encode/httpx/issues/3419">#3419</a>)</li>
<li><a
href="a33c87852b"><code>a33c878</code></a>
Fix <code>extensions</code> type annotation. (<a
href="https://redirect.github.com/encode/httpx/issues/3380">#3380</a>)</li>
<li><a
href="ce7e14da27"><code>ce7e14d</code></a>
Error on verify as str. (<a
href="https://redirect.github.com/encode/httpx/issues/3418">#3418</a>)</li>
<li><a
href="47f4a96ffa"><code>47f4a96</code></a>
Handle empty zstd responses (<a
href="https://redirect.github.com/encode/httpx/issues/3412">#3412</a>)</li>
<li><a
href="189fc4bcbe"><code>189fc4b</code></a>
Update CHANGELOG.md, fix typo(s) (<a
href="https://redirect.github.com/encode/httpx/issues/3406">#3406</a>)</li>
<li>Additional commits viewable in <a
href="https://github.com/encode/httpx/compare/0.27.2...0.28.1">compare
view</a></li>
</ul>
</details>
<br />

Updates `poethepoet` from 0.32.1 to 0.33.0
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/nat-n/poethepoet/releases">poethepoet's
releases</a>.</em></p>
<blockquote>
<h2>0.33.0</h2>
<h2>Enhancements</h2>
<ul>
<li>Implemented first version of UvExecutor by <a
href="https://github.com/AKuederle"><code>@​AKuederle</code></a> in <a
href="https://redirect.github.com/nat-n/poethepoet/pull/271">nat-n/poethepoet#271</a></li>
<li>Support displaying help for a single task by <a
href="https://github.com/nat-n"><code>@​nat-n</code></a> in <a
href="https://redirect.github.com/nat-n/poethepoet/pull/280">nat-n/poethepoet#280</a></li>
</ul>
<h2>Fixes</h2>
<ul>
<li>Fix argument parsing issues in poetry 2.0 plugin by <a
href="https://github.com/nat-n"><code>@​nat-n</code></a> in <a
href="https://redirect.github.com/nat-n/poethepoet/pull/277">nat-n/poethepoet#277</a></li>
<li>Use <code>python3</code> or <code>sys.executable</code> if
<code>python</code> is not on the path by <a
href="https://github.com/nat-n"><code>@​nat-n</code></a> in <a
href="https://redirect.github.com/nat-n/poethepoet/pull/278">nat-n/poethepoet#278</a></li>
<li>Tighten poetry-core dependency for non-wheel based installation
methods</li>
</ul>
<h2>New Contributors</h2>
<ul>
<li><a href="https://github.com/AKuederle"><code>@​AKuederle</code></a>
made their first contribution in <a
href="https://redirect.github.com/nat-n/poethepoet/pull/271">nat-n/poethepoet#271</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/nat-n/poethepoet/compare/v0.32.2...v0.33.0">https://github.com/nat-n/poethepoet/compare/v0.32.2...v0.33.0</a></p>
<h2>0.32.2</h2>
<h2>Fixes</h2>
<ul>
<li>Improve detection of poetry 2.0 projects via the build-system table
by <a href="https://github.com/nat-n"><code>@​nat-n</code></a> in <a
href="https://redirect.github.com/nat-n/poethepoet/pull/274">nat-n/poethepoet#274</a></li>
<li>Fix usage without Poetry doc link in the readme by <a
href="https://github.com/johnthagen"><code>@​johnthagen</code></a> in <a
href="https://redirect.github.com/nat-n/poethepoet/pull/273">nat-n/poethepoet#273</a></li>
</ul>
<h2>New Contributors</h2>
<ul>
<li><a
href="https://github.com/johnthagen"><code>@​johnthagen</code></a> made
their first contribution in <a
href="https://redirect.github.com/nat-n/poethepoet/pull/273">nat-n/poethepoet#273</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/nat-n/poethepoet/compare/v0.32.1...v0.32.2">https://github.com/nat-n/poethepoet/compare/v0.32.1...v0.32.2</a></p>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li>See full diff in <a
href="https://github.com/nat-n/poethepoet/compare/v0.32.1...v0.33.0">compare
view</a></li>
</ul>
</details>
<br />

Updates `pyright` from 1.1.392.post0 to 1.1.396
<details>
<summary>Commits</summary>
<ul>
<li><a
href="5617c6c57f"><code>5617c6c</code></a>
[pyright updated to 1.1.396] Update Version (<a
href="https://redirect.github.com/RobertCraigie/pyright-python/issues/338">#338</a>)</li>
<li><a
href="72e863b737"><code>72e863b</code></a>
chore(ci): remove invalid reviewers (<a
href="https://redirect.github.com/RobertCraigie/pyright-python/issues/336">#336</a>)</li>
<li><a
href="74b6b556d9"><code>74b6b55</code></a>
[pyright updated to 1.1.395] Update Version (<a
href="https://redirect.github.com/RobertCraigie/pyright-python/issues/335">#335</a>)</li>
<li><a
href="70eb305a67"><code>70eb305</code></a>
[pyright updated to 1.1.394] Update Version (<a
href="https://redirect.github.com/RobertCraigie/pyright-python/issues/333">#333</a>)</li>
<li><a
href="c82fac2803"><code>c82fac2</code></a>
[pyright updated to 1.1.393] Update Version (<a
href="https://redirect.github.com/RobertCraigie/pyright-python/issues/332">#332</a>)</li>
<li>See full diff in <a
href="https://github.com/RobertCraigie/pyright-python/compare/v1.1.392.post0...v1.1.396">compare
view</a></li>
</ul>
</details>
<br />

Updates `ruff` from 0.9.3 to 0.9.9
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/astral-sh/ruff/releases">ruff's
releases</a>.</em></p>
<blockquote>
<h2>0.9.9</h2>
<h2>Release Notes</h2>
<h3>Preview features</h3>
<ul>
<li>Fix caching of unsupported-syntax errors (<a
href="https://redirect.github.com/astral-sh/ruff/pull/16425">#16425</a>)</li>
</ul>
<h3>Bug fixes</h3>
<ul>
<li>Only show unsupported-syntax errors in editors when preview mode is
enabled (<a
href="https://redirect.github.com/astral-sh/ruff/pull/16429">#16429</a>)</li>
</ul>
<h2>Contributors</h2>
<ul>
<li><a
href="https://github.com/InSyncWithFoo"><code>@​InSyncWithFoo</code></a></li>
<li><a
href="https://github.com/MichaReiser"><code>@​MichaReiser</code></a></li>
<li><a
href="https://github.com/dhruvmanila"><code>@​dhruvmanila</code></a></li>
<li><a href="https://github.com/ntBre"><code>@​ntBre</code></a></li>
</ul>
<h2>Install ruff 0.9.9</h2>
<h3>Install prebuilt binaries via shell script</h3>
<pre lang="sh"><code>curl --proto '=https' --tlsv1.2 -LsSf
https://github.com/astral-sh/ruff/releases/download/0.9.9/ruff-installer.sh
| sh
</code></pre>
<h3>Install prebuilt binaries via powershell script</h3>
<pre lang="sh"><code>powershell -ExecutionPolicy ByPass -c &quot;irm
https://github.com/astral-sh/ruff/releases/download/0.9.9/ruff-installer.ps1
| iex&quot;
</code></pre>
<h2>Download ruff 0.9.9</h2>
<table>
<thead>
<tr>
<th>File</th>
<th>Platform</th>
<th>Checksum</th>
</tr>
</thead>
<tbody>
<tr>
<td><a
href="https://github.com/astral-sh/ruff/releases/download/0.9.9/ruff-aarch64-apple-darwin.tar.gz">ruff-aarch64-apple-darwin.tar.gz</a></td>
<td>Apple Silicon macOS</td>
<td><a
href="https://github.com/astral-sh/ruff/releases/download/0.9.9/ruff-aarch64-apple-darwin.tar.gz.sha256">checksum</a></td>
</tr>
<tr>
<td><a
href="https://github.com/astral-sh/ruff/releases/download/0.9.9/ruff-x86_64-apple-darwin.tar.gz">ruff-x86_64-apple-darwin.tar.gz</a></td>
<td>Intel macOS</td>
<td><a
href="https://github.com/astral-sh/ruff/releases/download/0.9.9/ruff-x86_64-apple-darwin.tar.gz.sha256">checksum</a></td>
</tr>
<tr>
<td><a
href="https://github.com/astral-sh/ruff/releases/download/0.9.9/ruff-aarch64-pc-windows-msvc.zip">ruff-aarch64-pc-windows-msvc.zip</a></td>
<td>ARM64 Windows</td>
<td><a
href="https://github.com/astral-sh/ruff/releases/download/0.9.9/ruff-aarch64-pc-windows-msvc.zip.sha256">checksum</a></td>
</tr>
<tr>
<td><a
href="https://github.com/astral-sh/ruff/releases/download/0.9.9/ruff-i686-pc-windows-msvc.zip">ruff-i686-pc-windows-msvc.zip</a></td>
<td>x86 Windows</td>
<td><a
href="https://github.com/astral-sh/ruff/releases/download/0.9.9/ruff-i686-pc-windows-msvc.zip.sha256">checksum</a></td>
</tr>
<tr>
<td><a
href="https://github.com/astral-sh/ruff/releases/download/0.9.9/ruff-x86_64-pc-windows-msvc.zip">ruff-x86_64-pc-windows-msvc.zip</a></td>
<td>x64 Windows</td>
<td><a
href="https://github.com/astral-sh/ruff/releases/download/0.9.9/ruff-x86_64-pc-windows-msvc.zip.sha256">checksum</a></td>
</tr>
<tr>
<td><a
href="https://github.com/astral-sh/ruff/releases/download/0.9.9/ruff-aarch64-unknown-linux-gnu.tar.gz">ruff-aarch64-unknown-linux-gnu.tar.gz</a></td>
<td>ARM64 Linux</td>
<td><a
href="https://github.com/astral-sh/ruff/releases/download/0.9.9/ruff-aarch64-unknown-linux-gnu.tar.gz.sha256">checksum</a></td>
</tr>
<tr>
<td><a
href="https://github.com/astral-sh/ruff/releases/download/0.9.9/ruff-i686-unknown-linux-gnu.tar.gz">ruff-i686-unknown-linux-gnu.tar.gz</a></td>
<td>x86 Linux</td>
<td><a
href="https://github.com/astral-sh/ruff/releases/download/0.9.9/ruff-i686-unknown-linux-gnu.tar.gz.sha256">checksum</a></td>
</tr>
<tr>
<td><a
href="https://github.com/astral-sh/ruff/releases/download/0.9.9/ruff-powerpc64-unknown-linux-gnu.tar.gz">ruff-powerpc64-unknown-linux-gnu.tar.gz</a></td>
<td>PPC64 Linux</td>
<td><a
href="https://github.com/astral-sh/ruff/releases/download/0.9.9/ruff-powerpc64-unknown-linux-gnu.tar.gz.sha256">checksum</a></td>
</tr>
<tr>
<td><a
href="https://github.com/astral-sh/ruff/releases/download/0.9.9/ruff-powerpc64le-unknown-linux-gnu.tar.gz">ruff-powerpc64le-unknown-linux-gnu.tar.gz</a></td>
<td>PPC64LE Linux</td>
<td><a
href="https://github.com/astral-sh/ruff/releases/download/0.9.9/ruff-powerpc64le-unknown-linux-gnu.tar.gz.sha256">checksum</a></td>
</tr>
<tr>
<td><a
href="https://github.com/astral-sh/ruff/releases/download/0.9.9/ruff-s390x-unknown-linux-gnu.tar.gz">ruff-s390x-unknown-linux-gnu.tar.gz</a></td>
<td>S390x Linux</td>
<td><a
href="https://github.com/astral-sh/ruff/releases/download/0.9.9/ruff-s390x-unknown-linux-gnu.tar.gz.sha256">checksum</a></td>
</tr>
<tr>
<td><a
href="https://github.com/astral-sh/ruff/releases/download/0.9.9/ruff-x86_64-unknown-linux-gnu.tar.gz">ruff-x86_64-unknown-linux-gnu.tar.gz</a></td>
<td>x64 Linux</td>
<td><a
href="https://github.com/astral-sh/ruff/releases/download/0.9.9/ruff-x86_64-unknown-linux-gnu.tar.gz.sha256">checksum</a></td>
</tr>
<tr>
<td><a
href="https://github.com/astral-sh/ruff/releases/download/0.9.9/ruff-armv7-unknown-linux-gnueabihf.tar.gz">ruff-armv7-unknown-linux-gnueabihf.tar.gz</a></td>
<td>ARMv7 Linux</td>
<td><a
href="https://github.com/astral-sh/ruff/releases/download/0.9.9/ruff-armv7-unknown-linux-gnueabihf.tar.gz.sha256">checksum</a></td>
</tr>
<tr>
<td><a
href="https://github.com/astral-sh/ruff/releases/download/0.9.9/ruff-aarch64-unknown-linux-musl.tar.gz">ruff-aarch64-unknown-linux-musl.tar.gz</a></td>
<td>ARM64 MUSL Linux</td>
<td><a
href="https://github.com/astral-sh/ruff/releases/download/0.9.9/ruff-aarch64-unknown-linux-musl.tar.gz.sha256">checksum</a></td>
</tr>
</tbody>
</table>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/astral-sh/ruff/blob/main/CHANGELOG.md">ruff's
changelog</a>.</em></p>
<blockquote>
<h2>0.9.9</h2>
<h3>Preview features</h3>
<ul>
<li>Fix caching of unsupported-syntax errors (<a
href="https://redirect.github.com/astral-sh/ruff/pull/16425">#16425</a>)</li>
</ul>
<h3>Bug fixes</h3>
<ul>
<li>Only show unsupported-syntax errors in editors when preview mode is
enabled (<a
href="https://redirect.github.com/astral-sh/ruff/pull/16429">#16429</a>)</li>
</ul>
<h2>0.9.8</h2>
<h3>Preview features</h3>
<ul>
<li>Start detecting version-related syntax errors in the parser (<a
href="https://redirect.github.com/astral-sh/ruff/pull/16090">#16090</a>)</li>
</ul>
<h3>Rule changes</h3>
<ul>
<li>[<code>pylint</code>] Mark fix unsafe (<code>PLW1507</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/16343">#16343</a>)</li>
<li>[<code>pylint</code>] Catch <code>case np.nan</code>/<code>case
math.nan</code> in <code>match</code> statements (<code>PLW0177</code>)
(<a
href="https://redirect.github.com/astral-sh/ruff/pull/16378">#16378</a>)</li>
<li>[<code>ruff</code>] Add more Pydantic models variants to the list of
default copy semantics (<code>RUF012</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/16291">#16291</a>)</li>
</ul>
<h3>Server</h3>
<ul>
<li>Avoid indexing the project if <code>configurationPreference</code>
is <code>editorOnly</code> (<a
href="https://redirect.github.com/astral-sh/ruff/pull/16381">#16381</a>)</li>
<li>Avoid unnecessary info at non-trace server log level (<a
href="https://redirect.github.com/astral-sh/ruff/pull/16389">#16389</a>)</li>
<li>Expand <code>ruff.configuration</code> to allow inline config (<a
href="https://redirect.github.com/astral-sh/ruff/pull/16296">#16296</a>)</li>
<li>Notify users for invalid client settings (<a
href="https://redirect.github.com/astral-sh/ruff/pull/16361">#16361</a>)</li>
</ul>
<h3>Configuration</h3>
<ul>
<li>Add <code>per-file-target-version</code> option (<a
href="https://redirect.github.com/astral-sh/ruff/pull/16257">#16257</a>)</li>
</ul>
<h3>Bug fixes</h3>
<ul>
<li>[<code>refurb</code>] Do not consider docstring(s)
(<code>FURB156</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/16391">#16391</a>)</li>
<li>[<code>flake8-self</code>] Ignore attribute accesses on
instance-like variables (<code>SLF001</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/16149">#16149</a>)</li>
<li>[<code>pylint</code>] Fix false positives, add missing methods, and
support positional-only parameters (<code>PLE0302</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/16263">#16263</a>)</li>
<li>[<code>flake8-pyi</code>] Mark <code>PYI030</code> fix unsafe when
comments are deleted (<a
href="https://redirect.github.com/astral-sh/ruff/pull/16322">#16322</a>)</li>
</ul>
<h3>Documentation</h3>
<ul>
<li>Fix example for <code>S611</code> (<a
href="https://redirect.github.com/astral-sh/ruff/pull/16316">#16316</a>)</li>
<li>Normalize inconsistent markdown headings in docstrings (<a
href="https://redirect.github.com/astral-sh/ruff/pull/16364">#16364</a>)</li>
<li>Document MSRV policy (<a
href="https://redirect.github.com/astral-sh/ruff/pull/16384">#16384</a>)</li>
</ul>
<h2>0.9.7</h2>
<h3>Preview features</h3>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="091d0af2ab"><code>091d0af</code></a>
Bump version to Ruff 0.9.9 (<a
href="https://redirect.github.com/astral-sh/ruff/issues/16434">#16434</a>)</li>
<li><a
href="3d72138740"><code>3d72138</code></a>
Check <code>LinterSettings::preview</code> for version-related syntax
errors (<a
href="https://redirect.github.com/astral-sh/ruff/issues/16429">#16429</a>)</li>
<li><a
href="4a23756024"><code>4a23756</code></a>
Avoid caching files with unsupported syntax errors (<a
href="https://redirect.github.com/astral-sh/ruff/issues/16425">#16425</a>)</li>
<li><a
href="af62f7932b"><code>af62f79</code></a>
Prioritize &quot;bug&quot; label for changelog sections (<a
href="https://redirect.github.com/astral-sh/ruff/issues/16433">#16433</a>)</li>
<li><a
href="0ced8d053c"><code>0ced8d0</code></a>
[<code>flake8-copyright</code>] Add links to applicable options
(<code>CPY001</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/issues/16421">#16421</a>)</li>
<li><a
href="a8e171f82c"><code>a8e171f</code></a>
Fix string-length limit in documentation for PYI054 (<a
href="https://redirect.github.com/astral-sh/ruff/issues/16432">#16432</a>)</li>
<li><a
href="cf83584abb"><code>cf83584</code></a>
Show version-related syntax errors in the playground (<a
href="https://redirect.github.com/astral-sh/ruff/issues/16419">#16419</a>)</li>
<li><a
href="764aa0e6a1"><code>764aa0e</code></a>
Allow passing <code>ParseOptions</code> to inline tests (<a
href="https://redirect.github.com/astral-sh/ruff/issues/16357">#16357</a>)</li>
<li><a
href="568cf88c6c"><code>568cf88</code></a>
Bump version to 0.9.8 (<a
href="https://redirect.github.com/astral-sh/ruff/issues/16414">#16414</a>)</li>
<li><a
href="040071bbc5"><code>040071b</code></a>
[red-knot] Ignore surrounding whitespace when looking for `&lt;!--
snapshot-diag...</li>
<li>Additional commits viewable in <a
href="https://github.com/astral-sh/ruff/compare/0.9.3...0.9.9">compare
view</a></li>
</ul>
</details>
<br />


Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore <dependency name> major version` will close this
group update PR and stop Dependabot creating any more for the specific
dependency's major version (unless you unignore this specific
dependency's major version or upgrade to it yourself)
- `@dependabot ignore <dependency name> minor version` will close this
group update PR and stop Dependabot creating any more for the specific
dependency's minor version (unless you unignore this specific
dependency's minor version or upgrade to it yourself)
- `@dependabot ignore <dependency name>` will close this group update PR
and stop Dependabot creating any more for the specific dependency
(unless you unignore this specific dependency or upgrade to it yourself)
- `@dependabot unignore <dependency name>` will remove all of the ignore
conditions of the specified dependency
- `@dependabot unignore <dependency name> <ignore condition>` will
remove the ignore condition of the specified dependency and ignore
conditions


</details>

---------

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Nicholas Tindle <nicholas.tindle@agpt.co>
2025-04-02 00:39:57 +00:00
dependabot[bot]
142fa2af16 chore(backend/deps): bump the production-dependencies group across 1 directory with 20 updates (#9728)
Bumps the production-dependencies group with 20 updates in the
/autogpt_platform/backend directory:

| Package | From | To |
| --- | --- | --- |
| [aio-pika](https://github.com/mosquito/aio-pika) | `9.5.4` | `9.5.5` |
| [anthropic](https://github.com/anthropics/anthropic-sdk-python) |
`0.45.2` | `0.49.0` |
| [discord-py](https://github.com/Rapptz/discord.py) | `2.4.0` | `2.5.2`
|
| [e2b-code-interpreter](https://github.com/e2b-dev/code-interpreter) |
`1.0.5` | `1.1.1` |
| [fastapi](https://github.com/fastapi/fastapi) | `0.115.8` | `0.115.12`
|
| [flake8](https://github.com/pycqa/flake8) | `7.1.1` | `7.2.0` |
|
[google-api-python-client](https://github.com/googleapis/google-api-python-client)
| `2.160.0` | `2.166.0` |
| [google-cloud-storage](https://github.com/googleapis/python-storage) |
`3.0.0` | `3.1.0` |
| [groq](https://github.com/groq/groq-python) | `0.18.0` | `0.20.0` |
| [jinja2](https://github.com/pallets/jinja) | `3.1.5` | `3.1.6` |
|
[launchdarkly-server-sdk](https://github.com/launchdarkly/python-server-sdk)
| `9.9.0` | `9.10.0` |
| [mem0ai](https://github.com/mem0ai/mem0) | `0.1.48` | `0.1.80` |
| [openai](https://github.com/openai/openai-python) | `1.61.1` |
`1.69.0` |
| [pydantic](https://github.com/pydantic/pydantic) | `2.10.6` | `2.11.1`
|
| [pydantic-settings](https://github.com/pydantic/pydantic-settings) |
`2.7.1` | `2.8.1` |
| [pytest](https://github.com/pytest-dev/pytest) | `8.3.4` | `8.3.5` |
| [python-dotenv](https://github.com/theskumar/python-dotenv) | `1.0.1`
| `1.1.0` |
| [sentry-sdk](https://github.com/getsentry/sentry-python) | `2.20.0` |
`2.24.1` |
| [sqlalchemy](https://github.com/sqlalchemy/sqlalchemy) | `2.0.37` |
`2.0.40` |
| [supabase](https://github.com/supabase/supabase-py) | `2.13.0` |
`2.15.0` |


Updates `aio-pika` from 9.5.4 to 9.5.5
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/mosquito/aio-pika/blob/master/CHANGELOG.md">aio-pika's
changelog</a>.</em></p>
<blockquote>
<h2>9.5.5</h2>
<ul>
<li>Replace WeakSet with set for robust channels tracking <a
href="https://redirect.github.com/mosquito/aio-pika/issues/666">#666</a>
by shushpanov</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="0d442ba73d"><code>0d442ba</code></a>
Bump to 9.5.5</li>
<li><a
href="7796f83e85"><code>7796f83</code></a>
Merge pull request <a
href="https://redirect.github.com/mosquito/aio-pika/issues/666">#666</a>
from shushpanov/use_set_instead_of_week_set</li>
<li><a
href="3a94dbdaaa"><code>3a94dbd</code></a>
Currently, <code>RobustChannel</code> uses <code>WeakSet</code> to track
exchanges and queues for r...</li>
<li>See full diff in <a
href="https://github.com/mosquito/aio-pika/compare/9.5.4...9.5.5">compare
view</a></li>
</ul>
</details>
<br />

Updates `anthropic` from 0.45.2 to 0.49.0
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/anthropics/anthropic-sdk-python/releases">anthropic's
releases</a>.</em></p>
<blockquote>
<h2>v0.49.0</h2>
<h2>0.49.0 (2025-02-28)</h2>
<p>Full Changelog: <a
href="https://github.com/anthropics/anthropic-sdk-python/compare/v0.48.0...v0.49.0">v0.48.0...v0.49.0</a></p>
<h3>Features</h3>
<ul>
<li><strong>api:</strong> add support for disabling tool calls (<a
href="https://redirect.github.com/anthropics/anthropic-sdk-python/issues/888">#888</a>)
(<a
href="bfde3d2978">bfde3d2</a>)</li>
</ul>
<h3>Chores</h3>
<ul>
<li><strong>docs:</strong> update client docstring (<a
href="https://redirect.github.com/anthropics/anthropic-sdk-python/issues/887">#887</a>)
(<a
href="4d3ec5ec5b">4d3ec5e</a>)</li>
</ul>
<h3>Documentation</h3>
<ul>
<li>update URLs from stainlessapi.com to stainless.com (<a
href="https://redirect.github.com/anthropics/anthropic-sdk-python/issues/885">#885</a>)
(<a
href="312364b9b5">312364b</a>)</li>
</ul>
<h2>v0.48.0</h2>
<h2>0.48.0 (2025-02-27)</h2>
<p>Full Changelog: <a
href="https://github.com/anthropics/anthropic-sdk-python/compare/v0.47.2...v0.48.0">v0.47.2...v0.48.0</a></p>
<h3>Features</h3>
<ul>
<li><strong>api:</strong> add URL source blocks for images and PDFs (<a
href="https://redirect.github.com/anthropics/anthropic-sdk-python/issues/884">#884</a>)
(<a
href="e6b3a70ffb">e6b3a70</a>)</li>
</ul>
<h3>Documentation</h3>
<ul>
<li>add thinking examples (<a
href="f46324863d">f463248</a>)</li>
</ul>
<h2>v0.47.2</h2>
<h2>0.47.2 (2025-02-25)</h2>
<p>Full Changelog: <a
href="https://github.com/anthropics/anthropic-sdk-python/compare/v0.47.1...v0.47.2">v0.47.1...v0.47.2</a></p>
<h3>Bug Fixes</h3>
<ul>
<li><strong>beta:</strong> add thinking to beta.messages.stream (<a
href="69e3db1de0">69e3db1</a>)</li>
</ul>
<h3>Chores</h3>
<ul>
<li><strong>internal:</strong> properly set
<strong>pydantic_private</strong> (<a
href="https://redirect.github.com/anthropics/anthropic-sdk-python/issues/879">#879</a>)
(<a
href="3537a3bb22">3537a3b</a>)</li>
</ul>
<h2>v0.47.1</h2>
<h2>0.47.1 (2025-02-24)</h2>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/anthropics/anthropic-sdk-python/blob/main/CHANGELOG.md">anthropic's
changelog</a>.</em></p>
<blockquote>
<h2>0.49.0 (2025-02-28)</h2>
<p>Full Changelog: <a
href="https://github.com/anthropics/anthropic-sdk-python/compare/v0.48.0...v0.49.0">v0.48.0...v0.49.0</a></p>
<h3>Features</h3>
<ul>
<li><strong>api:</strong> add support for disabling tool calls (<a
href="https://redirect.github.com/anthropics/anthropic-sdk-python/issues/888">#888</a>)
(<a
href="bfde3d2978">bfde3d2</a>)</li>
</ul>
<h3>Chores</h3>
<ul>
<li><strong>docs:</strong> update client docstring (<a
href="https://redirect.github.com/anthropics/anthropic-sdk-python/issues/887">#887</a>)
(<a
href="4d3ec5ec5b">4d3ec5e</a>)</li>
</ul>
<h3>Documentation</h3>
<ul>
<li>update URLs from stainlessapi.com to stainless.com (<a
href="https://redirect.github.com/anthropics/anthropic-sdk-python/issues/885">#885</a>)
(<a
href="312364b9b5">312364b</a>)</li>
</ul>
<h2>0.48.0 (2025-02-27)</h2>
<p>Full Changelog: <a
href="https://github.com/anthropics/anthropic-sdk-python/compare/v0.47.2...v0.48.0">v0.47.2...v0.48.0</a></p>
<h3>Features</h3>
<ul>
<li><strong>api:</strong> add URL source blocks for images and PDFs (<a
href="https://redirect.github.com/anthropics/anthropic-sdk-python/issues/884">#884</a>)
(<a
href="e6b3a70ffb">e6b3a70</a>)</li>
</ul>
<h3>Documentation</h3>
<ul>
<li>add thinking examples (<a
href="f46324863d">f463248</a>)</li>
</ul>
<h2>0.47.2 (2025-02-25)</h2>
<p>Full Changelog: <a
href="https://github.com/anthropics/anthropic-sdk-python/compare/v0.47.1...v0.47.2">v0.47.1...v0.47.2</a></p>
<h3>Bug Fixes</h3>
<ul>
<li><strong>beta:</strong> add thinking to beta.messages.stream (<a
href="69e3db1de0">69e3db1</a>)</li>
</ul>
<h3>Chores</h3>
<ul>
<li><strong>internal:</strong> properly set
<strong>pydantic_private</strong> (<a
href="https://redirect.github.com/anthropics/anthropic-sdk-python/issues/879">#879</a>)
(<a
href="3537a3bb22">3537a3b</a>)</li>
</ul>
<h2>0.47.1 (2025-02-24)</h2>
<p>Full Changelog: <a
href="https://github.com/anthropics/anthropic-sdk-python/compare/v0.47.0...v0.47.1">v0.47.0...v0.47.1</a></p>
<h3>Chores</h3>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="8b244157a7"><code>8b24415</code></a>
release: 0.49.0</li>
<li><a
href="5e605db5db"><code>5e605db</code></a>
feat(api): add support for disabling tool calls (<a
href="https://redirect.github.com/anthropics/anthropic-sdk-python/issues/888">#888</a>)</li>
<li><a
href="810f434ec1"><code>810f434</code></a>
chore(docs): update client docstring (<a
href="https://redirect.github.com/anthropics/anthropic-sdk-python/issues/887">#887</a>)</li>
<li><a
href="859993cf66"><code>859993c</code></a>
docs: update URLs from stainlessapi.com to stainless.com (<a
href="https://redirect.github.com/anthropics/anthropic-sdk-python/issues/885">#885</a>)</li>
<li><a
href="6c08e05ab4"><code>6c08e05</code></a>
release: 0.48.0</li>
<li><a
href="90481732c9"><code>9048173</code></a>
feat(api): add URL source blocks for images and PDFs (<a
href="https://redirect.github.com/anthropics/anthropic-sdk-python/issues/884">#884</a>)</li>
<li><a
href="b5aaa3caca"><code>b5aaa3c</code></a>
docs: add thinking examples</li>
<li><a
href="599f2b9a95"><code>599f2b9</code></a>
release: 0.47.2</li>
<li><a
href="8fe5f5ce50"><code>8fe5f5c</code></a>
fix(beta): add thinking to beta.messages.stream</li>
<li><a
href="7e49d854c7"><code>7e49d85</code></a>
chore(internal): properly set <strong>pydantic_private</strong> (<a
href="https://redirect.github.com/anthropics/anthropic-sdk-python/issues/879">#879</a>)</li>
<li>Additional commits viewable in <a
href="https://github.com/anthropics/anthropic-sdk-python/compare/v0.45.2...v0.49.0">compare
view</a></li>
</ul>
</details>
<br />

Updates `discord-py` from 2.4.0 to 2.5.2
<details>
<summary>Commits</summary>
<ul>
<li><a
href="d2a6ccf715"><code>d2a6ccf</code></a>
Version bump to v2.5.2</li>
<li><a
href="f4bce1caf0"><code>f4bce1c</code></a>
Add changelog for v2.5.2</li>
<li><a
href="8594dd1b30"><code>8594dd1</code></a>
Fix embed media flags regression</li>
<li><a
href="2f8b2624f1"><code>2f8b262</code></a>
Fix improper class in audit log docs</li>
<li><a
href="973bb5089f"><code>973bb50</code></a>
Version bump for development</li>
<li><a
href="73f261d536"><code>73f261d</code></a>
Version bump to v2.5.1</li>
<li><a
href="6b0a6eea66"><code>6b0a6ee</code></a>
Add v2.5.1 changelog</li>
<li><a
href="cab4732b7e"><code>cab4732</code></a>
Make embed flags required and add them to all media fields</li>
<li><a
href="de5720e659"><code>de5720e</code></a>
Fix attachment is_spoiler() and is_voice_message()</li>
<li><a
href="fbe2b358fc"><code>fbe2b35</code></a>
Add note about NotFound for Messageable.send</li>
<li>Additional commits viewable in <a
href="https://github.com/Rapptz/discord.py/compare/v2.4.0...v2.5.2">compare
view</a></li>
</ul>
</details>
<br />

Updates `e2b-code-interpreter` from 1.0.5 to 1.1.1
<details>
<summary>Commits</summary>
<ul>
<li><a
href="3fe798e18b"><code>3fe798e</code></a>
Fix <code>to_json</code> method for charts (<a
href="https://redirect.github.com/e2b-dev/code-interpreter/issues/68">#68</a>)</li>
<li><a
href="1c4ac6ce6c"><code>1c4ac6c</code></a>
Merge pull request <a
href="https://redirect.github.com/e2b-dev/code-interpreter/issues/67">#67</a>
from e2b-dev/send-notification-to-releases-channel</li>
<li><a
href="6c0bb19dd5"><code>6c0bb19</code></a>
Send releases notification to dedicated channel</li>
<li><a
href="916390e04b"><code>916390e</code></a>
[skip ci] Release new versions</li>
<li><a
href="3bf76d77ae"><code>3bf76d7</code></a>
Merge pull request <a
href="https://redirect.github.com/e2b-dev/code-interpreter/issues/66">#66</a>
from e2b-dev/fix-sdk-gen-non-esm-import</li>
<li><a
href="30f3b24c9c"><code>30f3b24</code></a>
Merge branch 'main' into fix-sdk-gen-non-esm-import</li>
<li><a
href="455d71794c"><code>455d717</code></a>
update pnpm lockfile</li>
<li><a
href="418c069163"><code>418c069</code></a>
Merge pull request <a
href="https://redirect.github.com/e2b-dev/code-interpreter/issues/65">#65</a>
from e2b-dev/fix-sdk-gen-non-esm-import</li>
<li><a
href="c30f7e9934"><code>c30f7e9</code></a>
pin typdoc and typedoc-markdown to non-breaking versions in js-sdk</li>
<li><a
href="f651795436"><code>f651795</code></a>
Merge pull request <a
href="https://redirect.github.com/e2b-dev/code-interpreter/issues/64">#64</a>
from e2b-dev/mlejva-patch-1</li>
<li>Additional commits viewable in <a
href="https://github.com/e2b-dev/code-interpreter/compare/@e2b/code-interpreter-python@1.0.5...@e2b/code-interpreter-python@1.1.1">compare
view</a></li>
</ul>
</details>
<br />

Updates `fastapi` from 0.115.8 to 0.115.12
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/fastapi/fastapi/releases">fastapi's
releases</a>.</em></p>
<blockquote>
<h2>0.115.12</h2>
<h3>Fixes</h3>
<ul>
<li>🐛 Fix <code>convert_underscores=False</code> for header Pydantic
models. PR <a
href="https://redirect.github.com/fastapi/fastapi/pull/13515">#13515</a>
by <a
href="https://github.com/tiangolo"><code>@​tiangolo</code></a>.</li>
</ul>
<h3>Docs</h3>
<ul>
<li>📝 Update <code>docs/en/docs/tutorial/middleware.md</code>. PR <a
href="https://redirect.github.com/fastapi/fastapi/pull/13444">#13444</a>
by <a
href="https://github.com/Rishat-F"><code>@​Rishat-F</code></a>.</li>
<li>👥 Update FastAPI People - Experts. PR <a
href="https://redirect.github.com/fastapi/fastapi/pull/13493">#13493</a>
by <a
href="https://github.com/tiangolo"><code>@​tiangolo</code></a>.</li>
</ul>
<h3>Translations</h3>
<ul>
<li>🌐 Add Ukrainian translation for
<code>docs/uk/docs/tutorial/metadata.md</code> page. PR <a
href="https://redirect.github.com/fastapi/fastapi/pull/13459">#13459</a>
by <a
href="https://github.com/valentinDruzhinin"><code>@​valentinDruzhinin</code></a>.</li>
<li>🌐 Add Ukrainian translation for
<code>docs/uk/docs/tutorial/response-status-code.md</code> page. PR <a
href="https://redirect.github.com/fastapi/fastapi/pull/13462">#13462</a>
by <a
href="https://github.com/valentinDruzhinin"><code>@​valentinDruzhinin</code></a>.</li>
<li>🌐 Add Ukrainian translation for
<code>docs/uk/docs/tutorial/cookie-param-models.md</code> page. PR <a
href="https://redirect.github.com/fastapi/fastapi/pull/13460">#13460</a>
by <a
href="https://github.com/valentinDruzhinin"><code>@​valentinDruzhinin</code></a>.</li>
<li>🌐 Add Ukrainian translation for
<code>docs/uk/docs/tutorial/header-param-models.md</code> page. PR <a
href="https://redirect.github.com/fastapi/fastapi/pull/13461">#13461</a>
by <a
href="https://github.com/valentinDruzhinin"><code>@​valentinDruzhinin</code></a>.</li>
<li>🌐 Add Japanese translation for
<code>docs/ja/docs/virtual-environments.md</code>. PR <a
href="https://redirect.github.com/fastapi/fastapi/pull/13304">#13304</a>
by <a
href="https://github.com/k94-ishi"><code>@​k94-ishi</code></a>.</li>
<li>🌐 Add Korean translation for
<code>docs/ko/docs/tutorial/security/oauth2-jwt.md</code>. PR <a
href="https://redirect.github.com/fastapi/fastapi/pull/13333">#13333</a>
by <a href="https://github.com/yes0ng"><code>@​yes0ng</code></a>.</li>
<li>🌐 Add Vietnamese translation for
<code>docs/vi/docs/deployment/cloud.md</code>. PR <a
href="https://redirect.github.com/fastapi/fastapi/pull/13407">#13407</a>
by <a href="https://github.com/ptt3199"><code>@​ptt3199</code></a>.</li>
</ul>
<h3>Internal</h3>
<ul>
<li>⬆ Bump pydantic-ai from 0.0.15 to 0.0.30. PR <a
href="https://redirect.github.com/fastapi/fastapi/pull/13438">#13438</a>
by <a
href="https://github.com/apps/dependabot"><code>@​dependabot[bot]</code></a>.</li>
<li>⬆ Bump sqlmodel from 0.0.22 to 0.0.23. PR <a
href="https://redirect.github.com/fastapi/fastapi/pull/13437">#13437</a>
by <a
href="https://github.com/apps/dependabot"><code>@​dependabot[bot]</code></a>.</li>
<li>⬆ Bump black from 24.10.0 to 25.1.0. PR <a
href="https://redirect.github.com/fastapi/fastapi/pull/13436">#13436</a>
by <a
href="https://github.com/apps/dependabot"><code>@​dependabot[bot]</code></a>.</li>
<li>⬆ Bump ruff to 0.9.4. PR <a
href="https://redirect.github.com/fastapi/fastapi/pull/13299">#13299</a>
by <a
href="https://github.com/apps/dependabot"><code>@​dependabot[bot]</code></a>.</li>
<li>🔧 Update sponsors: pause TestDriven. PR <a
href="https://redirect.github.com/fastapi/fastapi/pull/13446">#13446</a>
by <a
href="https://github.com/tiangolo"><code>@​tiangolo</code></a>.</li>
</ul>
<h2>0.115.11</h2>
<h3>Fixes</h3>
<ul>
<li>🐛 Add docs examples and tests (support) for <code>Annotated</code>
custom validations, like <code>AfterValidator</code>, revert <a
href="https://redirect.github.com/fastapi/fastapi/pull/13440">#13440</a>.
PR <a
href="https://redirect.github.com/fastapi/fastapi/pull/13442">#13442</a>
by <a href="https://github.com/tiangolo"><code>@​tiangolo</code></a>.
<ul>
<li>New docs: <a
href="https://fastapi.tiangolo.com/tutorial/query-params-str-validations/#custom-validation">Query
Parameters and String Validations - Custom Validation</a>.</li>
</ul>
</li>
</ul>
<h3>Translations</h3>
<ul>
<li>🌐 Add Russian translation for
<code>docs/ru/docs/tutorial/middleware.md</code>. PR <a
href="https://redirect.github.com/fastapi/fastapi/pull/13412">#13412</a>
by <a href="https://github.com/alv2017"><code>@​alv2017</code></a>.</li>
</ul>
<h3>Internal</h3>
<ul>
<li>👥 Update FastAPI GitHub topic repositories. PR <a
href="https://redirect.github.com/fastapi/fastapi/pull/13439">#13439</a>
by <a
href="https://github.com/tiangolo"><code>@​tiangolo</code></a>.</li>
<li>👥 Update FastAPI People - Contributors and Translators. PR <a
href="https://redirect.github.com/fastapi/fastapi/pull/13432">#13432</a>
by <a
href="https://github.com/tiangolo"><code>@​tiangolo</code></a>.</li>
<li>👥 Update FastAPI People - Sponsors. PR <a
href="https://redirect.github.com/fastapi/fastapi/pull/13433">#13433</a>
by <a
href="https://github.com/tiangolo"><code>@​tiangolo</code></a>.</li>
</ul>
<h2>0.115.10</h2>
<h3>Fixes</h3>
<ul>
<li>♻️ Update internal annotation usage for compatibility with Pydantic
2.11. PR <a
href="https://redirect.github.com/fastapi/fastapi/pull/13314">#13314</a>
by <a href="https://github.com/Viicos"><code>@​Viicos</code></a>.</li>
</ul>
<h3>Upgrades</h3>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="628c34e0ca"><code>628c34e</code></a>
🔖 Release version 0.115.12</li>
<li><a
href="8e76d4e5f4"><code>8e76d4e</code></a>
📝 Update release notes</li>
<li><a
href="2537d9d1c2"><code>2537d9d</code></a>
🐛 Fix <code>convert_underscores=False</code> for header Pydantic models
(<a
href="https://redirect.github.com/fastapi/fastapi/issues/13515">#13515</a>)</li>
<li><a
href="c08a3e8f22"><code>c08a3e8</code></a>
📝 Update release notes</li>
<li><a
href="241de23b68"><code>241de23</code></a>
📝 Update <code>docs/en/docs/tutorial/middleware.md</code> (<a
href="https://redirect.github.com/fastapi/fastapi/issues/13444">#13444</a>)</li>
<li><a
href="4e40e1e85d"><code>4e40e1e</code></a>
📝 Update release notes</li>
<li><a
href="ecf6e7eec2"><code>ecf6e7e</code></a>
🌐 Add Ukrainian translation for
<code>docs/uk/docs/tutorial/metadata.md</code> page (<a
href="https://redirect.github.com/fastapi/fastapi/issues/13">#13</a>...</li>
<li><a
href="3afd733753"><code>3afd733</code></a>
📝 Update release notes</li>
<li><a
href="8557a88d16"><code>8557a88</code></a>
🌐 Add Ukrainian translation for
`docs/uk/docs/tutorial/response-status-code.m...</li>
<li><a
href="e4c1dd799d"><code>e4c1dd7</code></a>
📝 Update release notes</li>
<li>Additional commits viewable in <a
href="https://github.com/fastapi/fastapi/compare/0.115.8...0.115.12">compare
view</a></li>
</ul>
</details>
<br />

Updates `flake8` from 7.1.1 to 7.2.0
<details>
<summary>Commits</summary>
<ul>
<li><a
href="16f5f28a38"><code>16f5f28</code></a>
Release 7.2.0</li>
<li><a
href="ebad305769"><code>ebad305</code></a>
Merge pull request <a
href="https://redirect.github.com/pycqa/flake8/issues/1974">#1974</a>
from PyCQA/update-plugins</li>
<li><a
href="d56d569ce4"><code>d56d569</code></a>
update versions of pycodestyle / pyflakes</li>
<li><a
href="a7e8f6250c"><code>a7e8f62</code></a>
Merge pull request <a
href="https://redirect.github.com/pycqa/flake8/issues/1973">#1973</a>
from PyCQA/py39-plus</li>
<li><a
href="9d55ccdb72"><code>9d55ccd</code></a>
py39+</li>
<li><a
href="e492aeb385"><code>e492aeb</code></a>
Merge pull request <a
href="https://redirect.github.com/pycqa/flake8/issues/1967">#1967</a>
from PyCQA/unnecessary-mocks</li>
<li><a
href="fa2ed7145c"><code>fa2ed71</code></a>
remove a few unnecessary mocks in test_checker_manager</li>
<li><a
href="fffee8ba9d"><code>fffee8b</code></a>
Release 7.1.2</li>
<li><a
href="19001f77f3"><code>19001f7</code></a>
Merge pull request <a
href="https://redirect.github.com/pycqa/flake8/issues/1966">#1966</a>
from PyCQA/limit-procs-to-file-count</li>
<li><a
href="f35737a32d"><code>f35737a</code></a>
avoid starting unnecessary processes when file count is limited</li>
<li>See full diff in <a
href="https://github.com/pycqa/flake8/compare/7.1.1...7.2.0">compare
view</a></li>
</ul>
</details>
<br />

Updates `google-api-python-client` from 2.160.0 to 2.166.0
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/googleapis/google-api-python-client/releases">google-api-python-client's
releases</a>.</em></p>
<blockquote>
<h2>v2.166.0</h2>
<h2><a
href="https://github.com/googleapis/google-api-python-client/compare/v2.165.0...v2.166.0">2.166.0</a>
(2025-03-25)</h2>
<h3>Features</h3>
<ul>
<li><strong>aiplatform:</strong> Update the api <a
href="9d050cee8d</a>
(<a
href="722da7de01">722da7d</a>)</li>
<li><strong>alloydb:</strong> Update the api <a
href="db87ff7dae</a>
(<a
href="722da7de01">722da7d</a>)</li>
<li><strong>analyticshub:</strong> Update the api <a
href="0716538951</a>
(<a
href="722da7de01">722da7d</a>)</li>
<li><strong>apigee:</strong> Update the api <a
href="2fb0b5170e</a>
(<a
href="722da7de01">722da7d</a>)</li>
<li><strong>bigqueryreservation:</strong> Update the api <a
href="98c07716c1</a>
(<a
href="722da7de01">722da7d</a>)</li>
<li><strong>bigquery:</strong> Update the api <a
href="0f85078845</a>
(<a
href="722da7de01">722da7d</a>)</li>
<li><strong>civicinfo:</strong> Update the api <a
href="f4a8692800</a>
(<a
href="722da7de01">722da7d</a>)</li>
<li><strong>compute:</strong> Update the api <a
href="daa99db3ac</a>
(<a
href="722da7de01">722da7d</a>)</li>
<li><strong>contactcenterinsights:</strong> Update the api <a
href="0ca2138859</a>
(<a
href="722da7de01">722da7d</a>)</li>
<li><strong>container:</strong> Update the api <a
href="969054e90e</a>
(<a
href="722da7de01">722da7d</a>)</li>
<li><strong>dataplex:</strong> Update the api <a
href="b1e4a4fa3a</a>
(<a
href="722da7de01">722da7d</a>)</li>
<li><strong>dataproc:</strong> Update the api <a
href="ab21a62281</a>
(<a
href="722da7de01">722da7d</a>)</li>
<li><strong>datastream:</strong> Update the api <a
href="77b0d5e5a7</a>
(<a
href="722da7de01">722da7d</a>)</li>
<li><strong>dialogflow:</strong> Update the api <a
href="cc1fce237a</a>
(<a
href="722da7de01">722da7d</a>)</li>
<li><strong>discoveryengine:</strong> Update the api <a
href="32191c2064</a>
(<a
href="722da7de01">722da7d</a>)</li>
<li><strong>displayvideo:</strong> Update the api <a
href="76088b5c22</a>
(<a
href="722da7de01">722da7d</a>)</li>
<li><strong>documentai:</strong> Update the api <a
href="79b0b5264c</a>
(<a
href="722da7de01">722da7d</a>)</li>
<li><strong>gkebackup:</strong> Update the api <a
href="0ad6b20463</a>
(<a
href="722da7de01">722da7d</a>)</li>
<li><strong>integrations:</strong> Update the api <a
href="3786649a17</a>
(<a
href="722da7de01">722da7d</a>)</li>
<li><strong>managedkafka:</strong> Update the api <a
href="7e80d5a8e7</a>
(<a
href="722da7de01">722da7d</a>)</li>
<li><strong>merchantapi:</strong> Update the api <a
href="54e2633d6c</a>
(<a
href="722da7de01">722da7d</a>)</li>
<li><strong>monitoring:</strong> Update the api <a
href="cecd16cb74</a>
(<a
href="722da7de01">722da7d</a>)</li>
<li><strong>netapp:</strong> Update the api <a
href="c2afd5c9b6</a>
(<a
href="722da7de01">722da7d</a>)</li>
<li><strong>networkconnectivity:</strong> Update the api <a
href="cabd98e33c</a>
(<a
href="722da7de01">722da7d</a>)</li>
<li><strong>networkservices:</strong> Update the api <a
href="8fb80bc60f</a>
(<a
href="722da7de01">722da7d</a>)</li>
<li><strong>notebooks:</strong> Update the api <a
href="5012558735</a>
(<a
href="722da7de01">722da7d</a>)</li>
<li><strong>oracledatabase:</strong> Update the api <a
href="c892cd5c07</a>
(<a
href="722da7de01">722da7d</a>)</li>
<li><strong>pubsub:</strong> Update the api <a
href="6bf4e2d990</a>
(<a
href="722da7de01">722da7d</a>)</li>
<li><strong>securitycenter:</strong> Update the api <a
href="5a7dfccd9b</a>
(<a
href="722da7de01">722da7d</a>)</li>
<li><strong>verifiedaccess:</strong> Update the api <a
href="d58429ee48</a>
(<a
href="722da7de01">722da7d</a>)</li>
</ul>
<h3>Bug Fixes</h3>
<ul>
<li><strong>chat:</strong> Update the api <a
href="eceac9d703</a>
(<a
href="722da7de01">722da7d</a>)</li>
<li><strong>storage:</strong> Update the api <a
href="56ff88eecd</a>
(<a
href="722da7de01">722da7d</a>)</li>
<li><strong>sts:</strong> Update the api <a
href="63ec516264</a>
(<a
href="722da7de01">722da7d</a>)</li>
</ul>
<h2>v2.165.0</h2>
<h2><a
href="https://github.com/googleapis/google-api-python-client/compare/v2.164.0...v2.165.0">2.165.0</a>
(2025-03-18)</h2>
<h3>Features</h3>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="7633383ffc"><code>7633383</code></a>
chore(main): release 2.166.0 (<a
href="https://redirect.github.com/googleapis/google-api-python-client/issues/2586">#2586</a>)</li>
<li><a
href="722da7de01"><code>722da7d</code></a>
chore: Update discovery artifacts (<a
href="https://redirect.github.com/googleapis/google-api-python-client/issues/2585">#2585</a>)</li>
<li><a
href="e9fb04c74d"><code>e9fb04c</code></a>
chore(main): release 2.165.0 (<a
href="https://redirect.github.com/googleapis/google-api-python-client/issues/2581">#2581</a>)</li>
<li><a
href="935c167ae7"><code>935c167</code></a>
chore: Update discovery artifacts (<a
href="https://redirect.github.com/googleapis/google-api-python-client/issues/2582">#2582</a>)</li>
<li><a
href="21847efba0"><code>21847ef</code></a>
fix: resolve issue where pre-release versions of dependencies are
installed (...</li>
<li><a
href="0b1875f676"><code>0b1875f</code></a>
chore(main): release 2.164.0 (<a
href="https://redirect.github.com/googleapis/google-api-python-client/issues/2578">#2578</a>)</li>
<li><a
href="390e213906"><code>390e213</code></a>
chore: Update discovery artifacts (<a
href="https://redirect.github.com/googleapis/google-api-python-client/issues/2577">#2577</a>)</li>
<li><a
href="df40ac60f2"><code>df40ac6</code></a>
chore: remove unused files (<a
href="https://redirect.github.com/googleapis/google-api-python-client/issues/2575">#2575</a>)</li>
<li><a
href="6bf97861c4"><code>6bf9786</code></a>
chore(main): release 2.163.0 (<a
href="https://redirect.github.com/googleapis/google-api-python-client/issues/2572">#2572</a>)</li>
<li><a
href="8bc64e5e1a"><code>8bc64e5</code></a>
chore: Update discovery artifacts (<a
href="https://redirect.github.com/googleapis/google-api-python-client/issues/2571">#2571</a>)</li>
<li>Additional commits viewable in <a
href="https://github.com/googleapis/google-api-python-client/compare/v2.160.0...v2.166.0">compare
view</a></li>
</ul>
</details>
<br />

Updates `google-cloud-storage` from 3.0.0 to 3.1.0
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/googleapis/python-storage/releases">google-cloud-storage's
releases</a>.</em></p>
<blockquote>
<h2>v3.1.0</h2>
<h2><a
href="https://github.com/googleapis/python-storage/compare/v3.0.0...v3.1.0">3.1.0</a>
(2025-02-27)</h2>
<h3>Features</h3>
<ul>
<li>Add api_key argument to Client constructor (<a
href="https://redirect.github.com/googleapis/python-storage/issues/1441">#1441</a>)
(<a
href="c869e15ec5">c869e15</a>)</li>
<li>Add Bucket.move_blob() for HNS-enabled buckets (<a
href="https://redirect.github.com/googleapis/python-storage/issues/1431">#1431</a>)
(<a
href="24c000fb7b">24c000f</a>)</li>
</ul>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/googleapis/python-storage/blob/main/CHANGELOG.md">google-cloud-storage's
changelog</a>.</em></p>
<blockquote>
<h2><a
href="https://github.com/googleapis/python-storage/compare/v3.0.0...v3.1.0">3.1.0</a>
(2025-02-27)</h2>
<h3>Features</h3>
<ul>
<li>Add api_key argument to Client constructor (<a
href="https://redirect.github.com/googleapis/python-storage/issues/1441">#1441</a>)
(<a
href="c869e15ec5">c869e15</a>)</li>
<li>Add Bucket.move_blob() for HNS-enabled buckets (<a
href="https://redirect.github.com/googleapis/python-storage/issues/1431">#1431</a>)
(<a
href="24c000fb7b">24c000f</a>)</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="aa7afdff7e"><code>aa7afdf</code></a>
chore(main): release 3.1.0 (<a
href="https://redirect.github.com/googleapis/python-storage/issues/1435">#1435</a>)</li>
<li><a
href="c869e15ec5"><code>c869e15</code></a>
Feat: Add api_key argument to Client constructor (<a
href="https://redirect.github.com/googleapis/python-storage/issues/1441">#1441</a>)</li>
<li><a
href="b58d3190c9"><code>b58d319</code></a>
chore(deps): bump virtualenv from 20.26.3 to 20.26.6 in /.kokoro (<a
href="https://redirect.github.com/googleapis/python-storage/issues/1412">#1412</a>)</li>
<li><a
href="0378b44400"><code>0378b44</code></a>
chore: move create_trace_span context manager within (<a
href="https://redirect.github.com/googleapis/python-storage/issues/1443">#1443</a>)</li>
<li><a
href="511b6f5c2b"><code>511b6f5</code></a>
chore(python): conditionally load credentials in .kokoro/build.sh (<a
href="https://redirect.github.com/googleapis/python-storage/issues/1440">#1440</a>)</li>
<li><a
href="b08aa0b131"><code>b08aa0b</code></a>
chore: set gcs-sdk-team as CODEOWNER (<a
href="https://redirect.github.com/googleapis/python-storage/issues/1442">#1442</a>)</li>
<li><a
href="24c000fb7b"><code>24c000f</code></a>
feat: add Bucket.move_blob() for HNS-enabled buckets (<a
href="https://redirect.github.com/googleapis/python-storage/issues/1431">#1431</a>)</li>
<li>See full diff in <a
href="https://github.com/googleapis/python-storage/compare/v3.0.0...v3.1.0">compare
view</a></li>
</ul>
</details>
<br />

Updates `groq` from 0.18.0 to 0.20.0
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/groq/groq-python/releases">groq's
releases</a>.</em></p>
<blockquote>
<h2>v0.20.0</h2>
<h2>0.20.0 (2025-03-19)</h2>
<p>Full Changelog: <a
href="https://github.com/groq/groq-python/compare/v0.19.0...v0.20.0">v0.19.0...v0.20.0</a></p>
<h3>Features</h3>
<ul>
<li><strong>api:</strong> Add speech endpoint (<a
href="https://redirect.github.com/groq/groq-python/issues/219">#219</a>)
(<a
href="f150801968">f150801</a>)</li>
<li><strong>api:</strong> api update (<a
href="https://redirect.github.com/groq/groq-python/issues/218">#218</a>)
(<a
href="c124862e24">c124862</a>)</li>
<li><strong>api:</strong> api update (<a
href="https://redirect.github.com/groq/groq-python/issues/220">#220</a>)
(<a
href="f4eeb8d8be">f4eeb8d</a>)</li>
</ul>
<h3>Bug Fixes</h3>
<ul>
<li><strong>ci:</strong> ensure pip is always available (<a
href="https://redirect.github.com/groq/groq-python/issues/216">#216</a>)
(<a
href="085166c129">085166c</a>)</li>
<li><strong>ci:</strong> remove publishing patch (<a
href="https://redirect.github.com/groq/groq-python/issues/217">#217</a>)
(<a
href="fb579e87a3">fb579e8</a>)</li>
<li><strong>types:</strong> handle more discriminated union shapes (<a
href="https://redirect.github.com/groq/groq-python/issues/215">#215</a>)
(<a
href="5c72e94d51">5c72e94</a>)</li>
</ul>
<h3>Chores</h3>
<ul>
<li><strong>internal:</strong> bump rye to 0.44.0 (<a
href="https://redirect.github.com/groq/groq-python/issues/214">#214</a>)
(<a
href="66feae21c5">66feae2</a>)</li>
<li><strong>internal:</strong> codegen related update (<a
href="https://redirect.github.com/groq/groq-python/issues/213">#213</a>)
(<a
href="7a1627444b">7a16274</a>)</li>
<li><strong>internal:</strong> remove extra empty newlines (<a
href="https://redirect.github.com/groq/groq-python/issues/211">#211</a>)
(<a
href="4187fa110f">4187fa1</a>)</li>
</ul>
<h2>v0.19.0</h2>
<h2>0.19.0 (2025-03-11)</h2>
<p>Full Changelog: <a
href="https://github.com/groq/groq-python/compare/v0.18.0...v0.19.0">v0.18.0...v0.19.0</a></p>
<h3>Features</h3>
<ul>
<li><strong>api:</strong> manual updates (<a
href="https://redirect.github.com/groq/groq-python/issues/209">#209</a>)
(<a
href="15e2dca833">15e2dca</a>)</li>
<li><strong>client:</strong> allow passing <code>NotGiven</code> for
body (<a
href="https://redirect.github.com/groq/groq-python/issues/200">#200</a>)
(<a
href="afa6c0fc01">afa6c0f</a>)</li>
<li><strong>client:</strong> send <code>X-Stainless-Read-Timeout</code>
header (<a
href="https://redirect.github.com/groq/groq-python/issues/193">#193</a>)
(<a
href="e8911a43d6">e8911a4</a>)</li>
</ul>
<h3>Bug Fixes</h3>
<ul>
<li>add reasoning field to ChoiceDelta class (<a
href="edfee3b6c5">edfee3b</a>)</li>
<li>asyncify on non-asyncio runtimes (<a
href="https://redirect.github.com/groq/groq-python/issues/198">#198</a>)
(<a
href="49387fe83c">49387fe</a>)</li>
<li><strong>client:</strong> mark some request bodies as optional (<a
href="afa6c0fc01">afa6c0f</a>)</li>
<li>GitHub Terraform: Create/Update .github/workflows/stale.yaml [skip
ci] (<a
href="662763a5ea">662763a</a>)</li>
<li>GitHub Terraform: Create/Update .github/workflows/stale.yaml [skip
ci] (<a
href="5298ec1a8c">5298ec1</a>)</li>
</ul>
<h3>Chores</h3>
<ul>
<li><strong>api:</strong> remove chat_completion_chunk to force a
rebuild of it (<a
href="https://redirect.github.com/groq/groq-python/issues/208">#208</a>)
(<a
href="01fb0d14e4">01fb0d1</a>)</li>
<li><strong>docs:</strong> update client docstring (<a
href="https://redirect.github.com/groq/groq-python/issues/204">#204</a>)
(<a
href="a0f45996ff">a0f4599</a>)</li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/groq/groq-python/blob/main/CHANGELOG.md">groq's
changelog</a>.</em></p>
<blockquote>
<h2>0.20.0 (2025-03-19)</h2>
<p>Full Changelog: <a
href="https://github.com/groq/groq-python/compare/v0.19.0...v0.20.0">v0.19.0...v0.20.0</a></p>
<h3>Features</h3>
<ul>
<li><strong>api:</strong> Add speech endpoint (<a
href="https://redirect.github.com/groq/groq-python/issues/219">#219</a>)
(<a
href="f150801968">f150801</a>)</li>
<li><strong>api:</strong> api update (<a
href="https://redirect.github.com/groq/groq-python/issues/218">#218</a>)
(<a
href="c124862e24">c124862</a>)</li>
<li><strong>api:</strong> api update (<a
href="https://redirect.github.com/groq/groq-python/issues/220">#220</a>)
(<a
href="f4eeb8d8be">f4eeb8d</a>)</li>
</ul>
<h3>Bug Fixes</h3>
<ul>
<li><strong>ci:</strong> ensure pip is always available (<a
href="https://redirect.github.com/groq/groq-python/issues/216">#216</a>)
(<a
href="085166c129">085166c</a>)</li>
<li><strong>ci:</strong> remove publishing patch (<a
href="https://redirect.github.com/groq/groq-python/issues/217">#217</a>)
(<a
href="fb579e87a3">fb579e8</a>)</li>
<li><strong>types:</strong> handle more discriminated union shapes (<a
href="https://redirect.github.com/groq/groq-python/issues/215">#215</a>)
(<a
href="5c72e94d51">5c72e94</a>)</li>
</ul>
<h3>Chores</h3>
<ul>
<li><strong>internal:</strong> bump rye to 0.44.0 (<a
href="https://redirect.github.com/groq/groq-python/issues/214">#214</a>)
(<a
href="66feae21c5">66feae2</a>)</li>
<li><strong>internal:</strong> codegen related update (<a
href="https://redirect.github.com/groq/groq-python/issues/213">#213</a>)
(<a
href="7a1627444b">7a16274</a>)</li>
<li><strong>internal:</strong> remove extra empty newlines (<a
href="https://redirect.github.com/groq/groq-python/issues/211">#211</a>)
(<a
href="4187fa110f">4187fa1</a>)</li>
</ul>
<h2>0.19.0 (2025-03-11)</h2>
<p>Full Changelog: <a
href="https://github.com/groq/groq-python/compare/v0.18.0...v0.19.0">v0.18.0...v0.19.0</a></p>
<h3>Features</h3>
<ul>
<li><strong>api:</strong> manual updates (<a
href="https://redirect.github.com/groq/groq-python/issues/209">#209</a>)
(<a
href="15e2dca833">15e2dca</a>)</li>
<li><strong>client:</strong> allow passing <code>NotGiven</code> for
body (<a
href="https://redirect.github.com/groq/groq-python/issues/200">#200</a>)
(<a
href="afa6c0fc01">afa6c0f</a>)</li>
<li><strong>client:</strong> send <code>X-Stainless-Read-Timeout</code>
header (<a
href="https://redirect.github.com/groq/groq-python/issues/193">#193</a>)
(<a
href="e8911a43d6">e8911a4</a>)</li>
</ul>
<h3>Bug Fixes</h3>
<ul>
<li>add reasoning field to ChoiceDelta class (<a
href="edfee3b6c5">edfee3b</a>)</li>
<li>asyncify on non-asyncio runtimes (<a
href="https://redirect.github.com/groq/groq-python/issues/198">#198</a>)
(<a
href="49387fe83c">49387fe</a>)</li>
<li><strong>client:</strong> mark some request bodies as optional (<a
href="afa6c0fc01">afa6c0f</a>)</li>
<li>GitHub Terraform: Create/Update .github/workflows/stale.yaml [skip
ci] (<a
href="662763a5ea">662763a</a>)</li>
<li>GitHub Terraform: Create/Update .github/workflows/stale.yaml [skip
ci] (<a
href="5298ec1a8c">5298ec1</a>)</li>
</ul>
<h3>Chores</h3>
<ul>
<li><strong>api:</strong> remove chat_completion_chunk to force a
rebuild of it (<a
href="https://redirect.github.com/groq/groq-python/issues/208">#208</a>)
(<a
href="01fb0d14e4">01fb0d1</a>)</li>
<li><strong>docs:</strong> update client docstring (<a
href="https://redirect.github.com/groq/groq-python/issues/204">#204</a>)
(<a
href="a0f45996ff">a0f4599</a>)</li>
<li><strong>internal:</strong> codegen related update (<a
href="https://redirect.github.com/groq/groq-python/issues/199">#199</a>)
(<a
href="de2ac71d68">de2ac71</a>)</li>
<li><strong>internal:</strong> fix devcontainers setup (<a
href="https://redirect.github.com/groq/groq-python/issues/201">#201</a>)
(<a
href="af101ee282">af101ee</a>)</li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="9f14aacde8"><code>9f14aac</code></a>
release: 0.20.0 (<a
href="https://redirect.github.com/groq/groq-python/issues/212">#212</a>)</li>
<li><a
href="90be0841aa"><code>90be084</code></a>
release: 0.19.0 (<a
href="https://redirect.github.com/groq/groq-python/issues/194">#194</a>)</li>
<li><a
href="662763a5ea"><code>662763a</code></a>
fix: GitHub Terraform: Create/Update .github/workflows/stale.yaml [skip
ci]</li>
<li><a
href="5298ec1a8c"><code>5298ec1</code></a>
fix: GitHub Terraform: Create/Update .github/workflows/stale.yaml [skip
ci]</li>
<li>See full diff in <a
href="https://github.com/groq/groq-python/compare/v0.18.0...v0.20.0">compare
view</a></li>
</ul>
</details>
<br />

Updates `jinja2` from 3.1.5 to 3.1.6
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/pallets/jinja/releases">jinja2's
releases</a>.</em></p>
<blockquote>
<h2>3.1.6</h2>
<p>This is the Jinja 3.1.6 security release, which fixes security issues
but does not otherwise change behavior and should not result in breaking
changes compared to the latest feature release.</p>
<p>PyPI: <a
href="https://pypi.org/project/Jinja2/3.1.6/">https://pypi.org/project/Jinja2/3.1.6/</a>
Changes: <a
href="https://jinja.palletsprojects.com/en/stable/changes/#version-3-1-6">https://jinja.palletsprojects.com/en/stable/changes/#version-3-1-6</a></p>
<ul>
<li>The <code>|attr</code> filter does not bypass the environment's
attribute lookup, allowing the sandbox to apply its checks. <a
href="https://github.com/pallets/jinja/security/advisories/GHSA-cpwx-vrp4-4pq7">https://github.com/pallets/jinja/security/advisories/GHSA-cpwx-vrp4-4pq7</a></li>
</ul>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/pallets/jinja/blob/main/CHANGES.rst">jinja2's
changelog</a>.</em></p>
<blockquote>
<h2>Version 3.1.6</h2>
<p>Released 2025-03-05</p>
<ul>
<li>The <code>|attr</code> filter does not bypass the environment's
attribute lookup,
allowing the sandbox to apply its checks.
:ghsa:<code>cpwx-vrp4-4pq7</code></li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="15206881c0"><code>1520688</code></a>
release version 3.1.6</li>
<li><a
href="90457bbf33"><code>90457bb</code></a>
Merge commit from fork</li>
<li><a
href="065334d1ee"><code>065334d</code></a>
attr filter uses env.getattr</li>
<li><a
href="033c20015c"><code>033c200</code></a>
start version 3.1.6</li>
<li><a
href="bc68d4efa9"><code>bc68d4e</code></a>
use global contributing guide (<a
href="https://redirect.github.com/pallets/jinja/issues/2070">#2070</a>)</li>
<li><a
href="247de5e0c5"><code>247de5e</code></a>
use global contributing guide</li>
<li><a
href="ab8218c7a1"><code>ab8218c</code></a>
use project advisory link instead of global</li>
<li><a
href="b4ffc8ff29"><code>b4ffc8f</code></a>
release version 3.1.5 (<a
href="https://redirect.github.com/pallets/jinja/issues/2066">#2066</a>)</li>
<li>See full diff in <a
href="https://github.com/pallets/jinja/compare/3.1.5...3.1.6">compare
view</a></li>
</ul>
</details>
<br />

Updates `launchdarkly-server-sdk` from 9.9.0 to 9.10.0
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/launchdarkly/python-server-sdk/releases">launchdarkly-server-sdk's
releases</a>.</em></p>
<blockquote>
<h2>v9.10.0</h2>
<h2><a
href="https://github.com/launchdarkly/python-server-sdk/compare/9.9.0...9.10.0">9.10.0</a>
(2025-03-13)</h2>
<h3>Features</h3>
<ul>
<li>Inline context for custom and migration op events (<a
href="https://redirect.github.com/launchdarkly/python-server-sdk/issues/327">#327</a>)
(<a
href="ecfd56cc91">ecfd56c</a>)</li>
</ul>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/launchdarkly/python-server-sdk/blob/main/CHANGELOG.md">launchdarkly-server-sdk's
changelog</a>.</em></p>
<blockquote>
<h2><a
href="https://github.com/launchdarkly/python-server-sdk/compare/9.9.0...9.10.0">9.10.0</a>
(2025-03-13)</h2>
<h3>Features</h3>
<ul>
<li>Inline context for custom and migration op events (<a
href="https://redirect.github.com/launchdarkly/python-server-sdk/issues/327">#327</a>)
(<a
href="ecfd56cc91">ecfd56c</a>)</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="b7145ea004"><code>b7145ea</code></a>
chore(main): release 9.10.0 (<a
href="https://redirect.github.com/launchdarkly/python-server-sdk/issues/328">#328</a>)</li>
<li><a
href="ecfd56cc91"><code>ecfd56c</code></a>
feat: Inline context for custom and migration op events (<a
href="https://redirect.github.com/launchdarkly/python-server-sdk/issues/327">#327</a>)</li>
<li>See full diff in <a
href="https://github.com/launchdarkly/python-server-sdk/compare/9.9.0...9.10.0">compare
view</a></li>
</ul>
</details>
<br />

Updates `mem0ai` from 0.1.48 to 0.1.80
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/mem0ai/mem0/releases">mem0ai's
releases</a>.</em></p>
<blockquote>
<h2>0.1.80</h2>
<h2>What's Changed</h2>
<ul>
<li>[Feature] Add support for hybrid search for pinecone vector database
by <a href="https://github.com/deshraj"><code>@​deshraj</code></a> in <a
href="https://redirect.github.com/embedchain/embedchain/pull/1259">embedchain/embedchain#1259</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/embedchain/embedchain/compare/0.1.79...0.1.80">https://github.com/embedchain/embedchain/compare/0.1.79...0.1.80</a></p>
<h2>v0.1.80</h2>
<h2>What's Changed</h2>
<ul>
<li>Update for faiss doc by <a
href="https://github.com/Dev-Khant"><code>@​Dev-Khant</code></a> in <a
href="https://redirect.github.com/mem0ai/mem0/pull/2464">mem0ai/mem0#2464</a></li>
<li>Add support for procedural memory by <a
href="https://github.com/deshraj"><code>@​deshraj</code></a> in <a
href="https://redirect.github.com/mem0ai/mem0/pull/2460">mem0ai/mem0#2460</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/mem0ai/mem0/compare/v0.1.79...v0.1.80">https://github.com/mem0ai/mem0/compare/v0.1.79...v0.1.80</a></p>
<h2>0.1.79</h2>
<h2>What's Changed</h2>
<ul>
<li>[Bug fix] Fix vertex ai integration issue by <a
href="https://github.com/deshraj"><code>@​deshraj</code></a> in <a
href="https://redirect.github.com/embedchain/embedchain/pull/1257">embedchain/embedchain#1257</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/embedchain/embedchain/compare/0.1.78...0.1.79">https://github.com/embedchain/embedchain/compare/0.1.78...0.1.79</a></p>
<h2>v0.1.79</h2>
<h2>What's Changed</h2>
<ul>
<li>update changelog by <a
href="https://github.com/Dev-Khant"><code>@​Dev-Khant</code></a> in <a
href="https://redirect.github.com/mem0ai/mem0/pull/2462">mem0ai/mem0#2462</a></li>
<li>bump version -&gt; 0.1.79 by <a
href="https://github.com/Dev-Khant"><code>@​Dev-Khant</code></a> in <a
href="https://redirect.github.com/mem0ai/mem0/pull/2463">mem0ai/mem0#2...

_Description has been truncated_

---------

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Nicholas Tindle <nicholas.tindle@agpt.co>
2025-04-01 22:55:58 +00:00
dependabot[bot]
1b3c465f0d chore(backend/deps): bump psutil from 6.1.1 to 7.0.0 in /autogpt_platform/backend (#9686)
Bumps [psutil](https://github.com/giampaolo/psutil) from 6.1.1 to 7.0.0.
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/giampaolo/psutil/blob/master/HISTORY.rst">psutil's
changelog</a>.</em></p>
<blockquote>
<h1>7.0.0</h1>
<p>2025-02-13</p>
<p><strong>Enhancements</strong></p>
<ul>
<li>669_, [Windows]: <code>net_if_addrs()</code>_ also returns the
<code>broadcast</code> address
instead of <code>None</code>.</li>
<li>2480_: Python 2.7 is no longer supported. Latest version supporting
Python
2.7 is psutil 6.1.X. Install it with: <code>pip2 install
psutil==6.1.*</code>.</li>
<li>2490_: removed long deprecated <code>Process.memory_info_ex()</code>
method. It was
deprecated in psutil 4.0.0, released 8 years ago. Substitute is
<code>Process.memory_full_info()</code>.</li>
</ul>
<p><strong>Bug fixes</strong></p>
<ul>
<li>2496_, [Linux]: Avoid segfault (a cPython bug) on
<code>Process.memory_maps()</code>
for processes that use hundreds of GBs of memory.</li>
<li>2502_, [macOS]: <code>virtual_memory()</code>_ now relies on
<code>host_statistics64</code>
instead of <code>host_statistics</code>. This is the same approach used
by <code>vm_stat</code>
CLI tool, and should grant more accurate results.</li>
</ul>
<p><strong>Compatibility notes</strong></p>
<ul>
<li>2480_: Python 2.7 is no longer supported.</li>
<li>2490_: removed long deprecated <code>Process.memory_info_ex()</code>
method.</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="ea5b55605f"><code>ea5b556</code></a>
pre-release</li>
<li><a
href="d6e28b7a83"><code>d6e28b7</code></a>
try to fix tests</li>
<li><a
href="104bb3228b"><code>104bb32</code></a>
test cpu_times() for process children</li>
<li><a
href="16c091b380"><code>16c091b</code></a>
test cpu_times() for process children</li>
<li><a
href="eee09da72a"><code>eee09da</code></a>
[OSX] proc.c: Fix goo.gl link in comment for source reference (<a
href="https://redirect.github.com/giampaolo/psutil/issues/2505">#2505</a>)</li>
<li><a
href="17e27801e6"><code>17e2780</code></a>
ci: build aarch64 wheel on GHA aarch64 runner (<a
href="https://redirect.github.com/giampaolo/psutil/issues/2503">#2503</a>)</li>
<li><a
href="1ba8667c89"><code>1ba8667</code></a>
pin black version to 24.X, because new 25.X breaks style</li>
<li><a
href="9c114a5137"><code>9c114a5</code></a>
[OSX] use <code>host_statistics64</code> to get memory metrics (<a
href="https://redirect.github.com/giampaolo/psutil/issues/2502">#2502</a>)</li>
<li><a
href="08d7d43894"><code>08d7d43</code></a>
pin black version to 24.X, because new 25.X breaks style</li>
<li><a
href="a509e5aa18"><code>a509e5a</code></a>
669 windows broadcast addr (<a
href="https://redirect.github.com/giampaolo/psutil/issues/2501">#2501</a>)</li>
<li>Additional commits viewable in <a
href="https://github.com/giampaolo/psutil/compare/release-6.1.1...release-7.0.0">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=psutil&package-manager=pip&previous-version=6.1.1&new-version=7.0.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

---------

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Nicholas Tindle <nicholas.tindle@agpt.co>
2025-04-01 21:49:47 +00:00
Nicholas Tindle
f23b7543b3 feat: different signup error message (#9704)
<!-- Clearly explain the need for these changes: -->

We keep showing local users error messages that are just not relevant

### Changes 🏗️
Swaps the error messaging logic to be dependent on the behavior of the
specific platform they are on
<!-- Concisely describe all of the changes made in this pull request:
-->

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
- [x] Test with auth container down (simulates incorrect setup) and
validate that error message is shown
  - [x] Try normal path
2025-04-01 20:23:23 +00:00
Zamil Majdy
77b18b00c7 feat(frontend): Implement UI for Agent Input subtypes (#9700)
- Follow-up to #9657

<img width="280" alt="image"
src="https://github.com/user-attachments/assets/2f3cd683-db63-485f-8914-5654c34f1a4c"
/>

<img width="520" alt="image"
src="https://github.com/user-attachments/assets/de7e7cb9-61d4-4071-aea8-393ff5200c54"
/>

### Changes 🏗️

* Implement the input UI for Agent Input subtypes.
* Refactor node-input-component, extra out data type decision logic,
share it with runner/library input.
* Add `format` field for short-text, long-text, and mediafile type.
* Unify UI data type enum.

Out of scope:
- Styling for these inputs.

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
- [x] Use all the available agent input subtypes in an agent and run it

---------

Co-authored-by: Reinier van der Leer <pwuts@agpt.co>
2025-04-01 20:21:46 +00:00
Abhimanyu Yadav
dbb85baf4c fix(frontend): Fix date picker ux (#9715)
- fix #9315

What have I changed?

- Allowed the user to select the month and year using a dropdown.
- Removed the "Prev" and "Next" buttons for month navigation.
- Fixed the "Today" date design.

<img width="847" alt="Screenshot 2025-03-28 at 6 28 20 PM"
src="https://github.com/user-attachments/assets/740bddfd-e0a2-4799-8325-d52dec31a512"
/>

---------

Co-authored-by: Bently <tomnoon9@gmail.com>
2025-04-01 08:42:04 +00:00
Abhimanyu Yadav
7440f71527 fix(frontend): Sort agents by last edited date in publish dialog (#9724)
- fix #9189 

Currently, the list of agents on the "Publish Agent" dialog is random. I
have sorted them so that the latest edited ones appear first, similar to
the library page.

Co-authored-by: Bently <tomnoon9@gmail.com>
2025-04-01 03:26:07 +00:00
Abhimanyu Yadav
c6089bb6a6 fix(frontend): Remove animation from search bar on library agent page (#9707)
- fix #9523 

Removing the animation from the input search bar and adding the same
behavior as the Google search bar.

![Screenshot 2025-03-28 at 10 20
57 AM](https://github.com/user-attachments/assets/cee009e9-3a81-41b1-9023-503aa040fee4)

---------

Co-authored-by: Bently <tomnoon9@gmail.com>
2025-03-31 12:57:37 +00:00
Abhimanyu Yadav
c71d06a082 fix(frontend): Add extra padding bottom on library agent page (#9706)
- fix #9705 

Adding extra padding so the banner doesn’t cut below the cards.

![Screenshot 2025-03-28 at 9 30
49 AM](https://github.com/user-attachments/assets/d1990dda-4d16-430b-823c-a6338e57d99c)

---------

Co-authored-by: Bently <tomnoon9@gmail.com>
2025-03-31 12:57:19 +00:00
Reinier van der Leer
babcb41f43 refactor(libs): Remove print statements (#9718)
Remove the debug print statements in the logging module.

Every time an app process is started, it prints:
```
Console logging enabled
```
or similar, depending on the logging config.
2025-03-31 10:46:06 +00:00
Reinier van der Leer
abcacacc06 fix(ci): Update lockfiles 2025-03-31 12:16:37 +02:00
Reinier van der Leer
1f2af18388 feat(platform/library): Real-time execution updates (#9695)
- Resolves #8782

### Changes 🏗️

- feat(frontend/library): Use WS subscription to get real-time execution
updates
- feat(backend/ws_api): Send `GraphExecutionUpdate` on all new agent I/O
- Include agent I/O in `GraphExecutionUpdate` (by subclassing
`GraphExecution`)
    - Add `IO_BLOCK_IDs` to `.blocks.io`
- feat(backend/ws_api): Add `subscribe_graph_executions` method to
WebSocket API

- feat(backend): Withhold `GraphExecution.node_executions` from requests
by non-graph-owners
  - Split `GraphExecutionWithNodes` off of `GraphExecution`
- Use `GraphExecution` as much as possible, as it's a much cheaper query
than `GraphExecutionWithNodes`
  - refactor(frontend): Make `GraphExecution.node_executions` optional

- fix(frontend): Parse dates in responses of `/executions` and
`/graphs/{graph_id}/executions`

- refactor(frontend/library): Move sorting logic for agent runs list
from `AgentRunsPage` to `AgentRunsSelectorList`

- refactor(backend/ws_api): Clean up message handler implementations

- refactor(backend/tests): Use `.data.execution.get_graph_execution(..)`
directly instead of `AgentServer.test_get_graph_run_results(..)`

Out-of-scope changes:
- refactor(backend): Remove unnecessary query include from
`.data.graph.get_graph_metadata(..)`

Demo:


https://github.com/user-attachments/assets/8ea6225d-7334-49cb-a522-83f153d840da

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  - Go to `/library/agents/[id]` for an agent with inputs and outputs
    - Draft and run a new run
      - [x] -> should appear in the list of runs at the top
      - [x] -> should be selected as soon as the request finishes
      - [x] -> new I/O should appear as it is generated
- [x] -> status should be updated in real-time (both in list and in
adjacent details view)
    - Click "Run again"
      - [x] -> should appear in the list of runs at the top
      - [x] -> should be selected as soon as the request finishes
      - [x] -> new I/O should appear as it is generated
- [x] -> status should be updated in real-time (both in list and in
adjacent details view)
- Click "Open in builder" under "Agent actions"; run the agent from the
builder
      - [x] -> should work the same as before
        - [x] -> node I/O should appear in real-time
        - [x] -> node execution statuses should update in real-time
2025-03-28 12:19:14 +00:00
Swifty
8974a0f9e5 fix(platform): Fixs to allow formatting and tests to work from sub command (#9703)
This pull request includes several changes to improve the backend
functionality and configuration of the `autogpt_platform`. The most
important changes involve adding a RabbitMQ service for testing,
enhancing logging configuration, updating the linter script to handle
errors gracefully, and modifying test configurations.

Backend configuration improvements:

*
[`autogpt_platform/backend/docker-compose.test.yaml`](diffhunk://#diff-f6a211ff1c6d96d19adb5641ee287258a6af8d72a99e33dafb4a334094205a43R29-R43):
Added RabbitMQ service configuration for testing, including health
checks and environment variables.
*
[`autogpt_platform/backend/.env.example`](diffhunk://#diff-62020caf1b9a15e0e3b9b3b1b69d5f6464bf7643f62354cbbaabf755d57b6064R191-R192):
Added a section delimiter for optional API keys for use in finding the
optional keys end when auto generating integrations.

Error handling and logging enhancements:

*
[`autogpt_platform/backend/linter.py`](diffhunk://#diff-0787e3ef718ac9963df64d9ab1d8e7a3b35dc4ab0cb874c65da6c2901e1e4991R3):
Updated the `run` function to handle `subprocess.CalledProcessError`
exceptions and print error output to `stderr` and prevent raising a
stack trace when it should not.
[[1]](diffhunk://#diff-0787e3ef718ac9963df64d9ab1d8e7a3b35dc4ab0cb874c65da6c2901e1e4991R3)
[[2]](diffhunk://#diff-0787e3ef718ac9963df64d9ab1d8e7a3b35dc4ab0cb874c65da6c2901e1e4991L13-R23)

Testing configuration updates:

*
[`autogpt_platform/backend/pyproject.toml`](diffhunk://#diff-26ebebd91da791c6484f07d9d91484a66f52836708f5294b24365603438b880cR111):
Added `asyncio_default_fixture_loop_scope` to pytest configuration for
better control over asyncio fixtures.
*
[`autogpt_platform/backend/run_tests.py`](diffhunk://#diff-f09930577243a4ef5213bf6191a3c500a4b8d3dcfee2d4b452cf7ce66b3c494fL55):
Removed the `postgres-test` service from the test setup script as we
need all of docker services up for the tests to run.
2025-03-28 09:39:43 +01:00
Zamil Majdy
071ae3cb1f feat(backend): Make agent store data to be publicly accessible by non authenticated user (#9710)
This PR publicly exposes all the agents listed in the store to the
internet.

### Changes 🏗️

Remove the auth requirement for an agent to download the agent.

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
- [x] Accessing
`http://localhost:8006/api/store/download/agents/{agentId}` without
authorization key.
2025-03-28 07:11:48 +00:00
Zamil Majdy
c6703dd891 fix(backend): Skip updating status of already terminated graph (#9696)
When we are cancelling a running graph execution, it's possible that the
graph is already terminated.
We need to allow this process to proceed and update the rest of its node
execution to terminate.

### Changes 🏗️

Instead of erroring out the graph execution status update, we proceed on
updating the node execution status.

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  - [x] Stop an already terminated graph
2025-03-26 04:19:37 +00:00
Reinier van der Leer
6e0af09c3d Merge branch 'master' into dev 2025-03-25 17:00:53 +01:00
Reinier van der Leer
9077323b89 fix(backend): Filter Redis messages by user ID (#9697) 2025-03-25 16:56:16 +01:00
Zamil Majdy
33299070d3 feat(frontend/library): Add toast on agent execution request failure (#9689)
Currently, when an agent execution fails to be executed, the front-end
does not display any feedback to the user.
The scope of this change is providing that.

### Changes 🏗️

* Extracted `useToastOnFail` from `credits` page into a unified helper
method.
* Uses `useToastOnFail` on agent execution requests on library pages.

<img width="1000" alt="image"
src="https://github.com/user-attachments/assets/2daa0597-eb93-457d-8887-0f00c4db89ac"
/>
<img width="1000" alt="image"
src="https://github.com/user-attachments/assets/1a541c98-fb95-424f-8ffe-972332b3ce01"
/>


### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  - [x] Run agent with invalid input 

<details>
  <summary>Example test plan</summary>
  
  - [ ] Create from scratch and execute an agent with at least 3 blocks
- [ ] Import an agent from file upload, and confirm it executes
correctly
  - [ ] Upload agent to marketplace
- [ ] Import an agent from marketplace and confirm it executes correctly
  - [ ] Edit an agent from monitor, and confirm it executes correctly
</details>

#### For configuration changes:
- [ ] `.env.example` is updated or already compatible with my changes
- [ ] `docker-compose.yml` is updated or already compatible with my
changes
- [ ] I have included a list of my configuration changes in the PR
description (under **Changes**)

<details>
  <summary>Examples of configuration changes</summary>

  - Changing ports
  - Adding new services that need to communicate with each other
  - Secrets or environment variable changes
  - New or infrastructure changes such as databases
</details>

---------

Co-authored-by: Reinier van der Leer <pwuts@agpt.co>
2025-03-25 13:51:17 +00:00
Reinier van der Leer
87f87500cb refactor(backend): Improve error message on unmatched webhook ingress (#9694)
- Resolves #9693

### Changes 🏗️

- Catch the DB error and log a descriptive error message
- Add `NotFoundError` to `backend.util.exceptions`

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [ ] ~~I have tested my changes according to the test plan:~~
- Low-stakes change, high effort to test: we'll see if it works from the
production logs
2025-03-25 13:36:37 +00:00
Reinier van der Leer
1162ec1474 refactor(backend): Reorganize & clean up execution update system (#9663)
- Prep work for #8782
- Prep work for #8779

### Changes 🏗️

- refactor(platform): Differentiate graph/node execution events
- fix(platform): Subscribe to execution updates by `graph_exec_id`
instead of `graph_id`+`graph_version`
- refactor(backend): Move all execution related models and functions
from `.data.graph` to `.data.execution`
- refactor(backend): Reorganize & refactor `.data.execution`

- fix(libs): Remove `load_dotenv` in `.auth.config` to fix test config
issues
- dx: Bump version of `black` in pre-commit config to v24.10.0 to match
poetry.lock

- Other minor refactoring in both frontend and backend

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  - Run an agent in the builder
    - [x] -> works normally, node I/O is updated in real time
  - Run an agent in the library
    - [x] -> works normally

---------

Co-authored-by: Zamil Majdy <zamil.majdy@agpt.co>
2025-03-25 13:14:04 +01:00
Krzysztof Czerwinski
37f212e950 Stop migrating models 2025-03-25 11:46:00 +00:00
Krzysztof Czerwinski
58bb4f92b7 feat(platform): Onboarding updates (#9636)
This is a follow up to
https://github.com/Significant-Gravitas/AutoGPT/pull/9511 fixing some
issues and updating onboarding.

### Changes 🏗️

- Update `UserOnboarding` data
  - Update schema and add migration
- Change `step` in `UserOnboarding` to `completedSteps` array with
`OnboardingStep` enum
- Remove `isCompleted`: this is now inferred from `completedSteps`
values
- Don't onboard if <2 marketplace agents; that prevents self-host
onboarding
- Add endpoints:
- `is_onboarding_enabled`: to check if users should be onboarded (not if
they finished onboarding); now check if there are at least 2 marketplace
agents
- `get_store_agent`: returns `StoreAgentDetails` for given
`store_listing_version_id`
  - `get_graph_meta_by_store_listing_version_id`: returns `GraphMeta`
- Add agent to Library just before running it and not when chosen and
remove code that was responsible for removing agent that wasn't run
- Move onboarding to `OnboardingProvider` (it'll be needed globally for
Phase 2)
- Multiple fixes, renames for clarity

### Checklist 📋

- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  - [x] Don't onboard if less than 2 marketplace agents
  - [x] Avoid non-input and credentials agents
  - [x] Onboarding works and can be finished
  - [x] Onboarding resumes
  - [x] Onboarding agent runs correctly

---------

Co-authored-by: Nicholas Tindle <nicholas.tindle@agpt.co>
2025-03-25 10:40:40 +00:00
Krzysztof Czerwinski
b7ca8d9c30 feat(backend): Migrate old models in existing agents (#9452)
Some existing nodes use models that no longer exist as values on
`LlmModel` enum.

### Changes 🏗️

- Update models for all blocks with `LlmModel` fields that do not exist
in `LlmModel` enum to `gpt-4o`, directly in `AgentNode->constantInput`
db column, on server startup

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  - [x] Updates wrong models to `gpt-4o` for all affected `AgentNode`s
  - [x] Doesn't update correct models
  - [x] Doesn't insert model when unnecessary
  - [x] Doesn't break other values in jsonb
2025-03-25 09:38:11 +00:00
Zamil Majdy
66ebe4376e fix(backend): Increase block request security; Prevent DNS rebinding & open redirect attack (#9688)
The current block web requests utility has a logic to avoid the system
firing into blocklisted IPs.
However, the current logic is still prone to a few security issues:

* DNS rebinding attack: due to the lack of guarantee on the used IP not
being changed during the IP checking and firing step.
* Open redirect: due to the request sensitive request headers are still
being propagated throughout the web redirect.

### Changes 🏗️

* Uses IP pinning to request the web.
* Strip `Authorization`, `Proxy-Authorization`, `Cookie` upon web
redirects.

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
- [x] Test the web request block, add more tests with different
validation scenarios.

(cherry picked from commit f0df4c9174)
2025-03-25 13:51:20 +07:00
Zamil Majdy
f0df4c9174 fix(backend): Increase block request security; Prevent DNS rebinding & open redirect attack (#9688)
The current block web requests utility has a logic to avoid the system
firing into blocklisted IPs.
However, the current logic is still prone to a few security issues:

* DNS rebinding attack: due to the lack of guarantee on the used IP not
being changed during the IP checking and firing step.
* Open redirect: due to the request sensitive request headers are still
being propagated throughout the web redirect.

### Changes 🏗️

* Uses IP pinning to request the web.
* Strip `Authorization`, `Proxy-Authorization`, `Cookie` upon web
redirects.


### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
- [x] Test the web request block, add more tests with different
validation scenarios.
2025-03-25 13:09:47 +07:00
Toran Bruce Richards
2e9ca70ce2 Update CONTRIBUTING.md 2025-03-24 18:11:56 +00:00
Reinier van der Leer
4ca1a453c9 refactor(backend): Defer loading of .blocks and .integrations.webhooks on module init (#9664)
Currently, an import statement like `from backend.blocks.basic import
AgentInputBlock` will initialize `backend.blocks` and thereby load all
other blocks. This has quite high potential to cause circular import
issues, and it's bad for performance in cases where we don't want to
load all blocks (yet).
The same goes for `backend.integrations.webhooks`.

### Changes 🏗️

- Change `__init__.py` of `backend.blocks` and
`backend.integrations.webhooks` to cached loader functions rather than
init-time code
- Change type of `BlockWebhookConfig.provider` to `ProviderName`

<!-- test edit to check that this doesn't break anything -->

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  - [x] Set up and use an agent with a webhook-triggered block

---------

Co-authored-by: Zamil Majdy <zamil.majdy@agpt.co>
2025-03-24 15:44:45 +00:00
Zamil Majdy
6f48515863 fix(blocks): Disable and provide toggle for Agent Input Block subtypes (#9677)
Agent Input Block subtypes do not have a proper input UI yet on the
library & run input page.

### Changes 🏗️

Provide a toggle to enable these blocks and set it to False by default.

### Checklist 📋

#### For code changes:
- [ ] I have clearly listed my changes in the PR description
- [ ] I have made a test plan
- [ ] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [ ] ...

<details>
  <summary>Example test plan</summary>
  
  - [ ] Create from scratch and execute an agent with at least 3 blocks
- [ ] Import an agent from file upload, and confirm it executes
correctly
  - [ ] Upload agent to marketplace
- [ ] Import an agent from marketplace and confirm it executes correctly
  - [ ] Edit an agent from monitor, and confirm it executes correctly
</details>

#### For configuration changes:
- [ ] `.env.example` is updated or already compatible with my changes
- [ ] `docker-compose.yml` is updated or already compatible with my
changes
- [ ] I have included a list of my configuration changes in the PR
description (under **Changes**)

<details>
  <summary>Examples of configuration changes</summary>

  - Changing ports
  - Adding new services that need to communicate with each other
  - Secrets or environment variable changes
  - New or infrastructure changes such as databases
</details>
2025-03-24 12:33:54 +00:00
Nicholas Tindle
7ba566e768 feat(frontend/backend): admin agent review table (#9634)
<!-- Clearly explain the need for these changes: -->
We need an admin agent approval UI for handling the submissions to the
marketplace

### Changes 🏗️
- Adds routes to the admin routes list
- Fixes the db query for submitting new versions of existing agents
- Add models for responses that include version details
- add the admin pages for agent
- Adds the Admin Agent Data Table
- Add all the new endpoints to the client.ts
Models changes
- convert the Submission status to an enum
- remove is_approved from models which was left incorrectly
- Add StoreListingWithVersions
<!-- Concisely describe all of the changes made in this pull request:
-->

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [x] Test the admin dashboard for
    - [x] Reject
    - [x] Accept
    - [x] Updating listing
    - [x] More version submissions

---------

Co-authored-by: Zamil Majdy <zamil.majdy@agpt.co>
2025-03-24 07:52:35 +00:00
Zamil Majdy
26984a7338 feat(backend): Add capability to charge based on block execution count (#9661)
Blocks that are not defined in the block cost are pretty much free. The
lack of cost control makes it hard to control its quota. The scope of
this change is providing a way to charge any executions based on the
number of block being executed in real-time.

### Changes 🏗️

* Add execution charge logic based on the number of blocks executed,
controlled by these two configurations:
* `execution_cost_count_threshold`: We will charge the execution based
on the multiple of this number.
* `execution_cost_per_threshold`: The amount we are charging on its
threshold multiple.
* Make charging logic on the graph execution logic (as opposed to node
level) so it's being done serially and insufficient fund error is
guaranteed to stop the graph execution.
* Moved cost calculation logic into backend/executor/util.py

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
- [x] Execute graph with configured threshold & cost and test the
balance being deducted on that.
  - [x] Existing cost calculation is still being done without any issue.
  - [x] Low balance stop the whole graph execution.
2025-03-24 07:26:33 +00:00
Zamil Majdy
5b118fc939 fix(blocks): Fix failing block test AgentToggleInputBlock 2025-03-24 12:31:16 +07:00
Zamil Majdy
ed48d1c04f fix(blocks): Change title of placeholder_values on AgentDropdownInputBlock 2025-03-24 12:21:20 +07:00
Zamil Majdy
8d87c08b8c fix(blocks): Set AgentInputBlocks default value to None and make it a non-advance field 2025-03-24 12:15:59 +07:00
Zamil Majdy
e49cb43b49 feat(block): Add agent input block subtypes (#9657)
### Changes 🏗️

Added these types of input blocks:
* TextShort
* TextLong
* Number
* Date
* Time
* FileUpload
* Dropdown
* Toggle

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  - [x] Test in respective block codes.
2025-03-24 02:57:10 +00:00
Nicholas Tindle
56663c5fe9 feat(backend): add backend support for store listings submissions (#9628)
<!-- Clearly explain the need for these changes: -->
The store listing and submissions were previously just a best guess
without much implementation. This updates the database models and
queries and such to be based on discussion around what the process
should look like. It also adds and update the relevant routers for this
change

### Changes 🏗️
Store Listing
- change isApproved to hasApprovedVersion
- Move slug into store listing
- mark an active version in store listing

Store Version
- Move submissions into version
- make name optional
- have state transition timestamps for submitted and approved/rejected
- added a changes field
- added internal comments and clarified review comments field

SubmissionStatus
- Fixed DAFT to DRAFT

StoreListingSubmission
- Dropped table

Graph
- Used more modern format for the params for prisma -- no other changes

Added migrations for all the model movements

<!-- Concisely describe all of the changes made in this pull request:
-->

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
- [x] Use the store codepaths from the release testplan doc as the test
plan (claude I can't publish the testplan but I am a maintainer lol,
trust me here my guy, you're supposed to be lenient)
  - [x] Check the db is used as appropriate following the rules

---------

Co-authored-by: Zamil Majdy <zamil.majdy@agpt.co>
2025-03-23 23:30:54 +00:00
Nicholas Tindle
a537fd0891 [Snyk] Security upgrade next from 14.2.23 to 14.2.25 (#9672)
![snyk-top-banner](https://redirect.github.com/andygongea/OWASP-Benchmark/assets/818805/c518c423-16fe-447e-b67f-ad5a49b5d123)

### Snyk has created this PR to fix 1 vulnerabilities in the yarn
dependencies of this project.

#### Snyk changed the following file(s):

- `autogpt_platform/frontend/package.json`
- `autogpt_platform/frontend/yarn.lock`


#### Note for
[zero-installs](https://yarnpkg.com/features/zero-installs) users

If you are using the Yarn feature
[zero-installs](https://yarnpkg.com/features/zero-installs) that was
introduced in Yarn V2, note that this PR does not update the
`.yarn/cache/` directory meaning this code cannot be pulled and
immediately developed on as one would expect for a zero-install project
- you will need to run `yarn` to update the contents of the
`./yarn/cache` directory.
If you are not using zero-install you can ignore this as your flow
should likely be unchanged.




#### Vulnerabilities that will be fixed with an upgrade:

|  | Issue | Score | 

:-------------------------:|:-------------------------|:-------------------------
![critical
severity](https://res.cloudinary.com/snyk/image/upload/w_20,h_20/v1561977819/icon/c.png
'critical severity') | Improper Authorization
<br/>[SNYK-JS-NEXT-9508709](https://snyk.io/vuln/SNYK-JS-NEXT-9508709) |
&nbsp;&nbsp;**751**&nbsp;&nbsp;




---

> [!IMPORTANT]
>
> - Check the changes in this PR to ensure they won't cause issues with
your project.
> - Max score is 1000. Note that the real score may have changed since
the PR was raised.
> - This PR was automatically created by Snyk using the credentials of a
real user.

---

**Note:** _You are seeing this because you or someone else with access
to this repository has authorized Snyk to open fix PRs._

For more information: <img
src="https://api.segment.io/v1/pixel/track?data=eyJ3cml0ZUtleSI6InJyWmxZcEdHY2RyTHZsb0lYd0dUcVg4WkFRTnNCOUEwIiwiYW5vbnltb3VzSWQiOiI5NGUwNmYxNy0yY2NkLTQ3NzEtOTk2NC1kN2JmYzhiNTI2NTgiLCJldmVudCI6IlBSIHZpZXdlZCIsInByb3BlcnRpZXMiOnsicHJJZCI6Ijk0ZTA2ZjE3LTJjY2QtNDc3MS05OTY0LWQ3YmZjOGI1MjY1OCJ9fQ=="
width="0" height="0"/>
🧐 [View latest project
report](https://app.snyk.io/org/significant-gravitas/project/3d924968-0cf3-4767-9609-501fa4962856?utm_source&#x3D;github&amp;utm_medium&#x3D;referral&amp;page&#x3D;fix-pr)
📜 [Customise PR
templates](https://docs.snyk.io/scan-using-snyk/pull-requests/snyk-fix-pull-or-merge-requests/customize-pr-templates?utm_source=github&utm_content=fix-pr-template)
🛠 [Adjust project
settings](https://app.snyk.io/org/significant-gravitas/project/3d924968-0cf3-4767-9609-501fa4962856?utm_source&#x3D;github&amp;utm_medium&#x3D;referral&amp;page&#x3D;fix-pr/settings)
📚 [Read about Snyk's upgrade
logic](https://docs.snyk.io/scan-with-snyk/snyk-open-source/manage-vulnerabilities/upgrade-package-versions-to-fix-vulnerabilities?utm_source=github&utm_content=fix-pr-template)

---

**Learn how to fix vulnerabilities with free interactive lessons:**

🦉 [Improper
Authorization](https://learn.snyk.io/lesson/broken-function-level-authorization/?loc&#x3D;fix-pr)

[//]: #
'snyk:metadata:{"customTemplate":{"variablesUsed":[],"fieldsUsed":[]},"dependencies":[{"name":"next","from":"14.2.23","to":"14.2.25"}],"env":"prod","issuesToFix":["SNYK-JS-NEXT-9508709"],"prId":"94e06f17-2ccd-4771-9964-d7bfc8b52658","prPublicId":"94e06f17-2ccd-4771-9964-d7bfc8b52658","packageManager":"yarn","priorityScoreList":[751],"projectPublicId":"3d924968-0cf3-4767-9609-501fa4962856","projectUrl":"https://app.snyk.io/org/significant-gravitas/project/3d924968-0cf3-4767-9609-501fa4962856?utm_source=github&utm_medium=referral&page=fix-pr","prType":"fix","templateFieldSources":{"branchName":"default","commitMessage":"default","description":"default","title":"default"},"templateVariants":["updated-fix-title","priorityScore"],"type":"auto","upgrade":["SNYK-JS-NEXT-9508709"],"vulns":["SNYK-JS-NEXT-9508709"],"patch":[],"isBreakingChange":false,"remediationStrategy":"vuln"}'

Co-authored-by: snyk-bot <snyk-bot@snyk.io>
2025-03-23 23:09:47 +00:00
Krzysztof Czerwinski
d694ccd50f fix(frontend): Fill defaults from schema to hardcodedValues at node creation (#9632)
Defaults need to be handled as special cases every time there's no
`hardcodedValues` in node data. This causes multiple issues such as
`useCredentials` not taking into account default model and requiring
user to manually switch model back and forth.

### Changes 🏗️

- Set default values from `inputSchema` to `hardcodedValues` when new
node is placed.

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  - [x] Newly placed node has defaults set as `hardcodedValues`
- [x] AI Blocks: Model is recognised, node shows price and credentials
work correctly without the need to switch

---------

Co-authored-by: Zamil Majdy <zamil.majdy@agpt.co>
2025-03-23 22:48:28 +00:00
Zamil Majdy
f01b31873f feat(backend): Avoid loading all node executions on large continuos agent (#9667)
A graph that processes tens of thousands of nodes will cripple the
system since the API tries to load all of them and dump them into the
browser. The scope of this change is to avoid such an issue by only
returning the last 1000 node executions.

### Changes 🏗️

* Return only 1000 node executions from `AgentNodeExecutions` reference.
* Unify the include clause for fetching `AgentNodeExecutions` in one
place and its format.
* Fix & optimize `cancel_execution` logic to always set both the graph &
node execution status in batch.

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
- [x] Execute a graph in a loop that executes 10000 nodes, it should
only display the last 1000 nodes when refreshed. Cancelling the graph
should also not cripple the server.
2025-03-21 12:48:35 +00:00
Zamil Majdy
5411e18bd0 feat(backend): Mark starting nodes as QUEUED instead of INCOMPLETE during the initial execution (#9665)
Having the starting nodes of the execution marked as incomplete misled
the users.

### Changes 🏗️

Mark the starting nodes during the executions as QUEUED instead of
INCOMPLETE.

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
- [x] Executed the graph, the incomplete initial starting node is no
more.
2025-03-21 11:59:27 +00:00
Zamil Majdy
b85f6196aa fix(frontend): Fix unreliable websocket connection for node execution update (#9666)
The current execution update is unreliable, once you lose WebSocket
connection, you will receive no updates.

### Changes 🏗️

Fix web socket re-connection logic.

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
- [x] Run the app and execute an agent, then restart the API server, and
re-execute the app without refreshing the page.
2025-03-21 11:50:20 +00:00
Zamil Majdy
a1ac7b18f9 feat(backend): Avoid connecting the same host and falling-back to defined api_host (#9668)
### Changes 🏗️

Avoid connecting the same host and falling-back to defined api_host.

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [ ] I have tested my changes according to the test plan:
- [ ] Define a custom DBMANAGER_HOST, RestService should access the
`db_manager` service using localhost.
2025-03-21 11:41:01 +00:00
Zamil Majdy
42232f55e8 feat(platform): Use a single DB manager across the system (#9662)
DB Manager uses DB connections, and multiple instances of it hog the
very limited Database connection quota. We need this service to be a
unified place to control the limited db connection.

### Changes 🏗️

Connect all the database manager usage in a single place, currently
running on the rest service.

### Checklist 📋

#### For code changes:
- [ ] I have clearly listed my changes in the PR description
- [ ] I have made a test plan
- [ ] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [ ] ...

<details>
  <summary>Example test plan</summary>
  
  - [ ] Create from scratch and execute an agent with at least 3 blocks
- [ ] Import an agent from file upload, and confirm it executes
correctly
  - [ ] Upload agent to marketplace
- [ ] Import an agent from marketplace and confirm it executes correctly
  - [ ] Edit an agent from monitor, and confirm it executes correctly
</details>

#### For configuration changes:
- [ ] `.env.example` is updated or already compatible with my changes
- [ ] `docker-compose.yml` is updated or already compatible with my
changes
- [ ] I have included a list of my configuration changes in the PR
description (under **Changes**)

<details>
  <summary>Examples of configuration changes</summary>

  - Changing ports
  - Adding new services that need to communicate with each other
  - Secrets or environment variable changes
  - New or infrastructure changes such as databases
</details>
2025-03-21 09:24:08 +00:00
Reinier van der Leer
9a661b5101 fix(backend/ws): Add user_id to websocket event subscription key (#9660)
- Add `user_id` to WS subscription key
- Add error catching to WS message handler
2025-03-20 17:54:04 +01:00
Zamil Majdy
90b147ff51 Merge branch 'dev' of github.com:Significant-Gravitas/AutoGPT into dev 2025-03-19 23:39:51 +07:00
Reinier van der Leer
6e4fbb0cb5 fix(frontend/library): Truncate agent card title and description (#9658)
- Resolves #9631

### Changes 🏗️

- Truncate library agent card title (2 lines) and description (3 lines)
- Make "See runs" and "Open in builder" stick to bottom of card
regardless of other content
- Reduce number of grid columns (4 -> 3) in `lg` layout on `/library` to
give items more horizontal space

![screenshot of library agent grid with the applied
changes](https://github.com/user-attachments/assets/b27d5c97-33b8-4708-9f8c-fc67aad899c9)


### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  - [x] Visually test the changes made on different screen sizes
2025-03-19 23:39:09 +07:00
Reinier van der Leer
df6203343d fix(frontend/library): Improve agent I/O rendering (#9656)
- Related to #8784

### Changes 🏗️

- feat(frontend/library): Improve agent output styling & fix content
overflow issue
- fix(frontend/library): Fix overlap between content and inset button of
expandable input fields (#9650)
- fix(backend): Unbreak loading graph executions with missing inputs

![screenshot of restyled Output
section](https://github.com/user-attachments/assets/97836158-5735-4d01-94dd-16e3fb6999c6)

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
- Run an agent with at least one input *not* filled out; view this run
in the Library
    - [x] -> page should load normally
    - [x] -> agent inputs should load and show normally
- Run an agent that generates long output; view this run in the Library
- [x] -> output should not overflow its container or stretch the page
layout
    - [x] -> visually check that the output section looks slick
2025-03-19 23:39:09 +07:00
Toran Bruce Richards
93238dc78c fix(blocks): SendWebRequestBlock to properly handle HTTP error responses (#9655)
### Issue
The SendWebRequestBlock currently fails to properly route HTTP error
responses (4xx, 5xx) to their designated output pins (`client_error` and
`server_error`). Instead, these errors are being sent to the default
"Error" pin, breaking expected workflows that depend on proper error
handling.

### Root Cause
The underlying issue is that our custom `requests` module from
`backend.util.request` appears to automatically raise exceptions for
error status codes (similar to how `raise_for_status()` works in the
standard requests library). When these exceptions are thrown, the
block's conditional logic for handling different status codes is
bypassed entirely.

### Changes
This PR adds proper exception handling to catch HTTP errors raised by
the requests module and routes them to the appropriate output pins:
- Added a try-except block to capture `requests.exceptions.HTTPError`
- Extract status code and response data from the caught exception
- Yield to the proper pin based on the status code (4xx → client_error,
5xx → server_error)
- Maintain consistent behavior with the original design intent

### Additional Context
This change maintains backward compatibility while ensuring the block
behaves according to its documented functionality. Users can now
properly handle 4xx and 5xx errors in their workflows as originally
intended.

<!-- Clearly explain the need for these changes: -->
### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
- [x] Test the block with new changes and old and ensure expected
behavior

---------

Co-authored-by: Reinier van der Leer <pwuts@agpt.co>
2025-03-19 23:39:09 +07:00
Nicholas Tindle
b6260b5ce9 fix(backend): drastically increase batching time for the agent run (#9654)
<!-- Clearly explain the need for these changes: -->

We accidently send several emails within 10 mins and we're gonna get
blocked for spam if we keep it up

### Changes 🏗️
- moves 1 min to 60 min timer
<!-- Concisely describe all of the changes made in this pull request:
-->

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [x] Test via batching a series of notiications over an hour
2025-03-19 23:39:09 +07:00
Reinier van der Leer
9c31c79898 feat(frontend/library): Make agent input fields expandable (#9650)
- Resolves #9622

### Changes 🏗️

- Add pop-out button + modal to input fields in Agent Run Draft view on
`/library/agents/[id]`
- Fix `icon`-variant button styling

![the expand button on the input
fields](https://github.com/user-attachments/assets/00be33fe-44d1-490a-9cab-9696df8f6e6f)
![the expanded input modal that
appears](https://github.com/user-attachments/assets/787f33b9-d884-467b-b99b-dcbec8a1d059)

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  - Go to an agent's page -> click "+ New run"
    - [x] -> pop-out button should show on all input fields
- Enter a value in one of the inputs; click the pop-out button on that
input
    - [x] -> input modal with large text field should open
- [x] -> the value you just entered should be present in the modal's
text field
  - Edit the value & click "Save"
    - [x] -> the modal should close
- [x] -> the value in the corresponding input field should be updated
2025-03-19 23:39:09 +07:00
Zamil Majdy
f8a6c9e67f fix(block): Revert custom get_missing_links method on AddToListBlock 2025-03-19 23:39:09 +07:00
Zamil Majdy
ff9a5cc638 fix(block): Avoid infinite loop execution on AddToListBlock self-loop (#9629)
### Changes 🏗️

<img width="757" alt="image"
src="https://github.com/user-attachments/assets/909aab58-24c7-42ec-9580-ac3e9f32057e"
/>

Since a self-loop is now allowed for AddToListBlock, providing an entry
pin using a static output will cause infinite execution.
This PR change avoid such scenario to be allowed.

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [x] Described above
2025-03-19 23:39:09 +07:00
Zamil Majdy
4e6144803b fix(platform): Fix possible db-config permission denied when running two different Supabase versions (#9652)
The change in https://github.com/Significant-Gravitas/AutoGPT/pull/9620
introduces a breaking change in the database volume content; however,
the database's volume location does not change, making switching between
two versions clash.

### Changes 🏗️

Renamed db-config named volume to supabase-config.

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [x] CI
2025-03-19 23:39:09 +07:00
Reinier van der Leer
2c92122721 feat(platform/library): Add icons to primary agent run action buttons (#9651)
- Resolves #9612

### Changes 🏗️

- Add icon to "Run" button in run draft view
- Add icons "Stop run" and "Run again" buttons in run view

!["Run"
button](https://github.com/user-attachments/assets/da863753-6cb2-4cea-aa00-c313b606d198)
!["Run again"
button](https://github.com/user-attachments/assets/79958187-05dd-494e-a3a1-e9745db0d2d4)
!["Stop run"
button](https://github.com/user-attachments/assets/ad37ec3a-3c0b-493b-b548-e6b902eb8bda)


### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  - Purely visual changes, no functional test needed.
    Technical changes are covered by the type checker.
2025-03-19 23:39:09 +07:00
Nicholas Tindle
9b19d1959e feat(frontend): break out the sidebar into a reusable component + use it for admin page (#9618)
<!-- Clearly explain the need for these changes: -->
We need a sidebar for the admin page, might as well reuse the reusable
component to do so!

### Changes 🏗️
- Extracts the agptui sidebar to a more reusable component
- Update the usage of that sidebar in the settings page
- Use that same sidebar for the admin page

<!-- Concisely describe all of the changes made in this pull request:
-->

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [x] Test the old sidebar
  - [x] Test the new sidebar for admin
2025-03-19 23:39:09 +07:00
Zamil Majdy
17f3a19bc3 feat(backend): Support sub-agent on export/import agent feature (#9640)
Agents using Agent blocks should be seamlessly downloaded from the
marketplace to a file and imported from a file.

Requirements:
* A recursive export process that exports all the required agents to a
single file, no matter how many layers deep (taking care of potential
loops).
* An import process that expects and extracts several agents from a
single file into your library at once.

Considerations:
We need to ensure the reference IDs in the Agent Blocks match/are
updated to match the imported sub-agent ids to prevent broken
references.

### Changes 🏗️

* Add sub_graphs field on Graph model 
* Improve graph creation query to support inserting graph + subgraphs in
batch
* Deprecate graph template & remove its column
* Update on marketplace download agent (unified the used method, with
more secure cleanup & proper ownership check).
* Fix failing test cases

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [x] Export graph with sub agents.
  - [x] Import the exported graph with sub agents.
2025-03-19 23:39:09 +07:00
Reinier van der Leer
596b29f53a feat(platform/library): Add "Export agent to file" action (#9627)
- Resolves #9609

### Changes 🏗️

- feat(frontend/library): Add "Export agent to file" button
- fix(frontend/library): Put "Open in builder" button behind access
check

- feat(backend): Improve & move graph export stripping logic
  - Add logic to strip `SecretField` values
  - Move node stripping logic to `NodeModel` from `GraphModel`
    - Add `NodeModel.stripped_for_export()` method
  - Add `NodeModel.block` property

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
- Create and configure an agent with the Publish To Medium block and a
block that uses credentials
  - Go to `/library/agents/[id]` for the agent you just created
    - [x] -> "Open in builder" button should show
    - [x] -> "Open in builder" button should work
    - [x] -> "Export agent to file" button should show
    - [x] -> "Export agent to file" button should work
      - [x] -> Exported file contains no credentials or secrets
      - [ ] -> ~~Exported file contains no user IDs~~
  - Go to `/library/agents/[id]` for an agent from the marketplace
    - [x] -> "Open in builder" button should not show
    - [x] -> "Export agent to file" button should not show
2025-03-19 23:39:09 +07:00
dependabot[bot]
e0300f3d13 chore(libs/deps-dev): bump ruff from 0.9.6 to 0.9.9 in /autogpt_platform/autogpt_libs in the development-dependencies group (#9559)
Bumps the development-dependencies group in
/autogpt_platform/autogpt_libs with 1 update:
[ruff](https://github.com/astral-sh/ruff).

Updates `ruff` from 0.9.6 to 0.9.9
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/astral-sh/ruff/releases">ruff's
releases</a>.</em></p>
<blockquote>
<h2>0.9.9</h2>
<h2>Release Notes</h2>
<h3>Preview features</h3>
<ul>
<li>Fix caching of unsupported-syntax errors (<a
href="https://redirect.github.com/astral-sh/ruff/pull/16425">#16425</a>)</li>
</ul>
<h3>Bug fixes</h3>
<ul>
<li>Only show unsupported-syntax errors in editors when preview mode is
enabled (<a
href="https://redirect.github.com/astral-sh/ruff/pull/16429">#16429</a>)</li>
</ul>
<h2>Contributors</h2>
<ul>
<li><a
href="https://github.com/InSyncWithFoo"><code>@​InSyncWithFoo</code></a></li>
<li><a
href="https://github.com/MichaReiser"><code>@​MichaReiser</code></a></li>
<li><a
href="https://github.com/dhruvmanila"><code>@​dhruvmanila</code></a></li>
<li><a href="https://github.com/ntBre"><code>@​ntBre</code></a></li>
</ul>
<h2>Install ruff 0.9.9</h2>
<h3>Install prebuilt binaries via shell script</h3>
<pre lang="sh"><code>curl --proto '=https' --tlsv1.2 -LsSf
https://github.com/astral-sh/ruff/releases/download/0.9.9/ruff-installer.sh
| sh
</code></pre>
<h3>Install prebuilt binaries via powershell script</h3>
<pre lang="sh"><code>powershell -ExecutionPolicy ByPass -c &quot;irm
https://github.com/astral-sh/ruff/releases/download/0.9.9/ruff-installer.ps1
| iex&quot;
</code></pre>
<h2>Download ruff 0.9.9</h2>
<table>
<thead>
<tr>
<th>File</th>
<th>Platform</th>
<th>Checksum</th>
</tr>
</thead>
<tbody>
<tr>
<td><a
href="https://github.com/astral-sh/ruff/releases/download/0.9.9/ruff-aarch64-apple-darwin.tar.gz">ruff-aarch64-apple-darwin.tar.gz</a></td>
<td>Apple Silicon macOS</td>
<td><a
href="https://github.com/astral-sh/ruff/releases/download/0.9.9/ruff-aarch64-apple-darwin.tar.gz.sha256">checksum</a></td>
</tr>
<tr>
<td><a
href="https://github.com/astral-sh/ruff/releases/download/0.9.9/ruff-x86_64-apple-darwin.tar.gz">ruff-x86_64-apple-darwin.tar.gz</a></td>
<td>Intel macOS</td>
<td><a
href="https://github.com/astral-sh/ruff/releases/download/0.9.9/ruff-x86_64-apple-darwin.tar.gz.sha256">checksum</a></td>
</tr>
<tr>
<td><a
href="https://github.com/astral-sh/ruff/releases/download/0.9.9/ruff-aarch64-pc-windows-msvc.zip">ruff-aarch64-pc-windows-msvc.zip</a></td>
<td>ARM64 Windows</td>
<td><a
href="https://github.com/astral-sh/ruff/releases/download/0.9.9/ruff-aarch64-pc-windows-msvc.zip.sha256">checksum</a></td>
</tr>
<tr>
<td><a
href="https://github.com/astral-sh/ruff/releases/download/0.9.9/ruff-i686-pc-windows-msvc.zip">ruff-i686-pc-windows-msvc.zip</a></td>
<td>x86 Windows</td>
<td><a
href="https://github.com/astral-sh/ruff/releases/download/0.9.9/ruff-i686-pc-windows-msvc.zip.sha256">checksum</a></td>
</tr>
<tr>
<td><a
href="https://github.com/astral-sh/ruff/releases/download/0.9.9/ruff-x86_64-pc-windows-msvc.zip">ruff-x86_64-pc-windows-msvc.zip</a></td>
<td>x64 Windows</td>
<td><a
href="https://github.com/astral-sh/ruff/releases/download/0.9.9/ruff-x86_64-pc-windows-msvc.zip.sha256">checksum</a></td>
</tr>
<tr>
<td><a
href="https://github.com/astral-sh/ruff/releases/download/0.9.9/ruff-aarch64-unknown-linux-gnu.tar.gz">ruff-aarch64-unknown-linux-gnu.tar.gz</a></td>
<td>ARM64 Linux</td>
<td><a
href="https://github.com/astral-sh/ruff/releases/download/0.9.9/ruff-aarch64-unknown-linux-gnu.tar.gz.sha256">checksum</a></td>
</tr>
<tr>
<td><a
href="https://github.com/astral-sh/ruff/releases/download/0.9.9/ruff-i686-unknown-linux-gnu.tar.gz">ruff-i686-unknown-linux-gnu.tar.gz</a></td>
<td>x86 Linux</td>
<td><a
href="https://github.com/astral-sh/ruff/releases/download/0.9.9/ruff-i686-unknown-linux-gnu.tar.gz.sha256">checksum</a></td>
</tr>
<tr>
<td><a
href="https://github.com/astral-sh/ruff/releases/download/0.9.9/ruff-powerpc64-unknown-linux-gnu.tar.gz">ruff-powerpc64-unknown-linux-gnu.tar.gz</a></td>
<td>PPC64 Linux</td>
<td><a
href="https://github.com/astral-sh/ruff/releases/download/0.9.9/ruff-powerpc64-unknown-linux-gnu.tar.gz.sha256">checksum</a></td>
</tr>
<tr>
<td><a
href="https://github.com/astral-sh/ruff/releases/download/0.9.9/ruff-powerpc64le-unknown-linux-gnu.tar.gz">ruff-powerpc64le-unknown-linux-gnu.tar.gz</a></td>
<td>PPC64LE Linux</td>
<td><a
href="https://github.com/astral-sh/ruff/releases/download/0.9.9/ruff-powerpc64le-unknown-linux-gnu.tar.gz.sha256">checksum</a></td>
</tr>
<tr>
<td><a
href="https://github.com/astral-sh/ruff/releases/download/0.9.9/ruff-s390x-unknown-linux-gnu.tar.gz">ruff-s390x-unknown-linux-gnu.tar.gz</a></td>
<td>S390x Linux</td>
<td><a
href="https://github.com/astral-sh/ruff/releases/download/0.9.9/ruff-s390x-unknown-linux-gnu.tar.gz.sha256">checksum</a></td>
</tr>
<tr>
<td><a
href="https://github.com/astral-sh/ruff/releases/download/0.9.9/ruff-x86_64-unknown-linux-gnu.tar.gz">ruff-x86_64-unknown-linux-gnu.tar.gz</a></td>
<td>x64 Linux</td>
<td><a
href="https://github.com/astral-sh/ruff/releases/download/0.9.9/ruff-x86_64-unknown-linux-gnu.tar.gz.sha256">checksum</a></td>
</tr>
<tr>
<td><a
href="https://github.com/astral-sh/ruff/releases/download/0.9.9/ruff-armv7-unknown-linux-gnueabihf.tar.gz">ruff-armv7-unknown-linux-gnueabihf.tar.gz</a></td>
<td>ARMv7 Linux</td>
<td><a
href="https://github.com/astral-sh/ruff/releases/download/0.9.9/ruff-armv7-unknown-linux-gnueabihf.tar.gz.sha256">checksum</a></td>
</tr>
<tr>
<td><a
href="https://github.com/astral-sh/ruff/releases/download/0.9.9/ruff-aarch64-unknown-linux-musl.tar.gz">ruff-aarch64-unknown-linux-musl.tar.gz</a></td>
<td>ARM64 MUSL Linux</td>
<td><a
href="https://github.com/astral-sh/ruff/releases/download/0.9.9/ruff-aarch64-unknown-linux-musl.tar.gz.sha256">checksum</a></td>
</tr>
</tbody>
</table>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/astral-sh/ruff/blob/main/CHANGELOG.md">ruff's
changelog</a>.</em></p>
<blockquote>
<h2>0.9.9</h2>
<h3>Preview features</h3>
<ul>
<li>Fix caching of unsupported-syntax errors (<a
href="https://redirect.github.com/astral-sh/ruff/pull/16425">#16425</a>)</li>
</ul>
<h3>Bug fixes</h3>
<ul>
<li>Only show unsupported-syntax errors in editors when preview mode is
enabled (<a
href="https://redirect.github.com/astral-sh/ruff/pull/16429">#16429</a>)</li>
</ul>
<h2>0.9.8</h2>
<h3>Preview features</h3>
<ul>
<li>Start detecting version-related syntax errors in the parser (<a
href="https://redirect.github.com/astral-sh/ruff/pull/16090">#16090</a>)</li>
</ul>
<h3>Rule changes</h3>
<ul>
<li>[<code>pylint</code>] Mark fix unsafe (<code>PLW1507</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/16343">#16343</a>)</li>
<li>[<code>pylint</code>] Catch <code>case np.nan</code>/<code>case
math.nan</code> in <code>match</code> statements (<code>PLW0177</code>)
(<a
href="https://redirect.github.com/astral-sh/ruff/pull/16378">#16378</a>)</li>
<li>[<code>ruff</code>] Add more Pydantic models variants to the list of
default copy semantics (<code>RUF012</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/16291">#16291</a>)</li>
</ul>
<h3>Server</h3>
<ul>
<li>Avoid indexing the project if <code>configurationPreference</code>
is <code>editorOnly</code> (<a
href="https://redirect.github.com/astral-sh/ruff/pull/16381">#16381</a>)</li>
<li>Avoid unnecessary info at non-trace server log level (<a
href="https://redirect.github.com/astral-sh/ruff/pull/16389">#16389</a>)</li>
<li>Expand <code>ruff.configuration</code> to allow inline config (<a
href="https://redirect.github.com/astral-sh/ruff/pull/16296">#16296</a>)</li>
<li>Notify users for invalid client settings (<a
href="https://redirect.github.com/astral-sh/ruff/pull/16361">#16361</a>)</li>
</ul>
<h3>Configuration</h3>
<ul>
<li>Add <code>per-file-target-version</code> option (<a
href="https://redirect.github.com/astral-sh/ruff/pull/16257">#16257</a>)</li>
</ul>
<h3>Bug fixes</h3>
<ul>
<li>[<code>refurb</code>] Do not consider docstring(s)
(<code>FURB156</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/16391">#16391</a>)</li>
<li>[<code>flake8-self</code>] Ignore attribute accesses on
instance-like variables (<code>SLF001</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/16149">#16149</a>)</li>
<li>[<code>pylint</code>] Fix false positives, add missing methods, and
support positional-only parameters (<code>PLE0302</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/16263">#16263</a>)</li>
<li>[<code>flake8-pyi</code>] Mark <code>PYI030</code> fix unsafe when
comments are deleted (<a
href="https://redirect.github.com/astral-sh/ruff/pull/16322">#16322</a>)</li>
</ul>
<h3>Documentation</h3>
<ul>
<li>Fix example for <code>S611</code> (<a
href="https://redirect.github.com/astral-sh/ruff/pull/16316">#16316</a>)</li>
<li>Normalize inconsistent markdown headings in docstrings (<a
href="https://redirect.github.com/astral-sh/ruff/pull/16364">#16364</a>)</li>
<li>Document MSRV policy (<a
href="https://redirect.github.com/astral-sh/ruff/pull/16384">#16384</a>)</li>
</ul>
<h2>0.9.7</h2>
<h3>Preview features</h3>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="091d0af2ab"><code>091d0af</code></a>
Bump version to Ruff 0.9.9 (<a
href="https://redirect.github.com/astral-sh/ruff/issues/16434">#16434</a>)</li>
<li><a
href="3d72138740"><code>3d72138</code></a>
Check <code>LinterSettings::preview</code> for version-related syntax
errors (<a
href="https://redirect.github.com/astral-sh/ruff/issues/16429">#16429</a>)</li>
<li><a
href="4a23756024"><code>4a23756</code></a>
Avoid caching files with unsupported syntax errors (<a
href="https://redirect.github.com/astral-sh/ruff/issues/16425">#16425</a>)</li>
<li><a
href="af62f7932b"><code>af62f79</code></a>
Prioritize &quot;bug&quot; label for changelog sections (<a
href="https://redirect.github.com/astral-sh/ruff/issues/16433">#16433</a>)</li>
<li><a
href="0ced8d053c"><code>0ced8d0</code></a>
[<code>flake8-copyright</code>] Add links to applicable options
(<code>CPY001</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/issues/16421">#16421</a>)</li>
<li><a
href="a8e171f82c"><code>a8e171f</code></a>
Fix string-length limit in documentation for PYI054 (<a
href="https://redirect.github.com/astral-sh/ruff/issues/16432">#16432</a>)</li>
<li><a
href="cf83584abb"><code>cf83584</code></a>
Show version-related syntax errors in the playground (<a
href="https://redirect.github.com/astral-sh/ruff/issues/16419">#16419</a>)</li>
<li><a
href="764aa0e6a1"><code>764aa0e</code></a>
Allow passing <code>ParseOptions</code> to inline tests (<a
href="https://redirect.github.com/astral-sh/ruff/issues/16357">#16357</a>)</li>
<li><a
href="568cf88c6c"><code>568cf88</code></a>
Bump version to 0.9.8 (<a
href="https://redirect.github.com/astral-sh/ruff/issues/16414">#16414</a>)</li>
<li><a
href="040071bbc5"><code>040071b</code></a>
[red-knot] Ignore surrounding whitespace when looking for `&lt;!--
snapshot-diag...</li>
<li>Additional commits viewable in <a
href="https://github.com/astral-sh/ruff/compare/0.9.6...0.9.9">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=ruff&package-manager=pip&previous-version=0.9.6&new-version=0.9.9)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore <dependency name> major version` will close this
group update PR and stop Dependabot creating any more for the specific
dependency's major version (unless you unignore this specific
dependency's major version or upgrade to it yourself)
- `@dependabot ignore <dependency name> minor version` will close this
group update PR and stop Dependabot creating any more for the specific
dependency's minor version (unless you unignore this specific
dependency's minor version or upgrade to it yourself)
- `@dependabot ignore <dependency name>` will close this group update PR
and stop Dependabot creating any more for the specific dependency
(unless you unignore this specific dependency or upgrade to it yourself)
- `@dependabot unignore <dependency name>` will remove all of the ignore
conditions of the specified dependency
- `@dependabot unignore <dependency name> <ignore condition>` will
remove the ignore condition of the specified dependency and ignore
conditions


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-03-19 23:39:09 +07:00
Zamil Majdy
780fddc2a0 feat(platform)!: Lock Supabase docker-compose code (#9620)
We have been submoduling Supabase for provisioning local Supabase
instances using docker-compose. Aside from the huge size of unrelated
code being pulled, there is also the risk of pulling unintentional
breaking change from the upstream to the platform.

The latest Supabase changes hide the 5432 port from the supabase-db
container and shift it to the supavisor, the instance that we are
currently not using. This causes an error in the existing setup.

## BREAKING CHANGES

This change will introduce different volume locations for the database
content, pulling this change will make the data content fresh from the
start. To keep your old data with this change, execute this command:
```
cp -r supabase/docker/volumes/db/data db/docker/volumes/db/data
```


### Changes 🏗️

The scope of this PR is snapshotting the current docker-compose code
obtained from the Supabase repository and embedding it into our
repository. This will eliminate the need for submodule / recursive
cloning and bringing the entire Supabase repository into the platform.

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [x] Existing CI
2025-03-19 23:39:09 +07:00
Nicholas Tindle
52b4351961 fix: backend admin page logic was broken (#9616)
<!-- Clearly explain the need for these changes: -->

We're building out admin utilities so we need to bring back the `/admin`
route with RBAC. This PR goes through re-enabling that to work with the
latest changes

### Changes 🏗️
- Adds back removed logic
- Refactors the role checks to fix minor bug for admin page and more
importantly clarify
- Updates routes to the latest 
<!-- Concisely describe all of the changes made in this pull request:
-->

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [x] Test with admin and authenticated user roles
  - [x] Test with logged out user role
- [x] For the above check the all the existing routes + new ones in the
`middleware.ts`
2025-03-19 23:39:09 +07:00
Zamil Majdy
8757439192 fix(platform): Fallback front-end-url to platform-url for billing page 2025-03-19 23:39:09 +07:00
dependabot[bot]
e124ee6a9e chore(frontend/deps): bump the production-dependencies group across 1 directory with 13 updates (#9611)
Bumps the production-dependencies group with 13 updates in the
/autogpt_platform/frontend directory:

| Package | From | To |
| --- | --- | --- |
| [@faker-js/faker](https://github.com/faker-js/faker) | `9.4.0` |
`9.6.0` |
|
[@next/third-parties](https://github.com/vercel/next.js/tree/HEAD/packages/third-parties)
| `15.1.6` | `15.2.1` |
| [@supabase/supabase-js](https://github.com/supabase/supabase-js) |
`2.48.1` | `2.49.1` |
|
[@tanstack/react-table](https://github.com/TanStack/table/tree/HEAD/packages/react-table)
| `8.20.6` | `8.21.2` |
|
[@xyflow/react](https://github.com/xyflow/xyflow/tree/HEAD/packages/react)
| `12.4.2` | `12.4.4` |
| [framer-motion](https://github.com/motiondivision/motion) | `12.3.1` |
`12.4.11` |
|
[lucide-react](https://github.com/lucide-icons/lucide/tree/HEAD/packages/lucide-react)
| `0.474.0` | `0.479.0` |
| [next-themes](https://github.com/pacocoursey/next-themes) | `0.4.4` |
`0.4.5` |
| [react-day-picker](https://github.com/gpbl/react-day-picker) | `9.5.1`
| `9.6.1` |
| [react-icons](https://github.com/react-icons/react-icons) | `5.4.0` |
`5.5.0` |
| [react-shepherd](https://github.com/shepherd-pro/shepherd) | `6.1.7` |
`6.1.8` |
| [uuid](https://github.com/uuidjs/uuid) | `11.0.5` | `11.1.0` |
| [zod](https://github.com/colinhacks/zod) | `3.24.1` | `3.24.2` |


Updates `@faker-js/faker` from 9.4.0 to 9.6.0
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/faker-js/faker/releases"><code>@​faker-js/faker</code>'s
releases</a>.</em></p>
<blockquote>
<h2>v9.6.0</h2>
<h2>What's Changed</h2>
<ul>
<li>chore(deps): update dependency typescript to v5.8.2 by <a
href="https://github.com/renovate"><code>@​renovate</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3424">faker-js/faker#3424</a></li>
<li>chore(deps): update dependency ts-morph to v25.0.1 by <a
href="https://github.com/renovate"><code>@​renovate</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3418">faker-js/faker#3418</a></li>
<li>chore(deps): update devdependencies by <a
href="https://github.com/renovate"><code>@​renovate</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3419">faker-js/faker#3419</a></li>
<li>chore(deps): update eslint by <a
href="https://github.com/renovate"><code>@​renovate</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3420">faker-js/faker#3420</a></li>
<li>chore(deps): update vitest by <a
href="https://github.com/renovate"><code>@​renovate</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3421">faker-js/faker#3421</a></li>
<li>chore(deps): update all non-major dependencies by <a
href="https://github.com/renovate"><code>@​renovate</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3422">faker-js/faker#3422</a></li>
<li>chore(deps): remove obsolete dependency
<code>@​types/eslint</code>__js by <a
href="https://github.com/renovate"><code>@​renovate</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3425">faker-js/faker#3425</a></li>
<li>chore(deps): update dependency prettier to v3.5.2 by <a
href="https://github.com/renovate"><code>@​renovate</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3423">faker-js/faker#3423</a></li>
<li>chore(deps): update pnpm to v10 by <a
href="https://github.com/renovate"><code>@​renovate</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3427">faker-js/faker#3427</a></li>
<li>chore(deps): update eslint (major) by <a
href="https://github.com/renovate"><code>@​renovate</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3426">faker-js/faker#3426</a></li>
<li>chore(deps): update devdependencies by <a
href="https://github.com/renovate"><code>@​renovate</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3428">faker-js/faker#3428</a></li>
<li>chore(deps): lock file maintenance by <a
href="https://github.com/renovate"><code>@​renovate</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3431">faker-js/faker#3431</a></li>
<li>docs: revert npm download badge by <a
href="https://github.com/LitoMore"><code>@​LitoMore</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3433">faker-js/faker#3433</a></li>
<li>feat(finance): add ISO 4217 numerical codes to Currency object by <a
href="https://github.com/Nfloc"><code>@​Nfloc</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3404">faker-js/faker#3404</a></li>
<li>feat(number): bigint multipleOf by <a
href="https://github.com/soc221b"><code>@​soc221b</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3402">faker-js/faker#3402</a></li>
<li>refactor(internet): deprecate color method for removal by <a
href="https://github.com/ST-DDT"><code>@​ST-DDT</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3401">faker-js/faker#3401</a></li>
<li>test: add snapshot test for all locales' character sets by <a
href="https://github.com/ST-DDT"><code>@​ST-DDT</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3276">faker-js/faker#3276</a></li>
<li>chore(release): 9.6.0 by <a
href="https://github.com/fakerjs-bot"><code>@​fakerjs-bot</code></a> in
<a
href="https://redirect.github.com/faker-js/faker/pull/3435">faker-js/faker#3435</a></li>
</ul>
<h2>New Contributors</h2>
<ul>
<li><a href="https://github.com/LitoMore"><code>@​LitoMore</code></a>
made their first contribution in <a
href="https://redirect.github.com/faker-js/faker/pull/3433">faker-js/faker#3433</a></li>
<li><a href="https://github.com/Nfloc"><code>@​Nfloc</code></a> made
their first contribution in <a
href="https://redirect.github.com/faker-js/faker/pull/3404">faker-js/faker#3404</a></li>
<li><a href="https://github.com/soc221b"><code>@​soc221b</code></a> made
their first contribution in <a
href="https://redirect.github.com/faker-js/faker/pull/3402">faker-js/faker#3402</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/faker-js/faker/compare/v9.5.1...v9.6.0">https://github.com/faker-js/faker/compare/v9.5.1...v9.6.0</a></p>
<h2>v9.5.1</h2>
<h2>What's Changed</h2>
<ul>
<li>test: retry verify tag 3 times by <a
href="https://github.com/Shinigami92"><code>@​Shinigami92</code></a> in
<a
href="https://redirect.github.com/faker-js/faker/pull/3395">faker-js/faker#3395</a></li>
<li>test: disable summary for local by <a
href="https://github.com/Shinigami92"><code>@​Shinigami92</code></a> in
<a
href="https://redirect.github.com/faker-js/faker/pull/3394">faker-js/faker#3394</a></li>
<li>chore: add usage trend by <a
href="https://github.com/cwtuan"><code>@​cwtuan</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3374">faker-js/faker#3374</a></li>
<li>chore(deps): lock file maintenance by <a
href="https://github.com/renovate"><code>@​renovate</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3403">faker-js/faker#3403</a></li>
<li>fix: test before using Buffers by <a
href="https://github.com/ST-DDT"><code>@​ST-DDT</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3400">faker-js/faker#3400</a></li>
<li>revert(chore): update LICENSE file (<a
href="https://redirect.github.com/faker-js/faker/issues/3350">#3350</a>)
by <a
href="https://github.com/Shinigami92"><code>@​Shinigami92</code></a> in
<a
href="https://redirect.github.com/faker-js/faker/pull/3410">faker-js/faker#3410</a></li>
<li>chore(deps): lock file maintenance by <a
href="https://github.com/renovate"><code>@​renovate</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3411">faker-js/faker#3411</a></li>
<li>docs: change ejcheng by <a
href="https://github.com/Shinigami92"><code>@​Shinigami92</code></a> in
<a
href="https://redirect.github.com/faker-js/faker/pull/3408">faker-js/faker#3408</a></li>
<li>docs: improve missing data error by <a
href="https://github.com/ST-DDT"><code>@​ST-DDT</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3406">faker-js/faker#3406</a></li>
<li>chore(release): 9.5.1 by <a
href="https://github.com/fakerjs-bot"><code>@​fakerjs-bot</code></a> in
<a
href="https://redirect.github.com/faker-js/faker/pull/3415">faker-js/faker#3415</a></li>
</ul>
<h2>New Contributors</h2>
<ul>
<li><a href="https://github.com/cwtuan"><code>@​cwtuan</code></a> made
their first contribution in <a
href="https://redirect.github.com/faker-js/faker/pull/3374">faker-js/faker#3374</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/faker-js/faker/compare/v9.5.0...v9.5.1">https://github.com/faker-js/faker/compare/v9.5.0...v9.5.1</a></p>
<h2>v9.5.0</h2>
<h2>What's Changed</h2>
<ul>
<li>chore(deps): lock file maintenance by <a
href="https://github.com/renovate"><code>@​renovate</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3377">faker-js/faker#3377</a></li>
<li>docs: faker.seed examples are not consistent after refresh by <a
href="https://github.com/matthewmayer"><code>@​matthewmayer</code></a>
in <a
href="https://redirect.github.com/faker-js/faker/pull/3378">faker-js/faker#3378</a></li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/faker-js/faker/blob/next/CHANGELOG.md"><code>@​faker-js/faker</code>'s
changelog</a>.</em></p>
<blockquote>
<h2><a
href="https://github.com/faker-js/faker/compare/v9.5.1...v9.6.0">9.6.0</a>
(2025-03-06)</h2>
<h3>Features</h3>
<ul>
<li><strong>finance:</strong> add ISO 4217 numerical codes to Currency
(<a
href="https://redirect.github.com/faker-js/faker/issues/3404">#3404</a>)
(<a
href="ae9aec67b1">ae9aec6</a>)</li>
<li><strong>number:</strong> bigint multipleOf (<a
href="https://redirect.github.com/faker-js/faker/issues/3402">#3402</a>)
(<a
href="7b4f85a2c0">7b4f85a</a>)</li>
</ul>
<h2><a
href="https://github.com/faker-js/faker/compare/v9.5.0...v9.5.1">9.5.1</a>
(2025-02-28)</h2>
<h3>Bug Fixes</h3>
<ul>
<li>test before using Buffers (<a
href="https://redirect.github.com/faker-js/faker/issues/3400">#3400</a>)
(<a
href="ec7c9a8e60">ec7c9a8</a>)</li>
</ul>
<h2><a
href="https://github.com/faker-js/faker/compare/v9.4.0...v9.5.0">9.5.0</a>
(2025-02-10)</h2>
<h3>Features</h3>
<ul>
<li><strong>image:</strong> add AI-generated avatars (<a
href="https://redirect.github.com/faker-js/faker/issues/3126">#3126</a>)
(<a
href="9e1395380c">9e13953</a>)</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="50b3241629"><code>50b3241</code></a>
chore(release): 9.6.0 (<a
href="https://redirect.github.com/faker-js/faker/issues/3435">#3435</a>)</li>
<li><a
href="62486af20c"><code>62486af</code></a>
test: add snapshot test for all locales' character sets (<a
href="https://redirect.github.com/faker-js/faker/issues/3276">#3276</a>)</li>
<li><a
href="1982431fd0"><code>1982431</code></a>
refactor(internet): deprecate color method for removal (<a
href="https://redirect.github.com/faker-js/faker/issues/3401">#3401</a>)</li>
<li><a
href="7b4f85a2c0"><code>7b4f85a</code></a>
feat(number): bigint multipleOf (<a
href="https://redirect.github.com/faker-js/faker/issues/3402">#3402</a>)</li>
<li><a
href="ae9aec67b1"><code>ae9aec6</code></a>
feat(finance): add ISO 4217 numerical codes to Currency (<a
href="https://redirect.github.com/faker-js/faker/issues/3404">#3404</a>)</li>
<li><a
href="57d39d7442"><code>57d39d7</code></a>
docs: revert npm download badge (<a
href="https://redirect.github.com/faker-js/faker/issues/3433">#3433</a>)</li>
<li><a
href="bf3aa8b064"><code>bf3aa8b</code></a>
chore(deps): lock file maintenance (<a
href="https://redirect.github.com/faker-js/faker/issues/3431">#3431</a>)</li>
<li><a
href="f591459aff"><code>f591459</code></a>
chore(deps): update devdependencies (<a
href="https://redirect.github.com/faker-js/faker/issues/3428">#3428</a>)</li>
<li><a
href="1db428ab97"><code>1db428a</code></a>
chore(deps): update eslint (major) (<a
href="https://redirect.github.com/faker-js/faker/issues/3426">#3426</a>)</li>
<li><a
href="b7e7714b8b"><code>b7e7714</code></a>
chore(deps): update pnpm to v10 (<a
href="https://redirect.github.com/faker-js/faker/issues/3427">#3427</a>)</li>
<li>Additional commits viewable in <a
href="https://github.com/faker-js/faker/compare/v9.4.0...v9.6.0">compare
view</a></li>
</ul>
</details>
<details>
<summary>Maintainer changes</summary>
<p>This version was pushed to npm by <a
href="https://www.npmjs.com/~st-ddt">st-ddt</a>, a new releaser for
<code>@​faker-js/faker</code> since your current version.</p>
</details>
<br />

Updates `@next/third-parties` from 15.1.6 to 15.2.1
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/vercel/next.js/releases"><code>@​next/third-parties</code>'s
releases</a>.</em></p>
<blockquote>
<h2>v15.2.1</h2>
<h3>Core Changes</h3>
<ul>
<li>Unify Link and Form prefetching: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/76184">#76184</a></li>
<li>Turbopack: Ensure server actions sourcemaps tests pass: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/76157">#76157</a></li>
<li>[dev-overlay] control dark theme in one place: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/76528">#76528</a></li>
<li>[dev-overlay] change css var for terminal: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/76590">#76590</a></li>
<li>[dev-overlay] Discriminate stack frame settled typed: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/76517">#76517</a></li>
<li>Remove obsolete <code>sourcePackage</code> references: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/76550">#76550</a></li>
<li>refactor: remove unused variable in externals handling: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/76599">#76599</a></li>
<li>fix: Add popular embedding libraries to serverExternalPackages: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/76574">#76574</a></li>
<li>[Segment Cache] Implement hash-only navigations: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/76179">#76179</a></li>
<li>Webpack: abstract away getting compilation spans: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/76579">#76579</a></li>
<li>report compiler duration for webpack and improve numbers: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/76665">#76665</a></li>
<li>[dev-overlay] fix dark theme missing close bracket: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/76672">#76672</a></li>
<li>Remove <code>revalidate</code> property from incremental cache
<code>ctx</code> for <code>FETCH</code> kind: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/76500">#76500</a></li>
<li>[dev-overlay] fix: env name label style was out of sync with error
type label: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/76668">#76668</a></li>
<li>Turbopack: avoid celling source maps before minify: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/76626">#76626</a></li>
<li>refactor(CI): Merge all four bundler test manifest scripts into one:
<a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/76652">#76652</a></li>
<li>[metadata] fix duplicate metadata for parallel routes: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/76669">#76669</a></li>
<li>[Segment Cache] Omit from bundle if flag disabled: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/76622">#76622</a></li>
<li>[Segment Cache] Support output: &quot;export&quot; mode: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/75671">#75671</a></li>
<li>[Segment Cache] Refresh on same-page navigation: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/76223">#76223</a></li>
<li>[metadata] re-enable streaming metadata with PPR: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/76119">#76119</a></li>
<li>[Segment Cache] Search param fallback handling: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/75990">#75990</a></li>
<li>[Segment Cache] Fix: canonicalURL omits origin: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/76444">#76444</a></li>
<li>fix metadata basePath for manifest: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/76681">#76681</a></li>
<li>Propagate expire time to <code>cache-control</code> header and
prerender manifest: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/76207">#76207</a></li>
<li>Show revalidate/expire columns in build output: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/76343">#76343</a></li>
<li>Gate alternate bundler behind canary only: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/76634">#76634</a></li>
<li>[dynamicIO] routes with dynamic segments should be able to be static
in dev: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/76691">#76691</a></li>
<li>[repo] upgrade ts <code>5.8.2</code>: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/76709">#76709</a></li>
<li>[metadata]: ensure metadata boundary is only rendered once on client
nav: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/76692">#76692</a></li>
<li>[metadata] clean up redudant options: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/76712">#76712</a></li>
<li>Fix uniqueness detection for <code>generateStaticParams</code>: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/76713">#76713</a></li>
<li>Upgrade React from <code>22e39ea7-20250225</code> to
<code>d55cc79b-20250228</code>: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/76680">#76680</a></li>
<li>[Turbopack] Compute module batches and use them for chunking: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/76133">#76133</a></li>
<li>[Dev Tools] Improve keyboard interactions for menu &amp; overlays:
<a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/76754">#76754</a></li>
<li>Keep server code out of browser chunks: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/76660">#76660</a></li>
<li>Turbopack: inline minify into code generation and make it a plain
function instead of a turbo tasks function: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/76628">#76628</a></li>
<li>fix edge runtime asset fetch in pages api: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/76750">#76750</a></li>
<li>Update use-cache-unknown-cache-kind.test.ts snapshot for alternate
bundler: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/76682">#76682</a></li>
</ul>
<h3>Example Changes</h3>
<ul>
<li>docs: fix reading <code>params</code> code blocks: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/76705">#76705</a></li>
</ul>
<h3>Misc Changes</h3>
<ul>
<li>fix(rustdoc): Fix rustdoc warnings, block on rustdoc failures in CI:
<a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/76448">#76448</a></li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="633878112e"><code>6338781</code></a>
v15.2.1</li>
<li><a
href="197b4bb709"><code>197b4bb</code></a>
v15.2.1-canary.6</li>
<li><a
href="0b699b1c8d"><code>0b699b1</code></a>
v15.2.1-canary.5</li>
<li><a
href="4bf1ee117d"><code>4bf1ee1</code></a>
[repo] upgrade ts <code>5.8.2</code> (<a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/76709">#76709</a>)</li>
<li><a
href="6f6001a87a"><code>6f6001a</code></a>
v15.2.1-canary.4</li>
<li><a
href="5f1df89fdc"><code>5f1df89</code></a>
v15.2.1-canary.3</li>
<li><a
href="f06a72e11e"><code>f06a72e</code></a>
v15.2.1-canary.2</li>
<li><a
href="2497f81d1b"><code>2497f81</code></a>
v15.2.1-canary.1</li>
<li><a
href="83610c6a84"><code>83610c6</code></a>
v15.2.1-canary.0</li>
<li><a
href="b0416fbb44"><code>b0416fb</code></a>
v15.2.0</li>
<li>Additional commits viewable in <a
href="https://github.com/vercel/next.js/commits/v15.2.1/packages/third-parties">compare
view</a></li>
</ul>
</details>
<br />

Updates `@supabase/supabase-js` from 2.48.1 to 2.49.1
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/supabase/supabase-js/releases"><code>@​supabase/supabase-js</code>'s
releases</a>.</em></p>
<blockquote>
<h2>v2.49.1</h2>
<h2><a
href="https://github.com/supabase/supabase-js/compare/v2.49.0...v2.49.1">2.49.1</a>
(2025-02-24)</h2>
<h3>Bug Fixes</h3>
<ul>
<li><strong>deps:</strong> upgrade postgrest-js 1.19.2 (<a
href="3f01c3fbc4">3f01c3f</a>)</li>
</ul>
<h2>v2.49.0</h2>
<h1><a
href="https://github.com/supabase/supabase-js/compare/v2.48.1...v2.49.0">2.49.0</a>
(2025-02-24)</h1>
<h3>Features</h3>
<ul>
<li>bump <code>@supabase/auth-js</code> to 2.68.0 (<a
href="https://redirect.github.com/supabase/supabase-js/issues/1359">#1359</a>)
(<a
href="a9ece9a4ae">a9ece9a</a>)</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="fceca48c37"><code>fceca48</code></a>
Merge pull request <a
href="https://redirect.github.com/supabase/supabase-js/issues/1369">#1369</a>
from supabase/avallete/chore-bump-postgrest-js-1-19-2</li>
<li><a
href="a9ece9a4ae"><code>a9ece9a</code></a>
feat: bump <code>@supabase/auth-js</code> to 2.68.0 (<a
href="https://redirect.github.com/supabase/supabase-js/issues/1359">#1359</a>)</li>
<li><a
href="3f01c3fbc4"><code>3f01c3f</code></a>
fix(deps): upgrade postgrest-js 1.19.2</li>
<li>See full diff in <a
href="https://github.com/supabase/supabase-js/compare/v2.48.1...v2.49.1">compare
view</a></li>
</ul>
</details>
<br />

Updates `@tanstack/react-table` from 8.20.6 to 8.21.2
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/TanStack/table/releases"><code>@​tanstack/react-table</code>'s
releases</a>.</em></p>
<blockquote>
<h2>v8.21.2</h2>
<p>Version 8.21.2 - 2/11/25, 8:59 PM</p>
<h2>Changes</h2>
<h3>Fix</h3>
<ul>
<li>arrIncludes autoremove filterFn (<a
href="https://github.com/TanStack/table/tree/HEAD/packages/react-table/issues/5623">#5623</a>)
(2efaf57) by lukebui</li>
<li>lit-table: spread table options in lit adapter (<a
href="https://github.com/TanStack/table/tree/HEAD/packages/react-table/issues/5904">#5904</a>)
(36dede1) by <a
href="https://github.com/kadoshms"><code>@​kadoshms</code></a></li>
</ul>
<h3>Docs</h3>
<ul>
<li>row accessor bug in example code block (<a
href="https://github.com/TanStack/table/tree/HEAD/packages/react-table/issues/5893">#5893</a>)
(b1506a7) by Valerii Petryniak</li>
<li>virtualizer tbody from onchange (827b098) by Kevin Van Cott</li>
<li>exp virtual - remeasure when table state changes (9e6987d) by Kevin
Van Cott</li>
<li>angular: add expanding and sub components examples (<a
href="https://github.com/TanStack/table/tree/HEAD/packages/react-table/issues/5898">#5898</a>)
(099e1a4) by <a
href="https://github.com/riccardoperra"><code>@​riccardoperra</code></a></li>
<li>example name (57703a4) by Kevin Van Cott</li>
</ul>
<h2>Packages</h2>
<ul>
<li><code>@​tanstack/table-core</code><a
href="https://github.com/8"><code>@​8</code></a>.21.2</li>
<li><code>@​tanstack/lit-table</code><a
href="https://github.com/8"><code>@​8</code></a>.21.2</li>
<li><code>@​tanstack/angular-table</code><a
href="https://github.com/8"><code>@​8</code></a>.21.2</li>
<li><code>@​tanstack/qwik-table</code><a
href="https://github.com/8"><code>@​8</code></a>.21.2</li>
<li><code>@​tanstack/react-table</code><a
href="https://github.com/8"><code>@​8</code></a>.21.2</li>
<li><code>@​tanstack/solid-table</code><a
href="https://github.com/8"><code>@​8</code></a>.21.2</li>
<li><code>@​tanstack/svelte-table</code><a
href="https://github.com/8"><code>@​8</code></a>.21.2</li>
<li><code>@​tanstack/vue-table</code><a
href="https://github.com/8"><code>@​8</code></a>.21.2</li>
<li><code>@​tanstack/react-table-devtools</code><a
href="https://github.com/8"><code>@​8</code></a>.21.2</li>
</ul>
<h2>v8.21.1</h2>
<p>Version 8.21.1 - 2/3/25, 5:37 AM</p>
<h2>Changes</h2>
<h3>Fix</h3>
<ul>
<li>lit-table: dynamic data updates in the Lit Table Adapter (<a
href="https://github.com/TanStack/table/tree/HEAD/packages/react-table/issues/5884">#5884</a>)
(9763877) by Luke Schierer</li>
</ul>
<h3>Docs</h3>
<ul>
<li>add experimental virtualization example (<a
href="https://github.com/TanStack/table/tree/HEAD/packages/react-table/issues/5895">#5895</a>)
(8d6e19f) by Kevin Van Cott</li>
<li>angular: add missing faker-js deps (<a
href="https://github.com/TanStack/table/tree/HEAD/packages/react-table/issues/5883">#5883</a>)
(190c669) by <a
href="https://github.com/riccardoperra"><code>@​riccardoperra</code></a></li>
<li>angular: add editable, row-dnd and performant column resizing
example (<a
href="https://github.com/TanStack/table/tree/HEAD/packages/react-table/issues/5881">#5881</a>)
(0baabdd) by <a
href="https://github.com/riccardoperra"><code>@​riccardoperra</code></a></li>
</ul>
<h2>Packages</h2>
<ul>
<li><code>@​tanstack/lit-table</code><a
href="https://github.com/8"><code>@​8</code></a>.21.1</li>
</ul>
<h2>v8.21.0</h2>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="db745afdb8"><code>db745af</code></a>
release: v8.21.2</li>
<li>See full diff in <a
href="https://github.com/TanStack/table/commits/v8.21.2/packages/react-table">compare
view</a></li>
</ul>
</details>
<br />

Updates `@xyflow/react` from 12.4.2 to 12.4.4
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/xyflow/xyflow/releases"><code>@​xyflow/react</code>'s
releases</a>.</em></p>
<blockquote>
<h2><code>@​xyflow/react</code><a
href="https://github.com/12"><code>@​12</code></a>.4.4</h2>
<h3>Patch Changes</h3>
<ul>
<li>
<p><a
href="https://redirect.github.com/xyflow/xyflow/pull/5052">#5052</a> <a
href="99dd7d3549"><code>99dd7d35</code></a>
Thanks <a href="https://github.com/moklick"><code>@​moklick</code></a>!
- Show an error if user drags uninitialized node</p>
</li>
<li>
<p><a
href="https://redirect.github.com/xyflow/xyflow/pull/5042">#5042</a> <a
href="2fe0e850a8"><code>2fe0e850</code></a>
Thanks <a href="https://github.com/moklick"><code>@​moklick</code></a>!
- Allow click connections when target sets
<code>isConnectableStart</code></p>
</li>
<li>
<p><a
href="https://redirect.github.com/xyflow/xyflow/pull/5047">#5047</a> <a
href="b3bf5693c6"><code>b3bf5693</code></a>
Thanks <a href="https://github.com/moklick"><code>@​moklick</code></a>!
- Pass generics to OnSelectionChangeFunc so that users can type it
correctly</p>
</li>
<li>
<p><a
href="https://redirect.github.com/xyflow/xyflow/pull/5053">#5053</a> <a
href="25fb45b5e9"><code>25fb45b5</code></a>
Thanks <a href="https://github.com/moklick"><code>@​moklick</code></a>!
- Remove incorrect deprecation warning</p>
</li>
<li>
<p><a
href="https://redirect.github.com/xyflow/xyflow/pull/5033">#5033</a> <a
href="7b4a81fb6b"><code>7b4a81fb</code></a>
Thanks <a
href="https://github.com/dimaMachina"><code>@​dimaMachina</code></a>! -
lint: use <code>React.JSX</code> type instead of the deprecated global
<code>JSX</code> namespace</p>
</li>
<li>
<p><a
href="https://redirect.github.com/xyflow/xyflow/pull/5043">#5043</a> <a
href="0292ad2010"><code>0292ad20</code></a>
Thanks <a href="https://github.com/moklick"><code>@​moklick</code></a>!
- Use current expandParent value on drag to be able to update it while
dragging</p>
</li>
<li>
<p><a
href="https://redirect.github.com/xyflow/xyflow/pull/5032">#5032</a> <a
href="5867bba805"><code>5867bba8</code></a>
Thanks <a
href="https://github.com/dimaMachina"><code>@​dimaMachina</code></a>! -
lint: remove unnecessary type assertions</p>
</li>
<li>
<p>Updated dependencies [<a
href="99dd7d3549"><code>99dd7d35</code></a>]:</p>
<ul>
<li><code>@​xyflow/system</code><a
href="https://github.com/0"><code>@​0</code></a>.0.52</li>
</ul>
</li>
</ul>
<h2><code>@​xyflow/react</code><a
href="https://github.com/12"><code>@​12</code></a>.4.3</h2>
<h3>Patch Changes</h3>
<ul>
<li>
<p><a
href="https://redirect.github.com/xyflow/xyflow/pull/5010">#5010</a> <a
href="6c121d427f"><code>6c121d42</code></a>
Thanks <a href="https://github.com/moklick"><code>@​moklick</code></a>!
- Add more TSDocs to components, hooks, utils funcs and types</p>
</li>
<li>
<p><a
href="https://redirect.github.com/xyflow/xyflow/pull/4991">#4991</a> <a
href="ea54d9bcb1"><code>ea54d9bc</code></a>
Thanks <a
href="https://github.com/waynetee"><code>@​waynetee</code></a>! - Fix
viewport shifting on node focus</p>
</li>
<li>
<p><a
href="https://redirect.github.com/xyflow/xyflow/pull/5013">#5013</a> <a
href="cde899c5be"><code>cde899c5</code></a>
Thanks <a href="https://github.com/moklick"><code>@​moklick</code></a>!
- Pass <code>NodeType</code> type argument from
<code>ReactFlowProps</code> to <code>connectionLineComponent</code>
property.</p>
</li>
<li>
<p><a
href="https://redirect.github.com/xyflow/xyflow/pull/5008">#5008</a> <a
href="12d859fe29"><code>12d859fe</code></a>
Thanks <a href="https://github.com/moklick"><code>@​moklick</code></a>!
- Add package.json to exports</p>
</li>
<li>
<p><a
href="https://redirect.github.com/xyflow/xyflow/pull/5012">#5012</a> <a
href="4d3f19e88b"><code>4d3f19e8</code></a>
Thanks <a href="https://github.com/moklick"><code>@​moklick</code></a>!
- Add snapGrid option to screenToFlowPosition and set snapToGrid to
false</p>
</li>
<li>
<p><a
href="https://redirect.github.com/xyflow/xyflow/pull/5003">#5003</a> <a
href="e8e0d68495"><code>e8e0d684</code></a>
Thanks <a
href="https://github.com/dimaMachina"><code>@​dimaMachina</code></a>! -
repair lint command</p>
</li>
<li>
<p><a
href="https://redirect.github.com/xyflow/xyflow/pull/4991">#4991</a> <a
href="4c62f19b3a"><code>4c62f19b</code></a>
Thanks <a
href="https://github.com/waynetee"><code>@​waynetee</code></a>! -
Prevent viewport shift after using Tab</p>
</li>
<li>
<p>Updated dependencies [<a
href="6c121d427f"><code>6c121d42</code></a>,
<a
href="4947029cd6"><code>4947029c</code></a>,
<a
href="e8e0d68495"><code>e8e0d684</code></a>]:</p>
<ul>
<li><code>@​xyflow/system</code><a
href="https://github.com/0"><code>@​0</code></a>.0.51</li>
</ul>
</li>
</ul>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/xyflow/xyflow/blob/main/packages/react/CHANGELOG.md"><code>@​xyflow/react</code>'s
changelog</a>.</em></p>
<blockquote>
<h2>12.4.4</h2>
<h3>Patch Changes</h3>
<ul>
<li>
<p><a
href="https://redirect.github.com/xyflow/xyflow/pull/5052">#5052</a> <a
href="99dd7d3549"><code>99dd7d35</code></a>
Thanks <a href="https://github.com/moklick"><code>@​moklick</code></a>!
- Show an error if user drags uninitialized node</p>
</li>
<li>
<p><a
href="https://redirect.github.com/xyflow/xyflow/pull/5042">#5042</a> <a
href="2fe0e850a8"><code>2fe0e850</code></a>
Thanks <a href="https://github.com/moklick"><code>@​moklick</code></a>!
- Allow click connections when target sets
<code>isConnectableStart</code></p>
</li>
<li>
<p><a
href="https://redirect.github.com/xyflow/xyflow/pull/5047">#5047</a> <a
href="b3bf5693c6"><code>b3bf5693</code></a>
Thanks <a href="https://github.com/moklick"><code>@​moklick</code></a>!
- Pass generics to OnSelectionChangeFunc so that users can type it
correctly</p>
</li>
<li>
<p><a
href="https://redirect.github.com/xyflow/xyflow/pull/5053">#5053</a> <a
href="25fb45b5e9"><code>25fb45b5</code></a>
Thanks <a href="https://github.com/moklick"><code>@​moklick</code></a>!
- Remove incorrect deprecation warning</p>
</li>
<li>
<p><a
href="https://redirect.github.com/xyflow/xyflow/pull/5033">#5033</a> <a
href="7b4a81fb6b"><code>7b4a81fb</code></a>
Thanks <a
href="https://github.com/dimaMachina"><code>@​dimaMachina</code></a>! -
lint: use <code>React.JSX</code> type instead of the deprecated global
<code>JSX</code> namespace</p>
</li>
<li>
<p><a
href="https://redirect.github.com/xyflow/xyflow/pull/5043">#5043</a> <a
href="0292ad2010"><code>0292ad20</code></a>
Thanks <a href="https://github.com/moklick"><code>@​moklick</code></a>!
- Use current expandParent value on drag to be able to update it while
dragging</p>
</li>
<li>
<p><a
href="https://redirect.github.com/xyflow/xyflow/pull/5032">#5032</a> <a
href="5867bba805"><code>5867bba8</code></a>
Thanks <a
href="https://github.com/dimaMachina"><code>@​dimaMachina</code></a>! -
lint: remove unnecessary type assertions</p>
</li>
<li>
<p>Updated dependencies [<a
href="99dd7d3549"><code>99dd7d35</code></a>]:</p>
<ul>
<li><code>@​xyflow/system</code><a
href="https://github.com/0"><code>@​0</code></a>.0.52</li>
</ul>
</li>
</ul>
<h2>12.4.3</h2>
<h3>Patch Changes</h3>
<ul>
<li>
<p><a
href="https://redirect.github.com/xyflow/xyflow/pull/5010">#5010</a> <a
href="6c121d427f"><code>6c121d42</code></a>
Thanks <a href="https://github.com/moklick"><code>@​moklick</code></a>!
- Add more TSDocs to components, hooks, utils funcs and types</p>
</li>
<li>
<p><a
href="https://redirect.github.com/xyflow/xyflow/pull/4991">#4991</a> <a
href="ea54d9bcb1"><code>ea54d9bc</code></a>
Thanks <a
href="https://github.com/waynetee"><code>@​waynetee</code></a>! - Fix
viewport shifting on node focus</p>
</li>
<li>
<p><a
href="https://redirect.github.com/xyflow/xyflow/pull/5013">#5013</a> <a
href="cde899c5be"><code>cde899c5</code></a>
Thanks <a href="https://github.com/moklick"><code>@​moklick</code></a>!
- Pass <code>NodeType</code> type argument from
<code>ReactFlowProps</code> to <code>connectionLineComponent</code>
property.</p>
</li>
<li>
<p><a
href="https://redirect.github.com/xyflow/xyflow/pull/5008">#5008</a> <a
href="12d859fe29"><code>12d859fe</code></a>
Thanks <a href="https://github.com/moklick"><code>@​moklick</code></a>!
- Add package.json to exports</p>
</li>
<li>
<p><a
href="https://redirect.github.com/xyflow/xyflow/pull/5012">#5012</a> <a
href="4d3f19e88b"><code>4d3f19e8</code></a>
Thanks <a href="https://github.com/moklick"><code>@​moklick</code></a>!
- Add snapGrid option to screenToFlowPosition and set snapToGrid to
false</p>
</li>
<li>
<p><a
href="https://redirect.github.com/xyflow/xyflow/pull/5003">#5003</a> <a
href="e8e0d68495"><code>e8e0d684</code></a>
Thanks <a
href="https://github.com/dimaMachina"><code>@​dimaMachina</code></a>! -
repair lint command</p>
</li>
<li>
<p><a
href="https://redirect.github.com/xyflow/xyflow/pull/4991">#4991</a> <a
href="4c62f19b3a"><code>4c62f19b</code></a>
Thanks <a
href="https://github.com/waynetee"><code>@​waynetee</code></a>! -
Prevent viewport shift after using Tab</p>
</li>
<li>
<p>Updated dependencies [<a
href="6c121d427f"><code>6c121d42</code></a>,
<a
href="4947029cd6"><code>4947029c</code></a>,
<a
href="e8e0d68495"><code>e8e0d684</code></a>]:</p>
<ul>
<li><code>@​xyflow/system</code><a
href="https://github.com/0"><code>@​0</code></a>.0.51</li>
</ul>
</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="d045503667"><code>d045503</code></a>
chore(packages): bump</li>
<li><a
href="7a00fe3520"><code>7a00fe3</code></a>
chore(getNodeConnections): remove deprecation <a
href="https://github.com/xyflow/xyflow/tree/HEAD/packages/react/issues/5051">#5051</a></li>
<li><a
href="08b99e8719"><code>08b99e8</code></a>
Merge pull request <a
href="https://github.com/xyflow/xyflow/tree/HEAD/packages/react/issues/5043">#5043</a>
from xyflow/refactor/dynamic-expand-parent</li>
<li><a
href="8dd2b4f9e2"><code>8dd2b4f</code></a>
chore(updateNodePositions): cleanup</li>
<li><a
href="0b67a6c303"><code>0b67a6c</code></a>
refactor(errors): show error when user drags uninitialized node <a
href="https://github.com/xyflow/xyflow/tree/HEAD/packages/react/issues/5014">#5014</a></li>
<li><a
href="27df80b6a6"><code>27df80b</code></a>
fix(selection-listener): pass generics</li>
<li><a
href="d094ef0581"><code>d094ef0</code></a>
fix(OnSelectionChangeFunc): pass node and edge type generics <a
href="https://github.com/xyflow/xyflow/tree/HEAD/packages/react/issues/5023">#5023</a></li>
<li><a
href="3969758af2"><code>3969758</code></a>
refactor(expandParent): use current value on drag <a
href="https://github.com/xyflow/xyflow/tree/HEAD/packages/react/issues/5039">#5039</a></li>
<li><a
href="43f188d3b1"><code>43f188d</code></a>
fix(click-connections): handle isConnectableStart correctly <a
href="https://github.com/xyflow/xyflow/tree/HEAD/packages/react/issues/5041">#5041</a></li>
<li><a
href="68591a8bf1"><code>68591a8</code></a>
Merge branch 'main' into no-deprecated</li>
<li>Additional commits viewable in <a
href="https://github.com/xyflow/xyflow/commits/@xyflow/react@12.4.4/packages/react">compare
view</a></li>
</ul>
</details>
<br />

Updates `framer-motion` from 12.3.1 to 12.4.11
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/motiondivision/motion/blob/main/CHANGELOG.md">framer-motion's
changelog</a>.</em></p>
<blockquote>
<h2>[12.4.11] 2025-03-10</h2>
<h3>Fixed</h3>
<ul>
<li>Preventing flattening of scroll animations when <code>type</code> or
<code>ease</code> are explicitly set.</li>
</ul>
<h2>[12.4.10] 2025-03-03</h2>
<h3>Fixed</h3>
<ul>
<li>Adding UMD bundles for <code>motion-dom</code> and
<code>motion-utils</code>.</li>
</ul>
<h2>[12.4.9] 2025-03-03</h2>
<h3>Fixed</h3>
<ul>
<li>Fixed <code>Reorder.Item</code> reordering causing
<code>lostpointercapture</code> event to fire.</li>
</ul>
<h2>[12.4.8] 2025-02-26</h2>
<h3>Fixed</h3>
<ul>
<li>Fixed exiting children with <code>layoutDependency</code> not
animating layout changes because of a stale layout dependency.</li>
</ul>
<h2>[12.4.7] 2025-02-20</h2>
<h3>Fixed</h3>
<ul>
<li>Fixed <code>AnimatePresence</code> not triggering exit animations
when a child with <code>layout</code> or <code>drag</code> is
removed.</li>
</ul>
<h2>[12.4.6] 2025-02-20</h2>
<h3>Fixed</h3>
<ul>
<li>Fixed drag gesture on child elements.</li>
</ul>
<h2>[12.4.5] 2025-02-19</h2>
<h3>Fixed</h3>
<ul>
<li>Fixed <code>onClick</code> handlers not working inside
<code>press</code> events.</li>
</ul>
<h2>[12.4.4] 2025-02-18</h2>
<h3>Fixed</h3>
<ul>
<li>Changed press, drag and pan gestures to use pointer capturing for
better usage within <code>iframe</code> embeds.</li>
</ul>
<h2>[12.4.3] 2025-02-12</h2>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="9316ca8202"><code>9316ca8</code></a>
v12.4.11</li>
<li><a
href="c291a09ff2"><code>c291a09</code></a>
Updating changelog</li>
<li><a
href="9e2923c5e4"><code>9e2923c</code></a>
Fix scroll easing (<a
href="https://redirect.github.com/motiondivision/motion/issues/3106">#3106</a>)</li>
<li><a
href="18b71d1395"><code>18b71d1</code></a>
v12.4.10</li>
<li><a
href="e89a0c20c9"><code>e89a0c2</code></a>
Latest</li>
<li><a
href="d84031e1cd"><code>d84031e</code></a>
Latest</li>
<li><a
href="63e7597969"><code>63e7597</code></a>
Updating rollup</li>
<li><a
href="3fcf837c1f"><code>3fcf837</code></a>
v12.4.9</li>
<li><a
href="c58920f6f2"><code>c58920f</code></a>
Updating changelog</li>
<li><a
href="c889431567"><code>c889431</code></a>
Fixing pointer capture and Reorder item (<a
href="https://redirect.github.com/motiondivision/motion/issues/3097">#3097</a>)</li>
<li>Additional commits viewable in <a
href="https://github.com/motiondivision/motion/compare/v12.3.1...v12.4.11">compare
view</a></li>
</ul>
</details>
<br />

Updates `lucide-react` from 0.474.0 to 0.479.0
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/lucide-icons/lucide/releases">lucide-react's
releases</a>.</em></p>
<blockquote>
<h2>Version 0.479.0</h2>
<h2>What's Changed</h2>
<ul>
<li>feat(<code>@​lucide/svelte</code>): Lucide svelte 5 package by <a
href="https://github.com/ericfennis"><code>@​ericfennis</code></a> in <a
href="https://redirect.github.com/lucide-icons/lucide/pull/2753">lucide-icons/lucide#2753</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/lucide-icons/lucide/compare/0.478.0...0.479.0">https://github.com/lucide-icons/lucide/compare/0.478.0...0.479.0</a></p>
<h2>Version 0.478.0</h2>
<h2>What's Changed</h2>
<ul>
<li>ci(pr-comment): Fix icon preview comment on PRs by <a
href="https://github.com/ericfennis"><code>@​ericfennis</code></a> in <a
href="https://redirect.github.com/lucide-icons/lucide/pull/2854">lucide-icons/lucide#2854</a></li>
<li>fix(ci): run lint pr title on title change by <a
href="https://github.com/jguddas"><code>@​jguddas</code></a> in <a
href="https://redirect.github.com/lucide-icons/lucide/pull/2872">lucide-icons/lucide#2872</a></li>
<li>fix(metadata): name change reflected in contributions by <a
href="https://github.com/AnnaSasDev"><code>@​AnnaSasDev</code></a> in <a
href="https://redirect.github.com/lucide-icons/lucide/pull/2866">lucide-icons/lucide#2866</a></li>
<li>fix(icons): changed <code>brackets</code> icon by <a
href="https://github.com/jguddas"><code>@​jguddas</code></a> in <a
href="https://redirect.github.com/lucide-icons/lucide/pull/2863">lucide-icons/lucide#2863</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/lucide-icons/lucide/compare/0.477.0...0.478.0">https://github.com/lucide-icons/lucide/compare/0.477.0...0.478.0</a></p>
<h2>New icons 0.477.0</h2>
<h2>New icons 🎨</h2>
<ul>
<li><code>square-round-corner</code> (<a
href="https://github.com/lucide-icons/lucide/tree/HEAD/packages/lucide-react/issues/2323">#2323</a>)
by <a href="https://github.com/liamb13"><code>@​liamb13</code></a></li>
</ul>
<h2>Modified Icons 🔨</h2>
<ul>
<li><code>circle-slash-2</code> (<a
href="https://github.com/lucide-icons/lucide/tree/HEAD/packages/lucide-react/issues/2837">#2837</a>)
by <a href="https://github.com/jguddas"><code>@​jguddas</code></a></li>
</ul>
<h2>Fixes and new icons 0.476.0</h2>
<h2>Fixes</h2>
<ul>
<li>fix(lucide-react): Revert exports property package.json, fixing edge
worker environments. by <a
href="https://github.com/ericfennis"><code>@​ericfennis</code></a> in <a
href="https://redirect.github.com/lucide-icons/lucide/pull/2814">lucide-icons/lucide#2814</a></li>
<li>fix(lucide): Lucide create element function returning SVG Element by
<a href="https://github.com/ericfennis"><code>@​ericfennis</code></a> in
<a
href="https://redirect.github.com/lucide-icons/lucide/pull/2816">lucide-icons/lucide#2816</a></li>
</ul>
<h2>New icons 🎨</h2>
<ul>
<li><code>shield-user</code> (<a
href="https://github.com/lucide-icons/lucide/tree/HEAD/packages/lucide-react/issues/2608">#2608</a>)
by <a
href="https://github.com/sebinemeth"><code>@​sebinemeth</code></a></li>
</ul>
<h2>Modified Icons 🔨</h2>
<ul>
<li><code>beef</code> (<a
href="https://github.com/lucide-icons/lucide/tree/HEAD/packages/lucide-react/issues/2832">#2832</a>)
by <a href="https://github.com/jguddas"><code>@​jguddas</code></a></li>
</ul>
<h2>New icons 0.475.0</h2>
<h2>New icons 🎨</h2>
<ul>
<li><code>circle-small</code> (<a
href="https://github.com/lucide-icons/lucide/tree/HEAD/packages/lucide-react/issues/2607">#2607</a>)
by <a
href="https://github.com/jamiemlaw"><code>@​jamiemlaw</code></a></li>
<li><code>mars-stroke</code> (<a
href="https://github.com/lucide-icons/lucide/tree/HEAD/packages/lucide-react/issues/2607">#2607</a>)
by <a
href="https://github.com/jamiemlaw"><code>@​jamiemlaw</code></a></li>
<li><code>mars</code> (<a
href="https://github.com/lucide-icons/lucide/tree/HEAD/packages/lucide-react/issues/2607">#2607</a>)
by <a
href="https://github.com/jamiemlaw"><code>@​jamiemlaw</code></a></li>
<li><code>non-binary</code> (<a
href="https://github.com/lucide-icons/lucide/tree/HEAD/packages/lucide-react/issues/2607">#2607</a>)
by <a
href="https://github.com/jamiemlaw"><code>@​jamiemlaw</code></a></li>
<li><code>transgender</code> (<a
href="https://github.com/lucide-icons/lucide/tree/HEAD/packages/lucide-react/issues/2607">#2607</a>)
by <a
href="https://github.com/jamiemlaw"><code>@​jamiemlaw</code></a></li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="1787b82cfe"><code>1787b82</code></a>
build(deps-dev): bump vite from 5.4.13 to 5.4.14 in /packages/lucide (<a
href="https://github.com/lucide-icons/lucide/tree/HEAD/packages/lucide-react/issues/2804">#2804</a>)</li>
<li><a
href="b46927e510"><code>b46927e</code></a>
fix(lucide-react): Revert exports property package.json, fixing edge
worker e...</li>
<li><a
href="3ab6c373a0"><code>3ab6c37</code></a>
build(deps-dev): bump vite from 5.4.12 to 5.4.13 (<a
href="https://github.com/lucide-icons/lucide/tree/HEAD/packages/lucide-react/issues/2798">#2798</a>)</li>
<li><a
href="ba2c4b526f"><code>ba2c4b5</code></a>
build(deps-dev): bump vite from 5.1.8 to 5.4.12 (<a
href="https://github.com/lucide-icons/lucide/tree/HEAD/packages/lucide-react/issues/2786">#2786</a>)</li>
<li><a
href="50630b3aaf"><code>50630b3</code></a>
ci: Improve build speeds (<a
href="https://github.com/lucide-icons/lucide/tree/HEAD/packages/lucide-react/issues/2778">#2778</a>)</li>
<li>See full diff in <a
href="https://github.com/lucide-icons/lucide/commits/0.479.0/packages/lucide-react">compare
view</a></li>
</ul>
</details>
<br />

Updates `next-themes` from 0.4.4 to 0.4.5
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/pacocoursey/next-themes/releases">next-themes's
releases</a>.</em></p>
<blockquote>
<h2>v0.4.5</h2>
<h2>What's Changed</h2>
<ul>
<li>fix: map theme to class using ValueObject in injected script by <a
href="https://github.com/danielgavrilov"><code>@​danielgavrilov</code></a>
in <a
href="https://redirect.github.com/pacocoursey/next-themes/pull/330">pacocoursey/next-themes#330</a></li>
<li>Reduce number of renders by pre-setting resolvedTheme by <a
href="https://github.com/wahba-openai"><code>@​wahba-openai</code></a>
in <a
href="https://redirect.github.com/pacocoursey/next-themes/pull/338">pacocoursey/next-themes#338</a></li>
<li>Bump next from 14.2.10 to 14.2.15 in the npm_and_yarn group across 1
directory by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a
href="https://redirect.github.com/pacocoursey/next-themes/pull/331">pacocoursey/next-themes#331</a></li>
<li>Bump the npm_and_yarn group across 1 directory with 7 updates by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a
href="https://redirect.github.com/pacocoursey/next-themes/pull/341">pacocoursey/next-themes#341</a></li>
<li>chore: Fix corepack errors in CI by <a
href="https://github.com/pacocoursey"><code>@​pacocoursey</code></a> in
<a
href="https://redirect.github.com/pacocoursey/next-themes/pull/342">pacocoursey/next-themes#342</a></li>
</ul>
<h2>New Contributors</h2>
<ul>
<li><a
href="https://github.com/danielgavrilov"><code>@​danielgavrilov</code></a>
made their first contribution in <a
href="https://redirect.github.com/pacocoursey/next-themes/pull/330">pacocoursey/next-themes#330</a></li>
<li><a
href="https://github.com/wahba-openai"><code>@​wahba-openai</code></a>
made their first contribution in <a
href="https://redirect.github.com/pacocoursey/next-themes/pull/338">pacocoursey/next-themes#338</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/pacocoursey/next-themes/compare/v0.4.4...v0.4.5">https://github.com/pacocoursey/next-themes/compare/v0.4.4...v0.4.5</a></p>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="d12996b4e8"><code>d12996b</code></a>
chore: Fix corepack errors in CI (<a
href="https://redirect.github.com/pacocoursey/next-themes/issues/342">#342</a>)</li>
<li><a
href="b77db23e9f"><code>b77db23</code></a>
Bump the npm_and_yarn group across 1 directory with 7 updates (<a
href="https://redirect.github.com/pacocoursey/next-themes/issues/341">#341</a>)</li>
<li><a
href="d3fa4ee9ad"><code>d3fa4ee</code></a>
Bump next from 14.2.10 to 14.2.15 in the npm_and_yarn group across 1
director...</li>
<li><a
href="ad83567b11"><code>ad83567</code></a>
Reduce number of renders by pre-setting resolvedTheme (<a
href="https://redirect.github.com/pacocoursey/next-themes/issues/338">#338</a>)</li>
<li><a
href="1b510445a3"><code>1b51044</code></a>
fix: map theme to class using ValueObject in injected script (<a
href="https://redirect.github.com/pacocoursey/next-themes/issues/330">#330</a>)</li>
<li>See full diff in <a
href="https://github.com/pacocoursey/next-themes/compare/v0.4.4...v0.4.5">compare
view</a></li>
</ul>
</details>
<br />

Updates `react-day-picker` from 9.5.1 to 9.6.1
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/gpbl/react-day-picker/releases">react-day-picker's
releases</a>.</em></p>
<blockquote>
<h2>v9.6.1</h2>
<p>This release addresses an accessibility issue, adds a new
<code>animate</code> prop and fixes other minor bugs.</p>
<h3>Possible Breaking Change in Custom Styles</h3>
<p>To address a <a
href="https://redirect.github.com/gpbl/react-day-picker/issues/2630">focus
lost bug</a> affecting navigation buttons, we <a
href="https://redirect.github.com/gpbl/react-day-picker/pull/2685">updated</a>
the buttons to use <code>aria-disabled</code> instead of the
<code>disabled</code> attribute.</p>
<p>This change may cause custom styles for those disabled buttons to
break. To fix it in your code, update the CSS selector to target
<code>[aria-disabled=&quot;true&quot;]</code>:</p>
<pre lang="diff"><code>- .rdp-button_next:disabled,
+ .rdp-button_next[aria-disabled=&quot;true&quot;] {
  /* your custom CSS */
}
- .rdp-button_previous:disabled,
+ .rdp-button_previous[aria-disabled=&quot;true&quot;] {
  /* your custom CSS */
}
</code></pre>
<h3>Animating Month Transitions</h3>
<p>Thanks to the work by <a
href="https://github.com/rodgobbi"><code>@​rodgobbi</code></a>, we have
added animations to DayPicker. The new <a
href="http://daypicker.dev/docs/navigation#animate"><code>animate</code>
prop</a> enables CSS transitions for captions and weeks when navigating
between months:</p>
<!-- raw HTML omitted -->
<pre lang="tsx"><code>&lt;DayPicker animate /&gt;
</code></pre>
<p>Customizing the animation style can be challenging due to the HTML
table structure of the grid. We may address this in the future. Please
leave your feedback in <a
href="https://github.com/gpbl/react-day-picker/discussions">DayPicker
Discussions</a>.</p>
<h2>What's Changed</h2>
<ul>
<li>feat: new <code>animate</code> prop by <a
href="https://github.com/rodgobbi"><code>@​rodgobbi</code></a> in <a
href="https://redirect.github.com/gpbl/react-day-picker/pull/2684">gpbl/react-day-picker#2684</a></li>
<li>feat(performance): add <code>sideEffects</code> property to
package.json by <a
href="https://github.com/rodgobbi"><code>@​rodgobbi</code></a> in <a
href="https://redirect.github.com/gpbl/react-day-picker/pull/2673">gpbl/react-day-picker#2673</a></li>
<li>fix(accessibility): focus lost when navigation button is disabled by
<a href="https://github.com/gpbl"><code>@​gpbl</code></a> in <a
href="https://redirect.github.com/gpbl/react-day-picker/pull/2685">gpbl/react-day-picker#2685</a></li>
<li>fix: render selected days with <code>selected</code> modifier when
disabled by <a
href="https://github.com/rodgobbi"><code>@​rodgobbi</code></a> in <a
href="https://redirect.github.com/gpbl/react-day-picker/pull/2700">gpbl/react-day-picker#2700</a></li>
<li>fix(build): remove extra files from package.json by <a
href="https://github.com/gpbl"><code>@​gpbl</code></a> in <a
href="https://redirect.github.com/gpbl/react-day-picker/pull/2692">gpbl/react-day-picker#2692</a></li>
<li>chore(types): fix deprecation of select event handler types by <a
href="https://github.com/timothyis"><code>@​timothyis</code></a> in <a
href="https://redirect.github.com/gpbl/react-day-picker/pull/2680">gpbl/react-day-picker#2680</a></li>
</ul>
<h3>v9.6.1</h3>
<ul>
<li>fix(build): add missing .css entries in package.json files by <a
href="https://github.com/gpbl"><code>@​gpbl</code></a> in <a
href="https://redirect.github.com/gpbl/react-day-picker/pull/2703">gpbl/react-day-picker#2703</a></li>
</ul>
<h2>New Contributors</h2>
<ul>
<li><a href="https://github.com/timothyis"><code>@​timothyis</code></a>
made their first contribution in <a
href="https://redirect.github.com/gpbl/react-day-picker/pull/2680">gpbl/react-day-picker#2680</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/gpbl/react-day-picker/compare/v9.5.1...v9.6.1">https://github.com/gpbl/react-day-picker/compare/v9.5.1...v9.6.1</a></p>
<h2>v9.6.0</h2>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="c37983b8e5"><code>c37983b</code></a>
build: bump v9.6.1</li>
<li><a
href="3d382fe18d"><code>3d382fe</code></a>
build: add missing .css entries in package.json files (<a
href="https://redirect.github.com/gpbl/react-day-picker/issues/2703">#2703</a>)</li>
<li><a
href="ca71182076"><code>ca71182</code></a>
build: bump v9.6.0</li>
<li><a
href="37f759a718"><code>37f759a</code></a>
chore: animate prop, remove <a
href="https://github.com/experimental"><code>@​experimental</code></a>
tag</li>
<li><a
href="9aa3d35062"><code>9aa3d35</code></a>
docs: update for new <code>animate</code> prop (<a
href="https://redirect.github.com/gpbl/react-day-picker/issues/2694">#2694</a>)</li>
<li><a
href="d7d0a8ad13"><code>d7d0a8a</code></a>
fix: render selected days with selected modifier when disabled (<a
href="https://redirect.github.com/gpbl/react-day-picker/issues/2700">#2700</a>)</li>
<li><a
href="48f00dc20f"><code>48f00dc</code></a>
fix: focus lost when navigation button is disabled (<a
href="https://redirect.github.com/gpbl/react-day-picker/issues/2685">#2685</a>)</li>
<li><a
href="6617c9bf81"><code>6617c9b</code></a>
chore: update keyframes names (<a
href="https://redirect.github.com/gpbl/react-day-picker/issues/2695">#2695</a>)</li>
<li><a
href="13e0a4acc9"><code>13e0a4a</code></a>
test: fix test related to first of month (<a
href="https://redirect.github.com/gpbl/react-day-picker/issues/2693">#2693</a>)</li>
<li><a
href="ebde52dd02"><code>ebde52d</code></a>
docs: fix no-wrap-table style for paragraphs</li>
<li>Additional commits viewable in <a
href="https://github.com/gpbl/react-day-picker/compare/v9.5.1...v9.6.1">compare
view</a></li>
</ul>
</details>
<br />

Updates `react-icons` from 5.4.0 to 5.5.0
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/react-icons/react-icons/releases">react-icons's
releases</a>.</em></p>
<blockquote>
<h2>v5.5.0</h2>
<h2>What's Changed</h2>
<ul>
<li>[React 19] Update IconType type to return React.ReactNode by <a
href="https://github.com/diaz-hfc"><code>@​diaz-hfc</code></a> in <a
href="https://redirect.github.com/react-icons/react-icons/pull/1004">react-icons/react-icons#1004</a></li>
<li>Bump vite from 5.2.10 to 5.4.11 by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a
href="https://redirect.github.com/react-icons/react-icons/pull/996">react-icons/react-icons#996</a></li>
<li>Bump nanoid from 3.3.7 to 3.3.8 by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a
href="https://redirect.github.com/react-icons/react-icons/pull/1005">react-icons/react-icons#1005</a></li>
<li>Bump vite from 5.4.11 to 5.4.14 by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a
href="https://redirect.github.com/react-icons/react-icons/pull/1021">react-icons/react-icons#1021</a></li>
<li>Bump esbuild from 0.20.2 to 0.25.0 by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a
href="https://redirect.github.com/react-icons/react-icons/pull/1027">react-icons/react-icons#1027</a></li>
</ul>
<h2>New Contributors</h2>
<ul>
<li><a href="https://github.com/diaz-hfc"><code>@​diaz-hfc</code></a>
made their first contribution in <a
href="https://redirect.github.com/react-icons/react-icons/pull/1004">react-icons/react-icons#1004</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/react-icons/react-icons/compare/v5.4.0...v5.5.0">https://github.com/react-icons/react-icons/compare/v5.4.0...v5.5.0</a></p>
<table>
<thead>
<tr>
<th>Icon Library</th>
<th>License</th>
<th>Version</th>
<th align="right">Count</th>
</tr>
</thead>
<tbody>
<tr>
<td><a href="https://circumicons.com/">Circum Icons</a></td>
<td><a
href="https://github.com/Klarr-Agency/Circum-Icons/blob/main/LICENSE">MPL-2.0
license</a></td>
<td>1.0.0</td>
<td align="right">288</td>
</tr>
<tr>
<td><a href="https://fontawesome.com/">Font Awesome 5</a></td>
<td><a href="https://creativecommons.org/licenses/by/4.0/">CC BY 4.0
License</a></td>
<td>5.15.4-3-gafecf2a</td>
<td align="right">1612</td>
</tr>
<tr>
<td><a href="https://fontawesome.com/">Font Awesome 6</a></td>
<td><a href="https://creativecommons.org/licenses/by/4.0/">CC BY 4.0
License</a></td>
<td>6.6.0</td>
<td align="right">2050</td...

_Description has been truncated_

---------

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Zamil Majdy <zamil.majdy@agpt.co>
2025-03-19 23:39:09 +07:00
Reinier van der Leer
7854b999f3 fix(frontend/library): Truncate agent card title and description (#9658)
- Resolves #9631

### Changes 🏗️

- Truncate library agent card title (2 lines) and description (3 lines)
- Make "See runs" and "Open in builder" stick to bottom of card
regardless of other content
- Reduce number of grid columns (4 -> 3) in `lg` layout on `/library` to
give items more horizontal space

![screenshot of library agent grid with the applied
changes](https://github.com/user-attachments/assets/b27d5c97-33b8-4708-9f8c-fc67aad899c9)


### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  - [x] Visually test the changes made on different screen sizes
2025-03-19 15:56:57 +00:00
Reinier van der Leer
ca5fb1618e fix(frontend/library): Improve agent I/O rendering (#9656)
- Related to #8784

### Changes 🏗️

- feat(frontend/library): Improve agent output styling & fix content
overflow issue
- fix(frontend/library): Fix overlap between content and inset button of
expandable input fields (#9650)
- fix(backend): Unbreak loading graph executions with missing inputs

![screenshot of restyled Output
section](https://github.com/user-attachments/assets/97836158-5735-4d01-94dd-16e3fb6999c6)

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
- Run an agent with at least one input *not* filled out; view this run
in the Library
    - [x] -> page should load normally
    - [x] -> agent inputs should load and show normally
- Run an agent that generates long output; view this run in the Library
- [x] -> output should not overflow its container or stretch the page
layout
    - [x] -> visually check that the output section looks slick
2025-03-19 15:19:52 +00:00
Toran Bruce Richards
6d36be6f86 fix(blocks): SendWebRequestBlock to properly handle HTTP error responses (#9655)
### Issue
The SendWebRequestBlock currently fails to properly route HTTP error
responses (4xx, 5xx) to their designated output pins (`client_error` and
`server_error`). Instead, these errors are being sent to the default
"Error" pin, breaking expected workflows that depend on proper error
handling.

### Root Cause
The underlying issue is that our custom `requests` module from
`backend.util.request` appears to automatically raise exceptions for
error status codes (similar to how `raise_for_status()` works in the
standard requests library). When these exceptions are thrown, the
block's conditional logic for handling different status codes is
bypassed entirely.

### Changes
This PR adds proper exception handling to catch HTTP errors raised by
the requests module and routes them to the appropriate output pins:
- Added a try-except block to capture `requests.exceptions.HTTPError`
- Extract status code and response data from the caught exception
- Yield to the proper pin based on the status code (4xx → client_error,
5xx → server_error)
- Maintain consistent behavior with the original design intent

### Additional Context
This change maintains backward compatibility while ensuring the block
behaves according to its documented functionality. Users can now
properly handle 4xx and 5xx errors in their workflows as originally
intended.

<!-- Clearly explain the need for these changes: -->
### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
- [x] Test the block with new changes and old and ensure expected
behavior

---------

Co-authored-by: Reinier van der Leer <pwuts@agpt.co>
2025-03-19 13:05:07 +00:00
Nicholas Tindle
dc1d5faa5d fix(backend): drastically increase batching time for the agent run (#9654)
<!-- Clearly explain the need for these changes: -->

We accidently send several emails within 10 mins and we're gonna get
blocked for spam if we keep it up

### Changes 🏗️
- moves 1 min to 60 min timer
<!-- Concisely describe all of the changes made in this pull request:
-->

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [x] Test via batching a series of notiications over an hour
2025-03-18 19:39:16 +00:00
Reinier van der Leer
a187e87741 feat(frontend/library): Make agent input fields expandable (#9650)
- Resolves #9622

### Changes 🏗️

- Add pop-out button + modal to input fields in Agent Run Draft view on
`/library/agents/[id]`
- Fix `icon`-variant button styling

![the expand button on the input
fields](https://github.com/user-attachments/assets/00be33fe-44d1-490a-9cab-9696df8f6e6f)
![the expanded input modal that
appears](https://github.com/user-attachments/assets/787f33b9-d884-467b-b99b-dcbec8a1d059)

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  - Go to an agent's page -> click "+ New run"
    - [x] -> pop-out button should show on all input fields
- Enter a value in one of the inputs; click the pop-out button on that
input
    - [x] -> input modal with large text field should open
- [x] -> the value you just entered should be present in the modal's
text field
  - Edit the value & click "Save"
    - [x] -> the modal should close
- [x] -> the value in the corresponding input field should be updated
2025-03-18 14:52:30 +00:00
Zamil Majdy
067983eb80 fix(block): Revert custom get_missing_links method on AddToListBlock 2025-03-18 22:12:19 +07:00
Zamil Majdy
e6de8b98f7 fix(block): Avoid infinite loop execution on AddToListBlock self-loop (#9629)
### Changes 🏗️

<img width="757" alt="image"
src="https://github.com/user-attachments/assets/909aab58-24c7-42ec-9580-ac3e9f32057e"
/>

Since a self-loop is now allowed for AddToListBlock, providing an entry
pin using a static output will cause infinite execution.
This PR change avoid such scenario to be allowed.

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [x] Described above
2025-03-18 14:44:34 +00:00
Zamil Majdy
607641c574 fix(platform): Fix possible db-config permission denied when running two different Supabase versions (#9652)
The change in https://github.com/Significant-Gravitas/AutoGPT/pull/9620
introduces a breaking change in the database volume content; however,
the database's volume location does not change, making switching between
two versions clash.

### Changes 🏗️

Renamed db-config named volume to supabase-config.

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [x] CI
2025-03-18 13:06:03 +00:00
Reinier van der Leer
18dfbf191c feat(platform/library): Add icons to primary agent run action buttons (#9651)
- Resolves #9612

### Changes 🏗️

- Add icon to "Run" button in run draft view
- Add icons "Stop run" and "Run again" buttons in run view

!["Run"
button](https://github.com/user-attachments/assets/da863753-6cb2-4cea-aa00-c313b606d198)
!["Run again"
button](https://github.com/user-attachments/assets/79958187-05dd-494e-a3a1-e9745db0d2d4)
!["Stop run"
button](https://github.com/user-attachments/assets/ad37ec3a-3c0b-493b-b548-e6b902eb8bda)


### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  - Purely visual changes, no functional test needed.
    Technical changes are covered by the type checker.
2025-03-18 10:31:33 +00:00
Nicholas Tindle
841679216c feat(frontend): break out the sidebar into a reusable component + use it for admin page (#9618)
<!-- Clearly explain the need for these changes: -->
We need a sidebar for the admin page, might as well reuse the reusable
component to do so!

### Changes 🏗️
- Extracts the agptui sidebar to a more reusable component
- Update the usage of that sidebar in the settings page
- Use that same sidebar for the admin page

<!-- Concisely describe all of the changes made in this pull request:
-->

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [x] Test the old sidebar
  - [x] Test the new sidebar for admin
2025-03-17 21:19:19 +00:00
Zamil Majdy
50eac43e1a feat(backend): Support sub-agent on export/import agent feature (#9640)
Agents using Agent blocks should be seamlessly downloaded from the
marketplace to a file and imported from a file.

Requirements:
* A recursive export process that exports all the required agents to a
single file, no matter how many layers deep (taking care of potential
loops).
* An import process that expects and extracts several agents from a
single file into your library at once.

Considerations:
We need to ensure the reference IDs in the Agent Blocks match/are
updated to match the imported sub-agent ids to prevent broken
references.

### Changes 🏗️

* Add sub_graphs field on Graph model 
* Improve graph creation query to support inserting graph + subgraphs in
batch
* Deprecate graph template & remove its column
* Update on marketplace download agent (unified the used method, with
more secure cleanup & proper ownership check).
* Fix failing test cases

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [x] Export graph with sub agents.
  - [x] Import the exported graph with sub agents.
2025-03-17 16:38:27 +00:00
Reinier van der Leer
91445e4760 feat(platform/library): Add "Export agent to file" action (#9627)
- Resolves #9609

### Changes 🏗️

- feat(frontend/library): Add "Export agent to file" button
- fix(frontend/library): Put "Open in builder" button behind access
check

- feat(backend): Improve & move graph export stripping logic
  - Add logic to strip `SecretField` values
  - Move node stripping logic to `NodeModel` from `GraphModel`
    - Add `NodeModel.stripped_for_export()` method
  - Add `NodeModel.block` property

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
- Create and configure an agent with the Publish To Medium block and a
block that uses credentials
  - Go to `/library/agents/[id]` for the agent you just created
    - [x] -> "Open in builder" button should show
    - [x] -> "Open in builder" button should work
    - [x] -> "Export agent to file" button should show
    - [x] -> "Export agent to file" button should work
      - [x] -> Exported file contains no credentials or secrets
      - [ ] -> ~~Exported file contains no user IDs~~
  - Go to `/library/agents/[id]` for an agent from the marketplace
    - [x] -> "Open in builder" button should not show
    - [x] -> "Export agent to file" button should not show
2025-03-16 14:10:53 +00:00
dependabot[bot]
6127727aeb chore(libs/deps-dev): bump ruff from 0.9.6 to 0.9.9 in /autogpt_platform/autogpt_libs in the development-dependencies group (#9559)
Bumps the development-dependencies group in
/autogpt_platform/autogpt_libs with 1 update:
[ruff](https://github.com/astral-sh/ruff).

Updates `ruff` from 0.9.6 to 0.9.9
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/astral-sh/ruff/releases">ruff's
releases</a>.</em></p>
<blockquote>
<h2>0.9.9</h2>
<h2>Release Notes</h2>
<h3>Preview features</h3>
<ul>
<li>Fix caching of unsupported-syntax errors (<a
href="https://redirect.github.com/astral-sh/ruff/pull/16425">#16425</a>)</li>
</ul>
<h3>Bug fixes</h3>
<ul>
<li>Only show unsupported-syntax errors in editors when preview mode is
enabled (<a
href="https://redirect.github.com/astral-sh/ruff/pull/16429">#16429</a>)</li>
</ul>
<h2>Contributors</h2>
<ul>
<li><a
href="https://github.com/InSyncWithFoo"><code>@​InSyncWithFoo</code></a></li>
<li><a
href="https://github.com/MichaReiser"><code>@​MichaReiser</code></a></li>
<li><a
href="https://github.com/dhruvmanila"><code>@​dhruvmanila</code></a></li>
<li><a href="https://github.com/ntBre"><code>@​ntBre</code></a></li>
</ul>
<h2>Install ruff 0.9.9</h2>
<h3>Install prebuilt binaries via shell script</h3>
<pre lang="sh"><code>curl --proto '=https' --tlsv1.2 -LsSf
https://github.com/astral-sh/ruff/releases/download/0.9.9/ruff-installer.sh
| sh
</code></pre>
<h3>Install prebuilt binaries via powershell script</h3>
<pre lang="sh"><code>powershell -ExecutionPolicy ByPass -c &quot;irm
https://github.com/astral-sh/ruff/releases/download/0.9.9/ruff-installer.ps1
| iex&quot;
</code></pre>
<h2>Download ruff 0.9.9</h2>
<table>
<thead>
<tr>
<th>File</th>
<th>Platform</th>
<th>Checksum</th>
</tr>
</thead>
<tbody>
<tr>
<td><a
href="https://github.com/astral-sh/ruff/releases/download/0.9.9/ruff-aarch64-apple-darwin.tar.gz">ruff-aarch64-apple-darwin.tar.gz</a></td>
<td>Apple Silicon macOS</td>
<td><a
href="https://github.com/astral-sh/ruff/releases/download/0.9.9/ruff-aarch64-apple-darwin.tar.gz.sha256">checksum</a></td>
</tr>
<tr>
<td><a
href="https://github.com/astral-sh/ruff/releases/download/0.9.9/ruff-x86_64-apple-darwin.tar.gz">ruff-x86_64-apple-darwin.tar.gz</a></td>
<td>Intel macOS</td>
<td><a
href="https://github.com/astral-sh/ruff/releases/download/0.9.9/ruff-x86_64-apple-darwin.tar.gz.sha256">checksum</a></td>
</tr>
<tr>
<td><a
href="https://github.com/astral-sh/ruff/releases/download/0.9.9/ruff-aarch64-pc-windows-msvc.zip">ruff-aarch64-pc-windows-msvc.zip</a></td>
<td>ARM64 Windows</td>
<td><a
href="https://github.com/astral-sh/ruff/releases/download/0.9.9/ruff-aarch64-pc-windows-msvc.zip.sha256">checksum</a></td>
</tr>
<tr>
<td><a
href="https://github.com/astral-sh/ruff/releases/download/0.9.9/ruff-i686-pc-windows-msvc.zip">ruff-i686-pc-windows-msvc.zip</a></td>
<td>x86 Windows</td>
<td><a
href="https://github.com/astral-sh/ruff/releases/download/0.9.9/ruff-i686-pc-windows-msvc.zip.sha256">checksum</a></td>
</tr>
<tr>
<td><a
href="https://github.com/astral-sh/ruff/releases/download/0.9.9/ruff-x86_64-pc-windows-msvc.zip">ruff-x86_64-pc-windows-msvc.zip</a></td>
<td>x64 Windows</td>
<td><a
href="https://github.com/astral-sh/ruff/releases/download/0.9.9/ruff-x86_64-pc-windows-msvc.zip.sha256">checksum</a></td>
</tr>
<tr>
<td><a
href="https://github.com/astral-sh/ruff/releases/download/0.9.9/ruff-aarch64-unknown-linux-gnu.tar.gz">ruff-aarch64-unknown-linux-gnu.tar.gz</a></td>
<td>ARM64 Linux</td>
<td><a
href="https://github.com/astral-sh/ruff/releases/download/0.9.9/ruff-aarch64-unknown-linux-gnu.tar.gz.sha256">checksum</a></td>
</tr>
<tr>
<td><a
href="https://github.com/astral-sh/ruff/releases/download/0.9.9/ruff-i686-unknown-linux-gnu.tar.gz">ruff-i686-unknown-linux-gnu.tar.gz</a></td>
<td>x86 Linux</td>
<td><a
href="https://github.com/astral-sh/ruff/releases/download/0.9.9/ruff-i686-unknown-linux-gnu.tar.gz.sha256">checksum</a></td>
</tr>
<tr>
<td><a
href="https://github.com/astral-sh/ruff/releases/download/0.9.9/ruff-powerpc64-unknown-linux-gnu.tar.gz">ruff-powerpc64-unknown-linux-gnu.tar.gz</a></td>
<td>PPC64 Linux</td>
<td><a
href="https://github.com/astral-sh/ruff/releases/download/0.9.9/ruff-powerpc64-unknown-linux-gnu.tar.gz.sha256">checksum</a></td>
</tr>
<tr>
<td><a
href="https://github.com/astral-sh/ruff/releases/download/0.9.9/ruff-powerpc64le-unknown-linux-gnu.tar.gz">ruff-powerpc64le-unknown-linux-gnu.tar.gz</a></td>
<td>PPC64LE Linux</td>
<td><a
href="https://github.com/astral-sh/ruff/releases/download/0.9.9/ruff-powerpc64le-unknown-linux-gnu.tar.gz.sha256">checksum</a></td>
</tr>
<tr>
<td><a
href="https://github.com/astral-sh/ruff/releases/download/0.9.9/ruff-s390x-unknown-linux-gnu.tar.gz">ruff-s390x-unknown-linux-gnu.tar.gz</a></td>
<td>S390x Linux</td>
<td><a
href="https://github.com/astral-sh/ruff/releases/download/0.9.9/ruff-s390x-unknown-linux-gnu.tar.gz.sha256">checksum</a></td>
</tr>
<tr>
<td><a
href="https://github.com/astral-sh/ruff/releases/download/0.9.9/ruff-x86_64-unknown-linux-gnu.tar.gz">ruff-x86_64-unknown-linux-gnu.tar.gz</a></td>
<td>x64 Linux</td>
<td><a
href="https://github.com/astral-sh/ruff/releases/download/0.9.9/ruff-x86_64-unknown-linux-gnu.tar.gz.sha256">checksum</a></td>
</tr>
<tr>
<td><a
href="https://github.com/astral-sh/ruff/releases/download/0.9.9/ruff-armv7-unknown-linux-gnueabihf.tar.gz">ruff-armv7-unknown-linux-gnueabihf.tar.gz</a></td>
<td>ARMv7 Linux</td>
<td><a
href="https://github.com/astral-sh/ruff/releases/download/0.9.9/ruff-armv7-unknown-linux-gnueabihf.tar.gz.sha256">checksum</a></td>
</tr>
<tr>
<td><a
href="https://github.com/astral-sh/ruff/releases/download/0.9.9/ruff-aarch64-unknown-linux-musl.tar.gz">ruff-aarch64-unknown-linux-musl.tar.gz</a></td>
<td>ARM64 MUSL Linux</td>
<td><a
href="https://github.com/astral-sh/ruff/releases/download/0.9.9/ruff-aarch64-unknown-linux-musl.tar.gz.sha256">checksum</a></td>
</tr>
</tbody>
</table>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/astral-sh/ruff/blob/main/CHANGELOG.md">ruff's
changelog</a>.</em></p>
<blockquote>
<h2>0.9.9</h2>
<h3>Preview features</h3>
<ul>
<li>Fix caching of unsupported-syntax errors (<a
href="https://redirect.github.com/astral-sh/ruff/pull/16425">#16425</a>)</li>
</ul>
<h3>Bug fixes</h3>
<ul>
<li>Only show unsupported-syntax errors in editors when preview mode is
enabled (<a
href="https://redirect.github.com/astral-sh/ruff/pull/16429">#16429</a>)</li>
</ul>
<h2>0.9.8</h2>
<h3>Preview features</h3>
<ul>
<li>Start detecting version-related syntax errors in the parser (<a
href="https://redirect.github.com/astral-sh/ruff/pull/16090">#16090</a>)</li>
</ul>
<h3>Rule changes</h3>
<ul>
<li>[<code>pylint</code>] Mark fix unsafe (<code>PLW1507</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/16343">#16343</a>)</li>
<li>[<code>pylint</code>] Catch <code>case np.nan</code>/<code>case
math.nan</code> in <code>match</code> statements (<code>PLW0177</code>)
(<a
href="https://redirect.github.com/astral-sh/ruff/pull/16378">#16378</a>)</li>
<li>[<code>ruff</code>] Add more Pydantic models variants to the list of
default copy semantics (<code>RUF012</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/16291">#16291</a>)</li>
</ul>
<h3>Server</h3>
<ul>
<li>Avoid indexing the project if <code>configurationPreference</code>
is <code>editorOnly</code> (<a
href="https://redirect.github.com/astral-sh/ruff/pull/16381">#16381</a>)</li>
<li>Avoid unnecessary info at non-trace server log level (<a
href="https://redirect.github.com/astral-sh/ruff/pull/16389">#16389</a>)</li>
<li>Expand <code>ruff.configuration</code> to allow inline config (<a
href="https://redirect.github.com/astral-sh/ruff/pull/16296">#16296</a>)</li>
<li>Notify users for invalid client settings (<a
href="https://redirect.github.com/astral-sh/ruff/pull/16361">#16361</a>)</li>
</ul>
<h3>Configuration</h3>
<ul>
<li>Add <code>per-file-target-version</code> option (<a
href="https://redirect.github.com/astral-sh/ruff/pull/16257">#16257</a>)</li>
</ul>
<h3>Bug fixes</h3>
<ul>
<li>[<code>refurb</code>] Do not consider docstring(s)
(<code>FURB156</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/16391">#16391</a>)</li>
<li>[<code>flake8-self</code>] Ignore attribute accesses on
instance-like variables (<code>SLF001</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/16149">#16149</a>)</li>
<li>[<code>pylint</code>] Fix false positives, add missing methods, and
support positional-only parameters (<code>PLE0302</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/16263">#16263</a>)</li>
<li>[<code>flake8-pyi</code>] Mark <code>PYI030</code> fix unsafe when
comments are deleted (<a
href="https://redirect.github.com/astral-sh/ruff/pull/16322">#16322</a>)</li>
</ul>
<h3>Documentation</h3>
<ul>
<li>Fix example for <code>S611</code> (<a
href="https://redirect.github.com/astral-sh/ruff/pull/16316">#16316</a>)</li>
<li>Normalize inconsistent markdown headings in docstrings (<a
href="https://redirect.github.com/astral-sh/ruff/pull/16364">#16364</a>)</li>
<li>Document MSRV policy (<a
href="https://redirect.github.com/astral-sh/ruff/pull/16384">#16384</a>)</li>
</ul>
<h2>0.9.7</h2>
<h3>Preview features</h3>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="091d0af2ab"><code>091d0af</code></a>
Bump version to Ruff 0.9.9 (<a
href="https://redirect.github.com/astral-sh/ruff/issues/16434">#16434</a>)</li>
<li><a
href="3d72138740"><code>3d72138</code></a>
Check <code>LinterSettings::preview</code> for version-related syntax
errors (<a
href="https://redirect.github.com/astral-sh/ruff/issues/16429">#16429</a>)</li>
<li><a
href="4a23756024"><code>4a23756</code></a>
Avoid caching files with unsupported syntax errors (<a
href="https://redirect.github.com/astral-sh/ruff/issues/16425">#16425</a>)</li>
<li><a
href="af62f7932b"><code>af62f79</code></a>
Prioritize &quot;bug&quot; label for changelog sections (<a
href="https://redirect.github.com/astral-sh/ruff/issues/16433">#16433</a>)</li>
<li><a
href="0ced8d053c"><code>0ced8d0</code></a>
[<code>flake8-copyright</code>] Add links to applicable options
(<code>CPY001</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/issues/16421">#16421</a>)</li>
<li><a
href="a8e171f82c"><code>a8e171f</code></a>
Fix string-length limit in documentation for PYI054 (<a
href="https://redirect.github.com/astral-sh/ruff/issues/16432">#16432</a>)</li>
<li><a
href="cf83584abb"><code>cf83584</code></a>
Show version-related syntax errors in the playground (<a
href="https://redirect.github.com/astral-sh/ruff/issues/16419">#16419</a>)</li>
<li><a
href="764aa0e6a1"><code>764aa0e</code></a>
Allow passing <code>ParseOptions</code> to inline tests (<a
href="https://redirect.github.com/astral-sh/ruff/issues/16357">#16357</a>)</li>
<li><a
href="568cf88c6c"><code>568cf88</code></a>
Bump version to 0.9.8 (<a
href="https://redirect.github.com/astral-sh/ruff/issues/16414">#16414</a>)</li>
<li><a
href="040071bbc5"><code>040071b</code></a>
[red-knot] Ignore surrounding whitespace when looking for `&lt;!--
snapshot-diag...</li>
<li>Additional commits viewable in <a
href="https://github.com/astral-sh/ruff/compare/0.9.6...0.9.9">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=ruff&package-manager=pip&previous-version=0.9.6&new-version=0.9.9)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore <dependency name> major version` will close this
group update PR and stop Dependabot creating any more for the specific
dependency's major version (unless you unignore this specific
dependency's major version or upgrade to it yourself)
- `@dependabot ignore <dependency name> minor version` will close this
group update PR and stop Dependabot creating any more for the specific
dependency's minor version (unless you unignore this specific
dependency's minor version or upgrade to it yourself)
- `@dependabot ignore <dependency name>` will close this group update PR
and stop Dependabot creating any more for the specific dependency
(unless you unignore this specific dependency or upgrade to it yourself)
- `@dependabot unignore <dependency name>` will remove all of the ignore
conditions of the specified dependency
- `@dependabot unignore <dependency name> <ignore condition>` will
remove the ignore condition of the specified dependency and ignore
conditions


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-03-14 18:13:31 +00:00
Zamil Majdy
1bc3041615 feat(platform)!: Lock Supabase docker-compose code (#9620)
We have been submoduling Supabase for provisioning local Supabase
instances using docker-compose. Aside from the huge size of unrelated
code being pulled, there is also the risk of pulling unintentional
breaking change from the upstream to the platform.

The latest Supabase changes hide the 5432 port from the supabase-db
container and shift it to the supavisor, the instance that we are
currently not using. This causes an error in the existing setup.

## BREAKING CHANGES

This change will introduce different volume locations for the database
content, pulling this change will make the data content fresh from the
start. To keep your old data with this change, execute this command:
```
cp -r supabase/docker/volumes/db/data db/docker/volumes/db/data
```


### Changes 🏗️

The scope of this PR is snapshotting the current docker-compose code
obtained from the Supabase repository and embedding it into our
repository. This will eliminate the need for submodule / recursive
cloning and bringing the entire Supabase repository into the platform.

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [x] Existing CI
2025-03-14 17:16:13 +00:00
Nicholas Tindle
9c84dbddca fix: backend admin page logic was broken (#9616)
<!-- Clearly explain the need for these changes: -->

We're building out admin utilities so we need to bring back the `/admin`
route with RBAC. This PR goes through re-enabling that to work with the
latest changes

### Changes 🏗️
- Adds back removed logic
- Refactors the role checks to fix minor bug for admin page and more
importantly clarify
- Updates routes to the latest 
<!-- Concisely describe all of the changes made in this pull request:
-->

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [x] Test with admin and authenticated user roles
  - [x] Test with logged out user role
- [x] For the above check the all the existing routes + new ones in the
`middleware.ts`
2025-03-14 16:37:15 +00:00
Zamil Majdy
b67c2e166b fix(platform): Fallback front-end-url to platform-url for billing page 2025-03-14 21:09:34 +07:00
Zamil Majdy
aa17872667 feat(backend): Fix failed RPC on Notification Service
(cherry picked from commit 801f3a3a24)
2025-03-14 14:37:04 +07:00
Zamil Majdy
801f3a3a24 feat(backend): Fix failed RPC on Notification Service 2025-03-14 14:36:44 +07:00
Zamil Majdy
b0fed43971 feat(backend): Fix failed RPC on Notification Service (#9630)
Although returning a Prisma object on an RPC is a bad practice, we have
instances where we do so and the type contains a `prisma.Json` field.
This Json field can't be seamlessly serialized and then converted back
into the Prisma object.

### Changes 🏗️

Replacing prisma object as return type on notification service with a
plain pydantic object as DTO.

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  - [x] Calling notification APIs through the RPC client.

(cherry picked from commit b9f31a9c44)
2025-03-14 13:26:34 +07:00
Zamil Majdy
b9f31a9c44 feat(backend): Fix failed RPC on Notification Service (#9630)
Although returning a Prisma object on an RPC is a bad practice, we have
instances where we do so and the type contains a `prisma.Json` field.
This Json field can't be seamlessly serialized and then converted back
into the Prisma object.

### Changes 🏗️

Replacing prisma object as return type on notification service with a
plain pydantic object as DTO.


### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  - [x] Calling notification APIs through the RPC client.
2025-03-14 01:54:18 +00:00
Zamil Majdy
90f9e4e94a fix(backend): Move Notification service to DB manager (#9626)
DatabaseManager is already provisioned in RestApiService, and
NotificationService lives within the same instance as the Rest Server.

### Changes 🏗️

Moving the DB calls of NotificationService to DatabaseManager.

### Checklist 📋

#### For code changes:
- [ ] I have clearly listed my changes in the PR description
- [ ] I have made a test plan
- [ ] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [ ] ...

<details>
  <summary>Example test plan</summary>

  - [ ] Create from scratch and execute an agent with at least 3 blocks
- [ ] Import an agent from file upload, and confirm it executes
correctly
  - [ ] Upload agent to marketplace
- [ ] Import an agent from marketplace and confirm it executes correctly
  - [ ] Edit an agent from monitor, and confirm it executes correctly
</details>

#### For configuration changes:
- [ ] `.env.example` is updated or already compatible with my changes
- [ ] `docker-compose.yml` is updated or already compatible with my
changes
- [ ] I have included a list of my configuration changes in the PR
description (under **Changes**)

<details>
  <summary>Examples of configuration changes</summary>

  - Changing ports
  - Adding new services that need to communicate with each other
  - Secrets or environment variable changes
  - New or infrastructure changes such as databases
</details>

(cherry picked from commit f4d4bb83b0)
2025-03-13 13:44:47 +07:00
Zamil Majdy
f4d4bb83b0 fix(backend): Move Notification service to DB manager (#9626)
DatabaseManager is already provisioned in RestApiService, and
NotificationService lives within the same instance as the Rest Server.

### Changes 🏗️

Moving the DB calls of NotificationService to DatabaseManager.

### Checklist 📋

#### For code changes:
- [ ] I have clearly listed my changes in the PR description
- [ ] I have made a test plan
- [ ] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [ ] ...

<details>
  <summary>Example test plan</summary>
  
  - [ ] Create from scratch and execute an agent with at least 3 blocks
- [ ] Import an agent from file upload, and confirm it executes
correctly
  - [ ] Upload agent to marketplace
- [ ] Import an agent from marketplace and confirm it executes correctly
  - [ ] Edit an agent from monitor, and confirm it executes correctly
</details>

#### For configuration changes:
- [ ] `.env.example` is updated or already compatible with my changes
- [ ] `docker-compose.yml` is updated or already compatible with my
changes
- [ ] I have included a list of my configuration changes in the PR
description (under **Changes**)

<details>
  <summary>Examples of configuration changes</summary>

  - Changing ports
  - Adding new services that need to communicate with each other
  - Secrets or environment variable changes
  - New or infrastructure changes such as databases
</details>
2025-03-13 03:14:46 +00:00
dependabot[bot]
02618e1a52 chore(frontend/deps): bump the production-dependencies group across 1 directory with 13 updates (#9611)
Bumps the production-dependencies group with 13 updates in the
/autogpt_platform/frontend directory:

| Package | From | To |
| --- | --- | --- |
| [@faker-js/faker](https://github.com/faker-js/faker) | `9.4.0` |
`9.6.0` |
|
[@next/third-parties](https://github.com/vercel/next.js/tree/HEAD/packages/third-parties)
| `15.1.6` | `15.2.1` |
| [@supabase/supabase-js](https://github.com/supabase/supabase-js) |
`2.48.1` | `2.49.1` |
|
[@tanstack/react-table](https://github.com/TanStack/table/tree/HEAD/packages/react-table)
| `8.20.6` | `8.21.2` |
|
[@xyflow/react](https://github.com/xyflow/xyflow/tree/HEAD/packages/react)
| `12.4.2` | `12.4.4` |
| [framer-motion](https://github.com/motiondivision/motion) | `12.3.1` |
`12.4.11` |
|
[lucide-react](https://github.com/lucide-icons/lucide/tree/HEAD/packages/lucide-react)
| `0.474.0` | `0.479.0` |
| [next-themes](https://github.com/pacocoursey/next-themes) | `0.4.4` |
`0.4.5` |
| [react-day-picker](https://github.com/gpbl/react-day-picker) | `9.5.1`
| `9.6.1` |
| [react-icons](https://github.com/react-icons/react-icons) | `5.4.0` |
`5.5.0` |
| [react-shepherd](https://github.com/shepherd-pro/shepherd) | `6.1.7` |
`6.1.8` |
| [uuid](https://github.com/uuidjs/uuid) | `11.0.5` | `11.1.0` |
| [zod](https://github.com/colinhacks/zod) | `3.24.1` | `3.24.2` |


Updates `@faker-js/faker` from 9.4.0 to 9.6.0
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/faker-js/faker/releases"><code>@​faker-js/faker</code>'s
releases</a>.</em></p>
<blockquote>
<h2>v9.6.0</h2>
<h2>What's Changed</h2>
<ul>
<li>chore(deps): update dependency typescript to v5.8.2 by <a
href="https://github.com/renovate"><code>@​renovate</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3424">faker-js/faker#3424</a></li>
<li>chore(deps): update dependency ts-morph to v25.0.1 by <a
href="https://github.com/renovate"><code>@​renovate</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3418">faker-js/faker#3418</a></li>
<li>chore(deps): update devdependencies by <a
href="https://github.com/renovate"><code>@​renovate</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3419">faker-js/faker#3419</a></li>
<li>chore(deps): update eslint by <a
href="https://github.com/renovate"><code>@​renovate</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3420">faker-js/faker#3420</a></li>
<li>chore(deps): update vitest by <a
href="https://github.com/renovate"><code>@​renovate</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3421">faker-js/faker#3421</a></li>
<li>chore(deps): update all non-major dependencies by <a
href="https://github.com/renovate"><code>@​renovate</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3422">faker-js/faker#3422</a></li>
<li>chore(deps): remove obsolete dependency
<code>@​types/eslint</code>__js by <a
href="https://github.com/renovate"><code>@​renovate</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3425">faker-js/faker#3425</a></li>
<li>chore(deps): update dependency prettier to v3.5.2 by <a
href="https://github.com/renovate"><code>@​renovate</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3423">faker-js/faker#3423</a></li>
<li>chore(deps): update pnpm to v10 by <a
href="https://github.com/renovate"><code>@​renovate</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3427">faker-js/faker#3427</a></li>
<li>chore(deps): update eslint (major) by <a
href="https://github.com/renovate"><code>@​renovate</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3426">faker-js/faker#3426</a></li>
<li>chore(deps): update devdependencies by <a
href="https://github.com/renovate"><code>@​renovate</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3428">faker-js/faker#3428</a></li>
<li>chore(deps): lock file maintenance by <a
href="https://github.com/renovate"><code>@​renovate</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3431">faker-js/faker#3431</a></li>
<li>docs: revert npm download badge by <a
href="https://github.com/LitoMore"><code>@​LitoMore</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3433">faker-js/faker#3433</a></li>
<li>feat(finance): add ISO 4217 numerical codes to Currency object by <a
href="https://github.com/Nfloc"><code>@​Nfloc</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3404">faker-js/faker#3404</a></li>
<li>feat(number): bigint multipleOf by <a
href="https://github.com/soc221b"><code>@​soc221b</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3402">faker-js/faker#3402</a></li>
<li>refactor(internet): deprecate color method for removal by <a
href="https://github.com/ST-DDT"><code>@​ST-DDT</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3401">faker-js/faker#3401</a></li>
<li>test: add snapshot test for all locales' character sets by <a
href="https://github.com/ST-DDT"><code>@​ST-DDT</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3276">faker-js/faker#3276</a></li>
<li>chore(release): 9.6.0 by <a
href="https://github.com/fakerjs-bot"><code>@​fakerjs-bot</code></a> in
<a
href="https://redirect.github.com/faker-js/faker/pull/3435">faker-js/faker#3435</a></li>
</ul>
<h2>New Contributors</h2>
<ul>
<li><a href="https://github.com/LitoMore"><code>@​LitoMore</code></a>
made their first contribution in <a
href="https://redirect.github.com/faker-js/faker/pull/3433">faker-js/faker#3433</a></li>
<li><a href="https://github.com/Nfloc"><code>@​Nfloc</code></a> made
their first contribution in <a
href="https://redirect.github.com/faker-js/faker/pull/3404">faker-js/faker#3404</a></li>
<li><a href="https://github.com/soc221b"><code>@​soc221b</code></a> made
their first contribution in <a
href="https://redirect.github.com/faker-js/faker/pull/3402">faker-js/faker#3402</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/faker-js/faker/compare/v9.5.1...v9.6.0">https://github.com/faker-js/faker/compare/v9.5.1...v9.6.0</a></p>
<h2>v9.5.1</h2>
<h2>What's Changed</h2>
<ul>
<li>test: retry verify tag 3 times by <a
href="https://github.com/Shinigami92"><code>@​Shinigami92</code></a> in
<a
href="https://redirect.github.com/faker-js/faker/pull/3395">faker-js/faker#3395</a></li>
<li>test: disable summary for local by <a
href="https://github.com/Shinigami92"><code>@​Shinigami92</code></a> in
<a
href="https://redirect.github.com/faker-js/faker/pull/3394">faker-js/faker#3394</a></li>
<li>chore: add usage trend by <a
href="https://github.com/cwtuan"><code>@​cwtuan</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3374">faker-js/faker#3374</a></li>
<li>chore(deps): lock file maintenance by <a
href="https://github.com/renovate"><code>@​renovate</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3403">faker-js/faker#3403</a></li>
<li>fix: test before using Buffers by <a
href="https://github.com/ST-DDT"><code>@​ST-DDT</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3400">faker-js/faker#3400</a></li>
<li>revert(chore): update LICENSE file (<a
href="https://redirect.github.com/faker-js/faker/issues/3350">#3350</a>)
by <a
href="https://github.com/Shinigami92"><code>@​Shinigami92</code></a> in
<a
href="https://redirect.github.com/faker-js/faker/pull/3410">faker-js/faker#3410</a></li>
<li>chore(deps): lock file maintenance by <a
href="https://github.com/renovate"><code>@​renovate</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3411">faker-js/faker#3411</a></li>
<li>docs: change ejcheng by <a
href="https://github.com/Shinigami92"><code>@​Shinigami92</code></a> in
<a
href="https://redirect.github.com/faker-js/faker/pull/3408">faker-js/faker#3408</a></li>
<li>docs: improve missing data error by <a
href="https://github.com/ST-DDT"><code>@​ST-DDT</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3406">faker-js/faker#3406</a></li>
<li>chore(release): 9.5.1 by <a
href="https://github.com/fakerjs-bot"><code>@​fakerjs-bot</code></a> in
<a
href="https://redirect.github.com/faker-js/faker/pull/3415">faker-js/faker#3415</a></li>
</ul>
<h2>New Contributors</h2>
<ul>
<li><a href="https://github.com/cwtuan"><code>@​cwtuan</code></a> made
their first contribution in <a
href="https://redirect.github.com/faker-js/faker/pull/3374">faker-js/faker#3374</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/faker-js/faker/compare/v9.5.0...v9.5.1">https://github.com/faker-js/faker/compare/v9.5.0...v9.5.1</a></p>
<h2>v9.5.0</h2>
<h2>What's Changed</h2>
<ul>
<li>chore(deps): lock file maintenance by <a
href="https://github.com/renovate"><code>@​renovate</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3377">faker-js/faker#3377</a></li>
<li>docs: faker.seed examples are not consistent after refresh by <a
href="https://github.com/matthewmayer"><code>@​matthewmayer</code></a>
in <a
href="https://redirect.github.com/faker-js/faker/pull/3378">faker-js/faker#3378</a></li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/faker-js/faker/blob/next/CHANGELOG.md"><code>@​faker-js/faker</code>'s
changelog</a>.</em></p>
<blockquote>
<h2><a
href="https://github.com/faker-js/faker/compare/v9.5.1...v9.6.0">9.6.0</a>
(2025-03-06)</h2>
<h3>Features</h3>
<ul>
<li><strong>finance:</strong> add ISO 4217 numerical codes to Currency
(<a
href="https://redirect.github.com/faker-js/faker/issues/3404">#3404</a>)
(<a
href="ae9aec67b1">ae9aec6</a>)</li>
<li><strong>number:</strong> bigint multipleOf (<a
href="https://redirect.github.com/faker-js/faker/issues/3402">#3402</a>)
(<a
href="7b4f85a2c0">7b4f85a</a>)</li>
</ul>
<h2><a
href="https://github.com/faker-js/faker/compare/v9.5.0...v9.5.1">9.5.1</a>
(2025-02-28)</h2>
<h3>Bug Fixes</h3>
<ul>
<li>test before using Buffers (<a
href="https://redirect.github.com/faker-js/faker/issues/3400">#3400</a>)
(<a
href="ec7c9a8e60">ec7c9a8</a>)</li>
</ul>
<h2><a
href="https://github.com/faker-js/faker/compare/v9.4.0...v9.5.0">9.5.0</a>
(2025-02-10)</h2>
<h3>Features</h3>
<ul>
<li><strong>image:</strong> add AI-generated avatars (<a
href="https://redirect.github.com/faker-js/faker/issues/3126">#3126</a>)
(<a
href="9e1395380c">9e13953</a>)</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="50b3241629"><code>50b3241</code></a>
chore(release): 9.6.0 (<a
href="https://redirect.github.com/faker-js/faker/issues/3435">#3435</a>)</li>
<li><a
href="62486af20c"><code>62486af</code></a>
test: add snapshot test for all locales' character sets (<a
href="https://redirect.github.com/faker-js/faker/issues/3276">#3276</a>)</li>
<li><a
href="1982431fd0"><code>1982431</code></a>
refactor(internet): deprecate color method for removal (<a
href="https://redirect.github.com/faker-js/faker/issues/3401">#3401</a>)</li>
<li><a
href="7b4f85a2c0"><code>7b4f85a</code></a>
feat(number): bigint multipleOf (<a
href="https://redirect.github.com/faker-js/faker/issues/3402">#3402</a>)</li>
<li><a
href="ae9aec67b1"><code>ae9aec6</code></a>
feat(finance): add ISO 4217 numerical codes to Currency (<a
href="https://redirect.github.com/faker-js/faker/issues/3404">#3404</a>)</li>
<li><a
href="57d39d7442"><code>57d39d7</code></a>
docs: revert npm download badge (<a
href="https://redirect.github.com/faker-js/faker/issues/3433">#3433</a>)</li>
<li><a
href="bf3aa8b064"><code>bf3aa8b</code></a>
chore(deps): lock file maintenance (<a
href="https://redirect.github.com/faker-js/faker/issues/3431">#3431</a>)</li>
<li><a
href="f591459aff"><code>f591459</code></a>
chore(deps): update devdependencies (<a
href="https://redirect.github.com/faker-js/faker/issues/3428">#3428</a>)</li>
<li><a
href="1db428ab97"><code>1db428a</code></a>
chore(deps): update eslint (major) (<a
href="https://redirect.github.com/faker-js/faker/issues/3426">#3426</a>)</li>
<li><a
href="b7e7714b8b"><code>b7e7714</code></a>
chore(deps): update pnpm to v10 (<a
href="https://redirect.github.com/faker-js/faker/issues/3427">#3427</a>)</li>
<li>Additional commits viewable in <a
href="https://github.com/faker-js/faker/compare/v9.4.0...v9.6.0">compare
view</a></li>
</ul>
</details>
<details>
<summary>Maintainer changes</summary>
<p>This version was pushed to npm by <a
href="https://www.npmjs.com/~st-ddt">st-ddt</a>, a new releaser for
<code>@​faker-js/faker</code> since your current version.</p>
</details>
<br />

Updates `@next/third-parties` from 15.1.6 to 15.2.1
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/vercel/next.js/releases"><code>@​next/third-parties</code>'s
releases</a>.</em></p>
<blockquote>
<h2>v15.2.1</h2>
<h3>Core Changes</h3>
<ul>
<li>Unify Link and Form prefetching: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/76184">#76184</a></li>
<li>Turbopack: Ensure server actions sourcemaps tests pass: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/76157">#76157</a></li>
<li>[dev-overlay] control dark theme in one place: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/76528">#76528</a></li>
<li>[dev-overlay] change css var for terminal: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/76590">#76590</a></li>
<li>[dev-overlay] Discriminate stack frame settled typed: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/76517">#76517</a></li>
<li>Remove obsolete <code>sourcePackage</code> references: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/76550">#76550</a></li>
<li>refactor: remove unused variable in externals handling: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/76599">#76599</a></li>
<li>fix: Add popular embedding libraries to serverExternalPackages: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/76574">#76574</a></li>
<li>[Segment Cache] Implement hash-only navigations: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/76179">#76179</a></li>
<li>Webpack: abstract away getting compilation spans: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/76579">#76579</a></li>
<li>report compiler duration for webpack and improve numbers: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/76665">#76665</a></li>
<li>[dev-overlay] fix dark theme missing close bracket: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/76672">#76672</a></li>
<li>Remove <code>revalidate</code> property from incremental cache
<code>ctx</code> for <code>FETCH</code> kind: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/76500">#76500</a></li>
<li>[dev-overlay] fix: env name label style was out of sync with error
type label: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/76668">#76668</a></li>
<li>Turbopack: avoid celling source maps before minify: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/76626">#76626</a></li>
<li>refactor(CI): Merge all four bundler test manifest scripts into one:
<a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/76652">#76652</a></li>
<li>[metadata] fix duplicate metadata for parallel routes: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/76669">#76669</a></li>
<li>[Segment Cache] Omit from bundle if flag disabled: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/76622">#76622</a></li>
<li>[Segment Cache] Support output: &quot;export&quot; mode: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/75671">#75671</a></li>
<li>[Segment Cache] Refresh on same-page navigation: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/76223">#76223</a></li>
<li>[metadata] re-enable streaming metadata with PPR: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/76119">#76119</a></li>
<li>[Segment Cache] Search param fallback handling: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/75990">#75990</a></li>
<li>[Segment Cache] Fix: canonicalURL omits origin: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/76444">#76444</a></li>
<li>fix metadata basePath for manifest: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/76681">#76681</a></li>
<li>Propagate expire time to <code>cache-control</code> header and
prerender manifest: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/76207">#76207</a></li>
<li>Show revalidate/expire columns in build output: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/76343">#76343</a></li>
<li>Gate alternate bundler behind canary only: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/76634">#76634</a></li>
<li>[dynamicIO] routes with dynamic segments should be able to be static
in dev: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/76691">#76691</a></li>
<li>[repo] upgrade ts <code>5.8.2</code>: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/76709">#76709</a></li>
<li>[metadata]: ensure metadata boundary is only rendered once on client
nav: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/76692">#76692</a></li>
<li>[metadata] clean up redudant options: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/76712">#76712</a></li>
<li>Fix uniqueness detection for <code>generateStaticParams</code>: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/76713">#76713</a></li>
<li>Upgrade React from <code>22e39ea7-20250225</code> to
<code>d55cc79b-20250228</code>: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/76680">#76680</a></li>
<li>[Turbopack] Compute module batches and use them for chunking: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/76133">#76133</a></li>
<li>[Dev Tools] Improve keyboard interactions for menu &amp; overlays:
<a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/76754">#76754</a></li>
<li>Keep server code out of browser chunks: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/76660">#76660</a></li>
<li>Turbopack: inline minify into code generation and make it a plain
function instead of a turbo tasks function: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/76628">#76628</a></li>
<li>fix edge runtime asset fetch in pages api: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/76750">#76750</a></li>
<li>Update use-cache-unknown-cache-kind.test.ts snapshot for alternate
bundler: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/76682">#76682</a></li>
</ul>
<h3>Example Changes</h3>
<ul>
<li>docs: fix reading <code>params</code> code blocks: <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/76705">#76705</a></li>
</ul>
<h3>Misc Changes</h3>
<ul>
<li>fix(rustdoc): Fix rustdoc warnings, block on rustdoc failures in CI:
<a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/76448">#76448</a></li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="633878112e"><code>6338781</code></a>
v15.2.1</li>
<li><a
href="197b4bb709"><code>197b4bb</code></a>
v15.2.1-canary.6</li>
<li><a
href="0b699b1c8d"><code>0b699b1</code></a>
v15.2.1-canary.5</li>
<li><a
href="4bf1ee117d"><code>4bf1ee1</code></a>
[repo] upgrade ts <code>5.8.2</code> (<a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/76709">#76709</a>)</li>
<li><a
href="6f6001a87a"><code>6f6001a</code></a>
v15.2.1-canary.4</li>
<li><a
href="5f1df89fdc"><code>5f1df89</code></a>
v15.2.1-canary.3</li>
<li><a
href="f06a72e11e"><code>f06a72e</code></a>
v15.2.1-canary.2</li>
<li><a
href="2497f81d1b"><code>2497f81</code></a>
v15.2.1-canary.1</li>
<li><a
href="83610c6a84"><code>83610c6</code></a>
v15.2.1-canary.0</li>
<li><a
href="b0416fbb44"><code>b0416fb</code></a>
v15.2.0</li>
<li>Additional commits viewable in <a
href="https://github.com/vercel/next.js/commits/v15.2.1/packages/third-parties">compare
view</a></li>
</ul>
</details>
<br />

Updates `@supabase/supabase-js` from 2.48.1 to 2.49.1
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/supabase/supabase-js/releases"><code>@​supabase/supabase-js</code>'s
releases</a>.</em></p>
<blockquote>
<h2>v2.49.1</h2>
<h2><a
href="https://github.com/supabase/supabase-js/compare/v2.49.0...v2.49.1">2.49.1</a>
(2025-02-24)</h2>
<h3>Bug Fixes</h3>
<ul>
<li><strong>deps:</strong> upgrade postgrest-js 1.19.2 (<a
href="3f01c3fbc4">3f01c3f</a>)</li>
</ul>
<h2>v2.49.0</h2>
<h1><a
href="https://github.com/supabase/supabase-js/compare/v2.48.1...v2.49.0">2.49.0</a>
(2025-02-24)</h1>
<h3>Features</h3>
<ul>
<li>bump <code>@supabase/auth-js</code> to 2.68.0 (<a
href="https://redirect.github.com/supabase/supabase-js/issues/1359">#1359</a>)
(<a
href="a9ece9a4ae">a9ece9a</a>)</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="fceca48c37"><code>fceca48</code></a>
Merge pull request <a
href="https://redirect.github.com/supabase/supabase-js/issues/1369">#1369</a>
from supabase/avallete/chore-bump-postgrest-js-1-19-2</li>
<li><a
href="a9ece9a4ae"><code>a9ece9a</code></a>
feat: bump <code>@supabase/auth-js</code> to 2.68.0 (<a
href="https://redirect.github.com/supabase/supabase-js/issues/1359">#1359</a>)</li>
<li><a
href="3f01c3fbc4"><code>3f01c3f</code></a>
fix(deps): upgrade postgrest-js 1.19.2</li>
<li>See full diff in <a
href="https://github.com/supabase/supabase-js/compare/v2.48.1...v2.49.1">compare
view</a></li>
</ul>
</details>
<br />

Updates `@tanstack/react-table` from 8.20.6 to 8.21.2
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/TanStack/table/releases"><code>@​tanstack/react-table</code>'s
releases</a>.</em></p>
<blockquote>
<h2>v8.21.2</h2>
<p>Version 8.21.2 - 2/11/25, 8:59 PM</p>
<h2>Changes</h2>
<h3>Fix</h3>
<ul>
<li>arrIncludes autoremove filterFn (<a
href="https://github.com/TanStack/table/tree/HEAD/packages/react-table/issues/5623">#5623</a>)
(2efaf57) by lukebui</li>
<li>lit-table: spread table options in lit adapter (<a
href="https://github.com/TanStack/table/tree/HEAD/packages/react-table/issues/5904">#5904</a>)
(36dede1) by <a
href="https://github.com/kadoshms"><code>@​kadoshms</code></a></li>
</ul>
<h3>Docs</h3>
<ul>
<li>row accessor bug in example code block (<a
href="https://github.com/TanStack/table/tree/HEAD/packages/react-table/issues/5893">#5893</a>)
(b1506a7) by Valerii Petryniak</li>
<li>virtualizer tbody from onchange (827b098) by Kevin Van Cott</li>
<li>exp virtual - remeasure when table state changes (9e6987d) by Kevin
Van Cott</li>
<li>angular: add expanding and sub components examples (<a
href="https://github.com/TanStack/table/tree/HEAD/packages/react-table/issues/5898">#5898</a>)
(099e1a4) by <a
href="https://github.com/riccardoperra"><code>@​riccardoperra</code></a></li>
<li>example name (57703a4) by Kevin Van Cott</li>
</ul>
<h2>Packages</h2>
<ul>
<li><code>@​tanstack/table-core</code><a
href="https://github.com/8"><code>@​8</code></a>.21.2</li>
<li><code>@​tanstack/lit-table</code><a
href="https://github.com/8"><code>@​8</code></a>.21.2</li>
<li><code>@​tanstack/angular-table</code><a
href="https://github.com/8"><code>@​8</code></a>.21.2</li>
<li><code>@​tanstack/qwik-table</code><a
href="https://github.com/8"><code>@​8</code></a>.21.2</li>
<li><code>@​tanstack/react-table</code><a
href="https://github.com/8"><code>@​8</code></a>.21.2</li>
<li><code>@​tanstack/solid-table</code><a
href="https://github.com/8"><code>@​8</code></a>.21.2</li>
<li><code>@​tanstack/svelte-table</code><a
href="https://github.com/8"><code>@​8</code></a>.21.2</li>
<li><code>@​tanstack/vue-table</code><a
href="https://github.com/8"><code>@​8</code></a>.21.2</li>
<li><code>@​tanstack/react-table-devtools</code><a
href="https://github.com/8"><code>@​8</code></a>.21.2</li>
</ul>
<h2>v8.21.1</h2>
<p>Version 8.21.1 - 2/3/25, 5:37 AM</p>
<h2>Changes</h2>
<h3>Fix</h3>
<ul>
<li>lit-table: dynamic data updates in the Lit Table Adapter (<a
href="https://github.com/TanStack/table/tree/HEAD/packages/react-table/issues/5884">#5884</a>)
(9763877) by Luke Schierer</li>
</ul>
<h3>Docs</h3>
<ul>
<li>add experimental virtualization example (<a
href="https://github.com/TanStack/table/tree/HEAD/packages/react-table/issues/5895">#5895</a>)
(8d6e19f) by Kevin Van Cott</li>
<li>angular: add missing faker-js deps (<a
href="https://github.com/TanStack/table/tree/HEAD/packages/react-table/issues/5883">#5883</a>)
(190c669) by <a
href="https://github.com/riccardoperra"><code>@​riccardoperra</code></a></li>
<li>angular: add editable, row-dnd and performant column resizing
example (<a
href="https://github.com/TanStack/table/tree/HEAD/packages/react-table/issues/5881">#5881</a>)
(0baabdd) by <a
href="https://github.com/riccardoperra"><code>@​riccardoperra</code></a></li>
</ul>
<h2>Packages</h2>
<ul>
<li><code>@​tanstack/lit-table</code><a
href="https://github.com/8"><code>@​8</code></a>.21.1</li>
</ul>
<h2>v8.21.0</h2>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="db745afdb8"><code>db745af</code></a>
release: v8.21.2</li>
<li>See full diff in <a
href="https://github.com/TanStack/table/commits/v8.21.2/packages/react-table">compare
view</a></li>
</ul>
</details>
<br />

Updates `@xyflow/react` from 12.4.2 to 12.4.4
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/xyflow/xyflow/releases"><code>@​xyflow/react</code>'s
releases</a>.</em></p>
<blockquote>
<h2><code>@​xyflow/react</code><a
href="https://github.com/12"><code>@​12</code></a>.4.4</h2>
<h3>Patch Changes</h3>
<ul>
<li>
<p><a
href="https://redirect.github.com/xyflow/xyflow/pull/5052">#5052</a> <a
href="99dd7d3549"><code>99dd7d35</code></a>
Thanks <a href="https://github.com/moklick"><code>@​moklick</code></a>!
- Show an error if user drags uninitialized node</p>
</li>
<li>
<p><a
href="https://redirect.github.com/xyflow/xyflow/pull/5042">#5042</a> <a
href="2fe0e850a8"><code>2fe0e850</code></a>
Thanks <a href="https://github.com/moklick"><code>@​moklick</code></a>!
- Allow click connections when target sets
<code>isConnectableStart</code></p>
</li>
<li>
<p><a
href="https://redirect.github.com/xyflow/xyflow/pull/5047">#5047</a> <a
href="b3bf5693c6"><code>b3bf5693</code></a>
Thanks <a href="https://github.com/moklick"><code>@​moklick</code></a>!
- Pass generics to OnSelectionChangeFunc so that users can type it
correctly</p>
</li>
<li>
<p><a
href="https://redirect.github.com/xyflow/xyflow/pull/5053">#5053</a> <a
href="25fb45b5e9"><code>25fb45b5</code></a>
Thanks <a href="https://github.com/moklick"><code>@​moklick</code></a>!
- Remove incorrect deprecation warning</p>
</li>
<li>
<p><a
href="https://redirect.github.com/xyflow/xyflow/pull/5033">#5033</a> <a
href="7b4a81fb6b"><code>7b4a81fb</code></a>
Thanks <a
href="https://github.com/dimaMachina"><code>@​dimaMachina</code></a>! -
lint: use <code>React.JSX</code> type instead of the deprecated global
<code>JSX</code> namespace</p>
</li>
<li>
<p><a
href="https://redirect.github.com/xyflow/xyflow/pull/5043">#5043</a> <a
href="0292ad2010"><code>0292ad20</code></a>
Thanks <a href="https://github.com/moklick"><code>@​moklick</code></a>!
- Use current expandParent value on drag to be able to update it while
dragging</p>
</li>
<li>
<p><a
href="https://redirect.github.com/xyflow/xyflow/pull/5032">#5032</a> <a
href="5867bba805"><code>5867bba8</code></a>
Thanks <a
href="https://github.com/dimaMachina"><code>@​dimaMachina</code></a>! -
lint: remove unnecessary type assertions</p>
</li>
<li>
<p>Updated dependencies [<a
href="99dd7d3549"><code>99dd7d35</code></a>]:</p>
<ul>
<li><code>@​xyflow/system</code><a
href="https://github.com/0"><code>@​0</code></a>.0.52</li>
</ul>
</li>
</ul>
<h2><code>@​xyflow/react</code><a
href="https://github.com/12"><code>@​12</code></a>.4.3</h2>
<h3>Patch Changes</h3>
<ul>
<li>
<p><a
href="https://redirect.github.com/xyflow/xyflow/pull/5010">#5010</a> <a
href="6c121d427f"><code>6c121d42</code></a>
Thanks <a href="https://github.com/moklick"><code>@​moklick</code></a>!
- Add more TSDocs to components, hooks, utils funcs and types</p>
</li>
<li>
<p><a
href="https://redirect.github.com/xyflow/xyflow/pull/4991">#4991</a> <a
href="ea54d9bcb1"><code>ea54d9bc</code></a>
Thanks <a
href="https://github.com/waynetee"><code>@​waynetee</code></a>! - Fix
viewport shifting on node focus</p>
</li>
<li>
<p><a
href="https://redirect.github.com/xyflow/xyflow/pull/5013">#5013</a> <a
href="cde899c5be"><code>cde899c5</code></a>
Thanks <a href="https://github.com/moklick"><code>@​moklick</code></a>!
- Pass <code>NodeType</code> type argument from
<code>ReactFlowProps</code> to <code>connectionLineComponent</code>
property.</p>
</li>
<li>
<p><a
href="https://redirect.github.com/xyflow/xyflow/pull/5008">#5008</a> <a
href="12d859fe29"><code>12d859fe</code></a>
Thanks <a href="https://github.com/moklick"><code>@​moklick</code></a>!
- Add package.json to exports</p>
</li>
<li>
<p><a
href="https://redirect.github.com/xyflow/xyflow/pull/5012">#5012</a> <a
href="4d3f19e88b"><code>4d3f19e8</code></a>
Thanks <a href="https://github.com/moklick"><code>@​moklick</code></a>!
- Add snapGrid option to screenToFlowPosition and set snapToGrid to
false</p>
</li>
<li>
<p><a
href="https://redirect.github.com/xyflow/xyflow/pull/5003">#5003</a> <a
href="e8e0d68495"><code>e8e0d684</code></a>
Thanks <a
href="https://github.com/dimaMachina"><code>@​dimaMachina</code></a>! -
repair lint command</p>
</li>
<li>
<p><a
href="https://redirect.github.com/xyflow/xyflow/pull/4991">#4991</a> <a
href="4c62f19b3a"><code>4c62f19b</code></a>
Thanks <a
href="https://github.com/waynetee"><code>@​waynetee</code></a>! -
Prevent viewport shift after using Tab</p>
</li>
<li>
<p>Updated dependencies [<a
href="6c121d427f"><code>6c121d42</code></a>,
<a
href="4947029cd6"><code>4947029c</code></a>,
<a
href="e8e0d68495"><code>e8e0d684</code></a>]:</p>
<ul>
<li><code>@​xyflow/system</code><a
href="https://github.com/0"><code>@​0</code></a>.0.51</li>
</ul>
</li>
</ul>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/xyflow/xyflow/blob/main/packages/react/CHANGELOG.md"><code>@​xyflow/react</code>'s
changelog</a>.</em></p>
<blockquote>
<h2>12.4.4</h2>
<h3>Patch Changes</h3>
<ul>
<li>
<p><a
href="https://redirect.github.com/xyflow/xyflow/pull/5052">#5052</a> <a
href="99dd7d3549"><code>99dd7d35</code></a>
Thanks <a href="https://github.com/moklick"><code>@​moklick</code></a>!
- Show an error if user drags uninitialized node</p>
</li>
<li>
<p><a
href="https://redirect.github.com/xyflow/xyflow/pull/5042">#5042</a> <a
href="2fe0e850a8"><code>2fe0e850</code></a>
Thanks <a href="https://github.com/moklick"><code>@​moklick</code></a>!
- Allow click connections when target sets
<code>isConnectableStart</code></p>
</li>
<li>
<p><a
href="https://redirect.github.com/xyflow/xyflow/pull/5047">#5047</a> <a
href="b3bf5693c6"><code>b3bf5693</code></a>
Thanks <a href="https://github.com/moklick"><code>@​moklick</code></a>!
- Pass generics to OnSelectionChangeFunc so that users can type it
correctly</p>
</li>
<li>
<p><a
href="https://redirect.github.com/xyflow/xyflow/pull/5053">#5053</a> <a
href="25fb45b5e9"><code>25fb45b5</code></a>
Thanks <a href="https://github.com/moklick"><code>@​moklick</code></a>!
- Remove incorrect deprecation warning</p>
</li>
<li>
<p><a
href="https://redirect.github.com/xyflow/xyflow/pull/5033">#5033</a> <a
href="7b4a81fb6b"><code>7b4a81fb</code></a>
Thanks <a
href="https://github.com/dimaMachina"><code>@​dimaMachina</code></a>! -
lint: use <code>React.JSX</code> type instead of the deprecated global
<code>JSX</code> namespace</p>
</li>
<li>
<p><a
href="https://redirect.github.com/xyflow/xyflow/pull/5043">#5043</a> <a
href="0292ad2010"><code>0292ad20</code></a>
Thanks <a href="https://github.com/moklick"><code>@​moklick</code></a>!
- Use current expandParent value on drag to be able to update it while
dragging</p>
</li>
<li>
<p><a
href="https://redirect.github.com/xyflow/xyflow/pull/5032">#5032</a> <a
href="5867bba805"><code>5867bba8</code></a>
Thanks <a
href="https://github.com/dimaMachina"><code>@​dimaMachina</code></a>! -
lint: remove unnecessary type assertions</p>
</li>
<li>
<p>Updated dependencies [<a
href="99dd7d3549"><code>99dd7d35</code></a>]:</p>
<ul>
<li><code>@​xyflow/system</code><a
href="https://github.com/0"><code>@​0</code></a>.0.52</li>
</ul>
</li>
</ul>
<h2>12.4.3</h2>
<h3>Patch Changes</h3>
<ul>
<li>
<p><a
href="https://redirect.github.com/xyflow/xyflow/pull/5010">#5010</a> <a
href="6c121d427f"><code>6c121d42</code></a>
Thanks <a href="https://github.com/moklick"><code>@​moklick</code></a>!
- Add more TSDocs to components, hooks, utils funcs and types</p>
</li>
<li>
<p><a
href="https://redirect.github.com/xyflow/xyflow/pull/4991">#4991</a> <a
href="ea54d9bcb1"><code>ea54d9bc</code></a>
Thanks <a
href="https://github.com/waynetee"><code>@​waynetee</code></a>! - Fix
viewport shifting on node focus</p>
</li>
<li>
<p><a
href="https://redirect.github.com/xyflow/xyflow/pull/5013">#5013</a> <a
href="cde899c5be"><code>cde899c5</code></a>
Thanks <a href="https://github.com/moklick"><code>@​moklick</code></a>!
- Pass <code>NodeType</code> type argument from
<code>ReactFlowProps</code> to <code>connectionLineComponent</code>
property.</p>
</li>
<li>
<p><a
href="https://redirect.github.com/xyflow/xyflow/pull/5008">#5008</a> <a
href="12d859fe29"><code>12d859fe</code></a>
Thanks <a href="https://github.com/moklick"><code>@​moklick</code></a>!
- Add package.json to exports</p>
</li>
<li>
<p><a
href="https://redirect.github.com/xyflow/xyflow/pull/5012">#5012</a> <a
href="4d3f19e88b"><code>4d3f19e8</code></a>
Thanks <a href="https://github.com/moklick"><code>@​moklick</code></a>!
- Add snapGrid option to screenToFlowPosition and set snapToGrid to
false</p>
</li>
<li>
<p><a
href="https://redirect.github.com/xyflow/xyflow/pull/5003">#5003</a> <a
href="e8e0d68495"><code>e8e0d684</code></a>
Thanks <a
href="https://github.com/dimaMachina"><code>@​dimaMachina</code></a>! -
repair lint command</p>
</li>
<li>
<p><a
href="https://redirect.github.com/xyflow/xyflow/pull/4991">#4991</a> <a
href="4c62f19b3a"><code>4c62f19b</code></a>
Thanks <a
href="https://github.com/waynetee"><code>@​waynetee</code></a>! -
Prevent viewport shift after using Tab</p>
</li>
<li>
<p>Updated dependencies [<a
href="6c121d427f"><code>6c121d42</code></a>,
<a
href="4947029cd6"><code>4947029c</code></a>,
<a
href="e8e0d68495"><code>e8e0d684</code></a>]:</p>
<ul>
<li><code>@​xyflow/system</code><a
href="https://github.com/0"><code>@​0</code></a>.0.51</li>
</ul>
</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="d045503667"><code>d045503</code></a>
chore(packages): bump</li>
<li><a
href="7a00fe3520"><code>7a00fe3</code></a>
chore(getNodeConnections): remove deprecation <a
href="https://github.com/xyflow/xyflow/tree/HEAD/packages/react/issues/5051">#5051</a></li>
<li><a
href="08b99e8719"><code>08b99e8</code></a>
Merge pull request <a
href="https://github.com/xyflow/xyflow/tree/HEAD/packages/react/issues/5043">#5043</a>
from xyflow/refactor/dynamic-expand-parent</li>
<li><a
href="8dd2b4f9e2"><code>8dd2b4f</code></a>
chore(updateNodePositions): cleanup</li>
<li><a
href="0b67a6c303"><code>0b67a6c</code></a>
refactor(errors): show error when user drags uninitialized node <a
href="https://github.com/xyflow/xyflow/tree/HEAD/packages/react/issues/5014">#5014</a></li>
<li><a
href="27df80b6a6"><code>27df80b</code></a>
fix(selection-listener): pass generics</li>
<li><a
href="d094ef0581"><code>d094ef0</code></a>
fix(OnSelectionChangeFunc): pass node and edge type generics <a
href="https://github.com/xyflow/xyflow/tree/HEAD/packages/react/issues/5023">#5023</a></li>
<li><a
href="3969758af2"><code>3969758</code></a>
refactor(expandParent): use current value on drag <a
href="https://github.com/xyflow/xyflow/tree/HEAD/packages/react/issues/5039">#5039</a></li>
<li><a
href="43f188d3b1"><code>43f188d</code></a>
fix(click-connections): handle isConnectableStart correctly <a
href="https://github.com/xyflow/xyflow/tree/HEAD/packages/react/issues/5041">#5041</a></li>
<li><a
href="68591a8bf1"><code>68591a8</code></a>
Merge branch 'main' into no-deprecated</li>
<li>Additional commits viewable in <a
href="https://github.com/xyflow/xyflow/commits/@xyflow/react@12.4.4/packages/react">compare
view</a></li>
</ul>
</details>
<br />

Updates `framer-motion` from 12.3.1 to 12.4.11
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/motiondivision/motion/blob/main/CHANGELOG.md">framer-motion's
changelog</a>.</em></p>
<blockquote>
<h2>[12.4.11] 2025-03-10</h2>
<h3>Fixed</h3>
<ul>
<li>Preventing flattening of scroll animations when <code>type</code> or
<code>ease</code> are explicitly set.</li>
</ul>
<h2>[12.4.10] 2025-03-03</h2>
<h3>Fixed</h3>
<ul>
<li>Adding UMD bundles for <code>motion-dom</code> and
<code>motion-utils</code>.</li>
</ul>
<h2>[12.4.9] 2025-03-03</h2>
<h3>Fixed</h3>
<ul>
<li>Fixed <code>Reorder.Item</code> reordering causing
<code>lostpointercapture</code> event to fire.</li>
</ul>
<h2>[12.4.8] 2025-02-26</h2>
<h3>Fixed</h3>
<ul>
<li>Fixed exiting children with <code>layoutDependency</code> not
animating layout changes because of a stale layout dependency.</li>
</ul>
<h2>[12.4.7] 2025-02-20</h2>
<h3>Fixed</h3>
<ul>
<li>Fixed <code>AnimatePresence</code> not triggering exit animations
when a child with <code>layout</code> or <code>drag</code> is
removed.</li>
</ul>
<h2>[12.4.6] 2025-02-20</h2>
<h3>Fixed</h3>
<ul>
<li>Fixed drag gesture on child elements.</li>
</ul>
<h2>[12.4.5] 2025-02-19</h2>
<h3>Fixed</h3>
<ul>
<li>Fixed <code>onClick</code> handlers not working inside
<code>press</code> events.</li>
</ul>
<h2>[12.4.4] 2025-02-18</h2>
<h3>Fixed</h3>
<ul>
<li>Changed press, drag and pan gestures to use pointer capturing for
better usage within <code>iframe</code> embeds.</li>
</ul>
<h2>[12.4.3] 2025-02-12</h2>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="9316ca8202"><code>9316ca8</code></a>
v12.4.11</li>
<li><a
href="c291a09ff2"><code>c291a09</code></a>
Updating changelog</li>
<li><a
href="9e2923c5e4"><code>9e2923c</code></a>
Fix scroll easing (<a
href="https://redirect.github.com/motiondivision/motion/issues/3106">#3106</a>)</li>
<li><a
href="18b71d1395"><code>18b71d1</code></a>
v12.4.10</li>
<li><a
href="e89a0c20c9"><code>e89a0c2</code></a>
Latest</li>
<li><a
href="d84031e1cd"><code>d84031e</code></a>
Latest</li>
<li><a
href="63e7597969"><code>63e7597</code></a>
Updating rollup</li>
<li><a
href="3fcf837c1f"><code>3fcf837</code></a>
v12.4.9</li>
<li><a
href="c58920f6f2"><code>c58920f</code></a>
Updating changelog</li>
<li><a
href="c889431567"><code>c889431</code></a>
Fixing pointer capture and Reorder item (<a
href="https://redirect.github.com/motiondivision/motion/issues/3097">#3097</a>)</li>
<li>Additional commits viewable in <a
href="https://github.com/motiondivision/motion/compare/v12.3.1...v12.4.11">compare
view</a></li>
</ul>
</details>
<br />

Updates `lucide-react` from 0.474.0 to 0.479.0
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/lucide-icons/lucide/releases">lucide-react's
releases</a>.</em></p>
<blockquote>
<h2>Version 0.479.0</h2>
<h2>What's Changed</h2>
<ul>
<li>feat(<code>@​lucide/svelte</code>): Lucide svelte 5 package by <a
href="https://github.com/ericfennis"><code>@​ericfennis</code></a> in <a
href="https://redirect.github.com/lucide-icons/lucide/pull/2753">lucide-icons/lucide#2753</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/lucide-icons/lucide/compare/0.478.0...0.479.0">https://github.com/lucide-icons/lucide/compare/0.478.0...0.479.0</a></p>
<h2>Version 0.478.0</h2>
<h2>What's Changed</h2>
<ul>
<li>ci(pr-comment): Fix icon preview comment on PRs by <a
href="https://github.com/ericfennis"><code>@​ericfennis</code></a> in <a
href="https://redirect.github.com/lucide-icons/lucide/pull/2854">lucide-icons/lucide#2854</a></li>
<li>fix(ci): run lint pr title on title change by <a
href="https://github.com/jguddas"><code>@​jguddas</code></a> in <a
href="https://redirect.github.com/lucide-icons/lucide/pull/2872">lucide-icons/lucide#2872</a></li>
<li>fix(metadata): name change reflected in contributions by <a
href="https://github.com/AnnaSasDev"><code>@​AnnaSasDev</code></a> in <a
href="https://redirect.github.com/lucide-icons/lucide/pull/2866">lucide-icons/lucide#2866</a></li>
<li>fix(icons): changed <code>brackets</code> icon by <a
href="https://github.com/jguddas"><code>@​jguddas</code></a> in <a
href="https://redirect.github.com/lucide-icons/lucide/pull/2863">lucide-icons/lucide#2863</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/lucide-icons/lucide/compare/0.477.0...0.478.0">https://github.com/lucide-icons/lucide/compare/0.477.0...0.478.0</a></p>
<h2>New icons 0.477.0</h2>
<h2>New icons 🎨</h2>
<ul>
<li><code>square-round-corner</code> (<a
href="https://github.com/lucide-icons/lucide/tree/HEAD/packages/lucide-react/issues/2323">#2323</a>)
by <a href="https://github.com/liamb13"><code>@​liamb13</code></a></li>
</ul>
<h2>Modified Icons 🔨</h2>
<ul>
<li><code>circle-slash-2</code> (<a
href="https://github.com/lucide-icons/lucide/tree/HEAD/packages/lucide-react/issues/2837">#2837</a>)
by <a href="https://github.com/jguddas"><code>@​jguddas</code></a></li>
</ul>
<h2>Fixes and new icons 0.476.0</h2>
<h2>Fixes</h2>
<ul>
<li>fix(lucide-react): Revert exports property package.json, fixing edge
worker environments. by <a
href="https://github.com/ericfennis"><code>@​ericfennis</code></a> in <a
href="https://redirect.github.com/lucide-icons/lucide/pull/2814">lucide-icons/lucide#2814</a></li>
<li>fix(lucide): Lucide create element function returning SVG Element by
<a href="https://github.com/ericfennis"><code>@​ericfennis</code></a> in
<a
href="https://redirect.github.com/lucide-icons/lucide/pull/2816">lucide-icons/lucide#2816</a></li>
</ul>
<h2>New icons 🎨</h2>
<ul>
<li><code>shield-user</code> (<a
href="https://github.com/lucide-icons/lucide/tree/HEAD/packages/lucide-react/issues/2608">#2608</a>)
by <a
href="https://github.com/sebinemeth"><code>@​sebinemeth</code></a></li>
</ul>
<h2>Modified Icons 🔨</h2>
<ul>
<li><code>beef</code> (<a
href="https://github.com/lucide-icons/lucide/tree/HEAD/packages/lucide-react/issues/2832">#2832</a>)
by <a href="https://github.com/jguddas"><code>@​jguddas</code></a></li>
</ul>
<h2>New icons 0.475.0</h2>
<h2>New icons 🎨</h2>
<ul>
<li><code>circle-small</code> (<a
href="https://github.com/lucide-icons/lucide/tree/HEAD/packages/lucide-react/issues/2607">#2607</a>)
by <a
href="https://github.com/jamiemlaw"><code>@​jamiemlaw</code></a></li>
<li><code>mars-stroke</code> (<a
href="https://github.com/lucide-icons/lucide/tree/HEAD/packages/lucide-react/issues/2607">#2607</a>)
by <a
href="https://github.com/jamiemlaw"><code>@​jamiemlaw</code></a></li>
<li><code>mars</code> (<a
href="https://github.com/lucide-icons/lucide/tree/HEAD/packages/lucide-react/issues/2607">#2607</a>)
by <a
href="https://github.com/jamiemlaw"><code>@​jamiemlaw</code></a></li>
<li><code>non-binary</code> (<a
href="https://github.com/lucide-icons/lucide/tree/HEAD/packages/lucide-react/issues/2607">#2607</a>)
by <a
href="https://github.com/jamiemlaw"><code>@​jamiemlaw</code></a></li>
<li><code>transgender</code> (<a
href="https://github.com/lucide-icons/lucide/tree/HEAD/packages/lucide-react/issues/2607">#2607</a>)
by <a
href="https://github.com/jamiemlaw"><code>@​jamiemlaw</code></a></li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="1787b82cfe"><code>1787b82</code></a>
build(deps-dev): bump vite from 5.4.13 to 5.4.14 in /packages/lucide (<a
href="https://github.com/lucide-icons/lucide/tree/HEAD/packages/lucide-react/issues/2804">#2804</a>)</li>
<li><a
href="b46927e510"><code>b46927e</code></a>
fix(lucide-react): Revert exports property package.json, fixing edge
worker e...</li>
<li><a
href="3ab6c373a0"><code>3ab6c37</code></a>
build(deps-dev): bump vite from 5.4.12 to 5.4.13 (<a
href="https://github.com/lucide-icons/lucide/tree/HEAD/packages/lucide-react/issues/2798">#2798</a>)</li>
<li><a
href="ba2c4b526f"><code>ba2c4b5</code></a>
build(deps-dev): bump vite from 5.1.8 to 5.4.12 (<a
href="https://github.com/lucide-icons/lucide/tree/HEAD/packages/lucide-react/issues/2786">#2786</a>)</li>
<li><a
href="50630b3aaf"><code>50630b3</code></a>
ci: Improve build speeds (<a
href="https://github.com/lucide-icons/lucide/tree/HEAD/packages/lucide-react/issues/2778">#2778</a>)</li>
<li>See full diff in <a
href="https://github.com/lucide-icons/lucide/commits/0.479.0/packages/lucide-react">compare
view</a></li>
</ul>
</details>
<br />

Updates `next-themes` from 0.4.4 to 0.4.5
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/pacocoursey/next-themes/releases">next-themes's
releases</a>.</em></p>
<blockquote>
<h2>v0.4.5</h2>
<h2>What's Changed</h2>
<ul>
<li>fix: map theme to class using ValueObject in injected script by <a
href="https://github.com/danielgavrilov"><code>@​danielgavrilov</code></a>
in <a
href="https://redirect.github.com/pacocoursey/next-themes/pull/330">pacocoursey/next-themes#330</a></li>
<li>Reduce number of renders by pre-setting resolvedTheme by <a
href="https://github.com/wahba-openai"><code>@​wahba-openai</code></a>
in <a
href="https://redirect.github.com/pacocoursey/next-themes/pull/338">pacocoursey/next-themes#338</a></li>
<li>Bump next from 14.2.10 to 14.2.15 in the npm_and_yarn group across 1
directory by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a
href="https://redirect.github.com/pacocoursey/next-themes/pull/331">pacocoursey/next-themes#331</a></li>
<li>Bump the npm_and_yarn group across 1 directory with 7 updates by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a
href="https://redirect.github.com/pacocoursey/next-themes/pull/341">pacocoursey/next-themes#341</a></li>
<li>chore: Fix corepack errors in CI by <a
href="https://github.com/pacocoursey"><code>@​pacocoursey</code></a> in
<a
href="https://redirect.github.com/pacocoursey/next-themes/pull/342">pacocoursey/next-themes#342</a></li>
</ul>
<h2>New Contributors</h2>
<ul>
<li><a
href="https://github.com/danielgavrilov"><code>@​danielgavrilov</code></a>
made their first contribution in <a
href="https://redirect.github.com/pacocoursey/next-themes/pull/330">pacocoursey/next-themes#330</a></li>
<li><a
href="https://github.com/wahba-openai"><code>@​wahba-openai</code></a>
made their first contribution in <a
href="https://redirect.github.com/pacocoursey/next-themes/pull/338">pacocoursey/next-themes#338</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/pacocoursey/next-themes/compare/v0.4.4...v0.4.5">https://github.com/pacocoursey/next-themes/compare/v0.4.4...v0.4.5</a></p>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="d12996b4e8"><code>d12996b</code></a>
chore: Fix corepack errors in CI (<a
href="https://redirect.github.com/pacocoursey/next-themes/issues/342">#342</a>)</li>
<li><a
href="b77db23e9f"><code>b77db23</code></a>
Bump the npm_and_yarn group across 1 directory with 7 updates (<a
href="https://redirect.github.com/pacocoursey/next-themes/issues/341">#341</a>)</li>
<li><a
href="d3fa4ee9ad"><code>d3fa4ee</code></a>
Bump next from 14.2.10 to 14.2.15 in the npm_and_yarn group across 1
director...</li>
<li><a
href="ad83567b11"><code>ad83567</code></a>
Reduce number of renders by pre-setting resolvedTheme (<a
href="https://redirect.github.com/pacocoursey/next-themes/issues/338">#338</a>)</li>
<li><a
href="1b510445a3"><code>1b51044</code></a>
fix: map theme to class using ValueObject in injected script (<a
href="https://redirect.github.com/pacocoursey/next-themes/issues/330">#330</a>)</li>
<li>See full diff in <a
href="https://github.com/pacocoursey/next-themes/compare/v0.4.4...v0.4.5">compare
view</a></li>
</ul>
</details>
<br />

Updates `react-day-picker` from 9.5.1 to 9.6.1
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/gpbl/react-day-picker/releases">react-day-picker's
releases</a>.</em></p>
<blockquote>
<h2>v9.6.1</h2>
<p>This release addresses an accessibility issue, adds a new
<code>animate</code> prop and fixes other minor bugs.</p>
<h3>Possible Breaking Change in Custom Styles</h3>
<p>To address a <a
href="https://redirect.github.com/gpbl/react-day-picker/issues/2630">focus
lost bug</a> affecting navigation buttons, we <a
href="https://redirect.github.com/gpbl/react-day-picker/pull/2685">updated</a>
the buttons to use <code>aria-disabled</code> instead of the
<code>disabled</code> attribute.</p>
<p>This change may cause custom styles for those disabled buttons to
break. To fix it in your code, update the CSS selector to target
<code>[aria-disabled=&quot;true&quot;]</code>:</p>
<pre lang="diff"><code>- .rdp-button_next:disabled,
+ .rdp-button_next[aria-disabled=&quot;true&quot;] {
  /* your custom CSS */
}
- .rdp-button_previous:disabled,
+ .rdp-button_previous[aria-disabled=&quot;true&quot;] {
  /* your custom CSS */
}
</code></pre>
<h3>Animating Month Transitions</h3>
<p>Thanks to the work by <a
href="https://github.com/rodgobbi"><code>@​rodgobbi</code></a>, we have
added animations to DayPicker. The new <a
href="http://daypicker.dev/docs/navigation#animate"><code>animate</code>
prop</a> enables CSS transitions for captions and weeks when navigating
between months:</p>
<!-- raw HTML omitted -->
<pre lang="tsx"><code>&lt;DayPicker animate /&gt;
</code></pre>
<p>Customizing the animation style can be challenging due to the HTML
table structure of the grid. We may address this in the future. Please
leave your feedback in <a
href="https://github.com/gpbl/react-day-picker/discussions">DayPicker
Discussions</a>.</p>
<h2>What's Changed</h2>
<ul>
<li>feat: new <code>animate</code> prop by <a
href="https://github.com/rodgobbi"><code>@​rodgobbi</code></a> in <a
href="https://redirect.github.com/gpbl/react-day-picker/pull/2684">gpbl/react-day-picker#2684</a></li>
<li>feat(performance): add <code>sideEffects</code> property to
package.json by <a
href="https://github.com/rodgobbi"><code>@​rodgobbi</code></a> in <a
href="https://redirect.github.com/gpbl/react-day-picker/pull/2673">gpbl/react-day-picker#2673</a></li>
<li>fix(accessibility): focus lost when navigation button is disabled by
<a href="https://github.com/gpbl"><code>@​gpbl</code></a> in <a
href="https://redirect.github.com/gpbl/react-day-picker/pull/2685">gpbl/react-day-picker#2685</a></li>
<li>fix: render selected days with <code>selected</code> modifier when
disabled by <a
href="https://github.com/rodgobbi"><code>@​rodgobbi</code></a> in <a
href="https://redirect.github.com/gpbl/react-day-picker/pull/2700">gpbl/react-day-picker#2700</a></li>
<li>fix(build): remove extra files from package.json by <a
href="https://github.com/gpbl"><code>@​gpbl</code></a> in <a
href="https://redirect.github.com/gpbl/react-day-picker/pull/2692">gpbl/react-day-picker#2692</a></li>
<li>chore(types): fix deprecation of select event handler types by <a
href="https://github.com/timothyis"><code>@​timothyis</code></a> in <a
href="https://redirect.github.com/gpbl/react-day-picker/pull/2680">gpbl/react-day-picker#2680</a></li>
</ul>
<h3>v9.6.1</h3>
<ul>
<li>fix(build): add missing .css entries in package.json files by <a
href="https://github.com/gpbl"><code>@​gpbl</code></a> in <a
href="https://redirect.github.com/gpbl/react-day-picker/pull/2703">gpbl/react-day-picker#2703</a></li>
</ul>
<h2>New Contributors</h2>
<ul>
<li><a href="https://github.com/timothyis"><code>@​timothyis</code></a>
made their first contribution in <a
href="https://redirect.github.com/gpbl/react-day-picker/pull/2680">gpbl/react-day-picker#2680</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/gpbl/react-day-picker/compare/v9.5.1...v9.6.1">https://github.com/gpbl/react-day-picker/compare/v9.5.1...v9.6.1</a></p>
<h2>v9.6.0</h2>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="c37983b8e5"><code>c37983b</code></a>
build: bump v9.6.1</li>
<li><a
href="3d382fe18d"><code>3d382fe</code></a>
build: add missing .css entries in package.json files (<a
href="https://redirect.github.com/gpbl/react-day-picker/issues/2703">#2703</a>)</li>
<li><a
href="ca71182076"><code>ca71182</code></a>
build: bump v9.6.0</li>
<li><a
href="37f759a718"><code>37f759a</code></a>
chore: animate prop, remove <a
href="https://github.com/experimental"><code>@​experimental</code></a>
tag</li>
<li><a
href="9aa3d35062"><code>9aa3d35</code></a>
docs: update for new <code>animate</code> prop (<a
href="https://redirect.github.com/gpbl/react-day-picker/issues/2694">#2694</a>)</li>
<li><a
href="d7d0a8ad13"><code>d7d0a8a</code></a>
fix: render selected days with selected modifier when disabled (<a
href="https://redirect.github.com/gpbl/react-day-picker/issues/2700">#2700</a>)</li>
<li><a
href="48f00dc20f"><code>48f00dc</code></a>
fix: focus lost when navigation button is disabled (<a
href="https://redirect.github.com/gpbl/react-day-picker/issues/2685">#2685</a>)</li>
<li><a
href="6617c9bf81"><code>6617c9b</code></a>
chore: update keyframes names (<a
href="https://redirect.github.com/gpbl/react-day-picker/issues/2695">#2695</a>)</li>
<li><a
href="13e0a4acc9"><code>13e0a4a</code></a>
test: fix test related to first of month (<a
href="https://redirect.github.com/gpbl/react-day-picker/issues/2693">#2693</a>)</li>
<li><a
href="ebde52dd02"><code>ebde52d</code></a>
docs: fix no-wrap-table style for paragraphs</li>
<li>Additional commits viewable in <a
href="https://github.com/gpbl/react-day-picker/compare/v9.5.1...v9.6.1">compare
view</a></li>
</ul>
</details>
<br />

Updates `react-icons` from 5.4.0 to 5.5.0
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/react-icons/react-icons/releases">react-icons's
releases</a>.</em></p>
<blockquote>
<h2>v5.5.0</h2>
<h2>What's Changed</h2>
<ul>
<li>[React 19] Update IconType type to return React.ReactNode by <a
href="https://github.com/diaz-hfc"><code>@​diaz-hfc</code></a> in <a
href="https://redirect.github.com/react-icons/react-icons/pull/1004">react-icons/react-icons#1004</a></li>
<li>Bump vite from 5.2.10 to 5.4.11 by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a
href="https://redirect.github.com/react-icons/react-icons/pull/996">react-icons/react-icons#996</a></li>
<li>Bump nanoid from 3.3.7 to 3.3.8 by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a
href="https://redirect.github.com/react-icons/react-icons/pull/1005">react-icons/react-icons#1005</a></li>
<li>Bump vite from 5.4.11 to 5.4.14 by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a
href="https://redirect.github.com/react-icons/react-icons/pull/1021">react-icons/react-icons#1021</a></li>
<li>Bump esbuild from 0.20.2 to 0.25.0 by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a
href="https://redirect.github.com/react-icons/react-icons/pull/1027">react-icons/react-icons#1027</a></li>
</ul>
<h2>New Contributors</h2>
<ul>
<li><a href="https://github.com/diaz-hfc"><code>@​diaz-hfc</code></a>
made their first contribution in <a
href="https://redirect.github.com/react-icons/react-icons/pull/1004">react-icons/react-icons#1004</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/react-icons/react-icons/compare/v5.4.0...v5.5.0">https://github.com/react-icons/react-icons/compare/v5.4.0...v5.5.0</a></p>
<table>
<thead>
<tr>
<th>Icon Library</th>
<th>License</th>
<th>Version</th>
<th align="right">Count</th>
</tr>
</thead>
<tbody>
<tr>
<td><a href="https://circumicons.com/">Circum Icons</a></td>
<td><a
href="https://github.com/Klarr-Agency/Circum-Icons/blob/main/LICENSE">MPL-2.0
license</a></td>
<td>1.0.0</td>
<td align="right">288</td>
</tr>
<tr>
<td><a href="https://fontawesome.com/">Font Awesome 5</a></td>
<td><a href="https://creativecommons.org/licenses/by/4.0/">CC BY 4.0
License</a></td>
<td>5.15.4-3-gafecf2a</td>
<td align="right">1612</td>
</tr>
<tr>
<td><a href="https://fontawesome.com/">Font Awesome 6</a></td>
<td><a href="https://creativecommons.org/licenses/by/4.0/">CC BY 4.0
License</a></td>
<td>6.6.0</td>
<td align="right">2050</td...

_Description has been truncated_

---------

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Zamil Majdy <zamil.majdy@agpt.co>
2025-03-12 14:33:06 +00:00
Nicholas Tindle
a5f448af98 Merge branch 'dev' into ci-chromatic 2025-03-06 11:24:49 -06:00
Nicholas Tindle
c766bd66e1 fix(frontend): typechecking 2025-02-05 17:15:24 -06:00
Nicholas Tindle
6d11ad8051 fix(frontend): format 2025-02-05 17:12:33 -06:00
Nicholas Tindle
d476983bd2 fix: doesn't crash 2025-02-05 17:10:21 -06:00
Nicholas Tindle
3ac1ce5a3f fix: format 2025-02-05 16:50:51 -06:00
Nicholas Tindle
3b89e6d2b7 Merge branch 'ci-chromatic' of https://github.com/Significant-Gravitas/AutoGPT into ci-chromatic 2025-02-05 16:49:32 -06:00
Nicholas Tindle
c7a7652b9f Merge branch 'dev' into ci-chromatic 2025-02-05 16:47:46 -06:00
Nicholas Tindle
b6b0d0b209 Merge branch 'dev' into ci-chromatic 2025-02-03 11:55:31 -06:00
Nicholas Tindle
a5b1495062 Merge branch 'dev' into ci-chromatic 2025-02-03 07:13:53 -06:00
Nicholas Tindle
026f16c10f Merge branch 'dev' into ci-chromatic 2025-01-31 04:50:48 -06:00
Nicholas Tindle
c468201c53 Update mock_client.ts 2025-01-29 07:10:03 -06:00
Nicholas Tindle
5beb581d1c feat(frontend): minimocking 2025-01-29 07:04:38 -06:00
Nicholas Tindle
df2339c1cf feat: add mock backend for rendering the storybook stuff 2025-01-29 06:46:50 -06:00
Nicholas Tindle
327db54321 Merge branch 'open-2047-add-type-checking-step-to-front-end-ci' into ci-chromatic 2025-01-29 12:26:19 +00:00
Nicholas Tindle
234d6f78ba Merge branch 'dev' into open-2047-add-type-checking-step-to-front-end-ci 2025-01-29 12:17:23 +00:00
Nicholas Tindle
43088ddff8 fix: incorrect meshing of types and test 2025-01-29 06:16:28 -06:00
Nicholas Tindle
fd955fba25 ref: add providers to the story previews 2025-01-29 05:35:56 -06:00
Nicholas Tindle
83943d9ddb Merge branch 'open-2047-add-type-checking-step-to-front-end-ci' into ci-chromatic 2025-01-29 10:57:00 +00:00
Nicholas Tindle
60c26e62f6 Merge branch 'dev' into open-2047-add-type-checking-step-to-front-end-ci 2025-01-29 10:53:51 +00:00
Nicholas Tindle
1fc8f9ba66 fix: handle conditions better for feature flagging 2025-01-28 18:04:41 +00:00
Nicholas Tindle
33d747f457 ref: remove unused code 2025-01-28 16:39:11 +00:00
Nicholas Tindle
06fa001a37 ref: use data structure for copy and paste data 2025-01-28 16:36:07 +00:00
Nicholas Tindle
4e7b56b814 ref: pr changes 2025-01-28 16:31:25 +00:00
Nicholas Tindle
d6b03a4f18 ref: pr change request 2025-01-28 16:30:13 +00:00
Nicholas Tindle
fae9aeb49a fix: linting 2025-01-28 16:30:05 +00:00
Nicholas Tindle
5e8c1e274e fix: linting 2025-01-28 16:29:58 +00:00
Nicholas Tindle
55f7dc4853 fix: drop classname unused 2025-01-28 16:25:23 +00:00
Nicholas Tindle
b317adb9cf ref: remove classname from navbar link 2025-01-28 16:23:14 +00:00
Nicholas Tindle
c873ba04b8 ref: split out type-check step + fix tsc error 2025-01-28 15:38:05 +00:00
Nicholas Tindle
00f0311dd0 ref: split out type-check step + fix tsc error 2025-01-28 15:31:52 +00:00
Nicholas Tindle
9b2bd756fa Update platform-frontend-ci.yml 2025-01-28 15:17:42 +00:00
Nicholas Tindle
bceb83ca30 fix: workingdir required 2025-01-28 15:17:42 +00:00
Nicholas Tindle
eadbfcd920 Update platform-frontend-ci.yml 2025-01-28 15:17:41 +00:00
SwiftyOS
9768540b60 Merge branch 'dev' into open-2047-add-type-checking-step-to-front-end-ci 2025-01-28 15:46:21 +01:00
Nicholas Tindle
697436be07 Merge branch 'dev' into open-2047-add-type-checking-step-to-front-end-ci 2025-01-28 07:53:27 +00:00
Nicholas Tindle
d725e105a0 Merge branch 'dev' into open-2047-add-type-checking-step-to-front-end-ci 2025-01-26 15:27:50 +01:00
Nicholas Tindle
927f43f52f fix: formatting 2025-01-26 12:18:11 +00:00
Nicholas Tindle
eedcc92d6f fix: add secret to all the subschemas 2025-01-26 12:15:37 +00:00
Nicholas Tindle
f0c378c70d fix: missing type addition 2025-01-26 12:12:12 +00:00
Nicholas Tindle
c6c2b852df fix: missing inputs based on changes 2025-01-26 12:11:37 +00:00
Nicholas Tindle
aaab8b1e0e fix: more formatting 2025-01-26 11:56:07 +00:00
Nicholas Tindle
a4eeb4535a fix: formatting 2025-01-26 11:55:56 +00:00
Nicholas Tindle
db068c598c fix: missing types 2025-01-26 11:46:35 +00:00
Nicholas Tindle
d4d9efc73e fix: missing attribute 2025-01-26 11:46:25 +00:00
Nicholas Tindle
ffaf77df4e fix: type the params 2025-01-26 11:46:10 +00:00
Nicholas Tindle
2daf08434e fix: type the params 2025-01-26 11:46:01 +00:00
Nicholas Tindle
745137f4c2 fix: pass correct subclass 2025-01-26 11:45:46 +00:00
Nicholas Tindle
3a2c3deb0e fix: remove import + impossible case 2025-01-26 11:45:34 +00:00
Nicholas Tindle
66a15a7b8c fix: user correct object when deleting 2025-01-26 11:44:25 +00:00
Nicholas Tindle
669c61de76 fix: take in classnames as used by the outer component
we probbaly shouldn't be doing this?
2025-01-26 11:44:04 +00:00
Nicholas Tindle
e860bde3d4 fix: coalesce types and use a default
@aarushik93 is this okay?
2025-01-26 11:43:40 +00:00
Nicholas Tindle
f5394f6d65 fix: expose interface for sub object so it can be used in other places to fix type errors 2025-01-26 11:43:00 +00:00
Nicholas Tindle
06e845abe7 feat: take in props for navbar
Is this desired?
2025-01-26 11:42:28 +00:00
Nicholas Tindle
c2c3c29018 fix: use proper state object 2025-01-26 11:42:08 +00:00
Nicholas Tindle
31fd0b557a fix: add missing import 2025-01-26 11:41:44 +00:00
Nicholas Tindle
9350fe1d2b fix: fully disable unused page 2025-01-26 11:41:12 +00:00
Nicholas Tindle
5ae92820b4 fix: remove unused classnames 2025-01-26 11:40:57 +00:00
Nicholas Tindle
66a87e5a14 ci: typechecker for frontend 2025-01-26 11:22:38 +00:00
Nicholas Tindle
e1f8882e2d fix: stories being broken 2025-01-26 11:18:12 +00:00
353 changed files with 16492 additions and 8551 deletions

View File

@@ -129,30 +129,6 @@ updates:
- "minor"
- "patch"
# Submodules
- package-ecosystem: "gitsubmodule"
directory: "autogpt_platform/supabase"
schedule:
interval: "weekly"
open-pull-requests-limit: 1
target-branch: "dev"
commit-message:
prefix: "chore(platform/deps)"
prefix-development: "chore(platform/deps-dev)"
groups:
production-dependencies:
dependency-type: "production"
update-types:
- "minor"
- "patch"
development-dependencies:
dependency-type: "development"
update-types:
- "minor"
- "patch"
# Docs
- package-ecosystem: 'pip'
directory: "docs/"

View File

@@ -34,6 +34,7 @@ jobs:
python -m prisma migrate deploy
env:
DATABASE_URL: ${{ secrets.BACKEND_DATABASE_URL }}
DIRECT_URL: ${{ secrets.BACKEND_DATABASE_URL }}
trigger:

View File

@@ -36,6 +36,7 @@ jobs:
python -m prisma migrate deploy
env:
DATABASE_URL: ${{ secrets.BACKEND_DATABASE_URL }}
DIRECT_URL: ${{ secrets.BACKEND_DATABASE_URL }}
trigger:
needs: migrate

View File

@@ -80,18 +80,35 @@ jobs:
- name: Install Poetry (Unix)
run: |
curl -sSL https://install.python-poetry.org | python3 -
# Extract Poetry version from backend/poetry.lock
HEAD_POETRY_VERSION=$(head -n 1 poetry.lock | grep -oP '(?<=Poetry )[0-9]+\.[0-9]+\.[0-9]+')
echo "Found Poetry version ${HEAD_POETRY_VERSION} in backend/poetry.lock"
if [ -n "$BASE_REF" ]; then
BASE_BRANCH=${BASE_REF/refs\/heads\//}
BASE_POETRY_VERSION=$((git show "origin/$BASE_BRANCH":./poetry.lock; true) | head -n 1 | grep -oP '(?<=Poetry )[0-9]+\.[0-9]+\.[0-9]+')
echo "Found Poetry version ${BASE_POETRY_VERSION} in backend/poetry.lock on ${BASE_REF}"
POETRY_VERSION=$(printf '%s\n' "$HEAD_POETRY_VERSION" "$BASE_POETRY_VERSION" | sort -V | tail -n1)
else
POETRY_VERSION=$HEAD_POETRY_VERSION
fi
echo "Using Poetry version ${POETRY_VERSION}"
# Install Poetry
curl -sSL https://install.python-poetry.org | POETRY_VERSION=$POETRY_VERSION python3 -
if [ "${{ runner.os }}" = "macOS" ]; then
PATH="$HOME/.local/bin:$PATH"
echo "$HOME/.local/bin" >> $GITHUB_PATH
fi
env:
BASE_REF: ${{ github.base_ref || github.event.merge_group.base_ref }}
- name: Check poetry.lock
run: |
poetry lock
if ! git diff --quiet poetry.lock; then
if ! git diff --quiet --ignore-matching-lines="^# " poetry.lock; then
echo "Error: poetry.lock not up to date."
echo
git diff poetry.lock
@@ -118,6 +135,7 @@ jobs:
run: poetry run prisma migrate dev --name updates
env:
DATABASE_URL: ${{ steps.supabase.outputs.DB_URL }}
DIRECT_URL: ${{ steps.supabase.outputs.DB_URL }}
- id: lint
name: Run Linter
@@ -134,12 +152,13 @@ jobs:
env:
LOG_LEVEL: ${{ runner.debug && 'DEBUG' || 'INFO' }}
DATABASE_URL: ${{ steps.supabase.outputs.DB_URL }}
DIRECT_URL: ${{ steps.supabase.outputs.DB_URL }}
SUPABASE_URL: ${{ steps.supabase.outputs.API_URL }}
SUPABASE_SERVICE_ROLE_KEY: ${{ steps.supabase.outputs.SERVICE_ROLE_KEY }}
SUPABASE_JWT_SECRET: ${{ steps.supabase.outputs.JWT_SECRET }}
REDIS_HOST: 'localhost'
REDIS_PORT: '6379'
REDIS_PASSWORD: 'testpassword'
REDIS_HOST: "localhost"
REDIS_PORT: "6379"
REDIS_PASSWORD: "testpassword"
env:
CI: true
@@ -152,8 +171,8 @@ jobs:
# If you want to replace this, you can do so by making our entire system generate
# new credentials for each local user and update the environment variables in
# the backend service, docker composes, and examples
RABBITMQ_DEFAULT_USER: 'rabbitmq_user_default'
RABBITMQ_DEFAULT_PASS: 'k0VMxyIJF9S35f3x2uaw5IWAl6Y536O7'
RABBITMQ_DEFAULT_USER: "rabbitmq_user_default"
RABBITMQ_DEFAULT_PASS: "k0VMxyIJF9S35f3x2uaw5IWAl6Y536O7"
# - name: Upload coverage reports to Codecov
# uses: codecov/codecov-action@v4

View File

@@ -56,6 +56,30 @@ jobs:
run: |
yarn type-check
design:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
with:
fetch-depth: 0
- name: Set up Node.js
uses: actions/setup-node@v4
with:
node-version: "21"
- name: Install dependencies
run: |
yarn install --frozen-lockfile
- name: Run Chromatic
uses: chromaui/action@latest
with:
# ⚠️ Make sure to configure a `CHROMATIC_PROJECT_TOKEN` repository secret
projectToken: ${{ secrets.CHROMATIC_PROJECT_TOKEN }}
workingDir: autogpt_platform/frontend
test:
runs-on: ubuntu-latest
strategy:
@@ -82,7 +106,7 @@ jobs:
- name: Copy default supabase .env
run: |
cp ../supabase/docker/.env.example ../.env
cp ../.env.example ../.env
- name: Copy backend .env
run: |

3
.gitmodules vendored
View File

@@ -1,6 +1,3 @@
[submodule "classic/forge/tests/vcr_cassettes"]
path = classic/forge/tests/vcr_cassettes
url = https://github.com/Significant-Gravitas/Auto-GPT-test-cassettes
[submodule "autogpt_platform/supabase"]
path = autogpt_platform/supabase
url = https://github.com/supabase/supabase.git

View File

@@ -140,7 +140,7 @@ repos:
language: system
- repo: https://github.com/psf/black
rev: 23.12.1
rev: 24.10.0
# Black has sensible defaults, doesn't need package context, and ignores
# everything in .gitignore, so it works fine without any config or arguments.
hooks:

View File

@@ -2,9 +2,6 @@
If you are reading this, you are probably looking for the full **[contribution guide]**,
which is part of our [wiki].
Also check out our [🚀 Roadmap][roadmap] for information about our priorities and associated tasks.
<!-- You can find our immediate priorities and their progress on our public [kanban board]. -->
[contribution guide]: https://github.com/Significant-Gravitas/AutoGPT/wiki/Contributing
[wiki]: https://github.com/Significant-Gravitas/AutoGPT/wiki
[roadmap]: https://github.com/Significant-Gravitas/AutoGPT/discussions/6971

View File

@@ -15,7 +15,11 @@
> Setting up and hosting the AutoGPT Platform yourself is a technical process.
> If you'd rather something that just works, we recommend [joining the waitlist](https://bit.ly/3ZDijAI) for the cloud-hosted beta.
https://github.com/user-attachments/assets/d04273a5-b36a-4a37-818e-f631ce72d603
### Updated Setup Instructions:
Weve moved to a fully maintained and regularly updated documentation site.
👉 [Follow the official self-hosting guide here](https://docs.agpt.co/platform/getting-started/)
This tutorial assumes you have Docker, VSCode, git and npm installed.

View File

@@ -0,0 +1,123 @@
############
# Secrets
# YOU MUST CHANGE THESE BEFORE GOING INTO PRODUCTION
############
POSTGRES_PASSWORD=your-super-secret-and-long-postgres-password
JWT_SECRET=your-super-secret-jwt-token-with-at-least-32-characters-long
ANON_KEY=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyAgCiAgICAicm9sZSI6ICJhbm9uIiwKICAgICJpc3MiOiAic3VwYWJhc2UtZGVtbyIsCiAgICAiaWF0IjogMTY0MTc2OTIwMCwKICAgICJleHAiOiAxNzk5NTM1NjAwCn0.dc_X5iR_VP_qT0zsiyj_I_OZ2T9FtRU2BBNWN8Bu4GE
SERVICE_ROLE_KEY=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyAgCiAgICAicm9sZSI6ICJzZXJ2aWNlX3JvbGUiLAogICAgImlzcyI6ICJzdXBhYmFzZS1kZW1vIiwKICAgICJpYXQiOiAxNjQxNzY5MjAwLAogICAgImV4cCI6IDE3OTk1MzU2MDAKfQ.DaYlNEoUrrEn2Ig7tqibS-PHK5vgusbcbo7X36XVt4Q
DASHBOARD_USERNAME=supabase
DASHBOARD_PASSWORD=this_password_is_insecure_and_should_be_updated
SECRET_KEY_BASE=UpNVntn3cDxHJpq99YMc1T1AQgQpc8kfYTuRgBiYa15BLrx8etQoXz3gZv1/u2oq
VAULT_ENC_KEY=your-encryption-key-32-chars-min
############
# Database - You can change these to any PostgreSQL database that has logical replication enabled.
############
POSTGRES_HOST=db
POSTGRES_DB=postgres
POSTGRES_PORT=5432
# default user is postgres
############
# Supavisor -- Database pooler
############
POOLER_PROXY_PORT_TRANSACTION=6543
POOLER_DEFAULT_POOL_SIZE=20
POOLER_MAX_CLIENT_CONN=100
POOLER_TENANT_ID=your-tenant-id
############
# API Proxy - Configuration for the Kong Reverse proxy.
############
KONG_HTTP_PORT=8000
KONG_HTTPS_PORT=8443
############
# API - Configuration for PostgREST.
############
PGRST_DB_SCHEMAS=public,storage,graphql_public
############
# Auth - Configuration for the GoTrue authentication server.
############
## General
SITE_URL=http://localhost:3000
ADDITIONAL_REDIRECT_URLS=
JWT_EXPIRY=3600
DISABLE_SIGNUP=false
API_EXTERNAL_URL=http://localhost:8000
## Mailer Config
MAILER_URLPATHS_CONFIRMATION="/auth/v1/verify"
MAILER_URLPATHS_INVITE="/auth/v1/verify"
MAILER_URLPATHS_RECOVERY="/auth/v1/verify"
MAILER_URLPATHS_EMAIL_CHANGE="/auth/v1/verify"
## Email auth
ENABLE_EMAIL_SIGNUP=true
ENABLE_EMAIL_AUTOCONFIRM=false
SMTP_ADMIN_EMAIL=admin@example.com
SMTP_HOST=supabase-mail
SMTP_PORT=2500
SMTP_USER=fake_mail_user
SMTP_PASS=fake_mail_password
SMTP_SENDER_NAME=fake_sender
ENABLE_ANONYMOUS_USERS=false
## Phone auth
ENABLE_PHONE_SIGNUP=true
ENABLE_PHONE_AUTOCONFIRM=true
############
# Studio - Configuration for the Dashboard
############
STUDIO_DEFAULT_ORGANIZATION=Default Organization
STUDIO_DEFAULT_PROJECT=Default Project
STUDIO_PORT=3000
# replace if you intend to use Studio outside of localhost
SUPABASE_PUBLIC_URL=http://localhost:8000
# Enable webp support
IMGPROXY_ENABLE_WEBP_DETECTION=true
# Add your OpenAI API key to enable SQL Editor Assistant
OPENAI_API_KEY=
############
# Functions - Configuration for Functions
############
# NOTE: VERIFY_JWT applies to all functions. Per-function VERIFY_JWT is not supported yet.
FUNCTIONS_VERIFY_JWT=false
############
# Logs - Configuration for Logflare
# Please refer to https://supabase.com/docs/reference/self-hosting-analytics/introduction
############
LOGFLARE_LOGGER_BACKEND_API_KEY=your-super-secret-and-long-logflare-key
# Change vector.toml sinks to reflect this change
LOGFLARE_API_KEY=your-super-secret-and-long-logflare-key
# Docker socket location - this value will differ depending on your OS
DOCKER_SOCKET_LOCATION=/var/run/docker.sock
# Google Cloud Project details
GOOGLE_PROJECT_ID=GOOGLE_PROJECT_ID
GOOGLE_PROJECT_NUMBER=GOOGLE_PROJECT_NUMBER

View File

@@ -22,35 +22,29 @@ To run the AutoGPT Platform, follow these steps:
2. Run the following command:
```
git submodule update --init --recursive --progress
cp .env.example .env
```
This command will initialize and update the submodules in the repository. The `supabase` folder will be cloned to the root directory.
This command will copy the `.env.example` file to `.env`. You can modify the `.env` file to add your own environment variables.
3. Run the following command:
```
cp supabase/docker/.env.example .env
```
This command will copy the `.env.example` file to `.env` in the `supabase/docker` directory. You can modify the `.env` file to add your own environment variables.
4. Run the following command:
```
docker compose up -d
```
This command will start all the necessary backend services defined in the `docker-compose.yml` file in detached mode.
5. Navigate to `frontend` within the `autogpt_platform` directory:
4. Navigate to `frontend` within the `autogpt_platform` directory:
```
cd frontend
```
You will need to run your frontend application separately on your local machine.
6. Run the following command:
5. Run the following command:
```
cp .env.example .env.local
```
This command will copy the `.env.example` file to `.env.local` in the `frontend` directory. You can modify the `.env.local` within this folder to add your own environment variables for the frontend application.
7. Run the following command:
6. Run the following command:
```
npm install
npm run dev
@@ -61,7 +55,7 @@ To run the AutoGPT Platform, follow these steps:
yarn install && yarn dev
```
8. Open your browser and navigate to `http://localhost:3000` to access the AutoGPT Platform frontend.
7. Open your browser and navigate to `http://localhost:3000` to access the AutoGPT Platform frontend.
### Docker Compose Commands

View File

@@ -1,11 +1,9 @@
from .config import Settings
from .depends import requires_admin_user, requires_user
from .jwt_utils import parse_jwt_token
from .middleware import APIKeyValidator, auth_middleware
from .models import User
__all__ = [
"Settings",
"parse_jwt_token",
"requires_user",
"requires_admin_user",

View File

@@ -1,14 +1,11 @@
import os
from dotenv import load_dotenv
load_dotenv()
class Settings:
JWT_SECRET_KEY: str = os.getenv("SUPABASE_JWT_SECRET", "")
ENABLE_AUTH: bool = os.getenv("ENABLE_AUTH", "false").lower() == "true"
JWT_ALGORITHM: str = "HS256"
def __init__(self):
self.JWT_SECRET_KEY: str = os.getenv("SUPABASE_JWT_SECRET", "")
self.ENABLE_AUTH: bool = os.getenv("ENABLE_AUTH", "false").lower() == "true"
self.JWT_ALGORITHM: str = "HS256"
@property
def is_configured(self) -> bool:

View File

@@ -1,6 +1,6 @@
import fastapi
from .config import Settings
from .config import settings
from .middleware import auth_middleware
from .models import DEFAULT_USER_ID, User
@@ -17,7 +17,7 @@ def requires_admin_user(
def verify_user(payload: dict | None, admin_only: bool) -> User:
if not payload:
if Settings.ENABLE_AUTH:
if settings.ENABLE_AUTH:
raise fastapi.HTTPException(
status_code=401, detail="Authorization header is missing"
)

View File

@@ -8,7 +8,7 @@ from pydantic import Field, field_validator
from pydantic_settings import BaseSettings, SettingsConfigDict
from .filters import BelowLevelFilter
from .formatters import AGPTFormatter, StructuredLoggingFormatter
from .formatters import AGPTFormatter
LOG_DIR = Path(__file__).parent.parent.parent.parent / "logs"
LOG_FILE = "activity.log"
@@ -81,9 +81,26 @@ def configure_logging(force_cloud_logging: bool = False) -> None:
"""
config = LoggingConfig()
log_handlers: list[logging.Handler] = []
# Console output handlers
stdout = logging.StreamHandler(stream=sys.stdout)
stdout.setLevel(config.level)
stdout.addFilter(BelowLevelFilter(logging.WARNING))
if config.level == logging.DEBUG:
stdout.setFormatter(AGPTFormatter(DEBUG_LOG_FORMAT))
else:
stdout.setFormatter(AGPTFormatter(SIMPLE_LOG_FORMAT))
stderr = logging.StreamHandler()
stderr.setLevel(logging.WARNING)
if config.level == logging.DEBUG:
stderr.setFormatter(AGPTFormatter(DEBUG_LOG_FORMAT))
else:
stderr.setFormatter(AGPTFormatter(SIMPLE_LOG_FORMAT))
log_handlers += [stdout, stderr]
# Cloud logging setup
if config.enable_cloud_logging or force_cloud_logging:
import google.cloud.logging
@@ -97,28 +114,7 @@ def configure_logging(force_cloud_logging: bool = False) -> None:
transport=SyncTransport,
)
cloud_handler.setLevel(config.level)
cloud_handler.setFormatter(StructuredLoggingFormatter())
log_handlers.append(cloud_handler)
print("Cloud logging enabled")
else:
# Console output handlers
stdout = logging.StreamHandler(stream=sys.stdout)
stdout.setLevel(config.level)
stdout.addFilter(BelowLevelFilter(logging.WARNING))
if config.level == logging.DEBUG:
stdout.setFormatter(AGPTFormatter(DEBUG_LOG_FORMAT))
else:
stdout.setFormatter(AGPTFormatter(SIMPLE_LOG_FORMAT))
stderr = logging.StreamHandler()
stderr.setLevel(logging.WARNING)
if config.level == logging.DEBUG:
stderr.setFormatter(AGPTFormatter(DEBUG_LOG_FORMAT))
else:
stderr.setFormatter(AGPTFormatter(SIMPLE_LOG_FORMAT))
log_handlers += [stdout, stderr]
print("Console logging enabled")
# File logging setup
if config.enable_file_logging:
@@ -156,7 +152,6 @@ def configure_logging(force_cloud_logging: bool = False) -> None:
error_log_handler.setLevel(logging.ERROR)
error_log_handler.setFormatter(AGPTFormatter(DEBUG_LOG_FORMAT, no_color=True))
log_handlers.append(error_log_handler)
print("File logging enabled")
# Configure the root logger
logging.basicConfig(

View File

@@ -1,7 +1,6 @@
import logging
from colorama import Fore, Style
from google.cloud.logging_v2.handlers import CloudLoggingFilter, StructuredLogHandler
from .utils import remove_color_codes
@@ -80,16 +79,3 @@ class AGPTFormatter(FancyConsoleFormatter):
return remove_color_codes(super().format(record))
else:
return super().format(record)
class StructuredLoggingFormatter(StructuredLogHandler, logging.Formatter):
def __init__(self):
# Set up CloudLoggingFilter to add diagnostic info to the log records
self.cloud_logging_filter = CloudLoggingFilter()
# Init StructuredLogHandler
super().__init__()
def format(self, record: logging.LogRecord) -> str:
self.cloud_logging_filter.filter(record)
return super().format(record)

View File

@@ -2,6 +2,7 @@ import logging
import re
from typing import Any
import uvicorn.config
from colorama import Fore
@@ -25,3 +26,14 @@ def print_attribute(
"color": value_color,
},
)
def generate_uvicorn_config():
"""
Generates a uvicorn logging config that silences uvicorn's default logging and tells it to use the native logging module.
"""
log_config = dict(uvicorn.config.LOGGING_CONFIG)
log_config["loggers"]["uvicorn"] = {"handlers": []}
log_config["loggers"]["uvicorn.error"] = {"handlers": []}
log_config["loggers"]["uvicorn.access"] = {"handlers": []}
return log_config

View File

@@ -1,20 +1,59 @@
import inspect
import threading
from typing import Callable, ParamSpec, TypeVar
from typing import Awaitable, Callable, ParamSpec, TypeVar, cast, overload
P = ParamSpec("P")
R = TypeVar("R")
def thread_cached(func: Callable[P, R]) -> Callable[P, R]:
@overload
def thread_cached(func: Callable[P, Awaitable[R]]) -> Callable[P, Awaitable[R]]: ...
@overload
def thread_cached(func: Callable[P, R]) -> Callable[P, R]: ...
def thread_cached(
func: Callable[P, R] | Callable[P, Awaitable[R]],
) -> Callable[P, R] | Callable[P, Awaitable[R]]:
thread_local = threading.local()
def wrapper(*args: P.args, **kwargs: P.kwargs) -> R:
cache = getattr(thread_local, "cache", None)
if cache is None:
cache = thread_local.cache = {}
key = (args, tuple(sorted(kwargs.items())))
if key not in cache:
cache[key] = func(*args, **kwargs)
return cache[key]
def _clear():
if hasattr(thread_local, "cache"):
del thread_local.cache
return wrapper
if inspect.iscoroutinefunction(func):
async def async_wrapper(*args: P.args, **kwargs: P.kwargs) -> R:
cache = getattr(thread_local, "cache", None)
if cache is None:
cache = thread_local.cache = {}
key = (args, tuple(sorted(kwargs.items())))
if key not in cache:
cache[key] = await cast(Callable[P, Awaitable[R]], func)(
*args, **kwargs
)
return cache[key]
setattr(async_wrapper, "clear_cache", _clear)
return async_wrapper
else:
def sync_wrapper(*args: P.args, **kwargs: P.kwargs) -> R:
cache = getattr(thread_local, "cache", None)
if cache is None:
cache = thread_local.cache = {}
key = (args, tuple(sorted(kwargs.items())))
if key not in cache:
cache[key] = func(*args, **kwargs)
return cache[key]
setattr(sync_wrapper, "clear_cache", _clear)
return sync_wrapper
def clear_thread_cache(func: Callable) -> None:
if clear := getattr(func, "clear_cache", None):
clear()

View File

@@ -1,4 +1,4 @@
# This file is automatically @generated by Poetry 2.1.1 and should not be changed by hand.
# This file is automatically @generated by Poetry 2.1.2 and should not be changed by hand.
[[package]]
name = "aiohappyeyeballs"
@@ -14,113 +14,104 @@ files = [
[[package]]
name = "aiohttp"
version = "3.10.5"
version = "3.11.15"
description = "Async http client/server framework (asyncio)"
optional = false
python-versions = ">=3.8"
python-versions = ">=3.9"
groups = ["main"]
files = [
{file = "aiohttp-3.10.5-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:18a01eba2574fb9edd5f6e5fb25f66e6ce061da5dab5db75e13fe1558142e0a3"},
{file = "aiohttp-3.10.5-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:94fac7c6e77ccb1ca91e9eb4cb0ac0270b9fb9b289738654120ba8cebb1189c6"},
{file = "aiohttp-3.10.5-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:2f1f1c75c395991ce9c94d3e4aa96e5c59c8356a15b1c9231e783865e2772699"},
{file = "aiohttp-3.10.5-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4f7acae3cf1a2a2361ec4c8e787eaaa86a94171d2417aae53c0cca6ca3118ff6"},
{file = "aiohttp-3.10.5-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:94c4381ffba9cc508b37d2e536b418d5ea9cfdc2848b9a7fea6aebad4ec6aac1"},
{file = "aiohttp-3.10.5-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:c31ad0c0c507894e3eaa843415841995bf8de4d6b2d24c6e33099f4bc9fc0d4f"},
{file = "aiohttp-3.10.5-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:0912b8a8fadeb32ff67a3ed44249448c20148397c1ed905d5dac185b4ca547bb"},
{file = "aiohttp-3.10.5-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:0d93400c18596b7dc4794d48a63fb361b01a0d8eb39f28800dc900c8fbdaca91"},
{file = "aiohttp-3.10.5-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:d00f3c5e0d764a5c9aa5a62d99728c56d455310bcc288a79cab10157b3af426f"},
{file = "aiohttp-3.10.5-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:d742c36ed44f2798c8d3f4bc511f479b9ceef2b93f348671184139e7d708042c"},
{file = "aiohttp-3.10.5-cp310-cp310-musllinux_1_2_ppc64le.whl", hash = "sha256:814375093edae5f1cb31e3407997cf3eacefb9010f96df10d64829362ae2df69"},
{file = "aiohttp-3.10.5-cp310-cp310-musllinux_1_2_s390x.whl", hash = "sha256:8224f98be68a84b19f48e0bdc14224b5a71339aff3a27df69989fa47d01296f3"},
{file = "aiohttp-3.10.5-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:d9a487ef090aea982d748b1b0d74fe7c3950b109df967630a20584f9a99c0683"},
{file = "aiohttp-3.10.5-cp310-cp310-win32.whl", hash = "sha256:d9ef084e3dc690ad50137cc05831c52b6ca428096e6deb3c43e95827f531d5ef"},
{file = "aiohttp-3.10.5-cp310-cp310-win_amd64.whl", hash = "sha256:66bf9234e08fe561dccd62083bf67400bdbf1c67ba9efdc3dac03650e97c6088"},
{file = "aiohttp-3.10.5-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:8c6a4e5e40156d72a40241a25cc226051c0a8d816610097a8e8f517aeacd59a2"},
{file = "aiohttp-3.10.5-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:2c634a3207a5445be65536d38c13791904fda0748b9eabf908d3fe86a52941cf"},
{file = "aiohttp-3.10.5-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:4aff049b5e629ef9b3e9e617fa6e2dfeda1bf87e01bcfecaf3949af9e210105e"},
{file = "aiohttp-3.10.5-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1942244f00baaacaa8155eca94dbd9e8cc7017deb69b75ef67c78e89fdad3c77"},
{file = "aiohttp-3.10.5-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:e04a1f2a65ad2f93aa20f9ff9f1b672bf912413e5547f60749fa2ef8a644e061"},
{file = "aiohttp-3.10.5-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:7f2bfc0032a00405d4af2ba27f3c429e851d04fad1e5ceee4080a1c570476697"},
{file = "aiohttp-3.10.5-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:424ae21498790e12eb759040bbb504e5e280cab64693d14775c54269fd1d2bb7"},
{file = "aiohttp-3.10.5-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:975218eee0e6d24eb336d0328c768ebc5d617609affaca5dbbd6dd1984f16ed0"},
{file = "aiohttp-3.10.5-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:4120d7fefa1e2d8fb6f650b11489710091788de554e2b6f8347c7a20ceb003f5"},
{file = "aiohttp-3.10.5-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:b90078989ef3fc45cf9221d3859acd1108af7560c52397ff4ace8ad7052a132e"},
{file = "aiohttp-3.10.5-cp311-cp311-musllinux_1_2_ppc64le.whl", hash = "sha256:ba5a8b74c2a8af7d862399cdedce1533642fa727def0b8c3e3e02fcb52dca1b1"},
{file = "aiohttp-3.10.5-cp311-cp311-musllinux_1_2_s390x.whl", hash = "sha256:02594361128f780eecc2a29939d9dfc870e17b45178a867bf61a11b2a4367277"},
{file = "aiohttp-3.10.5-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:8fb4fc029e135859f533025bc82047334e24b0d489e75513144f25408ecaf058"},
{file = "aiohttp-3.10.5-cp311-cp311-win32.whl", hash = "sha256:e1ca1ef5ba129718a8fc827b0867f6aa4e893c56eb00003b7367f8a733a9b072"},
{file = "aiohttp-3.10.5-cp311-cp311-win_amd64.whl", hash = "sha256:349ef8a73a7c5665cca65c88ab24abe75447e28aa3bc4c93ea5093474dfdf0ff"},
{file = "aiohttp-3.10.5-cp312-cp312-macosx_10_9_universal2.whl", hash = "sha256:305be5ff2081fa1d283a76113b8df7a14c10d75602a38d9f012935df20731487"},
{file = "aiohttp-3.10.5-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:3a1c32a19ee6bbde02f1cb189e13a71b321256cc1d431196a9f824050b160d5a"},
{file = "aiohttp-3.10.5-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:61645818edd40cc6f455b851277a21bf420ce347baa0b86eaa41d51ef58ba23d"},
{file = "aiohttp-3.10.5-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:6c225286f2b13bab5987425558baa5cbdb2bc925b2998038fa028245ef421e75"},
{file = "aiohttp-3.10.5-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:8ba01ebc6175e1e6b7275c907a3a36be48a2d487549b656aa90c8a910d9f3178"},
{file = "aiohttp-3.10.5-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:8eaf44ccbc4e35762683078b72bf293f476561d8b68ec8a64f98cf32811c323e"},
{file = "aiohttp-3.10.5-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b1c43eb1ab7cbf411b8e387dc169acb31f0ca0d8c09ba63f9eac67829585b44f"},
{file = "aiohttp-3.10.5-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:de7a5299827253023c55ea549444e058c0eb496931fa05d693b95140a947cb73"},
{file = "aiohttp-3.10.5-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:4790f0e15f00058f7599dab2b206d3049d7ac464dc2e5eae0e93fa18aee9e7bf"},
{file = "aiohttp-3.10.5-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:44b324a6b8376a23e6ba25d368726ee3bc281e6ab306db80b5819999c737d820"},
{file = "aiohttp-3.10.5-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:0d277cfb304118079e7044aad0b76685d30ecb86f83a0711fc5fb257ffe832ca"},
{file = "aiohttp-3.10.5-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:54d9ddea424cd19d3ff6128601a4a4d23d54a421f9b4c0fff740505813739a91"},
{file = "aiohttp-3.10.5-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:4f1c9866ccf48a6df2b06823e6ae80573529f2af3a0992ec4fe75b1a510df8a6"},
{file = "aiohttp-3.10.5-cp312-cp312-win32.whl", hash = "sha256:dc4826823121783dccc0871e3f405417ac116055bf184ac04c36f98b75aacd12"},
{file = "aiohttp-3.10.5-cp312-cp312-win_amd64.whl", hash = "sha256:22c0a23a3b3138a6bf76fc553789cb1a703836da86b0f306b6f0dc1617398abc"},
{file = "aiohttp-3.10.5-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:7f6b639c36734eaa80a6c152a238242bedcee9b953f23bb887e9102976343092"},
{file = "aiohttp-3.10.5-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:f29930bc2921cef955ba39a3ff87d2c4398a0394ae217f41cb02d5c26c8b1b77"},
{file = "aiohttp-3.10.5-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:f489a2c9e6455d87eabf907ac0b7d230a9786be43fbe884ad184ddf9e9c1e385"},
{file = "aiohttp-3.10.5-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:123dd5b16b75b2962d0fff566effb7a065e33cd4538c1692fb31c3bda2bfb972"},
{file = "aiohttp-3.10.5-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:b98e698dc34966e5976e10bbca6d26d6724e6bdea853c7c10162a3235aba6e16"},
{file = "aiohttp-3.10.5-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:c3b9162bab7e42f21243effc822652dc5bb5e8ff42a4eb62fe7782bcbcdfacf6"},
{file = "aiohttp-3.10.5-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:1923a5c44061bffd5eebeef58cecf68096e35003907d8201a4d0d6f6e387ccaa"},
{file = "aiohttp-3.10.5-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:d55f011da0a843c3d3df2c2cf4e537b8070a419f891c930245f05d329c4b0689"},
{file = "aiohttp-3.10.5-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:afe16a84498441d05e9189a15900640a2d2b5e76cf4efe8cbb088ab4f112ee57"},
{file = "aiohttp-3.10.5-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:f8112fb501b1e0567a1251a2fd0747baae60a4ab325a871e975b7bb67e59221f"},
{file = "aiohttp-3.10.5-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:1e72589da4c90337837fdfe2026ae1952c0f4a6e793adbbfbdd40efed7c63599"},
{file = "aiohttp-3.10.5-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:4d46c7b4173415d8e583045fbc4daa48b40e31b19ce595b8d92cf639396c15d5"},
{file = "aiohttp-3.10.5-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:33e6bc4bab477c772a541f76cd91e11ccb6d2efa2b8d7d7883591dfb523e5987"},
{file = "aiohttp-3.10.5-cp313-cp313-win32.whl", hash = "sha256:c58c6837a2c2a7cf3133983e64173aec11f9c2cd8e87ec2fdc16ce727bcf1a04"},
{file = "aiohttp-3.10.5-cp313-cp313-win_amd64.whl", hash = "sha256:38172a70005252b6893088c0f5e8a47d173df7cc2b2bd88650957eb84fcf5022"},
{file = "aiohttp-3.10.5-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:f6f18898ace4bcd2d41a122916475344a87f1dfdec626ecde9ee802a711bc569"},
{file = "aiohttp-3.10.5-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:5ede29d91a40ba22ac1b922ef510aab871652f6c88ef60b9dcdf773c6d32ad7a"},
{file = "aiohttp-3.10.5-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:673f988370f5954df96cc31fd99c7312a3af0a97f09e407399f61583f30da9bc"},
{file = "aiohttp-3.10.5-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:58718e181c56a3c02d25b09d4115eb02aafe1a732ce5714ab70326d9776457c3"},
{file = "aiohttp-3.10.5-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:4b38b1570242fbab8d86a84128fb5b5234a2f70c2e32f3070143a6d94bc854cf"},
{file = "aiohttp-3.10.5-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:074d1bff0163e107e97bd48cad9f928fa5a3eb4b9d33366137ffce08a63e37fe"},
{file = "aiohttp-3.10.5-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:fd31f176429cecbc1ba499d4aba31aaccfea488f418d60376b911269d3b883c5"},
{file = "aiohttp-3.10.5-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:7384d0b87d4635ec38db9263e6a3f1eb609e2e06087f0aa7f63b76833737b471"},
{file = "aiohttp-3.10.5-cp38-cp38-musllinux_1_2_aarch64.whl", hash = "sha256:8989f46f3d7ef79585e98fa991e6ded55d2f48ae56d2c9fa5e491a6e4effb589"},
{file = "aiohttp-3.10.5-cp38-cp38-musllinux_1_2_i686.whl", hash = "sha256:c83f7a107abb89a227d6c454c613e7606c12a42b9a4ca9c5d7dad25d47c776ae"},
{file = "aiohttp-3.10.5-cp38-cp38-musllinux_1_2_ppc64le.whl", hash = "sha256:cde98f323d6bf161041e7627a5fd763f9fd829bcfcd089804a5fdce7bb6e1b7d"},
{file = "aiohttp-3.10.5-cp38-cp38-musllinux_1_2_s390x.whl", hash = "sha256:676f94c5480d8eefd97c0c7e3953315e4d8c2b71f3b49539beb2aa676c58272f"},
{file = "aiohttp-3.10.5-cp38-cp38-musllinux_1_2_x86_64.whl", hash = "sha256:2d21ac12dc943c68135ff858c3a989f2194a709e6e10b4c8977d7fcd67dfd511"},
{file = "aiohttp-3.10.5-cp38-cp38-win32.whl", hash = "sha256:17e997105bd1a260850272bfb50e2a328e029c941c2708170d9d978d5a30ad9a"},
{file = "aiohttp-3.10.5-cp38-cp38-win_amd64.whl", hash = "sha256:1c19de68896747a2aa6257ae4cf6ef59d73917a36a35ee9d0a6f48cff0f94db8"},
{file = "aiohttp-3.10.5-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:7e2fe37ac654032db1f3499fe56e77190282534810e2a8e833141a021faaab0e"},
{file = "aiohttp-3.10.5-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:f5bf3ead3cb66ab990ee2561373b009db5bc0e857549b6c9ba84b20bc462e172"},
{file = "aiohttp-3.10.5-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:1b2c16a919d936ca87a3c5f0e43af12a89a3ce7ccbce59a2d6784caba945b68b"},
{file = "aiohttp-3.10.5-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ad146dae5977c4dd435eb31373b3fe9b0b1bf26858c6fc452bf6af394067e10b"},
{file = "aiohttp-3.10.5-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:8c5c6fa16412b35999320f5c9690c0f554392dc222c04e559217e0f9ae244b92"},
{file = "aiohttp-3.10.5-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:95c4dc6f61d610bc0ee1edc6f29d993f10febfe5b76bb470b486d90bbece6b22"},
{file = "aiohttp-3.10.5-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:da452c2c322e9ce0cfef392e469a26d63d42860f829026a63374fde6b5c5876f"},
{file = "aiohttp-3.10.5-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:898715cf566ec2869d5cb4d5fb4be408964704c46c96b4be267442d265390f32"},
{file = "aiohttp-3.10.5-cp39-cp39-musllinux_1_2_aarch64.whl", hash = "sha256:391cc3a9c1527e424c6865e087897e766a917f15dddb360174a70467572ac6ce"},
{file = "aiohttp-3.10.5-cp39-cp39-musllinux_1_2_i686.whl", hash = "sha256:380f926b51b92d02a34119d072f178d80bbda334d1a7e10fa22d467a66e494db"},
{file = "aiohttp-3.10.5-cp39-cp39-musllinux_1_2_ppc64le.whl", hash = "sha256:ce91db90dbf37bb6fa0997f26574107e1b9d5ff939315247b7e615baa8ec313b"},
{file = "aiohttp-3.10.5-cp39-cp39-musllinux_1_2_s390x.whl", hash = "sha256:9093a81e18c45227eebe4c16124ebf3e0d893830c6aca7cc310bfca8fe59d857"},
{file = "aiohttp-3.10.5-cp39-cp39-musllinux_1_2_x86_64.whl", hash = "sha256:ee40b40aa753d844162dcc80d0fe256b87cba48ca0054f64e68000453caead11"},
{file = "aiohttp-3.10.5-cp39-cp39-win32.whl", hash = "sha256:03f2645adbe17f274444953bdea69f8327e9d278d961d85657cb0d06864814c1"},
{file = "aiohttp-3.10.5-cp39-cp39-win_amd64.whl", hash = "sha256:d17920f18e6ee090bdd3d0bfffd769d9f2cb4c8ffde3eb203777a3895c128862"},
{file = "aiohttp-3.10.5.tar.gz", hash = "sha256:f071854b47d39591ce9a17981c46790acb30518e2f83dfca8db2dfa091178691"},
{file = "aiohttp-3.11.15-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:078b1ce274f967951b42a65d5b7ec115b7886343a5271f2eed330458ea87bb48"},
{file = "aiohttp-3.11.15-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:48d790d05c76b6cb93a1259b44d3f0019b61d507f8ebf0d49da3ef5ac858b05d"},
{file = "aiohttp-3.11.15-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:e13767424425bb607931a0b9e703b95d2693650011ef8f0166b4cd80066b66b9"},
{file = "aiohttp-3.11.15-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:7ac49c32901489343b4dab064ab520f6b879a03fb4f9667c84620b66f07bed69"},
{file = "aiohttp-3.11.15-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:576b56a34d26ea8a8f0e1a30b8a069ba4ab121fb8eb796d1b89fedda7c74c553"},
{file = "aiohttp-3.11.15-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:71ed71d9431e770550aab27a77ef9d30d33ce6f558ab7818be7205ae6aca635d"},
{file = "aiohttp-3.11.15-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c982b2cbd7b8f4b31e9faf2de09e22b060a6cd0a693f20892dda41a8fb0f46ef"},
{file = "aiohttp-3.11.15-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:f00e7540a60460fbeaffff4a7dcf72fe8b710f8280a542a4d798273787c64e72"},
{file = "aiohttp-3.11.15-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:9b97fae7f75a0b666ce4281627856d1b1a85477f26c2e8b266292e770c17df36"},
{file = "aiohttp-3.11.15-cp310-cp310-musllinux_1_2_armv7l.whl", hash = "sha256:a449c48f5b02c0c14f5310881558ca861bff8e00b1f79be4cf6a53f638464c30"},
{file = "aiohttp-3.11.15-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:8c6e39f0bb2418f839841f92b3cd64077ff5166d724c984ab450ca08d5e51d92"},
{file = "aiohttp-3.11.15-cp310-cp310-musllinux_1_2_ppc64le.whl", hash = "sha256:e792131352418dde7b0c598e217e89ecf6a26889f46f35f910e5544ffdebf3ae"},
{file = "aiohttp-3.11.15-cp310-cp310-musllinux_1_2_s390x.whl", hash = "sha256:c28cbae1ce76dc48d0fcccb045ac21d00dcc1b306bb745910cf35585ce89404e"},
{file = "aiohttp-3.11.15-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:d9c8417a24063f35b526f8bf14f6f4bdea6f3f49850457337b6ea928901adbbc"},
{file = "aiohttp-3.11.15-cp310-cp310-win32.whl", hash = "sha256:a50b86eb9cf74fa5b6f1386e08e1520dcbe83d7dfd4c8bf6f2ca72b03d42e79f"},
{file = "aiohttp-3.11.15-cp310-cp310-win_amd64.whl", hash = "sha256:a0361cafb50b185356a5f346c169aea1d14783df99e6da714d626b104586e0b5"},
{file = "aiohttp-3.11.15-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:5bd37d615cd26d09321bd0168305f8508778712cf38aeffeed550274fb48a2ee"},
{file = "aiohttp-3.11.15-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:a3d706afcc808f6add4208dfa13f911fd93c2a3dab6be484fee4fd0602a0867e"},
{file = "aiohttp-3.11.15-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:43625253e3dc018d34867b70909149f15f29eac0382802afe027f2fbf17bcb9c"},
{file = "aiohttp-3.11.15-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:437eee9e057a7907b11e4af2b18df56b6c795b28e0a3ac250691936cf6bf40eb"},
{file = "aiohttp-3.11.15-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:ec3dd04138bd30e6a3403dbd3ab5a5ccfb501597c5a95196cd816936ed55b777"},
{file = "aiohttp-3.11.15-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:85d73479b79172e7d667b466c170ca6097a334c09ecd83c95c210546031251b5"},
{file = "aiohttp-3.11.15-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ae3a5d9f2fbe736fec7d24be25c57aa78c2d78d96540439ea68a8abbed9906fc"},
{file = "aiohttp-3.11.15-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:269d145c593a65f78fb9a64dece90341561ddb2e91a96d42681132b2f706c42a"},
{file = "aiohttp-3.11.15-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:0987dcf32e4c47f22634d32e4b0ffbc368bbcf2b33b408cd1a3d2dc0a6a5cd34"},
{file = "aiohttp-3.11.15-cp311-cp311-musllinux_1_2_armv7l.whl", hash = "sha256:7cf4b2b5a0f7a738ecd759eaeaef800fc7c57683b7be9d8a43fcb86ca62701b4"},
{file = "aiohttp-3.11.15-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:f1e0369f0dc8c895e718ce37147f56d46142d37596be183ab7a34192c5e6e4c5"},
{file = "aiohttp-3.11.15-cp311-cp311-musllinux_1_2_ppc64le.whl", hash = "sha256:82ddf7f642b9c0b08063f3cf4e2818b22901bce8ebad05c232d9e295e77436a0"},
{file = "aiohttp-3.11.15-cp311-cp311-musllinux_1_2_s390x.whl", hash = "sha256:0c7eba0f90e27ec4af64db051f35387fa17128e6eeb58ee0f2318f2627168cc2"},
{file = "aiohttp-3.11.15-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:5a61df20fa77792e83307e266f76790f7cb67980dd476948948de212ee7ec64c"},
{file = "aiohttp-3.11.15-cp311-cp311-win32.whl", hash = "sha256:be11989cbc0728f81c0d022cef140ef8adb20d3012ad8f0ac61853bef571eb52"},
{file = "aiohttp-3.11.15-cp311-cp311-win_amd64.whl", hash = "sha256:357355c9d51c8b12bbc7de43b27ce4b51f14cce050e00b5a87d0d5527d779395"},
{file = "aiohttp-3.11.15-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:433e7388b3063bba462b3362641988270b087a9ccae22390364f86b37a480c17"},
{file = "aiohttp-3.11.15-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:d99981304065f4ea407dd7495f74f8b8c10f0e26733f8a47dc174ece73744d14"},
{file = "aiohttp-3.11.15-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:a3739daa52c0cff42f1c40f63b2fe818fcf41019d84c80a7add3224207a7060f"},
{file = "aiohttp-3.11.15-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:6fd82d0b3f73c59c80dade0ca8e0342de1ee261e147140ade65a465be670e83c"},
{file = "aiohttp-3.11.15-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:c930064b79cc0eb63678e376b819d546b0e2360264cd7544c32119496f646f35"},
{file = "aiohttp-3.11.15-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:291f324f49ecede693dfb4820a412d1388cb10a2214ab60028252b505e105d6f"},
{file = "aiohttp-3.11.15-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:65eb40e74e3126cba185da7a78815cf3a30140932193831b3bfd73c79965c723"},
{file = "aiohttp-3.11.15-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:a6d6d86443580f846ec9cf60f899b7cace34411f2ff5c95db5970047195e5bfa"},
{file = "aiohttp-3.11.15-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:8b4d8d78fbd5290204dcf43957a2184acd5cee358f203f24a3a97f7d7984eeb7"},
{file = "aiohttp-3.11.15-cp312-cp312-musllinux_1_2_armv7l.whl", hash = "sha256:94c552a4864ed292dadf1d341213374284a3af72e49bea02e70ce6f07cb37004"},
{file = "aiohttp-3.11.15-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:23857adc919b64bba8a4db0eccb24e53fcaf85633e690ef1581c5562ed58cae7"},
{file = "aiohttp-3.11.15-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:eba325409a0f990f9b43ed18916cbf5b9779bc4c979b8887c444e7be9c38ccca"},
{file = "aiohttp-3.11.15-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:f1b6c639750bf2a228957e25fcab7a7ff11987c543d70bf73223369f0d7bdb27"},
{file = "aiohttp-3.11.15-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:2241d862dc6a3c0c2662a6934e79078d3a1e51a76c0dca5d65b30f3debee6c9b"},
{file = "aiohttp-3.11.15-cp312-cp312-win32.whl", hash = "sha256:18733fa6bbe44698ff96138c1f1d682bbdf0846250a42c25c108eed328fef0d4"},
{file = "aiohttp-3.11.15-cp312-cp312-win_amd64.whl", hash = "sha256:0ec98c22030ea2a430cb11afddda7d4737b7e5c236c704f0d7d15504978598f7"},
{file = "aiohttp-3.11.15-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:c37aa3eb8eb92f3793f0c1e73f212a76cbc8d363e9990df54b5b7099f75ce740"},
{file = "aiohttp-3.11.15-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:5b5edd482ff0a8585b3f4e8b3681819447324166a43a5588800a5bca340dbf27"},
{file = "aiohttp-3.11.15-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:45d8c22c91bdb7417bd4f5119242dbd2e2140c0e9de47342c765f127f897eb57"},
{file = "aiohttp-3.11.15-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4b03093d4140d926965d23497a059ec59c8c07e602d2489ce5fb990f3a897db4"},
{file = "aiohttp-3.11.15-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:05eea49d598c4ece6f285e00464de206f838b48ff21938d5aa9c394e115945b9"},
{file = "aiohttp-3.11.15-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:63f8d6106566f98fcfde7a643c9da52d90679b8592dea76c4adfc3cd5d06d22c"},
{file = "aiohttp-3.11.15-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:36a490f1ebe0b982366314c77f02258f87bd5d9bd362839dc6a24188447f37eb"},
{file = "aiohttp-3.11.15-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:73a7f6283634dd30f93b9a67c414c00517869478b50361c503535e075fa07eaf"},
{file = "aiohttp-3.11.15-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:0e97c1e55f6931f07ecaf53aef8352def8386adfd0cd3caa6429cc40e886d6a9"},
{file = "aiohttp-3.11.15-cp313-cp313-musllinux_1_2_armv7l.whl", hash = "sha256:d8370d31e6d8ecccd97cd791c466d0acb56527df51b0c105d7ea54c7fcc0f75b"},
{file = "aiohttp-3.11.15-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:c2de66177e087999568638c02639cf0248011b5c7fca69b006947153c534fcab"},
{file = "aiohttp-3.11.15-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:edcddb97574402ff7481bc6f70819ba863e77b0be58a840ed5f59d52d2f20a71"},
{file = "aiohttp-3.11.15-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:29cce2a7e009494e675018c0b1819a133befbab8629c797276c5d793bbdf1138"},
{file = "aiohttp-3.11.15-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:825ec92391e3e4ddda74de79ed0f8b443e9b412a0c9c82618ca2163abd875df5"},
{file = "aiohttp-3.11.15-cp313-cp313-win32.whl", hash = "sha256:430f9707f0c1239a92bff7769b0db185ef400278dc63c89f88ed1bd5153aab7a"},
{file = "aiohttp-3.11.15-cp313-cp313-win_amd64.whl", hash = "sha256:f30e6980ec5d6ad815a233e19e39fe27ea94b1081c31c8aa1df1b629da3737b8"},
{file = "aiohttp-3.11.15-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:03ce9d2f01aef26cd6aaba2f330273c2364237db2f499b93c3e9f2e249f83cd2"},
{file = "aiohttp-3.11.15-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:1bee0f9e2d4088b57243d63afcb06256bd2d9ff683080f51e74fa790c8cfedfd"},
{file = "aiohttp-3.11.15-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:7e20bd6aa51a5209c9131395e20ce126e8e317c0cf78a8180f026b4d73f018f6"},
{file = "aiohttp-3.11.15-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:23397670f3739b6f3c4019da8226190f6cce5ab2008b664ed96a6d1f0fe7f069"},
{file = "aiohttp-3.11.15-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:fc270fe480e7e425c45054543f58510fe649f70b77f514171909bbfe585105c0"},
{file = "aiohttp-3.11.15-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:84370ff70c1677ee0c4db40fe2baee6ffc72e9d32def3ff18739c1390c62329f"},
{file = "aiohttp-3.11.15-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:379882ab6a40e6e0879ad8e84dca74ddbadff94af4f314b59b7da89c8463a669"},
{file = "aiohttp-3.11.15-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:74afb637cd06760afe0aa55a3ce82178ef4c950be65935add8f3809f701f36ca"},
{file = "aiohttp-3.11.15-cp39-cp39-musllinux_1_2_aarch64.whl", hash = "sha256:558de86eef9a886e43c6ae5b75cecdce81203da5832d19d11da8de417967d478"},
{file = "aiohttp-3.11.15-cp39-cp39-musllinux_1_2_armv7l.whl", hash = "sha256:41f82df6f0f895f0f843bc86762bea45b4d0fe876de49239ffc89d2365426399"},
{file = "aiohttp-3.11.15-cp39-cp39-musllinux_1_2_i686.whl", hash = "sha256:38368a32530dcdeccfa47544cf66724118d9cc8a889c057e116723ab62994380"},
{file = "aiohttp-3.11.15-cp39-cp39-musllinux_1_2_ppc64le.whl", hash = "sha256:204f6695b47a7d130ddf6680158920825d0d32202a870e0bc56a2ec637935b1a"},
{file = "aiohttp-3.11.15-cp39-cp39-musllinux_1_2_s390x.whl", hash = "sha256:4b85486e8914d4e778343f5322834aada678eaf4c5315e50d41d9b74817ff97b"},
{file = "aiohttp-3.11.15-cp39-cp39-musllinux_1_2_x86_64.whl", hash = "sha256:79a08d675167c50f41d106d67bbcbc9e86e1b43d305c4b9f982d5c656a94a9ba"},
{file = "aiohttp-3.11.15-cp39-cp39-win32.whl", hash = "sha256:20dda85988a4f506bf5a457b416b238e32ab944a2deb878ddf0af92df9254a9b"},
{file = "aiohttp-3.11.15-cp39-cp39-win_amd64.whl", hash = "sha256:3cfd9f4aeaec4a75a0b4986a9977ac0a09b3d87ae83415e4b461e86715c80897"},
{file = "aiohttp-3.11.15.tar.gz", hash = "sha256:b9b9a1e592ac8fcc4584baea240e41f77415e0de98932fdf19565aa3b6a02d0b"},
]
[package.dependencies]
aiohappyeyeballs = ">=2.3.0"
aiosignal = ">=1.1.2"
async-timeout = {version = ">=4.0,<5.0", markers = "python_version < \"3.11\""}
async-timeout = {version = ">=4.0,<6.0", markers = "python_version < \"3.11\""}
attrs = ">=17.3.0"
frozenlist = ">=1.1.1"
multidict = ">=4.5,<7.0"
yarl = ">=1.0,<2.0"
propcache = ">=0.2.0"
yarl = ">=1.17.0,<2.0"
[package.extras]
speedups = ["Brotli ; platform_python_implementation == \"CPython\"", "aiodns (>=3.2.0) ; sys_platform == \"linux\" or sys_platform == \"darwin\"", "brotlicffi ; platform_python_implementation != \"CPython\""]
@@ -186,7 +177,7 @@ files = [
{file = "async-timeout-4.0.3.tar.gz", hash = "sha256:4640d96be84d82d02ed59ea2b7105a0f7b33abe8703703cd0ab0bf87c427522f"},
{file = "async_timeout-4.0.3-py3-none-any.whl", hash = "sha256:7405140ff1230c310e51dc27b3145b9092d659ce68ff733fb0cefe3ee42be028"},
]
markers = {main = "python_version < \"3.11\"", dev = "python_full_version < \"3.11.3\""}
markers = {main = "python_version == \"3.10\"", dev = "python_full_version < \"3.11.3\""}
[[package]]
name = "attrs"
@@ -384,7 +375,7 @@ description = "Backport of PEP 654 (exception groups)"
optional = false
python-versions = ">=3.7"
groups = ["main"]
markers = "python_version < \"3.11\""
markers = "python_version == \"3.10\""
files = [
{file = "exceptiongroup-1.2.2-py3-none-any.whl", hash = "sha256:3111b9d131c238bec2f8f516e123e14ba243563fb135d3fe885990585aa7795b"},
{file = "exceptiongroup-1.2.2.tar.gz", hash = "sha256:47c2edf7c6738fafb49fd34290706d1a1a2f4d1c6df275526b62cbb4aa5393cc"},
@@ -1071,6 +1062,114 @@ httpx = {version = ">=0.26,<0.29", extras = ["http2"]}
pydantic = ">=1.9,<3.0"
strenum = {version = ">=0.4.9,<0.5.0", markers = "python_version < \"3.11\""}
[[package]]
name = "propcache"
version = "0.3.1"
description = "Accelerated property cache"
optional = false
python-versions = ">=3.9"
groups = ["main"]
files = [
{file = "propcache-0.3.1-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:f27785888d2fdd918bc36de8b8739f2d6c791399552333721b58193f68ea3e98"},
{file = "propcache-0.3.1-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:d4e89cde74154c7b5957f87a355bb9c8ec929c167b59c83d90654ea36aeb6180"},
{file = "propcache-0.3.1-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:730178f476ef03d3d4d255f0c9fa186cb1d13fd33ffe89d39f2cda4da90ceb71"},
{file = "propcache-0.3.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:967a8eec513dbe08330f10137eacb427b2ca52118769e82ebcfcab0fba92a649"},
{file = "propcache-0.3.1-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:5b9145c35cc87313b5fd480144f8078716007656093d23059e8993d3a8fa730f"},
{file = "propcache-0.3.1-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:9e64e948ab41411958670f1093c0a57acfdc3bee5cf5b935671bbd5313bcf229"},
{file = "propcache-0.3.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:319fa8765bfd6a265e5fa661547556da381e53274bc05094fc9ea50da51bfd46"},
{file = "propcache-0.3.1-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:c66d8ccbc902ad548312b96ed8d5d266d0d2c6d006fd0f66323e9d8f2dd49be7"},
{file = "propcache-0.3.1-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:2d219b0dbabe75e15e581fc1ae796109b07c8ba7d25b9ae8d650da582bed01b0"},
{file = "propcache-0.3.1-cp310-cp310-musllinux_1_2_armv7l.whl", hash = "sha256:cd6a55f65241c551eb53f8cf4d2f4af33512c39da5d9777694e9d9c60872f519"},
{file = "propcache-0.3.1-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:9979643ffc69b799d50d3a7b72b5164a2e97e117009d7af6dfdd2ab906cb72cd"},
{file = "propcache-0.3.1-cp310-cp310-musllinux_1_2_ppc64le.whl", hash = "sha256:4cf9e93a81979f1424f1a3d155213dc928f1069d697e4353edb8a5eba67c6259"},
{file = "propcache-0.3.1-cp310-cp310-musllinux_1_2_s390x.whl", hash = "sha256:2fce1df66915909ff6c824bbb5eb403d2d15f98f1518e583074671a30fe0c21e"},
{file = "propcache-0.3.1-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:4d0dfdd9a2ebc77b869a0b04423591ea8823f791293b527dc1bb896c1d6f1136"},
{file = "propcache-0.3.1-cp310-cp310-win32.whl", hash = "sha256:1f6cc0ad7b4560e5637eb2c994e97b4fa41ba8226069c9277eb5ea7101845b42"},
{file = "propcache-0.3.1-cp310-cp310-win_amd64.whl", hash = "sha256:47ef24aa6511e388e9894ec16f0fbf3313a53ee68402bc428744a367ec55b833"},
{file = "propcache-0.3.1-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:7f30241577d2fef2602113b70ef7231bf4c69a97e04693bde08ddab913ba0ce5"},
{file = "propcache-0.3.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:43593c6772aa12abc3af7784bff4a41ffa921608dd38b77cf1dfd7f5c4e71371"},
{file = "propcache-0.3.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:a75801768bbe65499495660b777e018cbe90c7980f07f8aa57d6be79ea6f71da"},
{file = "propcache-0.3.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f6f1324db48f001c2ca26a25fa25af60711e09b9aaf4b28488602776f4f9a744"},
{file = "propcache-0.3.1-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:5cdb0f3e1eb6dfc9965d19734d8f9c481b294b5274337a8cb5cb01b462dcb7e0"},
{file = "propcache-0.3.1-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:1eb34d90aac9bfbced9a58b266f8946cb5935869ff01b164573a7634d39fbcb5"},
{file = "propcache-0.3.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:f35c7070eeec2cdaac6fd3fe245226ed2a6292d3ee8c938e5bb645b434c5f256"},
{file = "propcache-0.3.1-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:b23c11c2c9e6d4e7300c92e022046ad09b91fd00e36e83c44483df4afa990073"},
{file = "propcache-0.3.1-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:3e19ea4ea0bf46179f8a3652ac1426e6dcbaf577ce4b4f65be581e237340420d"},
{file = "propcache-0.3.1-cp311-cp311-musllinux_1_2_armv7l.whl", hash = "sha256:bd39c92e4c8f6cbf5f08257d6360123af72af9f4da75a690bef50da77362d25f"},
{file = "propcache-0.3.1-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:b0313e8b923b3814d1c4a524c93dfecea5f39fa95601f6a9b1ac96cd66f89ea0"},
{file = "propcache-0.3.1-cp311-cp311-musllinux_1_2_ppc64le.whl", hash = "sha256:e861ad82892408487be144906a368ddbe2dc6297074ade2d892341b35c59844a"},
{file = "propcache-0.3.1-cp311-cp311-musllinux_1_2_s390x.whl", hash = "sha256:61014615c1274df8da5991a1e5da85a3ccb00c2d4701ac6f3383afd3ca47ab0a"},
{file = "propcache-0.3.1-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:71ebe3fe42656a2328ab08933d420df5f3ab121772eef78f2dc63624157f0ed9"},
{file = "propcache-0.3.1-cp311-cp311-win32.whl", hash = "sha256:58aa11f4ca8b60113d4b8e32d37e7e78bd8af4d1a5b5cb4979ed856a45e62005"},
{file = "propcache-0.3.1-cp311-cp311-win_amd64.whl", hash = "sha256:9532ea0b26a401264b1365146c440a6d78269ed41f83f23818d4b79497aeabe7"},
{file = "propcache-0.3.1-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:f78eb8422acc93d7b69964012ad7048764bb45a54ba7a39bb9e146c72ea29723"},
{file = "propcache-0.3.1-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:89498dd49c2f9a026ee057965cdf8192e5ae070ce7d7a7bd4b66a8e257d0c976"},
{file = "propcache-0.3.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:09400e98545c998d57d10035ff623266927cb784d13dd2b31fd33b8a5316b85b"},
{file = "propcache-0.3.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:aa8efd8c5adc5a2c9d3b952815ff8f7710cefdcaf5f2c36d26aff51aeca2f12f"},
{file = "propcache-0.3.1-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:c2fe5c910f6007e716a06d269608d307b4f36e7babee5f36533722660e8c4a70"},
{file = "propcache-0.3.1-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:a0ab8cf8cdd2194f8ff979a43ab43049b1df0b37aa64ab7eca04ac14429baeb7"},
{file = "propcache-0.3.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:563f9d8c03ad645597b8d010ef4e9eab359faeb11a0a2ac9f7b4bc8c28ebef25"},
{file = "propcache-0.3.1-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:fb6e0faf8cb6b4beea5d6ed7b5a578254c6d7df54c36ccd3d8b3eb00d6770277"},
{file = "propcache-0.3.1-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:1c5c7ab7f2bb3f573d1cb921993006ba2d39e8621019dffb1c5bc94cdbae81e8"},
{file = "propcache-0.3.1-cp312-cp312-musllinux_1_2_armv7l.whl", hash = "sha256:050b571b2e96ec942898f8eb46ea4bfbb19bd5502424747e83badc2d4a99a44e"},
{file = "propcache-0.3.1-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:e1c4d24b804b3a87e9350f79e2371a705a188d292fd310e663483af6ee6718ee"},
{file = "propcache-0.3.1-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:e4fe2a6d5ce975c117a6bb1e8ccda772d1e7029c1cca1acd209f91d30fa72815"},
{file = "propcache-0.3.1-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:feccd282de1f6322f56f6845bf1207a537227812f0a9bf5571df52bb418d79d5"},
{file = "propcache-0.3.1-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:ec314cde7314d2dd0510c6787326bbffcbdc317ecee6b7401ce218b3099075a7"},
{file = "propcache-0.3.1-cp312-cp312-win32.whl", hash = "sha256:7d2d5a0028d920738372630870e7d9644ce437142197f8c827194fca404bf03b"},
{file = "propcache-0.3.1-cp312-cp312-win_amd64.whl", hash = "sha256:88c423efef9d7a59dae0614eaed718449c09a5ac79a5f224a8b9664d603f04a3"},
{file = "propcache-0.3.1-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:f1528ec4374617a7a753f90f20e2f551121bb558fcb35926f99e3c42367164b8"},
{file = "propcache-0.3.1-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:dc1915ec523b3b494933b5424980831b636fe483d7d543f7afb7b3bf00f0c10f"},
{file = "propcache-0.3.1-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:a110205022d077da24e60b3df8bcee73971be9575dec5573dd17ae5d81751111"},
{file = "propcache-0.3.1-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d249609e547c04d190e820d0d4c8ca03ed4582bcf8e4e160a6969ddfb57b62e5"},
{file = "propcache-0.3.1-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:5ced33d827625d0a589e831126ccb4f5c29dfdf6766cac441d23995a65825dcb"},
{file = "propcache-0.3.1-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:4114c4ada8f3181af20808bedb250da6bae56660e4b8dfd9cd95d4549c0962f7"},
{file = "propcache-0.3.1-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:975af16f406ce48f1333ec5e912fe11064605d5c5b3f6746969077cc3adeb120"},
{file = "propcache-0.3.1-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:a34aa3a1abc50740be6ac0ab9d594e274f59960d3ad253cd318af76b996dd654"},
{file = "propcache-0.3.1-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:9cec3239c85ed15bfaded997773fdad9fb5662b0a7cbc854a43f291eb183179e"},
{file = "propcache-0.3.1-cp313-cp313-musllinux_1_2_armv7l.whl", hash = "sha256:05543250deac8e61084234d5fc54f8ebd254e8f2b39a16b1dce48904f45b744b"},
{file = "propcache-0.3.1-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:5cb5918253912e088edbf023788de539219718d3b10aef334476b62d2b53de53"},
{file = "propcache-0.3.1-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:f3bbecd2f34d0e6d3c543fdb3b15d6b60dd69970c2b4c822379e5ec8f6f621d5"},
{file = "propcache-0.3.1-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:aca63103895c7d960a5b9b044a83f544b233c95e0dcff114389d64d762017af7"},
{file = "propcache-0.3.1-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:5a0a9898fdb99bf11786265468571e628ba60af80dc3f6eb89a3545540c6b0ef"},
{file = "propcache-0.3.1-cp313-cp313-win32.whl", hash = "sha256:3a02a28095b5e63128bcae98eb59025924f121f048a62393db682f049bf4ac24"},
{file = "propcache-0.3.1-cp313-cp313-win_amd64.whl", hash = "sha256:813fbb8b6aea2fc9659815e585e548fe706d6f663fa73dff59a1677d4595a037"},
{file = "propcache-0.3.1-cp313-cp313t-macosx_10_13_universal2.whl", hash = "sha256:a444192f20f5ce8a5e52761a031b90f5ea6288b1eef42ad4c7e64fef33540b8f"},
{file = "propcache-0.3.1-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:0fbe94666e62ebe36cd652f5fc012abfbc2342de99b523f8267a678e4dfdee3c"},
{file = "propcache-0.3.1-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:f011f104db880f4e2166bcdcf7f58250f7a465bc6b068dc84c824a3d4a5c94dc"},
{file = "propcache-0.3.1-cp313-cp313t-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:3e584b6d388aeb0001d6d5c2bd86b26304adde6d9bb9bfa9c4889805021b96de"},
{file = "propcache-0.3.1-cp313-cp313t-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:8a17583515a04358b034e241f952f1715243482fc2c2945fd99a1b03a0bd77d6"},
{file = "propcache-0.3.1-cp313-cp313t-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:5aed8d8308215089c0734a2af4f2e95eeb360660184ad3912686c181e500b2e7"},
{file = "propcache-0.3.1-cp313-cp313t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6d8e309ff9a0503ef70dc9a0ebd3e69cf7b3894c9ae2ae81fc10943c37762458"},
{file = "propcache-0.3.1-cp313-cp313t-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:b655032b202028a582d27aeedc2e813299f82cb232f969f87a4fde491a233f11"},
{file = "propcache-0.3.1-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:9f64d91b751df77931336b5ff7bafbe8845c5770b06630e27acd5dbb71e1931c"},
{file = "propcache-0.3.1-cp313-cp313t-musllinux_1_2_armv7l.whl", hash = "sha256:19a06db789a4bd896ee91ebc50d059e23b3639c25d58eb35be3ca1cbe967c3bf"},
{file = "propcache-0.3.1-cp313-cp313t-musllinux_1_2_i686.whl", hash = "sha256:bef100c88d8692864651b5f98e871fb090bd65c8a41a1cb0ff2322db39c96c27"},
{file = "propcache-0.3.1-cp313-cp313t-musllinux_1_2_ppc64le.whl", hash = "sha256:87380fb1f3089d2a0b8b00f006ed12bd41bd858fabfa7330c954c70f50ed8757"},
{file = "propcache-0.3.1-cp313-cp313t-musllinux_1_2_s390x.whl", hash = "sha256:e474fc718e73ba5ec5180358aa07f6aded0ff5f2abe700e3115c37d75c947e18"},
{file = "propcache-0.3.1-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:17d1c688a443355234f3c031349da69444be052613483f3e4158eef751abcd8a"},
{file = "propcache-0.3.1-cp313-cp313t-win32.whl", hash = "sha256:359e81a949a7619802eb601d66d37072b79b79c2505e6d3fd8b945538411400d"},
{file = "propcache-0.3.1-cp313-cp313t-win_amd64.whl", hash = "sha256:e7fb9a84c9abbf2b2683fa3e7b0d7da4d8ecf139a1c635732a8bda29c5214b0e"},
{file = "propcache-0.3.1-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:ed5f6d2edbf349bd8d630e81f474d33d6ae5d07760c44d33cd808e2f5c8f4ae6"},
{file = "propcache-0.3.1-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:668ddddc9f3075af019f784456267eb504cb77c2c4bd46cc8402d723b4d200bf"},
{file = "propcache-0.3.1-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:0c86e7ceea56376216eba345aa1fc6a8a6b27ac236181f840d1d7e6a1ea9ba5c"},
{file = "propcache-0.3.1-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:83be47aa4e35b87c106fc0c84c0fc069d3f9b9b06d3c494cd404ec6747544894"},
{file = "propcache-0.3.1-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:27c6ac6aa9fc7bc662f594ef380707494cb42c22786a558d95fcdedb9aa5d035"},
{file = "propcache-0.3.1-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:64a956dff37080b352c1c40b2966b09defb014347043e740d420ca1eb7c9b908"},
{file = "propcache-0.3.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:82de5da8c8893056603ac2d6a89eb8b4df49abf1a7c19d536984c8dd63f481d5"},
{file = "propcache-0.3.1-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:0c3c3a203c375b08fd06a20da3cf7aac293b834b6f4f4db71190e8422750cca5"},
{file = "propcache-0.3.1-cp39-cp39-musllinux_1_2_aarch64.whl", hash = "sha256:b303b194c2e6f171cfddf8b8ba30baefccf03d36a4d9cab7fd0bb68ba476a3d7"},
{file = "propcache-0.3.1-cp39-cp39-musllinux_1_2_armv7l.whl", hash = "sha256:916cd229b0150129d645ec51614d38129ee74c03293a9f3f17537be0029a9641"},
{file = "propcache-0.3.1-cp39-cp39-musllinux_1_2_i686.whl", hash = "sha256:a461959ead5b38e2581998700b26346b78cd98540b5524796c175722f18b0294"},
{file = "propcache-0.3.1-cp39-cp39-musllinux_1_2_ppc64le.whl", hash = "sha256:069e7212890b0bcf9b2be0a03afb0c2d5161d91e1bf51569a64f629acc7defbf"},
{file = "propcache-0.3.1-cp39-cp39-musllinux_1_2_s390x.whl", hash = "sha256:ef2e4e91fb3945769e14ce82ed53007195e616a63aa43b40fb7ebaaf907c8d4c"},
{file = "propcache-0.3.1-cp39-cp39-musllinux_1_2_x86_64.whl", hash = "sha256:8638f99dca15b9dff328fb6273e09f03d1c50d9b6512f3b65a4154588a7595fe"},
{file = "propcache-0.3.1-cp39-cp39-win32.whl", hash = "sha256:6f173bbfe976105aaa890b712d1759de339d8a7cef2fc0a1714cc1a1e1c47f64"},
{file = "propcache-0.3.1-cp39-cp39-win_amd64.whl", hash = "sha256:603f1fe4144420374f1a69b907494c3acbc867a581c2d49d4175b0de7cc64566"},
{file = "propcache-0.3.1-py3-none-any.whl", hash = "sha256:9a8ecf38de50a7f518c21568c80f985e776397b902f1ce0b01f799aba1608b40"},
{file = "propcache-0.3.1.tar.gz", hash = "sha256:40d980c33765359098837527e18eddefc9a24cea5b45e078a7f3bb5b032c6ecf"},
]
[[package]]
name = "proto-plus"
version = "1.26.0"
@@ -1139,20 +1238,21 @@ pyasn1 = ">=0.4.6,<0.7.0"
[[package]]
name = "pydantic"
version = "2.10.6"
version = "2.11.1"
description = "Data validation using Python type hints"
optional = false
python-versions = ">=3.8"
python-versions = ">=3.9"
groups = ["main"]
files = [
{file = "pydantic-2.10.6-py3-none-any.whl", hash = "sha256:427d664bf0b8a2b34ff5dd0f5a18df00591adcee7198fbd71981054cef37b584"},
{file = "pydantic-2.10.6.tar.gz", hash = "sha256:ca5daa827cce33de7a42be142548b0096bf05a7e7b365aebfa5f8eeec7128236"},
{file = "pydantic-2.11.1-py3-none-any.whl", hash = "sha256:5b6c415eee9f8123a14d859be0c84363fec6b1feb6b688d6435801230b56e0b8"},
{file = "pydantic-2.11.1.tar.gz", hash = "sha256:442557d2910e75c991c39f4b4ab18963d57b9b55122c8b2a9cd176d8c29ce968"},
]
[package.dependencies]
annotated-types = ">=0.6.0"
pydantic-core = "2.27.2"
pydantic-core = "2.33.0"
typing-extensions = ">=4.12.2"
typing-inspection = ">=0.4.0"
[package.extras]
email = ["email-validator (>=2.0.0)"]
@@ -1160,112 +1260,111 @@ timezone = ["tzdata ; python_version >= \"3.9\" and platform_system == \"Windows
[[package]]
name = "pydantic-core"
version = "2.27.2"
version = "2.33.0"
description = "Core functionality for Pydantic validation and serialization"
optional = false
python-versions = ">=3.8"
python-versions = ">=3.9"
groups = ["main"]
files = [
{file = "pydantic_core-2.27.2-cp310-cp310-macosx_10_12_x86_64.whl", hash = "sha256:2d367ca20b2f14095a8f4fa1210f5a7b78b8a20009ecced6b12818f455b1e9fa"},
{file = "pydantic_core-2.27.2-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:491a2b73db93fab69731eaee494f320faa4e093dbed776be1a829c2eb222c34c"},
{file = "pydantic_core-2.27.2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:7969e133a6f183be60e9f6f56bfae753585680f3b7307a8e555a948d443cc05a"},
{file = "pydantic_core-2.27.2-cp310-cp310-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:3de9961f2a346257caf0aa508a4da705467f53778e9ef6fe744c038119737ef5"},
{file = "pydantic_core-2.27.2-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:e2bb4d3e5873c37bb3dd58714d4cd0b0e6238cebc4177ac8fe878f8b3aa8e74c"},
{file = "pydantic_core-2.27.2-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:280d219beebb0752699480fe8f1dc61ab6615c2046d76b7ab7ee38858de0a4e7"},
{file = "pydantic_core-2.27.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:47956ae78b6422cbd46f772f1746799cbb862de838fd8d1fbd34a82e05b0983a"},
{file = "pydantic_core-2.27.2-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:14d4a5c49d2f009d62a2a7140d3064f686d17a5d1a268bc641954ba181880236"},
{file = "pydantic_core-2.27.2-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:337b443af21d488716f8d0b6164de833e788aa6bd7e3a39c005febc1284f4962"},
{file = "pydantic_core-2.27.2-cp310-cp310-musllinux_1_1_armv7l.whl", hash = "sha256:03d0f86ea3184a12f41a2d23f7ccb79cdb5a18e06993f8a45baa8dfec746f0e9"},
{file = "pydantic_core-2.27.2-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:7041c36f5680c6e0f08d922aed302e98b3745d97fe1589db0a3eebf6624523af"},
{file = "pydantic_core-2.27.2-cp310-cp310-win32.whl", hash = "sha256:50a68f3e3819077be2c98110c1f9dcb3817e93f267ba80a2c05bb4f8799e2ff4"},
{file = "pydantic_core-2.27.2-cp310-cp310-win_amd64.whl", hash = "sha256:e0fd26b16394ead34a424eecf8a31a1f5137094cabe84a1bcb10fa6ba39d3d31"},
{file = "pydantic_core-2.27.2-cp311-cp311-macosx_10_12_x86_64.whl", hash = "sha256:8e10c99ef58cfdf2a66fc15d66b16c4a04f62bca39db589ae8cba08bc55331bc"},
{file = "pydantic_core-2.27.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:26f32e0adf166a84d0cb63be85c562ca8a6fa8de28e5f0d92250c6b7e9e2aff7"},
{file = "pydantic_core-2.27.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8c19d1ea0673cd13cc2f872f6c9ab42acc4e4f492a7ca9d3795ce2b112dd7e15"},
{file = "pydantic_core-2.27.2-cp311-cp311-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:5e68c4446fe0810e959cdff46ab0a41ce2f2c86d227d96dc3847af0ba7def306"},
{file = "pydantic_core-2.27.2-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:d9640b0059ff4f14d1f37321b94061c6db164fbe49b334b31643e0528d100d99"},
{file = "pydantic_core-2.27.2-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:40d02e7d45c9f8af700f3452f329ead92da4c5f4317ca9b896de7ce7199ea459"},
{file = "pydantic_core-2.27.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:1c1fd185014191700554795c99b347d64f2bb637966c4cfc16998a0ca700d048"},
{file = "pydantic_core-2.27.2-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:d81d2068e1c1228a565af076598f9e7451712700b673de8f502f0334f281387d"},
{file = "pydantic_core-2.27.2-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:1a4207639fb02ec2dbb76227d7c751a20b1a6b4bc52850568e52260cae64ca3b"},
{file = "pydantic_core-2.27.2-cp311-cp311-musllinux_1_1_armv7l.whl", hash = "sha256:3de3ce3c9ddc8bbd88f6e0e304dea0e66d843ec9de1b0042b0911c1663ffd474"},
{file = "pydantic_core-2.27.2-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:30c5f68ded0c36466acede341551106821043e9afaad516adfb6e8fa80a4e6a6"},
{file = "pydantic_core-2.27.2-cp311-cp311-win32.whl", hash = "sha256:c70c26d2c99f78b125a3459f8afe1aed4d9687c24fd677c6a4436bc042e50d6c"},
{file = "pydantic_core-2.27.2-cp311-cp311-win_amd64.whl", hash = "sha256:08e125dbdc505fa69ca7d9c499639ab6407cfa909214d500897d02afb816e7cc"},
{file = "pydantic_core-2.27.2-cp311-cp311-win_arm64.whl", hash = "sha256:26f0d68d4b235a2bae0c3fc585c585b4ecc51382db0e3ba402a22cbc440915e4"},
{file = "pydantic_core-2.27.2-cp312-cp312-macosx_10_12_x86_64.whl", hash = "sha256:9e0c8cfefa0ef83b4da9588448b6d8d2a2bf1a53c3f1ae5fca39eb3061e2f0b0"},
{file = "pydantic_core-2.27.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:83097677b8e3bd7eaa6775720ec8e0405f1575015a463285a92bfdfe254529ef"},
{file = "pydantic_core-2.27.2-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:172fce187655fece0c90d90a678424b013f8fbb0ca8b036ac266749c09438cb7"},
{file = "pydantic_core-2.27.2-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:519f29f5213271eeeeb3093f662ba2fd512b91c5f188f3bb7b27bc5973816934"},
{file = "pydantic_core-2.27.2-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:05e3a55d124407fffba0dd6b0c0cd056d10e983ceb4e5dbd10dda135c31071d6"},
{file = "pydantic_core-2.27.2-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:9c3ed807c7b91de05e63930188f19e921d1fe90de6b4f5cd43ee7fcc3525cb8c"},
{file = "pydantic_core-2.27.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6fb4aadc0b9a0c063206846d603b92030eb6f03069151a625667f982887153e2"},
{file = "pydantic_core-2.27.2-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:28ccb213807e037460326424ceb8b5245acb88f32f3d2777427476e1b32c48c4"},
{file = "pydantic_core-2.27.2-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:de3cd1899e2c279b140adde9357c4495ed9d47131b4a4eaff9052f23398076b3"},
{file = "pydantic_core-2.27.2-cp312-cp312-musllinux_1_1_armv7l.whl", hash = "sha256:220f892729375e2d736b97d0e51466252ad84c51857d4d15f5e9692f9ef12be4"},
{file = "pydantic_core-2.27.2-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:a0fcd29cd6b4e74fe8ddd2c90330fd8edf2e30cb52acda47f06dd615ae72da57"},
{file = "pydantic_core-2.27.2-cp312-cp312-win32.whl", hash = "sha256:1e2cb691ed9834cd6a8be61228471d0a503731abfb42f82458ff27be7b2186fc"},
{file = "pydantic_core-2.27.2-cp312-cp312-win_amd64.whl", hash = "sha256:cc3f1a99a4f4f9dd1de4fe0312c114e740b5ddead65bb4102884b384c15d8bc9"},
{file = "pydantic_core-2.27.2-cp312-cp312-win_arm64.whl", hash = "sha256:3911ac9284cd8a1792d3cb26a2da18f3ca26c6908cc434a18f730dc0db7bfa3b"},
{file = "pydantic_core-2.27.2-cp313-cp313-macosx_10_12_x86_64.whl", hash = "sha256:7d14bd329640e63852364c306f4d23eb744e0f8193148d4044dd3dacdaacbd8b"},
{file = "pydantic_core-2.27.2-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:82f91663004eb8ed30ff478d77c4d1179b3563df6cdb15c0817cd1cdaf34d154"},
{file = "pydantic_core-2.27.2-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:71b24c7d61131bb83df10cc7e687433609963a944ccf45190cfc21e0887b08c9"},
{file = "pydantic_core-2.27.2-cp313-cp313-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:fa8e459d4954f608fa26116118bb67f56b93b209c39b008277ace29937453dc9"},
{file = "pydantic_core-2.27.2-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:ce8918cbebc8da707ba805b7fd0b382816858728ae7fe19a942080c24e5b7cd1"},
{file = "pydantic_core-2.27.2-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:eda3f5c2a021bbc5d976107bb302e0131351c2ba54343f8a496dc8783d3d3a6a"},
{file = "pydantic_core-2.27.2-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bd8086fa684c4775c27f03f062cbb9eaa6e17f064307e86b21b9e0abc9c0f02e"},
{file = "pydantic_core-2.27.2-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:8d9b3388db186ba0c099a6d20f0604a44eabdeef1777ddd94786cdae158729e4"},
{file = "pydantic_core-2.27.2-cp313-cp313-musllinux_1_1_aarch64.whl", hash = "sha256:7a66efda2387de898c8f38c0cf7f14fca0b51a8ef0b24bfea5849f1b3c95af27"},
{file = "pydantic_core-2.27.2-cp313-cp313-musllinux_1_1_armv7l.whl", hash = "sha256:18a101c168e4e092ab40dbc2503bdc0f62010e95d292b27827871dc85450d7ee"},
{file = "pydantic_core-2.27.2-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:ba5dd002f88b78a4215ed2f8ddbdf85e8513382820ba15ad5ad8955ce0ca19a1"},
{file = "pydantic_core-2.27.2-cp313-cp313-win32.whl", hash = "sha256:1ebaf1d0481914d004a573394f4be3a7616334be70261007e47c2a6fe7e50130"},
{file = "pydantic_core-2.27.2-cp313-cp313-win_amd64.whl", hash = "sha256:953101387ecf2f5652883208769a79e48db18c6df442568a0b5ccd8c2723abee"},
{file = "pydantic_core-2.27.2-cp313-cp313-win_arm64.whl", hash = "sha256:ac4dbfd1691affb8f48c2c13241a2e3b60ff23247cbcf981759c768b6633cf8b"},
{file = "pydantic_core-2.27.2-cp38-cp38-macosx_10_12_x86_64.whl", hash = "sha256:d3e8d504bdd3f10835468f29008d72fc8359d95c9c415ce6e767203db6127506"},
{file = "pydantic_core-2.27.2-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:521eb9b7f036c9b6187f0b47318ab0d7ca14bd87f776240b90b21c1f4f149320"},
{file = "pydantic_core-2.27.2-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:85210c4d99a0114f5a9481b44560d7d1e35e32cc5634c656bc48e590b669b145"},
{file = "pydantic_core-2.27.2-cp38-cp38-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:d716e2e30c6f140d7560ef1538953a5cd1a87264c737643d481f2779fc247fe1"},
{file = "pydantic_core-2.27.2-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:f66d89ba397d92f840f8654756196d93804278457b5fbede59598a1f9f90b228"},
{file = "pydantic_core-2.27.2-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:669e193c1c576a58f132e3158f9dfa9662969edb1a250c54d8fa52590045f046"},
{file = "pydantic_core-2.27.2-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9fdbe7629b996647b99c01b37f11170a57ae675375b14b8c13b8518b8320ced5"},
{file = "pydantic_core-2.27.2-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:d262606bf386a5ba0b0af3b97f37c83d7011439e3dc1a9298f21efb292e42f1a"},
{file = "pydantic_core-2.27.2-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:cabb9bcb7e0d97f74df8646f34fc76fbf793b7f6dc2438517d7a9e50eee4f14d"},
{file = "pydantic_core-2.27.2-cp38-cp38-musllinux_1_1_armv7l.whl", hash = "sha256:d2d63f1215638d28221f664596b1ccb3944f6e25dd18cd3b86b0a4c408d5ebb9"},
{file = "pydantic_core-2.27.2-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:bca101c00bff0adb45a833f8451b9105d9df18accb8743b08107d7ada14bd7da"},
{file = "pydantic_core-2.27.2-cp38-cp38-win32.whl", hash = "sha256:f6f8e111843bbb0dee4cb6594cdc73e79b3329b526037ec242a3e49012495b3b"},
{file = "pydantic_core-2.27.2-cp38-cp38-win_amd64.whl", hash = "sha256:fd1aea04935a508f62e0d0ef1f5ae968774a32afc306fb8545e06f5ff5cdf3ad"},
{file = "pydantic_core-2.27.2-cp39-cp39-macosx_10_12_x86_64.whl", hash = "sha256:c10eb4f1659290b523af58fa7cffb452a61ad6ae5613404519aee4bfbf1df993"},
{file = "pydantic_core-2.27.2-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:ef592d4bad47296fb11f96cd7dc898b92e795032b4894dfb4076cfccd43a9308"},
{file = "pydantic_core-2.27.2-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:c61709a844acc6bf0b7dce7daae75195a10aac96a596ea1b776996414791ede4"},
{file = "pydantic_core-2.27.2-cp39-cp39-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:42c5f762659e47fdb7b16956c71598292f60a03aa92f8b6351504359dbdba6cf"},
{file = "pydantic_core-2.27.2-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:4c9775e339e42e79ec99c441d9730fccf07414af63eac2f0e48e08fd38a64d76"},
{file = "pydantic_core-2.27.2-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:57762139821c31847cfb2df63c12f725788bd9f04bc2fb392790959b8f70f118"},
{file = "pydantic_core-2.27.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:0d1e85068e818c73e048fe28cfc769040bb1f475524f4745a5dc621f75ac7630"},
{file = "pydantic_core-2.27.2-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:097830ed52fd9e427942ff3b9bc17fab52913b2f50f2880dc4a5611446606a54"},
{file = "pydantic_core-2.27.2-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:044a50963a614ecfae59bb1eaf7ea7efc4bc62f49ed594e18fa1e5d953c40e9f"},
{file = "pydantic_core-2.27.2-cp39-cp39-musllinux_1_1_armv7l.whl", hash = "sha256:4e0b4220ba5b40d727c7f879eac379b822eee5d8fff418e9d3381ee45b3b0362"},
{file = "pydantic_core-2.27.2-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:5e4f4bb20d75e9325cc9696c6802657b58bc1dbbe3022f32cc2b2b632c3fbb96"},
{file = "pydantic_core-2.27.2-cp39-cp39-win32.whl", hash = "sha256:cca63613e90d001b9f2f9a9ceb276c308bfa2a43fafb75c8031c4f66039e8c6e"},
{file = "pydantic_core-2.27.2-cp39-cp39-win_amd64.whl", hash = "sha256:77d1bca19b0f7021b3a982e6f903dcd5b2b06076def36a652e3907f596e29f67"},
{file = "pydantic_core-2.27.2-pp310-pypy310_pp73-macosx_10_12_x86_64.whl", hash = "sha256:2bf14caea37e91198329b828eae1618c068dfb8ef17bb33287a7ad4b61ac314e"},
{file = "pydantic_core-2.27.2-pp310-pypy310_pp73-macosx_11_0_arm64.whl", hash = "sha256:b0cb791f5b45307caae8810c2023a184c74605ec3bcbb67d13846c28ff731ff8"},
{file = "pydantic_core-2.27.2-pp310-pypy310_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:688d3fd9fcb71f41c4c015c023d12a79d1c4c0732ec9eb35d96e3388a120dcf3"},
{file = "pydantic_core-2.27.2-pp310-pypy310_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:3d591580c34f4d731592f0e9fe40f9cc1b430d297eecc70b962e93c5c668f15f"},
{file = "pydantic_core-2.27.2-pp310-pypy310_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:82f986faf4e644ffc189a7f1aafc86e46ef70372bb153e7001e8afccc6e54133"},
{file = "pydantic_core-2.27.2-pp310-pypy310_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:bec317a27290e2537f922639cafd54990551725fc844249e64c523301d0822fc"},
{file = "pydantic_core-2.27.2-pp310-pypy310_pp73-musllinux_1_1_armv7l.whl", hash = "sha256:0296abcb83a797db256b773f45773da397da75a08f5fcaef41f2044adec05f50"},
{file = "pydantic_core-2.27.2-pp310-pypy310_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:0d75070718e369e452075a6017fbf187f788e17ed67a3abd47fa934d001863d9"},
{file = "pydantic_core-2.27.2-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:7e17b560be3c98a8e3aa66ce828bdebb9e9ac6ad5466fba92eb74c4c95cb1151"},
{file = "pydantic_core-2.27.2-pp39-pypy39_pp73-macosx_10_12_x86_64.whl", hash = "sha256:c33939a82924da9ed65dab5a65d427205a73181d8098e79b6b426bdf8ad4e656"},
{file = "pydantic_core-2.27.2-pp39-pypy39_pp73-macosx_11_0_arm64.whl", hash = "sha256:00bad2484fa6bda1e216e7345a798bd37c68fb2d97558edd584942aa41b7d278"},
{file = "pydantic_core-2.27.2-pp39-pypy39_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:c817e2b40aba42bac6f457498dacabc568c3b7a986fc9ba7c8d9d260b71485fb"},
{file = "pydantic_core-2.27.2-pp39-pypy39_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:251136cdad0cb722e93732cb45ca5299fb56e1344a833640bf93b2803f8d1bfd"},
{file = "pydantic_core-2.27.2-pp39-pypy39_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:d2088237af596f0a524d3afc39ab3b036e8adb054ee57cbb1dcf8e09da5b29cc"},
{file = "pydantic_core-2.27.2-pp39-pypy39_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:d4041c0b966a84b4ae7a09832eb691a35aec90910cd2dbe7a208de59be77965b"},
{file = "pydantic_core-2.27.2-pp39-pypy39_pp73-musllinux_1_1_armv7l.whl", hash = "sha256:8083d4e875ebe0b864ffef72a4304827015cff328a1be6e22cc850753bfb122b"},
{file = "pydantic_core-2.27.2-pp39-pypy39_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:f141ee28a0ad2123b6611b6ceff018039df17f32ada8b534e6aa039545a3efb2"},
{file = "pydantic_core-2.27.2-pp39-pypy39_pp73-win_amd64.whl", hash = "sha256:7d0c8399fcc1848491f00e0314bd59fb34a9c008761bcb422a057670c3f65e35"},
{file = "pydantic_core-2.27.2.tar.gz", hash = "sha256:eb026e5a4c1fee05726072337ff51d1efb6f59090b7da90d30ea58625b1ffb39"},
{file = "pydantic_core-2.33.0-cp310-cp310-macosx_10_12_x86_64.whl", hash = "sha256:71dffba8fe9ddff628c68f3abd845e91b028361d43c5f8e7b3f8b91d7d85413e"},
{file = "pydantic_core-2.33.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:abaeec1be6ed535a5d7ffc2e6c390083c425832b20efd621562fbb5bff6dc518"},
{file = "pydantic_core-2.33.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:759871f00e26ad3709efc773ac37b4d571de065f9dfb1778012908bcc36b3a73"},
{file = "pydantic_core-2.33.0-cp310-cp310-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:dcfebee69cd5e1c0b76a17e17e347c84b00acebb8dd8edb22d4a03e88e82a207"},
{file = "pydantic_core-2.33.0-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:1b1262b912435a501fa04cd213720609e2cefa723a07c92017d18693e69bf00b"},
{file = "pydantic_core-2.33.0-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:4726f1f3f42d6a25678c67da3f0b10f148f5655813c5aca54b0d1742ba821b8f"},
{file = "pydantic_core-2.33.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:e790954b5093dff1e3a9a2523fddc4e79722d6f07993b4cd5547825c3cbf97b5"},
{file = "pydantic_core-2.33.0-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:34e7fb3abe375b5c4e64fab75733d605dda0f59827752debc99c17cb2d5f3276"},
{file = "pydantic_core-2.33.0-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:ecb158fb9b9091b515213bed3061eb7deb1d3b4e02327c27a0ea714ff46b0760"},
{file = "pydantic_core-2.33.0-cp310-cp310-musllinux_1_1_armv7l.whl", hash = "sha256:4d9149e7528af8bbd76cc055967e6e04617dcb2a2afdaa3dea899406c5521faa"},
{file = "pydantic_core-2.33.0-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:e81a295adccf73477220e15ff79235ca9dcbcee4be459eb9d4ce9a2763b8386c"},
{file = "pydantic_core-2.33.0-cp310-cp310-win32.whl", hash = "sha256:f22dab23cdbce2005f26a8f0c71698457861f97fc6318c75814a50c75e87d025"},
{file = "pydantic_core-2.33.0-cp310-cp310-win_amd64.whl", hash = "sha256:9cb2390355ba084c1ad49485d18449b4242da344dea3e0fe10babd1f0db7dcfc"},
{file = "pydantic_core-2.33.0-cp311-cp311-macosx_10_12_x86_64.whl", hash = "sha256:a608a75846804271cf9c83e40bbb4dab2ac614d33c6fd5b0c6187f53f5c593ef"},
{file = "pydantic_core-2.33.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:e1c69aa459f5609dec2fa0652d495353accf3eda5bdb18782bc5a2ae45c9273a"},
{file = "pydantic_core-2.33.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:b9ec80eb5a5f45a2211793f1c4aeddff0c3761d1c70d684965c1807e923a588b"},
{file = "pydantic_core-2.33.0-cp311-cp311-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:e925819a98318d17251776bd3d6aa9f3ff77b965762155bdad15d1a9265c4cfd"},
{file = "pydantic_core-2.33.0-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:5bf68bb859799e9cec3d9dd8323c40c00a254aabb56fe08f907e437005932f2b"},
{file = "pydantic_core-2.33.0-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:1b2ea72dea0825949a045fa4071f6d5b3d7620d2a208335207793cf29c5a182d"},
{file = "pydantic_core-2.33.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:1583539533160186ac546b49f5cde9ffc928062c96920f58bd95de32ffd7bffd"},
{file = "pydantic_core-2.33.0-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:23c3e77bf8a7317612e5c26a3b084c7edeb9552d645742a54a5867635b4f2453"},
{file = "pydantic_core-2.33.0-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:a7a7f2a3f628d2f7ef11cb6188bcf0b9e1558151d511b974dfea10a49afe192b"},
{file = "pydantic_core-2.33.0-cp311-cp311-musllinux_1_1_armv7l.whl", hash = "sha256:f1fb026c575e16f673c61c7b86144517705865173f3d0907040ac30c4f9f5915"},
{file = "pydantic_core-2.33.0-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:635702b2fed997e0ac256b2cfbdb4dd0bf7c56b5d8fba8ef03489c03b3eb40e2"},
{file = "pydantic_core-2.33.0-cp311-cp311-win32.whl", hash = "sha256:07b4ced28fccae3f00626eaa0c4001aa9ec140a29501770a88dbbb0966019a86"},
{file = "pydantic_core-2.33.0-cp311-cp311-win_amd64.whl", hash = "sha256:4927564be53239a87770a5f86bdc272b8d1fbb87ab7783ad70255b4ab01aa25b"},
{file = "pydantic_core-2.33.0-cp311-cp311-win_arm64.whl", hash = "sha256:69297418ad644d521ea3e1aa2e14a2a422726167e9ad22b89e8f1130d68e1e9a"},
{file = "pydantic_core-2.33.0-cp312-cp312-macosx_10_12_x86_64.whl", hash = "sha256:6c32a40712e3662bebe524abe8abb757f2fa2000028d64cc5a1006016c06af43"},
{file = "pydantic_core-2.33.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:8ec86b5baa36f0a0bfb37db86c7d52652f8e8aa076ab745ef7725784183c3fdd"},
{file = "pydantic_core-2.33.0-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4deac83a8cc1d09e40683be0bc6d1fa4cde8df0a9bf0cda5693f9b0569ac01b6"},
{file = "pydantic_core-2.33.0-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:175ab598fb457a9aee63206a1993874badf3ed9a456e0654273e56f00747bbd6"},
{file = "pydantic_core-2.33.0-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:5f36afd0d56a6c42cf4e8465b6441cf546ed69d3a4ec92724cc9c8c61bd6ecf4"},
{file = "pydantic_core-2.33.0-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:0a98257451164666afafc7cbf5fb00d613e33f7e7ebb322fbcd99345695a9a61"},
{file = "pydantic_core-2.33.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ecc6d02d69b54a2eb83ebcc6f29df04957f734bcf309d346b4f83354d8376862"},
{file = "pydantic_core-2.33.0-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:1a69b7596c6603afd049ce7f3835bcf57dd3892fc7279f0ddf987bebed8caa5a"},
{file = "pydantic_core-2.33.0-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:ea30239c148b6ef41364c6f51d103c2988965b643d62e10b233b5efdca8c0099"},
{file = "pydantic_core-2.33.0-cp312-cp312-musllinux_1_1_armv7l.whl", hash = "sha256:abfa44cf2f7f7d7a199be6c6ec141c9024063205545aa09304349781b9a125e6"},
{file = "pydantic_core-2.33.0-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:20d4275f3c4659d92048c70797e5fdc396c6e4446caf517ba5cad2db60cd39d3"},
{file = "pydantic_core-2.33.0-cp312-cp312-win32.whl", hash = "sha256:918f2013d7eadea1d88d1a35fd4a1e16aaf90343eb446f91cb091ce7f9b431a2"},
{file = "pydantic_core-2.33.0-cp312-cp312-win_amd64.whl", hash = "sha256:aec79acc183865bad120b0190afac467c20b15289050648b876b07777e67ea48"},
{file = "pydantic_core-2.33.0-cp312-cp312-win_arm64.whl", hash = "sha256:5461934e895968655225dfa8b3be79e7e927e95d4bd6c2d40edd2fa7052e71b6"},
{file = "pydantic_core-2.33.0-cp313-cp313-macosx_10_12_x86_64.whl", hash = "sha256:f00e8b59e1fc8f09d05594aa7d2b726f1b277ca6155fc84c0396db1b373c4555"},
{file = "pydantic_core-2.33.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:1a73be93ecef45786d7d95b0c5e9b294faf35629d03d5b145b09b81258c7cd6d"},
{file = "pydantic_core-2.33.0-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ff48a55be9da6930254565ff5238d71d5e9cd8c5487a191cb85df3bdb8c77365"},
{file = "pydantic_core-2.33.0-cp313-cp313-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:26a4ea04195638dcd8c53dadb545d70badba51735b1594810e9768c2c0b4a5da"},
{file = "pydantic_core-2.33.0-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:41d698dcbe12b60661f0632b543dbb119e6ba088103b364ff65e951610cb7ce0"},
{file = "pydantic_core-2.33.0-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:ae62032ef513fe6281ef0009e30838a01057b832dc265da32c10469622613885"},
{file = "pydantic_core-2.33.0-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:f225f3a3995dbbc26affc191d0443c6c4aa71b83358fd4c2b7d63e2f6f0336f9"},
{file = "pydantic_core-2.33.0-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:5bdd36b362f419c78d09630cbaebc64913f66f62bda6d42d5fbb08da8cc4f181"},
{file = "pydantic_core-2.33.0-cp313-cp313-musllinux_1_1_aarch64.whl", hash = "sha256:2a0147c0bef783fd9abc9f016d66edb6cac466dc54a17ec5f5ada08ff65caf5d"},
{file = "pydantic_core-2.33.0-cp313-cp313-musllinux_1_1_armv7l.whl", hash = "sha256:c860773a0f205926172c6644c394e02c25421dc9a456deff16f64c0e299487d3"},
{file = "pydantic_core-2.33.0-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:138d31e3f90087f42aa6286fb640f3c7a8eb7bdae829418265e7e7474bd2574b"},
{file = "pydantic_core-2.33.0-cp313-cp313-win32.whl", hash = "sha256:d20cbb9d3e95114325780f3cfe990f3ecae24de7a2d75f978783878cce2ad585"},
{file = "pydantic_core-2.33.0-cp313-cp313-win_amd64.whl", hash = "sha256:ca1103d70306489e3d006b0f79db8ca5dd3c977f6f13b2c59ff745249431a606"},
{file = "pydantic_core-2.33.0-cp313-cp313-win_arm64.whl", hash = "sha256:6291797cad239285275558e0a27872da735b05c75d5237bbade8736f80e4c225"},
{file = "pydantic_core-2.33.0-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:7b79af799630af263eca9ec87db519426d8c9b3be35016eddad1832bac812d87"},
{file = "pydantic_core-2.33.0-cp313-cp313t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:eabf946a4739b5237f4f56d77fa6668263bc466d06a8036c055587c130a46f7b"},
{file = "pydantic_core-2.33.0-cp313-cp313t-win_amd64.whl", hash = "sha256:8a1d581e8cdbb857b0e0e81df98603376c1a5c34dc5e54039dcc00f043df81e7"},
{file = "pydantic_core-2.33.0-cp39-cp39-macosx_10_12_x86_64.whl", hash = "sha256:7c9c84749f5787781c1c45bb99f433402e484e515b40675a5d121ea14711cf61"},
{file = "pydantic_core-2.33.0-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:64672fa888595a959cfeff957a654e947e65bbe1d7d82f550417cbd6898a1d6b"},
{file = "pydantic_core-2.33.0-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:26bc7367c0961dec292244ef2549afa396e72e28cc24706210bd44d947582c59"},
{file = "pydantic_core-2.33.0-cp39-cp39-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:ce72d46eb201ca43994303025bd54d8a35a3fc2a3495fac653d6eb7205ce04f4"},
{file = "pydantic_core-2.33.0-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:14229c1504287533dbf6b1fc56f752ce2b4e9694022ae7509631ce346158de11"},
{file = "pydantic_core-2.33.0-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:085d8985b1c1e48ef271e98a658f562f29d89bda98bf120502283efbc87313eb"},
{file = "pydantic_core-2.33.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:31860fbda80d8f6828e84b4a4d129fd9c4535996b8249cfb8c720dc2a1a00bb8"},
{file = "pydantic_core-2.33.0-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:f200b2f20856b5a6c3a35f0d4e344019f805e363416e609e9b47c552d35fd5ea"},
{file = "pydantic_core-2.33.0-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:5f72914cfd1d0176e58ddc05c7a47674ef4222c8253bf70322923e73e14a4ac3"},
{file = "pydantic_core-2.33.0-cp39-cp39-musllinux_1_1_armv7l.whl", hash = "sha256:91301a0980a1d4530d4ba7e6a739ca1a6b31341252cb709948e0aca0860ce0ae"},
{file = "pydantic_core-2.33.0-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:7419241e17c7fbe5074ba79143d5523270e04f86f1b3a0dff8df490f84c8273a"},
{file = "pydantic_core-2.33.0-cp39-cp39-win32.whl", hash = "sha256:7a25493320203005d2a4dac76d1b7d953cb49bce6d459d9ae38e30dd9f29bc9c"},
{file = "pydantic_core-2.33.0-cp39-cp39-win_amd64.whl", hash = "sha256:82a4eba92b7ca8af1b7d5ef5f3d9647eee94d1f74d21ca7c21e3a2b92e008358"},
{file = "pydantic_core-2.33.0-pp310-pypy310_pp73-macosx_10_12_x86_64.whl", hash = "sha256:e2762c568596332fdab56b07060c8ab8362c56cf2a339ee54e491cd503612c50"},
{file = "pydantic_core-2.33.0-pp310-pypy310_pp73-macosx_11_0_arm64.whl", hash = "sha256:5bf637300ff35d4f59c006fff201c510b2b5e745b07125458a5389af3c0dff8c"},
{file = "pydantic_core-2.33.0-pp310-pypy310_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:62c151ce3d59ed56ebd7ce9ce5986a409a85db697d25fc232f8e81f195aa39a1"},
{file = "pydantic_core-2.33.0-pp310-pypy310_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9ee65f0cc652261744fd07f2c6e6901c914aa6c5ff4dcfaf1136bc394d0dd26b"},
{file = "pydantic_core-2.33.0-pp310-pypy310_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:024d136ae44d233e6322027bbf356712b3940bee816e6c948ce4b90f18471b3d"},
{file = "pydantic_core-2.33.0-pp310-pypy310_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:e37f10f6d4bc67c58fbd727108ae1d8b92b397355e68519f1e4a7babb1473442"},
{file = "pydantic_core-2.33.0-pp310-pypy310_pp73-musllinux_1_1_armv7l.whl", hash = "sha256:502ed542e0d958bd12e7c3e9a015bce57deaf50eaa8c2e1c439b512cb9db1e3a"},
{file = "pydantic_core-2.33.0-pp310-pypy310_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:715c62af74c236bf386825c0fdfa08d092ab0f191eb5b4580d11c3189af9d330"},
{file = "pydantic_core-2.33.0-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:bccc06fa0372151f37f6b69834181aa9eb57cf8665ed36405fb45fbf6cac3bae"},
{file = "pydantic_core-2.33.0-pp311-pypy311_pp73-macosx_10_12_x86_64.whl", hash = "sha256:5d8dc9f63a26f7259b57f46a7aab5af86b2ad6fbe48487500bb1f4b27e051e4c"},
{file = "pydantic_core-2.33.0-pp311-pypy311_pp73-macosx_11_0_arm64.whl", hash = "sha256:30369e54d6d0113d2aa5aee7a90d17f225c13d87902ace8fcd7bbf99b19124db"},
{file = "pydantic_core-2.33.0-pp311-pypy311_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f3eb479354c62067afa62f53bb387827bee2f75c9c79ef25eef6ab84d4b1ae3b"},
{file = "pydantic_core-2.33.0-pp311-pypy311_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:0310524c833d91403c960b8a3cf9f46c282eadd6afd276c8c5edc617bd705dc9"},
{file = "pydantic_core-2.33.0-pp311-pypy311_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:eddb18a00bbb855325db27b4c2a89a4ba491cd6a0bd6d852b225172a1f54b36c"},
{file = "pydantic_core-2.33.0-pp311-pypy311_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:ade5dbcf8d9ef8f4b28e682d0b29f3008df9842bb5ac48ac2c17bc55771cc976"},
{file = "pydantic_core-2.33.0-pp311-pypy311_pp73-musllinux_1_1_armv7l.whl", hash = "sha256:2c0afd34f928383e3fd25740f2050dbac9d077e7ba5adbaa2227f4d4f3c8da5c"},
{file = "pydantic_core-2.33.0-pp311-pypy311_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:7da333f21cd9df51d5731513a6d39319892947604924ddf2e24a4612975fb936"},
{file = "pydantic_core-2.33.0-pp311-pypy311_pp73-win_amd64.whl", hash = "sha256:4b6d77c75a57f041c5ee915ff0b0bb58eabb78728b69ed967bc5b780e8f701b8"},
{file = "pydantic_core-2.33.0-pp39-pypy39_pp73-macosx_10_12_x86_64.whl", hash = "sha256:ba95691cf25f63df53c1d342413b41bd7762d9acb425df8858d7efa616c0870e"},
{file = "pydantic_core-2.33.0-pp39-pypy39_pp73-macosx_11_0_arm64.whl", hash = "sha256:4f1ab031feb8676f6bd7c85abec86e2935850bf19b84432c64e3e239bffeb1ec"},
{file = "pydantic_core-2.33.0-pp39-pypy39_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:58c1151827eef98b83d49b6ca6065575876a02d2211f259fb1a6b7757bd24dd8"},
{file = "pydantic_core-2.33.0-pp39-pypy39_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:a66d931ea2c1464b738ace44b7334ab32a2fd50be023d863935eb00f42be1778"},
{file = "pydantic_core-2.33.0-pp39-pypy39_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:0bcf0bab28995d483f6c8d7db25e0d05c3efa5cebfd7f56474359e7137f39856"},
{file = "pydantic_core-2.33.0-pp39-pypy39_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:89670d7a0045acb52be0566df5bc8b114ac967c662c06cf5e0c606e4aadc964b"},
{file = "pydantic_core-2.33.0-pp39-pypy39_pp73-musllinux_1_1_armv7l.whl", hash = "sha256:b716294e721d8060908dbebe32639b01bfe61b15f9f57bcc18ca9a0e00d9520b"},
{file = "pydantic_core-2.33.0-pp39-pypy39_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:fc53e05c16697ff0c1c7c2b98e45e131d4bfb78068fffff92a82d169cbb4c7b7"},
{file = "pydantic_core-2.33.0-pp39-pypy39_pp73-win_amd64.whl", hash = "sha256:68504959253303d3ae9406b634997a2123a0b0c1da86459abbd0ffc921695eac"},
{file = "pydantic_core-2.33.0.tar.gz", hash = "sha256:40eb8af662ba409c3cbf4a8150ad32ae73514cd7cb1f1a2113af39763dd616b3"},
]
[package.dependencies]
@@ -1273,14 +1372,14 @@ typing-extensions = ">=4.6.0,<4.7.0 || >4.7.0"
[[package]]
name = "pydantic-settings"
version = "2.7.1"
version = "2.8.1"
description = "Settings management using Pydantic"
optional = false
python-versions = ">=3.8"
groups = ["main"]
files = [
{file = "pydantic_settings-2.7.1-py3-none-any.whl", hash = "sha256:590be9e6e24d06db33a4262829edef682500ef008565a969c73d39d5f8bfb3fd"},
{file = "pydantic_settings-2.7.1.tar.gz", hash = "sha256:10c9caad35e64bfb3c2fbf70a078c0e25cc92499782e5200747f942a065dec93"},
{file = "pydantic_settings-2.8.1-py3-none-any.whl", hash = "sha256:81942d5ac3d905f7f3ee1a70df5dfb62d5569c12f51a5a647defc1c3d9ee2e9c"},
{file = "pydantic_settings-2.8.1.tar.gz", hash = "sha256:d5c663dfbe9db9d5e1c646b2e161da12f0d734d422ee56f567d0ea2cee4e8585"},
]
[package.dependencies]
@@ -1335,14 +1434,14 @@ dev = ["argcomplete", "attrs (>=19.2)", "hypothesis (>=3.56)", "mock", "pygments
[[package]]
name = "pytest-asyncio"
version = "0.25.3"
version = "0.26.0"
description = "Pytest support for asyncio"
optional = false
python-versions = ">=3.9"
groups = ["main"]
files = [
{file = "pytest_asyncio-0.25.3-py3-none-any.whl", hash = "sha256:9e89518e0f9bd08928f97a3482fdc4e244df17529460bc038291ccaf8f85c7c3"},
{file = "pytest_asyncio-0.25.3.tar.gz", hash = "sha256:fc1da2cf9f125ada7e710b4ddad05518d4cee187ae9412e9ac9271003497f07a"},
{file = "pytest_asyncio-0.26.0-py3-none-any.whl", hash = "sha256:7b51ed894f4fbea1340262bdae5135797ebbe21d8638978e35d31c6d19f72fb0"},
{file = "pytest_asyncio-0.26.0.tar.gz", hash = "sha256:c4df2a697648241ff39e7f0e4a73050b03f123f760673956cf0d72a4990e312f"},
]
[package.dependencies]
@@ -1402,21 +1501,21 @@ cli = ["click (>=5.0)"]
[[package]]
name = "realtime"
version = "2.0.2"
version = "2.4.2"
description = ""
optional = false
python-versions = "<4.0,>=3.9"
groups = ["main"]
files = [
{file = "realtime-2.0.2-py3-none-any.whl", hash = "sha256:2634c915bc38807f2013f21e8bcc4d2f79870dfd81460ddb9393883d0489928a"},
{file = "realtime-2.0.2.tar.gz", hash = "sha256:519da9325b3b8102139d51785013d592f6b2403d81fa21d838a0b0234723ed7d"},
{file = "realtime-2.4.2-py3-none-any.whl", hash = "sha256:0cc1b4a097acf9c0bd3a2f1998170de47744574c606617285113ddb3021e54ca"},
{file = "realtime-2.4.2.tar.gz", hash = "sha256:760308d5310533f65a9098e0b482a518f6ad2f3c0f2723e83cf5856865bafc5d"},
]
[package.dependencies]
aiohttp = ">=3.10.2,<4.0.0"
aiohttp = ">=3.11.14,<4.0.0"
python-dateutil = ">=2.8.1,<3.0.0"
typing-extensions = ">=4.12.2,<5.0.0"
websockets = ">=11,<13"
websockets = ">=11,<15"
[[package]]
name = "redis"
@@ -1476,30 +1575,30 @@ pyasn1 = ">=0.1.3"
[[package]]
name = "ruff"
version = "0.9.6"
version = "0.11.2"
description = "An extremely fast Python linter and code formatter, written in Rust."
optional = false
python-versions = ">=3.7"
groups = ["dev"]
files = [
{file = "ruff-0.9.6-py3-none-linux_armv6l.whl", hash = "sha256:2f218f356dd2d995839f1941322ff021c72a492c470f0b26a34f844c29cdf5ba"},
{file = "ruff-0.9.6-py3-none-macosx_10_12_x86_64.whl", hash = "sha256:b908ff4df65dad7b251c9968a2e4560836d8f5487c2f0cc238321ed951ea0504"},
{file = "ruff-0.9.6-py3-none-macosx_11_0_arm64.whl", hash = "sha256:b109c0ad2ececf42e75fa99dc4043ff72a357436bb171900714a9ea581ddef83"},
{file = "ruff-0.9.6-py3-none-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1de4367cca3dac99bcbd15c161404e849bb0bfd543664db39232648dc00112dc"},
{file = "ruff-0.9.6-py3-none-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:ac3ee4d7c2c92ddfdaedf0bf31b2b176fa7aa8950efc454628d477394d35638b"},
{file = "ruff-0.9.6-py3-none-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:5dc1edd1775270e6aa2386119aea692039781429f0be1e0949ea5884e011aa8e"},
{file = "ruff-0.9.6-py3-none-manylinux_2_17_ppc64.manylinux2014_ppc64.whl", hash = "sha256:4a091729086dffa4bd070aa5dab7e39cc6b9d62eb2bef8f3d91172d30d599666"},
{file = "ruff-0.9.6-py3-none-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:d1bbc6808bf7b15796cef0815e1dfb796fbd383e7dbd4334709642649625e7c5"},
{file = "ruff-0.9.6-py3-none-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:589d1d9f25b5754ff230dce914a174a7c951a85a4e9270613a2b74231fdac2f5"},
{file = "ruff-0.9.6-py3-none-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:dc61dd5131742e21103fbbdcad683a8813be0e3c204472d520d9a5021ca8b217"},
{file = "ruff-0.9.6-py3-none-musllinux_1_2_aarch64.whl", hash = "sha256:5e2d9126161d0357e5c8f30b0bd6168d2c3872372f14481136d13de9937f79b6"},
{file = "ruff-0.9.6-py3-none-musllinux_1_2_armv7l.whl", hash = "sha256:68660eab1a8e65babb5229a1f97b46e3120923757a68b5413d8561f8a85d4897"},
{file = "ruff-0.9.6-py3-none-musllinux_1_2_i686.whl", hash = "sha256:c4cae6c4cc7b9b4017c71114115db0445b00a16de3bcde0946273e8392856f08"},
{file = "ruff-0.9.6-py3-none-musllinux_1_2_x86_64.whl", hash = "sha256:19f505b643228b417c1111a2a536424ddde0db4ef9023b9e04a46ed8a1cb4656"},
{file = "ruff-0.9.6-py3-none-win32.whl", hash = "sha256:194d8402bceef1b31164909540a597e0d913c0e4952015a5b40e28c146121b5d"},
{file = "ruff-0.9.6-py3-none-win_amd64.whl", hash = "sha256:03482d5c09d90d4ee3f40d97578423698ad895c87314c4de39ed2af945633caa"},
{file = "ruff-0.9.6-py3-none-win_arm64.whl", hash = "sha256:0e2bb706a2be7ddfea4a4af918562fdc1bcb16df255e5fa595bbd800ce322a5a"},
{file = "ruff-0.9.6.tar.gz", hash = "sha256:81761592f72b620ec8fa1068a6fd00e98a5ebee342a3642efd84454f3031dca9"},
{file = "ruff-0.11.2-py3-none-linux_armv6l.whl", hash = "sha256:c69e20ea49e973f3afec2c06376eb56045709f0212615c1adb0eda35e8a4e477"},
{file = "ruff-0.11.2-py3-none-macosx_10_12_x86_64.whl", hash = "sha256:2c5424cc1c4eb1d8ecabe6d4f1b70470b4f24a0c0171356290b1953ad8f0e272"},
{file = "ruff-0.11.2-py3-none-macosx_11_0_arm64.whl", hash = "sha256:ecf20854cc73f42171eedb66f006a43d0a21bfb98a2523a809931cda569552d9"},
{file = "ruff-0.11.2-py3-none-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0c543bf65d5d27240321604cee0633a70c6c25c9a2f2492efa9f6d4b8e4199bb"},
{file = "ruff-0.11.2-py3-none-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:20967168cc21195db5830b9224be0e964cc9c8ecf3b5a9e3ce19876e8d3a96e3"},
{file = "ruff-0.11.2-py3-none-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:955a9ce63483999d9f0b8f0b4a3ad669e53484232853054cc8b9d51ab4c5de74"},
{file = "ruff-0.11.2-py3-none-manylinux_2_17_ppc64.manylinux2014_ppc64.whl", hash = "sha256:86b3a27c38b8fce73bcd262b0de32e9a6801b76d52cdb3ae4c914515f0cef608"},
{file = "ruff-0.11.2-py3-none-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a3b66a03b248c9fcd9d64d445bafdf1589326bee6fc5c8e92d7562e58883e30f"},
{file = "ruff-0.11.2-py3-none-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:0397c2672db015be5aa3d4dac54c69aa012429097ff219392c018e21f5085147"},
{file = "ruff-0.11.2-py3-none-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:869bcf3f9abf6457fbe39b5a37333aa4eecc52a3b99c98827ccc371a8e5b6f1b"},
{file = "ruff-0.11.2-py3-none-musllinux_1_2_aarch64.whl", hash = "sha256:2a2b50ca35457ba785cd8c93ebbe529467594087b527a08d487cf0ee7b3087e9"},
{file = "ruff-0.11.2-py3-none-musllinux_1_2_armv7l.whl", hash = "sha256:7c69c74bf53ddcfbc22e6eb2f31211df7f65054bfc1f72288fc71e5f82db3eab"},
{file = "ruff-0.11.2-py3-none-musllinux_1_2_i686.whl", hash = "sha256:6e8fb75e14560f7cf53b15bbc55baf5ecbe373dd5f3aab96ff7aa7777edd7630"},
{file = "ruff-0.11.2-py3-none-musllinux_1_2_x86_64.whl", hash = "sha256:842a472d7b4d6f5924e9297aa38149e5dcb1e628773b70e6387ae2c97a63c58f"},
{file = "ruff-0.11.2-py3-none-win32.whl", hash = "sha256:aca01ccd0eb5eb7156b324cfaa088586f06a86d9e5314b0eb330cb48415097cc"},
{file = "ruff-0.11.2-py3-none-win_amd64.whl", hash = "sha256:3170150172a8f994136c0c66f494edf199a0bbea7a409f649e4bc8f4d7084080"},
{file = "ruff-0.11.2-py3-none-win_arm64.whl", hash = "sha256:52933095158ff328f4c77af3d74f0379e34fd52f175144cefc1b192e7ccd32b4"},
{file = "ruff-0.11.2.tar.gz", hash = "sha256:ec47591497d5a1050175bdf4e1a4e6272cddff7da88a2ad595e1e326041d8d94"},
]
[[package]]
@@ -1561,21 +1660,21 @@ test = ["pylint", "pytest", "pytest-black", "pytest-cov", "pytest-pylint"]
[[package]]
name = "supabase"
version = "2.13.0"
version = "2.15.0"
description = "Supabase client for Python."
optional = false
python-versions = "<4.0,>=3.9"
groups = ["main"]
files = [
{file = "supabase-2.13.0-py3-none-any.whl", hash = "sha256:6cfccc055be21dab311afc5e9d5b37f3a4966f8394703763fbc8f8e86f36eaa6"},
{file = "supabase-2.13.0.tar.gz", hash = "sha256:452574d34bd978c8d11b5f02b0182b48e8854e511c969483c83875ec01495f11"},
{file = "supabase-2.15.0-py3-none-any.whl", hash = "sha256:a665c7ab6c8ad1d80609ab62ad657f66fdaf38070ec9e0db5c7887fd72b109c0"},
{file = "supabase-2.15.0.tar.gz", hash = "sha256:2e66289ad74ae9c4cb04a69f9de00cd2ce880cd890de23269a40ac5b69151d26"},
]
[package.dependencies]
gotrue = ">=2.11.0,<3.0.0"
httpx = ">=0.26,<0.29"
postgrest = ">=0.19,<0.20"
realtime = ">=2.0.0,<3.0.0"
postgrest = ">0.19,<1.1"
realtime = ">=2.4.0,<2.5.0"
storage3 = ">=0.10,<0.12"
supafunc = ">=0.9,<0.10"
@@ -1602,7 +1701,7 @@ description = "A lil' TOML parser"
optional = false
python-versions = ">=3.8"
groups = ["main"]
markers = "python_version < \"3.11\""
markers = "python_version == \"3.10\""
files = [
{file = "tomli-2.1.0-py3-none-any.whl", hash = "sha256:a5c57c3d1c56f5ccdf89f6523458f60ef716e210fc47c4cfb188c5ba473e0391"},
{file = "tomli-2.1.0.tar.gz", hash = "sha256:3f646cae2aec94e17d04973e4249548320197cfabdf130015d023de4b74d8ab8"},
@@ -1620,6 +1719,21 @@ files = [
{file = "typing_extensions-4.12.2.tar.gz", hash = "sha256:1a7ead55c7e559dd4dee8856e3a88b41225abfe1ce8df57b7c13915fe121ffb8"},
]
[[package]]
name = "typing-inspection"
version = "0.4.0"
description = "Runtime typing introspection tools"
optional = false
python-versions = ">=3.9"
groups = ["main"]
files = [
{file = "typing_inspection-0.4.0-py3-none-any.whl", hash = "sha256:50e72559fcd2a6367a19f7a7e610e6afcb9fac940c650290eed893d61386832f"},
{file = "typing_inspection-0.4.0.tar.gz", hash = "sha256:9765c87de36671694a67904bf2c96e395be9c6439bb6c87b5142569dcdd65122"},
]
[package.dependencies]
typing-extensions = ">=4.12.0"
[[package]]
name = "urllib3"
version = "2.2.2"
@@ -1802,109 +1916,100 @@ files = [
[[package]]
name = "yarl"
version = "1.11.1"
version = "1.18.3"
description = "Yet another URL library"
optional = false
python-versions = ">=3.8"
python-versions = ">=3.9"
groups = ["main"]
files = [
{file = "yarl-1.11.1-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:400cd42185f92de559d29eeb529e71d80dfbd2f45c36844914a4a34297ca6f00"},
{file = "yarl-1.11.1-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:8258c86f47e080a258993eed877d579c71da7bda26af86ce6c2d2d072c11320d"},
{file = "yarl-1.11.1-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:2164cd9725092761fed26f299e3f276bb4b537ca58e6ff6b252eae9631b5c96e"},
{file = "yarl-1.11.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a08ea567c16f140af8ddc7cb58e27e9138a1386e3e6e53982abaa6f2377b38cc"},
{file = "yarl-1.11.1-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:768ecc550096b028754ea28bf90fde071c379c62c43afa574edc6f33ee5daaec"},
{file = "yarl-1.11.1-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:2909fa3a7d249ef64eeb2faa04b7957e34fefb6ec9966506312349ed8a7e77bf"},
{file = "yarl-1.11.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:01a8697ec24f17c349c4f655763c4db70eebc56a5f82995e5e26e837c6eb0e49"},
{file = "yarl-1.11.1-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:e286580b6511aac7c3268a78cdb861ec739d3e5a2a53b4809faef6b49778eaff"},
{file = "yarl-1.11.1-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:4179522dc0305c3fc9782549175c8e8849252fefeb077c92a73889ccbcd508ad"},
{file = "yarl-1.11.1-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:27fcb271a41b746bd0e2a92182df507e1c204759f460ff784ca614e12dd85145"},
{file = "yarl-1.11.1-cp310-cp310-musllinux_1_2_ppc64le.whl", hash = "sha256:f61db3b7e870914dbd9434b560075e0366771eecbe6d2b5561f5bc7485f39efd"},
{file = "yarl-1.11.1-cp310-cp310-musllinux_1_2_s390x.whl", hash = "sha256:c92261eb2ad367629dc437536463dc934030c9e7caca861cc51990fe6c565f26"},
{file = "yarl-1.11.1-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:d95b52fbef190ca87d8c42f49e314eace4fc52070f3dfa5f87a6594b0c1c6e46"},
{file = "yarl-1.11.1-cp310-cp310-win32.whl", hash = "sha256:489fa8bde4f1244ad6c5f6d11bb33e09cf0d1d0367edb197619c3e3fc06f3d91"},
{file = "yarl-1.11.1-cp310-cp310-win_amd64.whl", hash = "sha256:476e20c433b356e16e9a141449f25161e6b69984fb4cdbd7cd4bd54c17844998"},
{file = "yarl-1.11.1-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:946eedc12895873891aaceb39bceb484b4977f70373e0122da483f6c38faaa68"},
{file = "yarl-1.11.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:21a7c12321436b066c11ec19c7e3cb9aec18884fe0d5b25d03d756a9e654edfe"},
{file = "yarl-1.11.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:c35f493b867912f6fda721a59cc7c4766d382040bdf1ddaeeaa7fa4d072f4675"},
{file = "yarl-1.11.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:25861303e0be76b60fddc1250ec5986c42f0a5c0c50ff57cc30b1be199c00e63"},
{file = "yarl-1.11.1-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:e4b53f73077e839b3f89c992223f15b1d2ab314bdbdf502afdc7bb18e95eae27"},
{file = "yarl-1.11.1-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:327c724b01b8641a1bf1ab3b232fb638706e50f76c0b5bf16051ab65c868fac5"},
{file = "yarl-1.11.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:4307d9a3417eea87715c9736d050c83e8c1904e9b7aada6ce61b46361b733d92"},
{file = "yarl-1.11.1-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:48a28bed68ab8fb7e380775f0029a079f08a17799cb3387a65d14ace16c12e2b"},
{file = "yarl-1.11.1-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:067b961853c8e62725ff2893226fef3d0da060656a9827f3f520fb1d19b2b68a"},
{file = "yarl-1.11.1-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:8215f6f21394d1f46e222abeb06316e77ef328d628f593502d8fc2a9117bde83"},
{file = "yarl-1.11.1-cp311-cp311-musllinux_1_2_ppc64le.whl", hash = "sha256:498442e3af2a860a663baa14fbf23fb04b0dd758039c0e7c8f91cb9279799bff"},
{file = "yarl-1.11.1-cp311-cp311-musllinux_1_2_s390x.whl", hash = "sha256:69721b8effdb588cb055cc22f7c5105ca6fdaa5aeb3ea09021d517882c4a904c"},
{file = "yarl-1.11.1-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:1e969fa4c1e0b1a391f3fcbcb9ec31e84440253325b534519be0d28f4b6b533e"},
{file = "yarl-1.11.1-cp311-cp311-win32.whl", hash = "sha256:7d51324a04fc4b0e097ff8a153e9276c2593106a811704025bbc1d6916f45ca6"},
{file = "yarl-1.11.1-cp311-cp311-win_amd64.whl", hash = "sha256:15061ce6584ece023457fb8b7a7a69ec40bf7114d781a8c4f5dcd68e28b5c53b"},
{file = "yarl-1.11.1-cp312-cp312-macosx_10_9_universal2.whl", hash = "sha256:a4264515f9117be204935cd230fb2a052dd3792789cc94c101c535d349b3dab0"},
{file = "yarl-1.11.1-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:f41fa79114a1d2eddb5eea7b912d6160508f57440bd302ce96eaa384914cd265"},
{file = "yarl-1.11.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:02da8759b47d964f9173c8675710720b468aa1c1693be0c9c64abb9d8d9a4867"},
{file = "yarl-1.11.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:9361628f28f48dcf8b2f528420d4d68102f593f9c2e592bfc842f5fb337e44fd"},
{file = "yarl-1.11.1-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:b91044952da03b6f95fdba398d7993dd983b64d3c31c358a4c89e3c19b6f7aef"},
{file = "yarl-1.11.1-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:74db2ef03b442276d25951749a803ddb6e270d02dda1d1c556f6ae595a0d76a8"},
{file = "yarl-1.11.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7e975a2211952a8a083d1b9d9ba26472981ae338e720b419eb50535de3c02870"},
{file = "yarl-1.11.1-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:8aef97ba1dd2138112890ef848e17d8526fe80b21f743b4ee65947ea184f07a2"},
{file = "yarl-1.11.1-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:a7915ea49b0c113641dc4d9338efa9bd66b6a9a485ffe75b9907e8573ca94b84"},
{file = "yarl-1.11.1-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:504cf0d4c5e4579a51261d6091267f9fd997ef58558c4ffa7a3e1460bd2336fa"},
{file = "yarl-1.11.1-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:3de5292f9f0ee285e6bd168b2a77b2a00d74cbcfa420ed078456d3023d2f6dff"},
{file = "yarl-1.11.1-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:a34e1e30f1774fa35d37202bbeae62423e9a79d78d0874e5556a593479fdf239"},
{file = "yarl-1.11.1-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:66b63c504d2ca43bf7221a1f72fbe981ff56ecb39004c70a94485d13e37ebf45"},
{file = "yarl-1.11.1-cp312-cp312-win32.whl", hash = "sha256:a28b70c9e2213de425d9cba5ab2e7f7a1c8ca23a99c4b5159bf77b9c31251447"},
{file = "yarl-1.11.1-cp312-cp312-win_amd64.whl", hash = "sha256:17b5a386d0d36fb828e2fb3ef08c8829c1ebf977eef88e5367d1c8c94b454639"},
{file = "yarl-1.11.1-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:1fa2e7a406fbd45b61b4433e3aa254a2c3e14c4b3186f6e952d08a730807fa0c"},
{file = "yarl-1.11.1-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:750f656832d7d3cb0c76be137ee79405cc17e792f31e0a01eee390e383b2936e"},
{file = "yarl-1.11.1-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:0b8486f322d8f6a38539136a22c55f94d269addb24db5cb6f61adc61eabc9d93"},
{file = "yarl-1.11.1-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:3fce4da3703ee6048ad4138fe74619c50874afe98b1ad87b2698ef95bf92c96d"},
{file = "yarl-1.11.1-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:8ed653638ef669e0efc6fe2acb792275cb419bf9cb5c5049399f3556995f23c7"},
{file = "yarl-1.11.1-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:18ac56c9dd70941ecad42b5a906820824ca72ff84ad6fa18db33c2537ae2e089"},
{file = "yarl-1.11.1-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:688654f8507464745ab563b041d1fb7dab5d9912ca6b06e61d1c4708366832f5"},
{file = "yarl-1.11.1-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:4973eac1e2ff63cf187073cd4e1f1148dcd119314ab79b88e1b3fad74a18c9d5"},
{file = "yarl-1.11.1-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:964a428132227edff96d6f3cf261573cb0f1a60c9a764ce28cda9525f18f7786"},
{file = "yarl-1.11.1-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:6d23754b9939cbab02c63434776df1170e43b09c6a517585c7ce2b3d449b7318"},
{file = "yarl-1.11.1-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:c2dc4250fe94d8cd864d66018f8344d4af50e3758e9d725e94fecfa27588ff82"},
{file = "yarl-1.11.1-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:09696438cb43ea6f9492ef237761b043f9179f455f405279e609f2bc9100212a"},
{file = "yarl-1.11.1-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:999bfee0a5b7385a0af5ffb606393509cfde70ecca4f01c36985be6d33e336da"},
{file = "yarl-1.11.1-cp313-cp313-win32.whl", hash = "sha256:ce928c9c6409c79e10f39604a7e214b3cb69552952fbda8d836c052832e6a979"},
{file = "yarl-1.11.1-cp313-cp313-win_amd64.whl", hash = "sha256:501c503eed2bb306638ccb60c174f856cc3246c861829ff40eaa80e2f0330367"},
{file = "yarl-1.11.1-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:dae7bd0daeb33aa3e79e72877d3d51052e8b19c9025ecf0374f542ea8ec120e4"},
{file = "yarl-1.11.1-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:3ff6b1617aa39279fe18a76c8d165469c48b159931d9b48239065767ee455b2b"},
{file = "yarl-1.11.1-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:3257978c870728a52dcce8c2902bf01f6c53b65094b457bf87b2644ee6238ddc"},
{file = "yarl-1.11.1-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0f351fa31234699d6084ff98283cb1e852270fe9e250a3b3bf7804eb493bd937"},
{file = "yarl-1.11.1-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:8aef1b64da41d18026632d99a06b3fefe1d08e85dd81d849fa7c96301ed22f1b"},
{file = "yarl-1.11.1-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:7175a87ab8f7fbde37160a15e58e138ba3b2b0e05492d7351314a250d61b1591"},
{file = "yarl-1.11.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ba444bdd4caa2a94456ef67a2f383710928820dd0117aae6650a4d17029fa25e"},
{file = "yarl-1.11.1-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:0ea9682124fc062e3d931c6911934a678cb28453f957ddccf51f568c2f2b5e05"},
{file = "yarl-1.11.1-cp38-cp38-musllinux_1_2_aarch64.whl", hash = "sha256:8418c053aeb236b20b0ab8fa6bacfc2feaaf7d4683dd96528610989c99723d5f"},
{file = "yarl-1.11.1-cp38-cp38-musllinux_1_2_i686.whl", hash = "sha256:61a5f2c14d0a1adfdd82258f756b23a550c13ba4c86c84106be4c111a3a4e413"},
{file = "yarl-1.11.1-cp38-cp38-musllinux_1_2_ppc64le.whl", hash = "sha256:f3a6d90cab0bdf07df8f176eae3a07127daafcf7457b997b2bf46776da2c7eb7"},
{file = "yarl-1.11.1-cp38-cp38-musllinux_1_2_s390x.whl", hash = "sha256:077da604852be488c9a05a524068cdae1e972b7dc02438161c32420fb4ec5e14"},
{file = "yarl-1.11.1-cp38-cp38-musllinux_1_2_x86_64.whl", hash = "sha256:15439f3c5c72686b6c3ff235279630d08936ace67d0fe5c8d5bbc3ef06f5a420"},
{file = "yarl-1.11.1-cp38-cp38-win32.whl", hash = "sha256:238a21849dd7554cb4d25a14ffbfa0ef380bb7ba201f45b144a14454a72ffa5a"},
{file = "yarl-1.11.1-cp38-cp38-win_amd64.whl", hash = "sha256:67459cf8cf31da0e2cbdb4b040507e535d25cfbb1604ca76396a3a66b8ba37a6"},
{file = "yarl-1.11.1-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:884eab2ce97cbaf89f264372eae58388862c33c4f551c15680dd80f53c89a269"},
{file = "yarl-1.11.1-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:8a336eaa7ee7e87cdece3cedb395c9657d227bfceb6781295cf56abcd3386a26"},
{file = "yarl-1.11.1-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:87f020d010ba80a247c4abc335fc13421037800ca20b42af5ae40e5fd75e7909"},
{file = "yarl-1.11.1-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:637c7ddb585a62d4469f843dac221f23eec3cbad31693b23abbc2c366ad41ff4"},
{file = "yarl-1.11.1-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:48dfd117ab93f0129084577a07287376cc69c08138694396f305636e229caa1a"},
{file = "yarl-1.11.1-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:75e0ae31fb5ccab6eda09ba1494e87eb226dcbd2372dae96b87800e1dcc98804"},
{file = "yarl-1.11.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:f46f81501160c28d0c0b7333b4f7be8983dbbc161983b6fb814024d1b4952f79"},
{file = "yarl-1.11.1-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:04293941646647b3bfb1719d1d11ff1028e9c30199509a844da3c0f5919dc520"},
{file = "yarl-1.11.1-cp39-cp39-musllinux_1_2_aarch64.whl", hash = "sha256:250e888fa62d73e721f3041e3a9abf427788a1934b426b45e1b92f62c1f68366"},
{file = "yarl-1.11.1-cp39-cp39-musllinux_1_2_i686.whl", hash = "sha256:e8f63904df26d1a66aabc141bfd258bf738b9bc7bc6bdef22713b4f5ef789a4c"},
{file = "yarl-1.11.1-cp39-cp39-musllinux_1_2_ppc64le.whl", hash = "sha256:aac44097d838dda26526cffb63bdd8737a2dbdf5f2c68efb72ad83aec6673c7e"},
{file = "yarl-1.11.1-cp39-cp39-musllinux_1_2_s390x.whl", hash = "sha256:267b24f891e74eccbdff42241c5fb4f974de2d6271dcc7d7e0c9ae1079a560d9"},
{file = "yarl-1.11.1-cp39-cp39-musllinux_1_2_x86_64.whl", hash = "sha256:6907daa4b9d7a688063ed098c472f96e8181733c525e03e866fb5db480a424df"},
{file = "yarl-1.11.1-cp39-cp39-win32.whl", hash = "sha256:14438dfc5015661f75f85bc5adad0743678eefee266ff0c9a8e32969d5d69f74"},
{file = "yarl-1.11.1-cp39-cp39-win_amd64.whl", hash = "sha256:94d0caaa912bfcdc702a4204cd5e2bb01eb917fc4f5ea2315aa23962549561b0"},
{file = "yarl-1.11.1-py3-none-any.whl", hash = "sha256:72bf26f66456baa0584eff63e44545c9f0eaed9b73cb6601b647c91f14c11f38"},
{file = "yarl-1.11.1.tar.gz", hash = "sha256:1bb2d9e212fb7449b8fb73bc461b51eaa17cc8430b4a87d87be7b25052d92f53"},
{file = "yarl-1.18.3-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:7df647e8edd71f000a5208fe6ff8c382a1de8edfbccdbbfe649d263de07d8c34"},
{file = "yarl-1.18.3-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:c69697d3adff5aa4f874b19c0e4ed65180ceed6318ec856ebc423aa5850d84f7"},
{file = "yarl-1.18.3-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:602d98f2c2d929f8e697ed274fbadc09902c4025c5a9963bf4e9edfc3ab6f7ed"},
{file = "yarl-1.18.3-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:c654d5207c78e0bd6d749f6dae1dcbbfde3403ad3a4b11f3c5544d9906969dde"},
{file = "yarl-1.18.3-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:5094d9206c64181d0f6e76ebd8fb2f8fe274950a63890ee9e0ebfd58bf9d787b"},
{file = "yarl-1.18.3-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:35098b24e0327fc4ebdc8ffe336cee0a87a700c24ffed13161af80124b7dc8e5"},
{file = "yarl-1.18.3-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:3236da9272872443f81fedc389bace88408f64f89f75d1bdb2256069a8730ccc"},
{file = "yarl-1.18.3-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:e2c08cc9b16f4f4bc522771d96734c7901e7ebef70c6c5c35dd0f10845270bcd"},
{file = "yarl-1.18.3-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:80316a8bd5109320d38eef8833ccf5f89608c9107d02d2a7f985f98ed6876990"},
{file = "yarl-1.18.3-cp310-cp310-musllinux_1_2_armv7l.whl", hash = "sha256:c1e1cc06da1491e6734f0ea1e6294ce00792193c463350626571c287c9a704db"},
{file = "yarl-1.18.3-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:fea09ca13323376a2fdfb353a5fa2e59f90cd18d7ca4eaa1fd31f0a8b4f91e62"},
{file = "yarl-1.18.3-cp310-cp310-musllinux_1_2_ppc64le.whl", hash = "sha256:e3b9fd71836999aad54084906f8663dffcd2a7fb5cdafd6c37713b2e72be1760"},
{file = "yarl-1.18.3-cp310-cp310-musllinux_1_2_s390x.whl", hash = "sha256:757e81cae69244257d125ff31663249b3013b5dc0a8520d73694aed497fb195b"},
{file = "yarl-1.18.3-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:b1771de9944d875f1b98a745bc547e684b863abf8f8287da8466cf470ef52690"},
{file = "yarl-1.18.3-cp310-cp310-win32.whl", hash = "sha256:8874027a53e3aea659a6d62751800cf6e63314c160fd607489ba5c2edd753cf6"},
{file = "yarl-1.18.3-cp310-cp310-win_amd64.whl", hash = "sha256:93b2e109287f93db79210f86deb6b9bbb81ac32fc97236b16f7433db7fc437d8"},
{file = "yarl-1.18.3-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:8503ad47387b8ebd39cbbbdf0bf113e17330ffd339ba1144074da24c545f0069"},
{file = "yarl-1.18.3-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:02ddb6756f8f4517a2d5e99d8b2f272488e18dd0bfbc802f31c16c6c20f22193"},
{file = "yarl-1.18.3-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:67a283dd2882ac98cc6318384f565bffc751ab564605959df4752d42483ad889"},
{file = "yarl-1.18.3-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d980e0325b6eddc81331d3f4551e2a333999fb176fd153e075c6d1c2530aa8a8"},
{file = "yarl-1.18.3-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:b643562c12680b01e17239be267bc306bbc6aac1f34f6444d1bded0c5ce438ca"},
{file = "yarl-1.18.3-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:c017a3b6df3a1bd45b9fa49a0f54005e53fbcad16633870104b66fa1a30a29d8"},
{file = "yarl-1.18.3-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:75674776d96d7b851b6498f17824ba17849d790a44d282929c42dbb77d4f17ae"},
{file = "yarl-1.18.3-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:ccaa3a4b521b780a7e771cc336a2dba389a0861592bbce09a476190bb0c8b4b3"},
{file = "yarl-1.18.3-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:2d06d3005e668744e11ed80812e61efd77d70bb7f03e33c1598c301eea20efbb"},
{file = "yarl-1.18.3-cp311-cp311-musllinux_1_2_armv7l.whl", hash = "sha256:9d41beda9dc97ca9ab0b9888cb71f7539124bc05df02c0cff6e5acc5a19dcc6e"},
{file = "yarl-1.18.3-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:ba23302c0c61a9999784e73809427c9dbedd79f66a13d84ad1b1943802eaaf59"},
{file = "yarl-1.18.3-cp311-cp311-musllinux_1_2_ppc64le.whl", hash = "sha256:6748dbf9bfa5ba1afcc7556b71cda0d7ce5f24768043a02a58846e4a443d808d"},
{file = "yarl-1.18.3-cp311-cp311-musllinux_1_2_s390x.whl", hash = "sha256:0b0cad37311123211dc91eadcb322ef4d4a66008d3e1bdc404808992260e1a0e"},
{file = "yarl-1.18.3-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:0fb2171a4486bb075316ee754c6d8382ea6eb8b399d4ec62fde2b591f879778a"},
{file = "yarl-1.18.3-cp311-cp311-win32.whl", hash = "sha256:61b1a825a13bef4a5f10b1885245377d3cd0bf87cba068e1d9a88c2ae36880e1"},
{file = "yarl-1.18.3-cp311-cp311-win_amd64.whl", hash = "sha256:b9d60031cf568c627d028239693fd718025719c02c9f55df0a53e587aab951b5"},
{file = "yarl-1.18.3-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:1dd4bdd05407ced96fed3d7f25dbbf88d2ffb045a0db60dbc247f5b3c5c25d50"},
{file = "yarl-1.18.3-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:7c33dd1931a95e5d9a772d0ac5e44cac8957eaf58e3c8da8c1414de7dd27c576"},
{file = "yarl-1.18.3-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:25b411eddcfd56a2f0cd6a384e9f4f7aa3efee14b188de13048c25b5e91f1640"},
{file = "yarl-1.18.3-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:436c4fc0a4d66b2badc6c5fc5ef4e47bb10e4fd9bf0c79524ac719a01f3607c2"},
{file = "yarl-1.18.3-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:e35ef8683211db69ffe129a25d5634319a677570ab6b2eba4afa860f54eeaf75"},
{file = "yarl-1.18.3-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:84b2deecba4a3f1a398df819151eb72d29bfeb3b69abb145a00ddc8d30094512"},
{file = "yarl-1.18.3-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:00e5a1fea0fd4f5bfa7440a47eff01d9822a65b4488f7cff83155a0f31a2ecba"},
{file = "yarl-1.18.3-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:d0e883008013c0e4aef84dcfe2a0b172c4d23c2669412cf5b3371003941f72bb"},
{file = "yarl-1.18.3-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:5a3f356548e34a70b0172d8890006c37be92995f62d95a07b4a42e90fba54272"},
{file = "yarl-1.18.3-cp312-cp312-musllinux_1_2_armv7l.whl", hash = "sha256:ccd17349166b1bee6e529b4add61727d3f55edb7babbe4069b5764c9587a8cc6"},
{file = "yarl-1.18.3-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:b958ddd075ddba5b09bb0be8a6d9906d2ce933aee81100db289badbeb966f54e"},
{file = "yarl-1.18.3-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:c7d79f7d9aabd6011004e33b22bc13056a3e3fb54794d138af57f5ee9d9032cb"},
{file = "yarl-1.18.3-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:4891ed92157e5430874dad17b15eb1fda57627710756c27422200c52d8a4e393"},
{file = "yarl-1.18.3-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:ce1af883b94304f493698b00d0f006d56aea98aeb49d75ec7d98cd4a777e9285"},
{file = "yarl-1.18.3-cp312-cp312-win32.whl", hash = "sha256:f91c4803173928a25e1a55b943c81f55b8872f0018be83e3ad4938adffb77dd2"},
{file = "yarl-1.18.3-cp312-cp312-win_amd64.whl", hash = "sha256:7e2ee16578af3b52ac2f334c3b1f92262f47e02cc6193c598502bd46f5cd1477"},
{file = "yarl-1.18.3-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:90adb47ad432332d4f0bc28f83a5963f426ce9a1a8809f5e584e704b82685dcb"},
{file = "yarl-1.18.3-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:913829534200eb0f789d45349e55203a091f45c37a2674678744ae52fae23efa"},
{file = "yarl-1.18.3-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:ef9f7768395923c3039055c14334ba4d926f3baf7b776c923c93d80195624782"},
{file = "yarl-1.18.3-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:88a19f62ff30117e706ebc9090b8ecc79aeb77d0b1f5ec10d2d27a12bc9f66d0"},
{file = "yarl-1.18.3-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:e17c9361d46a4d5addf777c6dd5eab0715a7684c2f11b88c67ac37edfba6c482"},
{file = "yarl-1.18.3-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:1a74a13a4c857a84a845505fd2d68e54826a2cd01935a96efb1e9d86c728e186"},
{file = "yarl-1.18.3-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:41f7ce59d6ee7741af71d82020346af364949314ed3d87553763a2df1829cc58"},
{file = "yarl-1.18.3-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:f52a265001d830bc425f82ca9eabda94a64a4d753b07d623a9f2863fde532b53"},
{file = "yarl-1.18.3-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:82123d0c954dc58db301f5021a01854a85bf1f3bb7d12ae0c01afc414a882ca2"},
{file = "yarl-1.18.3-cp313-cp313-musllinux_1_2_armv7l.whl", hash = "sha256:2ec9bbba33b2d00999af4631a3397d1fd78290c48e2a3e52d8dd72db3a067ac8"},
{file = "yarl-1.18.3-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:fbd6748e8ab9b41171bb95c6142faf068f5ef1511935a0aa07025438dd9a9bc1"},
{file = "yarl-1.18.3-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:877d209b6aebeb5b16c42cbb377f5f94d9e556626b1bfff66d7b0d115be88d0a"},
{file = "yarl-1.18.3-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:b464c4ab4bfcb41e3bfd3f1c26600d038376c2de3297760dfe064d2cb7ea8e10"},
{file = "yarl-1.18.3-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:8d39d351e7faf01483cc7ff7c0213c412e38e5a340238826be7e0e4da450fdc8"},
{file = "yarl-1.18.3-cp313-cp313-win32.whl", hash = "sha256:61ee62ead9b68b9123ec24bc866cbef297dd266175d53296e2db5e7f797f902d"},
{file = "yarl-1.18.3-cp313-cp313-win_amd64.whl", hash = "sha256:578e281c393af575879990861823ef19d66e2b1d0098414855dd367e234f5b3c"},
{file = "yarl-1.18.3-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:61e5e68cb65ac8f547f6b5ef933f510134a6bf31bb178be428994b0cb46c2a04"},
{file = "yarl-1.18.3-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:fe57328fbc1bfd0bd0514470ac692630f3901c0ee39052ae47acd1d90a436719"},
{file = "yarl-1.18.3-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:a440a2a624683108a1b454705ecd7afc1c3438a08e890a1513d468671d90a04e"},
{file = "yarl-1.18.3-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:09c7907c8548bcd6ab860e5f513e727c53b4a714f459b084f6580b49fa1b9cee"},
{file = "yarl-1.18.3-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:b4f6450109834af88cb4cc5ecddfc5380ebb9c228695afc11915a0bf82116789"},
{file = "yarl-1.18.3-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:a9ca04806f3be0ac6d558fffc2fdf8fcef767e0489d2684a21912cc4ed0cd1b8"},
{file = "yarl-1.18.3-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:77a6e85b90a7641d2e07184df5557132a337f136250caafc9ccaa4a2a998ca2c"},
{file = "yarl-1.18.3-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:6333c5a377c8e2f5fae35e7b8f145c617b02c939d04110c76f29ee3676b5f9a5"},
{file = "yarl-1.18.3-cp39-cp39-musllinux_1_2_aarch64.whl", hash = "sha256:0b3c92fa08759dbf12b3a59579a4096ba9af8dd344d9a813fc7f5070d86bbab1"},
{file = "yarl-1.18.3-cp39-cp39-musllinux_1_2_armv7l.whl", hash = "sha256:4ac515b860c36becb81bb84b667466885096b5fc85596948548b667da3bf9f24"},
{file = "yarl-1.18.3-cp39-cp39-musllinux_1_2_i686.whl", hash = "sha256:045b8482ce9483ada4f3f23b3774f4e1bf4f23a2d5c912ed5170f68efb053318"},
{file = "yarl-1.18.3-cp39-cp39-musllinux_1_2_ppc64le.whl", hash = "sha256:a4bb030cf46a434ec0225bddbebd4b89e6471814ca851abb8696170adb163985"},
{file = "yarl-1.18.3-cp39-cp39-musllinux_1_2_s390x.whl", hash = "sha256:54d6921f07555713b9300bee9c50fb46e57e2e639027089b1d795ecd9f7fa910"},
{file = "yarl-1.18.3-cp39-cp39-musllinux_1_2_x86_64.whl", hash = "sha256:1d407181cfa6e70077df3377938c08012d18893f9f20e92f7d2f314a437c30b1"},
{file = "yarl-1.18.3-cp39-cp39-win32.whl", hash = "sha256:ac36703a585e0929b032fbaab0707b75dc12703766d0b53486eabd5139ebadd5"},
{file = "yarl-1.18.3-cp39-cp39-win_amd64.whl", hash = "sha256:ba87babd629f8af77f557b61e49e7c7cac36f22f871156b91e10a6e9d4f829e9"},
{file = "yarl-1.18.3-py3-none-any.whl", hash = "sha256:b57f4f58099328dfb26c6a771d09fb20dbbae81d20cfb66141251ea063bd101b"},
{file = "yarl-1.18.3.tar.gz", hash = "sha256:ac1801c45cbf77b6c99242eeff4fffb5e4e73a800b5c4ad4fc0be5def634d2e1"},
]
[package.dependencies]
idna = ">=2.0"
multidict = ">=4.0"
propcache = ">=0.2.0"
[[package]]
name = "zipp"
@@ -1929,4 +2034,4 @@ type = ["pytest-mypy"]
[metadata]
lock-version = "2.1"
python-versions = ">=3.10,<4.0"
content-hash = "f5cd0d1dafeb2b5c97d0ef27bef8a2235d4a1f54e3c60583d05ef582ac49c0e6"
content-hash = "c8e23c0609cae0717447f575849b658bee9203b784ec7270b62629cddbbbd9ca"

View File

@@ -10,18 +10,17 @@ packages = [{ include = "autogpt_libs" }]
colorama = "^0.4.6"
expiringdict = "^1.2.2"
google-cloud-logging = "^3.11.4"
pydantic = "^2.10.6"
pydantic-settings = "^2.7.1"
pydantic = "^2.11.1"
pydantic-settings = "^2.8.1"
pyjwt = "^2.10.1"
pytest-asyncio = "^0.25.3"
pytest-asyncio = "^0.26.0"
pytest-mock = "^3.14.0"
python = ">=3.10,<4.0"
python-dotenv = "^1.0.1"
supabase = "^2.13.0"
supabase = "^2.15.0"
[tool.poetry.group.dev.dependencies]
redis = "^5.2.1"
ruff = "^0.9.6"
ruff = "^0.11.0"
[build-system]
requires = ["poetry-core"]

View File

@@ -8,6 +8,7 @@ DB_CONNECT_TIMEOUT=60
DB_POOL_TIMEOUT=300
DB_SCHEMA=platform
DATABASE_URL="postgresql://${DB_USER}:${DB_PASS}@${DB_HOST}:${DB_PORT}/${DB_NAME}?schema=${DB_SCHEMA}&connect_timeout=${DB_CONNECT_TIMEOUT}"
DIRECT_URL="postgresql://${DB_USER}:${DB_PASS}@${DB_HOST}:${DB_PORT}/${DB_NAME}?schema=${DB_SCHEMA}&connect_timeout=${DB_CONNECT_TIMEOUT}"
PRISMA_SCHEMA="postgres/schema.prisma"
# EXECUTOR
@@ -188,6 +189,8 @@ SMARTLEAD_API_KEY=
# ZeroBounce
ZEROBOUNCE_API_KEY=
## ===== OPTIONAL API KEYS END ===== ##
# Logging Configuration
LOG_LEVEL=INFO
ENABLE_CLOUD_LOGGING=false

View File

@@ -73,7 +73,6 @@ FROM server_dependencies AS server
COPY autogpt_platform/backend /app/autogpt_platform/backend
RUN poetry install --no-ansi --only-root
ENV DATABASE_URL=""
ENV PORT=8000
CMD ["poetry", "run", "rest"]

View File

@@ -2,88 +2,103 @@ import importlib
import os
import re
from pathlib import Path
from typing import Type, TypeVar
from backend.data.block import Block
# Dynamically load all modules under backend.blocks
AVAILABLE_MODULES = []
current_dir = Path(__file__).parent
modules = [
str(f.relative_to(current_dir))[:-3].replace(os.path.sep, ".")
for f in current_dir.rglob("*.py")
if f.is_file() and f.name != "__init__.py"
]
for module in modules:
if not re.match("^[a-z0-9_.]+$", module):
raise ValueError(
f"Block module {module} error: module name must be lowercase, "
"and contain only alphanumeric characters and underscores."
)
importlib.import_module(f".{module}", package=__name__)
AVAILABLE_MODULES.append(module)
# Load all Block instances from the available modules
AVAILABLE_BLOCKS: dict[str, Type[Block]] = {}
from typing import TYPE_CHECKING, TypeVar
if TYPE_CHECKING:
from backend.data.block import Block
T = TypeVar("T")
def all_subclasses(cls: Type[T]) -> list[Type[T]]:
_AVAILABLE_BLOCKS: dict[str, type["Block"]] = {}
def load_all_blocks() -> dict[str, type["Block"]]:
from backend.data.block import Block
if _AVAILABLE_BLOCKS:
return _AVAILABLE_BLOCKS
# Dynamically load all modules under backend.blocks
AVAILABLE_MODULES = []
current_dir = Path(__file__).parent
modules = [
str(f.relative_to(current_dir))[:-3].replace(os.path.sep, ".")
for f in current_dir.rglob("*.py")
if f.is_file() and f.name != "__init__.py"
]
for module in modules:
if not re.match("^[a-z0-9_.]+$", module):
raise ValueError(
f"Block module {module} error: module name must be lowercase, "
"and contain only alphanumeric characters and underscores."
)
importlib.import_module(f".{module}", package=__name__)
AVAILABLE_MODULES.append(module)
# Load all Block instances from the available modules
for block_cls in all_subclasses(Block):
class_name = block_cls.__name__
if class_name.endswith("Base"):
continue
if not class_name.endswith("Block"):
raise ValueError(
f"Block class {class_name} does not end with 'Block'. "
"If you are creating an abstract class, "
"please name the class with 'Base' at the end"
)
block = block_cls.create()
if not isinstance(block.id, str) or len(block.id) != 36:
raise ValueError(
f"Block ID {block.name} error: {block.id} is not a valid UUID"
)
if block.id in _AVAILABLE_BLOCKS:
raise ValueError(
f"Block ID {block.name} error: {block.id} is already in use"
)
input_schema = block.input_schema.model_fields
output_schema = block.output_schema.model_fields
# Make sure `error` field is a string in the output schema
if "error" in output_schema and output_schema["error"].annotation is not str:
raise ValueError(
f"{block.name} `error` field in output_schema must be a string"
)
# Ensure all fields in input_schema and output_schema are annotated SchemaFields
for field_name, field in [*input_schema.items(), *output_schema.items()]:
if field.annotation is None:
raise ValueError(
f"{block.name} has a field {field_name} that is not annotated"
)
if field.json_schema_extra is None:
raise ValueError(
f"{block.name} has a field {field_name} not defined as SchemaField"
)
for field in block.input_schema.model_fields.values():
if field.annotation is bool and field.default not in (True, False):
raise ValueError(
f"{block.name} has a boolean field with no default value"
)
_AVAILABLE_BLOCKS[block.id] = block_cls
return _AVAILABLE_BLOCKS
__all__ = ["load_all_blocks"]
def all_subclasses(cls: type[T]) -> list[type[T]]:
subclasses = cls.__subclasses__()
for subclass in subclasses:
subclasses += all_subclasses(subclass)
return subclasses
for block_cls in all_subclasses(Block):
name = block_cls.__name__
if block_cls.__name__.endswith("Base"):
continue
if not block_cls.__name__.endswith("Block"):
raise ValueError(
f"Block class {block_cls.__name__} does not end with 'Block', If you are creating an abstract class, please name the class with 'Base' at the end"
)
block = block_cls.create()
if not isinstance(block.id, str) or len(block.id) != 36:
raise ValueError(f"Block ID {block.name} error: {block.id} is not a valid UUID")
if block.id in AVAILABLE_BLOCKS:
raise ValueError(f"Block ID {block.name} error: {block.id} is already in use")
input_schema = block.input_schema.model_fields
output_schema = block.output_schema.model_fields
# Make sure `error` field is a string in the output schema
if "error" in output_schema and output_schema["error"].annotation is not str:
raise ValueError(
f"{block.name} `error` field in output_schema must be a string"
)
# Make sure all fields in input_schema and output_schema are annotated and has a value
for field_name, field in [*input_schema.items(), *output_schema.items()]:
if field.annotation is None:
raise ValueError(
f"{block.name} has a field {field_name} that is not annotated"
)
if field.json_schema_extra is None:
raise ValueError(
f"{block.name} has a field {field_name} not defined as SchemaField"
)
for field in block.input_schema.model_fields.values():
if field.annotation is bool and field.default not in (True, False):
raise ValueError(f"{block.name} has a boolean field with no default value")
if block.disabled:
continue
AVAILABLE_BLOCKS[block.id] = block_cls
__all__ = ["AVAILABLE_MODULES", "AVAILABLE_BLOCKS"]

View File

@@ -1,8 +1,6 @@
import logging
from typing import Any
from autogpt_libs.utils.cache import thread_cached
from backend.data.block import (
Block,
BlockCategory,
@@ -19,21 +17,6 @@ from backend.util import json
logger = logging.getLogger(__name__)
@thread_cached
def get_executor_manager_client():
from backend.executor import ExecutionManager
from backend.util.service import get_service_client
return get_service_client(ExecutionManager)
@thread_cached
def get_event_bus():
from backend.data.execution import RedisExecutionEventBus
return RedisExecutionEventBus()
class AgentExecutorBlock(Block):
class Input(BlockSchema):
user_id: str = SchemaField(description="User ID")
@@ -75,26 +58,26 @@ class AgentExecutorBlock(Block):
)
def run(self, input_data: Input, **kwargs) -> BlockOutput:
executor_manager = get_executor_manager_client()
event_bus = get_event_bus()
from backend.data.execution import ExecutionEventType
from backend.executor import utils as execution_utils
graph_exec = executor_manager.add_execution(
event_bus = execution_utils.get_execution_event_bus()
graph_exec = execution_utils.add_graph_execution(
graph_id=input_data.graph_id,
graph_version=input_data.graph_version,
user_id=input_data.user_id,
data=input_data.data,
inputs=input_data.data,
)
log_id = f"Graph #{input_data.graph_id}-V{input_data.graph_version}, exec-id: {graph_exec.graph_exec_id}"
log_id = f"Graph #{input_data.graph_id}-V{input_data.graph_version}, exec-id: {graph_exec.id}"
logger.info(f"Starting execution of {log_id}")
for event in event_bus.listen(
graph_id=graph_exec.graph_id, graph_exec_id=graph_exec.graph_exec_id
user_id=graph_exec.user_id,
graph_id=graph_exec.graph_id,
graph_exec_id=graph_exec.id,
):
logger.info(
f"Execution {log_id} produced input {event.input_data} output {event.output_data}"
)
if not event.node_id:
if event.event_type == ExecutionEventType.GRAPH_EXEC_UPDATE:
if event.status in [
ExecutionStatus.COMPLETED,
ExecutionStatus.TERMINATED,
@@ -105,6 +88,10 @@ class AgentExecutorBlock(Block):
else:
continue
logger.debug(
f"Execution {log_id} produced input {event.input_data} output {event.output_data}"
)
if not event.block_id:
logger.warning(f"{log_id} received event without block_id {event}")
continue
@@ -119,5 +106,7 @@ class AgentExecutorBlock(Block):
continue
for output_data in event.output_data.get("output", []):
logger.info(f"Execution {log_id} produced {output_name}: {output_data}")
logger.debug(
f"Execution {log_id} produced {output_name}: {output_data}"
)
yield output_name, output_data

View File

@@ -1,7 +1,7 @@
from enum import Enum
from typing import Any, Optional
from pydantic import BaseModel
from pydantic import BaseModel, ConfigDict
from backend.data.model import SchemaField
@@ -143,11 +143,12 @@ class ContactEmail(BaseModel):
class EmploymentHistory(BaseModel):
"""An employment history in Apollo"""
class Config:
extra = "allow"
arbitrary_types_allowed = True
from_attributes = True
populate_by_name = True
model_config = ConfigDict(
extra="allow",
arbitrary_types_allowed=True,
from_attributes=True,
populate_by_name=True,
)
_id: Optional[str] = None
created_at: Optional[str] = None
@@ -188,11 +189,12 @@ class TypedCustomField(BaseModel):
class Pagination(BaseModel):
"""Pagination in Apollo"""
class Config:
extra = "allow" # Allow extra fields
arbitrary_types_allowed = True # Allow any type
from_attributes = True # Allow from_orm
populate_by_name = True # Allow field aliases to work both ways
model_config = ConfigDict(
extra="allow",
arbitrary_types_allowed=True,
from_attributes=True,
populate_by_name=True,
)
page: int = 0
per_page: int = 0
@@ -230,11 +232,12 @@ class PhoneNumber(BaseModel):
class Organization(BaseModel):
"""An organization in Apollo"""
class Config:
extra = "allow"
arbitrary_types_allowed = True
from_attributes = True
populate_by_name = True
model_config = ConfigDict(
extra="allow",
arbitrary_types_allowed=True,
from_attributes=True,
populate_by_name=True,
)
id: Optional[str] = "N/A"
name: Optional[str] = "N/A"
@@ -268,11 +271,12 @@ class Organization(BaseModel):
class Contact(BaseModel):
"""A contact in Apollo"""
class Config:
extra = "allow"
arbitrary_types_allowed = True
from_attributes = True
populate_by_name = True
model_config = ConfigDict(
extra="allow",
arbitrary_types_allowed=True,
from_attributes=True,
populate_by_name=True,
)
contact_roles: list[Any] = []
id: Optional[str] = None
@@ -369,14 +373,14 @@ If a company has several office locations, results are still based on the headqu
To exclude companies based on location, use the organization_not_locations parameter.
""",
default=[],
default_factory=list,
)
organizations_not_locations: list[str] = SchemaField(
description="""Exclude companies from search results based on the location of the company headquarters. You can use cities, US states, and countries as locations to exclude.
This parameter is useful for ensuring you do not prospect in an undesirable territory. For example, if you use ireland as a value, no Ireland-based companies will appear in your search results.
""",
default=[],
default_factory=list,
)
q_organization_keyword_tags: list[str] = SchemaField(
description="""Filter search results based on keywords associated with companies. For example, you can enter mining as a value to return only companies that have an association with the mining industry."""
@@ -390,7 +394,7 @@ If the value you enter for this parameter does not match with a company's name,
description="""The Apollo IDs for the companies you want to include in your search results. Each company in the Apollo database is assigned a unique ID.
To find IDs, identify the values for organization_id when you call this endpoint.""",
default=[],
default_factory=list,
)
max_results: int = SchemaField(
description="""The maximum number of results to return. If you don't specify this parameter, the default is 100.""",
@@ -443,14 +447,14 @@ Results also include job titles with the same terms, even if they are not exact
Use this parameter in combination with the person_seniorities[] parameter to find people based on specific job functions and seniority levels.
""",
default=[],
default_factory=list,
placeholder="marketing manager",
)
person_locations: list[str] = SchemaField(
description="""The location where people live. You can search across cities, US states, and countries.
To find people based on the headquarters locations of their current employer, use the organization_locations parameter.""",
default=[],
default_factory=list,
)
person_seniorities: list[SenorityLevels] = SchemaField(
description="""The job seniority that people hold within their current employer. This enables you to find people that currently hold positions at certain reporting levels, such as Director level or senior IC level.
@@ -460,7 +464,7 @@ For a person to be included in search results, they only need to match 1 of the
Searches only return results based on their current job title, so searching for Director-level employees only returns people that currently hold a Director-level title. If someone was previously a Director, but is currently a VP, they would not be included in your search results.
Use this parameter in combination with the person_titles[] parameter to find people based on specific job functions and seniority levels.""",
default=[],
default_factory=list,
)
organization_locations: list[str] = SchemaField(
description="""The location of the company headquarters for a person's current employer. You can search across cities, US states, and countries.
@@ -468,7 +472,7 @@ Use this parameter in combination with the person_titles[] parameter to find peo
If a company has several office locations, results are still based on the headquarters location. For example, if you search chicago but a company's HQ location is in boston, people that work for the Boston-based company will not appear in your results, even if they match other parameters.
To find people based on their personal location, use the person_locations parameter.""",
default=[],
default_factory=list,
)
q_organization_domains: list[str] = SchemaField(
description="""The domain name for the person's employer. This can be the current employer or a previous employer. Do not include www., the @ symbol, or similar.
@@ -476,23 +480,23 @@ To find people based on their personal location, use the person_locations parame
You can add multiple domains to search across companies.
Examples: apollo.io and microsoft.com""",
default=[],
default_factory=list,
)
contact_email_statuses: list[ContactEmailStatuses] = SchemaField(
description="""The email statuses for the people you want to find. You can add multiple statuses to expand your search.""",
default=[],
default_factory=list,
)
organization_ids: list[str] = SchemaField(
description="""The Apollo IDs for the companies (employers) you want to include in your search results. Each company in the Apollo database is assigned a unique ID.
To find IDs, call the Organization Search endpoint and identify the values for organization_id.""",
default=[],
default_factory=list,
)
organization_num_empoloyees_range: list[int] = SchemaField(
description="""The number range of employees working for the company. This enables you to find companies based on headcount. You can add multiple ranges to expand your search results.
Each range you add needs to be a string, with the upper and lower numbers of the range separated only by a comma.""",
default=[],
default_factory=list,
)
q_keywords: str = SchemaField(
description="""A string of words over which we want to filter the results""",
@@ -522,11 +526,12 @@ Use the page parameter to search the different pages of data.""",
class SearchPeopleResponse(BaseModel):
"""Response from Apollo's search people API"""
class Config:
extra = "allow" # Allow extra fields
arbitrary_types_allowed = True # Allow any type
from_attributes = True # Allow from_orm
populate_by_name = True # Allow field aliases to work both ways
model_config = ConfigDict(
extra="allow",
arbitrary_types_allowed=True,
from_attributes=True,
populate_by_name=True,
)
breadcrumbs: list[Breadcrumb] = []
partial_results_only: bool = True

View File

@@ -32,18 +32,18 @@ If a company has several office locations, results are still based on the headqu
To exclude companies based on location, use the organization_not_locations parameter.
""",
default=[],
default_factory=list,
)
organizations_not_locations: list[str] = SchemaField(
description="""Exclude companies from search results based on the location of the company headquarters. You can use cities, US states, and countries as locations to exclude.
This parameter is useful for ensuring you do not prospect in an undesirable territory. For example, if you use ireland as a value, no Ireland-based companies will appear in your search results.
""",
default=[],
default_factory=list,
)
q_organization_keyword_tags: list[str] = SchemaField(
description="""Filter search results based on keywords associated with companies. For example, you can enter mining as a value to return only companies that have an association with the mining industry.""",
default=[],
default_factory=list,
)
q_organization_name: str = SchemaField(
description="""Filter search results to include a specific company name.
@@ -56,7 +56,7 @@ If the value you enter for this parameter does not match with a company's name,
description="""The Apollo IDs for the companies you want to include in your search results. Each company in the Apollo database is assigned a unique ID.
To find IDs, identify the values for organization_id when you call this endpoint.""",
default=[],
default_factory=list,
)
max_results: int = SchemaField(
description="""The maximum number of results to return. If you don't specify this parameter, the default is 100.""",
@@ -72,7 +72,7 @@ To find IDs, identify the values for organization_id when you call this endpoint
class Output(BlockSchema):
organizations: list[Organization] = SchemaField(
description="List of organizations found",
default=[],
default_factory=list,
)
organization: Organization = SchemaField(
description="Each found organization, one at a time",

View File

@@ -26,14 +26,14 @@ class SearchPeopleBlock(Block):
Use this parameter in combination with the person_seniorities[] parameter to find people based on specific job functions and seniority levels.
""",
default=[],
default_factory=list,
advanced=False,
)
person_locations: list[str] = SchemaField(
description="""The location where people live. You can search across cities, US states, and countries.
To find people based on the headquarters locations of their current employer, use the organization_locations parameter.""",
default=[],
default_factory=list,
advanced=False,
)
person_seniorities: list[SenorityLevels] = SchemaField(
@@ -44,7 +44,7 @@ class SearchPeopleBlock(Block):
Searches only return results based on their current job title, so searching for Director-level employees only returns people that currently hold a Director-level title. If someone was previously a Director, but is currently a VP, they would not be included in your search results.
Use this parameter in combination with the person_titles[] parameter to find people based on specific job functions and seniority levels.""",
default=[],
default_factory=list,
advanced=False,
)
organization_locations: list[str] = SchemaField(
@@ -53,7 +53,7 @@ class SearchPeopleBlock(Block):
If a company has several office locations, results are still based on the headquarters location. For example, if you search chicago but a company's HQ location is in boston, people that work for the Boston-based company will not appear in your results, even if they match other parameters.
To find people based on their personal location, use the person_locations parameter.""",
default=[],
default_factory=list,
advanced=False,
)
q_organization_domains: list[str] = SchemaField(
@@ -62,26 +62,26 @@ class SearchPeopleBlock(Block):
You can add multiple domains to search across companies.
Examples: apollo.io and microsoft.com""",
default=[],
default_factory=list,
advanced=False,
)
contact_email_statuses: list[ContactEmailStatuses] = SchemaField(
description="""The email statuses for the people you want to find. You can add multiple statuses to expand your search.""",
default=[],
default_factory=list,
advanced=False,
)
organization_ids: list[str] = SchemaField(
description="""The Apollo IDs for the companies (employers) you want to include in your search results. Each company in the Apollo database is assigned a unique ID.
To find IDs, call the Organization Search endpoint and identify the values for organization_id.""",
default=[],
default_factory=list,
advanced=False,
)
organization_num_empoloyees_range: list[int] = SchemaField(
description="""The number range of employees working for the company. This enables you to find companies based on headcount. You can add multiple ranges to expand your search results.
Each range you add needs to be a string, with the upper and lower numbers of the range separated only by a comma.""",
default=[],
default_factory=list,
advanced=False,
)
q_keywords: str = SchemaField(
@@ -104,7 +104,7 @@ class SearchPeopleBlock(Block):
class Output(BlockSchema):
people: list[Contact] = SchemaField(
description="List of people found",
default=[],
default_factory=list,
)
person: Contact = SchemaField(
description="Each found person, one at a time",

View File

@@ -1,35 +1,22 @@
import enum
from typing import TYPE_CHECKING, Any, List
from typing import Any, List
from backend.data.block import (
Block,
BlockCategory,
BlockInput,
BlockOutput,
BlockSchema,
BlockType,
)
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema, BlockType
from backend.data.model import SchemaField
from backend.util import json
from backend.util.file import MediaFile, store_media_file
from backend.util.file import store_media_file
from backend.util.mock import MockObject
from backend.util.text import TextFormatter
from backend.util.type import convert
if TYPE_CHECKING:
from backend.data.graph import Link
formatter = TextFormatter()
from backend.util.type import MediaFileType, convert
class FileStoreBlock(Block):
class Input(BlockSchema):
file_in: MediaFile = SchemaField(
file_in: MediaFileType = SchemaField(
description="The file to store in the temporary directory, it can be a URL, data URI, or local path."
)
class Output(BlockSchema):
file_out: MediaFile = SchemaField(
file_out: MediaFileType = SchemaField(
description="The relative path to the stored file in the temporary directory."
)
@@ -101,29 +88,6 @@ class StoreValueBlock(Block):
yield "output", input_data.data or input_data.input
class PrintToConsoleBlock(Block):
class Input(BlockSchema):
text: str = SchemaField(description="The text to print to the console.")
class Output(BlockSchema):
status: str = SchemaField(description="The status of the print operation.")
def __init__(self):
super().__init__(
id="f3b1c1b2-4c4f-4f0d-8d2f-4c4f0d8d2f4c",
description="Print the given text to the console, this is used for a debugging purpose.",
categories={BlockCategory.BASIC},
input_schema=PrintToConsoleBlock.Input,
output_schema=PrintToConsoleBlock.Output,
test_input={"text": "Hello, World!"},
test_output=("status", "printed"),
)
def run(self, input_data: Input, **kwargs) -> BlockOutput:
print(">>>>> Print: ", input_data.text)
yield "status", "printed"
class FindInDictionaryBlock(Block):
class Input(BlockSchema):
input: Any = SchemaField(description="Dictionary to lookup from")
@@ -184,192 +148,10 @@ class FindInDictionaryBlock(Block):
yield "missing", input_data.input
class AgentInputBlock(Block):
"""
This block is used to provide input to the graph.
It takes in a value, name, description, default values list and bool to limit selection to default values.
It Outputs the value passed as input.
"""
class Input(BlockSchema):
name: str = SchemaField(description="The name of the input.")
value: Any = SchemaField(
description="The value to be passed as input.",
default=None,
)
title: str | None = SchemaField(
description="The title of the input.", default=None, advanced=True
)
description: str | None = SchemaField(
description="The description of the input.",
default=None,
advanced=True,
)
placeholder_values: List[Any] = SchemaField(
description="The placeholder values to be passed as input.",
default=[],
advanced=True,
)
limit_to_placeholder_values: bool = SchemaField(
description="Whether to limit the selection to placeholder values.",
default=False,
advanced=True,
)
advanced: bool = SchemaField(
description="Whether to show the input in the advanced section, if the field is not required.",
default=False,
advanced=True,
)
secret: bool = SchemaField(
description="Whether the input should be treated as a secret.",
default=False,
advanced=True,
)
class Output(BlockSchema):
result: Any = SchemaField(description="The value passed as input.")
def __init__(self):
super().__init__(
id="c0a8e994-ebf1-4a9c-a4d8-89d09c86741b",
description="This block is used to provide input to the graph.",
input_schema=AgentInputBlock.Input,
output_schema=AgentInputBlock.Output,
test_input=[
{
"value": "Hello, World!",
"name": "input_1",
"description": "This is a test input.",
"placeholder_values": [],
"limit_to_placeholder_values": False,
},
{
"value": "Hello, World!",
"name": "input_2",
"description": "This is a test input.",
"placeholder_values": ["Hello, World!"],
"limit_to_placeholder_values": True,
},
],
test_output=[
("result", "Hello, World!"),
("result", "Hello, World!"),
],
categories={BlockCategory.INPUT, BlockCategory.BASIC},
block_type=BlockType.INPUT,
static_output=True,
)
def run(self, input_data: Input, **kwargs) -> BlockOutput:
yield "result", input_data.value
class AgentOutputBlock(Block):
"""
Records the output of the graph for users to see.
Behavior:
If `format` is provided and the `value` is of a type that can be formatted,
the block attempts to format the recorded_value using the `format`.
If formatting fails or no `format` is provided, the raw `value` is output.
"""
class Input(BlockSchema):
value: Any = SchemaField(
description="The value to be recorded as output.",
default=None,
advanced=False,
)
name: str = SchemaField(description="The name of the output.")
title: str | None = SchemaField(
description="The title of the output.",
default=None,
advanced=True,
)
description: str | None = SchemaField(
description="The description of the output.",
default=None,
advanced=True,
)
format: str = SchemaField(
description="The format string to be used to format the recorded_value. Use Jinja2 syntax.",
default="",
advanced=True,
)
advanced: bool = SchemaField(
description="Whether to treat the output as advanced.",
default=False,
advanced=True,
)
secret: bool = SchemaField(
description="Whether the output should be treated as a secret.",
default=False,
advanced=True,
)
class Output(BlockSchema):
output: Any = SchemaField(description="The value recorded as output.")
name: Any = SchemaField(description="The name of the value recorded as output.")
def __init__(self):
super().__init__(
id="363ae599-353e-4804-937e-b2ee3cef3da4",
description="Stores the output of the graph for users to see.",
input_schema=AgentOutputBlock.Input,
output_schema=AgentOutputBlock.Output,
test_input=[
{
"value": "Hello, World!",
"name": "output_1",
"description": "This is a test output.",
"format": "{{ output_1 }}!!",
},
{
"value": "42",
"name": "output_2",
"description": "This is another test output.",
"format": "{{ output_2 }}",
},
{
"value": MockObject(value="!!", key="key"),
"name": "output_3",
"description": "This is a test output with a mock object.",
"format": "{{ output_3 }}",
},
],
test_output=[
("output", "Hello, World!!!"),
("output", "42"),
("output", MockObject(value="!!", key="key")),
],
categories={BlockCategory.OUTPUT, BlockCategory.BASIC},
block_type=BlockType.OUTPUT,
static_output=True,
)
def run(self, input_data: Input, **kwargs) -> BlockOutput:
"""
Attempts to format the recorded_value using the fmt_string if provided.
If formatting fails or no fmt_string is given, returns the original recorded_value.
"""
if input_data.format:
try:
yield "output", formatter.format_string(
input_data.format, {input_data.name: input_data.value}
)
except Exception as e:
yield "output", f"Error: {e}, {input_data.value}"
else:
yield "output", input_data.value
yield "name", input_data.name
class AddToDictionaryBlock(Block):
class Input(BlockSchema):
dictionary: dict[Any, Any] = SchemaField(
default={},
default_factory=dict,
description="The dictionary to add the entry to. If not provided, a new dictionary will be created.",
)
key: str = SchemaField(
@@ -385,7 +167,7 @@ class AddToDictionaryBlock(Block):
advanced=False,
)
entries: dict[Any, Any] = SchemaField(
default={},
default_factory=dict,
description="The entries to add to the dictionary. This is the batch version of the `key` and `value` fields.",
advanced=True,
)
@@ -447,7 +229,7 @@ class AddToDictionaryBlock(Block):
class AddToListBlock(Block):
class Input(BlockSchema):
list: List[Any] = SchemaField(
default=[],
default_factory=list,
advanced=False,
description="The list to add the entry to. If not provided, a new list will be created.",
)
@@ -457,7 +239,7 @@ class AddToListBlock(Block):
default=None,
)
entries: List[Any] = SchemaField(
default=[],
default_factory=lambda: list(),
description="The entries to add to the list. This is the batch version of the `entry` field.",
advanced=True,
)
@@ -466,17 +248,6 @@ class AddToListBlock(Block):
description="The position to insert the new entry. If not provided, the entry will be appended to the end of the list.",
)
@classmethod
def get_missing_links(cls, data: BlockInput, links: List["Link"]) -> set[str]:
return super().get_missing_links(
data,
[
link
for link in links
if link.sink_name != "list" or link.sink_id != link.source_id
],
)
class Output(BlockSchema):
updated_list: List[Any] = SchemaField(
description="The list with the new entry added."

View File

@@ -55,7 +55,7 @@ class CodeExecutionBlock(Block):
"These commands are executed with `sh`, in the foreground."
),
placeholder="pip install cowsay",
default=[],
default_factory=list,
advanced=False,
)
@@ -207,7 +207,7 @@ class InstantiationBlock(Block):
"These commands are executed with `sh`, in the foreground."
),
placeholder="pip install cowsay",
default=[],
default_factory=list,
advanced=False,
)

View File

@@ -8,6 +8,7 @@ from backend.data.block import (
BlockSchema,
)
from backend.data.model import SchemaField
from backend.integrations.providers import ProviderName
from backend.integrations.webhooks.compass import CompassWebhookType
@@ -42,7 +43,7 @@ class CompassAITriggerBlock(Block):
input_schema=CompassAITriggerBlock.Input,
output_schema=CompassAITriggerBlock.Output,
webhook_config=BlockManualWebhookConfig(
provider="compass",
provider=ProviderName.COMPASS,
webhook_type=CompassWebhookType.TRANSCRIPTION,
),
test_input=[

View File

@@ -34,7 +34,7 @@ class ReadCsvBlock(Block):
)
skip_columns: list[str] = SchemaField(
description="The columns to skip from the start of the row",
default=[],
default_factory=list,
)
class Output(BlockSchema):

View File

@@ -49,7 +49,7 @@ class ExaContentsBlock(Block):
class Output(BlockSchema):
results: list = SchemaField(
description="List of document contents",
default=[],
default_factory=list,
)
error: str = SchemaField(description="Error message if the request failed")

View File

@@ -38,11 +38,11 @@ class ExaSearchBlock(Block):
)
include_domains: List[str] = SchemaField(
description="Domains to include in search",
default=[],
default_factory=list,
)
exclude_domains: List[str] = SchemaField(
description="Domains to exclude from search",
default=[],
default_factory=list,
advanced=True,
)
start_crawl_date: datetime = SchemaField(
@@ -59,12 +59,12 @@ class ExaSearchBlock(Block):
)
include_text: List[str] = SchemaField(
description="Text patterns to include",
default=[],
default_factory=list,
advanced=True,
)
exclude_text: List[str] = SchemaField(
description="Text patterns to exclude",
default=[],
default_factory=list,
advanced=True,
)
contents: ContentSettings = SchemaField(
@@ -76,7 +76,7 @@ class ExaSearchBlock(Block):
class Output(BlockSchema):
results: list = SchemaField(
description="List of search results",
default=[],
default_factory=list,
)
def __init__(self):

View File

@@ -26,12 +26,12 @@ class ExaFindSimilarBlock(Block):
)
include_domains: List[str] = SchemaField(
description="Domains to include in search",
default=[],
default_factory=list,
advanced=True,
)
exclude_domains: List[str] = SchemaField(
description="Domains to exclude from search",
default=[],
default_factory=list,
advanced=True,
)
start_crawl_date: datetime = SchemaField(
@@ -48,12 +48,12 @@ class ExaFindSimilarBlock(Block):
)
include_text: List[str] = SchemaField(
description="Text patterns to include (max 1 string, up to 5 words)",
default=[],
default_factory=list,
advanced=True,
)
exclude_text: List[str] = SchemaField(
description="Text patterns to exclude (max 1 string, up to 5 words)",
default=[],
default_factory=list,
advanced=True,
)
contents: ContentSettings = SchemaField(
@@ -65,7 +65,7 @@ class ExaFindSimilarBlock(Block):
class Output(BlockSchema):
results: List[Any] = SchemaField(
description="List of similar documents with title, URL, published date, author, and score",
default=[],
default_factory=list,
)
def __init__(self):

View File

@@ -42,7 +42,7 @@ class AIVideoGeneratorBlock(Block):
description="Error message if video generation failed."
)
logs: list[str] = SchemaField(
description="Generation progress logs.", optional=True
description="Generation progress logs.",
)
def __init__(self):

View File

@@ -0,0 +1,51 @@
from backend.data.block import (
Block,
BlockCategory,
BlockManualWebhookConfig,
BlockOutput,
BlockSchema,
)
from backend.data.model import SchemaField
from backend.integrations.providers import ProviderName
from backend.integrations.webhooks.generic import GenericWebhookType
class GenericWebhookTriggerBlock(Block):
class Input(BlockSchema):
payload: dict = SchemaField(hidden=True, default_factory=dict)
constants: dict = SchemaField(
description="The constants to be set when the block is put on the graph",
default_factory=dict,
)
class Output(BlockSchema):
payload: dict = SchemaField(
description="The complete webhook payload that was received from the generic webhook."
)
constants: dict = SchemaField(
description="The constants to be set when the block is put on the graph"
)
example_payload = {"message": "Hello, World!"}
def __init__(self):
super().__init__(
id="8fa8c167-2002-47ce-aba8-97572fc5d387",
description="This block will output the contents of the generic input for the webhook.",
categories={BlockCategory.INPUT},
input_schema=GenericWebhookTriggerBlock.Input,
output_schema=GenericWebhookTriggerBlock.Output,
webhook_config=BlockManualWebhookConfig(
provider=ProviderName.GENERIC_WEBHOOK,
webhook_type=GenericWebhookType.PLAIN,
),
test_input={"constants": {"key": "value"}, "payload": self.example_payload},
test_output=[
("constants", {"key": "value"}),
("payload", self.example_payload),
],
)
def run(self, input_data: Input, **kwargs) -> BlockOutput:
yield "constants", input_data.constants
yield "payload", input_data.payload

View File

@@ -12,6 +12,7 @@ from backend.data.block import (
BlockWebhookConfig,
)
from backend.data.model import SchemaField
from backend.integrations.providers import ProviderName
from ._auth import (
TEST_CREDENTIALS,
@@ -36,7 +37,7 @@ class GitHubTriggerBase:
placeholder="{owner}/{repo}",
)
# --8<-- [start:example-payload-field]
payload: dict = SchemaField(hidden=True, default={})
payload: dict = SchemaField(hidden=True, default_factory=dict)
# --8<-- [end:example-payload-field]
class Output(BlockSchema):
@@ -123,7 +124,7 @@ class GithubPullRequestTriggerBlock(GitHubTriggerBase, Block):
output_schema=GithubPullRequestTriggerBlock.Output,
# --8<-- [start:example-webhook_config]
webhook_config=BlockWebhookConfig(
provider="github",
provider=ProviderName.GITHUB,
webhook_type=GithubWebhookType.REPO,
resource_format="{repo}",
event_filter_input="events",

View File

@@ -1,11 +1,16 @@
import json
import logging
from enum import Enum
from typing import Any
from requests.exceptions import HTTPError, RequestException
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.model import SchemaField
from backend.util.request import requests
logger = logging.getLogger(name=__name__)
class HttpMethod(Enum):
GET = "GET"
@@ -29,7 +34,7 @@ class SendWebRequestBlock(Block):
)
headers: dict[str, str] = SchemaField(
description="The headers to include in the request",
default={},
default_factory=dict,
)
json_format: bool = SchemaField(
title="JSON format",
@@ -43,8 +48,9 @@ class SendWebRequestBlock(Block):
class Output(BlockSchema):
response: object = SchemaField(description="The response from the server")
client_error: object = SchemaField(description="The error on 4xx status codes")
server_error: object = SchemaField(description="The error on 5xx status codes")
client_error: object = SchemaField(description="Errors on 4xx status codes")
server_error: object = SchemaField(description="Errors on 5xx status codes")
error: str = SchemaField(description="Errors for all other exceptions")
def __init__(self):
super().__init__(
@@ -68,20 +74,40 @@ class SendWebRequestBlock(Block):
# we should send it as plain text instead
input_data.json_format = False
response = requests.request(
input_data.method.value,
input_data.url,
headers=input_data.headers,
json=body if input_data.json_format else None,
data=body if not input_data.json_format else None,
)
result = response.json() if input_data.json_format else response.text
if response.status_code // 100 == 2:
try:
response = requests.request(
input_data.method.value,
input_data.url,
headers=input_data.headers,
json=body if input_data.json_format else None,
data=body if not input_data.json_format else None,
)
result = response.json() if input_data.json_format else response.text
yield "response", result
elif response.status_code // 100 == 4:
yield "client_error", result
elif response.status_code // 100 == 5:
yield "server_error", result
else:
raise ValueError(f"Unexpected status code: {response.status_code}")
except HTTPError as e:
# Handle error responses
try:
result = e.response.json() if input_data.json_format else str(e)
except json.JSONDecodeError:
result = str(e)
if 400 <= e.response.status_code < 500:
yield "client_error", result
elif 500 <= e.response.status_code < 600:
yield "server_error", result
else:
error_msg = (
"Unexpected status code "
f"{e.response.status_code} '{e.response.reason}'"
)
logger.warning(error_msg)
yield "error", error_msg
except RequestException as e:
# Handle other request-related exceptions
yield "error", str(e)
except Exception as e:
# Catch any other unexpected exceptions
yield "error", str(e)

View File

@@ -15,7 +15,8 @@ class HubSpotCompanyBlock(Block):
description="Operation to perform (create, update, get)", default="get"
)
company_data: dict = SchemaField(
description="Company data for create/update operations", default={}
description="Company data for create/update operations",
default_factory=dict,
)
domain: str = SchemaField(
description="Company domain for get/update operations", default=""

View File

@@ -15,7 +15,8 @@ class HubSpotContactBlock(Block):
description="Operation to perform (create, update, get)", default="get"
)
contact_data: dict = SchemaField(
description="Contact data for create/update operations", default={}
description="Contact data for create/update operations",
default_factory=dict,
)
email: str = SchemaField(
description="Email address for get/update operations", default=""

View File

@@ -19,7 +19,7 @@ class HubSpotEngagementBlock(Block):
)
email_data: dict = SchemaField(
description="Email data including recipient, subject, content",
default={},
default_factory=dict,
)
contact_id: str = SchemaField(
description="Contact ID for engagement tracking", default=""
@@ -27,7 +27,6 @@ class HubSpotEngagementBlock(Block):
timeframe_days: int = SchemaField(
description="Number of days to look back for engagement",
default=30,
optional=True,
)
class Output(BlockSchema):

View File

@@ -0,0 +1,556 @@
import copy
from datetime import date, time
from typing import Any, Optional
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema, BlockType
from backend.data.model import SchemaField
from backend.util.file import store_media_file
from backend.util.mock import MockObject
from backend.util.settings import Config
from backend.util.text import TextFormatter
from backend.util.type import LongTextType, MediaFileType, ShortTextType
formatter = TextFormatter()
config = Config()
class AgentInputBlock(Block):
"""
This block is used to provide input to the graph.
It takes in a value, name, description, default values list and bool to limit selection to default values.
It Outputs the value passed as input.
"""
class Input(BlockSchema):
name: str = SchemaField(description="The name of the input.")
value: Any = SchemaField(
description="The value to be passed as input.",
default=None,
)
title: str | None = SchemaField(
description="The title of the input.", default=None, advanced=True
)
description: str | None = SchemaField(
description="The description of the input.",
default=None,
advanced=True,
)
placeholder_values: list = SchemaField(
description="The placeholder values to be passed as input.",
default_factory=list,
advanced=True,
hidden=True,
)
advanced: bool = SchemaField(
description="Whether to show the input in the advanced section, if the field is not required.",
default=False,
advanced=True,
)
secret: bool = SchemaField(
description="Whether the input should be treated as a secret.",
default=False,
advanced=True,
)
def generate_schema(self):
schema = copy.deepcopy(self.get_field_schema("value"))
if possible_values := self.placeholder_values:
schema["enum"] = possible_values
return schema
class Output(BlockSchema):
result: Any = SchemaField(description="The value passed as input.")
def __init__(self, **kwargs):
super().__init__(
**{
"id": "c0a8e994-ebf1-4a9c-a4d8-89d09c86741b",
"description": "Base block for user inputs.",
"input_schema": AgentInputBlock.Input,
"output_schema": AgentInputBlock.Output,
"test_input": [
{
"value": "Hello, World!",
"name": "input_1",
"description": "Example test input.",
"placeholder_values": [],
},
{
"value": "Hello, World!",
"name": "input_2",
"description": "Example test input with placeholders.",
"placeholder_values": ["Hello, World!"],
},
],
"test_output": [
("result", "Hello, World!"),
("result", "Hello, World!"),
],
"categories": {BlockCategory.INPUT, BlockCategory.BASIC},
"block_type": BlockType.INPUT,
"static_output": True,
**kwargs,
}
)
def run(self, input_data: Input, *args, **kwargs) -> BlockOutput:
if input_data.value is not None:
yield "result", input_data.value
class AgentOutputBlock(Block):
"""
Records the output of the graph for users to see.
Behavior:
If `format` is provided and the `value` is of a type that can be formatted,
the block attempts to format the recorded_value using the `format`.
If formatting fails or no `format` is provided, the raw `value` is output.
"""
class Input(BlockSchema):
value: Any = SchemaField(
description="The value to be recorded as output.",
default=None,
advanced=False,
)
name: str = SchemaField(description="The name of the output.")
title: str | None = SchemaField(
description="The title of the output.",
default=None,
advanced=True,
)
description: str | None = SchemaField(
description="The description of the output.",
default=None,
advanced=True,
)
format: str = SchemaField(
description="The format string to be used to format the recorded_value. Use Jinja2 syntax.",
default="",
advanced=True,
)
advanced: bool = SchemaField(
description="Whether to treat the output as advanced.",
default=False,
advanced=True,
)
secret: bool = SchemaField(
description="Whether the output should be treated as a secret.",
default=False,
advanced=True,
)
def generate_schema(self):
return self.get_field_schema("value")
class Output(BlockSchema):
output: Any = SchemaField(description="The value recorded as output.")
name: Any = SchemaField(description="The name of the value recorded as output.")
def __init__(self):
super().__init__(
id="363ae599-353e-4804-937e-b2ee3cef3da4",
description="Stores the output of the graph for users to see.",
input_schema=AgentOutputBlock.Input,
output_schema=AgentOutputBlock.Output,
test_input=[
{
"value": "Hello, World!",
"name": "output_1",
"description": "This is a test output.",
"format": "{{ output_1 }}!!",
},
{
"value": "42",
"name": "output_2",
"description": "This is another test output.",
"format": "{{ output_2 }}",
},
{
"value": MockObject(value="!!", key="key"),
"name": "output_3",
"description": "This is a test output with a mock object.",
"format": "{{ output_3 }}",
},
],
test_output=[
("output", "Hello, World!!!"),
("output", "42"),
("output", MockObject(value="!!", key="key")),
],
categories={BlockCategory.OUTPUT, BlockCategory.BASIC},
block_type=BlockType.OUTPUT,
static_output=True,
)
def run(self, input_data: Input, *args, **kwargs) -> BlockOutput:
"""
Attempts to format the recorded_value using the fmt_string if provided.
If formatting fails or no fmt_string is given, returns the original recorded_value.
"""
if input_data.format:
try:
yield "output", formatter.format_string(
input_data.format, {input_data.name: input_data.value}
)
except Exception as e:
yield "output", f"Error: {e}, {input_data.value}"
else:
yield "output", input_data.value
yield "name", input_data.name
class AgentShortTextInputBlock(AgentInputBlock):
class Input(AgentInputBlock.Input):
value: Optional[ShortTextType] = SchemaField(
description="Short text input.",
default=None,
advanced=False,
title="Default Value",
)
class Output(AgentInputBlock.Output):
result: str = SchemaField(description="Short text result.")
def __init__(self):
super().__init__(
id="7fcd3bcb-8e1b-4e69-903d-32d3d4a92158",
description="Block for short text input (single-line).",
disabled=not config.enable_agent_input_subtype_blocks,
input_schema=AgentShortTextInputBlock.Input,
output_schema=AgentShortTextInputBlock.Output,
test_input=[
{
"value": "Hello",
"name": "short_text_1",
"description": "Short text example 1",
"placeholder_values": [],
},
{
"value": "Quick test",
"name": "short_text_2",
"description": "Short text example 2",
"placeholder_values": ["Quick test", "Another option"],
},
],
test_output=[
("result", "Hello"),
("result", "Quick test"),
],
)
class AgentLongTextInputBlock(AgentInputBlock):
class Input(AgentInputBlock.Input):
value: Optional[LongTextType] = SchemaField(
description="Long text input (potentially multi-line).",
default=None,
advanced=False,
title="Default Value",
)
class Output(AgentInputBlock.Output):
result: str = SchemaField(description="Long text result.")
def __init__(self):
super().__init__(
id="90a56ffb-7024-4b2b-ab50-e26c5e5ab8ba",
description="Block for long text input (multi-line).",
disabled=not config.enable_agent_input_subtype_blocks,
input_schema=AgentLongTextInputBlock.Input,
output_schema=AgentLongTextInputBlock.Output,
test_input=[
{
"value": "Lorem ipsum dolor sit amet...",
"name": "long_text_1",
"description": "Long text example 1",
"placeholder_values": [],
},
{
"value": "Another multiline text input.",
"name": "long_text_2",
"description": "Long text example 2",
"placeholder_values": ["Another multiline text input."],
},
],
test_output=[
("result", "Lorem ipsum dolor sit amet..."),
("result", "Another multiline text input."),
],
)
class AgentNumberInputBlock(AgentInputBlock):
class Input(AgentInputBlock.Input):
value: Optional[int] = SchemaField(
description="Number input.",
default=None,
advanced=False,
title="Default Value",
)
class Output(AgentInputBlock.Output):
result: int = SchemaField(description="Number result.")
def __init__(self):
super().__init__(
id="96dae2bb-97a2-41c2-bd2f-13a3b5a8ea98",
description="Block for number input.",
disabled=not config.enable_agent_input_subtype_blocks,
input_schema=AgentNumberInputBlock.Input,
output_schema=AgentNumberInputBlock.Output,
test_input=[
{
"value": 42,
"name": "number_input_1",
"description": "Number example 1",
"placeholder_values": [],
},
{
"value": 314,
"name": "number_input_2",
"description": "Number example 2",
"placeholder_values": [314, 2718],
},
],
test_output=[
("result", 42),
("result", 314),
],
)
class AgentDateInputBlock(AgentInputBlock):
class Input(AgentInputBlock.Input):
value: Optional[date] = SchemaField(
description="Date input (YYYY-MM-DD).",
default=None,
advanced=False,
title="Default Value",
)
class Output(AgentInputBlock.Output):
result: date = SchemaField(description="Date result.")
def __init__(self):
super().__init__(
id="7e198b09-4994-47db-8b4d-952d98241817",
description="Block for date input.",
disabled=not config.enable_agent_input_subtype_blocks,
input_schema=AgentDateInputBlock.Input,
output_schema=AgentDateInputBlock.Output,
test_input=[
{
# If your system can parse JSON date strings to date objects
"value": str(date(2025, 3, 19)),
"name": "date_input_1",
"description": "Example date input 1",
},
{
"value": str(date(2023, 12, 31)),
"name": "date_input_2",
"description": "Example date input 2",
},
],
test_output=[
("result", date(2025, 3, 19)),
("result", date(2023, 12, 31)),
],
)
class AgentTimeInputBlock(AgentInputBlock):
class Input(AgentInputBlock.Input):
value: Optional[time] = SchemaField(
description="Time input (HH:MM:SS).",
default=None,
advanced=False,
title="Default Value",
)
class Output(AgentInputBlock.Output):
result: time = SchemaField(description="Time result.")
def __init__(self):
super().__init__(
id="2a1c757e-86cf-4c7e-aacf-060dc382e434",
description="Block for time input.",
disabled=not config.enable_agent_input_subtype_blocks,
input_schema=AgentTimeInputBlock.Input,
output_schema=AgentTimeInputBlock.Output,
test_input=[
{
"value": str(time(9, 30, 0)),
"name": "time_input_1",
"description": "Time example 1",
},
{
"value": str(time(23, 59, 59)),
"name": "time_input_2",
"description": "Time example 2",
},
],
test_output=[
("result", time(9, 30, 0)),
("result", time(23, 59, 59)),
],
)
class AgentFileInputBlock(AgentInputBlock):
"""
A simplified file-upload block. In real usage, you might have a custom
file type or handle binary data. Here, we'll store a string path as the example.
"""
class Input(AgentInputBlock.Input):
value: Optional[MediaFileType] = SchemaField(
description="Path or reference to an uploaded file.",
default=None,
advanced=False,
title="Default Value",
)
class Output(AgentInputBlock.Output):
result: str = SchemaField(description="File reference/path result.")
def __init__(self):
super().__init__(
id="95ead23f-8283-4654-aef3-10c053b74a31",
description="Block for file upload input (string path for example).",
disabled=not config.enable_agent_input_subtype_blocks,
input_schema=AgentFileInputBlock.Input,
output_schema=AgentFileInputBlock.Output,
test_input=[
{
"value": "data:image/png;base64,MQ==",
"name": "file_upload_1",
"description": "Example file upload 1",
},
],
test_output=[
("result", str),
],
)
def run(
self,
input_data: Input,
*,
graph_exec_id: str,
**kwargs,
) -> BlockOutput:
if not input_data.value:
return
file_path = store_media_file(
graph_exec_id=graph_exec_id,
file=input_data.value,
return_content=False,
)
yield "result", file_path
class AgentDropdownInputBlock(AgentInputBlock):
"""
A specialized text input block that relies on placeholder_values to present a dropdown.
"""
class Input(AgentInputBlock.Input):
value: Optional[str] = SchemaField(
description="Text selected from a dropdown.",
default=None,
advanced=False,
title="Default Value",
)
placeholder_values: list = SchemaField(
description="Possible values for the dropdown.",
default_factory=list,
advanced=False,
title="Dropdown Options",
)
class Output(AgentInputBlock.Output):
result: str = SchemaField(description="Selected dropdown value.")
def __init__(self):
super().__init__(
id="655d6fdf-a334-421c-b733-520549c07cd1",
description="Block for dropdown text selection.",
disabled=not config.enable_agent_input_subtype_blocks,
input_schema=AgentDropdownInputBlock.Input,
output_schema=AgentDropdownInputBlock.Output,
test_input=[
{
"value": "Option A",
"name": "dropdown_1",
"placeholder_values": ["Option A", "Option B", "Option C"],
"description": "Dropdown example 1",
},
{
"value": "Option C",
"name": "dropdown_2",
"placeholder_values": ["Option A", "Option B", "Option C"],
"description": "Dropdown example 2",
},
],
test_output=[
("result", "Option A"),
("result", "Option C"),
],
)
class AgentToggleInputBlock(AgentInputBlock):
class Input(AgentInputBlock.Input):
value: bool = SchemaField(
description="Boolean toggle input.",
default=False,
advanced=False,
title="Default Value",
)
class Output(AgentInputBlock.Output):
result: bool = SchemaField(description="Boolean toggle result.")
def __init__(self):
super().__init__(
id="cbf36ab5-df4a-43b6-8a7f-f7ed8652116e",
description="Block for boolean toggle input.",
disabled=not config.enable_agent_input_subtype_blocks,
input_schema=AgentToggleInputBlock.Input,
output_schema=AgentToggleInputBlock.Output,
test_input=[
{
"value": True,
"name": "toggle_1",
"description": "Toggle example 1",
},
{
"value": False,
"name": "toggle_2",
"description": "Toggle example 2",
},
],
test_output=[
("result", True),
("result", False),
],
)
IO_BLOCK_IDs = [
AgentInputBlock().id,
AgentOutputBlock().id,
AgentShortTextInputBlock().id,
AgentLongTextInputBlock().id,
AgentNumberInputBlock().id,
AgentDateInputBlock().id,
AgentTimeInputBlock().id,
AgentFileInputBlock().id,
AgentDropdownInputBlock().id,
AgentToggleInputBlock().id,
]

View File

@@ -11,13 +11,13 @@ class StepThroughItemsBlock(Block):
advanced=False,
description="The list or dictionary of items to iterate over",
placeholder="[1, 2, 3, 4, 5] or {'key1': 'value1', 'key2': 'value2'}",
default=[],
default_factory=list,
)
items_object: dict = SchemaField(
advanced=False,
description="The list or dictionary of items to iterate over",
placeholder="[1, 2, 3, 4, 5] or {'key1': 'value1', 'key2': 'value2'}",
default={},
default_factory=dict,
)
items_str: str = SchemaField(
advanced=False,

View File

@@ -23,7 +23,7 @@ class JinaChunkingBlock(Block):
class Output(BlockSchema):
chunks: list = SchemaField(description="List of chunked texts")
tokens: list = SchemaField(
description="List of token information for each chunk", optional=True
description="List of token information for each chunk",
)
def __init__(self):

View File

@@ -1,4 +1,4 @@
from groq._utils._utils import quote
from urllib.parse import quote
from backend.blocks.jina._auth import (
TEST_CREDENTIALS,

View File

@@ -28,8 +28,8 @@ class LinearCreateIssueBlock(Block):
priority: int | None = SchemaField(
description="Priority of the issue",
default=None,
minimum=0,
maximum=4,
ge=0,
le=4,
)
project_name: str | None = SchemaField(
description="Name of the project to create the issue on",

View File

@@ -4,30 +4,24 @@ from abc import ABC
from enum import Enum, EnumMeta
from json import JSONDecodeError
from types import MappingProxyType
from typing import TYPE_CHECKING, Any, Iterable, List, Literal, NamedTuple, Optional
from pydantic import BaseModel, SecretStr
from backend.data.model import NodeExecutionStats
from backend.integrations.providers import ProviderName
if TYPE_CHECKING:
from enum import _EnumMemberT
from typing import Any, Iterable, List, Literal, NamedTuple, Optional
import anthropic
import ollama
import openai
from anthropic._types import NotGiven
from anthropic.types import ToolParam
from groq import Groq
from pydantic import BaseModel, SecretStr
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.model import (
APIKeyCredentials,
CredentialsField,
CredentialsMetaInput,
NodeExecutionStats,
SchemaField,
)
from backend.integrations.providers import ProviderName
from backend.util import json
from backend.util.settings import BehaveAs, Settings
from backend.util.text import TextFormatter
@@ -77,12 +71,10 @@ class ModelMetadata(NamedTuple):
class LlmModelMeta(EnumMeta):
@property
def __members__(
self: type["_EnumMemberT"],
) -> MappingProxyType[str, "_EnumMemberT"]:
def __members__(self) -> MappingProxyType:
if Settings().config.behave_as == BehaveAs.LOCAL:
members = super().__members__
return members
return MappingProxyType(members)
else:
removed_providers = ["ollama"]
existing_members = super().__members__
@@ -97,14 +89,17 @@ class LlmModelMeta(EnumMeta):
class LlmModel(str, Enum, metaclass=LlmModelMeta):
# OpenAI models
O3_MINI = "o3-mini"
O3 = "o3-2025-04-16"
O1 = "o1"
O1_PREVIEW = "o1-preview"
O1_MINI = "o1-mini"
GPT41 = "gpt-4.1-2025-04-14"
GPT4O_MINI = "gpt-4o-mini"
GPT4O = "gpt-4o"
GPT4_TURBO = "gpt-4-turbo"
GPT3_5_TURBO = "gpt-3.5-turbo"
# Anthropic models
CLAUDE_3_7_SONNET = "claude-3-7-sonnet-20250219"
CLAUDE_3_5_SONNET = "claude-3-5-sonnet-latest"
CLAUDE_3_5_HAIKU = "claude-3-5-haiku-latest"
CLAUDE_3_HAIKU = "claude-3-haiku-20240307"
@@ -125,6 +120,7 @@ class LlmModel(str, Enum, metaclass=LlmModelMeta):
OLLAMA_DOLPHIN = "dolphin-mistral:latest"
# OpenRouter models
GEMINI_FLASH_1_5 = "google/gemini-flash-1.5"
GEMINI_2_5_PRO = "google/gemini-2.5-pro-preview-03-25"
GROK_BETA = "x-ai/grok-beta"
MISTRAL_NEMO = "mistralai/mistral-nemo"
COHERE_COMMAND_R_08_2024 = "cohere/command-r-08-2024"
@@ -142,6 +138,8 @@ class LlmModel(str, Enum, metaclass=LlmModelMeta):
AMAZON_NOVA_PRO_V1 = "amazon/nova-pro-v1"
MICROSOFT_WIZARDLM_2_8X22B = "microsoft/wizardlm-2-8x22b"
GRYPHE_MYTHOMAX_L2_13B = "gryphe/mythomax-l2-13b"
META_LLAMA_4_SCOUT = "meta-llama/llama-4-scout"
META_LLAMA_4_MAVERICK = "meta-llama/llama-4-maverick"
@property
def metadata(self) -> ModelMetadata:
@@ -162,12 +160,14 @@ class LlmModel(str, Enum, metaclass=LlmModelMeta):
MODEL_METADATA = {
# https://platform.openai.com/docs/models
LlmModel.O3: ModelMetadata("openai", 200000, 100000),
LlmModel.O3_MINI: ModelMetadata("openai", 200000, 100000), # o3-mini-2025-01-31
LlmModel.O1: ModelMetadata("openai", 200000, 100000), # o1-2024-12-17
LlmModel.O1_PREVIEW: ModelMetadata(
"openai", 128000, 32768
), # o1-preview-2024-09-12
LlmModel.O1_MINI: ModelMetadata("openai", 128000, 65536), # o1-mini-2024-09-12
LlmModel.GPT41: ModelMetadata("openai", 1047576, 32768),
LlmModel.GPT4O_MINI: ModelMetadata(
"openai", 128000, 16384
), # gpt-4o-mini-2024-07-18
@@ -177,6 +177,9 @@ MODEL_METADATA = {
), # gpt-4-turbo-2024-04-09
LlmModel.GPT3_5_TURBO: ModelMetadata("openai", 16385, 4096), # gpt-3.5-turbo-0125
# https://docs.anthropic.com/en/docs/about-claude/models
LlmModel.CLAUDE_3_7_SONNET: ModelMetadata(
"anthropic", 200000, 8192
), # claude-3-7-sonnet-20250219
LlmModel.CLAUDE_3_5_SONNET: ModelMetadata(
"anthropic", 200000, 8192
), # claude-3-5-sonnet-20241022
@@ -202,6 +205,7 @@ MODEL_METADATA = {
LlmModel.OLLAMA_DOLPHIN: ModelMetadata("ollama", 32768, None),
# https://openrouter.ai/models
LlmModel.GEMINI_FLASH_1_5: ModelMetadata("open_router", 1000000, 8192),
LlmModel.GEMINI_2_5_PRO: ModelMetadata("open_router", 1050000, 8192),
LlmModel.GROK_BETA: ModelMetadata("open_router", 131072, 131072),
LlmModel.MISTRAL_NEMO: ModelMetadata("open_router", 128000, 4096),
LlmModel.COHERE_COMMAND_R_08_2024: ModelMetadata("open_router", 128000, 4096),
@@ -223,6 +227,8 @@ MODEL_METADATA = {
LlmModel.AMAZON_NOVA_PRO_V1: ModelMetadata("open_router", 300000, 5120),
LlmModel.MICROSOFT_WIZARDLM_2_8X22B: ModelMetadata("open_router", 65536, 4096),
LlmModel.GRYPHE_MYTHOMAX_L2_13B: ModelMetadata("open_router", 4096, 4096),
LlmModel.META_LLAMA_4_SCOUT: ModelMetadata("open_router", 131072, 131072),
LlmModel.META_LLAMA_4_MAVERICK: ModelMetadata("open_router", 1048576, 1000000),
}
for model in LlmModel:
@@ -252,7 +258,7 @@ class LLMResponse(BaseModel):
def convert_openai_tool_fmt_to_anthropic(
openai_tools: list[dict] | None = None,
) -> Iterable[ToolParam] | NotGiven:
) -> Iterable[ToolParam] | anthropic.NotGiven:
"""
Convert OpenAI tool format to Anthropic tool format.
"""
@@ -290,6 +296,7 @@ def llm_call(
max_tokens: int | None,
tools: list[dict] | None = None,
ollama_host: str = "localhost:11434",
parallel_tool_calls: bool | None = None,
) -> LLMResponse:
"""
Make a call to a language model.
@@ -335,6 +342,9 @@ def llm_call(
response_format=response_format, # type: ignore
max_completion_tokens=max_tokens,
tools=tools_param, # type: ignore
parallel_tool_calls=(
openai.NOT_GIVEN if parallel_tool_calls is None else parallel_tool_calls
),
)
if response.choices[0].message.tool_calls:
@@ -424,7 +434,7 @@ def llm_call(
response=(
resp.content[0].name
if isinstance(resp.content[0], anthropic.types.ToolUseBlock)
else resp.content[0].text
else getattr(resp.content[0], "text", "")
),
tool_calls=tool_calls,
prompt_tokens=resp.usage.input_tokens,
@@ -490,6 +500,9 @@ def llm_call(
messages=prompt, # type: ignore
max_tokens=max_tokens,
tools=tools_param, # type: ignore
parallel_tool_calls=(
openai.NOT_GIVEN if parallel_tool_calls is None else parallel_tool_calls
),
)
# If there's no response, raise an error
@@ -528,7 +541,7 @@ def llm_call(
class AIBlockBase(Block, ABC):
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
self.prompt = ""
self.prompt = []
def merge_llm_stats(self, block: "AIBlockBase"):
self.merge_stats(block.execution_stats)
@@ -558,7 +571,7 @@ class AIStructuredResponseGeneratorBlock(AIBlockBase):
description="The system prompt to provide additional context to the model.",
)
conversation_history: list[dict] = SchemaField(
default=[],
default_factory=list,
description="The conversation history to provide context for the prompt.",
)
retry: int = SchemaField(
@@ -568,7 +581,7 @@ class AIStructuredResponseGeneratorBlock(AIBlockBase):
)
prompt_values: dict[str, str] = SchemaField(
advanced=False,
default={},
default_factory=dict,
description="Values used to fill in the prompt. The values can be used in the prompt by putting them in a double curly braces, e.g. {{variable_name}}.",
)
max_tokens: int | None = SchemaField(
@@ -587,7 +600,7 @@ class AIStructuredResponseGeneratorBlock(AIBlockBase):
response: dict[str, Any] = SchemaField(
description="The response object generated by the language model."
)
prompt: str = SchemaField(description="The prompt sent to the language model.")
prompt: list = SchemaField(description="The prompt sent to the language model.")
error: str = SchemaField(description="Error message if the API call failed.")
def __init__(self):
@@ -609,7 +622,7 @@ class AIStructuredResponseGeneratorBlock(AIBlockBase):
test_credentials=TEST_CREDENTIALS,
test_output=[
("response", {"key1": "key1Value", "key2": "key2Value"}),
("prompt", str),
("prompt", list),
],
test_mock={
"llm_call": lambda *args, **kwargs: LLMResponse(
@@ -642,6 +655,7 @@ class AIStructuredResponseGeneratorBlock(AIBlockBase):
Test mocks work only on class functions, this wraps the llm_call function
so that it can be mocked withing the block testing framework.
"""
self.prompt = prompt
return llm_call(
credentials=credentials,
llm_model=llm_model,
@@ -796,7 +810,7 @@ class AITextGeneratorBlock(AIBlockBase):
)
prompt_values: dict[str, str] = SchemaField(
advanced=False,
default={},
default_factory=dict,
description="Values used to fill in the prompt. The values can be used in the prompt by putting them in a double curly braces, e.g. {{variable_name}}.",
)
ollama_host: str = SchemaField(
@@ -814,7 +828,7 @@ class AITextGeneratorBlock(AIBlockBase):
response: str = SchemaField(
description="The response generated by the language model."
)
prompt: str = SchemaField(description="The prompt sent to the language model.")
prompt: list = SchemaField(description="The prompt sent to the language model.")
error: str = SchemaField(description="Error message if the API call failed.")
def __init__(self):
@@ -831,7 +845,7 @@ class AITextGeneratorBlock(AIBlockBase):
test_credentials=TEST_CREDENTIALS,
test_output=[
("response", "Response text"),
("prompt", str),
("prompt", list),
],
test_mock={"llm_call": lambda *args, **kwargs: "Response text"},
)
@@ -850,7 +864,10 @@ class AITextGeneratorBlock(AIBlockBase):
self, input_data: Input, *, credentials: APIKeyCredentials, **kwargs
) -> BlockOutput:
object_input_data = AIStructuredResponseGeneratorBlock.Input(
**{attr: getattr(input_data, attr) for attr in input_data.model_fields},
**{
attr: getattr(input_data, attr)
for attr in AITextGeneratorBlock.Input.model_fields
},
expected_format={},
)
yield "response", self.llm_call(object_input_data, credentials)
@@ -907,7 +924,7 @@ class AITextSummarizerBlock(AIBlockBase):
class Output(BlockSchema):
summary: str = SchemaField(description="The final summary of the text.")
prompt: str = SchemaField(description="The prompt sent to the language model.")
prompt: list = SchemaField(description="The prompt sent to the language model.")
error: str = SchemaField(description="Error message if the API call failed.")
def __init__(self):
@@ -924,7 +941,7 @@ class AITextSummarizerBlock(AIBlockBase):
test_credentials=TEST_CREDENTIALS,
test_output=[
("summary", "Final summary of a long text"),
("prompt", str),
("prompt", list),
],
test_mock={
"llm_call": lambda input_data, credentials: (
@@ -1033,8 +1050,14 @@ class AITextSummarizerBlock(AIBlockBase):
class AIConversationBlock(AIBlockBase):
class Input(BlockSchema):
prompt: str = SchemaField(
description="The prompt to send to the language model.",
placeholder="Enter your prompt here...",
default="",
advanced=False,
)
messages: List[Any] = SchemaField(
description="List of messages in the conversation.", min_length=1
description="List of messages in the conversation.",
)
model: LlmModel = SchemaField(
title="LLM Model",
@@ -1057,7 +1080,7 @@ class AIConversationBlock(AIBlockBase):
response: str = SchemaField(
description="The model's response to the conversation."
)
prompt: str = SchemaField(description="The prompt sent to the language model.")
prompt: list = SchemaField(description="The prompt sent to the language model.")
error: str = SchemaField(description="Error message if the API call failed.")
def __init__(self):
@@ -1086,7 +1109,7 @@ class AIConversationBlock(AIBlockBase):
"response",
"The 2020 World Series was played at Globe Life Field in Arlington, Texas.",
),
("prompt", str),
("prompt", list),
],
test_mock={
"llm_call": lambda *args, **kwargs: "The 2020 World Series was played at Globe Life Field in Arlington, Texas."
@@ -1108,7 +1131,7 @@ class AIConversationBlock(AIBlockBase):
) -> BlockOutput:
response = self.llm_call(
AIStructuredResponseGeneratorBlock.Input(
prompt="",
prompt=input_data.prompt,
credentials=input_data.credentials,
model=input_data.model,
conversation_history=input_data.messages,
@@ -1166,7 +1189,7 @@ class AIListGeneratorBlock(AIBlockBase):
list_item: str = SchemaField(
description="Each individual item in the list.",
)
prompt: str = SchemaField(description="The prompt sent to the language model.")
prompt: list = SchemaField(description="The prompt sent to the language model.")
error: str = SchemaField(
description="Error message if the list generation failed."
)
@@ -1198,7 +1221,7 @@ class AIListGeneratorBlock(AIBlockBase):
"generated_list",
["Zylora Prime", "Kharon-9", "Vortexia", "Oceara", "Draknos"],
),
("prompt", str),
("prompt", list),
("list_item", "Zylora Prime"),
("list_item", "Kharon-9"),
("list_item", "Vortexia"),

View File

@@ -8,13 +8,13 @@ from moviepy.video.io.VideoFileClip import VideoFileClip
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.model import SchemaField
from backend.util.file import MediaFile, get_exec_file_path, store_media_file
from backend.util.file import MediaFileType, get_exec_file_path, store_media_file
class MediaDurationBlock(Block):
class Input(BlockSchema):
media_in: MediaFile = SchemaField(
media_in: MediaFileType = SchemaField(
description="Media input (URL, data URI, or local path)."
)
is_video: bool = SchemaField(
@@ -69,7 +69,7 @@ class LoopVideoBlock(Block):
"""
class Input(BlockSchema):
video_in: MediaFile = SchemaField(
video_in: MediaFileType = SchemaField(
description="The input video (can be a URL, data URI, or local path)."
)
# Provide EITHER a `duration` or `n_loops` or both. We'll demonstrate `duration`.
@@ -137,7 +137,7 @@ class LoopVideoBlock(Block):
assert isinstance(looped_clip, VideoFileClip)
# 4) Save the looped output
output_filename = MediaFile(
output_filename = MediaFileType(
f"{node_exec_id}_looped_{os.path.basename(local_video_path)}"
)
output_abspath = get_exec_file_path(graph_exec_id, output_filename)
@@ -162,10 +162,10 @@ class AddAudioToVideoBlock(Block):
"""
class Input(BlockSchema):
video_in: MediaFile = SchemaField(
video_in: MediaFileType = SchemaField(
description="Video input (URL, data URI, or local path)."
)
audio_in: MediaFile = SchemaField(
audio_in: MediaFileType = SchemaField(
description="Audio input (URL, data URI, or local path)."
)
volume: float = SchemaField(
@@ -178,7 +178,7 @@ class AddAudioToVideoBlock(Block):
)
class Output(BlockSchema):
video_out: MediaFile = SchemaField(
video_out: MediaFileType = SchemaField(
description="Final video (with attached audio), as a path or data URI."
)
error: str = SchemaField(
@@ -229,7 +229,7 @@ class AddAudioToVideoBlock(Block):
final_clip = video_clip.with_audio(audio_clip)
# 4) Write to output file
output_filename = MediaFile(
output_filename = MediaFileType(
f"{node_exec_id}_audio_attached_{os.path.basename(local_video_path)}"
)
output_abspath = os.path.join(abs_temp_dir, output_filename)

View File

@@ -65,7 +65,7 @@ class AddMemoryBlock(Block, Mem0Base):
default=Content(discriminator="content", content="I'm a vegetarian"),
)
metadata: dict[str, Any] = SchemaField(
description="Optional metadata for the memory", default={}
description="Optional metadata for the memory", default_factory=dict
)
limit_memory_to_run: bool = SchemaField(
@@ -173,7 +173,7 @@ class SearchMemoryBlock(Block, Mem0Base):
)
categories_filter: list[str] = SchemaField(
description="Categories to filter by",
default=[],
default_factory=list,
advanced=True,
)
limit_memory_to_run: bool = SchemaField(

View File

@@ -6,13 +6,14 @@ from backend.blocks.nvidia._auth import (
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.model import SchemaField
from backend.util.request import requests
from backend.util.type import MediaFileType
class NvidiaDeepfakeDetectBlock(Block):
class Input(BlockSchema):
credentials: NvidiaCredentialsInput = NvidiaCredentialsField()
image_base64: str = SchemaField(
description="Image to analyze for deepfakes", image_upload=True
image_base64: MediaFileType = SchemaField(
description="Image to analyze for deepfakes",
)
return_image: bool = SchemaField(
description="Whether to return the processed image with markings",
@@ -22,16 +23,12 @@ class NvidiaDeepfakeDetectBlock(Block):
class Output(BlockSchema):
status: str = SchemaField(
description="Detection status (SUCCESS, ERROR, CONTENT_FILTERED)",
default="",
)
image: str = SchemaField(
image: MediaFileType = SchemaField(
description="Processed image with detection markings (if return_image=True)",
default="",
image_output=True,
)
is_deepfake: float = SchemaField(
description="Probability that the image is a deepfake (0-1)",
default=0.0,
)
def __init__(self):

View File

@@ -177,7 +177,8 @@ class PineconeInsertBlock(Block):
description="Namespace to use in Pinecone", default=""
)
metadata: dict = SchemaField(
description="Additional metadata to store with each vector", default={}
description="Additional metadata to store with each vector",
default_factory=dict,
)
class Output(BlockSchema):

View File

@@ -12,7 +12,7 @@ from backend.data.model import (
SchemaField,
)
from backend.integrations.providers import ProviderName
from backend.util.file import MediaFile, store_media_file
from backend.util.file import MediaFileType, store_media_file
from backend.util.request import Requests
@@ -57,7 +57,7 @@ class ScreenshotWebPageBlock(Block):
)
class Output(BlockSchema):
image: MediaFile = SchemaField(description="The screenshot image data")
image: MediaFileType = SchemaField(description="The screenshot image data")
error: str = SchemaField(description="Error message if the screenshot failed")
def __init__(self):
@@ -142,7 +142,9 @@ class ScreenshotWebPageBlock(Block):
return {
"image": store_media_file(
graph_exec_id=graph_exec_id,
file=f"data:image/{format.value};base64,{b64encode(response.content).decode('utf-8')}",
file=MediaFileType(
f"data:image/{format.value};base64,{b64encode(response.content).decode('utf-8')}"
),
return_content=True,
)
}

View File

@@ -8,6 +8,7 @@ from backend.data.block import (
BlockWebhookConfig,
)
from backend.data.model import SchemaField
from backend.integrations.providers import ProviderName
from backend.util import settings
from backend.util.settings import AppEnvironment, BehaveAs
@@ -25,7 +26,7 @@ class Slant3DTriggerBase:
class Input(BlockSchema):
credentials: Slant3DCredentialsInput = Slant3DCredentialsField()
# Webhook URL is handled by the webhook system
payload: dict = SchemaField(hidden=True, default={})
payload: dict = SchemaField(hidden=True, default_factory=dict)
class Output(BlockSchema):
payload: dict = SchemaField(
@@ -82,7 +83,7 @@ class Slant3DOrderWebhookBlock(Slant3DTriggerBase, Block):
input_schema=self.Input,
output_schema=self.Output,
webhook_config=BlockWebhookConfig(
provider="slant3d",
provider=ProviderName.SLANT3D,
webhook_type="orders", # Only one type for now
resource_format="", # No resource format needed
event_filter_input="events",

View File

@@ -14,7 +14,6 @@ from backend.data.block import (
BlockOutput,
BlockSchema,
BlockType,
get_block,
)
from backend.data.model import SchemaField
from backend.util import json
@@ -155,7 +154,7 @@ class SmartDecisionMakerBlock(Block):
description="The system prompt to provide additional context to the model.",
)
conversation_history: list[dict] = SchemaField(
default=[],
default_factory=list,
description="The conversation history to provide context for the prompt.",
)
last_tool_output: Any = SchemaField(
@@ -169,7 +168,7 @@ class SmartDecisionMakerBlock(Block):
)
prompt_values: dict[str, str] = SchemaField(
advanced=False,
default={},
default_factory=dict,
description="Values used to fill in the prompt. The values can be used in the prompt by putting them in a double curly braces, e.g. {{variable_name}}.",
)
max_tokens: int | None = SchemaField(
@@ -264,9 +263,7 @@ class SmartDecisionMakerBlock(Block):
Raises:
ValueError: If the block specified by sink_node.block_id is not found.
"""
block = get_block(sink_node.block_id)
if not block:
raise ValueError(f"Block not found: {sink_node.block_id}")
block = sink_node.block
tool_function: dict[str, Any] = {
"name": re.sub(r"[^a-zA-Z0-9_-]", "_", block.name).lower(),
@@ -494,6 +491,7 @@ class SmartDecisionMakerBlock(Block):
max_tokens=input_data.max_tokens,
tools=tool_functions,
ollama_host=input_data.ollama_host,
parallel_tool_calls=False,
)
if not response.tool_calls:

View File

@@ -112,7 +112,7 @@ class AddLeadToCampaignBlock(Block):
lead_list: list[LeadInput] = SchemaField(
description="An array of JSON objects, each representing a lead's details. Can hold max 100 leads.",
max_length=100,
default=[],
default_factory=list,
advanced=False,
)
settings: LeadUploadSettings = SchemaField(
@@ -248,7 +248,7 @@ class SaveCampaignSequencesBlock(Block):
)
sequences: list[Sequence] = SchemaField(
description="The sequences to save",
default=[],
default_factory=list,
advanced=False,
)
credentials: SmartLeadCredentialsInput = SchemaField(

View File

@@ -39,7 +39,7 @@ class LeadCustomFields(BaseModel):
fields: dict[str, str] = SchemaField(
description="Custom fields for a lead (max 20 fields)",
max_length=20,
default={},
default_factory=dict,
)
@@ -85,7 +85,7 @@ class AddLeadsRequest(BaseModel):
lead_list: list[LeadInput] = SchemaField(
description="List of leads to add to the campaign",
max_length=100,
default=[],
default_factory=list,
)
settings: LeadUploadSettings
campaign_id: int

View File

@@ -156,7 +156,7 @@
# participant_ids: list[str] = SchemaField(
# description="Array of User IDs to create conversation with (max 50)",
# placeholder="Enter participant user IDs",
# default=[],
# default_factory=list,
# advanced=False
# )

View File

@@ -39,7 +39,6 @@ class TwitterGetListBlock(Block):
list_id: str = SchemaField(
description="The ID of the List to lookup",
placeholder="Enter list ID",
required=True,
)
class Output(BlockSchema):
@@ -184,7 +183,6 @@ class TwitterGetOwnedListsBlock(Block):
user_id: str = SchemaField(
description="The user ID whose owned Lists to retrieve",
placeholder="Enter user ID",
required=True,
)
max_results: int | None = SchemaField(

View File

@@ -45,13 +45,11 @@ class TwitterRemoveListMemberBlock(Block):
list_id: str = SchemaField(
description="The ID of the List to remove the member from",
placeholder="Enter list ID",
required=True,
)
user_id: str = SchemaField(
description="The ID of the user to remove from the List",
placeholder="Enter user ID to remove",
required=True,
)
class Output(BlockSchema):
@@ -120,13 +118,11 @@ class TwitterAddListMemberBlock(Block):
list_id: str = SchemaField(
description="The ID of the List to add the member to",
placeholder="Enter list ID",
required=True,
)
user_id: str = SchemaField(
description="The ID of the user to add to the List",
placeholder="Enter user ID to add",
required=True,
)
class Output(BlockSchema):
@@ -195,7 +191,6 @@ class TwitterGetListMembersBlock(Block):
list_id: str = SchemaField(
description="The ID of the List to get members from",
placeholder="Enter list ID",
required=True,
)
max_results: int | None = SchemaField(
@@ -376,7 +371,6 @@ class TwitterGetListMembershipsBlock(Block):
user_id: str = SchemaField(
description="The ID of the user whose List memberships to retrieve",
placeholder="Enter user ID",
required=True,
)
max_results: int | None = SchemaField(

View File

@@ -42,7 +42,6 @@ class TwitterGetListTweetsBlock(Block):
list_id: str = SchemaField(
description="The ID of the List whose Tweets you would like to retrieve",
placeholder="Enter list ID",
required=True,
)
max_results: int | None = SchemaField(

View File

@@ -28,7 +28,6 @@ class TwitterDeleteListBlock(Block):
list_id: str = SchemaField(
description="The ID of the List to be deleted",
placeholder="Enter list ID",
required=True,
)
class Output(BlockSchema):

View File

@@ -39,7 +39,6 @@ class TwitterUnpinListBlock(Block):
list_id: str = SchemaField(
description="The ID of the List to unpin",
placeholder="Enter list ID",
required=True,
)
class Output(BlockSchema):
@@ -103,7 +102,6 @@ class TwitterPinListBlock(Block):
list_id: str = SchemaField(
description="The ID of the List to pin",
placeholder="Enter list ID",
required=True,
)
class Output(BlockSchema):

View File

@@ -44,7 +44,7 @@ class SpaceList(BaseModel):
space_ids: list[str] = SchemaField(
description="List of Space IDs to lookup (up to 100)",
placeholder="Enter Space IDs",
default=[],
default_factory=list,
advanced=False,
)
@@ -54,7 +54,7 @@ class UserList(BaseModel):
user_ids: list[str] = SchemaField(
description="List of user IDs to lookup their Spaces (up to 100)",
placeholder="Enter user IDs",
default=[],
default_factory=list,
advanced=False,
)
@@ -227,7 +227,6 @@ class TwitterGetSpaceByIdBlock(Block):
space_id: str = SchemaField(
description="Space ID to lookup",
placeholder="Enter Space ID",
required=True,
)
class Output(BlockSchema):
@@ -389,7 +388,6 @@ class TwitterGetSpaceBuyersBlock(Block):
space_id: str = SchemaField(
description="Space ID to lookup buyers for",
placeholder="Enter Space ID",
required=True,
)
class Output(BlockSchema):
@@ -517,7 +515,6 @@ class TwitterGetSpaceTweetsBlock(Block):
space_id: str = SchemaField(
description="Space ID to lookup tweets for",
placeholder="Enter Space ID",
required=True,
)
class Output(BlockSchema):

View File

@@ -200,7 +200,7 @@ class UserIdList(BaseModel):
user_ids: list[str] = SchemaField(
description="List of user IDs to lookup (max 100)",
placeholder="Enter user IDs",
default=[],
default_factory=list,
advanced=False,
)
@@ -210,7 +210,7 @@ class UsernameList(BaseModel):
usernames: list[str] = SchemaField(
description="List of Twitter usernames/handles to lookup (max 100)",
placeholder="Enter usernames",
default=[],
default_factory=list,
advanced=False,
)

View File

@@ -8,7 +8,6 @@ import pathlib
import click
import psutil
from backend import app
from backend.util.process import AppProcess
@@ -42,8 +41,13 @@ def write_pid(pid: int):
class MainApp(AppProcess):
def run(self):
from backend import app
app.main(silent=True)
def cleanup(self):
pass
@click.group()
def main():
@@ -220,9 +224,8 @@ def event():
@test.command()
@click.argument("server_address")
@click.argument("graph_id")
@click.argument("graph_version")
def websocket(server_address: str, graph_id: str, graph_version: int):
@click.argument("graph_exec_id")
def websocket(server_address: str, graph_exec_id: str):
"""
Tests the websocket connection.
"""
@@ -230,16 +233,20 @@ def websocket(server_address: str, graph_id: str, graph_version: int):
import websockets.asyncio.client
from backend.server.ws_api import ExecutionSubscription, Methods, WsMessage
from backend.server.ws_api import (
WSMessage,
WSMethod,
WSSubscribeGraphExecutionRequest,
)
async def send_message(server_address: str):
uri = f"ws://{server_address}"
async with websockets.asyncio.client.connect(uri) as websocket:
try:
msg = WsMessage(
method=Methods.SUBSCRIBE,
data=ExecutionSubscription(
graph_id=graph_id, graph_version=graph_version
msg = WSMessage(
method=WSMethod.SUBSCRIBE_GRAPH_EXEC,
data=WSSubscribeGraphExecutionRequest(
graph_exec_id=graph_exec_id,
).model_dump(),
).model_dump_json()
await websocket.send(msg)

View File

@@ -12,12 +12,12 @@ async def log_raw_analytics(
data_index: str,
):
details = await prisma.models.AnalyticsDetails.prisma().create(
data={
"userId": user_id,
"type": type,
"data": prisma.Json(data),
"dataIndex": data_index,
}
data=prisma.types.AnalyticsDetailsCreateInput(
userId=user_id,
type=type,
data=prisma.Json(data),
dataIndex=data_index,
)
)
return details
@@ -32,12 +32,12 @@ async def log_raw_metric(
raise ValueError("metric_value must be non-negative")
result = await prisma.models.AnalyticsMetrics.prisma().create(
data={
"value": metric_value,
"analyticMetric": metric_name,
"userId": user_id,
"dataString": data_string,
},
data=prisma.types.AnalyticsMetricsCreateInput(
value=metric_value,
analyticMetric=metric_name,
userId=user_id,
dataString=data_string,
)
)
return result

View File

@@ -17,15 +17,18 @@ from typing import (
import jsonref
import jsonschema
from prisma.models import AgentBlock
from prisma.types import AgentBlockCreateInput
from pydantic import BaseModel
from backend.data.model import NodeExecutionStats
from backend.integrations.providers import ProviderName
from backend.util import json
from backend.util.settings import Config
from .model import (
ContributorDetails,
Credentials,
CredentialsFieldInfo,
CredentialsMetaInput,
is_credentials_field_name,
)
@@ -119,21 +122,26 @@ class BlockSchema(BaseModel):
def get_mismatch_error(cls, data: BlockInput) -> str | None:
return cls.validate_data(data)
@classmethod
def get_field_schema(cls, field_name: str) -> dict[str, Any]:
model_schema = cls.jsonschema().get("properties", {})
if not model_schema:
raise ValueError(f"Invalid model schema {cls}")
property_schema = model_schema.get(field_name)
if not property_schema:
raise ValueError(f"Invalid property name {field_name}")
return property_schema
@classmethod
def validate_field(cls, field_name: str, data: BlockInput) -> str | None:
"""
Validate the data against a specific property (one of the input/output name).
Returns the validation error message if the data does not match the schema.
"""
model_schema = cls.jsonschema().get("properties", {})
if not model_schema:
return f"Invalid model schema {cls}"
property_schema = model_schema.get(field_name)
if not property_schema:
return f"Invalid property name {field_name}"
try:
property_schema = cls.get_field_schema(field_name)
jsonschema.validate(json.to_dict(data), property_schema)
return None
except jsonschema.ValidationError as e:
@@ -196,6 +204,15 @@ class BlockSchema(BaseModel):
)
}
@classmethod
def get_credentials_fields_info(cls) -> dict[str, CredentialsFieldInfo]:
return {
field_name: CredentialsFieldInfo.model_validate(
cls.get_field_schema(field_name), by_alias=True
)
for field_name in cls.get_credentials_fields().keys()
}
@classmethod
def get_input_defaults(cls, data: BlockInput) -> BlockInput:
return data # Return as is, by default.
@@ -225,7 +242,7 @@ class BlockManualWebhookConfig(BaseModel):
the user has to manually set up the webhook at the provider.
"""
provider: str
provider: ProviderName
"""The service provider that the webhook connects to"""
webhook_type: str
@@ -461,9 +478,9 @@ class Block(ABC, Generic[BlockSchemaInputType, BlockSchemaOutputType]):
def get_blocks() -> dict[str, Type[Block]]:
from backend.blocks import AVAILABLE_BLOCKS # noqa: E402
from backend.blocks import load_all_blocks
return AVAILABLE_BLOCKS
return load_all_blocks()
async def initialize_blocks() -> None:
@@ -474,12 +491,12 @@ async def initialize_blocks() -> None:
)
if not existing_block:
await AgentBlock.prisma().create(
data={
"id": block.id,
"name": block.name,
"inputSchema": json.dumps(block.input_schema.jsonschema()),
"outputSchema": json.dumps(block.output_schema.jsonschema()),
}
data=AgentBlockCreateInput(
id=block.id,
name=block.name,
inputSchema=json.dumps(block.input_schema.jsonschema()),
outputSchema=json.dumps(block.output_schema.jsonschema()),
)
)
continue
@@ -502,6 +519,7 @@ async def initialize_blocks() -> None:
)
def get_block(block_id: str) -> Block | None:
# Note on the return type annotation: https://github.com/microsoft/pyright/issues/10281
def get_block(block_id: str) -> Block[BlockSchema, BlockSchema] | None:
cls = get_blocks().get(block_id)
return cls() if cls else None

View File

@@ -36,14 +36,17 @@ from backend.integrations.credentials_store import (
# =============== Configure the cost for each LLM Model call =============== #
MODEL_COST: dict[LlmModel, int] = {
LlmModel.O3: 7,
LlmModel.O3_MINI: 2, # $1.10 / $4.40
LlmModel.O1: 16, # $15 / $60
LlmModel.O1_PREVIEW: 16,
LlmModel.O1_MINI: 4,
LlmModel.GPT41: 2,
LlmModel.GPT4O_MINI: 1,
LlmModel.GPT4O: 3,
LlmModel.GPT4_TURBO: 10,
LlmModel.GPT3_5_TURBO: 1,
LlmModel.CLAUDE_3_7_SONNET: 5,
LlmModel.CLAUDE_3_5_SONNET: 4,
LlmModel.CLAUDE_3_5_HAIKU: 1, # $0.80 / $4.00
LlmModel.CLAUDE_3_HAIKU: 1,
@@ -60,6 +63,7 @@ MODEL_COST: dict[LlmModel, int] = {
LlmModel.DEEPSEEK_LLAMA_70B: 1, # ? / ?
LlmModel.OLLAMA_DOLPHIN: 1,
LlmModel.GEMINI_FLASH_1_5: 1,
LlmModel.GEMINI_2_5_PRO: 4,
LlmModel.GROK_BETA: 5,
LlmModel.MISTRAL_NEMO: 1,
LlmModel.COHERE_COMMAND_R_08_2024: 1,
@@ -75,6 +79,8 @@ MODEL_COST: dict[LlmModel, int] = {
LlmModel.AMAZON_NOVA_PRO_V1: 1,
LlmModel.MICROSOFT_WIZARDLM_2_8X22B: 1,
LlmModel.GRYPHE_MYTHOMAX_L2_13B: 1,
LlmModel.META_LLAMA_4_SCOUT: 1,
LlmModel.META_LLAMA_4_MAVERICK: 1,
}
for model in LlmModel:

View File

@@ -11,18 +11,20 @@ from prisma.enums import (
CreditRefundRequestStatus,
CreditTransactionType,
NotificationType,
OnboardingStep,
)
from prisma.errors import UniqueViolationError
from prisma.models import CreditRefundRequest, CreditTransaction, User
from prisma.types import CreditTransactionCreateInput, CreditTransactionWhereInput
from pydantic import BaseModel
from prisma.types import (
CreditRefundRequestCreateInput,
CreditTransactionCreateInput,
CreditTransactionWhereInput,
)
from tenacity import retry, stop_after_attempt, wait_exponential
from backend.data import db
from backend.data.block import Block, BlockInput, get_block
from backend.data.block_cost_config import BLOCK_COSTS
from backend.data.cost import BlockCost, BlockCostType
from backend.data.execution import NodeExecutionEntry
from backend.data.cost import BlockCost
from backend.data.model import (
AutoTopUpConfig,
RefundRequest,
@@ -31,6 +33,7 @@ from backend.data.model import (
)
from backend.data.notifications import NotificationEventDTO, RefundRequestData
from backend.data.user import get_user_by_id
from backend.executor.utils import UsageTransactionMetadata
from backend.notifications import NotificationManager
from backend.util.exceptions import InsufficientBalanceError
from backend.util.service import get_service_client
@@ -39,6 +42,7 @@ from backend.util.settings import Settings
settings = Settings()
stripe.api_key = settings.secrets.stripe_api_key
logger = logging.getLogger(__name__)
base_url = settings.config.frontend_base_url or settings.config.platform_base_url
class UserCreditBase(ABC):
@@ -90,20 +94,20 @@ class UserCreditBase(ABC):
@abstractmethod
async def spend_credits(
self,
entry: NodeExecutionEntry,
data_size: float,
run_time: float,
user_id: str,
cost: int,
metadata: UsageTransactionMetadata,
) -> int:
"""
Spend the credits for the user based on the block usage.
Spend the credits for the user based on the cost.
Args:
entry (NodeExecutionEntry): The node execution identifiers & data.
data_size (float): The size of the data being processed.
run_time (float): The time taken to run the block.
user_id (str): The user ID.
cost (int): The cost to spend.
metadata (UsageTransactionMetadata): The metadata of the transaction.
Returns:
int: amount of credit spent
int: The remaining balance.
"""
pass
@@ -118,6 +122,18 @@ class UserCreditBase(ABC):
"""
pass
@abstractmethod
async def onboarding_reward(self, user_id: str, credits: int, step: OnboardingStep):
"""
Reward the user with credits for completing an onboarding step.
Won't reward if the user has already received credits for the step.
Args:
user_id (str): The user ID.
step (OnboardingStep): The onboarding step.
"""
pass
@abstractmethod
async def top_up_intent(self, user_id: str, amount: int) -> str:
"""
@@ -185,6 +201,14 @@ class UserCreditBase(ABC):
"""
pass
@staticmethod
async def create_billing_portal_session(user_id: str) -> str:
session = stripe.billing_portal.Session.create(
customer=await get_stripe_customer_id(user_id),
return_url=base_url + "/profile/credits",
)
return session.url
@staticmethod
def time_now() -> datetime:
return datetime.now(timezone.utc)
@@ -202,7 +226,7 @@ class UserCreditBase(ABC):
"userId": user_id,
"createdAt": {"lte": top_time},
"isActive": True,
"runningBalance": {"not": None}, # type: ignore
"NOT": [{"runningBalance": None}],
},
order={"createdAt": "desc"},
)
@@ -324,31 +348,21 @@ class UserCreditBase(ABC):
amount = min(-user_balance, 0)
# Create the transaction
transaction_data: CreditTransactionCreateInput = {
"userId": user_id,
"amount": amount,
"runningBalance": user_balance + amount,
"type": transaction_type,
"metadata": metadata,
"isActive": is_active,
"createdAt": self.time_now(),
}
transaction_data = CreditTransactionCreateInput(
userId=user_id,
amount=amount,
runningBalance=user_balance + amount,
type=transaction_type,
metadata=metadata,
isActive=is_active,
createdAt=self.time_now(),
)
if transaction_key:
transaction_data["transactionKey"] = transaction_key
tx = await CreditTransaction.prisma().create(data=transaction_data)
return user_balance + amount, tx.transactionKey
class UsageTransactionMetadata(BaseModel):
graph_exec_id: str | None = None
graph_id: str | None = None
node_id: str | None = None
node_exec_id: str | None = None
block_id: str | None = None
block: str | None = None
input: BlockInput | None = None
class UserCredit(UserCreditBase):
@thread_cached
def notification_client(self) -> NotificationManager:
@@ -369,89 +383,21 @@ class UserCredit(UserCreditBase):
)
)
def _block_usage_cost(
self,
block: Block,
input_data: BlockInput,
data_size: float,
run_time: float,
) -> tuple[int, BlockInput]:
block_costs = BLOCK_COSTS.get(type(block))
if not block_costs:
return 0, {}
for block_cost in block_costs:
if not self._is_cost_filter_match(block_cost.cost_filter, input_data):
continue
if block_cost.cost_type == BlockCostType.RUN:
return block_cost.cost_amount, block_cost.cost_filter
if block_cost.cost_type == BlockCostType.SECOND:
return (
int(run_time * block_cost.cost_amount),
block_cost.cost_filter,
)
if block_cost.cost_type == BlockCostType.BYTE:
return (
int(data_size * block_cost.cost_amount),
block_cost.cost_filter,
)
return 0, {}
def _is_cost_filter_match(
self, cost_filter: BlockInput, input_data: BlockInput
) -> bool:
"""
Filter rules:
- If cost_filter is an object, then check if cost_filter is the subset of input_data
- Otherwise, check if cost_filter is equal to input_data.
- Undefined, null, and empty string are considered as equal.
"""
if not isinstance(cost_filter, dict) or not isinstance(input_data, dict):
return cost_filter == input_data
return all(
(not input_data.get(k) and not v)
or (input_data.get(k) and self._is_cost_filter_match(v, input_data[k]))
for k, v in cost_filter.items()
)
async def spend_credits(
self,
entry: NodeExecutionEntry,
data_size: float,
run_time: float,
user_id: str,
cost: int,
metadata: UsageTransactionMetadata,
) -> int:
block = get_block(entry.block_id)
if not block:
raise ValueError(f"Block not found: {entry.block_id}")
cost, matching_filter = self._block_usage_cost(
block=block, input_data=entry.data, data_size=data_size, run_time=run_time
)
if cost == 0:
return 0
balance, _ = await self._add_transaction(
user_id=entry.user_id,
user_id=user_id,
amount=-cost,
transaction_type=CreditTransactionType.USAGE,
metadata=Json(
UsageTransactionMetadata(
graph_exec_id=entry.graph_exec_id,
graph_id=entry.graph_id,
node_id=entry.node_id,
node_exec_id=entry.node_exec_id,
block_id=entry.block_id,
block=block.name,
input=matching_filter,
).model_dump()
),
metadata=Json(metadata.model_dump()),
)
user_id = entry.user_id
# Auto top-up if balance is below threshold.
auto_top_up = await get_auto_top_up(user_id)
@@ -461,7 +407,7 @@ class UserCredit(UserCreditBase):
user_id=user_id,
amount=auto_top_up.amount,
# Avoid multiple auto top-ups within the same graph execution.
key=f"AUTO-TOP-UP-{user_id}-{entry.graph_exec_id}",
key=f"AUTO-TOP-UP-{user_id}-{metadata.graph_exec_id}",
ceiling_balance=auto_top_up.threshold,
)
except Exception as e:
@@ -470,11 +416,29 @@ class UserCredit(UserCreditBase):
f"Auto top-up failed for user {user_id}, balance: {balance}, amount: {auto_top_up.amount}, error: {e}"
)
return cost
return balance
async def top_up_credits(self, user_id: str, amount: int):
await self._top_up_credits(user_id, amount)
async def onboarding_reward(self, user_id: str, credits: int, step: OnboardingStep):
key = f"REWARD-{user_id}-{step.value}"
if not await CreditTransaction.prisma().find_first(
where={
"userId": user_id,
"transactionKey": key,
}
):
await self._add_transaction(
user_id=user_id,
amount=credits,
transaction_type=CreditTransactionType.GRANT,
transaction_key=key,
metadata=Json(
{"reason": f"Reward for completing {step.value} onboarding step."}
),
)
async def top_up_refund(
self, user_id: str, transaction_key: str, metadata: dict[str, str]
) -> int:
@@ -493,15 +457,15 @@ class UserCredit(UserCreditBase):
try:
refund_request = await CreditRefundRequest.prisma().create(
data={
"id": refund_key,
"transactionKey": transaction_key,
"userId": user_id,
"amount": amount,
"reason": metadata.get("reason", ""),
"status": CreditRefundRequestStatus.PENDING,
"result": "The refund request is under review.",
}
data=CreditRefundRequestCreateInput(
id=refund_key,
transactionKey=transaction_key,
userId=user_id,
amount=amount,
reason=metadata.get("reason", ""),
status=CreditRefundRequestStatus.PENDING,
result="The refund request is under review.",
)
)
except UniqueViolationError:
raise ValueError(
@@ -765,10 +729,8 @@ class UserCredit(UserCreditBase):
ui_mode="hosted",
payment_intent_data={"setup_future_usage": "off_session"},
saved_payment_method_options={"payment_method_save": "enabled"},
success_url=settings.config.frontend_base_url
+ "/profile/credits?topup=success",
cancel_url=settings.config.frontend_base_url
+ "/profile/credits?topup=cancel",
success_url=base_url + "/profile/credits?topup=success",
cancel_url=base_url + "/profile/credits?topup=cancel",
allow_promotion_codes=True,
)
@@ -964,6 +926,9 @@ class DisabledUserCredit(UserCreditBase):
async def top_up_credits(self, *args, **kwargs):
pass
async def onboarding_reward(self, *args, **kwargs):
pass
async def top_up_intent(self, *args, **kwargs) -> str:
return ""

View File

@@ -62,10 +62,10 @@ async def connect():
# Connection acquired from a pool like Supabase somehow still possibly allows
# the db client obtains a connection but still reject query connection afterward.
try:
await prisma.execute_raw("SELECT 1")
except Exception as e:
raise ConnectionError("Failed to connect to Prisma.") from e
# try:
# await prisma.execute_raw("SELECT 1")
# except Exception as e:
# raise ConnectionError("Failed to connect to Prisma.") from e
@conn_retry("Prisma", "Releasing connection")
@@ -89,7 +89,7 @@ async def transaction():
async def locked_transaction(key: str):
lock_key = zlib.crc32(key.encode("utf-8"))
async with transaction() as tx:
await tx.execute_raw(f"SELECT pg_advisory_xact_lock({lock_key})")
await tx.execute_raw("SELECT pg_advisory_xact_lock($1)", lock_key)
yield tx

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

View File

@@ -1,4 +1,9 @@
import prisma
from typing import cast
import prisma.enums
import prisma.types
from backend.blocks.io import IO_BLOCK_IDs
AGENT_NODE_INCLUDE: prisma.types.AgentNodeInclude = {
"Input": True,
@@ -8,38 +13,60 @@ AGENT_NODE_INCLUDE: prisma.types.AgentNodeInclude = {
}
AGENT_GRAPH_INCLUDE: prisma.types.AgentGraphInclude = {
"AgentNodes": {"include": AGENT_NODE_INCLUDE} # type: ignore
"Nodes": {"include": AGENT_NODE_INCLUDE}
}
EXECUTION_RESULT_INCLUDE: prisma.types.AgentNodeExecutionInclude = {
"Input": True,
"Output": True,
"AgentNode": True,
"AgentGraphExecution": True,
"Node": True,
"GraphExecution": True,
}
GRAPH_EXECUTION_INCLUDE: prisma.types.AgentGraphExecutionInclude = {
"AgentNodeExecutions": {
MAX_NODE_EXECUTIONS_FETCH = 1000
GRAPH_EXECUTION_INCLUDE_WITH_NODES: prisma.types.AgentGraphExecutionInclude = {
"NodeExecutions": {
"include": {
"Input": True,
"Output": True,
"AgentNode": True,
"AgentGraphExecution": True,
}
"Node": True,
"GraphExecution": True,
},
"order_by": [
{"queuedTime": "desc"},
# Fallback: Incomplete execs has no queuedTime.
{"addedTime": "desc"},
],
"take": MAX_NODE_EXECUTIONS_FETCH, # Avoid loading excessive node executions.
}
}
GRAPH_EXECUTION_INCLUDE: prisma.types.AgentGraphExecutionInclude = {
"NodeExecutions": {
**cast(
prisma.types.FindManyAgentNodeExecutionArgsFromAgentGraphExecution,
GRAPH_EXECUTION_INCLUDE_WITH_NODES["NodeExecutions"],
),
"where": {
"Node": {"is": {"AgentBlock": {"is": {"id": {"in": IO_BLOCK_IDs}}}}},
"NOT": [{"executionStatus": prisma.enums.AgentExecutionStatus.INCOMPLETE}],
},
}
}
INTEGRATION_WEBHOOK_INCLUDE: prisma.types.IntegrationWebhookInclude = {
"AgentNodes": {"include": AGENT_NODE_INCLUDE} # type: ignore
"AgentNodes": {"include": AGENT_NODE_INCLUDE}
}
def library_agent_include(user_id: str) -> prisma.types.LibraryAgentInclude:
return {
"Agent": {
"AgentGraph": {
"include": {
**AGENT_GRAPH_INCLUDE,
"AgentGraphExecution": {"where": {"userId": user_id}},
"Executions": {"where": {"userId": user_id}},
}
},
"Creator": True,

View File

@@ -3,12 +3,14 @@ from typing import TYPE_CHECKING, AsyncGenerator, Optional
from prisma import Json
from prisma.models import IntegrationWebhook
from prisma.types import IntegrationWebhookCreateInput
from pydantic import Field, computed_field
from backend.data.includes import INTEGRATION_WEBHOOK_INCLUDE
from backend.data.queue import AsyncRedisEventBus
from backend.integrations.providers import ProviderName
from backend.integrations.webhooks.utils import webhook_ingress_url
from backend.util.exceptions import NotFoundError
from .db import BaseDbModel
@@ -65,28 +67,35 @@ class Webhook(BaseDbModel):
async def create_webhook(webhook: Webhook) -> Webhook:
created_webhook = await IntegrationWebhook.prisma().create(
data={
"id": webhook.id,
"userId": webhook.user_id,
"provider": webhook.provider.value,
"credentialsId": webhook.credentials_id,
"webhookType": webhook.webhook_type,
"resource": webhook.resource,
"events": webhook.events,
"config": Json(webhook.config),
"secret": webhook.secret,
"providerWebhookId": webhook.provider_webhook_id,
}
data=IntegrationWebhookCreateInput(
id=webhook.id,
userId=webhook.user_id,
provider=webhook.provider.value,
credentialsId=webhook.credentials_id,
webhookType=webhook.webhook_type,
resource=webhook.resource,
events=webhook.events,
config=Json(webhook.config),
secret=webhook.secret,
providerWebhookId=webhook.provider_webhook_id,
)
)
return Webhook.from_db(created_webhook)
async def get_webhook(webhook_id: str) -> Webhook:
"""⚠️ No `user_id` check: DO NOT USE without check in user-facing endpoints."""
webhook = await IntegrationWebhook.prisma().find_unique_or_raise(
"""
⚠️ No `user_id` check: DO NOT USE without check in user-facing endpoints.
Raises:
NotFoundError: if no record with the given ID exists
"""
webhook = await IntegrationWebhook.prisma().find_unique(
where={"id": webhook_id},
include=INTEGRATION_WEBHOOK_INCLUDE,
)
if not webhook:
raise NotFoundError(f"Webhook #{webhook_id} not found")
return Webhook.from_db(webhook)

View File

@@ -2,6 +2,7 @@ from __future__ import annotations
import base64
import logging
from collections import defaultdict
from datetime import datetime, timezone
from typing import (
TYPE_CHECKING,
@@ -12,6 +13,7 @@ from typing import (
Generic,
Literal,
Optional,
Sequence,
TypedDict,
TypeVar,
get_args,
@@ -141,17 +143,20 @@ def SchemaField(
secret: bool = False,
exclude: bool = False,
hidden: Optional[bool] = None,
depends_on: list[str] | None = None,
image_upload: Optional[bool] = None,
image_output: Optional[bool] = None,
**kwargs,
depends_on: Optional[list[str]] = None,
ge: Optional[float] = None,
le: Optional[float] = None,
min_length: Optional[int] = None,
max_length: Optional[int] = None,
discriminator: Optional[str] = None,
json_schema_extra: Optional[dict[str, Any]] = None,
) -> T:
if default is PydanticUndefined and default_factory is None:
advanced = False
elif advanced is None:
advanced = True
json_extra = {
json_schema_extra = {
k: v
for k, v in {
"placeholder": placeholder,
@@ -159,8 +164,7 @@ def SchemaField(
"advanced": advanced,
"hidden": hidden,
"depends_on": depends_on,
"image_upload": image_upload,
"image_output": image_output,
**(json_schema_extra or {}),
}.items()
if v is not None
}
@@ -172,8 +176,12 @@ def SchemaField(
title=title,
description=description,
exclude=exclude,
json_schema_extra=json_extra,
**kwargs,
ge=ge,
le=le,
min_length=min_length,
max_length=max_length,
discriminator=discriminator,
json_schema_extra=json_schema_extra,
) # type: ignore
@@ -294,9 +302,7 @@ class CredentialsMetaInput(BaseModel, Generic[CP, CT]):
)
field_schema = model.jsonschema()["properties"][field_name]
try:
schema_extra = _CredentialsFieldSchemaExtra[CP, CT].model_validate(
field_schema
)
schema_extra = CredentialsFieldInfo[CP, CT].model_validate(field_schema)
except ValidationError as e:
if "Field required [type=missing" not in str(e):
raise
@@ -322,14 +328,90 @@ class CredentialsMetaInput(BaseModel, Generic[CP, CT]):
)
class _CredentialsFieldSchemaExtra(BaseModel, Generic[CP, CT]):
class CredentialsFieldInfo(BaseModel, Generic[CP, CT]):
# TODO: move discrimination mechanism out of CredentialsField (frontend + backend)
credentials_provider: list[CP]
credentials_scopes: Optional[list[str]] = None
credentials_types: list[CT]
provider: frozenset[CP] = Field(..., alias="credentials_provider")
supported_types: frozenset[CT] = Field(..., alias="credentials_types")
required_scopes: Optional[frozenset[str]] = Field(None, alias="credentials_scopes")
discriminator: Optional[str] = None
discriminator_mapping: Optional[dict[str, CP]] = None
@classmethod
def combine(
cls, *fields: tuple[CredentialsFieldInfo[CP, CT], T]
) -> Sequence[tuple[CredentialsFieldInfo[CP, CT], set[T]]]:
"""
Combines multiple CredentialsFieldInfo objects into as few as possible.
Rules:
- Items can only be combined if they have the same supported credentials types
and the same supported providers.
- When combining items, the `required_scopes` of the result is a join
of the `required_scopes` of the original items.
Params:
*fields: (CredentialsFieldInfo, key) objects to group and combine
Returns:
A sequence of tuples containing combined CredentialsFieldInfo objects and
the set of keys of the respective original items that were grouped together.
"""
if not fields:
return []
# Group fields by their provider and supported_types
grouped_fields: defaultdict[
tuple[frozenset[CP], frozenset[CT]],
list[tuple[T, CredentialsFieldInfo[CP, CT]]],
] = defaultdict(list)
for field, key in fields:
group_key = (frozenset(field.provider), frozenset(field.supported_types))
grouped_fields[group_key].append((key, field))
# Combine fields within each group
result: list[tuple[CredentialsFieldInfo[CP, CT], set[T]]] = []
for group in grouped_fields.values():
# Start with the first field in the group
_, combined = group[0]
# Track the keys that were combined
combined_keys = {key for key, _ in group}
# Combine required_scopes from all fields in the group
all_scopes = set()
for _, field in group:
if field.required_scopes:
all_scopes.update(field.required_scopes)
# Create a new combined field
result.append(
(
CredentialsFieldInfo[CP, CT](
credentials_provider=combined.provider,
credentials_types=combined.supported_types,
credentials_scopes=frozenset(all_scopes) or None,
discriminator=combined.discriminator,
discriminator_mapping=combined.discriminator_mapping,
),
combined_keys,
)
)
return result
def discriminate(self, discriminator_value: Any) -> CredentialsFieldInfo:
if not (self.discriminator and self.discriminator_mapping):
return self
discriminator_value = self.discriminator_mapping[discriminator_value]
return CredentialsFieldInfo(
credentials_provider=frozenset([discriminator_value]),
credentials_types=self.supported_types,
credentials_scopes=self.required_scopes,
)
def CredentialsField(
required_scopes: set[str] = set(),
@@ -407,13 +489,14 @@ class RefundRequest(BaseModel):
class NodeExecutionStats(BaseModel):
"""Execution statistics for a node execution."""
class Config:
arbitrary_types_allowed = True
model_config = ConfigDict(
extra="allow",
arbitrary_types_allowed=True,
)
error: Optional[Exception | str] = None
walltime: float = 0
cputime: float = 0
cost: float = 0
input_size: int = 0
output_size: int = 0
llm_call_count: int = 0
@@ -425,14 +508,22 @@ class NodeExecutionStats(BaseModel):
class GraphExecutionStats(BaseModel):
"""Execution statistics for a graph execution."""
class Config:
arbitrary_types_allowed = True
model_config = ConfigDict(
extra="allow",
arbitrary_types_allowed=True,
)
error: Optional[Exception | str] = None
walltime: float = 0
walltime: float = Field(
default=0, description="Time between start and end of run (seconds)"
)
cputime: float = 0
nodes_walltime: float = 0
nodes_walltime: float = Field(
default=0, description="Total node execution time (seconds)"
)
nodes_cputime: float = 0
node_count: int = 0
node_error_count: int = 0
cost: float = 0
node_count: int = Field(default=0, description="Total number of node executions")
node_error_count: int = Field(
default=0, description="Total number of errors generated"
)
cost: int = Field(default=0, description="Total execution cost (cents)")

View File

@@ -6,10 +6,14 @@ from typing import Annotated, Any, Generic, Optional, TypeVar, Union
from prisma import Json
from prisma.enums import NotificationType
from prisma.models import NotificationEvent, UserNotificationBatch
from prisma.types import UserNotificationBatchWhereInput
from prisma.types import (
NotificationEventCreateInput,
UserNotificationBatchCreateInput,
UserNotificationBatchWhereInput,
)
# from backend.notifications.models import NotificationEvent
from pydantic import BaseModel, EmailStr, Field, field_validator
from pydantic import BaseModel, ConfigDict, EmailStr, Field, field_validator
from backend.server.v2.store.exceptions import DatabaseError
@@ -35,7 +39,7 @@ class QueueType(Enum):
class BaseNotificationData(BaseModel):
pass
model_config = ConfigDict(extra="allow")
class AgentRunData(BaseNotificationData):
@@ -341,9 +345,46 @@ class NotificationPreference(BaseModel):
)
class UserNotificationEventDTO(BaseModel):
type: NotificationType
data: dict
created_at: datetime
updated_at: datetime
@staticmethod
def from_db(model: NotificationEvent) -> "UserNotificationEventDTO":
return UserNotificationEventDTO(
type=model.type,
data=dict(model.data),
created_at=model.createdAt,
updated_at=model.updatedAt,
)
class UserNotificationBatchDTO(BaseModel):
user_id: str
type: NotificationType
notifications: list[UserNotificationEventDTO]
created_at: datetime
updated_at: datetime
@staticmethod
def from_db(model: UserNotificationBatch) -> "UserNotificationBatchDTO":
return UserNotificationBatchDTO(
user_id=model.userId,
type=model.type,
notifications=[
UserNotificationEventDTO.from_db(notification)
for notification in model.Notifications or []
],
created_at=model.createdAt,
updated_at=model.updatedAt,
)
def get_batch_delay(notification_type: NotificationType) -> timedelta:
return {
NotificationType.AGENT_RUN: timedelta(minutes=1),
NotificationType.AGENT_RUN: timedelta(minutes=60),
NotificationType.ZERO_BALANCE: timedelta(minutes=60),
NotificationType.LOW_BALANCE: timedelta(minutes=60),
NotificationType.BLOCK_EXECUTION_FAILED: timedelta(minutes=60),
@@ -355,11 +396,13 @@ async def create_or_add_to_user_notification_batch(
user_id: str,
notification_type: NotificationType,
notification_data: NotificationEventModel,
) -> UserNotificationBatch:
) -> UserNotificationBatchDTO:
try:
logger.info(
f"Creating or adding to notification batch for {user_id} with type {notification_type} and data {notification_data}"
)
if not notification_data.data:
raise ValueError("Notification data must be provided")
# Serialize the data
json_data: Json = Json(notification_data.data.model_dump())
@@ -372,50 +415,50 @@ async def create_or_add_to_user_notification_batch(
"type": notification_type,
}
},
include={"notifications": True},
include={"Notifications": True},
)
if not existing_batch:
async with transaction() as tx:
notification_event = await tx.notificationevent.create(
data={
"type": notification_type,
"data": json_data,
}
data=NotificationEventCreateInput(
type=notification_type,
data=json_data,
)
)
# Create new batch
resp = await tx.usernotificationbatch.create(
data={
"userId": user_id,
"type": notification_type,
"notifications": {"connect": [{"id": notification_event.id}]},
},
include={"notifications": True},
data=UserNotificationBatchCreateInput(
userId=user_id,
type=notification_type,
Notifications={"connect": [{"id": notification_event.id}]},
),
include={"Notifications": True},
)
return resp
return UserNotificationBatchDTO.from_db(resp)
else:
async with transaction() as tx:
notification_event = await tx.notificationevent.create(
data={
"type": notification_type,
"data": json_data,
"UserNotificationBatch": {"connect": {"id": existing_batch.id}},
}
data=NotificationEventCreateInput(
type=notification_type,
data=json_data,
UserNotificationBatch={"connect": {"id": existing_batch.id}},
)
)
# Add to existing batch
resp = await tx.usernotificationbatch.update(
where={"id": existing_batch.id},
data={
"notifications": {"connect": [{"id": notification_event.id}]}
"Notifications": {"connect": [{"id": notification_event.id}]}
},
include={"notifications": True},
include={"Notifications": True},
)
if not resp:
raise DatabaseError(
f"Failed to add notification event {notification_event.id} to existing batch {existing_batch.id}"
)
return resp
return UserNotificationBatchDTO.from_db(resp)
except Exception as e:
raise DatabaseError(
f"Failed to create or add to notification batch for user {user_id} and type {notification_type}: {e}"
@@ -425,18 +468,23 @@ async def create_or_add_to_user_notification_batch(
async def get_user_notification_oldest_message_in_batch(
user_id: str,
notification_type: NotificationType,
) -> NotificationEvent | None:
) -> UserNotificationEventDTO | None:
try:
batch = await UserNotificationBatch.prisma().find_first(
where={"userId": user_id, "type": notification_type},
include={"notifications": True},
include={"Notifications": True},
)
if not batch:
return None
if not batch.notifications:
if not batch.Notifications:
return None
sorted_notifications = sorted(batch.notifications, key=lambda x: x.createdAt)
return sorted_notifications[0]
sorted_notifications = sorted(batch.Notifications, key=lambda x: x.createdAt)
return (
UserNotificationEventDTO.from_db(sorted_notifications[0])
if sorted_notifications
else None
)
except Exception as e:
raise DatabaseError(
f"Failed to get user notification last message in batch for user {user_id} and type {notification_type}: {e}"
@@ -471,12 +519,13 @@ async def empty_user_notification_batch(
async def get_user_notification_batch(
user_id: str,
notification_type: NotificationType,
) -> UserNotificationBatch | None:
) -> UserNotificationBatchDTO | None:
try:
return await UserNotificationBatch.prisma().find_first(
batch = await UserNotificationBatch.prisma().find_first(
where={"userId": user_id, "type": notification_type},
include={"notifications": True},
include={"Notifications": True},
)
return UserNotificationBatchDTO.from_db(batch) if batch else None
except Exception as e:
raise DatabaseError(
f"Failed to get user notification batch for user {user_id} and type {notification_type}: {e}"
@@ -485,17 +534,18 @@ async def get_user_notification_batch(
async def get_all_batches_by_type(
notification_type: NotificationType,
) -> list[UserNotificationBatch]:
) -> list[UserNotificationBatchDTO]:
try:
return await UserNotificationBatch.prisma().find_many(
batches = await UserNotificationBatch.prisma().find_many(
where={
"type": notification_type,
"notifications": {
"Notifications": {
"some": {} # Only return batches with at least one notification
},
},
include={"notifications": True},
include={"Notifications": True},
)
return [UserNotificationBatchDTO.from_db(batch) for batch in batches]
except Exception as e:
raise DatabaseError(
f"Failed to get all batches by type {notification_type}: {e}"

View File

@@ -4,16 +4,15 @@ from typing import Any, Optional
import prisma
import pydantic
from prisma import Json
from prisma.models import (
AgentGraph,
AgentGraphExecution,
StoreListingVersion,
UserOnboarding,
)
from prisma.types import UserOnboardingUpdateInput
from prisma.enums import OnboardingStep
from prisma.models import UserOnboarding
from prisma.types import UserOnboardingCreateInput, UserOnboardingUpdateInput
from backend.server.v2.library.db import set_is_deleted_for_library_agent
from backend.server.v2.store.db import get_store_agent_details
from backend.data import db
from backend.data.block import get_blocks
from backend.data.credit import get_user_credit_model
from backend.data.graph import GraphModel
from backend.data.model import CredentialsMetaInput
from backend.server.v2.store.model import StoreAgentDetails
# Mapping from user reason id to categories to search for when choosing agent to show
@@ -24,85 +23,113 @@ REASON_MAPPING: dict[str, list[str]] = {
"ai_innovation": ["development", "research"],
"personal_productivity": ["personal", "productivity"],
}
POINTS_AGENT_COUNT = 50 # Number of agents to calculate points for
MIN_AGENT_COUNT = 2 # Minimum number of marketplace agents to enable onboarding
user_credit = get_user_credit_model()
class UserOnboardingUpdate(pydantic.BaseModel):
step: int
completedSteps: Optional[list[OnboardingStep]] = None
notificationDot: Optional[bool] = None
notified: Optional[list[OnboardingStep]] = None
usageReason: Optional[str] = None
integrations: list[str] = pydantic.Field(default_factory=list)
integrations: Optional[list[str]] = None
otherIntegrations: Optional[str] = None
selectedAgentCreator: Optional[str] = None
selectedAgentSlug: Optional[str] = None
selectedStoreListingVersionId: Optional[str] = None
agentInput: Optional[dict[str, Any]] = None
isCompleted: bool = False
onboardingAgentExecutionId: Optional[str] = None
async def get_user_onboarding(user_id: str):
return await UserOnboarding.prisma().upsert(
where={"userId": user_id},
data={
"create": {"userId": user_id}, # type: ignore
"create": UserOnboardingCreateInput(userId=user_id),
"update": {},
},
)
async def update_user_onboarding(user_id: str, data: UserOnboardingUpdate):
# Get the user onboarding data
user_onboarding = await get_user_onboarding(user_id)
update: UserOnboardingUpdateInput = {
"step": data.step,
"isCompleted": data.isCompleted,
}
if data.usageReason:
update["usageReason"] = data.usageReason
if data.integrations:
update["integrations"] = data.integrations
if data.otherIntegrations:
update["otherIntegrations"] = data.otherIntegrations
if data.selectedAgentSlug and data.selectedAgentCreator:
update["selectedAgentSlug"] = data.selectedAgentSlug
update["selectedAgentCreator"] = data.selectedAgentCreator
# Check if slug changes
if (
user_onboarding.selectedAgentCreator
and user_onboarding.selectedAgentSlug
and user_onboarding.selectedAgentSlug != data.selectedAgentSlug
update: UserOnboardingUpdateInput = {}
if data.completedSteps is not None:
update["completedSteps"] = list(set(data.completedSteps))
for step in (
OnboardingStep.AGENT_NEW_RUN,
OnboardingStep.GET_RESULTS,
OnboardingStep.MARKETPLACE_ADD_AGENT,
OnboardingStep.MARKETPLACE_RUN_AGENT,
OnboardingStep.BUILDER_SAVE_AGENT,
OnboardingStep.BUILDER_RUN_AGENT,
):
store_agent = await get_store_agent_details(
user_onboarding.selectedAgentCreator, user_onboarding.selectedAgentSlug
)
store_listing = await StoreListingVersion.prisma().find_unique_or_raise(
where={"id": store_agent.store_listing_version_id}
)
agent_graph = await AgentGraph.prisma().find_first(
where={"id": store_listing.agentId, "version": store_listing.version}
)
execution_count = await AgentGraphExecution.prisma().count(
where={
"userId": user_id,
"agentGraphId": store_listing.agentId,
"agentGraphVersion": store_listing.version,
}
)
# If there was no execution and graph doesn't belong to the user,
# mark the agent as deleted
if execution_count == 0 and agent_graph and agent_graph.userId != user_id:
await set_is_deleted_for_library_agent(
user_id, store_listing.agentId, store_listing.agentVersion, True
)
if data.agentInput:
if step in data.completedSteps:
await reward_user(user_id, step)
if data.notificationDot is not None:
update["notificationDot"] = data.notificationDot
if data.notified is not None:
update["notified"] = list(set(data.notified))
if data.usageReason is not None:
update["usageReason"] = data.usageReason
if data.integrations is not None:
update["integrations"] = data.integrations
if data.otherIntegrations is not None:
update["otherIntegrations"] = data.otherIntegrations
if data.selectedStoreListingVersionId is not None:
update["selectedStoreListingVersionId"] = data.selectedStoreListingVersionId
if data.agentInput is not None:
update["agentInput"] = Json(data.agentInput)
if data.onboardingAgentExecutionId is not None:
update["onboardingAgentExecutionId"] = data.onboardingAgentExecutionId
return await UserOnboarding.prisma().upsert(
where={"userId": user_id},
data={
"create": {"userId": user_id, **update}, # type: ignore
"create": {"userId": user_id, **update},
"update": update,
},
)
async def reward_user(user_id: str, step: OnboardingStep):
async with db.locked_transaction(f"usr_trx_{user_id}-reward"):
reward = 0
match step:
# Reward user when they clicked New Run during onboarding
# This is because they need credits before scheduling a run (next step)
case OnboardingStep.AGENT_NEW_RUN:
reward = 300
case OnboardingStep.GET_RESULTS:
reward = 300
case OnboardingStep.MARKETPLACE_ADD_AGENT:
reward = 100
case OnboardingStep.MARKETPLACE_RUN_AGENT:
reward = 100
case OnboardingStep.BUILDER_SAVE_AGENT:
reward = 100
case OnboardingStep.BUILDER_RUN_AGENT:
reward = 100
if reward == 0:
return
onboarding = await get_user_onboarding(user_id)
# Skip if already rewarded
if step in onboarding.rewardedFor:
return
onboarding.rewardedFor.append(step)
await user_credit.onboarding_reward(user_id, reward, step)
await UserOnboarding.prisma().update(
where={"userId": user_id},
data={
"completedSteps": list(set(onboarding.completedSteps + [step])),
"rewardedFor": onboarding.rewardedFor,
},
)
def clean_and_split(text: str) -> list[str]:
"""
Removes all special characters from a string, truncates it to 100 characters,
@@ -170,6 +197,20 @@ def calculate_points(
return int(points)
def get_credentials_blocks() -> dict[str, str]:
# Returns a dictionary of block id to credentials field name
creds: dict[str, str] = {}
blocks = get_blocks()
for id, block in blocks.items():
for field_name, field_info in block().input_schema.model_fields.items():
if field_info.annotation == CredentialsMetaInput:
creds[id] = field_name
return creds
CREDENTIALS_FIELDS: dict[str, str] = get_credentials_blocks()
async def get_recommended_agents(user_id: str) -> list[StoreAgentDetails]:
user_onboarding = await get_user_onboarding(user_id)
categories = REASON_MAPPING.get(user_onboarding.usageReason or "", [])
@@ -193,31 +234,74 @@ async def get_recommended_agents(user_id: str) -> list[StoreAgentDetails]:
for word in user_onboarding.integrations
]
agents = await prisma.models.StoreAgent.prisma().find_many(
storeAgents = await prisma.models.StoreAgent.prisma().find_many(
where=prisma.types.StoreAgentWhereInput(**where_clause),
order=[
{"featured": "desc"},
{"runs": "desc"},
{"rating": "desc"},
],
take=100,
)
if len(agents) < 2:
agents += await prisma.models.StoreAgent.prisma().find_many(
agentListings = await prisma.models.StoreListingVersion.prisma().find_many(
where={
"id": {"in": [agent.storeListingVersionId for agent in storeAgents]},
},
include={"AgentGraph": True},
)
for listing in agentListings:
agent = listing.AgentGraph
if agent is None:
continue
graph = GraphModel.from_db(agent)
# Remove agents with empty input schema
if not graph.input_schema:
storeAgents = [
a for a in storeAgents if a.storeListingVersionId != listing.id
]
continue
# Remove agents with empty credentials
# Get nodes from this agent that have credentials
nodes = await prisma.models.AgentNode.prisma().find_many(
where={
"listing_id": {"not_in": [agent.listing_id for agent in agents]},
"agentGraphId": agent.id,
"agentBlockId": {"in": list(CREDENTIALS_FIELDS.keys())},
},
)
for node in nodes:
block_id = node.agentBlockId
field_name = CREDENTIALS_FIELDS[block_id]
# If there are no credentials or they are empty, remove the agent
# FIXME ignores default values
if (
field_name not in node.constantInput
or node.constantInput[field_name] is None
):
storeAgents = [
a for a in storeAgents if a.storeListingVersionId != listing.id
]
break
# If there are less than 2 agents, add more agents to the list
if len(storeAgents) < 2:
storeAgents += await prisma.models.StoreAgent.prisma().find_many(
where={
"listing_id": {"not_in": [agent.listing_id for agent in storeAgents]},
},
order=[
{"featured": "desc"},
{"runs": "desc"},
{"rating": "desc"},
],
take=2 - len(agents),
take=2 - len(storeAgents),
)
# Calculate points for the first 30 agents and choose the top 2
# Calculate points for the first X agents and choose the top 2
agent_points = []
for agent in agents[:50]:
for agent in storeAgents[:POINTS_AGENT_COUNT]:
points = calculate_points(
agent, categories, custom, user_onboarding.integrations
)
@@ -245,3 +329,10 @@ async def get_recommended_agents(user_id: str) -> list[StoreAgentDetails]:
)
for agent in recommended_agents
]
async def onboarding_enabled() -> bool:
count = await prisma.models.StoreAgent.prisma().count(take=MIN_AGENT_COUNT + 1)
# Onboading is enabled if there are at least 2 agents in the store
return count >= MIN_AGENT_COUNT

View File

@@ -1,8 +1,6 @@
import asyncio
import json
import logging
from abc import ABC, abstractmethod
from datetime import datetime
from typing import Any, AsyncGenerator, Generator, Generic, Optional, TypeVar
from pydantic import BaseModel
@@ -14,13 +12,6 @@ from backend.data import redis
logger = logging.getLogger(__name__)
class DateTimeEncoder(json.JSONEncoder):
def default(self, o):
if isinstance(o, datetime):
return o.isoformat()
return super().default(o)
M = TypeVar("M", bound=BaseModel)
@@ -32,8 +23,12 @@ class BaseRedisEventBus(Generic[M], ABC):
def event_bus_name(self) -> str:
pass
@property
def Message(self) -> type["_EventPayloadWrapper[M]"]:
return _EventPayloadWrapper[self.Model]
def _serialize_message(self, item: M, channel_key: str) -> tuple[str, str]:
message = json.dumps(item.model_dump(), cls=DateTimeEncoder)
message = self.Message(payload=item).model_dump_json()
channel_name = f"{self.event_bus_name}/{channel_key}"
logger.debug(f"[{channel_name}] Publishing an event to Redis {message}")
return message, channel_name
@@ -43,9 +38,8 @@ class BaseRedisEventBus(Generic[M], ABC):
if msg["type"] != message_type:
return None
try:
data = json.loads(msg["data"])
logger.debug(f"Consuming an event from Redis {data}")
return self.Model(**data)
logger.debug(f"[{channel_key}] Consuming an event from Redis {msg['data']}")
return self.Message.model_validate_json(msg["data"]).payload
except Exception as e:
logger.error(f"Failed to parse event result from Redis {msg} {e}")
@@ -57,9 +51,16 @@ class BaseRedisEventBus(Generic[M], ABC):
return pubsub, full_channel_name
class RedisEventBus(BaseRedisEventBus[M], ABC):
Model: type[M]
class _EventPayloadWrapper(BaseModel, Generic[M]):
"""
Wrapper model to allow `RedisEventBus.Model` to be a discriminated union
of multiple event types.
"""
payload: M
class RedisEventBus(BaseRedisEventBus[M], ABC):
@property
def connection(self) -> redis.Redis:
return redis.get_redis()
@@ -85,8 +86,6 @@ class RedisEventBus(BaseRedisEventBus[M], ABC):
class AsyncRedisEventBus(BaseRedisEventBus[M], ABC):
Model: type[M]
@property
async def connection(self) -> redis.AsyncRedis:
return await redis.get_redis_async()

View File

@@ -4,10 +4,18 @@ from enum import Enum
from typing import Awaitable, Optional
import aio_pika
import aio_pika.exceptions as aio_ex
import pika
import pika.adapters.blocking_connection
from pika.exceptions import AMQPError
from pika.spec import BasicProperties
from pydantic import BaseModel
from tenacity import (
retry,
retry_if_exception_type,
stop_after_attempt,
wait_random_exponential,
)
from backend.util.retry import conn_retry
from backend.util.settings import Settings
@@ -161,6 +169,12 @@ class SyncRabbitMQ(RabbitMQBase):
routing_key=queue.routing_key or queue.name,
)
@retry(
retry=retry_if_exception_type((AMQPError, ConnectionError)),
wait=wait_random_exponential(multiplier=1, max=5),
stop=stop_after_attempt(5),
reraise=True,
)
def publish_message(
self,
routing_key: str,
@@ -258,6 +272,12 @@ class AsyncRabbitMQ(RabbitMQBase):
exchange, routing_key=queue.routing_key or queue.name
)
@retry(
retry=retry_if_exception_type((aio_ex.AMQPError, ConnectionError)),
wait=wait_random_exponential(multiplier=1, max=5),
stop=stop_after_attempt(5),
reraise=True,
)
async def publish_message(
self,
routing_key: str,

View File

@@ -11,7 +11,7 @@ from fastapi import HTTPException
from prisma import Json
from prisma.enums import NotificationType
from prisma.models import User
from prisma.types import UserUpdateInput
from prisma.types import JsonFilter, UserCreateInput, UserUpdateInput
from backend.data.db import prisma
from backend.data.model import UserIntegrations, UserMetadata, UserMetadataRaw
@@ -36,11 +36,11 @@ async def get_or_create_user(user_data: dict) -> User:
user = await prisma.user.find_unique(where={"id": user_id})
if not user:
user = await prisma.user.create(
data={
"id": user_id,
"email": user_email,
"name": user_data.get("user_metadata", {}).get("name"),
}
data=UserCreateInput(
id=user_id,
email=user_email,
name=user_data.get("user_metadata", {}).get("name"),
)
)
return User.model_validate(user)
@@ -84,11 +84,11 @@ async def create_default_user() -> Optional[User]:
user = await prisma.user.find_unique(where={"id": DEFAULT_USER_ID})
if not user:
user = await prisma.user.create(
data={
"id": DEFAULT_USER_ID,
"email": "default@example.com",
"name": "Default User",
}
data=UserCreateInput(
id=DEFAULT_USER_ID,
email="default@example.com",
name="Default User",
)
)
return User.model_validate(user)
@@ -135,16 +135,21 @@ async def migrate_and_encrypt_user_integrations():
"""Migrate integration credentials and OAuth states from metadata to integrations column."""
users = await User.prisma().find_many(
where={
"metadata": {
"path": ["integration_credentials"],
"not": Json({"a": "yolo"}), # bogus value works to check if key exists
} # type: ignore
"metadata": cast(
JsonFilter,
{
"path": ["integration_credentials"],
"not": Json(
{"a": "yolo"}
), # bogus value works to check if key exists
},
)
}
)
logger.info(f"Migrating integration credentials for {len(users)} users")
for user in users:
raw_metadata = cast(UserMetadataRaw, user.metadata)
raw_metadata = cast(dict, user.metadata)
metadata = UserMetadata.model_validate(raw_metadata)
# Get existing integrations data
@@ -160,7 +165,6 @@ async def migrate_and_encrypt_user_integrations():
await update_user_integrations(user_id=user.id, data=integrations)
# Remove from metadata
raw_metadata = dict(raw_metadata)
raw_metadata.pop("integration_credentials", None)
raw_metadata.pop("integration_oauth_states", None)

View File

@@ -1,15 +1,12 @@
from backend.app import run_processes
from backend.executor import DatabaseManager, ExecutionManager
from backend.executor import ExecutionManager
def main():
"""
Run all the processes required for the AutoGPT-server REST API.
"""
run_processes(
DatabaseManager(),
ExecutionManager(),
)
run_processes(ExecutionManager())
if __name__ == "__main__":

View File

@@ -1,16 +1,18 @@
from backend.data.credit import get_user_credit_model
import logging
from backend.data import db
from backend.data.credit import UsageTransactionMetadata, get_user_credit_model
from backend.data.execution import (
ExecutionResult,
NodeExecutionEntry,
RedisExecutionEventBus,
create_graph_execution,
get_execution_results,
get_incomplete_executions,
get_latest_execution,
update_execution_status,
get_graph_execution,
get_incomplete_node_executions,
get_latest_node_execution,
get_node_execution_results,
update_graph_execution_start_time,
update_graph_execution_stats,
update_node_execution_stats,
update_node_execution_status,
update_node_execution_status_batch,
upsert_execution_input,
upsert_execution_output,
)
@@ -20,44 +22,65 @@ from backend.data.graph import (
get_graph_metadata,
get_node,
)
from backend.data.notifications import (
create_or_add_to_user_notification_batch,
empty_user_notification_batch,
get_all_batches_by_type,
get_user_notification_batch,
get_user_notification_oldest_message_in_batch,
)
from backend.data.user import (
get_active_user_ids_in_timerange,
get_user_email_by_id,
get_user_email_verification,
get_user_integrations,
get_user_metadata,
get_user_notification_preference,
update_user_integrations,
update_user_metadata,
)
from backend.util.service import AppService, expose, exposed_run_and_wait
from backend.util.service import AppService, exposed_run_and_wait
from backend.util.settings import Config
config = Config()
_user_credit_model = get_user_credit_model()
logger = logging.getLogger(__name__)
async def _spend_credits(entry: NodeExecutionEntry) -> int:
return await _user_credit_model.spend_credits(entry, 0, 0)
async def _spend_credits(
user_id: str, cost: int, metadata: UsageTransactionMetadata
) -> int:
return await _user_credit_model.spend_credits(user_id, cost, metadata)
class DatabaseManager(AppService):
def __init__(self):
super().__init__()
self.use_db = True
self.use_redis = True
self.event_queue = RedisExecutionEventBus()
def run_service(self) -> None:
logger.info(f"[{self.service_name}] ⏳ Connecting to Database...")
self.run_and_wait(db.connect())
super().run_service()
def cleanup(self):
super().cleanup()
logger.info(f"[{self.service_name}] ⏳ Disconnecting Database...")
self.run_and_wait(db.disconnect())
@classmethod
def get_port(cls) -> int:
return config.database_api_port
@expose
def send_execution_update(self, execution_result: ExecutionResult):
self.event_queue.publish(execution_result)
# Executions
get_graph_execution = exposed_run_and_wait(get_graph_execution)
create_graph_execution = exposed_run_and_wait(create_graph_execution)
get_execution_results = exposed_run_and_wait(get_execution_results)
get_incomplete_executions = exposed_run_and_wait(get_incomplete_executions)
get_latest_execution = exposed_run_and_wait(get_latest_execution)
update_execution_status = exposed_run_and_wait(update_execution_status)
get_node_execution_results = exposed_run_and_wait(get_node_execution_results)
get_incomplete_node_executions = exposed_run_and_wait(
get_incomplete_node_executions
)
get_latest_node_execution = exposed_run_and_wait(get_latest_node_execution)
update_node_execution_status = exposed_run_and_wait(update_node_execution_status)
update_node_execution_status_batch = exposed_run_and_wait(
update_node_execution_status_batch
)
update_graph_execution_start_time = exposed_run_and_wait(
update_graph_execution_start_time
)
@@ -80,3 +103,24 @@ class DatabaseManager(AppService):
update_user_metadata = exposed_run_and_wait(update_user_metadata)
get_user_integrations = exposed_run_and_wait(get_user_integrations)
update_user_integrations = exposed_run_and_wait(update_user_integrations)
# User Comms - async
get_active_user_ids_in_timerange = exposed_run_and_wait(
get_active_user_ids_in_timerange
)
get_user_email_by_id = exposed_run_and_wait(get_user_email_by_id)
get_user_email_verification = exposed_run_and_wait(get_user_email_verification)
get_user_notification_preference = exposed_run_and_wait(
get_user_notification_preference
)
# Notifications - async
create_or_add_to_user_notification_batch = exposed_run_and_wait(
create_or_add_to_user_notification_batch
)
empty_user_notification_batch = exposed_run_and_wait(empty_user_notification_batch)
get_all_batches_by_type = exposed_run_and_wait(get_all_batches_by_type)
get_user_notification_batch = exposed_run_and_wait(get_user_notification_batch)
get_user_notification_oldest_message_in_batch = exposed_run_and_wait(
get_user_notification_oldest_message_in_batch
)

View File

@@ -5,14 +5,17 @@ import os
import signal
import sys
import threading
import time
from concurrent.futures import Future, ProcessPoolExecutor
from contextlib import contextmanager
from multiprocessing.pool import AsyncResult, Pool
from typing import TYPE_CHECKING, Any, Generator, Optional, TypeVar, cast
from typing import TYPE_CHECKING, Any, Generator, TypeVar, cast
from pika.adapters.blocking_connection import BlockingChannel
from pika.spec import Basic, BasicProperties
from redis.lock import Lock as RedisLock
from backend.blocks.basic import AgentOutputBlock
from backend.blocks.io import AgentOutputBlock
from backend.data.model import GraphExecutionStats, NodeExecutionStats
from backend.data.notifications import (
AgentRunData,
@@ -26,42 +29,40 @@ if TYPE_CHECKING:
from backend.executor import DatabaseManager
from backend.notifications.notifications import NotificationManager
from autogpt_libs.utils.cache import thread_cached
from autogpt_libs.utils.cache import clear_thread_cache, thread_cached
from backend.blocks.agent import AgentExecutorBlock
from backend.data import redis
from backend.data.block import (
Block,
BlockData,
BlockInput,
BlockSchema,
BlockType,
get_block,
)
from backend.data.block import BlockData, BlockInput, BlockSchema, get_block
from backend.data.execution import (
ExecutionQueue,
ExecutionResult,
ExecutionStatus,
GraphExecution,
GraphExecutionEntry,
NodeExecutionEntry,
merge_execution_input,
parse_execution_output,
NodeExecutionResult,
)
from backend.data.graph import Link, Node
from backend.executor.utils import (
GRAPH_EXECUTION_CANCEL_QUEUE_NAME,
GRAPH_EXECUTION_QUEUE_NAME,
CancelExecutionEvent,
UsageTransactionMetadata,
block_usage_cost,
execution_usage_cost,
get_execution_event_bus,
get_execution_queue,
parse_execution_output,
validate_exec,
)
from backend.data.graph import GraphModel, Link, Node
from backend.integrations.creds_manager import IntegrationCredentialsManager
from backend.util import json
from backend.util.decorator import error_logged, time_measured
from backend.util.file import clean_exec_files
from backend.util.logging import configure_logging
from backend.util.process import set_service_name
from backend.util.service import (
AppService,
close_service_client,
expose,
get_service_client,
)
from backend.util.process import AppProcess, set_service_name
from backend.util.service import close_service_client, get_service_client
from backend.util.settings import Settings
from backend.util.type import convert
logger = logging.getLogger(__name__)
settings = Settings()
@@ -86,7 +87,7 @@ class LogMetadata:
"node_id": node_id,
"block_name": block_name,
}
self.prefix = f"[ExecutionManager|uid:{user_id}|gid:{graph_id}|nid:{node_id}]|geid:{graph_eid}|nid:{node_eid}|{block_name}]"
self.prefix = f"[ExecutionManager|uid:{user_id}|gid:{graph_id}|nid:{node_id}]|geid:{graph_eid}|neid:{node_eid}|{block_name}]"
def info(self, msg: str, **extra):
msg = self._wrap(msg, **extra)
@@ -144,17 +145,22 @@ def execute_node(
node_exec_id = data.node_exec_id
node_id = data.node_id
def update_execution(status: ExecutionStatus) -> ExecutionResult:
exec_update = db_client.update_execution_status(node_exec_id, status)
db_client.send_execution_update(exec_update)
def update_execution_status(status: ExecutionStatus) -> NodeExecutionResult:
"""Sets status and fetches+broadcasts the latest state of the node execution"""
exec_update = db_client.update_node_execution_status(node_exec_id, status)
send_execution_update(exec_update)
return exec_update
node = db_client.get_node(node_id)
node_block = get_block(node.block_id)
if not node_block:
logger.error(f"Block {node.block_id} not found.")
return
node_block = node.block
def push_output(output_name: str, output_data: Any) -> None:
db_client.upsert_execution_output(
node_exec_id=node_exec_id,
output_name=output_name,
output_data=output_data,
)
log_metadata = LogMetadata(
user_id=user_id,
@@ -169,8 +175,8 @@ def execute_node(
input_data, error = validate_exec(node, data.data, resolve_input=False)
if input_data is None:
log_metadata.error(f"Skip execution, input validation error: {error}")
db_client.upsert_execution_output(node_exec_id, "error", error)
update_execution(ExecutionStatus.FAILED)
push_output("error", error)
update_execution_status(ExecutionStatus.FAILED)
return
# Re-shape the input data for agent block.
@@ -182,8 +188,8 @@ def execute_node(
# Execute the node
input_data_str = json.dumps(input_data)
input_size = len(input_data_str)
log_metadata.info("Executed node with input", input=input_data_str)
update_execution(ExecutionStatus.RUNNING)
log_metadata.debug("Executed node with input", input=input_data_str)
update_execution_status(ExecutionStatus.RUNNING)
# Inject extra execution arguments for the blocks via kwargs
extra_exec_kwargs: dict = {
@@ -206,19 +212,15 @@ def execute_node(
extra_exec_kwargs[field_name] = credentials
output_size = 0
cost = 0
try:
# Charge the user for the execution before running the block.
cost = db_client.spend_credits(data)
outputs: dict[str, Any] = {}
for output_name, output_data in node_block.execute(
input_data, **extra_exec_kwargs
):
output_data = json.convert_pydantic_to_json(output_data)
output_size += len(json.dumps(output_data))
log_metadata.info("Node produced output", **{output_name: output_data})
db_client.upsert_execution_output(node_exec_id, output_name, output_data)
log_metadata.debug("Node produced output", **{output_name: output_data})
push_output(output_name, output_data)
outputs[output_name] = output_data
for execution in _enqueue_next_nodes(
db_client=db_client,
@@ -231,13 +233,12 @@ def execute_node(
):
yield execution
# Update execution status and spend credits
update_execution(ExecutionStatus.COMPLETED)
update_execution_status(ExecutionStatus.COMPLETED)
except Exception as e:
error_msg = str(e)
db_client.upsert_execution_output(node_exec_id, "error", error_msg)
update_execution(ExecutionStatus.FAILED)
push_output("error", error_msg)
update_execution_status(ExecutionStatus.FAILED)
for execution in _enqueue_next_nodes(
db_client=db_client,
@@ -266,7 +267,6 @@ def execute_node(
)
execution_stats.input_size = input_size
execution_stats.output_size = output_size
execution_stats.cost = cost
def _enqueue_next_nodes(
@@ -281,10 +281,10 @@ def _enqueue_next_nodes(
def add_enqueued_execution(
node_exec_id: str, node_id: str, block_id: str, data: BlockInput
) -> NodeExecutionEntry:
exec_update = db_client.update_execution_status(
exec_update = db_client.update_node_execution_status(
node_exec_id, ExecutionStatus.QUEUED, data
)
db_client.send_execution_update(exec_update)
send_execution_update(exec_update)
return NodeExecutionEntry(
user_id=user_id,
graph_exec_id=graph_exec_id,
@@ -326,7 +326,7 @@ def _enqueue_next_nodes(
if link.is_static and link.sink_name not in next_node_input
}
if static_link_names and (
latest_execution := db_client.get_latest_execution(
latest_execution := db_client.get_latest_node_execution(
next_node_id, graph_exec_id
)
):
@@ -359,7 +359,7 @@ def _enqueue_next_nodes(
# If link is static, there could be some incomplete executions waiting for it.
# Load and complete the input missing input data, and try to re-enqueue them.
for iexec in db_client.get_incomplete_executions(
for iexec in db_client.get_incomplete_node_executions(
next_node_id, graph_exec_id
):
idata = iexec.input_data
@@ -396,60 +396,6 @@ def _enqueue_next_nodes(
]
def validate_exec(
node: Node,
data: BlockInput,
resolve_input: bool = True,
) -> tuple[BlockInput | None, str]:
"""
Validate the input data for a node execution.
Args:
node: The node to execute.
data: The input data for the node execution.
resolve_input: Whether to resolve dynamic pins into dict/list/object.
Returns:
A tuple of the validated data and the block name.
If the data is invalid, the first element will be None, and the second element
will be an error message.
If the data is valid, the first element will be the resolved input data, and
the second element will be the block name.
"""
node_block: Block | None = get_block(node.block_id)
if not node_block:
return None, f"Block for {node.block_id} not found."
schema = node_block.input_schema
# Convert non-matching data types to the expected input schema.
for name, data_type in schema.__annotations__.items():
if (value := data.get(name)) and (type(value) is not data_type):
data[name] = convert(value, data_type)
# Input data (without default values) should contain all required fields.
error_prefix = f"Input data missing or mismatch for `{node_block.name}`:"
if missing_links := schema.get_missing_links(data, node.input_links):
return None, f"{error_prefix} unpopulated links {missing_links}"
# Merge input data with default values and resolve dynamic dict/list/object pins.
input_default = schema.get_input_defaults(node.input_default)
data = {**input_default, **data}
if resolve_input:
data = merge_execution_input(data)
# Input data post-merge should contain all required fields from the schema.
if missing_input := schema.get_missing_input(data):
return None, f"{error_prefix} missing input {missing_input}"
# Last validation: Validate the input values against the schema.
if error := schema.get_mismatch_error(data):
error_message = f"{error_prefix} {error}"
logger.error(error_message)
return None, error_message
return data, node_block.name
class Executor:
"""
This class contains event handlers for the process pool executor events.
@@ -626,25 +572,79 @@ class Executor:
node_eid="*",
block_name="-",
)
cls.db_client.update_graph_execution_start_time(graph_exec.graph_exec_id)
exec_meta = cls.db_client.update_graph_execution_start_time(
graph_exec.graph_exec_id
)
if exec_meta is None:
logger.warning(
f"Skipped graph execution {graph_exec.graph_exec_id}, the graph execution is not found or not currently in the QUEUED state."
)
return
send_execution_update(exec_meta)
timing_info, (exec_stats, status, error) = cls._on_graph_execution(
graph_exec, cancel, log_metadata
)
exec_stats.walltime = timing_info.wall_time
exec_stats.cputime = timing_info.cpu_time
exec_stats.error = error
exec_stats.error = str(error)
if isinstance(exec_stats.error, Exception):
exec_stats.error = str(exec_stats.error)
result = cls.db_client.update_graph_execution_stats(
if graph_exec_result := cls.db_client.update_graph_execution_stats(
graph_exec_id=graph_exec.graph_exec_id,
status=status,
stats=exec_stats,
)
cls.db_client.send_execution_update(result)
):
send_execution_update(graph_exec_result)
cls._handle_agent_run_notif(graph_exec, exec_stats)
@classmethod
def _charge_usage(
cls,
node_exec: NodeExecutionEntry,
execution_count: int,
execution_stats: GraphExecutionStats,
) -> int:
block = get_block(node_exec.block_id)
if not block:
logger.error(f"Block {node_exec.block_id} not found.")
return execution_count
cost, matching_filter = block_usage_cost(block=block, input_data=node_exec.data)
if cost > 0:
cls.db_client.spend_credits(
user_id=node_exec.user_id,
cost=cost,
metadata=UsageTransactionMetadata(
graph_exec_id=node_exec.graph_exec_id,
graph_id=node_exec.graph_id,
node_exec_id=node_exec.node_exec_id,
node_id=node_exec.node_id,
block_id=node_exec.block_id,
block=block.name,
input=matching_filter,
),
)
execution_stats.cost += cost
cost, execution_count = execution_usage_cost(execution_count)
if cost > 0:
cls.db_client.spend_credits(
user_id=node_exec.user_id,
cost=cost,
metadata=UsageTransactionMetadata(
graph_exec_id=node_exec.graph_exec_id,
graph_id=node_exec.graph_id,
input={
"execution_count": execution_count,
"charge": "Execution Cost",
},
),
)
execution_stats.cost += cost
return execution_count
@classmethod
@time_measured
def _on_graph_execution(
@@ -660,15 +660,19 @@ class Executor:
Exception | None: The error that occurred during the execution, if any.
"""
log_metadata.info(f"Start graph execution {graph_exec.graph_exec_id}")
exec_stats = GraphExecutionStats()
execution_stats = GraphExecutionStats()
execution_status = ExecutionStatus.RUNNING
error = None
finished = False
def cancel_handler():
nonlocal execution_status
while not cancel.is_set():
cancel.wait(1)
if finished:
return
execution_status = ExecutionStatus.TERMINATED
cls.executor.terminate()
log_metadata.info(f"Terminated graph execution {graph_exec.graph_exec_id}")
cls._init_node_executor_pool()
@@ -679,44 +683,51 @@ class Executor:
try:
queue = ExecutionQueue[NodeExecutionEntry]()
for node_exec in graph_exec.start_node_execs:
exec_update = cls.db_client.update_execution_status(
node_exec.node_exec_id, ExecutionStatus.QUEUED, node_exec.data
)
cls.db_client.send_execution_update(exec_update)
queue.add(node_exec)
exec_cost_counter = 0
running_executions: dict[str, AsyncResult] = {}
low_balance_error: Optional[InsufficientBalanceError] = None
def make_exec_callback(exec_data: NodeExecutionEntry):
def callback(result: object):
running_executions.pop(exec_data.node_id)
if not isinstance(result, NodeExecutionStats):
return
nonlocal exec_stats, low_balance_error
exec_stats.node_count += 1
exec_stats.nodes_cputime += result.cputime
exec_stats.nodes_walltime += result.walltime
exec_stats.cost += result.cost
nonlocal execution_stats
execution_stats.node_count += 1
execution_stats.nodes_cputime += result.cputime
execution_stats.nodes_walltime += result.walltime
if (err := result.error) and isinstance(err, Exception):
exec_stats.node_error_count += 1
execution_stats.node_error_count += 1
if isinstance(err, InsufficientBalanceError):
low_balance_error = err
if _graph_exec := cls.db_client.update_graph_execution_stats(
graph_exec_id=exec_data.graph_exec_id,
status=execution_status,
stats=execution_stats,
):
send_execution_update(_graph_exec)
else:
logger.error(
"Callback for "
f"finished node execution #{exec_data.node_exec_id} "
"could not update execution stats "
f"for graph execution #{exec_data.graph_exec_id}; "
f"triggered while graph exec status = {execution_status}"
)
return callback
while not queue.empty():
if cancel.is_set():
return exec_stats, ExecutionStatus.TERMINATED, error
execution_status = ExecutionStatus.TERMINATED
return execution_stats, execution_status, error
exec_data = queue.get()
queued_node_exec = queue.get()
# Avoid parallel execution of the same node.
execution = running_executions.get(exec_data.node_id)
execution = running_executions.get(queued_node_exec.node_id)
if execution and not execution.ready():
# TODO (performance improvement):
# Wait for the completion of the same node execution is blocking.
@@ -725,13 +736,55 @@ class Executor:
execution.wait()
log_metadata.debug(
f"Dispatching node execution {exec_data.node_exec_id} "
f"for node {exec_data.node_id}",
f"Dispatching node execution {queued_node_exec.node_exec_id} "
f"for node {queued_node_exec.node_id}",
)
running_executions[exec_data.node_id] = cls.executor.apply_async(
try:
exec_cost_counter = cls._charge_usage(
node_exec=queued_node_exec,
execution_count=exec_cost_counter + 1,
execution_stats=execution_stats,
)
except InsufficientBalanceError as error:
node_exec_id = queued_node_exec.node_exec_id
cls.db_client.upsert_execution_output(
node_exec_id=node_exec_id,
output_name="error",
output_data=str(error),
)
execution_status = ExecutionStatus.FAILED
exec_update = cls.db_client.update_node_execution_status(
node_exec_id, execution_status
)
send_execution_update(exec_update)
cls._handle_low_balance_notif(
graph_exec.user_id,
graph_exec.graph_id,
execution_stats,
error,
)
raise
# Add credentials input overrides
node_id = queued_node_exec.node_id
if (node_creds_map := graph_exec.node_credentials_input_map) and (
node_field_creds_map := node_creds_map.get(node_id)
):
queued_node_exec.data.update(
{
field_name: creds_meta.model_dump()
for field_name, creds_meta in node_field_creds_map.items()
}
)
# Initiate node execution
running_executions[queued_node_exec.node_id] = cls.executor.apply_async(
cls.on_node_execution,
(queue, exec_data),
callback=make_exec_callback(exec_data),
(queue, queued_node_exec),
callback=make_exec_callback(queued_node_exec),
)
# Avoid terminating graph execution when some nodes are still running.
@@ -741,7 +794,8 @@ class Executor:
)
for node_id, execution in list(running_executions.items()):
if cancel.is_set():
return exec_stats, ExecutionStatus.TERMINATED, error
execution_status = ExecutionStatus.TERMINATED
return execution_stats, execution_status, error
if not queue.empty():
break # yield to parent loop to execute new queue items
@@ -751,32 +805,24 @@ class Executor:
log_metadata.info(f"Finished graph execution {graph_exec.graph_exec_id}")
if isinstance(low_balance_error, InsufficientBalanceError):
cls._handle_low_balance_notif(
graph_exec.user_id,
graph_exec.graph_id,
exec_stats,
low_balance_error,
)
raise low_balance_error
except Exception as e:
log_metadata.exception(
f"Failed graph execution {graph_exec.graph_exec_id}: {e}"
)
error = e
finally:
if error:
log_metadata.error(
f"Failed graph execution {graph_exec.graph_exec_id}: {error}"
)
execution_status = ExecutionStatus.FAILED
else:
execution_status = ExecutionStatus.COMPLETED
if not cancel.is_set():
finished = True
cancel.set()
cancel_thread.join()
clean_exec_files(graph_exec.graph_exec_id)
return (
exec_stats,
ExecutionStatus.FAILED if error else ExecutionStatus.COMPLETED,
error,
)
return execution_stats, execution_status, error
@classmethod
def _handle_agent_run_notif(
@@ -787,7 +833,10 @@ class Executor:
metadata = cls.db_client.get_graph_metadata(
graph_exec.graph_id, graph_exec.graph_version
)
outputs = cls.db_client.get_execution_results(graph_exec.graph_exec_id)
outputs = cls.db_client.get_node_execution_results(
graph_exec.graph_exec_id,
block_ids=[AgentOutputBlock().id],
)
named_outputs = [
{
@@ -795,7 +844,6 @@ class Executor:
for key, value in output.output_data.items()
}
for output in outputs
if output.block_id == AgentOutputBlock().id
]
event = NotificationEventDTO(
@@ -840,222 +888,178 @@ class Executor:
)
class ExecutionManager(AppService):
class ExecutionManager(AppProcess):
def __init__(self):
super().__init__()
self.use_redis = True
self.use_supabase = True
self.pool_size = settings.config.num_graph_workers
self.queue = ExecutionQueue[GraphExecutionEntry]()
self.running = True
self.active_graph_runs: dict[str, tuple[Future, threading.Event]] = {}
@classmethod
def get_port(cls) -> int:
return settings.config.execution_manager_port
def run_service(self):
from backend.integrations.credentials_store import IntegrationCredentialsStore
def run(self):
retry_count_max = settings.config.execution_manager_loop_max_retry
retry_count = 0
self.credentials_store = IntegrationCredentialsStore()
for retry_count in range(retry_count_max):
try:
self._run()
except Exception as e:
if not self.running:
break
logger.exception(
f"[{self.service_name}] Error in execution manager: {e}"
)
if retry_count >= retry_count_max:
logger.error(
f"[{self.service_name}] Max retries reached ({retry_count_max}), exiting..."
)
break
else:
logger.info(
f"[{self.service_name}] Retrying execution loop in {retry_count} seconds..."
)
time.sleep(retry_count)
def _run(self):
logger.info(f"[{self.service_name}] ⏳ Spawn max-{self.pool_size} workers...")
self.executor = ProcessPoolExecutor(
max_workers=self.pool_size,
initializer=Executor.on_graph_executor_start,
)
sync_manager = multiprocessing.Manager()
logger.info(
f"[{self.service_name}] Started with max-{self.pool_size} graph workers"
logger.info(f"[{self.service_name}] ⏳ Connecting to Redis...")
redis.connect()
# Consume Cancel & Run execution requests.
clear_thread_cache(get_execution_queue)
channel = get_execution_queue().get_channel()
channel.basic_qos(prefetch_count=self.pool_size)
channel.basic_consume(
queue=GRAPH_EXECUTION_CANCEL_QUEUE_NAME,
on_message_callback=self._handle_cancel_message,
auto_ack=True,
)
while True:
graph_exec_data = self.queue.get()
graph_exec_id = graph_exec_data.graph_exec_id
logger.debug(
f"[ExecutionManager] Dispatching graph execution {graph_exec_id}"
)
cancel_event = sync_manager.Event()
future = self.executor.submit(
Executor.on_graph_execution, graph_exec_data, cancel_event
)
self.active_graph_runs[graph_exec_id] = (future, cancel_event)
future.add_done_callback(
lambda _: self.active_graph_runs.pop(graph_exec_id, None)
channel.basic_consume(
queue=GRAPH_EXECUTION_QUEUE_NAME,
on_message_callback=self._handle_run_message,
auto_ack=False,
)
logger.info(f"[{self.service_name}] Ready to consume messages...")
channel.start_consuming()
def _handle_cancel_message(
self,
channel: BlockingChannel,
method: Basic.Deliver,
properties: BasicProperties,
body: bytes,
):
"""
Called whenever we receive a CANCEL message from the queue.
(With auto_ack=True, message is considered 'acked' automatically.)
"""
try:
request = CancelExecutionEvent.model_validate_json(body)
graph_exec_id = request.graph_exec_id
if not graph_exec_id:
logger.warning(
f"[{self.service_name}] Cancel message missing 'graph_exec_id'"
)
return
if graph_exec_id not in self.active_graph_runs:
logger.debug(
f"[{self.service_name}] Cancel received for {graph_exec_id} but not active."
)
return
_, cancel_event = self.active_graph_runs[graph_exec_id]
logger.info(f"[{self.service_name}] Received cancel for {graph_exec_id}")
if not cancel_event.is_set():
cancel_event.set()
else:
logger.debug(
f"[{self.service_name}] Cancel already set for {graph_exec_id}"
)
except Exception as e:
logger.exception(f"Error handling cancel message: {e}")
def _handle_run_message(
self,
channel: BlockingChannel,
method: Basic.Deliver,
properties: BasicProperties,
body: bytes,
):
delivery_tag = method.delivery_tag
try:
graph_exec_entry = GraphExecutionEntry.model_validate_json(body)
except Exception as e:
logger.error(f"[{self.service_name}] Could not parse run message: {e}")
channel.basic_nack(delivery_tag, requeue=False)
return
graph_exec_id = graph_exec_entry.graph_exec_id
logger.info(
f"[{self.service_name}] Received RUN for graph_exec_id={graph_exec_id}"
)
if graph_exec_id in self.active_graph_runs:
logger.warning(
f"[{self.service_name}] Graph {graph_exec_id} already running; rejecting duplicate run."
)
channel.basic_nack(delivery_tag, requeue=False)
return
cancel_event = multiprocessing.Manager().Event()
future = self.executor.submit(
Executor.on_graph_execution, graph_exec_entry, cancel_event
)
self.active_graph_runs[graph_exec_id] = (future, cancel_event)
def _on_run_done(f: Future):
logger.info(f"[{self.service_name}] Run completed for {graph_exec_id}")
try:
self.active_graph_runs.pop(graph_exec_id, None)
if f.exception():
logger.error(
f"[{self.service_name}] Execution for {graph_exec_id} failed: {f.exception()}"
)
channel.connection.add_callback_threadsafe(
lambda: channel.basic_nack(delivery_tag, requeue=False)
)
else:
channel.connection.add_callback_threadsafe(
lambda: channel.basic_ack(delivery_tag)
)
except Exception as e:
logger.error(f"[{self.service_name}] Error acknowledging message: {e}")
future.add_done_callback(_on_run_done)
def cleanup(self):
logger.info(f"[{__class__.__name__}] ⏳ Shutting down graph executor pool...")
super().cleanup()
logger.info(f"[{self.service_name}] ⏳ Shutting down service loop...")
self.running = False
logger.info(f"[{self.service_name}] ⏳ Shutting down RabbitMQ channel...")
get_execution_queue().get_channel().stop_consuming()
logger.info(f"[{self.service_name}] ⏳ Shutting down graph executor pool...")
self.executor.shutdown(cancel_futures=True)
super().cleanup()
logger.info(f"[{self.service_name}] ⏳ Disconnecting Redis...")
redis.disconnect()
@property
def db_client(self) -> "DatabaseManager":
return get_db_client()
@expose
def add_execution(
self,
graph_id: str,
data: BlockInput,
user_id: str,
graph_version: Optional[int] = None,
preset_id: str | None = None,
) -> GraphExecutionEntry:
graph: GraphModel | None = self.db_client.get_graph(
graph_id=graph_id, user_id=user_id, version=graph_version
)
if not graph:
raise ValueError(f"Graph #{graph_id} not found.")
graph.validate_graph(for_run=True)
self._validate_node_input_credentials(graph, user_id)
nodes_input = []
for node in graph.starting_nodes:
input_data = {}
block = get_block(node.block_id)
# Invalid block & Note block should never be executed.
if not block or block.block_type == BlockType.NOTE:
continue
# Extract request input data, and assign it to the input pin.
if block.block_type == BlockType.INPUT:
input_name = node.input_default.get("name")
if input_name and input_name in data:
input_data = {"value": data[input_name]}
# Extract webhook payload, and assign it to the input pin
webhook_payload_key = f"webhook_{node.webhook_id}_payload"
if (
block.block_type in (BlockType.WEBHOOK, BlockType.WEBHOOK_MANUAL)
and node.webhook_id
):
if webhook_payload_key not in data:
raise ValueError(
f"Node {block.name} #{node.id} webhook payload is missing"
)
input_data = {"payload": data[webhook_payload_key]}
input_data, error = validate_exec(node, input_data)
if input_data is None:
raise ValueError(error)
else:
nodes_input.append((node.id, input_data))
if not nodes_input:
raise ValueError(
"No starting nodes found for the graph, make sure an AgentInput or blocks with no inbound links are present as starting nodes."
)
graph_exec_id, node_execs = self.db_client.create_graph_execution(
graph_id=graph_id,
graph_version=graph.version,
nodes_input=nodes_input,
user_id=user_id,
preset_id=preset_id,
)
starting_node_execs = []
for node_exec in node_execs:
starting_node_execs.append(
NodeExecutionEntry(
user_id=user_id,
graph_exec_id=node_exec.graph_exec_id,
graph_id=node_exec.graph_id,
node_exec_id=node_exec.node_exec_id,
node_id=node_exec.node_id,
block_id=node_exec.block_id,
data=node_exec.input_data,
)
)
graph_exec = GraphExecutionEntry(
user_id=user_id,
graph_id=graph_id,
graph_version=graph_version or 0,
graph_exec_id=graph_exec_id,
start_node_execs=starting_node_execs,
)
self.queue.add(graph_exec)
return graph_exec
@expose
def cancel_execution(self, graph_exec_id: str) -> None:
"""
Mechanism:
1. Set the cancel event
2. Graph executor's cancel handler thread detects the event, terminates workers,
reinitializes worker pool, and returns.
3. Update execution statuses in DB and set `error` outputs to `"TERMINATED"`.
"""
if graph_exec_id not in self.active_graph_runs:
raise Exception(
f"Graph execution #{graph_exec_id} not active/running: "
"possibly already completed/cancelled."
)
future, cancel_event = self.active_graph_runs[graph_exec_id]
if cancel_event.is_set():
return
cancel_event.set()
future.result()
# Update the status of the unfinished node executions
node_execs = self.db_client.get_execution_results(graph_exec_id)
for node_exec in node_execs:
if node_exec.status not in (
ExecutionStatus.COMPLETED,
ExecutionStatus.FAILED,
):
exec_update = self.db_client.update_execution_status(
node_exec.node_exec_id, ExecutionStatus.TERMINATED
)
self.db_client.send_execution_update(exec_update)
def _validate_node_input_credentials(self, graph: GraphModel, user_id: str):
"""Checks all credentials for all nodes of the graph"""
for node in graph.nodes:
block = get_block(node.block_id)
if not block:
raise ValueError(f"Unknown block {node.block_id} for node #{node.id}")
# Find any fields of type CredentialsMetaInput
credentials_fields = cast(
type[BlockSchema], block.input_schema
).get_credentials_fields()
if not credentials_fields:
continue
for field_name, credentials_meta_type in credentials_fields.items():
credentials_meta = credentials_meta_type.model_validate(
node.input_default[field_name]
)
# Fetch the corresponding Credentials and perform sanity checks
credentials = self.credentials_store.get_creds_by_id(
user_id, credentials_meta.id
)
if not credentials:
raise ValueError(
f"Unknown credentials #{credentials_meta.id} "
f"for node #{node.id} input '{field_name}'"
)
if (
credentials.provider != credentials_meta.provider
or credentials.type != credentials_meta.type
):
logger.warning(
f"Invalid credentials #{credentials.id} for node #{node.id}: "
"type/provider mismatch: "
f"{credentials_meta.type}<>{credentials.type};"
f"{credentials_meta.provider}<>{credentials.provider}"
)
raise ValueError(
f"Invalid credentials #{credentials.id} for node #{node.id}: "
"type/provider mismatch"
)
# ------- UTILITIES ------- #
@@ -1074,6 +1078,10 @@ def get_notification_service() -> "NotificationManager":
return get_service_client(NotificationManager)
def send_execution_update(entry: GraphExecution | NodeExecutionResult):
return get_execution_event_bus().publish(entry)
@contextmanager
def synchronized(key: str, timeout: int = 60):
lock: RedisLock = redis.get_redis().lock(f"lock:{key}", timeout=timeout)

View File

@@ -16,7 +16,7 @@ from pydantic import BaseModel
from sqlalchemy import MetaData, create_engine
from backend.data.block import BlockInput
from backend.executor.manager import ExecutionManager
from backend.executor import utils as execution_utils
from backend.notifications.notifications import NotificationManager
from backend.util.service import AppService, expose, get_service_client
from backend.util.settings import Config
@@ -57,11 +57,6 @@ def job_listener(event):
log(f"Job {event.job_id} completed successfully.")
@thread_cached
def get_execution_client() -> ExecutionManager:
return get_service_client(ExecutionManager)
@thread_cached
def get_notification_client():
from backend.notifications import NotificationManager
@@ -73,9 +68,9 @@ def execute_graph(**kwargs):
args = ExecutionJobArgs(**kwargs)
try:
log(f"Executing recurring job for graph #{args.graph_id}")
get_execution_client().add_execution(
execution_utils.add_graph_execution(
graph_id=args.graph_id,
data=args.input_data,
inputs=args.input_data,
user_id=args.user_id,
graph_version=args.graph_version,
)
@@ -164,11 +159,6 @@ class Scheduler(AppService):
def db_pool_size(cls) -> int:
return config.scheduler_db_pool_size
@property
@thread_cached
def execution_client(self) -> ExecutionManager:
return get_service_client(ExecutionManager)
@property
@thread_cached
def notification_client(self) -> NotificationManager:
@@ -176,7 +166,7 @@ class Scheduler(AppService):
def run_service(self):
load_dotenv()
db_schema, db_url = _extract_schema_from_url(os.getenv("DATABASE_URL"))
db_schema, db_url = _extract_schema_from_url(os.getenv("DIRECT_URL"))
self.scheduler = BlockingScheduler(
jobstores={
Jobstores.EXECUTION.value: SQLAlchemyJobStore(
@@ -206,6 +196,12 @@ class Scheduler(AppService):
self.scheduler.add_listener(job_listener, EVENT_JOB_EXECUTED | EVENT_JOB_ERROR)
self.scheduler.start()
def cleanup(self):
super().cleanup()
logger.info(f"[{self.service_name}] ⏳ Shutting down scheduler...")
if self.scheduler:
self.scheduler.shutdown(wait=False)
@expose
def add_execution_schedule(
self,

View File

@@ -0,0 +1,748 @@
import logging
from typing import TYPE_CHECKING, Any, Optional, cast
from autogpt_libs.utils.cache import thread_cached
from pydantic import BaseModel
from backend.data.block import (
Block,
BlockData,
BlockInput,
BlockSchema,
BlockType,
get_block,
)
from backend.data.block_cost_config import BLOCK_COSTS
from backend.data.cost import BlockCostType
from backend.data.execution import (
AsyncRedisExecutionEventBus,
ExecutionStatus,
GraphExecutionStats,
GraphExecutionWithNodes,
RedisExecutionEventBus,
create_graph_execution,
update_graph_execution_stats,
update_node_execution_status_batch,
)
from backend.data.graph import GraphModel, Node, get_graph
from backend.data.model import CredentialsMetaInput
from backend.data.rabbitmq import (
AsyncRabbitMQ,
Exchange,
ExchangeType,
Queue,
RabbitMQConfig,
SyncRabbitMQ,
)
from backend.util.exceptions import NotFoundError
from backend.util.mock import MockObject
from backend.util.service import get_service_client
from backend.util.settings import Config
from backend.util.type import convert
if TYPE_CHECKING:
from backend.executor import DatabaseManager
from backend.integrations.credentials_store import IntegrationCredentialsStore
config = Config()
logger = logging.getLogger(__name__)
# ============ Resource Helpers ============ #
@thread_cached
def get_execution_event_bus() -> RedisExecutionEventBus:
return RedisExecutionEventBus()
@thread_cached
def get_async_execution_event_bus() -> AsyncRedisExecutionEventBus:
return AsyncRedisExecutionEventBus()
@thread_cached
def get_execution_queue() -> SyncRabbitMQ:
client = SyncRabbitMQ(create_execution_queue_config())
client.connect()
return client
@thread_cached
async def get_async_execution_queue() -> AsyncRabbitMQ:
client = AsyncRabbitMQ(create_execution_queue_config())
await client.connect()
return client
@thread_cached
def get_integration_credentials_store() -> "IntegrationCredentialsStore":
from backend.integrations.credentials_store import IntegrationCredentialsStore
return IntegrationCredentialsStore()
@thread_cached
def get_db_client() -> "DatabaseManager":
from backend.executor import DatabaseManager
return get_service_client(DatabaseManager)
# ============ Execution Cost Helpers ============ #
class UsageTransactionMetadata(BaseModel):
graph_exec_id: str | None = None
graph_id: str | None = None
node_id: str | None = None
node_exec_id: str | None = None
block_id: str | None = None
block: str | None = None
input: BlockInput | None = None
def execution_usage_cost(execution_count: int) -> tuple[int, int]:
"""
Calculate the cost of executing a graph based on the number of executions.
Args:
execution_count: Number of executions
Returns:
Tuple of cost amount and remaining execution count
"""
return (
execution_count
// config.execution_cost_count_threshold
* config.execution_cost_per_threshold,
execution_count % config.execution_cost_count_threshold,
)
def block_usage_cost(
block: Block,
input_data: BlockInput,
data_size: float = 0,
run_time: float = 0,
) -> tuple[int, BlockInput]:
"""
Calculate the cost of using a block based on the input data and the block type.
Args:
block: Block object
input_data: Input data for the block
data_size: Size of the input data in bytes
run_time: Execution time of the block in seconds
Returns:
Tuple of cost amount and cost filter
"""
block_costs = BLOCK_COSTS.get(type(block))
if not block_costs:
return 0, {}
for block_cost in block_costs:
if not _is_cost_filter_match(block_cost.cost_filter, input_data):
continue
if block_cost.cost_type == BlockCostType.RUN:
return block_cost.cost_amount, block_cost.cost_filter
if block_cost.cost_type == BlockCostType.SECOND:
return (
int(run_time * block_cost.cost_amount),
block_cost.cost_filter,
)
if block_cost.cost_type == BlockCostType.BYTE:
return (
int(data_size * block_cost.cost_amount),
block_cost.cost_filter,
)
return 0, {}
def _is_cost_filter_match(cost_filter: BlockInput, input_data: BlockInput) -> bool:
"""
Filter rules:
- If cost_filter is an object, then check if cost_filter is the subset of input_data
- Otherwise, check if cost_filter is equal to input_data.
- Undefined, null, and empty string are considered as equal.
"""
if not isinstance(cost_filter, dict) or not isinstance(input_data, dict):
return cost_filter == input_data
return all(
(not input_data.get(k) and not v)
or (input_data.get(k) and _is_cost_filter_match(v, input_data[k]))
for k, v in cost_filter.items()
)
# ============ Execution Input Helpers ============ #
LIST_SPLIT = "_$_"
DICT_SPLIT = "_#_"
OBJC_SPLIT = "_@_"
def parse_execution_output(output: BlockData, name: str) -> Any | None:
"""
Extracts partial output data by name from a given BlockData.
The function supports extracting data from lists, dictionaries, and objects
using specific naming conventions:
- For lists: <output_name>_$_<index>
- For dictionaries: <output_name>_#_<key>
- For objects: <output_name>_@_<attribute>
Args:
output (BlockData): A tuple containing the output name and data.
name (str): The name used to extract specific data from the output.
Returns:
Any | None: The extracted data if found, otherwise None.
Examples:
>>> output = ("result", [10, 20, 30])
>>> parse_execution_output(output, "result_$_1")
20
>>> output = ("config", {"key1": "value1", "key2": "value2"})
>>> parse_execution_output(output, "config_#_key1")
'value1'
>>> class Sample:
... attr1 = "value1"
... attr2 = "value2"
>>> output = ("object", Sample())
>>> parse_execution_output(output, "object_@_attr1")
'value1'
"""
output_name, output_data = output
if name == output_name:
return output_data
if name.startswith(f"{output_name}{LIST_SPLIT}"):
index = int(name.split(LIST_SPLIT)[1])
if not isinstance(output_data, list) or len(output_data) <= index:
return None
return output_data[int(name.split(LIST_SPLIT)[1])]
if name.startswith(f"{output_name}{DICT_SPLIT}"):
index = name.split(DICT_SPLIT)[1]
if not isinstance(output_data, dict) or index not in output_data:
return None
return output_data[index]
if name.startswith(f"{output_name}{OBJC_SPLIT}"):
index = name.split(OBJC_SPLIT)[1]
if isinstance(output_data, object) and hasattr(output_data, index):
return getattr(output_data, index)
return None
return None
def validate_exec(
node: Node,
data: BlockInput,
resolve_input: bool = True,
) -> tuple[BlockInput | None, str]:
"""
Validate the input data for a node execution.
Args:
node: The node to execute.
data: The input data for the node execution.
resolve_input: Whether to resolve dynamic pins into dict/list/object.
Returns:
A tuple of the validated data and the block name.
If the data is invalid, the first element will be None, and the second element
will be an error message.
If the data is valid, the first element will be the resolved input data, and
the second element will be the block name.
"""
node_block: Block | None = get_block(node.block_id)
if not node_block:
return None, f"Block for {node.block_id} not found."
schema = node_block.input_schema
# Convert non-matching data types to the expected input schema.
for name, data_type in schema.__annotations__.items():
if (value := data.get(name)) and (type(value) is not data_type):
data[name] = convert(value, data_type)
# Input data (without default values) should contain all required fields.
error_prefix = f"Input data missing or mismatch for `{node_block.name}`:"
if missing_links := schema.get_missing_links(data, node.input_links):
return None, f"{error_prefix} unpopulated links {missing_links}"
# Merge input data with default values and resolve dynamic dict/list/object pins.
input_default = schema.get_input_defaults(node.input_default)
data = {**input_default, **data}
if resolve_input:
data = merge_execution_input(data)
# Input data post-merge should contain all required fields from the schema.
if missing_input := schema.get_missing_input(data):
return None, f"{error_prefix} missing input {missing_input}"
# Last validation: Validate the input values against the schema.
if error := schema.get_mismatch_error(data):
error_message = f"{error_prefix} {error}"
logger.error(error_message)
return None, error_message
return data, node_block.name
def merge_execution_input(data: BlockInput) -> BlockInput:
"""
Merges dynamic input pins into a single list, dictionary, or object based on naming patterns.
This function processes input keys that follow specific patterns to merge them into a unified structure:
- `<input_name>_$_<index>` for list inputs.
- `<input_name>_#_<index>` for dictionary inputs.
- `<input_name>_@_<index>` for object inputs.
Args:
data (BlockInput): A dictionary containing input keys and their corresponding values.
Returns:
BlockInput: A dictionary with merged inputs.
Raises:
ValueError: If a list index is not an integer.
Examples:
>>> data = {
... "list_$_0": "a",
... "list_$_1": "b",
... "dict_#_key1": "value1",
... "dict_#_key2": "value2",
... "object_@_attr1": "value1",
... "object_@_attr2": "value2"
... }
>>> merge_execution_input(data)
{
"list": ["a", "b"],
"dict": {"key1": "value1", "key2": "value2"},
"object": <MockObject attr1="value1" attr2="value2">
}
"""
# Merge all input with <input_name>_$_<index> into a single list.
items = list(data.items())
for key, value in items:
if LIST_SPLIT not in key:
continue
name, index = key.split(LIST_SPLIT)
if not index.isdigit():
raise ValueError(f"Invalid key: {key}, #{index} index must be an integer.")
data[name] = data.get(name, [])
if int(index) >= len(data[name]):
# Pad list with empty string on missing indices.
data[name].extend([""] * (int(index) - len(data[name]) + 1))
data[name][int(index)] = value
# Merge all input with <input_name>_#_<index> into a single dict.
for key, value in items:
if DICT_SPLIT not in key:
continue
name, index = key.split(DICT_SPLIT)
data[name] = data.get(name, {})
data[name][index] = value
# Merge all input with <input_name>_@_<index> into a single object.
for key, value in items:
if OBJC_SPLIT not in key:
continue
name, index = key.split(OBJC_SPLIT)
if name not in data or not isinstance(data[name], object):
data[name] = MockObject()
setattr(data[name], index, value)
return data
def _validate_node_input_credentials(
graph: GraphModel,
user_id: str,
node_credentials_input_map: Optional[
dict[str, dict[str, CredentialsMetaInput]]
] = None,
):
"""Checks all credentials for all nodes of the graph"""
for node in graph.nodes:
block = node.block
# Find any fields of type CredentialsMetaInput
credentials_fields = cast(
type[BlockSchema], block.input_schema
).get_credentials_fields()
if not credentials_fields:
continue
for field_name, credentials_meta_type in credentials_fields.items():
if (
node_credentials_input_map
and (node_credentials_inputs := node_credentials_input_map.get(node.id))
and field_name in node_credentials_inputs
):
credentials_meta = node_credentials_input_map[node.id][field_name]
elif field_name in node.input_default:
credentials_meta = credentials_meta_type.model_validate(
node.input_default[field_name]
)
else:
raise ValueError(
f"Credentials absent for {block.name} node #{node.id} "
f"input '{field_name}'"
)
# Fetch the corresponding Credentials and perform sanity checks
credentials = get_integration_credentials_store().get_creds_by_id(
user_id, credentials_meta.id
)
if not credentials:
raise ValueError(
f"Unknown credentials #{credentials_meta.id} "
f"for node #{node.id} input '{field_name}'"
)
if (
credentials.provider != credentials_meta.provider
or credentials.type != credentials_meta.type
):
logger.warning(
f"Invalid credentials #{credentials.id} for node #{node.id}: "
"type/provider mismatch: "
f"{credentials_meta.type}<>{credentials.type};"
f"{credentials_meta.provider}<>{credentials.provider}"
)
raise ValueError(
f"Invalid credentials #{credentials.id} for node #{node.id}: "
"type/provider mismatch"
)
def make_node_credentials_input_map(
graph: GraphModel,
graph_credentials_input: dict[str, CredentialsMetaInput],
) -> dict[str, dict[str, CredentialsMetaInput]]:
"""
Maps credentials for an execution to the correct nodes.
Params:
graph: The graph to be executed.
graph_credentials_input: A (graph_input_name, credentials_meta) map.
Returns:
dict[node_id, dict[field_name, CredentialsMetaInput]]: Node credentials input map.
"""
result: dict[str, dict[str, CredentialsMetaInput]] = {}
# Get aggregated credentials fields for the graph
graph_cred_inputs = graph.aggregate_credentials_inputs()
for graph_input_name, (_, compatible_node_fields) in graph_cred_inputs.items():
# Best-effort map: skip missing items
if graph_input_name not in graph_credentials_input:
continue
# Use passed-in credentials for all compatible node input fields
for node_id, node_field_name in compatible_node_fields:
if node_id not in result:
result[node_id] = {}
result[node_id][node_field_name] = graph_credentials_input[graph_input_name]
return result
def construct_node_execution_input(
graph: GraphModel,
user_id: str,
graph_inputs: BlockInput,
node_credentials_input_map: Optional[
dict[str, dict[str, CredentialsMetaInput]]
] = None,
) -> list[tuple[str, BlockInput]]:
"""
Validates and prepares the input data for executing a graph.
This function checks the graph for starting nodes, validates the input data
against the schema, and resolves dynamic input pins into a single list,
dictionary, or object.
Args:
graph (GraphModel): The graph model to execute.
user_id (str): The ID of the user executing the graph.
data (BlockInput): The input data for the graph execution.
node_credentials_map: `dict[node_id, dict[input_name, CredentialsMetaInput]]`
Returns:
list[tuple[str, BlockInput]]: A list of tuples, each containing the node ID and
the corresponding input data for that node.
"""
graph.validate_graph(for_run=True)
_validate_node_input_credentials(graph, user_id, node_credentials_input_map)
nodes_input = []
for node in graph.starting_nodes:
input_data = {}
block = node.block
# Note block should never be executed.
if block.block_type == BlockType.NOTE:
continue
# Extract request input data, and assign it to the input pin.
if block.block_type == BlockType.INPUT:
input_name = node.input_default.get("name")
if input_name and input_name in graph_inputs:
input_data = {"value": graph_inputs[input_name]}
# Extract webhook payload, and assign it to the input pin
webhook_payload_key = f"webhook_{node.webhook_id}_payload"
if (
block.block_type in (BlockType.WEBHOOK, BlockType.WEBHOOK_MANUAL)
and node.webhook_id
):
if webhook_payload_key not in graph_inputs:
raise ValueError(
f"Node {block.name} #{node.id} webhook payload is missing"
)
input_data = {"payload": graph_inputs[webhook_payload_key]}
# Apply node credentials overrides
if node_credentials_input_map and (
node_credentials := node_credentials_input_map.get(node.id)
):
input_data.update({k: v.model_dump() for k, v in node_credentials.items()})
input_data, error = validate_exec(node, input_data)
if input_data is None:
raise ValueError(error)
else:
nodes_input.append((node.id, input_data))
if not nodes_input:
raise ValueError(
"No starting nodes found for the graph, make sure an AgentInput or blocks with no inbound links are present as starting nodes."
)
return nodes_input
# ============ Execution Queue Helpers ============ #
class CancelExecutionEvent(BaseModel):
graph_exec_id: str
GRAPH_EXECUTION_EXCHANGE = Exchange(
name="graph_execution",
type=ExchangeType.DIRECT,
durable=True,
auto_delete=False,
)
GRAPH_EXECUTION_QUEUE_NAME = "graph_execution_queue"
GRAPH_EXECUTION_ROUTING_KEY = "graph_execution.run"
GRAPH_EXECUTION_CANCEL_EXCHANGE = Exchange(
name="graph_execution_cancel",
type=ExchangeType.FANOUT,
durable=True,
auto_delete=True,
)
GRAPH_EXECUTION_CANCEL_QUEUE_NAME = "graph_execution_cancel_queue"
def create_execution_queue_config() -> RabbitMQConfig:
"""
Define two exchanges and queues:
- 'graph_execution' (DIRECT) for run tasks.
- 'graph_execution_cancel' (FANOUT) for cancel requests.
"""
run_queue = Queue(
name=GRAPH_EXECUTION_QUEUE_NAME,
exchange=GRAPH_EXECUTION_EXCHANGE,
routing_key=GRAPH_EXECUTION_ROUTING_KEY,
durable=True,
auto_delete=False,
)
cancel_queue = Queue(
name=GRAPH_EXECUTION_CANCEL_QUEUE_NAME,
exchange=GRAPH_EXECUTION_CANCEL_EXCHANGE,
routing_key="", # not used for FANOUT
durable=True,
auto_delete=False,
)
return RabbitMQConfig(
vhost="/",
exchanges=[GRAPH_EXECUTION_EXCHANGE, GRAPH_EXECUTION_CANCEL_EXCHANGE],
queues=[run_queue, cancel_queue],
)
async def add_graph_execution_async(
graph_id: str,
user_id: str,
inputs: BlockInput,
preset_id: Optional[str] = None,
graph_version: Optional[int] = None,
graph_credentials_inputs: Optional[dict[str, CredentialsMetaInput]] = None,
) -> GraphExecutionWithNodes:
"""
Adds a graph execution to the queue and returns the execution entry.
Args:
graph_id: The ID of the graph to execute.
user_id: The ID of the user executing the graph.
inputs: The input data for the graph execution.
preset_id: The ID of the preset to use.
graph_version: The version of the graph to execute.
graph_credentials_inputs: Credentials inputs to use in the execution.
Keys should map to the keys generated by `GraphModel.aggregate_credentials_inputs`.
Returns:
GraphExecutionEntry: The entry for the graph execution.
Raises:
ValueError: If the graph is not found or if there are validation errors.
""" # noqa
graph: GraphModel | None = await get_graph(
graph_id=graph_id, user_id=user_id, version=graph_version
)
if not graph:
raise NotFoundError(f"Graph #{graph_id} not found.")
node_credentials_input_map = (
make_node_credentials_input_map(graph, graph_credentials_inputs)
if graph_credentials_inputs
else None
)
graph_exec = await create_graph_execution(
user_id=user_id,
graph_id=graph_id,
graph_version=graph.version,
starting_nodes_input=construct_node_execution_input(
graph=graph,
user_id=user_id,
graph_inputs=inputs,
node_credentials_input_map=node_credentials_input_map,
),
preset_id=preset_id,
)
try:
queue = await get_async_execution_queue()
graph_exec_entry = graph_exec.to_graph_execution_entry()
if node_credentials_input_map:
graph_exec_entry.node_credentials_input_map = node_credentials_input_map
await queue.publish_message(
routing_key=GRAPH_EXECUTION_ROUTING_KEY,
message=graph_exec_entry.model_dump_json(),
exchange=GRAPH_EXECUTION_EXCHANGE,
)
bus = get_async_execution_event_bus()
await bus.publish(graph_exec)
return graph_exec
except Exception as e:
logger.error(f"Unable to publish graph #{graph_id} exec #{graph_exec.id}: {e}")
await update_node_execution_status_batch(
[node_exec.node_exec_id for node_exec in graph_exec.node_executions],
ExecutionStatus.FAILED,
)
await update_graph_execution_stats(
graph_exec_id=graph_exec.id,
status=ExecutionStatus.FAILED,
stats=GraphExecutionStats(error=str(e)),
)
raise
def add_graph_execution(
graph_id: str,
user_id: str,
inputs: BlockInput,
preset_id: Optional[str] = None,
graph_version: Optional[int] = None,
graph_credentials_inputs: Optional[dict[str, CredentialsMetaInput]] = None,
) -> GraphExecutionWithNodes:
"""
Adds a graph execution to the queue and returns the execution entry.
Args:
graph_id: The ID of the graph to execute.
user_id: The ID of the user executing the graph.
inputs: The input data for the graph execution.
preset_id: The ID of the preset to use.
graph_version: The version of the graph to execute.
graph_credentials_inputs: Credentials inputs to use in the execution.
Keys should map to the keys generated by `GraphModel.aggregate_credentials_inputs`.
Returns:
GraphExecutionEntry: The entry for the graph execution.
Raises:
ValueError: If the graph is not found or if there are validation errors.
"""
db = get_db_client()
graph: GraphModel | None = db.get_graph(
graph_id=graph_id, user_id=user_id, version=graph_version
)
if not graph:
raise NotFoundError(f"Graph #{graph_id} not found.")
node_credentials_input_map = (
make_node_credentials_input_map(graph, graph_credentials_inputs)
if graph_credentials_inputs
else None
)
graph_exec = db.create_graph_execution(
user_id=user_id,
graph_id=graph_id,
graph_version=graph.version,
starting_nodes_input=construct_node_execution_input(
graph=graph,
user_id=user_id,
graph_inputs=inputs,
node_credentials_input_map=node_credentials_input_map,
),
preset_id=preset_id,
)
try:
queue = get_execution_queue()
graph_exec_entry = graph_exec.to_graph_execution_entry()
if node_credentials_input_map:
graph_exec_entry.node_credentials_input_map = node_credentials_input_map
queue.publish_message(
routing_key=GRAPH_EXECUTION_ROUTING_KEY,
message=graph_exec_entry.model_dump_json(),
exchange=GRAPH_EXECUTION_EXCHANGE,
)
bus = get_execution_event_bus()
bus.publish(graph_exec)
return graph_exec
except Exception as e:
logger.error(f"Unable to publish graph #{graph_id} exec #{graph_exec.id}: {e}")
db.update_node_execution_status_batch(
[node_exec.node_exec_id for node_exec in graph_exec.node_executions],
ExecutionStatus.FAILED,
)
db.update_graph_execution_stats(
graph_exec_id=graph_exec.id,
status=ExecutionStatus.FAILED,
stats=GraphExecutionStats(error=str(e)),
)
raise

View File

@@ -161,6 +161,14 @@ smartlead_credentials = APIKeyCredentials(
expires_at=None,
)
google_maps_credentials = APIKeyCredentials(
id="9aa1bde0-4947-4a70-a20c-84daa3850d52",
provider="google_maps",
api_key=SecretStr(settings.secrets.google_maps_api_key),
title="Use Credits for Google Maps",
expires_at=None,
)
zerobounce_credentials = APIKeyCredentials(
id="63a6e279-2dc2-448e-bf57-85776f7176dc",
provider="zerobounce",
@@ -190,6 +198,7 @@ DEFAULT_CREDENTIALS = [
apollo_credentials,
smartlead_credentials,
zerobounce_credentials,
google_maps_credentials,
]
@@ -263,6 +272,8 @@ class IntegrationCredentialsStore:
all_credentials.append(smartlead_credentials)
if settings.secrets.zerobounce_api_key:
all_credentials.append(zerobounce_credentials)
if settings.secrets.google_maps_api_key:
all_credentials.append(google_maps_credentials)
return all_credentials
def get_creds_by_id(self, user_id: str, credentials_id: str) -> Credentials | None:

View File

@@ -10,6 +10,7 @@ from backend.data import redis
from backend.data.model import Credentials
from backend.integrations.credentials_store import IntegrationCredentialsStore
from backend.integrations.oauth import HANDLERS_BY_NAME
from backend.integrations.providers import ProviderName
from backend.util.exceptions import MissingConfigError
from backend.util.settings import Settings
@@ -153,12 +154,13 @@ class IntegrationCredentialsManager:
self.store.locks.release_all_locks()
def _get_provider_oauth_handler(provider_name: str) -> "BaseOAuthHandler":
def _get_provider_oauth_handler(provider_name_str: str) -> "BaseOAuthHandler":
provider_name = ProviderName(provider_name_str)
if provider_name not in HANDLERS_BY_NAME:
raise KeyError(f"Unknown provider '{provider_name}'")
client_id = getattr(settings.secrets, f"{provider_name}_client_id")
client_secret = getattr(settings.secrets, f"{provider_name}_client_secret")
client_id = getattr(settings.secrets, f"{provider_name.value}_client_id")
client_secret = getattr(settings.secrets, f"{provider_name.value}_client_secret")
if not (client_id and client_secret):
raise MissingConfigError(
f"Integration with provider '{provider_name}' is not configured",

View File

@@ -11,6 +11,7 @@ class ProviderName(str, Enum):
E2B = "e2b"
EXA = "exa"
FAL = "fal"
GENERIC_WEBHOOK = "generic_webhook"
GITHUB = "github"
GOOGLE = "google"
GOOGLE_MAPS = "google_maps"

View File

@@ -1,22 +1,45 @@
from typing import TYPE_CHECKING
from .compass import CompassWebhookManager
from .github import GithubWebhooksManager
from .slant3d import Slant3DWebhooksManager
if TYPE_CHECKING:
from ..providers import ProviderName
from ._base import BaseWebhooksManager
# --8<-- [start:WEBHOOK_MANAGERS_BY_NAME]
WEBHOOK_MANAGERS_BY_NAME: dict["ProviderName", type["BaseWebhooksManager"]] = {
handler.PROVIDER_NAME: handler
for handler in [
CompassWebhookManager,
GithubWebhooksManager,
Slant3DWebhooksManager,
]
}
# --8<-- [end:WEBHOOK_MANAGERS_BY_NAME]
_WEBHOOK_MANAGERS: dict["ProviderName", type["BaseWebhooksManager"]] = {}
__all__ = ["WEBHOOK_MANAGERS_BY_NAME"]
# --8<-- [start:load_webhook_managers]
def load_webhook_managers() -> dict["ProviderName", type["BaseWebhooksManager"]]:
if _WEBHOOK_MANAGERS:
return _WEBHOOK_MANAGERS
from .compass import CompassWebhookManager
from .generic import GenericWebhooksManager
from .github import GithubWebhooksManager
from .slant3d import Slant3DWebhooksManager
_WEBHOOK_MANAGERS.update(
{
handler.PROVIDER_NAME: handler
for handler in [
CompassWebhookManager,
GithubWebhooksManager,
Slant3DWebhooksManager,
GenericWebhooksManager,
]
}
)
return _WEBHOOK_MANAGERS
# --8<-- [end:load_webhook_managers]
def get_webhook_manager(provider_name: "ProviderName") -> "BaseWebhooksManager":
return load_webhook_managers()[provider_name]()
def supports_webhooks(provider_name: "ProviderName") -> bool:
return provider_name in load_webhook_managers()
__all__ = ["get_webhook_manager", "supports_webhooks"]

View File

@@ -0,0 +1,29 @@
import logging
from fastapi import Request
from strenum import StrEnum
from backend.data import integrations
from backend.integrations.providers import ProviderName
from ._manual_base import ManualWebhookManagerBase
logger = logging.getLogger(__name__)
class GenericWebhookType(StrEnum):
PLAIN = "plain"
class GenericWebhooksManager(ManualWebhookManagerBase):
PROVIDER_NAME = ProviderName.GENERIC_WEBHOOK
WebhookType = GenericWebhookType
@classmethod
async def validate_payload(
cls, webhook: integrations.Webhook, request: Request
) -> tuple[dict, str]:
payload = await request.json()
event_type = GenericWebhookType.PLAIN
return payload, event_type

View File

@@ -1,9 +1,9 @@
import logging
from typing import TYPE_CHECKING, Callable, Optional, cast
from backend.data.block import BlockSchema, BlockWebhookConfig, get_block
from backend.data.block import BlockSchema, BlockWebhookConfig
from backend.data.graph import set_node_webhook
from backend.integrations.webhooks import WEBHOOK_MANAGERS_BY_NAME
from backend.integrations.webhooks import get_webhook_manager, supports_webhooks
if TYPE_CHECKING:
from backend.data.graph import GraphModel, NodeModel
@@ -29,12 +29,7 @@ async def on_graph_activate(
# Compare nodes in new_graph_version with previous_graph_version
updated_nodes = []
for new_node in graph.nodes:
block = get_block(new_node.block_id)
if not block:
raise ValueError(
f"Node #{new_node.id} is instance of unknown block #{new_node.block_id}"
)
block_input_schema = cast(BlockSchema, block.input_schema)
block_input_schema = cast(BlockSchema, new_node.block.input_schema)
node_credentials = None
if (
@@ -75,12 +70,7 @@ async def on_graph_deactivate(
"""
updated_nodes = []
for node in graph.nodes:
block = get_block(node.block_id)
if not block:
raise ValueError(
f"Node #{node.id} is instance of unknown block #{node.block_id}"
)
block_input_schema = cast(BlockSchema, block.input_schema)
block_input_schema = cast(BlockSchema, node.block.input_schema)
node_credentials = None
if (
@@ -113,17 +103,13 @@ async def on_node_activate(
) -> "NodeModel":
"""Hook to be called when the node is activated/created"""
block = get_block(node.block_id)
if not block:
raise ValueError(
f"Node #{node.id} is instance of unknown block #{node.block_id}"
)
block = node.block
if not block.webhook_config:
return node
provider = block.webhook_config.provider
if provider not in WEBHOOK_MANAGERS_BY_NAME:
if not supports_webhooks(provider):
raise ValueError(
f"Block #{block.id} has webhook_config for provider {provider} "
"which does not support webhooks"
@@ -133,7 +119,7 @@ async def on_node_activate(
f"Activating webhook node #{node.id} with config {block.webhook_config}"
)
webhooks_manager = WEBHOOK_MANAGERS_BY_NAME[provider]()
webhooks_manager = get_webhook_manager(provider)
if auto_setup_webhook := isinstance(block.webhook_config, BlockWebhookConfig):
try:
@@ -224,23 +210,19 @@ async def on_node_deactivate(
"""Hook to be called when node is deactivated/deleted"""
logger.debug(f"Deactivating node #{node.id}")
block = get_block(node.block_id)
if not block:
raise ValueError(
f"Node #{node.id} is instance of unknown block #{node.block_id}"
)
block = node.block
if not block.webhook_config:
return node
provider = block.webhook_config.provider
if provider not in WEBHOOK_MANAGERS_BY_NAME:
if not supports_webhooks(provider):
raise ValueError(
f"Block #{block.id} has webhook_config for provider {provider} "
"which does not support webhooks"
)
webhooks_manager = WEBHOOK_MANAGERS_BY_NAME[provider]()
webhooks_manager = get_webhook_manager(provider)
if node.webhook_id:
logger.debug(f"Node #{node.id} has webhook_id {node.webhook_id}")

View File

@@ -9,6 +9,7 @@ from autogpt_libs.utils.cache import thread_cached
from prisma.enums import NotificationType
from pydantic import BaseModel
from backend.data import rabbitmq
from backend.data.notifications import (
BaseSummaryData,
BaseSummaryParams,
@@ -23,23 +24,12 @@ from backend.data.notifications import (
SummaryParamsEventModel,
WeeklySummaryData,
WeeklySummaryParams,
create_or_add_to_user_notification_batch,
empty_user_notification_batch,
get_all_batches_by_type,
get_batch_delay,
get_notif_data_type,
get_summary_params_type,
get_user_notification_batch,
get_user_notification_oldest_message_in_batch,
)
from backend.data.rabbitmq import Exchange, ExchangeType, Queue, RabbitMQConfig
from backend.data.user import (
generate_unsubscribe_link,
get_active_user_ids_in_timerange,
get_user_email_by_id,
get_user_email_verification,
get_user_notification_preference,
)
from backend.data.user import generate_unsubscribe_link
from backend.notifications.email import EmailSender
from backend.util.service import AppService, expose, get_service_client
from backend.util.settings import Settings
@@ -123,16 +113,36 @@ def get_scheduler():
return get_service_client(Scheduler)
@thread_cached
def get_db():
from backend.executor.database import DatabaseManager
return get_service_client(DatabaseManager)
class NotificationManager(AppService):
"""Service for handling notifications with batching support"""
def __init__(self):
super().__init__()
self.use_db = True
self.rabbitmq_config = create_notification_config()
self.running = True
self.email_sender = EmailSender()
@property
def rabbit(self) -> rabbitmq.AsyncRabbitMQ:
"""Access the RabbitMQ service. Will raise if not configured."""
if not self.rabbitmq_service:
raise RuntimeError("RabbitMQ not configured for this service")
return self.rabbitmq_service
@property
def rabbit_config(self) -> rabbitmq.RabbitMQConfig:
"""Access the RabbitMQ config. Will raise if not configured."""
if not self.rabbitmq_config:
raise RuntimeError("RabbitMQ not configured for this service")
return self.rabbitmq_config
@classmethod
def get_port(cls) -> int:
return settings.config.notification_service_port
@@ -160,11 +170,9 @@ class NotificationManager(AppService):
processed_count = 0
current_time = datetime.now(tz=timezone.utc)
start_time = current_time - timedelta(days=7)
users = self.run_and_wait(
get_active_user_ids_in_timerange(
end_time=current_time.isoformat(),
start_time=start_time.isoformat(),
)
users = get_db().get_active_user_ids_in_timerange(
end_time=current_time.isoformat(),
start_time=start_time.isoformat(),
)
for user in users:
@@ -194,84 +202,84 @@ class NotificationManager(AppService):
for notification_type in notification_types:
# Get all batches for this notification type
batches = self.run_and_wait(get_all_batches_by_type(notification_type))
batches = get_db().get_all_batches_by_type(notification_type)
for batch in batches:
# Check if batch has aged out
oldest_message = self.run_and_wait(
get_user_notification_oldest_message_in_batch(
batch.userId, notification_type
oldest_message = (
get_db().get_user_notification_oldest_message_in_batch(
batch.user_id, notification_type
)
)
if not oldest_message:
# this should never happen
logger.error(
f"Batch for user {batch.userId} and type {notification_type} has no oldest message whichshould never happen!!!!!!!!!!!!!!!!"
f"Batch for user {batch.user_id} and type {notification_type} has no oldest message whichshould never happen!!!!!!!!!!!!!!!!"
)
continue
max_delay = get_batch_delay(notification_type)
# If batch has aged out, process it
if oldest_message.createdAt + max_delay < current_time:
recipient_email = self.run_and_wait(
get_user_email_by_id(batch.userId)
)
if oldest_message.created_at + max_delay < current_time:
recipient_email = get_db().get_user_email_by_id(batch.user_id)
if not recipient_email:
logger.error(
f"User email not found for user {batch.userId}"
f"User email not found for user {batch.user_id}"
)
continue
should_send = self._should_email_user_based_on_preference(
batch.userId, notification_type
batch.user_id, notification_type
)
if not should_send:
logger.debug(
f"User {batch.userId} does not want to receive {notification_type} notifications"
f"User {batch.user_id} does not want to receive {notification_type} notifications"
)
# Clear the batch
self.run_and_wait(
empty_user_notification_batch(
batch.userId, notification_type
)
get_db().empty_user_notification_batch(
batch.user_id, notification_type
)
continue
batch_data = self.run_and_wait(
get_user_notification_batch(batch.userId, notification_type)
batch_data = get_db().get_user_notification_batch(
batch.user_id, notification_type
)
if not batch_data or not batch_data.notifications:
logger.error(
f"Batch data not found for user {batch.userId}"
f"Batch data not found for user {batch.user_id}"
)
# Clear the batch
self.run_and_wait(
empty_user_notification_batch(
batch.userId, notification_type
)
get_db().empty_user_notification_batch(
batch.user_id, notification_type
)
continue
unsub_link = generate_unsubscribe_link(batch.userId)
events = [
NotificationEventModel[
get_notif_data_type(db_event.type)
].model_validate(
{
"user_id": batch.userId,
"type": db_event.type,
"data": db_event.data,
"created_at": db_event.createdAt,
}
)
for db_event in batch_data.notifications
]
unsub_link = generate_unsubscribe_link(batch.user_id)
events = []
for db_event in batch_data.notifications:
try:
events.append(
NotificationEventModel[
get_notif_data_type(db_event.type)
].model_validate(
{
"user_id": batch.user_id,
"type": db_event.type,
"data": db_event.data,
"created_at": db_event.created_at,
}
)
)
except Exception as e:
logger.error(
f"Error parsing notification event: {e=}, {db_event=}"
)
continue
logger.info(f"{events=}")
self.email_sender.send_templated(
@@ -282,10 +290,8 @@ class NotificationManager(AppService):
)
# Clear the batch
self.run_and_wait(
empty_user_notification_batch(
batch.userId, notification_type
)
get_db().empty_user_notification_batch(
batch.user_id, notification_type
)
processed_count += 1
@@ -377,14 +383,16 @@ class NotificationManager(AppService):
self, user_id: str, event_type: NotificationType
) -> bool:
"""Check if a user wants to receive a notification based on their preferences and email verification status"""
validated_email = self.run_and_wait(get_user_email_verification(user_id))
preference = self.run_and_wait(
get_user_notification_preference(user_id)
).preferences.get(event_type, True)
validated_email = get_db().get_user_email_verification(user_id)
preference = (
get_db()
.get_user_notification_preference(user_id)
.preferences.get(event_type, True)
)
# only if both are true, should we email this person
return validated_email and preference
async def _gather_summary_data(
def _gather_summary_data(
self, user_id: str, event_type: NotificationType, params: BaseSummaryParams
) -> BaseSummaryData:
"""Gathers the data to build a summary notification"""
@@ -464,13 +472,13 @@ class NotificationManager(AppService):
else:
raise ValueError("Invalid event type or params")
async def _should_batch(
def _should_batch(
self, user_id: str, event_type: NotificationType, event: NotificationEventModel
) -> bool:
await create_or_add_to_user_notification_batch(user_id, event_type, event)
get_db().create_or_add_to_user_notification_batch(user_id, event_type, event)
oldest_message = await get_user_notification_oldest_message_in_batch(
oldest_message = get_db().get_user_notification_oldest_message_in_batch(
user_id, event_type
)
if not oldest_message:
@@ -478,7 +486,7 @@ class NotificationManager(AppService):
f"Batch for user {user_id} and type {event_type} has no oldest message whichshould never happen!!!!!!!!!!!!!!!!"
)
return False
oldest_age = oldest_message.createdAt
oldest_age = oldest_message.created_at
max_delay = get_batch_delay(event_type)
@@ -527,7 +535,7 @@ class NotificationManager(AppService):
model = parsed.model
logger.debug(f"Processing immediate notification: {model}")
recipient_email = self.run_and_wait(get_user_email_by_id(event.user_id))
recipient_email = get_db().get_user_email_by_id(event.user_id)
if not recipient_email:
logger.error(f"User email not found for user {event.user_id}")
return False
@@ -564,7 +572,7 @@ class NotificationManager(AppService):
model = parsed.model
logger.info(f"Processing batch notification: {model}")
recipient_email = self.run_and_wait(get_user_email_by_id(event.user_id))
recipient_email = get_db().get_user_email_by_id(event.user_id)
if not recipient_email:
logger.error(f"User email not found for user {event.user_id}")
return False
@@ -578,16 +586,12 @@ class NotificationManager(AppService):
)
return True
should_send = self.run_and_wait(
self._should_batch(event.user_id, event.type, model)
)
should_send = self._should_batch(event.user_id, event.type, model)
if not should_send:
logger.info("Batch not old enough to send")
return False
batch = self.run_and_wait(
get_user_notification_batch(event.user_id, event.type)
)
batch = get_db().get_user_notification_batch(event.user_id, event.type)
if not batch or not batch.notifications:
logger.error(f"Batch not found for user {event.user_id}")
return False
@@ -601,7 +605,7 @@ class NotificationManager(AppService):
"user_id": event.user_id,
"type": db_event.type,
"data": db_event.data,
"created_at": db_event.createdAt,
"created_at": db_event.created_at,
}
)
for db_event in batch.notifications
@@ -614,7 +618,7 @@ class NotificationManager(AppService):
user_unsub_link=unsub_link,
)
# only empty the batch if we sent the email successfully
self.run_and_wait(empty_user_notification_batch(event.user_id, event.type))
get_db().empty_user_notification_batch(event.user_id, event.type)
return True
except Exception as e:
logger.exception(f"Error processing notification for batch queue: {e}")
@@ -631,7 +635,7 @@ class NotificationManager(AppService):
logger.info(f"Processing summary notification: {model}")
recipient_email = self.run_and_wait(get_user_email_by_id(event.user_id))
recipient_email = get_db().get_user_email_by_id(event.user_id)
if not recipient_email:
logger.error(f"User email not found for user {event.user_id}")
return False
@@ -644,8 +648,8 @@ class NotificationManager(AppService):
)
return True
summary_data = self.run_and_wait(
self._gather_summary_data(event.user_id, event.type, model.data)
summary_data = self._gather_summary_data(
event.user_id, event.type, model.data
)
unsub_link = generate_unsubscribe_link(event.user_id)
@@ -685,6 +689,8 @@ class NotificationManager(AppService):
except QueueEmpty:
logger.debug(f"Queue {error_queue_name} empty")
except TimeoutError:
logger.debug(f"Queue {error_queue_name} timed out")
except Exception as e:
if message:
logger.error(
@@ -692,15 +698,19 @@ class NotificationManager(AppService):
)
self.run_and_wait(message.reject(requeue=False))
else:
logger.error(
f"Error in notification service loop, message unable to be rejected, and will have to be manually removed to free space in the queue: {e}"
logger.exception(
f"Error in notification service loop, message unable to be rejected, and will have to be manually removed to free space in the queue: {e=}"
)
def run_service(self):
logger.info(f"[{self.service_name}] ⏳ Configuring RabbitMQ...")
self.rabbitmq_service = rabbitmq.AsyncRabbitMQ(self.rabbitmq_config)
self.run_and_wait(self.rabbitmq_service.connect())
logger.info(f"[{self.service_name}] Started notification service")
# Set up scheduler for batch processing of all notification types
# this can be changed later to spawn differnt cleanups on different schedules
# this can be changed later to spawn different cleanups on different schedules
try:
get_scheduler().add_batched_notification_schedule(
notification_types=list(NotificationType),
@@ -762,3 +772,5 @@ class NotificationManager(AppService):
"""Cleanup service resources"""
self.running = False
super().cleanup()
logger.info(f"[{self.service_name}] ⏳ Disconnecting RabbitMQ...")
self.run_and_wait(self.rabbitmq_service.disconnect())

View File

@@ -2,8 +2,17 @@ from typing import Dict, Set
from fastapi import WebSocket
from backend.data import execution
from backend.server.model import Methods, WsMessage
from backend.data.execution import (
ExecutionEventType,
GraphExecutionEvent,
NodeExecutionEvent,
)
from backend.server.model import WSMessage, WSMethod
_EVENT_TYPE_TO_METHOD_MAP: dict[ExecutionEventType, WSMethod] = {
ExecutionEventType.GRAPH_EXEC_UPDATE: WSMethod.GRAPH_EXECUTION_EVENT,
ExecutionEventType.NODE_EXEC_UPDATE: WSMethod.NODE_EXECUTION_EVENT,
}
class ConnectionManager:
@@ -11,37 +20,96 @@ class ConnectionManager:
self.active_connections: Set[WebSocket] = set()
self.subscriptions: Dict[str, Set[WebSocket]] = {}
async def connect(self, websocket: WebSocket):
async def connect_socket(self, websocket: WebSocket):
await websocket.accept()
self.active_connections.add(websocket)
def disconnect(self, websocket: WebSocket):
def disconnect_socket(self, websocket: WebSocket):
self.active_connections.remove(websocket)
for subscribers in self.subscriptions.values():
subscribers.discard(websocket)
async def subscribe(self, graph_id: str, graph_version: int, websocket: WebSocket):
key = f"{graph_id}_{graph_version}"
if key not in self.subscriptions:
self.subscriptions[key] = set()
self.subscriptions[key].add(websocket)
async def subscribe_graph_exec(
self, *, user_id: str, graph_exec_id: str, websocket: WebSocket
) -> str:
return await self._subscribe(
_graph_exec_channel_key(user_id, graph_exec_id=graph_exec_id), websocket
)
async def unsubscribe(
self, graph_id: str, graph_version: int, websocket: WebSocket
):
key = f"{graph_id}_{graph_version}"
if key in self.subscriptions:
self.subscriptions[key].discard(websocket)
if not self.subscriptions[key]:
del self.subscriptions[key]
async def subscribe_graph_execs(
self, *, user_id: str, graph_id: str, websocket: WebSocket
) -> str:
return await self._subscribe(
_graph_execs_channel_key(user_id, graph_id=graph_id), websocket
)
async def send_execution_result(self, result: execution.ExecutionResult):
key = f"{result.graph_id}_{result.graph_version}"
if key in self.subscriptions:
message = WsMessage(
method=Methods.EXECUTION_EVENT,
channel=key,
data=result.model_dump(),
async def unsubscribe_graph_exec(
self, *, user_id: str, graph_exec_id: str, websocket: WebSocket
) -> str | None:
return await self._unsubscribe(
_graph_exec_channel_key(user_id, graph_exec_id=graph_exec_id), websocket
)
async def unsubscribe_graph_execs(
self, *, user_id: str, graph_id: str, websocket: WebSocket
) -> str | None:
return await self._unsubscribe(
_graph_execs_channel_key(user_id, graph_id=graph_id), websocket
)
async def send_execution_update(
self, exec_event: GraphExecutionEvent | NodeExecutionEvent
) -> int:
graph_exec_id = (
exec_event.id
if isinstance(exec_event, GraphExecutionEvent)
else exec_event.graph_exec_id
)
n_sent = 0
channels: set[str] = {
# Send update to listeners for this graph execution
_graph_exec_channel_key(exec_event.user_id, graph_exec_id=graph_exec_id)
}
if isinstance(exec_event, GraphExecutionEvent):
# Send update to listeners for all executions of this graph
channels.add(
_graph_execs_channel_key(
exec_event.user_id, graph_id=exec_event.graph_id
)
)
for channel in channels.intersection(self.subscriptions.keys()):
message = WSMessage(
method=_EVENT_TYPE_TO_METHOD_MAP[exec_event.event_type],
channel=channel,
data=exec_event.model_dump(),
).model_dump_json()
for connection in self.subscriptions[key]:
for connection in self.subscriptions[channel]:
await connection.send_text(message)
n_sent += 1
return n_sent
async def _subscribe(self, channel_key: str, websocket: WebSocket) -> str:
if channel_key not in self.subscriptions:
self.subscriptions[channel_key] = set()
self.subscriptions[channel_key].add(websocket)
return channel_key
async def _unsubscribe(self, channel_key: str, websocket: WebSocket) -> str | None:
if channel_key in self.subscriptions:
self.subscriptions[channel_key].discard(websocket)
if not self.subscriptions[channel_key]:
del self.subscriptions[channel_key]
return channel_key
return None
def _graph_exec_channel_key(user_id: str, *, graph_exec_id: str) -> str:
return f"{user_id}|graph_exec#{graph_exec_id}"
def _graph_execs_channel_key(user_id: str, *, graph_id: str) -> str:
return f"{user_id}|graph#{graph_id}|executions"

View File

@@ -2,7 +2,6 @@ import logging
from collections import defaultdict
from typing import Annotated, Any, Dict, List, Optional, Sequence
from autogpt_libs.utils.cache import thread_cached
from fastapi import APIRouter, Body, Depends, HTTPException
from prisma.enums import AgentExecutionStatus, APIKeyPermission
from typing_extensions import TypedDict
@@ -12,18 +11,11 @@ from backend.data import execution as execution_db
from backend.data import graph as graph_db
from backend.data.api_key import APIKey
from backend.data.block import BlockInput, CompletedBlockOutput
from backend.data.execution import ExecutionResult
from backend.executor import ExecutionManager
from backend.data.execution import NodeExecutionResult
from backend.executor.utils import add_graph_execution_async
from backend.server.external.middleware import require_permission
from backend.util.service import get_service_client
from backend.util.settings import Settings
@thread_cached
def execution_manager_client() -> ExecutionManager:
return get_service_client(ExecutionManager)
settings = Settings()
logger = logging.getLogger(__name__)
@@ -53,7 +45,7 @@ class GraphExecutionResult(TypedDict):
output: Optional[List[Dict[str, str]]]
def get_outputs_with_names(results: List[ExecutionResult]) -> List[Dict[str, str]]:
def get_outputs_with_names(results: list[NodeExecutionResult]) -> list[dict[str, str]]:
outputs = []
for result in results:
if "output" in result.output_data:
@@ -71,7 +63,7 @@ def get_outputs_with_names(results: List[ExecutionResult]) -> List[Dict[str, str
)
def get_graph_blocks() -> Sequence[dict[Any, Any]]:
blocks = [block() for block in backend.data.block.get_blocks().values()]
return [b.to_dict() for b in blocks]
return [b.to_dict() for b in blocks if not b.disabled]
@v1_router.post(
@@ -98,20 +90,20 @@ def execute_graph_block(
path="/graphs/{graph_id}/execute/{graph_version}",
tags=["graphs"],
)
def execute_graph(
async def execute_graph(
graph_id: str,
graph_version: int,
node_input: Annotated[dict[str, Any], Body(..., embed=True, default_factory=dict)],
api_key: APIKey = Depends(require_permission(APIKeyPermission.EXECUTE_GRAPH)),
) -> dict[str, Any]:
try:
graph_exec = execution_manager_client().add_execution(
graph_id,
graph_version=graph_version,
data=node_input,
graph_exec = await add_graph_execution_async(
graph_id=graph_id,
user_id=api_key.user_id,
inputs=node_input,
graph_version=graph_version,
)
return {"id": graph_exec.graph_exec_id}
return {"id": graph_exec.id}
except Exception as e:
msg = str(e).encode().decode("unicode_escape")
raise HTTPException(status_code=400, detail=msg)
@@ -130,7 +122,7 @@ async def get_graph_execution_results(
if not graph:
raise HTTPException(status_code=404, detail=f"Graph #{graph_id} not found.")
results = await execution_db.get_execution_results(graph_exec_id)
results = await execution_db.get_node_execution_results(graph_exec_id)
last_result = results[-1] if results else None
execution_status = (
last_result.status if last_result else AgentExecutionStatus.INCOMPLETE

View File

@@ -1,8 +1,10 @@
import asyncio
import logging
from typing import TYPE_CHECKING, Annotated, Literal
from typing import TYPE_CHECKING, Annotated, Awaitable, Literal
from fastapi import APIRouter, Body, Depends, HTTPException, Path, Query, Request
from pydantic import BaseModel, Field
from starlette.status import HTTP_404_NOT_FOUND
from backend.data.graph import set_node_webhook
from backend.data.integrations import (
@@ -13,13 +15,12 @@ from backend.data.integrations import (
wait_for_webhook_event,
)
from backend.data.model import Credentials, CredentialsType, OAuth2Credentials
from backend.executor.manager import ExecutionManager
from backend.executor.utils import add_graph_execution_async
from backend.integrations.creds_manager import IntegrationCredentialsManager
from backend.integrations.oauth import HANDLERS_BY_NAME
from backend.integrations.providers import ProviderName
from backend.integrations.webhooks import WEBHOOK_MANAGERS_BY_NAME
from backend.util.exceptions import NeedConfirmation
from backend.util.service import get_service_client
from backend.integrations.webhooks import get_webhook_manager
from backend.util.exceptions import NeedConfirmation, NotFoundError
from backend.util.settings import Settings
if TYPE_CHECKING:
@@ -281,8 +282,14 @@ async def webhook_ingress_generic(
webhook_id: Annotated[str, Path(title="Our ID for the webhook")],
):
logger.debug(f"Received {provider.value} webhook ingress for ID {webhook_id}")
webhook_manager = WEBHOOK_MANAGERS_BY_NAME[provider]()
webhook = await get_webhook(webhook_id)
webhook_manager = get_webhook_manager(provider)
try:
webhook = await get_webhook(webhook_id)
except NotFoundError as e:
logger.warning(f"Webhook payload received for unknown webhook: {e}")
raise HTTPException(
status_code=HTTP_404_NOT_FOUND, detail=f"Webhook #{webhook_id} not found"
) from e
logger.debug(f"Webhook #{webhook_id}: {webhook}")
payload, event_type = await webhook_manager.validate_payload(webhook, request)
logger.debug(
@@ -302,19 +309,22 @@ async def webhook_ingress_generic(
if not webhook.attached_nodes:
return
executor = get_service_client(ExecutionManager)
executions: list[Awaitable] = []
for node in webhook.attached_nodes:
logger.debug(f"Webhook-attached node: {node}")
if not node.is_triggered_by_event_type(event_type):
logger.debug(f"Node #{node.id} doesn't trigger on event {event_type}")
continue
logger.debug(f"Executing graph #{node.graph_id} node #{node.id}")
executor.add_execution(
graph_id=node.graph_id,
graph_version=node.graph_version,
data={f"webhook_{webhook_id}_payload": payload},
user_id=webhook.user_id,
executions.append(
add_graph_execution_async(
user_id=webhook.user_id,
graph_id=node.graph_id,
graph_version=node.graph_version,
inputs={f"webhook_{webhook_id}_payload": payload},
)
)
asyncio.gather(*executions)
@router.post("/webhooks/{webhook_id}/ping")
@@ -323,7 +333,7 @@ async def webhook_ping(
user_id: Annotated[str, Depends(get_user_id)], # require auth
):
webhook = await get_webhook(webhook_id)
webhook_manager = WEBHOOK_MANAGERS_BY_NAME[webhook.provider]()
webhook_manager = get_webhook_manager(webhook.provider)
credentials = (
creds_manager.get(user_id, webhook.credentials_id)
@@ -358,14 +368,6 @@ async def remove_all_webhooks_for_credentials(
NeedConfirmation: If any of the webhooks are still in use and `force` is `False`
"""
webhooks = await get_all_webhooks_by_creds(credentials.id)
if credentials.provider not in WEBHOOK_MANAGERS_BY_NAME:
if webhooks:
logger.error(
f"Credentials #{credentials.id} for provider {credentials.provider} "
f"are attached to {len(webhooks)} webhooks, "
f"but there is no available WebhooksHandler for {credentials.provider}"
)
return
if any(w.attached_nodes for w in webhooks) and not force:
raise NeedConfirmation(
"Some webhooks linked to these credentials are still in use by an agent"
@@ -376,7 +378,7 @@ async def remove_all_webhooks_for_credentials(
await set_node_webhook(node.id, None)
# Prune the webhook
webhook_manager = WEBHOOK_MANAGERS_BY_NAME[credentials.provider]()
webhook_manager = get_webhook_manager(ProviderName(credentials.provider))
success = await webhook_manager.prune_webhook_if_dangling(
webhook.id, credentials
)

View File

@@ -1,31 +1,36 @@
import enum
from typing import Any, List, Optional, Union
from typing import Any, Optional
import pydantic
import backend.data.graph
from backend.data.api_key import APIKeyPermission, APIKeyWithoutHash
from backend.data.graph import Graph
class Methods(enum.Enum):
SUBSCRIBE = "subscribe"
class WSMethod(enum.Enum):
SUBSCRIBE_GRAPH_EXEC = "subscribe_graph_execution"
SUBSCRIBE_GRAPH_EXECS = "subscribe_graph_executions"
UNSUBSCRIBE = "unsubscribe"
EXECUTION_EVENT = "execution_event"
GRAPH_EXECUTION_EVENT = "graph_execution_event"
NODE_EXECUTION_EVENT = "node_execution_event"
ERROR = "error"
HEARTBEAT = "heartbeat"
class WsMessage(pydantic.BaseModel):
method: Methods
data: Optional[Union[dict[str, Any], list[Any], str]] = None
class WSMessage(pydantic.BaseModel):
method: WSMethod
data: Optional[dict[str, Any] | list[Any] | str] = None
success: bool | None = None
channel: str | None = None
error: str | None = None
class ExecutionSubscription(pydantic.BaseModel):
class WSSubscribeGraphExecutionRequest(pydantic.BaseModel):
graph_exec_id: str
class WSSubscribeGraphExecutionsRequest(pydantic.BaseModel):
graph_id: str
graph_version: int
class ExecuteGraphResponse(pydantic.BaseModel):
@@ -33,12 +38,12 @@ class ExecuteGraphResponse(pydantic.BaseModel):
class CreateGraph(pydantic.BaseModel):
graph: backend.data.graph.Graph
graph: Graph
class CreateAPIKeyRequest(pydantic.BaseModel):
name: str
permissions: List[APIKeyPermission]
permissions: list[APIKeyPermission]
description: Optional[str] = None
@@ -52,7 +57,7 @@ class SetGraphActiveVersion(pydantic.BaseModel):
class UpdatePermissionsRequest(pydantic.BaseModel):
permissions: List[APIKeyPermission]
permissions: list[APIKeyPermission]
class Pagination(pydantic.BaseModel):

View File

@@ -11,22 +11,24 @@ from autogpt_libs.feature_flag.client import (
initialize_launchdarkly,
shutdown_launchdarkly,
)
from autogpt_libs.logging.utils import generate_uvicorn_config
import backend.data.block
import backend.data.db
import backend.data.graph
import backend.data.user
import backend.server.integrations.router
import backend.server.routers.postmark.postmark
import backend.server.routers.v1
import backend.server.v2.admin.store_admin_routes
import backend.server.v2.library.db
import backend.server.v2.library.model
import backend.server.v2.library.routes
import backend.server.v2.otto.routes
import backend.server.v2.postmark.postmark
import backend.server.v2.store.model
import backend.server.v2.store.routes
import backend.util.service
import backend.util.settings
from backend.blocks.llm import LlmModel
from backend.data.model import Credentials
from backend.integrations.providers import ProviderName
from backend.server.external.api import external_app
@@ -55,6 +57,7 @@ async def lifespan_context(app: fastapi.FastAPI):
await backend.data.block.initialize_blocks()
await backend.data.user.migrate_and_encrypt_user_integrations()
await backend.data.graph.fix_llm_provider_credentials()
await backend.data.graph.migrate_llm_models(LlmModel.GPT4O)
with launch_darkly_context():
yield
await backend.data.db.disconnect()
@@ -99,6 +102,11 @@ app.include_router(backend.server.routers.v1.v1_router, tags=["v1"], prefix="/ap
app.include_router(
backend.server.v2.store.routes.router, tags=["v2"], prefix="/api/store"
)
app.include_router(
backend.server.v2.admin.store_admin_routes.router,
tags=["v2", "admin"],
prefix="/api/store",
)
app.include_router(
backend.server.v2.library.routes.router, tags=["v2"], prefix="/api/library"
)
@@ -107,8 +115,8 @@ app.include_router(
)
app.include_router(
backend.server.v2.postmark.postmark.router,
tags=["v2", "email"],
backend.server.routers.postmark.postmark.router,
tags=["v1", "email"],
prefix="/api/email",
)
@@ -133,8 +141,13 @@ class AgentServer(backend.util.service.AppProcess):
server_app,
host=backend.util.settings.Config().agent_api_host,
port=backend.util.settings.Config().agent_api_port,
log_config=generate_uvicorn_config(),
)
def cleanup(self):
super().cleanup()
logger.info(f"[{self.service_name}] ⏳ Shutting down Agent Server...")
@staticmethod
async def test_execute_graph(
graph_id: str,
@@ -142,11 +155,12 @@ class AgentServer(backend.util.service.AppProcess):
graph_version: Optional[int] = None,
node_input: Optional[dict[str, Any]] = None,
):
return backend.server.routers.v1.execute_graph(
return await backend.server.routers.v1.execute_graph(
user_id=user_id,
graph_id=graph_id,
graph_version=graph_version,
node_input=node_input or {},
inputs=node_input or {},
credentials_inputs={},
)
@staticmethod
@@ -154,9 +168,10 @@ class AgentServer(backend.util.service.AppProcess):
graph_id: str,
graph_version: int,
user_id: str,
for_export: bool = False,
):
return await backend.server.routers.v1.get_graph(
graph_id, user_id, graph_version
graph_id, user_id, graph_version, for_export
)
@staticmethod
@@ -168,21 +183,15 @@ class AgentServer(backend.util.service.AppProcess):
@staticmethod
async def test_get_graph_run_status(graph_exec_id: str, user_id: str):
execution = await backend.data.graph.get_execution_meta(
from backend.data.execution import get_graph_execution_meta
execution = await get_graph_execution_meta(
user_id=user_id, execution_id=graph_exec_id
)
if not execution:
raise ValueError(f"Execution {graph_exec_id} not found")
return execution.status
@staticmethod
async def test_get_graph_run_results(
graph_id: str, graph_exec_id: str, user_id: str
):
return await backend.server.routers.v1.get_graph_execution(
graph_id, graph_exec_id, user_id
)
@staticmethod
async def test_delete_graph(graph_id: str, user_id: str):
await backend.server.v2.library.db.delete_library_agent_by_graph_id(
@@ -249,12 +258,16 @@ class AgentServer(backend.util.service.AppProcess):
):
return await backend.server.v2.store.routes.create_submission(request, user_id)
### ADMIN ###
@staticmethod
async def test_review_store_listing(
request: backend.server.v2.store.model.ReviewSubmissionRequest,
user: autogpt_libs.auth.models.User,
):
return await backend.server.v2.store.routes.review_submission(request, user)
return await backend.server.v2.admin.store_admin_routes.review_submission(
request.store_listing_version_id, request, user
)
@staticmethod
def test_create_credentials(
@@ -262,7 +275,9 @@ class AgentServer(backend.util.service.AppProcess):
provider: ProviderName,
credentials: Credentials,
) -> Credentials:
return backend.server.integrations.router.create_credentials(
from backend.server.integrations.router import create_credentials
return create_credentials(
user_id=user_id, provider=provider, credentials=credentials
)

Some files were not shown because too many files have changed in this diff Show More