Compare commits

...

118 Commits

Author SHA1 Message Date
Nicholas Tindle
8b0688d962 Update client.ts 2025-02-13 16:20:24 -06:00
Reinier van der Leer
ce1d63c517 feat(backend): Library v2 Agents and Presets (#9258)
- Blocked by #9267

This re-introduces changes from the following PRs with fixes:
- #9218
- #9211

### Changes 🏗️

- See #9218
- See #9211

Fixes:
- Fix Prisma query statements in `v2.library.db`
- Fix creation of (library) agents
- Fix test cleanup of (library) agents
- Fix handling and passing of `node_input` parameters

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  - [x] Create & run a new agent
  - [x] Update & run an existing agent
2025-02-13 17:30:58 +00:00
Zamil Majdy
b5b9a008bf feat(backend): Migrate json encoded string columns into a native json column (#9475)
### Changes 🏗️

Due to the legacy of SQLite usage, some of the JSON columns are actually
a string column string a stringified JSON column.
The scope of this PR is migrating those columns into an actual JSON
column.

### Checklist 📋

#### For code changes:
- [ ] I have clearly listed my changes in the PR description
- [ ] I have made a test plan
- [ ] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [ ] ...

<details>
  <summary>Example test plan</summary>
  
  - [ ] Create from scratch and execute an agent with at least 3 blocks
- [ ] Import an agent from file upload, and confirm it executes
correctly
  - [ ] Upload agent to marketplace
- [ ] Import an agent from marketplace and confirm it executes correctly
  - [ ] Edit an agent from monitor, and confirm it executes correctly
</details>

#### For configuration changes:
- [ ] `.env.example` is updated or already compatible with my changes
- [ ] `docker-compose.yml` is updated or already compatible with my
changes
- [ ] I have included a list of my configuration changes in the PR
description (under **Changes**)

<details>
  <summary>Examples of configuration changes</summary>

  - Changing ports
  - Adding new services that need to communicate with each other
  - Secrets or environment variable changes
  - New or infrastructure changes such as databases
</details>
2025-02-13 12:02:53 +00:00
Nicholas Tindle
7e04fbd25f feat(backend): schema updates, migration, queries for Email Notification Service (#9445)
<!-- Clearly explain the need for these changes: -->

The email service has requirements to
- Email users when some activity has happened on their account on some
scheduled basis -> We need a way to get active users and the executions
that happened while they were active
- Allow users to configure what emails they get -> Need a user
preference
- Get User email by Id so that we can email them -> Pretty
self-explanatory

We need to add a few new backend queries + db models for the
notification to start handling these details. This is the first set of
those changes based on experience building the app service

### Changes 🏗️

<!-- Concisely describe all of the changes made in this pull request:
-->
- Adds a new DB Model, `UserNotificationPreferences,` with related
migration
- Adds a new DB Model `NotificationEvent` with related migration to
track what notifications we've sent and how many and such (how much we
add here is open to change depending on what limits on data we want)
- Adds a new DB Model `UserNotificationBatch` with related migration to
handle batching of like models
- Adds queries to get users and executions by `datetime` ranges as `ISO`
strings
- Adds new queries to the `DatabaseManager` and exposes them to the
other `AppService`s
- Exposes all new queries plus an existing one `get_user_by_id`

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
- [x] I extracted these changes from a working implementation of the
service, and tested they don't bring down the service by being uncalled
by running the standard agent tests we do on release

---------

Co-authored-by: Reinier van der Leer <pwuts@agpt.co>
2025-02-12 22:54:16 +00:00
Bently
016ec0ff6b fix(frontend) update PublishAgentAwaitingReview router push path (#9471)
Updates the PublishAgentAwaitingReview router.push path, it was going to
``/marketplace/dashboard`` it should be ``/profile/dashboard``

### Changes 🏗️


8181ee8cd1/autogpt_platform/frontend/src/components/agptui/composite/PublishAgentPopout.tsx (L265-L267)

to be

```tsx
		onViewProgress={() => {
		  router.push("/profile/dashboard");
		  handleClose();
		}}
```
2025-02-12 15:32:47 +00:00
Zamil Majdy
3b8cde6d11 feat(block): Add batch matched result and its count on ExtractTextInformationBlock (#9470)
### Changes 🏗️

Introduced `matched_result` & `matched_count` as a batch matched result
list and its count for the block execution of
ExtractTextInformationBlock.

### Checklist 📋

#### For code changes:
- [ ] I have clearly listed my changes in the PR description
- [ ] I have made a test plan
- [ ] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [ ] ...

<details>
  <summary>Example test plan</summary>
  
  - [ ] Create from scratch and execute an agent with at least 3 blocks
- [ ] Import an agent from file upload, and confirm it executes
correctly
  - [ ] Upload agent to marketplace
- [ ] Import an agent from marketplace and confirm it executes correctly
  - [ ] Edit an agent from monitor, and confirm it executes correctly
</details>

#### For configuration changes:
- [ ] `.env.example` is updated or already compatible with my changes
- [ ] `docker-compose.yml` is updated or already compatible with my
changes
- [ ] I have included a list of my configuration changes in the PR
description (under **Changes**)

<details>
  <summary>Examples of configuration changes</summary>

  - Changing ports
  - Adding new services that need to communicate with each other
  - Secrets or environment variable changes
  - New or infrastructure changes such as databases
</details>
2025-02-12 12:22:26 +00:00
Keith Webber
d050a3f77c docs: Provide feedback when cloning submodules (#9448)
<!-- Clearly explain the need for these changes: -->

### Changes 🏗️
Added `--progress` to the submodule update, so that cloning progress can
be tracked and does not appear to hang.
<!-- Concisely describe all of the changes made in this pull request:
-->

### Checklist 📋

#### For code changes:
- [x] No code change, just docs.
 

<details>
  <summary>Example test plan</summary>
  
  - [ ] Create from scratch and execute an agent with at least 3 blocks
- [ ] Import an agent from file upload, and confirm it executes
correctly
  - [ ] Upload agent to marketplace
- [ ] Import an agent from marketplace and confirm it executes correctly
  - [ ] Edit an agent from monitor, and confirm it executes correctly
</details>

#### For configuration changes:
- [x] No configuration change, just docs.
 

<details>
  <summary> Provide feedback when cloning submodules </summary>

  -  now updating submodules shows the cloning repo's progress
 
</details>

Co-authored-by: Your Name <you@example.com>
2025-02-12 03:43:52 +00:00
Reinier van der Leer
1626bf9e16 fix(backend): Support Python 3.10 (#9468)
- Resolves #9467

### Changes 🏗️

- Loosen Python version requirement to include v3.10

Also, fixed a few issues in pyproject.toml:
- Re-sort dependency list
- Update `autogpt-platform-backend` package version to match latest
release
2025-02-12 02:02:00 +00:00
Andy Hooker
40613fe23e fix(frontend): Update user profile from marketplace to appropriate profile route (#9465)
### Background
Resolves: #9313

The application is incorrectly nesting the user's profile settings
within the route of `/marketplace` instead of the appropriate route of
`/profile`

This pr will modify the existing code to handle the relocation of the
(user) directory from the /marketplace to /profile.

### Changes 🏗️

1. Refactored directory of `(user)`:
- Moved the directory of (user) from `/marketplace/(user)` to
`profile/(user)`


2. Update Sidebar and Navbar components:
- Updating the existing code from the routing of market to profile by
modifying the existing routes.

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:

<details>
  <summary>Test Plan</summary>
  
  - [ ] Navigate to the route of `profile/` and observe the moved page.
- [ ] Navigate to the route of `profile/integrations` and observe the
moved page.
- [ ] Navigate to the route of `profile/api_keys` and observe the moved
page.
- [ ] Navigate to the route of `profile/profile` and observe the moved
page.
- [ ] Navigate to the route of `profile/settings` and observe the moved
page.
- [ ] Navigate to the route of `profile/credits` and observe the moved
page.
</details>
2025-02-11 11:25:15 +00:00
Andy Hooker
6eee9206f7 fix(market): Market featured agent card (#9463)
### Background
Resolves: #9313

The marketplace featured agent's section has a bug where if you hover
over a featured agent's card we are getting an incorrect background
color applied to the description.

### Changes 🏗️

1. Refactored `FeaturedStoreCard` to `FeaturedAgentCard`:
   - Condensed props and leverage StoreAgent type from api
- Removed onClick handler from props as this is not json serializable
and is not inline with NextJS best practices
- Used built in Card Components from ShadCN to minimize custom styling.
- Optimize images with implementation of the Image component from NextJS

2. Enhanced `FeaturedCardSection` components:
- Removing extensive prop passing and leverage the agent itself with the
StoreAgent type.
- Implemented Link from NextJS to better handler routing and remove the
`useRouter` implementation
   - Removed unnecessary handleCardClick method.

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:

Test Plan
<details>
  <summary></summary>
 
  - [ ] Goto the landing page of the application or /marketplace 
  - [ ] Scroll to the featured agents section
- [ ] Move mouse over each of the cards and observe the image
disappearing and text being shown
- [ ] Observe the background color of the text that replaced the image
matches that of the card
</details>
2025-02-10 22:01:29 +00:00
Muhammad Safi
64050faef6 feature(block): Add XML Parser Block (#9450)
-Updated pyproject.toml for new dependency gravitasml
-Updated poetry.lock

<!-- Clearly explain the need for these changes: -->
- Issue no #9317 stated that, the addition of an XMLParserBlock is
required.
- It was suggested that the use of gravitasml as external package to be
used for parsing.
- Changes incorporated and tested as per requirements.
### Changes 🏗️
- Added xml_parser.py
- updated pyproject.toml
- updated poetry.lock for dependency changes 
<!-- Concisely describe all of the changes made in this pull request:
-->

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [ ] ...

<details>
  <summary>Example test plan</summary>
  
  - [ ] Create from scratch and execute an agent with at least 3 blocks
- [ ] Import an agent from file upload, and confirm it executes
correctly
  - [ ] Upload agent to marketplace
- [ ] Import an agent from marketplace and confirm it executes correctly
  - [ ] Edit an agent from monitor, and confirm it executes correctly
</details>

#### For configuration changes:
- [ ] `.env.example` is updated or already compatible with my changes
- [ ] `docker-compose.yml` is updated or already compatible with my
changes
- [x] I have included a list of my configuration changes in the PR
description (under **Changes**)

<details>
  <summary>Examples of configuration changes</summary>

  - Changing ports
  - Adding new services that need to communicate with each other
  - Secrets or environment variable changes
  - New or infrastructure changes such as databases
</details>

---------

Co-authored-by: Nicholas Tindle <nicholas.tindle@agpt.co>
2025-02-10 21:00:09 +00:00
Krzysztof Czerwinski
00c312d02c feat(platform): Schedule specific agent version (#9444)
Scheduling always takes the newest version of an agent.

### Changes 🏗️

This PR allows to schedule any graph version by adding number input to
schedule popup. Number is automatically set to the newest version when
agent is chosen.

<img width="533" alt="Screenshot 2025-02-07 at 5 05 56 PM"
src="https://github.com/user-attachments/assets/357b8810-6f02-4066-b7a3-824d9bfd62af"
/>

- Update API, so it accepts graph version
- Update schedule pop up, so it lets user input version number
- Open and schedule correct agent
- Add `Version` column to the schedules table

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  - [x] Can schedule version between 1 and max version
  - [x] Reject incorrect version
  - [x] Table shows proper version
  - [x] Removing schedule works
2025-02-10 13:24:25 +00:00
Nicholas Tindle
610be988c4 feat(backend): attach rabbitmq to the AppService (#9438)
### Changes 🏗️
For Emailing, we need to make a new App Service (NotificationManager)
that will require us to have rabbitmq as a dependency. This is the
backing data library to make that happen + registering it with the app
service base class and connecting when we spawn up a service

<!-- Concisely describe all of the changes made in this pull request:
-->
- Adds a rabbitmq library following the existing standard for redis
- Adds rabbitmq to the service
- Adds rabbitmq mgmt library (pika) so that we can connect to rabbitmq

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
- [x] I tested by adding the following to the executor `@expose
add_execution` and verifying via the UI that the messages show up in the
queue as expected + the agent executes and behaves as normal!

![image](https://github.com/user-attachments/assets/3ebfb850-d482-4b11-901c-d3bf3397a346)

```diff
      diff --git a/autogpt_platform/backend/backend/executor/manager.py b/autogpt_platform/backend/backend/executor/manager.py
index 1d965e012..4cd5b403c 100644
--- a/autogpt_platform/backend/backend/executor/manager.py
+++ b/autogpt_platform/backend/backend/executor/manager.py
@@ -18,7 +18,7 @@ if TYPE_CHECKING:
 from autogpt_libs.utils.cache import thread_cached
 
 from backend.blocks.agent import AgentExecutorBlock
-from backend.data import redis
+from backend.data import rabbitmq, redis
 from backend.data.block import (
     Block,
     BlockData,
@@ -750,6 +750,19 @@ class ExecutionManager(AppService):
     def __init__(self):
         super().__init__()
         self.use_redis = True
+        self.use_rabbitmq = rabbitmq.RabbitMQConfig(
+            exchanges=[
+                rabbitmq.Exchange(name="execution", type=rabbitmq.ExchangeType.FANOUT),
+            ],
+            queues=[
+                rabbitmq.Queue(
+                    name="execution",
+                    exchange=rabbitmq.Exchange(
+                        name="execution", type=rabbitmq.ExchangeType.FANOUT
+                    ),
+                ),
+            ],
+        )
         self.use_supabase = True
         self.pool_size = settings.config.num_graph_workers
         self.queue = ExecutionQueue[GraphExecutionEntry]()
@@ -876,6 +889,12 @@ class ExecutionManager(AppService):
         )
         self.queue.add(graph_exec)
 
+        # test rabbitmq
+        self.rabbit.publish_message(
+            exchange=self.rabbit_config.exchanges[0],
+            routing_key=self.rabbit_config.exchanges[0].name,
+            message=graph_exec_id,
+        )
         return graph_exec
 
     @expose
```
2025-02-08 21:30:30 +00:00
Nicholas Tindle
1a4ba533ca feat(infra): add rabbitmq to docker compose (#9437)
<!-- Clearly explain the need for these changes: -->
We want to use RabbitMQ for email (and executor in the future) to ensure
message delivery -- something we currently lack with Redis. This PR is
adding RabbitMQ to the docker-compose and setup details with defaults so
that when we start bringing services up, they have the backing to do so.

### Changes 🏗️

<!-- Concisely describe all of the changes made in this pull request:
-->
- Adds rabbitmq container (with exposed API and mgmt ports)
- Adds .env.example config for the backend
- Adds dockercompose config for the backend, and passes the variables
around as needed

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [x] Start the container using docker compose deps subset

#### For configuration changes:
- [x] `.env.example` is updated or already compatible with my changes
- [x] `docker-compose.yml` is updated or already compatible with my
changes
- [x] I have included a list of my configuration changes in the PR
description (under **Changes**)
2025-02-07 15:40:17 +00:00
Krzysztof Czerwinski
56a307d048 fix(frontend): Fix beads when output is array (#9439)
Array output, e.g. `Item` in Step Through Items block doesn't output
correct number of beads, this PR fixes this issue.
2025-02-07 14:42:51 +00:00
Krzysztof Czerwinski
a5ad90f09b fix(frontend): Fix issue when pasting blocks (#9443)
When block is pasted the original block's inputs behave as disconnected:
the input is shown despite there being connection. This PR fixes this
issue.
2025-02-07 14:42:08 +00:00
Krzysztof Czerwinski
a315b3fc41 fix(frontend): Prevent exception when Stripe env var is missing (#9441)
Prevent exception when `NEXT_PUBLIC_STRIPE_PUBLISHABLE_KEY` is missing
and update state management in `useCredits`.
2025-02-07 12:37:06 +00:00
Krzysztof Czerwinski
1a1fe7c0b7 feat(platform): Support opening graphs with version and execution id (#9332)
Currently it's only possible to open latest graph from monitor and see
the node execution results only when manually running. This PR adds
ability to open running and finished graphs in builder.

### Changes 🏗️

Builder now handles graph version and execution ID in addition to graph
ID when opening a graph. When an execution ID is provided, node
execution results are fetched and subscribed to in real time. This makes
it possible to open a graph that is already executing and see both
existing node execution data and real-time updates (if it's still
running).

- Use graph version and execution id on the builder page and in
`useAgentGraph`
- Use graph version on the `execute_graph` endpoint
- Use graph version on the websockets to distinguish between versions
- Move `formatEdgeID` to utils; it's used in `useAgentGraph.ts` and in
`Flow.tsx`

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  - [x] Opening finished execution restores node results
- [x] Opening running execution restores results and continues to run
properly
  - [x] Results are separate for each graph across multiple tabs

#### For configuration changes:
- [ ] `.env.example` is updated or already compatible with my changes
- [ ] `docker-compose.yml` is updated or already compatible with my
changes
- [ ] I have included a list of my configuration changes in the PR
description (under **Changes**)

<details>
  <summary>Examples of configuration changes</summary>

  - Changing ports
  - Adding new services that need to communicate with each other
  - Secrets or environment variable changes
  - New or infrastructure changes such as databases
</details>

---------

Co-authored-by: Zamil Majdy <zamil.majdy@agpt.co>
2025-02-07 10:16:30 +00:00
Swifty
c693875951 platform(fix): Improve performance of builder (#9435)
1. Remove isHovered / onMouseEnter / onMouseLeave state updates
2. Wrap Custom Node in React.memo
3. Avoid re-renders for context menus
2025-02-07 10:38:50 +01:00
Krzysztof Czerwinski
797916cf14 feat(frontend): Show toast on low credit balance, rename Credits page to Billing (#9428)
<!-- Clearly explain the need for these changes: -->

### Changes 🏗️

- If a node fails to execute and error contains `Insufficient balance`
show toast with link to Billing page
- Rename `Credits` page to `Billing`
<img width="398" alt="Screenshot 2025-02-05 at 2 27 36 PM"
src="https://github.com/user-attachments/assets/6a5b4db0-a579-4607-a6bd-d5cf9229092f"
/>

### Checklist 📋

#### For code changes:
- [ ] I have clearly listed my changes in the PR description
- [ ] I have made a test plan
- [ ] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [ ] ...

<details>
  <summary>Example test plan</summary>
  
  - [ ] Create from scratch and execute an agent with at least 3 blocks
- [ ] Import an agent from file upload, and confirm it executes
correctly
  - [ ] Upload agent to marketplace
- [ ] Import an agent from marketplace and confirm it executes correctly
  - [ ] Edit an agent from monitor, and confirm it executes correctly
</details>

#### For configuration changes:
- [ ] `.env.example` is updated or already compatible with my changes
- [ ] `docker-compose.yml` is updated or already compatible with my
changes
- [ ] I have included a list of my configuration changes in the PR
description (under **Changes**)

<details>
  <summary>Examples of configuration changes</summary>

  - Changing ports
  - Adding new services that need to communicate with each other
  - Secrets or environment variable changes
  - New or infrastructure changes such as databases
</details>

---------

Co-authored-by: Aarushi <50577581+aarushik93@users.noreply.github.com>
2025-02-06 16:12:56 +00:00
dependabot[bot]
5d8fe1e184 chore(backend/deps): bump google-cloud-storage from 2.19.0 to 3.0.0 in /autogpt_platform/backend (#9399)
Bumps
[google-cloud-storage](https://github.com/googleapis/python-storage)
from 2.19.0 to 3.0.0.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/googleapis/python-storage/releases">google-cloud-storage's
releases</a>.</em></p>
<blockquote>
<h2>v3.0.0</h2>
<h2><a
href="https://github.com/googleapis/python-storage/compare/v2.19.0...v3.0.0">3.0.0</a>
(2025-01-28)</h2>
<h3>⚠ BREAKING CHANGES</h3>
<p>Please consult the README for details on this major version
release.</p>
<ul>
<li>The default checksum strategy for uploads has changed from None to
&quot;auto&quot; (<a
href="https://redirect.github.com/googleapis/python-storage/issues/1383">#1383</a>)</li>
<li>The default checksum strategy for downloads has changed from
&quot;md5&quot; to &quot;auto&quot; (<a
href="https://redirect.github.com/googleapis/python-storage/issues/1383">#1383</a>)</li>
<li>Deprecated positional argument &quot;num_retries&quot; has been
removed (<a
href="https://redirect.github.com/googleapis/python-storage/issues/1377">#1377</a>)</li>
<li>Deprecated argument &quot;text_mode&quot; has been removed (<a
href="https://redirect.github.com/googleapis/python-storage/issues/1379">#1379</a>)</li>
<li>Blob.download_to_filename() now deletes the empty destination file
on a 404 (<a
href="https://redirect.github.com/googleapis/python-storage/pull/1394">#1394</a>)</li>
<li>Media operations now use the same retry backoff, timeout and custom
predicate system as non-media operations, which may slightly impact
default retry behavior (<a
href="https://redirect.github.com/googleapis/python-storage/issues/1385">#1385</a>)</li>
<li>Retries are now enabled by default for uploads, blob deletes and
blob metadata updates (<a
href="https://redirect.github.com/googleapis/python-storage/issues/1400">#1400</a>)</li>
</ul>
<h3>Features</h3>
<ul>
<li>Add &quot;auto&quot; checksum option and make default (<a
href="https://redirect.github.com/googleapis/python-storage/issues/1383">#1383</a>)
(<a
href="5375fa0738">5375fa0</a>)</li>
<li>Blob.download_to_filename() deletes the empty destination file on a
404 (<a
href="https://redirect.github.com/googleapis/python-storage/pull/1394">#1394</a>)
(<a
href="066be2db78">066be2d</a>)</li>
<li>Enable custom predicates for media operations (<a
href="https://redirect.github.com/googleapis/python-storage/issues/1385">#1385</a>)
(<a
href="f3517bfcb9">f3517bf</a>)</li>
<li>Integrate google-resumable-media (<a
href="https://redirect.github.com/googleapis/python-storage/issues/1283">#1283</a>)
(<a
href="bd917b49d2">bd917b4</a>)</li>
<li>Retry by default for uploads, blob deletes, metadata updates (<a
href="https://redirect.github.com/googleapis/python-storage/issues/1400">#1400</a>)
(<a
href="0426005175">0426005</a>)</li>
</ul>
<h3>Bug Fixes</h3>
<ul>
<li>Cancel upload when BlobWriter exits with exception (<a
href="https://redirect.github.com/googleapis/python-storage/issues/1243">#1243</a>)
(<a
href="df107d20a7">df107d2</a>)</li>
<li>Changed name of methods <code>Blob.from_string()</code> and
<code>Bucket.from_string()</code> to <code>from_uri()</code> (<a
href="https://redirect.github.com/googleapis/python-storage/issues/1335">#1335</a>)
(<a
href="58c1d03819">58c1d03</a>)</li>
<li>Correctly calculate starting offset for retries of ranged reads (<a
href="https://redirect.github.com/googleapis/python-storage/issues/1376">#1376</a>)
(<a
href="7b6c9a0fb3">7b6c9a0</a>)</li>
<li>Filter download_kwargs in BlobReader (<a
href="https://redirect.github.com/googleapis/python-storage/issues/1411">#1411</a>)
(<a
href="0c21210450">0c21210</a>)</li>
<li>Remove deprecated num_retries argument (<a
href="https://redirect.github.com/googleapis/python-storage/issues/1377">#1377</a>)
(<a
href="58b5040933">58b5040</a>)</li>
<li>Remove deprecated text_mode argument (<a
href="https://redirect.github.com/googleapis/python-storage/issues/1379">#1379</a>)
(<a
href="4d20a8efa8">4d20a8e</a>)</li>
</ul>
<h3>Documentation</h3>
<ul>
<li>Correct formatting and update README.rst (<a
href="https://redirect.github.com/googleapis/python-storage/issues/1427">#1427</a>)
(<a
href="2945853977">2945853</a>)</li>
<li>Fix issue with exceptions.py documentation (<a
href="https://redirect.github.com/googleapis/python-storage/issues/1328">#1328</a>)
(<a
href="22b8c304af">22b8c30</a>)</li>
</ul>
<h2>v3.0.0rc1</h2>
<h2><a
href="https://github.com/googleapis/python-storage/compare/v2.19.0...v3.0.0rc1">3.0.0rc1</a>
(2024-12-12)</h2>
<h3>⚠ BREAKING CHANGES</h3>
<ul>
<li>The default checksum strategy for uploads has changed from None to
&quot;auto&quot; (<a
href="https://redirect.github.com/googleapis/python-storage/issues/1383">#1383</a>)
(5375fa0)</li>
<li>The default checksum strategy for downloads has changed from
&quot;md5&quot; to &quot;auto&quot; (<a
href="https://redirect.github.com/googleapis/python-storage/issues/1383">#1383</a>)
(5375fa0)</li>
<li>Deprecated positional argument &quot;num_retries&quot; has been
removed (<a
href="https://redirect.github.com/googleapis/python-storage/issues/1377">#1377</a>)
(58b5040)</li>
<li>Deprecated argument &quot;text_mode&quot; has been removed (<a
href="https://redirect.github.com/googleapis/python-storage/issues/1379">#1379</a>)
(4d20a8e)</li>
<li>Media operation retries now work identically to other retries, which
may impact default retry settings (<a
href="https://redirect.github.com/googleapis/python-storage/issues/1385">#1385</a>)
(f3517bf)</li>
<li>Blob.download_to_filename() deletes the empty destination file on a
404</li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/googleapis/python-storage/blob/main/CHANGELOG.md">google-cloud-storage's
changelog</a>.</em></p>
<blockquote>
<h2><a
href="https://github.com/googleapis/python-storage/compare/v2.19.0...v3.0.0">3.0.0</a>
(2025-01-28)</h2>
<h3>⚠ BREAKING CHANGES</h3>
<p>Please consult the README for details on this major version
release.</p>
<ul>
<li>The default checksum strategy for uploads has changed from None to
&quot;auto&quot; (<a
href="https://redirect.github.com/googleapis/python-storage/issues/1383">#1383</a>)</li>
<li>The default checksum strategy for downloads has changed from
&quot;md5&quot; to &quot;auto&quot; (<a
href="https://redirect.github.com/googleapis/python-storage/issues/1383">#1383</a>)</li>
<li>Deprecated positional argument &quot;num_retries&quot; has been
removed (<a
href="https://redirect.github.com/googleapis/python-storage/issues/1377">#1377</a>)</li>
<li>Deprecated argument &quot;text_mode&quot; has been removed (<a
href="https://redirect.github.com/googleapis/python-storage/issues/1379">#1379</a>)</li>
<li>Blob.download_to_filename() now deletes the empty destination file
on a 404 (<a
href="https://redirect.github.com/googleapis/python-storage/pull/1394">#1394</a>)</li>
<li>Media operations now use the same retry backoff, timeout and custom
predicate system as non-media operations, which may slightly impact
default retry behavior (<a
href="https://redirect.github.com/googleapis/python-storage/issues/1385">#1385</a>)</li>
<li>Retries are now enabled by default for uploads, blob deletes and
blob metadata updates (<a
href="https://redirect.github.com/googleapis/python-storage/issues/1400">#1400</a>)</li>
</ul>
<h3>Features</h3>
<ul>
<li>Add &quot;auto&quot; checksum option and make default (<a
href="https://redirect.github.com/googleapis/python-storage/issues/1383">#1383</a>)
(<a
href="5375fa0738">5375fa0</a>)</li>
<li>Blob.download_to_filename() deletes the empty destination file on a
404 (<a
href="https://redirect.github.com/googleapis/python-storage/pull/1394">#1394</a>)
(<a
href="066be2db78">066be2d</a>)</li>
<li>Enable custom predicates for media operations (<a
href="https://redirect.github.com/googleapis/python-storage/issues/1385">#1385</a>)
(<a
href="f3517bfcb9">f3517bf</a>)</li>
<li>Integrate google-resumable-media (<a
href="https://redirect.github.com/googleapis/python-storage/issues/1283">#1283</a>)
(<a
href="bd917b49d2">bd917b4</a>)</li>
<li>Retry by default for uploads, blob deletes, metadata updates (<a
href="https://redirect.github.com/googleapis/python-storage/issues/1400">#1400</a>)
(<a
href="0426005175">0426005</a>)</li>
</ul>
<h3>Bug Fixes</h3>
<ul>
<li>Cancel upload when BlobWriter exits with exception (<a
href="https://redirect.github.com/googleapis/python-storage/issues/1243">#1243</a>)
(<a
href="df107d20a7">df107d2</a>)</li>
<li>Changed name of methods <code>Blob.from_string()</code> and
<code>Bucket.from_string()</code> to <code>from_uri()</code> (<a
href="https://redirect.github.com/googleapis/python-storage/issues/1335">#1335</a>)
(<a
href="58c1d03819">58c1d03</a>)</li>
<li>Correctly calculate starting offset for retries of ranged reads (<a
href="https://redirect.github.com/googleapis/python-storage/issues/1376">#1376</a>)
(<a
href="7b6c9a0fb3">7b6c9a0</a>)</li>
<li>Filter download_kwargs in BlobReader (<a
href="https://redirect.github.com/googleapis/python-storage/issues/1411">#1411</a>)
(<a
href="0c21210450">0c21210</a>)</li>
<li>Remove deprecated num_retries argument (<a
href="https://redirect.github.com/googleapis/python-storage/issues/1377">#1377</a>)
(<a
href="58b5040933">58b5040</a>)</li>
<li>Remove deprecated text_mode argument (<a
href="https://redirect.github.com/googleapis/python-storage/issues/1379">#1379</a>)
(<a
href="4d20a8efa8">4d20a8e</a>)</li>
</ul>
<h3>Documentation</h3>
<ul>
<li>Correct formatting and update README.rst (<a
href="https://redirect.github.com/googleapis/python-storage/issues/1427">#1427</a>)
(<a
href="2945853977">2945853</a>)</li>
<li>Fix issue with exceptions.py documentation (<a
href="https://redirect.github.com/googleapis/python-storage/issues/1328">#1328</a>)
(<a
href="22b8c304af">22b8c30</a>)</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="f2cc9c5a2b"><code>f2cc9c5</code></a>
chore(main): release 3.0.0 (<a
href="https://redirect.github.com/googleapis/python-storage/issues/1393">#1393</a>)</li>
<li><a
href="71455bcc7f"><code>71455bc</code></a>
samples: add OTel Tracing quickstart (<a
href="https://redirect.github.com/googleapis/python-storage/issues/1371">#1371</a>)</li>
<li><a
href="2945853977"><code>2945853</code></a>
Docs: Correct formatting and update README.rst (<a
href="https://redirect.github.com/googleapis/python-storage/issues/1427">#1427</a>)</li>
<li><a
href="0426005175"><code>0426005</code></a>
feat: Retry by default for uploads, blob deletes, metadata updates (<a
href="https://redirect.github.com/googleapis/python-storage/issues/1400">#1400</a>)</li>
<li><a
href="0c21210450"><code>0c21210</code></a>
fix: filter download_kwargs in BlobReader (<a
href="https://redirect.github.com/googleapis/python-storage/issues/1411">#1411</a>)</li>
<li><a
href="c2a2ce58c7"><code>c2a2ce5</code></a>
chore(python): exclude .github/workflows/unittest.yml in renovate config
(<a
href="https://redirect.github.com/googleapis/python-storage/issues/1407">#1407</a>)</li>
<li><a
href="ffa0734708"><code>ffa0734</code></a>
chore(deps): update all dependencies (<a
href="https://redirect.github.com/googleapis/python-storage/issues/1405">#1405</a>)</li>
<li><a
href="2e94ad0eba"><code>2e94ad0</code></a>
chore(python): Update the python version in docs presubmit to use 3.10
(<a
href="https://redirect.github.com/googleapis/python-storage/issues/1403">#1403</a>)</li>
<li><a
href="4e9a382dd9"><code>4e9a382</code></a>
feat!: Release as 3.0.0 (<a
href="https://redirect.github.com/googleapis/python-storage/issues/1396">#1396</a>)</li>
<li><a
href="066be2db78"><code>066be2d</code></a>
feat: download_to_filename deletes the empty file on a 404 (<a
href="https://redirect.github.com/googleapis/python-storage/issues/1394">#1394</a>)</li>
<li>Additional commits viewable in <a
href="https://github.com/googleapis/python-storage/compare/v2.19.0...v3.0.0">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=google-cloud-storage&package-manager=pip&previous-version=2.19.0&new-version=3.0.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

---------

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Nicholas Tindle <nicholas.tindle@agpt.co>
2025-02-06 14:15:36 +00:00
Swifty
8181ee8cd1 platform(fix): Fix missing Profiles (#9424)
### What's This PR About?

This PR makes a few simple improvements to how user profiles are handled
in the app:

- **Always Have a Profile:**  
If a user doesn't already have a profile, the system now automatically
creates one with some default info (including a fun, randomly generated
username). This way, you never end up with a missing profile.

- **Better Profile Updates:**  
  Removes the creation of profiles on failed get requests
2025-02-06 11:20:32 +01:00
Zamil Majdy
6183ed5a63 fix(frontend): Add user note for automatic refill feature 2025-02-06 15:45:49 +07:00
dependabot[bot]
4b76aae1c9 chore(libs/deps): bump the production-dependencies group across 1 directory with 4 updates (#9432)
Bumps the production-dependencies group with 4 updates in the
/autogpt_platform/autogpt_libs directory:
[google-cloud-logging](https://github.com/googleapis/python-logging),
[pydantic](https://github.com/pydantic/pydantic),
[pytest-asyncio](https://github.com/pytest-dev/pytest-asyncio) and
[supabase](https://github.com/supabase/supabase-py).

Updates `google-cloud-logging` from 3.11.3 to 3.11.4
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/googleapis/python-logging/releases">google-cloud-logging's
releases</a>.</em></p>
<blockquote>
<h2>v3.11.4</h2>
<h2><a
href="https://github.com/googleapis/python-logging/compare/v3.11.3...v3.11.4">3.11.4</a>
(2025-01-22)</h2>
<h3>Bug Fixes</h3>
<ul>
<li>Made <code>write_entries</code> raise <code>ValueError</code> on
<code>ParseError</code>s (<a
href="https://redirect.github.com/googleapis/python-logging/issues/958">#958</a>)
(<a
href="5309478c05">5309478</a>)</li>
<li>Require proto-plus &gt;= 1.25 for Python 3.13 (<a
href="https://redirect.github.com/googleapis/python-logging/issues/955">#955</a>)
(<a
href="7baed8e968">7baed8e</a>)</li>
<li>Require proto-plus &gt;= 1.25 for Python 3.13 (<a
href="https://redirect.github.com/googleapis/python-logging/issues/955">#955</a>)
(<a
href="002b1fcb39">002b1fc</a>)</li>
</ul>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/googleapis/python-logging/blob/main/CHANGELOG.md">google-cloud-logging's
changelog</a>.</em></p>
<blockquote>
<h2><a
href="https://github.com/googleapis/python-logging/compare/v3.11.3...v3.11.4">3.11.4</a>
(2025-01-22)</h2>
<h3>Bug Fixes</h3>
<ul>
<li>Made <code>write_entries</code> raise <code>ValueError</code> on
<code>ParseError</code>s (<a
href="https://redirect.github.com/googleapis/python-logging/issues/958">#958</a>)
(<a
href="5309478c05">5309478</a>)</li>
<li>Require proto-plus &gt;= 1.25 for Python 3.13 (<a
href="https://redirect.github.com/googleapis/python-logging/issues/955">#955</a>)
(<a
href="7baed8e968">7baed8e</a>)</li>
<li>Require proto-plus &gt;= 1.25 for Python 3.13 (<a
href="https://redirect.github.com/googleapis/python-logging/issues/955">#955</a>)
(<a
href="002b1fcb39">002b1fc</a>)</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="c47946fee0"><code>c47946f</code></a>
chore(main): release 3.11.4 (<a
href="https://redirect.github.com/googleapis/python-logging/issues/956">#956</a>)</li>
<li><a
href="5309478c05"><code>5309478</code></a>
fix: Made <code>write_entries</code> raise <code>ValueError</code> on
<code>ParseError</code>s (<a
href="https://redirect.github.com/googleapis/python-logging/issues/958">#958</a>)</li>
<li><a
href="dfcce1f5ed"><code>dfcce1f</code></a>
chore(python): exclude .github/workflows/unittest.yml in renovate config
(<a
href="https://redirect.github.com/googleapis/python-logging/issues/959">#959</a>)</li>
<li><a
href="8c2c5a798a"><code>8c2c5a7</code></a>
chore(python): update dependencies in .kokoro/docker/docs (<a
href="https://redirect.github.com/googleapis/python-logging/issues/957">#957</a>)</li>
<li><a
href="7baed8e968"><code>7baed8e</code></a>
fix: require proto-plus &gt;= 1.25 for Python 3.13 (<a
href="https://redirect.github.com/googleapis/python-logging/issues/955">#955</a>)</li>
<li><a
href="002b1fcb39"><code>002b1fc</code></a>
fix: require proto-plus &gt;= 1.25 for Python 3.13 (<a
href="https://redirect.github.com/googleapis/python-logging/issues/955">#955</a>)</li>
<li><a
href="439eaa5b2f"><code>439eaa5</code></a>
chore(python): update dependencies in .kokoro/docker/docs (<a
href="https://redirect.github.com/googleapis/python-logging/issues/954">#954</a>)</li>
<li>See full diff in <a
href="https://github.com/googleapis/python-logging/compare/v3.11.3...v3.11.4">compare
view</a></li>
</ul>
</details>
<br />

Updates `pydantic` from 2.10.5 to 2.10.6
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/pydantic/pydantic/releases">pydantic's
releases</a>.</em></p>
<blockquote>
<h2>v2.10.6 2025-01-23</h2>
<h2>What's Changed</h2>
<h3>Fixes</h3>
<ul>
<li>Fix JSON Schema reference collection with <code>'examples'</code>
keys by <a href="https://github.com/Viicos"><code>@​Viicos</code></a> in
<a
href="https://redirect.github.com/pydantic/pydantic/pull/11325">#11325</a></li>
<li>Fix url python serialization by <a
href="https://github.com/sydney-runkle"><code>@​sydney-runkle</code></a>
in <a
href="https://redirect.github.com/pydantic/pydantic/pull/11331">#11331</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/pydantic/pydantic/compare/v2.10.5...v2.10.6">https://github.com/pydantic/pydantic/compare/v2.10.5...v2.10.6</a></p>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/pydantic/pydantic/blob/main/HISTORY.md">pydantic's
changelog</a>.</em></p>
<blockquote>
<h2>v2.10.6 (2025-01-23)</h2>
<p><a
href="https://github.com/pydantic/pydantic/releases/tag/v2.10.6">GitHub
release</a></p>
<h3>What's Changed</h3>
<h4>Fixes</h4>
<ul>
<li>Fix JSON Schema reference collection with <code>'examples'</code>
keys by <a href="https://github.com/Viicos"><code>@​Viicos</code></a> in
<a
href="https://redirect.github.com/pydantic/pydantic/pull/11325">#11325</a></li>
<li>Fix url python serialization by <a
href="https://github.com/sydney-runkle"><code>@​sydney-runkle</code></a>
in <a
href="https://redirect.github.com/pydantic/pydantic/pull/11331">#11331</a></li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="df05e69a8a"><code>df05e69</code></a>
Bump version to v2.10.6 (<a
href="https://redirect.github.com/pydantic/pydantic/issues/11334">#11334</a>)</li>
<li><a
href="416082625a"><code>4160826</code></a>
Fix url python serialization (<a
href="https://redirect.github.com/pydantic/pydantic/issues/11331">#11331</a>)</li>
<li><a
href="f94e842692"><code>f94e842</code></a>
Fix JSON Schema reference collection with
<code>&quot;examples&quot;</code> keys (<a
href="https://redirect.github.com/pydantic/pydantic/issues/11325">#11325</a>)</li>
<li>See full diff in <a
href="https://github.com/pydantic/pydantic/compare/v2.10.5...v2.10.6">compare
view</a></li>
</ul>
</details>
<br />

Updates `pytest-asyncio` from 0.25.2 to 0.25.3
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/pytest-dev/pytest-asyncio/releases">pytest-asyncio's
releases</a>.</em></p>
<blockquote>
<h2>pytest-asyncio 0.25.3</h2>
<ul>
<li>Avoid errors in cleanup of async generators when event loop is
already closed <a
href="https://redirect.github.com/pytest-dev/pytest-asyncio/issues/1040">#1040</a></li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="7c501923b0"><code>7c50192</code></a>
fix: Avoid errors in cleanup of async generators when event loop is
already c...</li>
<li>See full diff in <a
href="https://github.com/pytest-dev/pytest-asyncio/compare/v0.25.2...v0.25.3">compare
view</a></li>
</ul>
</details>
<br />

Updates `supabase` from 2.11.0 to 2.13.0
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/supabase/supabase-py/releases">supabase's
releases</a>.</em></p>
<blockquote>
<h2>v2.13.0</h2>
<h2><a
href="https://github.com/supabase/supabase-py/compare/v2.12.0...v2.13.0">2.13.0</a>
(2025-02-04)</h2>
<h3>Features</h3>
<ul>
<li><strong>realtime:</strong> bump realtime from 2.2.0 to 2.3.0 (<a
href="https://redirect.github.com/supabase/supabase-py/issues/1049">#1049</a>)
(<a
href="2347401770">2347401</a>)</li>
</ul>
<h3>Bug Fixes</h3>
<ul>
<li><strong>auth:</strong> bump gotrue from 2.11.2 to 2.11.3 (<a
href="https://redirect.github.com/supabase/supabase-py/issues/1051">#1051</a>)
(<a
href="4a2bb9e73e">4a2bb9e</a>)</li>
<li><strong>functions:</strong> bump supafunc from 0.9.2 to 0.9.3 (<a
href="https://redirect.github.com/supabase/supabase-py/issues/1052">#1052</a>)
(<a
href="29fed38015">29fed38</a>)</li>
<li><strong>storage:</strong> bump storage3 from 0.11.1 to 0.11.3 (<a
href="https://redirect.github.com/supabase/supabase-py/issues/1050">#1050</a>)
(<a
href="8c5d48f51f">8c5d48f</a>)</li>
<li>update SupabaseAuthClient to use super instead of calling base class
(<a
href="https://redirect.github.com/supabase/supabase-py/issues/1045">#1045</a>)
(<a
href="3efb4a678b">3efb4a6</a>)</li>
</ul>
<h2>v2.12.0</h2>
<h2><a
href="https://github.com/supabase/supabase-py/compare/v2.11.0...v2.12.0">2.12.0</a>
(2025-01-24)</h2>
<h3>Features</h3>
<ul>
<li><strong>realtime:</strong> bump realtime from 2.1.0 to 2.2.0 (<a
href="https://redirect.github.com/supabase/supabase-py/issues/1037">#1037</a>)
(<a
href="0e5eed6f47">0e5eed6</a>)</li>
</ul>
<h3>Bug Fixes</h3>
<ul>
<li><strong>auth:</strong> bump gotrue from 2.11.1 to 2.11.2 (<a
href="https://redirect.github.com/supabase/supabase-py/issues/1036">#1036</a>)
(<a
href="f7c87b9010">f7c87b9</a>)</li>
<li><strong>functions:</strong> bump supafunc from 0.9.0 to 0.9.2 (<a
href="https://redirect.github.com/supabase/supabase-py/issues/1029">#1029</a>)
(<a
href="f53f7ff58c">f53f7ff</a>)</li>
<li><strong>postgrest:</strong> bump postgrest from 0.19.1 to 0.19.3 (<a
href="https://redirect.github.com/supabase/supabase-py/issues/1040">#1040</a>)
(<a
href="c117a57a3f">c117a57</a>)</li>
<li><strong>storage:</strong> bump storage3 from 0.11.0 to 0.11.1 (<a
href="https://redirect.github.com/supabase/supabase-py/issues/1035">#1035</a>)
(<a
href="17942a2a6a">17942a2</a>)</li>
</ul>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/supabase/supabase-py/blob/main/CHANGELOG.md">supabase's
changelog</a>.</em></p>
<blockquote>
<h2><a
href="https://github.com/supabase/supabase-py/compare/v2.12.0...v2.13.0">2.13.0</a>
(2025-02-04)</h2>
<h3>Features</h3>
<ul>
<li><strong>realtime:</strong> bump realtime from 2.2.0 to 2.3.0 (<a
href="https://redirect.github.com/supabase/supabase-py/issues/1049">#1049</a>)
(<a
href="2347401770">2347401</a>)</li>
</ul>
<h3>Bug Fixes</h3>
<ul>
<li><strong>auth:</strong> bump gotrue from 2.11.2 to 2.11.3 (<a
href="https://redirect.github.com/supabase/supabase-py/issues/1051">#1051</a>)
(<a
href="4a2bb9e73e">4a2bb9e</a>)</li>
<li><strong>functions:</strong> bump supafunc from 0.9.2 to 0.9.3 (<a
href="https://redirect.github.com/supabase/supabase-py/issues/1052">#1052</a>)
(<a
href="29fed38015">29fed38</a>)</li>
<li><strong>storage:</strong> bump storage3 from 0.11.1 to 0.11.3 (<a
href="https://redirect.github.com/supabase/supabase-py/issues/1050">#1050</a>)
(<a
href="8c5d48f51f">8c5d48f</a>)</li>
<li>update SupabaseAuthClient to use super instead of calling base class
(<a
href="https://redirect.github.com/supabase/supabase-py/issues/1045">#1045</a>)
(<a
href="3efb4a678b">3efb4a6</a>)</li>
</ul>
<h2><a
href="https://github.com/supabase/supabase-py/compare/v2.11.0...v2.12.0">2.12.0</a>
(2025-01-24)</h2>
<h3>Features</h3>
<ul>
<li><strong>realtime:</strong> bump realtime from 2.1.0 to 2.2.0 (<a
href="https://redirect.github.com/supabase/supabase-py/issues/1037">#1037</a>)
(<a
href="0e5eed6f47">0e5eed6</a>)</li>
</ul>
<h3>Bug Fixes</h3>
<ul>
<li><strong>auth:</strong> bump gotrue from 2.11.1 to 2.11.2 (<a
href="https://redirect.github.com/supabase/supabase-py/issues/1036">#1036</a>)
(<a
href="f7c87b9010">f7c87b9</a>)</li>
<li><strong>functions:</strong> bump supafunc from 0.9.0 to 0.9.2 (<a
href="https://redirect.github.com/supabase/supabase-py/issues/1029">#1029</a>)
(<a
href="f53f7ff58c">f53f7ff</a>)</li>
<li><strong>postgrest:</strong> bump postgrest from 0.19.1 to 0.19.3 (<a
href="https://redirect.github.com/supabase/supabase-py/issues/1040">#1040</a>)
(<a
href="c117a57a3f">c117a57</a>)</li>
<li><strong>storage:</strong> bump storage3 from 0.11.0 to 0.11.1 (<a
href="https://redirect.github.com/supabase/supabase-py/issues/1035">#1035</a>)
(<a
href="17942a2a6a">17942a2</a>)</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="c70802e55c"><code>c70802e</code></a>
chore(main): release 2.13.0 (<a
href="https://redirect.github.com/supabase/supabase-py/issues/1046">#1046</a>)</li>
<li><a
href="614cacc6a9"><code>614cacc</code></a>
chore(tests): increase coverage (<a
href="https://redirect.github.com/supabase/supabase-py/issues/1053">#1053</a>)</li>
<li><a
href="29fed38015"><code>29fed38</code></a>
fix(functions): bump supafunc from 0.9.2 to 0.9.3 (<a
href="https://redirect.github.com/supabase/supabase-py/issues/1052">#1052</a>)</li>
<li><a
href="4a2bb9e73e"><code>4a2bb9e</code></a>
fix(auth): bump gotrue from 2.11.2 to 2.11.3 (<a
href="https://redirect.github.com/supabase/supabase-py/issues/1051">#1051</a>)</li>
<li><a
href="8c5d48f51f"><code>8c5d48f</code></a>
fix(storage): bump storage3 from 0.11.1 to 0.11.3 (<a
href="https://redirect.github.com/supabase/supabase-py/issues/1050">#1050</a>)</li>
<li><a
href="2347401770"><code>2347401</code></a>
feat(realtime): bump realtime from 2.2.0 to 2.3.0 (<a
href="https://redirect.github.com/supabase/supabase-py/issues/1049">#1049</a>)</li>
<li><a
href="4237977463"><code>4237977</code></a>
chore(deps-dev): bump black from 24.10.0 to 25.1.0 (<a
href="https://redirect.github.com/supabase/supabase-py/issues/1044">#1044</a>)</li>
<li><a
href="489043f03a"><code>489043f</code></a>
chore(deps-dev): bump pytest-asyncio from 0.25.2 to 0.25.3 (<a
href="https://redirect.github.com/supabase/supabase-py/issues/1043">#1043</a>)</li>
<li><a
href="b007da45c6"><code>b007da4</code></a>
chore(deps-dev): bump commitizen from 4.1.0 to 4.1.1 (<a
href="https://redirect.github.com/supabase/supabase-py/issues/1042">#1042</a>)</li>
<li><a
href="9a8713125e"><code>9a87131</code></a>
chore(ci): pipeline using same version of python for all tests (<a
href="https://redirect.github.com/supabase/supabase-py/issues/1047">#1047</a>)</li>
<li>Additional commits viewable in <a
href="https://github.com/supabase/supabase-py/compare/v2.11.0...v2.13.0">compare
view</a></li>
</ul>
</details>
<br />


Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore <dependency name> major version` will close this
group update PR and stop Dependabot creating any more for the specific
dependency's major version (unless you unignore this specific
dependency's major version or upgrade to it yourself)
- `@dependabot ignore <dependency name> minor version` will close this
group update PR and stop Dependabot creating any more for the specific
dependency's minor version (unless you unignore this specific
dependency's minor version or upgrade to it yourself)
- `@dependabot ignore <dependency name>` will close this group update PR
and stop Dependabot creating any more for the specific dependency
(unless you unignore this specific dependency or upgrade to it yourself)
- `@dependabot unignore <dependency name>` will remove all of the ignore
conditions of the specified dependency
- `@dependabot unignore <dependency name> <ignore condition>` will
remove the ignore condition of the specified dependency and ignore
conditions


</details>

---------

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Nicholas Tindle <nicholas.tindle@agpt.co>
2025-02-05 22:39:04 +00:00
dependabot[bot]
f6e395f36e chore(frontend/deps-dev): bump the development-dependencies group in /autogpt_platform/frontend with 13 updates (#9401)
Bumps the development-dependencies group in /autogpt_platform/frontend
with 13 updates:

| Package | From | To |
| --- | --- | --- |
| [@playwright/test](https://github.com/microsoft/playwright) | `1.50.0`
| `1.50.1` |
|
[@storybook/addon-a11y](https://github.com/storybookjs/storybook/tree/HEAD/code/addons/a11y)
| `8.5.2` | `8.5.3` |
|
[@storybook/addon-essentials](https://github.com/storybookjs/storybook/tree/HEAD/code/addons/essentials)
| `8.5.2` | `8.5.3` |
|
[@storybook/addon-interactions](https://github.com/storybookjs/storybook/tree/HEAD/code/addons/interactions)
| `8.5.2` | `8.5.3` |
|
[@storybook/addon-links](https://github.com/storybookjs/storybook/tree/HEAD/code/addons/links)
| `8.5.2` | `8.5.3` |
|
[@storybook/addon-onboarding](https://github.com/storybookjs/storybook/tree/HEAD/code/addons/onboarding)
| `8.5.2` | `8.5.3` |
|
[@storybook/blocks](https://github.com/storybookjs/storybook/tree/HEAD/code/lib/blocks)
| `8.5.2` | `8.5.3` |
|
[@storybook/nextjs](https://github.com/storybookjs/storybook/tree/HEAD/code/frameworks/nextjs)
| `8.5.2` | `8.5.3` |
|
[@storybook/react](https://github.com/storybookjs/storybook/tree/HEAD/code/renderers/react)
| `8.5.2` | `8.5.3` |
|
[@storybook/test](https://github.com/storybookjs/storybook/tree/HEAD/code/lib/test)
| `8.5.2` | `8.5.3` |
|
[@types/node](https://github.com/DefinitelyTyped/DefinitelyTyped/tree/HEAD/types/node)
| `22.10.10` | `22.13.0` |
| [chromatic](https://github.com/chromaui/chromatic-cli) | `11.25.1` |
`11.25.2` |
|
[storybook](https://github.com/storybookjs/storybook/tree/HEAD/code/lib/cli)
| `8.5.2` | `8.5.3` |

Updates `@playwright/test` from 1.50.0 to 1.50.1
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/microsoft/playwright/releases"><code>@​playwright/test</code>'s
releases</a>.</em></p>
<blockquote>
<h2>v1.50.1</h2>
<h3>Highlights</h3>
<p><a
href="https://redirect.github.com/microsoft/playwright/issues/34483">microsoft/playwright#34483</a>
- [Feature]: single aria snapshot for different engines/browsers
<a
href="https://redirect.github.com/microsoft/playwright/issues/34497">microsoft/playwright#34497</a>
- [Bug]: Firefox not handling keepalive: true fetch requests
<a
href="https://redirect.github.com/microsoft/playwright/issues/34504">microsoft/playwright#34504</a>
- [Bug]: update snapshots not creating good diffs
<a
href="https://redirect.github.com/microsoft/playwright/issues/34507">microsoft/playwright#34507</a>
- [Bug]: snapshotPathTemplate doesnt work when multiple projects
<a
href="https://redirect.github.com/microsoft/playwright/issues/34462">microsoft/playwright#34462</a>
- [Bug]: updateSnapshots &quot;changed&quot; throws an error</p>
<h2>Browser Versions</h2>
<ul>
<li>Chromium 133.0.6943.16</li>
<li>Mozilla Firefox 134.0</li>
<li>WebKit 18.2</li>
</ul>
<p>This version was also tested against the following stable
channels:</p>
<ul>
<li>Google Chrome 132</li>
<li>Microsoft Edge 132</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="dbc685ca98"><code>dbc685c</code></a>
chore: mark v1.50.1 (<a
href="https://redirect.github.com/microsoft/playwright/issues/34575">#34575</a>)</li>
<li><a
href="13d80f184e"><code>13d80f1</code></a>
cherry-pick(<a
href="https://redirect.github.com/microsoft/playwright/issues/34560">#34560</a>):
chore(docs): clarify connection method via BrowserType.c...</li>
<li><a
href="159210da82"><code>159210d</code></a>
cherry-pick(<a
href="https://redirect.github.com/microsoft/playwright/issues/34556">#34556</a>):
fix(toMatchAriaSnapshot): fail test run when updating mi...</li>
<li><a
href="fbad9f7ff7"><code>fbad9f7</code></a>
cherry-pick(<a
href="https://redirect.github.com/microsoft/playwright/issues/34537">#34537</a>):
feat: per-assertion snapshot path template in config (<a
href="https://redirect.github.com/microsoft/playwright/issues/3">#3</a>...</li>
<li><a
href="67313faaa7"><code>67313fa</code></a>
cherry-pick(<a
href="https://redirect.github.com/microsoft/playwright/issues/34550">#34550</a>):
roll follow-ups for .NET and Python</li>
<li><a
href="4b7794b37a"><code>4b7794b</code></a>
cherry-pick(<a
href="https://redirect.github.com/microsoft/playwright/issues/34544">#34544</a>):
fix(aria): disregard text area textContent</li>
<li><a
href="1efbedd3b3"><code>1efbedd</code></a>
cherry-pick(<a
href="https://redirect.github.com/microsoft/playwright/issues/34535">#34535</a>):
Revert &quot;Reapply &quot;fix(har timing): record connect timing
...</li>
<li><a
href="1e258e0894"><code>1e258e0</code></a>
cherry-pick(<a
href="https://redirect.github.com/microsoft/playwright/issues/34420">#34420</a>):
chore(deps): bump vite from 5.4.6 to 5.4.14 (<a
href="https://redirect.github.com/microsoft/playwright/issues/34539">#34539</a>)</li>
<li><a
href="7be4ef58de"><code>7be4ef5</code></a>
cherry-pick(<a
href="https://redirect.github.com/microsoft/playwright/issues/34522">#34522</a>):
test: fetch request through socks proxy over ipv4</li>
<li><a
href="7b3e590289"><code>7b3e590</code></a>
cherry-pick(<a
href="https://redirect.github.com/microsoft/playwright/issues/34530">#34530</a>):
fix(firefox): disable fetch keep-alive for now before a ...</li>
<li>Additional commits viewable in <a
href="https://github.com/microsoft/playwright/compare/v1.50.0...v1.50.1">compare
view</a></li>
</ul>
</details>
<br />

Updates `@storybook/addon-a11y` from 8.5.2 to 8.5.3
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/storybookjs/storybook/releases"><code>@​storybook/addon-a11y</code>'s
releases</a>.</em></p>
<blockquote>
<h2>v8.5.3</h2>
<h2>8.5.3</h2>
<ul>
<li>Preview: Add <code>globals</code> to <code>extract()</code> - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30415">#30415</a>,
thanks <a
href="https://github.com/ndelangen"><code>@​ndelangen</code></a>!</li>
<li>Vite: Fix add component UI invalidation - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30438">#30438</a>,
thanks <a
href="https://github.com/shilman"><code>@​shilman</code></a>!</li>
</ul>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/storybookjs/storybook/blob/next/CHANGELOG.md"><code>@​storybook/addon-a11y</code>'s
changelog</a>.</em></p>
<blockquote>
<h2>8.5.3</h2>
<ul>
<li>Preview: Add <code>globals</code> to <code>extract()</code> - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30415">#30415</a>,
thanks <a
href="https://github.com/ndelangen"><code>@​ndelangen</code></a>!</li>
<li>Vite: Fix add component UI invalidation - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30438">#30438</a>,
thanks <a
href="https://github.com/shilman"><code>@​shilman</code></a>!</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="81d183f3ab"><code>81d183f</code></a>
Bump version from &quot;8.5.2&quot; to &quot;8.5.3&quot; [skip ci]</li>
<li>See full diff in <a
href="https://github.com/storybookjs/storybook/commits/v8.5.3/code/addons/a11y">compare
view</a></li>
</ul>
</details>
<br />

Updates `@storybook/addon-essentials` from 8.5.2 to 8.5.3
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/storybookjs/storybook/releases"><code>@​storybook/addon-essentials</code>'s
releases</a>.</em></p>
<blockquote>
<h2>v8.5.3</h2>
<h2>8.5.3</h2>
<ul>
<li>Preview: Add <code>globals</code> to <code>extract()</code> - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30415">#30415</a>,
thanks <a
href="https://github.com/ndelangen"><code>@​ndelangen</code></a>!</li>
<li>Vite: Fix add component UI invalidation - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30438">#30438</a>,
thanks <a
href="https://github.com/shilman"><code>@​shilman</code></a>!</li>
</ul>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/storybookjs/storybook/blob/next/CHANGELOG.md"><code>@​storybook/addon-essentials</code>'s
changelog</a>.</em></p>
<blockquote>
<h2>8.5.3</h2>
<ul>
<li>Preview: Add <code>globals</code> to <code>extract()</code> - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30415">#30415</a>,
thanks <a
href="https://github.com/ndelangen"><code>@​ndelangen</code></a>!</li>
<li>Vite: Fix add component UI invalidation - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30438">#30438</a>,
thanks <a
href="https://github.com/shilman"><code>@​shilman</code></a>!</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="81d183f3ab"><code>81d183f</code></a>
Bump version from &quot;8.5.2&quot; to &quot;8.5.3&quot; [skip ci]</li>
<li>See full diff in <a
href="https://github.com/storybookjs/storybook/commits/v8.5.3/code/addons/essentials">compare
view</a></li>
</ul>
</details>
<br />

Updates `@storybook/addon-interactions` from 8.5.2 to 8.5.3
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/storybookjs/storybook/releases"><code>@​storybook/addon-interactions</code>'s
releases</a>.</em></p>
<blockquote>
<h2>v8.5.3</h2>
<h2>8.5.3</h2>
<ul>
<li>Preview: Add <code>globals</code> to <code>extract()</code> - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30415">#30415</a>,
thanks <a
href="https://github.com/ndelangen"><code>@​ndelangen</code></a>!</li>
<li>Vite: Fix add component UI invalidation - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30438">#30438</a>,
thanks <a
href="https://github.com/shilman"><code>@​shilman</code></a>!</li>
</ul>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/storybookjs/storybook/blob/next/CHANGELOG.md"><code>@​storybook/addon-interactions</code>'s
changelog</a>.</em></p>
<blockquote>
<h2>8.5.3</h2>
<ul>
<li>Preview: Add <code>globals</code> to <code>extract()</code> - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30415">#30415</a>,
thanks <a
href="https://github.com/ndelangen"><code>@​ndelangen</code></a>!</li>
<li>Vite: Fix add component UI invalidation - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30438">#30438</a>,
thanks <a
href="https://github.com/shilman"><code>@​shilman</code></a>!</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="81d183f3ab"><code>81d183f</code></a>
Bump version from &quot;8.5.2&quot; to &quot;8.5.3&quot; [skip ci]</li>
<li>See full diff in <a
href="https://github.com/storybookjs/storybook/commits/v8.5.3/code/addons/interactions">compare
view</a></li>
</ul>
</details>
<br />

Updates `@storybook/addon-links` from 8.5.2 to 8.5.3
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/storybookjs/storybook/releases"><code>@​storybook/addon-links</code>'s
releases</a>.</em></p>
<blockquote>
<h2>v8.5.3</h2>
<h2>8.5.3</h2>
<ul>
<li>Preview: Add <code>globals</code> to <code>extract()</code> - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30415">#30415</a>,
thanks <a
href="https://github.com/ndelangen"><code>@​ndelangen</code></a>!</li>
<li>Vite: Fix add component UI invalidation - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30438">#30438</a>,
thanks <a
href="https://github.com/shilman"><code>@​shilman</code></a>!</li>
</ul>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/storybookjs/storybook/blob/next/CHANGELOG.md"><code>@​storybook/addon-links</code>'s
changelog</a>.</em></p>
<blockquote>
<h2>8.5.3</h2>
<ul>
<li>Preview: Add <code>globals</code> to <code>extract()</code> - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30415">#30415</a>,
thanks <a
href="https://github.com/ndelangen"><code>@​ndelangen</code></a>!</li>
<li>Vite: Fix add component UI invalidation - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30438">#30438</a>,
thanks <a
href="https://github.com/shilman"><code>@​shilman</code></a>!</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="81d183f3ab"><code>81d183f</code></a>
Bump version from &quot;8.5.2&quot; to &quot;8.5.3&quot; [skip ci]</li>
<li>See full diff in <a
href="https://github.com/storybookjs/storybook/commits/v8.5.3/code/addons/links">compare
view</a></li>
</ul>
</details>
<br />

Updates `@storybook/addon-onboarding` from 8.5.2 to 8.5.3
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/storybookjs/storybook/releases"><code>@​storybook/addon-onboarding</code>'s
releases</a>.</em></p>
<blockquote>
<h2>v8.5.3</h2>
<h2>8.5.3</h2>
<ul>
<li>Preview: Add <code>globals</code> to <code>extract()</code> - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30415">#30415</a>,
thanks <a
href="https://github.com/ndelangen"><code>@​ndelangen</code></a>!</li>
<li>Vite: Fix add component UI invalidation - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30438">#30438</a>,
thanks <a
href="https://github.com/shilman"><code>@​shilman</code></a>!</li>
</ul>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/storybookjs/storybook/blob/next/CHANGELOG.md"><code>@​storybook/addon-onboarding</code>'s
changelog</a>.</em></p>
<blockquote>
<h2>8.5.3</h2>
<ul>
<li>Preview: Add <code>globals</code> to <code>extract()</code> - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30415">#30415</a>,
thanks <a
href="https://github.com/ndelangen"><code>@​ndelangen</code></a>!</li>
<li>Vite: Fix add component UI invalidation - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30438">#30438</a>,
thanks <a
href="https://github.com/shilman"><code>@​shilman</code></a>!</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="81d183f3ab"><code>81d183f</code></a>
Bump version from &quot;8.5.2&quot; to &quot;8.5.3&quot; [skip ci]</li>
<li>See full diff in <a
href="https://github.com/storybookjs/storybook/commits/v8.5.3/code/addons/onboarding">compare
view</a></li>
</ul>
</details>
<br />

Updates `@storybook/blocks` from 8.5.2 to 8.5.3
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/storybookjs/storybook/releases"><code>@​storybook/blocks</code>'s
releases</a>.</em></p>
<blockquote>
<h2>v8.5.3</h2>
<h2>8.5.3</h2>
<ul>
<li>Preview: Add <code>globals</code> to <code>extract()</code> - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30415">#30415</a>,
thanks <a
href="https://github.com/ndelangen"><code>@​ndelangen</code></a>!</li>
<li>Vite: Fix add component UI invalidation - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30438">#30438</a>,
thanks <a
href="https://github.com/shilman"><code>@​shilman</code></a>!</li>
</ul>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/storybookjs/storybook/blob/next/CHANGELOG.md"><code>@​storybook/blocks</code>'s
changelog</a>.</em></p>
<blockquote>
<h2>8.5.3</h2>
<ul>
<li>Preview: Add <code>globals</code> to <code>extract()</code> - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30415">#30415</a>,
thanks <a
href="https://github.com/ndelangen"><code>@​ndelangen</code></a>!</li>
<li>Vite: Fix add component UI invalidation - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30438">#30438</a>,
thanks <a
href="https://github.com/shilman"><code>@​shilman</code></a>!</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="81d183f3ab"><code>81d183f</code></a>
Bump version from &quot;8.5.2&quot; to &quot;8.5.3&quot; [skip ci]</li>
<li>See full diff in <a
href="https://github.com/storybookjs/storybook/commits/v8.5.3/code/lib/blocks">compare
view</a></li>
</ul>
</details>
<br />

Updates `@storybook/nextjs` from 8.5.2 to 8.5.3
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/storybookjs/storybook/releases"><code>@​storybook/nextjs</code>'s
releases</a>.</em></p>
<blockquote>
<h2>v8.5.3</h2>
<h2>8.5.3</h2>
<ul>
<li>Preview: Add <code>globals</code> to <code>extract()</code> - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30415">#30415</a>,
thanks <a
href="https://github.com/ndelangen"><code>@​ndelangen</code></a>!</li>
<li>Vite: Fix add component UI invalidation - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30438">#30438</a>,
thanks <a
href="https://github.com/shilman"><code>@​shilman</code></a>!</li>
</ul>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/storybookjs/storybook/blob/next/CHANGELOG.md"><code>@​storybook/nextjs</code>'s
changelog</a>.</em></p>
<blockquote>
<h2>8.5.3</h2>
<ul>
<li>Preview: Add <code>globals</code> to <code>extract()</code> - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30415">#30415</a>,
thanks <a
href="https://github.com/ndelangen"><code>@​ndelangen</code></a>!</li>
<li>Vite: Fix add component UI invalidation - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30438">#30438</a>,
thanks <a
href="https://github.com/shilman"><code>@​shilman</code></a>!</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="81d183f3ab"><code>81d183f</code></a>
Bump version from &quot;8.5.2&quot; to &quot;8.5.3&quot; [skip ci]</li>
<li>See full diff in <a
href="https://github.com/storybookjs/storybook/commits/v8.5.3/code/frameworks/nextjs">compare
view</a></li>
</ul>
</details>
<br />

Updates `@storybook/react` from 8.5.2 to 8.5.3
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/storybookjs/storybook/releases"><code>@​storybook/react</code>'s
releases</a>.</em></p>
<blockquote>
<h2>v8.5.3</h2>
<h2>8.5.3</h2>
<ul>
<li>Preview: Add <code>globals</code> to <code>extract()</code> - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30415">#30415</a>,
thanks <a
href="https://github.com/ndelangen"><code>@​ndelangen</code></a>!</li>
<li>Vite: Fix add component UI invalidation - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30438">#30438</a>,
thanks <a
href="https://github.com/shilman"><code>@​shilman</code></a>!</li>
</ul>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/storybookjs/storybook/blob/next/CHANGELOG.md"><code>@​storybook/react</code>'s
changelog</a>.</em></p>
<blockquote>
<h2>8.5.3</h2>
<ul>
<li>Preview: Add <code>globals</code> to <code>extract()</code> - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30415">#30415</a>,
thanks <a
href="https://github.com/ndelangen"><code>@​ndelangen</code></a>!</li>
<li>Vite: Fix add component UI invalidation - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30438">#30438</a>,
thanks <a
href="https://github.com/shilman"><code>@​shilman</code></a>!</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="81d183f3ab"><code>81d183f</code></a>
Bump version from &quot;8.5.2&quot; to &quot;8.5.3&quot; [skip ci]</li>
<li>See full diff in <a
href="https://github.com/storybookjs/storybook/commits/v8.5.3/code/renderers/react">compare
view</a></li>
</ul>
</details>
<br />

Updates `@storybook/test` from 8.5.2 to 8.5.3
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/storybookjs/storybook/releases"><code>@​storybook/test</code>'s
releases</a>.</em></p>
<blockquote>
<h2>v8.5.3</h2>
<h2>8.5.3</h2>
<ul>
<li>Preview: Add <code>globals</code> to <code>extract()</code> - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30415">#30415</a>,
thanks <a
href="https://github.com/ndelangen"><code>@​ndelangen</code></a>!</li>
<li>Vite: Fix add component UI invalidation - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30438">#30438</a>,
thanks <a
href="https://github.com/shilman"><code>@​shilman</code></a>!</li>
</ul>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/storybookjs/storybook/blob/next/CHANGELOG.md"><code>@​storybook/test</code>'s
changelog</a>.</em></p>
<blockquote>
<h2>8.5.3</h2>
<ul>
<li>Preview: Add <code>globals</code> to <code>extract()</code> - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30415">#30415</a>,
thanks <a
href="https://github.com/ndelangen"><code>@​ndelangen</code></a>!</li>
<li>Vite: Fix add component UI invalidation - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30438">#30438</a>,
thanks <a
href="https://github.com/shilman"><code>@​shilman</code></a>!</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="81d183f3ab"><code>81d183f</code></a>
Bump version from &quot;8.5.2&quot; to &quot;8.5.3&quot; [skip ci]</li>
<li>See full diff in <a
href="https://github.com/storybookjs/storybook/commits/v8.5.3/code/lib/test">compare
view</a></li>
</ul>
</details>
<br />

Updates `@types/node` from 22.10.10 to 22.13.0
<details>
<summary>Commits</summary>
<ul>
<li>See full diff in <a
href="https://github.com/DefinitelyTyped/DefinitelyTyped/commits/HEAD/types/node">compare
view</a></li>
</ul>
</details>
<br />

Updates `chromatic` from 11.25.1 to 11.25.2
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/chromaui/chromatic-cli/releases">chromatic's
releases</a>.</em></p>
<blockquote>
<h2>v11.25.2</h2>
<h4>🐛 Bug Fix</h4>
<ul>
<li>Add additional rspack builder entrypoint <a
href="https://redirect.github.com/chromaui/chromatic-cli/pull/1147">#1147</a>
(<a href="https://github.com/jmhobbs"><code>@​jmhobbs</code></a>)</li>
<li>Account for accessibility change counts in UI <a
href="https://redirect.github.com/chromaui/chromatic-cli/pull/1145">#1145</a>
(<a href="https://github.com/jmhobbs"><code>@​jmhobbs</code></a>)</li>
</ul>
<h4>Authors: 1</h4>
<ul>
<li>John Hobbs (<a
href="https://github.com/jmhobbs"><code>@​jmhobbs</code></a>)</li>
</ul>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/chromaui/chromatic-cli/blob/main/CHANGELOG.md">chromatic's
changelog</a>.</em></p>
<blockquote>
<h1>v11.25.2 (Thu Jan 30 2025)</h1>
<h4>🐛 Bug Fix</h4>
<ul>
<li>Add additional rspack builder entrypoint <a
href="https://redirect.github.com/chromaui/chromatic-cli/pull/1147">#1147</a>
(<a href="https://github.com/jmhobbs"><code>@​jmhobbs</code></a>)</li>
<li>Account for accessibility change counts in UI <a
href="https://redirect.github.com/chromaui/chromatic-cli/pull/1145">#1145</a>
(<a href="https://github.com/jmhobbs"><code>@​jmhobbs</code></a>)</li>
</ul>
<h4>Authors: 1</h4>
<ul>
<li>John Hobbs (<a
href="https://github.com/jmhobbs"><code>@​jmhobbs</code></a>)</li>
</ul>
<hr />
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="9f1324772c"><code>9f13247</code></a>
Bump version to: 11.25.2 [skip ci]</li>
<li><a
href="00daacc6ad"><code>00daacc</code></a>
Update CHANGELOG.md [skip ci]</li>
<li><a
href="9047fb72d6"><code>9047fb7</code></a>
Merge pull request <a
href="https://redirect.github.com/chromaui/chromatic-cli/issues/1147">#1147</a>
from chromaui/jmhobbs/cap-2681-chromatic-fails-to-tr...</li>
<li><a
href="2dfe359084"><code>2dfe359</code></a>
Add additional rspack builder entrypoint</li>
<li><a
href="71abbccabe"><code>71abbcc</code></a>
Merge pull request <a
href="https://redirect.github.com/chromaui/chromatic-cli/issues/1145">#1145</a>
from chromaui/jmhobbs/cap-2647-update-cli-messaging-...</li>
<li><a
href="7af2d57b81"><code>7af2d57</code></a>
Pluralize was/were on total changes.</li>
<li><a
href="33140257ec"><code>3314025</code></a>
Move to multiline format for changes.</li>
<li><a
href="28eaae5955"><code>28eaae5</code></a>
Account for accessibility change counts in UI</li>
<li>See full diff in <a
href="https://github.com/chromaui/chromatic-cli/compare/v11.25.1...v11.25.2">compare
view</a></li>
</ul>
</details>
<br />

Updates `storybook` from 8.5.2 to 8.5.3
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/storybookjs/storybook/releases">storybook's
releases</a>.</em></p>
<blockquote>
<h2>v8.5.3</h2>
<h2>8.5.3</h2>
<ul>
<li>Preview: Add <code>globals</code> to <code>extract()</code> - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30415">#30415</a>,
thanks <a
href="https://github.com/ndelangen"><code>@​ndelangen</code></a>!</li>
<li>Vite: Fix add component UI invalidation - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30438">#30438</a>,
thanks <a
href="https://github.com/shilman"><code>@​shilman</code></a>!</li>
</ul>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/storybookjs/storybook/blob/next/CHANGELOG.md">storybook's
changelog</a>.</em></p>
<blockquote>
<h2>8.5.3</h2>
<ul>
<li>Preview: Add <code>globals</code> to <code>extract()</code> - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30415">#30415</a>,
thanks <a
href="https://github.com/ndelangen"><code>@​ndelangen</code></a>!</li>
<li>Vite: Fix add component UI invalidation - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30438">#30438</a>,
thanks <a
href="https://github.com/shilman"><code>@​shilman</code></a>!</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="81d183f3ab"><code>81d183f</code></a>
Bump version from &quot;8.5.2&quot; to &quot;8.5.3&quot; [skip ci]</li>
<li>See full diff in <a
href="https://github.com/storybookjs/storybook/commits/v8.5.3/code/lib/cli">compare
view</a></li>
</ul>
</details>
<br />


Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore <dependency name> major version` will close this
group update PR and stop Dependabot creating any more for the specific
dependency's major version (unless you unignore this specific
dependency's major version or upgrade to it yourself)
- `@dependabot ignore <dependency name> minor version` will close this
group update PR and stop Dependabot creating any more for the specific
dependency's minor version (unless you unignore this specific
dependency's minor version or upgrade to it yourself)
- `@dependabot ignore <dependency name>` will close this group update PR
and stop Dependabot creating any more for the specific dependency
(unless you unignore this specific dependency or upgrade to it yourself)
- `@dependabot unignore <dependency name>` will remove all of the ignore
conditions of the specified dependency
- `@dependabot unignore <dependency name> <ignore condition>` will
remove the ignore condition of the specified dependency and ignore
conditions


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Nicholas Tindle <nicholas.tindle@agpt.co>
2025-02-05 22:08:15 +00:00
dependabot[bot]
9c2d19cdb5 chore(frontend/deps): bump the production-dependencies group in /autogpt_platform/frontend with 4 updates (#9400)
Bumps the production-dependencies group in /autogpt_platform/frontend
with 4 updates:
[@sentry/nextjs](https://github.com/getsentry/sentry-javascript),
[@stripe/stripe-js](https://github.com/stripe/stripe-js),
[launchdarkly-react-client-sdk](https://github.com/launchdarkly/react-client-sdk)
and [recharts](https://github.com/recharts/recharts).

Updates `@sentry/nextjs` from 8.51.0 to 8.54.0
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/getsentry/sentry-javascript/releases"><code>@​sentry/nextjs</code>'s
releases</a>.</em></p>
<blockquote>
<h2>8.54.0</h2>
<ul>
<li>feat(v8/deps): Upgrade all OpenTelemetry dependencies (<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/15098">#15098</a>)</li>
<li>fix(node/v8): Add compatibility layer for Prisma v5 (<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/15210">#15210</a>)</li>
</ul>
<p>Work in this release was contributed by <a
href="https://github.com/nwalters512"><code>@​nwalters512</code></a>.
Thank you for your contribution!</p>
<h2>Bundle size 📦</h2>
<table>
<thead>
<tr>
<th>Path</th>
<th>Size</th>
</tr>
</thead>
<tbody>
<tr>
<td><code>@​sentry/browser</code></td>
<td>23.3 KB</td>
</tr>
<tr>
<td><code>@​sentry/browser</code> - with treeshaking flags</td>
<td>23.17 KB</td>
</tr>
<tr>
<td><code>@​sentry/browser</code> (incl. Tracing)</td>
<td>35.9 KB</td>
</tr>
<tr>
<td><code>@​sentry/browser</code> (incl. Tracing, Replay)</td>
<td>73.27 KB</td>
</tr>
<tr>
<td><code>@​sentry/browser</code> (incl. Tracing, Replay) - with
treeshaking flags</td>
<td>66.71 KB</td>
</tr>
<tr>
<td><code>@​sentry/browser</code> (incl. Tracing, Replay with
Canvas)</td>
<td>77.57 KB</td>
</tr>
<tr>
<td><code>@​sentry/browser</code> (incl. Tracing, Replay, Feedback)</td>
<td>89.5 KB</td>
</tr>
<tr>
<td><code>@​sentry/browser</code> (incl. Feedback)</td>
<td>39.51 KB</td>
</tr>
<tr>
<td><code>@​sentry/browser</code> (incl. sendFeedback)</td>
<td>27.91 KB</td>
</tr>
<tr>
<td><code>@​sentry/browser</code> (incl. FeedbackAsync)</td>
<td>32.71 KB</td>
</tr>
<tr>
<td><code>@​sentry/react</code></td>
<td>25.98 KB</td>
</tr>
<tr>
<td><code>@​sentry/react</code> (incl. Tracing)</td>
<td>38.71 KB</td>
</tr>
<tr>
<td><code>@​sentry/vue</code></td>
<td>27.58 KB</td>
</tr>
<tr>
<td><code>@​sentry/vue</code> (incl. Tracing)</td>
<td>37.75 KB</td>
</tr>
<tr>
<td><code>@​sentry/svelte</code></td>
<td>23.46 KB</td>
</tr>
<tr>
<td>CDN Bundle</td>
<td>24.49 KB</td>
</tr>
<tr>
<td>CDN Bundle (incl. Tracing)</td>
<td>37.6 KB</td>
</tr>
<tr>
<td>CDN Bundle (incl. Tracing, Replay)</td>
<td>72.9 KB</td>
</tr>
<tr>
<td>CDN Bundle (incl. Tracing, Replay, Feedback)</td>
<td>78.23 KB</td>
</tr>
<tr>
<td>CDN Bundle - uncompressed</td>
<td>71.92 KB</td>
</tr>
<tr>
<td>CDN Bundle (incl. Tracing) - uncompressed</td>
<td>111.52 KB</td>
</tr>
<tr>
<td>CDN Bundle (incl. Tracing, Replay) - uncompressed</td>
<td>225.78 KB</td>
</tr>
<tr>
<td>CDN Bundle (incl. Tracing, Replay, Feedback) - uncompressed</td>
<td>238.88 KB</td>
</tr>
<tr>
<td><code>@​sentry/nextjs</code> (client)</td>
<td>38.96 KB</td>
</tr>
<tr>
<td><code>@​sentry/sveltekit</code> (client)</td>
<td>36.4 KB</td>
</tr>
<tr>
<td><code>@​sentry/node</code></td>
<td>162.85 KB</td>
</tr>
<tr>
<td><code>@​sentry/node</code> - without tracing</td>
<td>99.14 KB</td>
</tr>
<tr>
<td><code>@​sentry/aws-serverless</code></td>
<td>131.23 KB</td>
</tr>
</tbody>
</table>
<h2>8.53.0</h2>
<ul>
<li>feat(v8/nuxt): Add <code>url</code> to
<code>SourcemapsUploadOptions</code> (<a
href="https://redirect.github.com/getsentry/sentry-javascript/issues/15202">#15202</a>)</li>
<li>fix(v8/react): <code>fromLocation</code> can be undefined in
Tanstack Router Instrumentation (<a
href="https://redirect.github.com/getsentry/sentry-javascript/issues/15237">#15237</a>)</li>
</ul>
<p>Work in this release was contributed by <a
href="https://github.com/tannerlinsley"><code>@​tannerlinsley</code></a>.
Thank you for your contribution!</p>
<h2>Bundle size 📦</h2>
<table>
<thead>
<tr>
<th>Path</th>
<th>Size</th>
</tr>
</thead>
<tbody>
<tr>
<td><code>@​sentry/browser</code></td>
<td>23.3 KB</td>
</tr>
</tbody>
</table>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/getsentry/sentry-javascript/blob/8.54.0/CHANGELOG.md"><code>@​sentry/nextjs</code>'s
changelog</a>.</em></p>
<blockquote>
<h2>8.54.0</h2>
<ul>
<li>feat(v8/deps): Upgrade all OpenTelemetry dependencies (<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/15098">#15098</a>)</li>
<li>fix(node/v8): Add compatibility layer for Prisma v5 (<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/15210">#15210</a>)</li>
</ul>
<p>Work in this release was contributed by <a
href="https://github.com/nwalters512"><code>@​nwalters512</code></a>.
Thank you for your contribution!</p>
<h2>8.53.0</h2>
<ul>
<li>feat(v8/nuxt): Add <code>url</code> to
<code>SourcemapsUploadOptions</code> (<a
href="https://redirect.github.com/getsentry/sentry-javascript/issues/15202">#15202</a>)</li>
<li>fix(v8/react): <code>fromLocation</code> can be undefined in
Tanstack Router Instrumentation (<a
href="https://redirect.github.com/getsentry/sentry-javascript/issues/15237">#15237</a>)</li>
</ul>
<p>Work in this release was contributed by <a
href="https://github.com/tannerlinsley"><code>@​tannerlinsley</code></a>.
Thank you for your contribution!</p>
<h2>8.52.1</h2>
<ul>
<li>fix(v8/nextjs): Fix nextjs build warning (<a
href="https://redirect.github.com/getsentry/sentry-javascript/issues/15226">#15226</a>)</li>
<li>ref(v8/browser): Add protocol attributes to resource spans <a
href="https://redirect.github.com/getsentry/sentry-javascript/issues/15224">#15224</a></li>
<li>ref(v8/core): Don't set <code>this.name</code> to
<code>new.target.prototype.constructor.name</code> (<a
href="https://redirect.github.com/getsentry/sentry-javascript/issues/15222">#15222</a>)</li>
</ul>
<p>Work in this release was contributed by <a
href="https://github.com/Zen-cronic"><code>@​Zen-cronic</code></a>.
Thank you for your contribution!</p>
<h2>8.52.0</h2>
<h3>Important Changes</h3>
<ul>
<li><strong>feat(solidstart): Add <code>withSentry</code> wrapper for
SolidStart config (<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/15135">#15135</a>)</strong></li>
</ul>
<p>To enable the SolidStart SDK, wrap your SolidStart Config with
<code>withSentry</code>. The <code>sentrySolidStartVite</code> plugin is
now automatically
added by <code>withSentry</code> and you can pass the Sentry build-time
options like this:</p>
<pre lang="js"><code>import { defineConfig } from
'@solidjs/start/config';
import { withSentry } from '@sentry/solidstart';
<p>export default defineConfig(<br />
withSentry(<br />
{<br />
/* Your SolidStart config options... */<br />
},<br />
{<br />
// Options for setting up source maps<br />
org: process.env.SENTRY_ORG,<br />
project: process.env.SENTRY_PROJECT,<br />
authToken: process.env.SENTRY_AUTH_TOKEN,<br />
},<br />
),<br />
);<br />
</code></pre></p>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="e9a45fe6eb"><code>e9a45fe</code></a>
release: 8.54.0</li>
<li><a
href="ff4fb5ffed"><code>ff4fb5f</code></a>
meta(changelog): Update changelog for 8.54.0 (<a
href="https://redirect.github.com/getsentry/sentry-javascript/issues/15261">#15261</a>)</li>
<li><a
href="b6a4a4a036"><code>b6a4a4a</code></a>
fix(node/v8): Add compatibility layer for Prisma v5 (<a
href="https://redirect.github.com/getsentry/sentry-javascript/issues/15210">#15210</a>)</li>
<li><a
href="3673689441"><code>3673689</code></a>
feat(v8/deps): Upgrade all OpenTelemetry dependencies (<a
href="https://redirect.github.com/getsentry/sentry-javascript/issues/15098">#15098</a>)</li>
<li><a
href="13aadde725"><code>13aadde</code></a>
Merge branch 'release/8.53.0' into v8</li>
<li><a
href="3d8b132fda"><code>3d8b132</code></a>
release: 8.53.0</li>
<li><a
href="f08e6131da"><code>f08e613</code></a>
meta: Update Changelog for 8.53.0 (<a
href="https://redirect.github.com/getsentry/sentry-javascript/issues/15243">#15243</a>)</li>
<li><a
href="ef4f210781"><code>ef4f210</code></a>
fix(v8/react): From location can be undefined in Tanstack Router
Instrumentat...</li>
<li><a
href="4df37594f6"><code>4df3759</code></a>
feat(v8/nuxt): Add <code>url</code> to
<code>SourcemapsUploadOptions</code> (<a
href="https://redirect.github.com/getsentry/sentry-javascript/issues/15202">#15202</a>)</li>
<li><a
href="36bdc47676"><code>36bdc47</code></a>
Merge branch 'release/8.52.1' into v8</li>
<li>Additional commits viewable in <a
href="https://github.com/getsentry/sentry-javascript/compare/8.51.0...8.54.0">compare
view</a></li>
</ul>
</details>
<br />

Updates `@stripe/stripe-js` from 5.5.0 to 5.6.0
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/stripe/stripe-js/releases"><code>@​stripe/stripe-js</code>'s
releases</a>.</em></p>
<blockquote>
<h2>v5.6.0</h2>
<!-- raw HTML omitted -->
<!-- raw HTML omitted -->
<h3>New features</h3>
<h3>Fixes</h3>
<ul>
<li>Fix runServerUpdate type (<a
href="https://redirect.github.com/stripe/stripe-js/issues/712">#712</a>)</li>
<li>Push release commit and tag before creating release (<a
href="https://redirect.github.com/stripe/stripe-js/issues/707">#707</a>)</li>
</ul>
<h3>Changed</h3>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="085deba188"><code>085deba</code></a>
v5.6.0</li>
<li><a
href="d337871030"><code>d337871</code></a>
Fix runServerUpdate type (<a
href="https://redirect.github.com/stripe/stripe-js/issues/712">#712</a>)</li>
<li><a
href="31aef3556c"><code>31aef35</code></a>
Push release commit and tag before creating release (<a
href="https://redirect.github.com/stripe/stripe-js/issues/707">#707</a>)</li>
<li><a
href="f93c985d67"><code>f93c985</code></a>
v5.5.0</li>
<li>See full diff in <a
href="https://github.com/stripe/stripe-js/compare/v5.5.0...v5.6.0">compare
view</a></li>
</ul>
</details>
<br />

Updates `launchdarkly-react-client-sdk` from 3.6.0 to 3.6.1
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/launchdarkly/react-client-sdk/releases">launchdarkly-react-client-sdk's
releases</a>.</em></p>
<blockquote>
<h2>launchdarkly-react-client-sdk: v3.6.1</h2>
<h2><a
href="https://github.com/launchdarkly/react-client-sdk/compare/launchdarkly-react-client-sdk-v3.6.0...launchdarkly-react-client-sdk-v3.6.1">3.6.1</a>
(2025-01-30)</h2>
<h3>Bug Fixes</h3>
<ul>
<li>update package.json to support react 19 (<a
href="https://redirect.github.com/launchdarkly/react-client-sdk/issues/341">#341</a>)
(<a
href="2c0fc335eb">2c0fc33</a>)</li>
</ul>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/launchdarkly/react-client-sdk/blob/main/CHANGELOG.md">launchdarkly-react-client-sdk's
changelog</a>.</em></p>
<blockquote>
<h2><a
href="https://github.com/launchdarkly/react-client-sdk/compare/launchdarkly-react-client-sdk-v3.6.0...launchdarkly-react-client-sdk-v3.6.1">3.6.1</a>
(2025-01-30)</h2>
<h3>Bug Fixes</h3>
<ul>
<li>update package.json to support react 19 (<a
href="https://redirect.github.com/launchdarkly/react-client-sdk/issues/341">#341</a>)
(<a
href="2c0fc335eb">2c0fc33</a>)</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="090c8db0a2"><code>090c8db</code></a>
chore(main): release launchdarkly-react-client-sdk 3.6.1 (<a
href="https://redirect.github.com/launchdarkly/react-client-sdk/issues/344">#344</a>)</li>
<li><a
href="2c0fc335eb"><code>2c0fc33</code></a>
fix: update package.json to support react 19 (<a
href="https://redirect.github.com/launchdarkly/react-client-sdk/issues/341">#341</a>)</li>
<li>See full diff in <a
href="https://github.com/launchdarkly/react-client-sdk/compare/launchdarkly-react-client-sdk-v3.6.0...launchdarkly-react-client-sdk-v3.6.1">compare
view</a></li>
</ul>
</details>
<br />

Updates `recharts` from 2.15.0 to 2.15.1
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/recharts/recharts/releases">recharts's
releases</a>.</em></p>
<blockquote>
<h2>v2.15.1</h2>
<h2>What's Changed</h2>
<p>Quick patch release, nothing crazy going on here.</p>
<p>In the meantime please help us test recharts 3.0 alpha <a
href="https://redirect.github.com/recharts/recharts/issues/5445">recharts/recharts#5445</a>
🚀</p>
<h4>Fix</h4>
<ul>
<li><code>Legend - Typescript</code>: add <code>dataKey</code> type to
legend formatter props by <a
href="https://github.com/lucasassisrosa"><code>@​lucasassisrosa</code></a>
in <a
href="https://redirect.github.com/recharts/recharts/pull/5511">recharts/recharts#5511</a>.
Fixes <a
href="https://redirect.github.com/recharts/recharts/issues/5508">recharts/recharts#5508</a></li>
</ul>
<h4>Chore</h4>
<ul>
<li>Make sure <code>react-smooth</code> version is up to date in
package.json for R19 support by <a
href="https://github.com/acomanescu"><code>@​acomanescu</code></a> in <a
href="https://redirect.github.com/recharts/recharts/pull/5422">recharts/recharts#5422</a></li>
</ul>
<h2>New Contributors</h2>
<ul>
<li><a
href="https://github.com/acomanescu"><code>@​acomanescu</code></a> made
their first contribution in <a
href="https://redirect.github.com/recharts/recharts/pull/5422">recharts/recharts#5422</a></li>
<li><a
href="https://github.com/lucasassisrosa"><code>@​lucasassisrosa</code></a>
made their first contribution in <a
href="https://redirect.github.com/recharts/recharts/pull/5511">recharts/recharts#5511</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/recharts/recharts/compare/v2.15.0...v2.15.1">https://github.com/recharts/recharts/compare/v2.15.0...v2.15.1</a></p>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="3ecaab0e88"><code>3ecaab0</code></a>
2.15.1</li>
<li><a
href="786cda6f02"><code>786cda6</code></a>
feat: Add the dataKey type to legend formatter props (<a
href="https://redirect.github.com/recharts/recharts/issues/5511">#5511</a>)</li>
<li><a
href="a3cf0247f9"><code>a3cf024</code></a>
chore: update react-smooth version to support React 19 (<a
href="https://redirect.github.com/recharts/recharts/issues/5422">#5422</a>)</li>
<li>See full diff in <a
href="https://github.com/recharts/recharts/compare/v2.15.0...v2.15.1">compare
view</a></li>
</ul>
</details>
<br />


Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore <dependency name> major version` will close this
group update PR and stop Dependabot creating any more for the specific
dependency's major version (unless you unignore this specific
dependency's major version or upgrade to it yourself)
- `@dependabot ignore <dependency name> minor version` will close this
group update PR and stop Dependabot creating any more for the specific
dependency's minor version (unless you unignore this specific
dependency's minor version or upgrade to it yourself)
- `@dependabot ignore <dependency name>` will close this group update PR
and stop Dependabot creating any more for the specific dependency
(unless you unignore this specific dependency or upgrade to it yourself)
- `@dependabot unignore <dependency name>` will remove all of the ignore
conditions of the specified dependency
- `@dependabot unignore <dependency name> <ignore condition>` will
remove the ignore condition of the specified dependency and ignore
conditions


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Nicholas Tindle <nicholas.tindle@agpt.co>
2025-02-05 22:07:52 +00:00
Aarushi
277a896a83 feat(platform/external-api): Enhance the output from the external API on agent output (#9430)
This PR does two main things: 

1) Fixes the nodes block to show the name of the pins, not the generic
word "output".
2) Addes an output block that shows the name and result of the agent
output blocks. This improves readability of the API.

### Changes 🏗️

1) Update nodes block to show the name of the input pin
2) Added an output block to show all the AgentOutputBlocks outputs
3) Added a status field on the agent graph


Example results:

```
{
    "execution_id": "ea6b12ac-36f5-4f19-bc94-0df36f028c29",
    "status": "COMPLETED",
    "nodes": [
        {
            "node_id": "83cd909d-ff35-43c2-bdc9-a2f5142bd145",
            "input": "what is the capital of australia?",
            "output": {
                "result": [
                    "what is the capital of australia?"
                ]
            }
        },
        {
            "node_id": "6bdf81fc-2d56-4e32-82d5-24f8c5645cb5",
            "input": {
                "model": "gpt-4o",
                "credentials": {
                    "id": "604d8e22-3e24-4451-93eb-b17a276d3b8c",
                    "title": "openai",
                    "provider": "openai",
                    "type": "api_key"
                },
                "prompt": "what is the capital of australia?"
            },
            "output": {
                "response": [
                    "The capital of Australia is Canberra."
                ],
                "prompt": [
                    "[{\"role\": \"user\", \"content\": \"what is the capital of australia?\"}]"
                ]
            }
        },
        {
            "node_id": "55aa132a-c298-4cb6-9afe-39a56e492ab6",
            "input": "The capital of Australia is Canberra.",
            "output": {
                "output": [
                    "The capital of Australia is Canberra."
                ],
                "name": [
                    "result"
                ]
            }
        }
    ],
    "output": [
        {
            "result": "The capital of Australia is Canberra."
        }
    ]
}
```

### Checklist 📋

#### For code changes:
Testing: 
Create an agent
Run it via the API
Fetch the results
2025-02-05 22:06:48 +00:00
dependabot[bot]
bd9c0d741a chore(backend/deps): bump the production-dependencies group across 1 directory with 5 updates (#9434)
Bumps the production-dependencies group with 5 updates in the
/autogpt_platform/backend directory:

| Package | From | To |
| --- | --- | --- |
| [e2b-code-interpreter](https://github.com/e2b-dev/code-interpreter) |
`1.0.4` | `1.0.5` |
| [fastapi](https://github.com/fastapi/fastapi) | `0.115.7` | `0.115.8`
|
| [groq](https://github.com/groq/groq-python) | `0.15.0` | `0.18.0` |
| [openai](https://github.com/openai/openai-python) | `1.60.2` |
`1.61.1` |
| [supabase](https://github.com/supabase/supabase-py) | `2.12.0` |
`2.13.0` |


Updates `e2b-code-interpreter` from 1.0.4 to 1.0.5
<details>
<summary>Commits</summary>
<ul>
<li><a
href="007e2e5ae1"><code>007e2e5</code></a>
Merge pull request <a
href="https://redirect.github.com/e2b-dev/code-interpreter/issues/54">#54</a>
from e2b-dev/fix-bug-in-e2b-incompability</li>
<li><a
href="54935958aa"><code>5493595</code></a>
Add changeset</li>
<li><a
href="1b76bbe059"><code>1b76bbe</code></a>
Remove init method</li>
<li><a
href="bfa11701e7"><code>bfa1170</code></a>
[skip ci]Bump package manually</li>
<li><a
href="83f08213fb"><code>83f0821</code></a>
Skip on package.json change - it's just version bump</li>
<li><a
href="f14a2a5f6f"><code>f14a2a5</code></a>
Add to_dict() method in python SDK</li>
<li><a
href="daba22b9c7"><code>daba22b</code></a>
[skip ci] Release new versions</li>
<li><a
href="f8493025c2"><code>f849302</code></a>
Merge pull request <a
href="https://redirect.github.com/e2b-dev/code-interpreter/issues/53">#53</a>
from e2b-dev/update-r-to-44-e2b-1449</li>
<li><a
href="31aabd355a"><code>31aabd3</code></a>
Update Dockerfile</li>
<li><a
href="33ff495b17"><code>33ff495</code></a>
Add changeset</li>
<li>Additional commits viewable in <a
href="https://github.com/e2b-dev/code-interpreter/compare/@e2b/code-interpreter@1.0.4...@e2b/code-interpreter-python@1.0.5">compare
view</a></li>
</ul>
</details>
<br />

Updates `fastapi` from 0.115.7 to 0.115.8
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/fastapi/fastapi/releases">fastapi's
releases</a>.</em></p>
<blockquote>
<h2>0.115.8</h2>
<h3>Fixes</h3>
<ul>
<li>🐛 Fix <code>OAuth2PasswordRequestForm</code> and
<code>OAuth2PasswordRequestFormStrict</code> fixed
<code>grant_type</code> &quot;password&quot; RegEx. PR <a
href="https://redirect.github.com/fastapi/fastapi/pull/9783">#9783</a>
by <a
href="https://github.com/skarfie123"><code>@​skarfie123</code></a>.</li>
</ul>
<h3>Refactors</h3>
<ul>
<li> Simplify tests for body_multiple_params . PR <a
href="https://redirect.github.com/fastapi/fastapi/pull/13237">#13237</a>
by <a
href="https://github.com/alejsdev"><code>@​alejsdev</code></a>.</li>
<li>♻️ Move duplicated code portion to a static method in the
<code>APIKeyBase</code> super class. PR <a
href="https://redirect.github.com/fastapi/fastapi/pull/3142">#3142</a>
by <a
href="https://github.com/ShahriyarR"><code>@​ShahriyarR</code></a>.</li>
<li> Simplify tests for request_files. PR <a
href="https://redirect.github.com/fastapi/fastapi/pull/13182">#13182</a>
by <a
href="https://github.com/alejsdev"><code>@​alejsdev</code></a>.</li>
</ul>
<h3>Docs</h3>
<ul>
<li>📝 Change the word &quot;unwrap&quot; to &quot;unpack&quot; in
<code>docs/en/docs/tutorial/extra-models.md</code>. PR <a
href="https://redirect.github.com/fastapi/fastapi/pull/13061">#13061</a>
by <a
href="https://github.com/timothy-jeong"><code>@​timothy-jeong</code></a>.</li>
<li>📝 Update Request Body's <code>tutorial002</code> to deal with
<code>tax=0</code> case. PR <a
href="https://redirect.github.com/fastapi/fastapi/pull/13230">#13230</a>
by <a href="https://github.com/togogh"><code>@​togogh</code></a>.</li>
<li>👥 Update FastAPI People - Experts. PR <a
href="https://redirect.github.com/fastapi/fastapi/pull/13269">#13269</a>
by <a
href="https://github.com/tiangolo"><code>@​tiangolo</code></a>.</li>
</ul>
<h3>Translations</h3>
<ul>
<li>🌐 Add Japanese translation for
<code>docs/ja/docs/environment-variables.md</code>. PR <a
href="https://redirect.github.com/fastapi/fastapi/pull/13226">#13226</a>
by <a
href="https://github.com/k94-ishi"><code>@​k94-ishi</code></a>.</li>
<li>🌐 Add Russian translation for
<code>docs/ru/docs/advanced/async-tests.md</code>. PR <a
href="https://redirect.github.com/fastapi/fastapi/pull/13227">#13227</a>
by <a
href="https://github.com/Rishat-F"><code>@​Rishat-F</code></a>.</li>
<li>🌐 Update Russian translation for
<code>docs/ru/docs/tutorial/dependencies/dependencies-in-path-operation-decorators.md</code>.
PR <a
href="https://redirect.github.com/fastapi/fastapi/pull/13252">#13252</a>
by <a
href="https://github.com/Rishat-F"><code>@​Rishat-F</code></a>.</li>
<li>🌐 Add Russian translation for
<code>docs/ru/docs/tutorial/bigger-applications.md</code>. PR <a
href="https://redirect.github.com/fastapi/fastapi/pull/13154">#13154</a>
by <a href="https://github.com/alv2017"><code>@​alv2017</code></a>.</li>
</ul>
<h3>Internal</h3>
<ul>
<li>⬆️ Add support for Python 3.13. PR <a
href="https://redirect.github.com/fastapi/fastapi/pull/13274">#13274</a>
by <a
href="https://github.com/tiangolo"><code>@​tiangolo</code></a>.</li>
<li>⬆️ Upgrade AnyIO max version for tests, new range:
<code>&gt;=3.2.1,&lt;5.0.0</code>. PR <a
href="https://redirect.github.com/fastapi/fastapi/pull/13273">#13273</a>
by <a
href="https://github.com/tiangolo"><code>@​tiangolo</code></a>.</li>
<li>🔧 Update Sponsors badges. PR <a
href="https://redirect.github.com/fastapi/fastapi/pull/13271">#13271</a>
by <a
href="https://github.com/tiangolo"><code>@​tiangolo</code></a>.</li>
<li>♻️ Fix <code>notify_translations.py</code> empty env var handling
for PR label events vs workflow_dispatch. PR <a
href="https://redirect.github.com/fastapi/fastapi/pull/13272">#13272</a>
by <a
href="https://github.com/tiangolo"><code>@​tiangolo</code></a>.</li>
<li>♻️ Refactor and move <code>scripts/notify_translations.py</code>, no
need for a custom GitHub Action. PR <a
href="https://redirect.github.com/fastapi/fastapi/pull/13270">#13270</a>
by <a
href="https://github.com/tiangolo"><code>@​tiangolo</code></a>.</li>
<li>🔨 Update FastAPI People Experts script, refactor and optimize data
fetching to handle rate limits. PR <a
href="https://redirect.github.com/fastapi/fastapi/pull/13267">#13267</a>
by <a
href="https://github.com/tiangolo"><code>@​tiangolo</code></a>.</li>
<li>⬆ Bump pypa/gh-action-pypi-publish from 1.12.3 to 1.12.4. PR <a
href="https://redirect.github.com/fastapi/fastapi/pull/13251">#13251</a>
by <a
href="https://github.com/apps/dependabot"><code>@​dependabot[bot]</code></a>.</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="7128971f1d"><code>7128971</code></a>
🔖 Release version 0.115.8</li>
<li><a
href="55f8a446c7"><code>55f8a44</code></a>
📝 Update release notes</li>
<li><a
href="83ab6ac957"><code>83ab6ac</code></a>
📝 Change the word &quot;unwrap&quot; to &quot;unpack&quot; in
`docs/en/docs/tutorial/extra-models...</li>
<li><a
href="3d02a920ab"><code>3d02a92</code></a>
📝 Update release notes</li>
<li><a
href="1b00f8ae78"><code>1b00f8a</code></a>
 Simplify tests for body_multiple_params (<a
href="https://redirect.github.com/fastapi/fastapi/issues/13237">#13237</a>)</li>
<li><a
href="d97647fd57"><code>d97647f</code></a>
📝 Update release notes</li>
<li><a
href="9667ce87a9"><code>9667ce8</code></a>
📝 Update Request Body's <code>tutorial002</code> to deal with
<code>tax=0</code> case (<a
href="https://redirect.github.com/fastapi/fastapi/issues/13230">#13230</a>)</li>
<li><a
href="0541693bc7"><code>0541693</code></a>
📝 Update release notes</li>
<li><a
href="041b2e1c46"><code>041b2e1</code></a>
📝 Update release notes</li>
<li><a
href="30b270be9a"><code>30b270b</code></a>
♻️ Move duplicated code portion to a static method in the
<code>APIKeyBase</code> super ...</li>
<li>Additional commits viewable in <a
href="https://github.com/fastapi/fastapi/compare/0.115.7...0.115.8">compare
view</a></li>
</ul>
</details>
<br />

Updates `groq` from 0.15.0 to 0.18.0
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/groq/groq-python/releases">groq's
releases</a>.</em></p>
<blockquote>
<h2>v0.18.0</h2>
<h2>0.18.0 (2025-02-05)</h2>
<p>Full Changelog: <a
href="https://github.com/groq/groq-python/compare/v0.17.0...v0.18.0">v0.17.0...v0.18.0</a></p>
<h3>Features</h3>
<ul>
<li><strong>api:</strong> Add batch API (<a
href="https://redirect.github.com/groq/groq-python/issues/191">#191</a>)
(<a
href="367a744f46">367a744</a>)</li>
</ul>
<h3>Chores</h3>
<ul>
<li><strong>internal:</strong> bummp ruff dependency (<a
href="https://redirect.github.com/groq/groq-python/issues/190">#190</a>)
(<a
href="61678fc5fd">61678fc</a>)</li>
<li><strong>internal:</strong> change default timeout to an int (<a
href="https://redirect.github.com/groq/groq-python/issues/188">#188</a>)
(<a
href="348e152671">348e152</a>)</li>
</ul>
<h2>v0.17.0</h2>
<h2>0.17.0 (2025-02-03)</h2>
<p>Full Changelog: <a
href="https://github.com/groq/groq-python/compare/v0.16.0...v0.17.0">v0.16.0...v0.17.0</a></p>
<h3>Features</h3>
<ul>
<li><strong>api:</strong> api update (<a
href="https://redirect.github.com/groq/groq-python/issues/185">#185</a>)
(<a
href="e2373395cf">e237339</a>)</li>
</ul>
<h2>v0.16.0</h2>
<h2>0.16.0 (2025-01-29)</h2>
<p>Full Changelog: <a
href="https://github.com/groq/groq-python/compare/v0.15.0...v0.16.0">v0.15.0...v0.16.0</a></p>
<h3>Features</h3>
<ul>
<li><strong>api:</strong> api update (<a
href="https://redirect.github.com/groq/groq-python/issues/183">#183</a>)
(<a
href="a5cdbc5af7">a5cdbc5</a>)</li>
</ul>
<h3>Chores</h3>
<ul>
<li><strong>internal:</strong> codegen related update (<a
href="https://redirect.github.com/groq/groq-python/issues/177">#177</a>)
(<a
href="01e63041c8">01e6304</a>)</li>
<li><strong>internal:</strong> codegen related update (<a
href="https://redirect.github.com/groq/groq-python/issues/180">#180</a>)
(<a
href="5c8db1a979">5c8db1a</a>)</li>
<li><strong>internal:</strong> minor formatting changes (<a
href="https://redirect.github.com/groq/groq-python/issues/182">#182</a>)
(<a
href="2c4e409fe0">2c4e409</a>)</li>
<li><strong>internal:</strong> minor style changes (<a
href="https://redirect.github.com/groq/groq-python/issues/181">#181</a>)
(<a
href="77c752ab1a">77c752a</a>)</li>
</ul>
<h3>Documentation</h3>
<ul>
<li><strong>raw responses:</strong> fix duplicate <code>the</code> (<a
href="https://redirect.github.com/groq/groq-python/issues/179">#179</a>)
(<a
href="a28cbd863d">a28cbd8</a>)</li>
</ul>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/groq/groq-python/blob/main/CHANGELOG.md">groq's
changelog</a>.</em></p>
<blockquote>
<h2>0.18.0 (2025-02-05)</h2>
<p>Full Changelog: <a
href="https://github.com/groq/groq-python/compare/v0.17.0...v0.18.0">v0.17.0...v0.18.0</a></p>
<h3>Features</h3>
<ul>
<li><strong>api:</strong> Add batch API (<a
href="https://redirect.github.com/groq/groq-python/issues/191">#191</a>)
(<a
href="367a744f46">367a744</a>)</li>
</ul>
<h3>Chores</h3>
<ul>
<li><strong>internal:</strong> bummp ruff dependency (<a
href="https://redirect.github.com/groq/groq-python/issues/190">#190</a>)
(<a
href="61678fc5fd">61678fc</a>)</li>
<li><strong>internal:</strong> change default timeout to an int (<a
href="https://redirect.github.com/groq/groq-python/issues/188">#188</a>)
(<a
href="348e152671">348e152</a>)</li>
</ul>
<h2>0.17.0 (2025-02-03)</h2>
<p>Full Changelog: <a
href="https://github.com/groq/groq-python/compare/v0.16.0...v0.17.0">v0.16.0...v0.17.0</a></p>
<h3>Features</h3>
<ul>
<li><strong>api:</strong> api update (<a
href="https://redirect.github.com/groq/groq-python/issues/185">#185</a>)
(<a
href="e2373395cf">e237339</a>)</li>
</ul>
<h2>0.16.0 (2025-01-29)</h2>
<p>Full Changelog: <a
href="https://github.com/groq/groq-python/compare/v0.15.0...v0.16.0">v0.15.0...v0.16.0</a></p>
<h3>Features</h3>
<ul>
<li><strong>api:</strong> api update (<a
href="https://redirect.github.com/groq/groq-python/issues/183">#183</a>)
(<a
href="a5cdbc5af7">a5cdbc5</a>)</li>
</ul>
<h3>Chores</h3>
<ul>
<li><strong>internal:</strong> codegen related update (<a
href="https://redirect.github.com/groq/groq-python/issues/177">#177</a>)
(<a
href="01e63041c8">01e6304</a>)</li>
<li><strong>internal:</strong> codegen related update (<a
href="https://redirect.github.com/groq/groq-python/issues/180">#180</a>)
(<a
href="5c8db1a979">5c8db1a</a>)</li>
<li><strong>internal:</strong> minor formatting changes (<a
href="https://redirect.github.com/groq/groq-python/issues/182">#182</a>)
(<a
href="2c4e409fe0">2c4e409</a>)</li>
<li><strong>internal:</strong> minor style changes (<a
href="https://redirect.github.com/groq/groq-python/issues/181">#181</a>)
(<a
href="77c752ab1a">77c752a</a>)</li>
</ul>
<h3>Documentation</h3>
<ul>
<li><strong>raw responses:</strong> fix duplicate <code>the</code> (<a
href="https://redirect.github.com/groq/groq-python/issues/179">#179</a>)
(<a
href="a28cbd863d">a28cbd8</a>)</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="b74ce9e301"><code>b74ce9e</code></a>
release: 0.18.0 (<a
href="https://redirect.github.com/groq/groq-python/issues/189">#189</a>)</li>
<li><a
href="3cee54eece"><code>3cee54e</code></a>
release: 0.17.0 (<a
href="https://redirect.github.com/groq/groq-python/issues/186">#186</a>)</li>
<li><a
href="10566c3847"><code>10566c3</code></a>
release: 0.16.0 (<a
href="https://redirect.github.com/groq/groq-python/issues/178">#178</a>)</li>
<li>See full diff in <a
href="https://github.com/groq/groq-python/compare/v0.15.0...v0.18.0">compare
view</a></li>
</ul>
</details>
<br />

Updates `openai` from 1.60.2 to 1.61.1
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/openai/openai-python/releases">openai's
releases</a>.</em></p>
<blockquote>
<h2>v1.61.1</h2>
<h2>1.61.1 (2025-02-05)</h2>
<p>Full Changelog: <a
href="https://github.com/openai/openai-python/compare/v1.61.0...v1.61.1">v1.61.0...v1.61.1</a></p>
<h3>Bug Fixes</h3>
<ul>
<li><strong>api/types:</strong> correct audio duration &amp; role types
(<a
href="https://redirect.github.com/openai/openai-python/issues/2091">#2091</a>)
(<a
href="afcea4891f">afcea48</a>)</li>
<li><strong>cli/chat:</strong> only send params when set (<a
href="https://redirect.github.com/openai/openai-python/issues/2077">#2077</a>)
(<a
href="688b223d9a">688b223</a>)</li>
</ul>
<h3>Chores</h3>
<ul>
<li><strong>internal:</strong> bummp ruff dependency (<a
href="https://redirect.github.com/openai/openai-python/issues/2080">#2080</a>)
(<a
href="b7a80b1994">b7a80b1</a>)</li>
<li><strong>internal:</strong> change default timeout to an int (<a
href="https://redirect.github.com/openai/openai-python/issues/2079">#2079</a>)
(<a
href="d3df1c6ca0">d3df1c6</a>)</li>
</ul>
<h2>v1.61.0</h2>
<h2>1.61.0 (2025-01-31)</h2>
<p>Full Changelog: <a
href="https://github.com/openai/openai-python/compare/v1.60.2...v1.61.0">v1.60.2...v1.61.0</a></p>
<h3>Features</h3>
<ul>
<li><strong>api:</strong> add o3-mini (<a
href="https://redirect.github.com/openai/openai-python/issues/2067">#2067</a>)
(<a
href="12b87a4a1e">12b87a4</a>)</li>
</ul>
<h3>Bug Fixes</h3>
<ul>
<li><strong>types:</strong> correct metadata type + other fixes (<a
href="12b87a4a1e">12b87a4</a>)</li>
</ul>
<h3>Chores</h3>
<ul>
<li><strong>helpers:</strong> section links (<a
href="ef8d3cce40">ef8d3cc</a>)</li>
<li><strong>types:</strong> fix Metadata types (<a
href="82d3156e74">82d3156</a>)</li>
<li>update api.md (<a
href="https://redirect.github.com/openai/openai-python/issues/2063">#2063</a>)
(<a
href="21964f00fb">21964f0</a>)</li>
</ul>
<h3>Documentation</h3>
<ul>
<li><strong>readme:</strong> current section links (<a
href="https://redirect.github.com/openai/openai-python/issues/2055">#2055</a>)
(<a
href="ef8d3cce40">ef8d3cc</a>)</li>
</ul>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/openai/openai-python/blob/main/CHANGELOG.md">openai's
changelog</a>.</em></p>
<blockquote>
<h2>1.61.1 (2025-02-05)</h2>
<p>Full Changelog: <a
href="https://github.com/openai/openai-python/compare/v1.61.0...v1.61.1">v1.61.0...v1.61.1</a></p>
<h3>Bug Fixes</h3>
<ul>
<li><strong>api/types:</strong> correct audio duration &amp; role types
(<a
href="https://redirect.github.com/openai/openai-python/issues/2091">#2091</a>)
(<a
href="afcea4891f">afcea48</a>)</li>
<li><strong>cli/chat:</strong> only send params when set (<a
href="https://redirect.github.com/openai/openai-python/issues/2077">#2077</a>)
(<a
href="688b223d9a">688b223</a>)</li>
</ul>
<h3>Chores</h3>
<ul>
<li><strong>internal:</strong> bummp ruff dependency (<a
href="https://redirect.github.com/openai/openai-python/issues/2080">#2080</a>)
(<a
href="b7a80b1994">b7a80b1</a>)</li>
<li><strong>internal:</strong> change default timeout to an int (<a
href="https://redirect.github.com/openai/openai-python/issues/2079">#2079</a>)
(<a
href="d3df1c6ca0">d3df1c6</a>)</li>
</ul>
<h2>1.61.0 (2025-01-31)</h2>
<p>Full Changelog: <a
href="https://github.com/openai/openai-python/compare/v1.60.2...v1.61.0">v1.60.2...v1.61.0</a></p>
<h3>Features</h3>
<ul>
<li><strong>api:</strong> add o3-mini (<a
href="https://redirect.github.com/openai/openai-python/issues/2067">#2067</a>)
(<a
href="12b87a4a1e">12b87a4</a>)</li>
</ul>
<h3>Bug Fixes</h3>
<ul>
<li><strong>types:</strong> correct metadata type + other fixes (<a
href="12b87a4a1e">12b87a4</a>)</li>
</ul>
<h3>Chores</h3>
<ul>
<li><strong>helpers:</strong> section links (<a
href="ef8d3cce40">ef8d3cc</a>)</li>
<li><strong>types:</strong> fix Metadata types (<a
href="82d3156e74">82d3156</a>)</li>
<li>update api.md (<a
href="https://redirect.github.com/openai/openai-python/issues/2063">#2063</a>)
(<a
href="21964f00fb">21964f0</a>)</li>
</ul>
<h3>Documentation</h3>
<ul>
<li><strong>readme:</strong> current section links (<a
href="https://redirect.github.com/openai/openai-python/issues/2055">#2055</a>)
(<a
href="ef8d3cce40">ef8d3cc</a>)</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="7193688e36"><code>7193688</code></a>
release: 1.61.1</li>
<li><a
href="f344db250a"><code>f344db2</code></a>
fix(api/types): correct audio duration &amp; role types (<a
href="https://redirect.github.com/openai/openai-python/issues/2091">#2091</a>)</li>
<li><a
href="6afde0dc85"><code>6afde0d</code></a>
chore(internal): bummp ruff dependency (<a
href="https://redirect.github.com/openai/openai-python/issues/2080">#2080</a>)</li>
<li><a
href="5a1a412b77"><code>5a1a412</code></a>
chore(internal): change default timeout to an int (<a
href="https://redirect.github.com/openai/openai-python/issues/2079">#2079</a>)</li>
<li><a
href="c27e8cc997"><code>c27e8cc</code></a>
fix(cli/chat): only send params when set (<a
href="https://redirect.github.com/openai/openai-python/issues/2077">#2077</a>)</li>
<li><a
href="7a6517d81e"><code>7a6517d</code></a>
release: 1.61.0</li>
<li><a
href="b56b357e60"><code>b56b357</code></a>
chore(types): fix Metadata types</li>
<li><a
href="fdd52476b5"><code>fdd5247</code></a>
feat(api): add o3-mini (<a
href="https://redirect.github.com/openai/openai-python/issues/2067">#2067</a>)</li>
<li><a
href="a99096823a"><code>a990968</code></a>
Revert &quot;fix(parsing): don't validate input tools in the
asynchronous `.parse(...</li>
<li><a
href="d779e40bc9"><code>d779e40</code></a>
chore: update api.md (<a
href="https://redirect.github.com/openai/openai-python/issues/2063">#2063</a>)</li>
<li>Additional commits viewable in <a
href="https://github.com/openai/openai-python/compare/v1.60.2...v1.61.1">compare
view</a></li>
</ul>
</details>
<br />

Updates `supabase` from 2.12.0 to 2.13.0
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/supabase/supabase-py/releases">supabase's
releases</a>.</em></p>
<blockquote>
<h2>v2.13.0</h2>
<h2><a
href="https://github.com/supabase/supabase-py/compare/v2.12.0...v2.13.0">2.13.0</a>
(2025-02-04)</h2>
<h3>Features</h3>
<ul>
<li><strong>realtime:</strong> bump realtime from 2.2.0 to 2.3.0 (<a
href="https://redirect.github.com/supabase/supabase-py/issues/1049">#1049</a>)
(<a
href="2347401770">2347401</a>)</li>
</ul>
<h3>Bug Fixes</h3>
<ul>
<li><strong>auth:</strong> bump gotrue from 2.11.2 to 2.11.3 (<a
href="https://redirect.github.com/supabase/supabase-py/issues/1051">#1051</a>)
(<a
href="4a2bb9e73e">4a2bb9e</a>)</li>
<li><strong>functions:</strong> bump supafunc from 0.9.2 to 0.9.3 (<a
href="https://redirect.github.com/supabase/supabase-py/issues/1052">#1052</a>)
(<a
href="29fed38015">29fed38</a>)</li>
<li><strong>storage:</strong> bump storage3 from 0.11.1 to 0.11.3 (<a
href="https://redirect.github.com/supabase/supabase-py/issues/1050">#1050</a>)
(<a
href="8c5d48f51f">8c5d48f</a>)</li>
<li>update SupabaseAuthClient to use super instead of calling base class
(<a
href="https://redirect.github.com/supabase/supabase-py/issues/1045">#1045</a>)
(<a
href="3efb4a678b">3efb4a6</a>)</li>
</ul>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/supabase/supabase-py/blob/main/CHANGELOG.md">supabase's
changelog</a>.</em></p>
<blockquote>
<h2><a
href="https://github.com/supabase/supabase-py/compare/v2.12.0...v2.13.0">2.13.0</a>
(2025-02-04)</h2>
<h3>Features</h3>
<ul>
<li><strong>realtime:</strong> bump realtime from 2.2.0 to 2.3.0 (<a
href="https://redirect.github.com/supabase/supabase-py/issues/1049">#1049</a>)
(<a
href="2347401770">2347401</a>)</li>
</ul>
<h3>Bug Fixes</h3>
<ul>
<li><strong>auth:</strong> bump gotrue from 2.11.2 to 2.11.3 (<a
href="https://redirect.github.com/supabase/supabase-py/issues/1051">#1051</a>)
(<a
href="4a2bb9e73e">4a2bb9e</a>)</li>
<li><strong>functions:</strong> bump supafunc from 0.9.2 to 0.9.3 (<a
href="https://redirect.github.com/supabase/supabase-py/issues/1052">#1052</a>)
(<a
href="29fed38015">29fed38</a>)</li>
<li><strong>storage:</strong> bump storage3 from 0.11.1 to 0.11.3 (<a
href="https://redirect.github.com/supabase/supabase-py/issues/1050">#1050</a>)
(<a
href="8c5d48f51f">8c5d48f</a>)</li>
<li>update SupabaseAuthClient to use super instead of calling base class
(<a
href="https://redirect.github.com/supabase/supabase-py/issues/1045">#1045</a>)
(<a
href="3efb4a678b">3efb4a6</a>)</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="c70802e55c"><code>c70802e</code></a>
chore(main): release 2.13.0 (<a
href="https://redirect.github.com/supabase/supabase-py/issues/1046">#1046</a>)</li>
<li><a
href="614cacc6a9"><code>614cacc</code></a>
chore(tests): increase coverage (<a
href="https://redirect.github.com/supabase/supabase-py/issues/1053">#1053</a>)</li>
<li><a
href="29fed38015"><code>29fed38</code></a>
fix(functions): bump supafunc from 0.9.2 to 0.9.3 (<a
href="https://redirect.github.com/supabase/supabase-py/issues/1052">#1052</a>)</li>
<li><a
href="4a2bb9e73e"><code>4a2bb9e</code></a>
fix(auth): bump gotrue from 2.11.2 to 2.11.3 (<a
href="https://redirect.github.com/supabase/supabase-py/issues/1051">#1051</a>)</li>
<li><a
href="8c5d48f51f"><code>8c5d48f</code></a>
fix(storage): bump storage3 from 0.11.1 to 0.11.3 (<a
href="https://redirect.github.com/supabase/supabase-py/issues/1050">#1050</a>)</li>
<li><a
href="2347401770"><code>2347401</code></a>
feat(realtime): bump realtime from 2.2.0 to 2.3.0 (<a
href="https://redirect.github.com/supabase/supabase-py/issues/1049">#1049</a>)</li>
<li><a
href="4237977463"><code>4237977</code></a>
chore(deps-dev): bump black from 24.10.0 to 25.1.0 (<a
href="https://redirect.github.com/supabase/supabase-py/issues/1044">#1044</a>)</li>
<li><a
href="489043f03a"><code>489043f</code></a>
chore(deps-dev): bump pytest-asyncio from 0.25.2 to 0.25.3 (<a
href="https://redirect.github.com/supabase/supabase-py/issues/1043">#1043</a>)</li>
<li><a
href="b007da45c6"><code>b007da4</code></a>
chore(deps-dev): bump commitizen from 4.1.0 to 4.1.1 (<a
href="https://redirect.github.com/supabase/supabase-py/issues/1042">#1042</a>)</li>
<li><a
href="9a8713125e"><code>9a87131</code></a>
chore(ci): pipeline using same version of python for all tests (<a
href="https://redirect.github.com/supabase/supabase-py/issues/1047">#1047</a>)</li>
<li>Additional commits viewable in <a
href="https://github.com/supabase/supabase-py/compare/v2.12.0...v2.13.0">compare
view</a></li>
</ul>
</details>
<br />


Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore <dependency name> major version` will close this
group update PR and stop Dependabot creating any more for the specific
dependency's major version (unless you unignore this specific
dependency's major version or upgrade to it yourself)
- `@dependabot ignore <dependency name> minor version` will close this
group update PR and stop Dependabot creating any more for the specific
dependency's minor version (unless you unignore this specific
dependency's minor version or upgrade to it yourself)
- `@dependabot ignore <dependency name>` will close this group update PR
and stop Dependabot creating any more for the specific dependency
(unless you unignore this specific dependency or upgrade to it yourself)
- `@dependabot unignore <dependency name>` will remove all of the ignore
conditions of the specified dependency
- `@dependabot unignore <dependency name> <ignore condition>` will
remove the ignore condition of the specified dependency and ignore
conditions


</details>

---------

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Nicholas Tindle <nicholas.tindle@agpt.co>
2025-02-05 21:52:03 +00:00
dependabot[bot]
c098a8d659 chore(frontend/deps): bump framer-motion from 11.16.0 to 12.0.11 in /autogpt_platform/frontend (#9403)
Bumps [framer-motion](https://github.com/motiondivision/motion) from
11.16.0 to 12.0.11.
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/motiondivision/motion/blob/main/CHANGELOG.md">framer-motion's
changelog</a>.</em></p>
<blockquote>
<h2>[12.0.11] 2025-02-03</h2>
<h3>Fixed</h3>
<ul>
<li>Moving <code>updateSVGDimensions</code> to its own file to help with
tree-shaking.</li>
</ul>
<h2>[12.0.10] 2025-02-03</h2>
<h3>Fixed</h3>
<ul>
<li>Providing <code>MotionValue</code> to <code>motion</code> component
from <code>motion/react-client</code> entrypoint.</li>
</ul>
<h2>[12.0.9] 2025-02-03</h2>
<h3>Fixed</h3>
<ul>
<li>Removing React from bundle.</li>
</ul>
<h2>[12.0.8] 2025-02-03</h2>
<h3>Fixed</h3>
<ul>
<li>Infer type of <code>children</code> prop for
<code>motion.create</code>.</li>
</ul>
<h2>[12.0.7] 2025-01-28</h2>
<h3>Fixed</h3>
<ul>
<li>Fixed SVG transform animations via <code>animate</code>.</li>
</ul>
<h2>[12.0.6] 2025-01-27</h2>
<h3>Fixed</h3>
<ul>
<li>Discard layout projection snapshots if 0x0.</li>
</ul>
<h2>[12.0.5] 2025-01-24</h2>
<h3>Fixed</h3>
<ul>
<li>Fix scale correction for CSS variables.</li>
</ul>
<h2>[12.0.4] 2025-01-24</h2>
<h3>Fixed</h3>
<ul>
<li>Add scale correction for CSS variables.</li>
</ul>
<h2>[12.0.3] 2025-01-23</h2>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="ac626f5287"><code>ac626f5</code></a>
v12.0.11</li>
<li><a
href="f0172c942b"><code>f0172c9</code></a>
Latest</li>
<li><a
href="4855269501"><code>4855269</code></a>
Fixing type imports</li>
<li><a
href="553429c4c9"><code>553429c</code></a>
Latest</li>
<li><a
href="1b2e573055"><code>1b2e573</code></a>
v12.0.10</li>
<li><a
href="cfcfe30984"><code>cfcfe30</code></a>
Latest</li>
<li><a
href="97f7cb26fc"><code>97f7cb2</code></a>
Latest</li>
<li><a
href="cbeafb2aed"><code>cbeafb2</code></a>
v12.0.9</li>
<li><a
href="e70c1ba012"><code>e70c1ba</code></a>
Latest</li>
<li><a
href="90f083ed7b"><code>90f083e</code></a>
Latest (<a
href="https://redirect.github.com/motiondivision/motion/issues/3043">#3043</a>)</li>
<li>Additional commits viewable in <a
href="https://github.com/motiondivision/motion/compare/v11.16.0...v12.0.11">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=framer-motion&package-manager=npm_and_yarn&previous-version=11.16.0&new-version=12.0.11)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-05 21:47:28 +00:00
Zamil Majdy
1d30e401fe fix(backend): Charge user credits before its block execution (#9427)
### Changes 🏗️

Instead of letting the user to execution the block then break it
post-execution.
We can charge the user first and execute it afterward.
The trade-offs:
* We can't charge a block that is charged based on the execution time.
* We will also charge failed block executions.

### Checklist 📋

#### For code changes:
- [ ] I have clearly listed my changes in the PR description
- [ ] I have made a test plan
- [ ] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [ ] ...

<details>
  <summary>Example test plan</summary>
  
  - [ ] Create from scratch and execute an agent with at least 3 blocks
- [ ] Import an agent from file upload, and confirm it executes
correctly
  - [ ] Upload agent to marketplace
- [ ] Import an agent from marketplace and confirm it executes correctly
  - [ ] Edit an agent from monitor, and confirm it executes correctly
</details>

#### For configuration changes:
- [ ] `.env.example` is updated or already compatible with my changes
- [ ] `docker-compose.yml` is updated or already compatible with my
changes
- [ ] I have included a list of my configuration changes in the PR
description (under **Changes**)

<details>
  <summary>Examples of configuration changes</summary>

  - Changing ports
  - Adding new services that need to communicate with each other
  - Secrets or environment variable changes
  - New or infrastructure changes such as databases
</details>
2025-02-05 15:39:41 +00:00
Zamil Majdy
22536de71f feat(backend): Avoid multiple auto-top-ups within the same execution (#9426)
### Changes 🏗️

This PR makes auto-top-up more reliable:
* Auto top-up will not be executed more than once within a single
execution.
* Auto top-up will always be executed even if it was previously failing.
* Auto top-up will never be called twice or triggered when the balance
is more than the threshold, even in the concurrent block executions
set-up.

### Checklist 📋

#### For code changes:
- [ ] I have clearly listed my changes in the PR description
- [ ] I have made a test plan
- [ ] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [ ] ...

<details>
  <summary>Example test plan</summary>
  
  - [ ] Create from scratch and execute an agent with at least 3 blocks
- [ ] Import an agent from file upload, and confirm it executes
correctly
  - [ ] Upload agent to marketplace
- [ ] Import an agent from marketplace and confirm it executes correctly
  - [ ] Edit an agent from monitor, and confirm it executes correctly
</details>

#### For configuration changes:
- [ ] `.env.example` is updated or already compatible with my changes
- [ ] `docker-compose.yml` is updated or already compatible with my
changes
- [ ] I have included a list of my configuration changes in the PR
description (under **Changes**)

<details>
  <summary>Examples of configuration changes</summary>

  - Changing ports
  - Adding new services that need to communicate with each other
  - Secrets or environment variable changes
  - New or infrastructure changes such as databases
</details>
2025-02-05 15:39:23 +00:00
Zamil Majdy
0915879049 feat(backend): Allow promo coupon for credits top-up 2025-02-05 18:28:01 +07:00
Zamil Majdy
243122311c feat(frontend): Use USD value instead of cents credit value (#9423)
The use of credit points (equivalent to cents) can cause confussion, the
scope of this PR is removing the use of it in the UI and using USD value
instead.

### Changes 🏗️

Show US values on:
* credit page
* block cost
* balance button in the navbar

<img width="1440" alt="image"
src="https://github.com/user-attachments/assets/2b6a18b0-f89d-48bf-84bd-0ea6aa9182f7"
/>

<img width="1440" alt="image"
src="https://github.com/user-attachments/assets/7545b496-0e7c-49ea-afc1-a3d1fd3289fb"
/>



### Checklist 📋

#### For code changes:
- [ ] I have clearly listed my changes in the PR description
- [ ] I have made a test plan
- [ ] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [ ] ...

<details>
  <summary>Example test plan</summary>
  
  - [ ] Create from scratch and execute an agent with at least 3 blocks
- [ ] Import an agent from file upload, and confirm it executes
correctly
  - [ ] Upload agent to marketplace
- [ ] Import an agent from marketplace and confirm it executes correctly
  - [ ] Edit an agent from monitor, and confirm it executes correctly
</details>

#### For configuration changes:
- [ ] `.env.example` is updated or already compatible with my changes
- [ ] `docker-compose.yml` is updated or already compatible with my
changes
- [ ] I have included a list of my configuration changes in the PR
description (under **Changes**)

<details>
  <summary>Examples of configuration changes</summary>

  - Changing ports
  - Adding new services that need to communicate with each other
  - Secrets or environment variable changes
  - New or infrastructure changes such as databases
</details>
2025-02-05 10:40:57 +00:00
Krzysztof Czerwinski
533d120e98 feat(backend): Update llm models (#9390)
There are some models missing and their metadata is incorrect.

### Changes 🏗️

- Add models, update metadata and pricing.
- Add `max_output_tokens` to metadata, `None` indicates that max tokes
are unspecified in provider docs
- Added models:
  - OpenAI `o3-mini` and `o1`
  - Anthropic `claude-3-5-haiku-latest`
  - Groq `llama-3.3-70b-versatile`, `deepseek-r1-distill-llama-70b`
  - Ollama `llama3.3`
- Use the max output tokens from the provider that handles the least (so
it works for all) for OpenRouter models
- Use `max_output_tokens` and 4096 if `None` as a fallback
- Removed models (no longer working):
  - `gemma-7b-it`
  - `llama-3.1-70b-versatile`
  - `llama-3.1-405b-reasoning`
-  Rename llm enum name from `GEMINI_FLASH_1_5_8B` to `GEMINI_FLASH_1_5`

### Checklist 📋

#### For code changes:
- [ ] I have clearly listed my changes in the PR description
- [ ] I have made a test plan
- [ ] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [ ] ...

<details>
  <summary>Example test plan</summary>
  
  - [ ] Create from scratch and execute an agent with at least 3 blocks
- [ ] Import an agent from file upload, and confirm it executes
correctly
  - [ ] Upload agent to marketplace
- [ ] Import an agent from marketplace and confirm it executes correctly
  - [ ] Edit an agent from monitor, and confirm it executes correctly
</details>

---------

Co-authored-by: Aarushi <50577581+aarushik93@users.noreply.github.com>
2025-02-05 09:53:34 +00:00
Zamil Majdy
58cadeb3b9 feat(frontend): Fix wordings for auto top-up feature (#9419)
### Changes

Make the auto top-up wordings clearer: 

<img width="547" alt="Screenshot 2025-02-05 at 1 38 16 AM"
src="https://github.com/user-attachments/assets/9a902442-1815-4a38-af39-d7d4b0e120f0"
/>

<img width="555" alt="Screenshot 2025-02-05 at 1 40 29 AM"
src="https://github.com/user-attachments/assets/4c9c9cdc-2d73-45f2-8a8d-f7ae7f97541b"
/>
es 🏗️

### Checklist 📋

#### For code changes:
- [ ] I have clearly listed my changes in the PR description
- [ ] I have made a test plan
- [ ] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [ ] ...

<details>
  <summary>Example test plan</summary>
  
  - [ ] Create from scratch and execute an agent with at least 3 blocks
- [ ] Import an agent from file upload, and confirm it executes
correctly
  - [ ] Upload agent to marketplace
- [ ] Import an agent from marketplace and confirm it executes correctly
  - [ ] Edit an agent from monitor, and confirm it executes correctly
</details>

#### For configuration changes:
- [ ] `.env.example` is updated or already compatible with my changes
- [ ] `docker-compose.yml` is updated or already compatible with my
changes
- [ ] I have included a list of my configuration changes in the PR
description (under **Changes**)

<details>
  <summary>Examples of configuration changes</summary>

  - Changing ports
  - Adding new services that need to communicate with each other
  - Secrets or environment variable changes
  - New or infrastructure changes such as databases
</details>
2025-02-05 10:27:10 +07:00
Zamil Majdy
9151211d2a fix(backend): Set the minimum auto top-up amount to 500 credits (#9418)
### Changes 🏗️

* Set the minimum auto top-up amount to 500.
* Add description on minimum and dollar value in the input box.

### Checklist 📋

#### For code changes:
- [ ] I have clearly listed my changes in the PR description
- [ ] I have made a test plan
- [ ] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [ ] ...

<details>
  <summary>Example test plan</summary>
  
  - [ ] Create from scratch and execute an agent with at least 3 blocks
- [ ] Import an agent from file upload, and confirm it executes
correctly
  - [ ] Upload agent to marketplace
- [ ] Import an agent from marketplace and confirm it executes correctly
  - [ ] Edit an agent from monitor, and confirm it executes correctly
</details>

#### For configuration changes:
- [ ] `.env.example` is updated or already compatible with my changes
- [ ] `docker-compose.yml` is updated or already compatible with my
changes
- [ ] I have included a list of my configuration changes in the PR
description (under **Changes**)

<details>
  <summary>Examples of configuration changes</summary>

  - Changing ports
  - Adding new services that need to communicate with each other
  - Secrets or environment variable changes
  - New or infrastructure changes such as databases
</details>
2025-02-04 17:01:33 +00:00
Zamil Majdy
8e68e20fef fix(backend): Fix return URL of billing portal when platform_base_url != frontend_base_url (#9417)
### Changes 🏗️

Fix return URL of billing portal when platform_base_url !=
frontend_base_url.

### Checklist 📋

#### For code changes:
- [ ] I have clearly listed my changes in the PR description
- [ ] I have made a test plan
- [ ] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [ ] ...

<details>
  <summary>Example test plan</summary>
  
  - [ ] Create from scratch and execute an agent with at least 3 blocks
- [ ] Import an agent from file upload, and confirm it executes
correctly
  - [ ] Upload agent to marketplace
- [ ] Import an agent from marketplace and confirm it executes correctly
  - [ ] Edit an agent from monitor, and confirm it executes correctly
</details>

#### For configuration changes:
- [ ] `.env.example` is updated or already compatible with my changes
- [ ] `docker-compose.yml` is updated or already compatible with my
changes
- [ ] I have included a list of my configuration changes in the PR
description (under **Changes**)

<details>
  <summary>Examples of configuration changes</summary>

  - Changing ports
  - Adding new services that need to communicate with each other
  - Secrets or environment variable changes
  - New or infrastructure changes such as databases
</details>
2025-02-04 16:15:29 +00:00
Nicholas Tindle
a7d545cd5d Merge branch 'master' into dev 2025-02-04 08:45:34 -06:00
Nicholas Tindle
0fbabe690a Merge branch 'master' into dev 2025-02-04 08:43:52 -06:00
Zamil Majdy
cdd2d5696c fix(backend): Fix doubly reported produced output (#9412)
https://github.com/Significant-Gravitas/AutoGPT/pull/9340/files#diff-1b278ebf10a9da0fb5030010222b3a6df2b05a5463cad428cd6c38a1541b0f73R210-R219
introduced a bug where the spend_credit and update_execution is called
inside the loop instead of by the end of the execution.

### Changes 🏗️

Untabbed the `spend_credit` and `update_execution` code outside the
loop.

### Checklist 📋

#### For code changes:
- [ ] I have clearly listed my changes in the PR description
- [ ] I have made a test plan
- [ ] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [ ] ...

<details>
  <summary>Example test plan</summary>
  
  - [ ] Create from scratch and execute an agent with at least 3 blocks
- [ ] Import an agent from file upload, and confirm it executes
correctly
  - [ ] Upload agent to marketplace
- [ ] Import an agent from marketplace and confirm it executes correctly
  - [ ] Edit an agent from monitor, and confirm it executes correctly
</details>

#### For configuration changes:
- [ ] `.env.example` is updated or already compatible with my changes
- [ ] `docker-compose.yml` is updated or already compatible with my
changes
- [ ] I have included a list of my configuration changes in the PR
description (under **Changes**)

<details>
  <summary>Examples of configuration changes</summary>

  - Changing ports
  - Adding new services that need to communicate with each other
  - Secrets or environment variable changes
  - New or infrastructure changes such as databases
</details>
2025-02-04 14:42:19 +00:00
Zamil Majdy
f44453be6e fix(backend): Fix transaction history listing on older transaction with no metadata 2025-02-03 16:33:18 +01:00
Nicholas Tindle
4302c5d60a feat: add screenshotone block (#9308)
I want to be able to screenshot things

### Changes 🏗️

<!-- Concisely describe all of the changes made in this pull request:
-->
Adds ScreenshotOne blocks with the ability to screenshot stuff

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [x] Built agent with it

---------

Co-authored-by: Zamil Majdy <zamil.majdy@agpt.co>
Co-authored-by: Aarushi <50577581+aarushik93@users.noreply.github.com>
2025-02-03 14:46:26 +00:00
Nicholas Tindle
53aea8908a fix(backend): fix missing agent object requirement (#9380) 2025-02-03 07:42:54 -06:00
Zamil Majdy
7b50e9bd77 fix(backend): Fix return url post top-up (#9392)
### Changes 🏗️

Fix return URL post-top-up, when FRONT_END_BASE_URL is used.

### Checklist 📋

#### For code changes:
- [ ] I have clearly listed my changes in the PR description
- [ ] I have made a test plan
- [ ] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [ ] ...

<details>
  <summary>Example test plan</summary>
  
  - [ ] Create from scratch and execute an agent with at least 3 blocks
- [ ] Import an agent from file upload, and confirm it executes
correctly
  - [ ] Upload agent to marketplace
- [ ] Import an agent from marketplace and confirm it executes correctly
  - [ ] Edit an agent from monitor, and confirm it executes correctly
</details>

#### For configuration changes:
- [ ] `.env.example` is updated or already compatible with my changes
- [ ] `docker-compose.yml` is updated or already compatible with my
changes
- [ ] I have included a list of my configuration changes in the PR
description (under **Changes**)

<details>
  <summary>Examples of configuration changes</summary>

  - Changing ports
  - Adding new services that need to communicate with each other
  - Secrets or environment variable changes
  - New or infrastructure changes such as databases
</details>
2025-02-03 12:31:33 +00:00
Zamil Majdy
3de982792e fix(backend): Fix broken top-up flow (#9391)
https://github.com/Significant-Gravitas/AutoGPT/pull/9296 caused these
errors:

<img width="440" alt="image"
src="https://github.com/user-attachments/assets/f100619b-1a4c-44fb-b961-e74210894a91"
/>

<img width="411" alt="image"
src="https://github.com/user-attachments/assets/0c1a8aff-b14f-4ea8-8ae9-b8928c8511cf"
/>



### Changes 🏗️

Removed customer email & return_url.

### Checklist 📋

#### For code changes:
- [ ] I have clearly listed my changes in the PR description
- [ ] I have made a test plan
- [ ] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [ ] ...

<details>
  <summary>Example test plan</summary>
  
  - [ ] Create from scratch and execute an agent with at least 3 blocks
- [ ] Import an agent from file upload, and confirm it executes
correctly
  - [ ] Upload agent to marketplace
- [ ] Import an agent from marketplace and confirm it executes correctly
  - [ ] Edit an agent from monitor, and confirm it executes correctly
</details>

#### For configuration changes:
- [ ] `.env.example` is updated or already compatible with my changes
- [ ] `docker-compose.yml` is updated or already compatible with my
changes
- [ ] I have included a list of my configuration changes in the PR
description (under **Changes**)

<details>
  <summary>Examples of configuration changes</summary>

  - Changing ports
  - Adding new services that need to communicate with each other
  - Secrets or environment variable changes
  - New or infrastructure changes such as databases
</details>
2025-02-03 10:58:48 +00:00
Krzysztof Czerwinski
74b8b45e0a fix(frontend): Fix typing for dict/dict[Any, Any] SchemaField (#9383)
Type system complains about this entry as impossible but this code path
is possible and without it some blocks are broken (e.g. `Step Through
Items`)

### Changes 🏗️

- Restore `case "object"` in `NodeGenericInputField` 
- Update `BlockIOKVSubSchema` type, so ts type checking doesn't complain

### Checklist 📋

#### For code changes:
- [ ] I have clearly listed my changes in the PR description
- [ ] I have made a test plan
- [ ] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [ ] ...

<details>
  <summary>Example test plan</summary>
  
  - [ ] Create from scratch and execute an agent with at least 3 blocks
- [ ] Import an agent from file upload, and confirm it executes
correctly
  - [ ] Upload agent to marketplace
- [ ] Import an agent from marketplace and confirm it executes correctly
  - [ ] Edit an agent from monitor, and confirm it executes correctly
</details>

#### For configuration changes:
- [ ] `.env.example` is updated or already compatible with my changes
- [ ] `docker-compose.yml` is updated or already compatible with my
changes
- [ ] I have included a list of my configuration changes in the PR
description (under **Changes**)

<details>
  <summary>Examples of configuration changes</summary>

  - Changing ports
  - Adding new services that need to communicate with each other
  - Secrets or environment variable changes
  - New or infrastructure changes such as databases
</details>
2025-02-01 20:32:34 +00:00
Krzysztof Czerwinski
5bdd8c252e fix(frontend): Show feedback if user exists on sign up (#9389)
Currently if user tries to sign in with an email that is already
associated with a user they are redirected to login without any
feedback.

### Changes 🏗️

- Show feedback: "User with this email already exists" on signup when
user already registered
- Beta user waitlist message is shown otherwise as previously

### Checklist 📋

#### For code changes:
- [ ] I have clearly listed my changes in the PR description
- [ ] I have made a test plan
- [ ] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [ ] ...

<details>
  <summary>Example test plan</summary>
  
  - [ ] Create from scratch and execute an agent with at least 3 blocks
- [ ] Import an agent from file upload, and confirm it executes
correctly
  - [ ] Upload agent to marketplace
- [ ] Import an agent from marketplace and confirm it executes correctly
  - [ ] Edit an agent from monitor, and confirm it executes correctly
</details>

#### For configuration changes:
- [ ] `.env.example` is updated or already compatible with my changes
- [ ] `docker-compose.yml` is updated or already compatible with my
changes
- [ ] I have included a list of my configuration changes in the PR
description (under **Changes**)

<details>
  <summary>Examples of configuration changes</summary>

  - Changing ports
  - Adding new services that need to communicate with each other
  - Secrets or environment variable changes
  - New or infrastructure changes such as databases
</details>
2025-02-01 15:15:17 +00:00
Muhammad Safi
8e33af6d99 Marketplace UI changes. (#9381)
<!-- Clearly explain the need for these changes: -->
- Consolidated PR #8967 and #8968.
- Fixed issues #9370 and #9371
- Fixed padding for StoreCard.

### Changes 🏗️

<!-- Concisely describe all of the changes made in this pull request:
-->

### Checklist 📋

#### For code changes:
- [ ] I have clearly listed my changes in the PR description
- [ ] I have made a test plan
- [ ] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [ ] ...

<details>
  <summary>Example test plan</summary>
  
  - [ ] Create from scratch and execute an agent with at least 3 blocks
- [ ] Import an agent from file upload, and confirm it executes
correctly
  - [ ] Upload agent to marketplace
- [ ] Import an agent from marketplace and confirm it executes correctly
  - [ ] Edit an agent from monitor, and confirm it executes correctly
</details>

#### For configuration changes:
- [ ] `.env.example` is updated or already compatible with my changes
- [ ] `docker-compose.yml` is updated or already compatible with my
changes
- [ ] I have included a list of my configuration changes in the PR
description (under **Changes**)

<details>
  <summary>Examples of configuration changes</summary>

  - Changing ports
  - Adding new services that need to communicate with each other
  - Secrets or environment variable changes
  - New or infrastructure changes such as databases
</details>

---------

Co-authored-by: Muhammad Safi <muhammadsafi@Muhammads-MacBook-Pro.local>
Co-authored-by: Aarushi <50577581+aarushik93@users.noreply.github.com>
2025-01-31 09:45:36 +00:00
Muhammad Safi
f481de173d Fixed font for Heading Issue #8971 (#9358)
<!-- Clearly explain the need for these changes: -->
- Changed the Font weight to semibold.
- Adjusted size to 48px.
- Fixed line-height to 54px.
### Changes 🏗️

<!-- Concisely describe all of the changes made in this pull request:
-->

### Checklist 📋

#### For code changes:
- [ ] I have clearly listed my changes in the PR description
- [ ] I have made a test plan
- [ ] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [ ] ...

<details>
  <summary>Example test plan</summary>
  
  - [ ] Create from scratch and execute an agent with at least 3 blocks
- [ ] Import an agent from file upload, and confirm it executes
correctly
  - [ ] Upload agent to marketplace
- [ ] Import an agent from marketplace and confirm it executes correctly
  - [ ] Edit an agent from monitor, and confirm it executes correctly
</details>

#### For configuration changes:
- [ ] `.env.example` is updated or already compatible with my changes
- [ ] `docker-compose.yml` is updated or already compatible with my
changes
- [ ] I have included a list of my configuration changes in the PR
description (under **Changes**)

<details>
  <summary>Examples of configuration changes</summary>

  - Changing ports
  - Adding new services that need to communicate with each other
  - Secrets or environment variable changes
  - New or infrastructure changes such as databases
</details>

---------

Co-authored-by: Muhammad Safi <muhammadsafi@Muhammads-MacBook-Pro.local>
Co-authored-by: Aarushi <50577581+aarushik93@users.noreply.github.com>
2025-01-30 20:53:30 +00:00
Ethan Lee
24306a16bd Update README.md (#9379)
<!-- Clearly explain the need for these changes: -->

### Changes 🏗️

update backend readme to be more clear on where to run docker

<!-- Concisely describe all of the changes made in this pull request:
-->

### Checklist 📋

#### For code changes:
- [ x] I have clearly listed my changes in the PR description
- [ ] I have made a test plan
- [ ] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [ ] ...

<details>
  <summary>Example test plan</summary>
  
  - [ ] Create from scratch and execute an agent with at least 3 blocks
- [ ] Import an agent from file upload, and confirm it executes
correctly
  - [ ] Upload agent to marketplace
- [ ] Import an agent from marketplace and confirm it executes correctly
  - [ ] Edit an agent from monitor, and confirm it executes correctly
</details>

#### For configuration changes:
- [ ] `.env.example` is updated or already compatible with my changes
- [ ] `docker-compose.yml` is updated or already compatible with my
changes
- [ ] I have included a list of my configuration changes in the PR
description (under **Changes**)

<details>
  <summary>Examples of configuration changes</summary>

  - Changing ports
  - Adding new services that need to communicate with each other
  - Secrets or environment variable changes
  - New or infrastructure changes such as databases
</details>
2025-01-30 18:04:58 +00:00
Abhimanyu Yadav
f5bf36cd97 feat(blocks) : Authentication for todoist block (#9319)
- resolves - #9303 

Implement OAuth authentication for Todoist with a basic authentication
mechanism for testing purposes. The basic authentication will be removed
in the next block pr.

> I’ve been using the official Python SDK for Todoist, called
todoist-api-python.
2025-01-30 16:09:00 +00:00
Zamil Majdy
1f83a8c61a feat(platform): Implement User Credit Transaction History (#9291)
<img width="1129" alt="image"
src="https://github.com/user-attachments/assets/94aab319-7755-413b-91e1-4ad1ba383b83"
/>


### Changes 🏗️

Add a transaction history table on the user credits page.

### Checklist 📋

#### For code changes:
- [ ] I have clearly listed my changes in the PR description
- [ ] I have made a test plan
- [ ] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [ ] ...

<details>
  <summary>Example test plan</summary>
  
  - [ ] Create from scratch and execute an agent with at least 3 blocks
- [ ] Import an agent from file upload, and confirm it executes
correctly
  - [ ] Upload agent to marketplace
- [ ] Import an agent from marketplace and confirm it executes correctly
  - [ ] Edit an agent from monitor, and confirm it executes correctly
</details>

#### For configuration changes:
- [ ] `.env.example` is updated or already compatible with my changes
- [ ] `docker-compose.yml` is updated or already compatible with my
changes
- [ ] I have included a list of my configuration changes in the PR
description (under **Changes**)

<details>
  <summary>Examples of configuration changes</summary>

  - Changing ports
  - Adding new services that need to communicate with each other
  - Secrets or environment variable changes
  - New or infrastructure changes such as databases
</details>

---------

Co-authored-by: Krzysztof Czerwinski <kpczerwinski@gmail.com>
2025-01-30 21:15:04 +07:00
Abhimanyu Yadav
a44c9333d3 feat(block) : Todoist rest api blocks (#9369)
- Resolves - #9303 and #9304
- Depends on - https://github.com/Significant-Gravitas/AutoGPT/pull/9319

### Blocks list

Block Name | What It Does | Manually Tested
-- | -- | --
Todoist Create Label | Creates a new label in Todoist | 
Todoist List Labels | Retrieves all personal labels from Todoist | 
Todoist Get Label | Retrieves a specific label by ID | 
Todoist Create Task | Creates a new task in Todoist | 
Todoist Get Tasks | Retrieves active tasks from Todoist | 
Todoist Update Task | Updates an existing task | 
Todoist Close Task | Completes/closes a task | 
Todoist Reopen Task | Reopens a completed task | 
Todoist Delete Task | Permanently deletes a task | 
Todoist List Projects | Retrieves all projects from Todoist | 
Todoist Create Project | Creates a new project in Todoist | 
Todoist Get Project | Retrieves details for a specific project | 
Todoist Update Project | Updates an existing project | 
Todoist Delete Project | Deletes a project and its contents | 
Todoist List Collaborators | Retrieves collaborators on a project | 
Todoist List Sections | Retrieves sections from Todoist | 
Todoist Get Section | Retrieves details for a specific section | 
Todoist Delete Section | Deletes a section and its tasks | 
Todoist Create Comment | Creates a new comment on a task or project | 
Todoist Get Comments | Retrieves all comments for a task or project | 
Todoist Get Comment | Retrieves a specific comment by ID | 
Todoist Update Comment | Updates an existing comment | 
Todoist Delete Comment | Deletes a comment | 

> I’ve only created action blocks in Todoist because webhooks can only
be manually created [we can't do it programatically right now]. I’ve
already emailed Todoist for help, but they haven’t replied yet. Once I
receive a reply, I’ll create a pull request for webhook triggers in
Todoist.
2025-01-30 13:33:29 +00:00
Krzysztof Czerwinski
bd27ce5f26 fix(frontend): Center reset password page (#9377)
### Changes 🏗️

- Add `div` to recenter page elements

### Checklist 📋

#### For code changes:
- [ ] I have clearly listed my changes in the PR description
- [ ] I have made a test plan
- [ ] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [ ] ...

<details>
  <summary>Example test plan</summary>
  
  - [ ] Create from scratch and execute an agent with at least 3 blocks
- [ ] Import an agent from file upload, and confirm it executes
correctly
  - [ ] Upload agent to marketplace
- [ ] Import an agent from marketplace and confirm it executes correctly
  - [ ] Edit an agent from monitor, and confirm it executes correctly
</details>

#### For configuration changes:
- [ ] `.env.example` is updated or already compatible with my changes
- [ ] `docker-compose.yml` is updated or already compatible with my
changes
- [ ] I have included a list of my configuration changes in the PR
description (under **Changes**)

<details>
  <summary>Examples of configuration changes</summary>

  - Changing ports
  - Adding new services that need to communicate with each other
  - Secrets or environment variable changes
  - New or infrastructure changes such as databases
</details>
2025-01-30 12:30:59 +00:00
Pratim Sadhu
e82df96e56 Fix: Allow further zooming out in the builder (#9325) (#9368)
<!-- Clearly explain the need for these changes: -->
The current minimum zoom level restricts zooming out fully. I reduced
the minZoom level to allow more flexible zooming out for users.

### Changes 🏗️
Reduced minZoom from 0.2 to 0.1 in the ReactFlow component to allow
further zooming out.

<!-- Concisely describe all of the changes made in this pull request:
-->

### Checklist 📋

#### For code changes:
- [ x ] I have clearly listed my changes in the PR description
- [ x ] I have made a test plan
- [ x ] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
- [ x ] Tested with different elements and zoom ranges to confirm its a
smooth transition
- [ x ] Checked that canvas and all nodes remain unchanged and
interactive

Co-authored-by: Bently <tomnoon9@gmail.com>
2025-01-30 09:51:56 +00:00
Nicholas Tindle
b03e3e47a2 feat(blocks): add text replace block (#9366)
<!-- Clearly explain the need for these changes: -->
User in discord requested the replace text block

### Changes 🏗️
- Adds replace text block
<!-- Concisely describe all of the changes made in this pull request:
-->

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [x] Passed unit tests

Co-authored-by: Bently <tomnoon9@gmail.com>
2025-01-30 09:33:59 +00:00
dependabot[bot]
a4b962462c chore(backend/deps): bump the production-dependencies group across 1 directory with 9 updates (#9364)
Bumps the production-dependencies group with 9 updates in the
/autogpt_platform/backend directory:

| Package | From | To |
| --- | --- | --- |
| [anthropic](https://github.com/anthropics/anthropic-sdk-python) |
`0.40.0` | `0.45.2` |
|
[google-api-python-client](https://github.com/googleapis/google-api-python-client)
| `2.159.0` | `2.160.0` |
| [groq](https://github.com/groq/groq-python) | `0.13.1` | `0.15.0` |
| [openai](https://github.com/openai/openai-python) | `1.60.0` |
`1.60.2` |
| [pytest-asyncio](https://github.com/pytest-dev/pytest-asyncio) |
`0.25.2` | `0.25.3` |
| [sentry-sdk](https://github.com/getsentry/sentry-python) | `2.19.2` |
`2.20.0` |
| [stripe](https://github.com/stripe/stripe-python) | `11.4.1` |
`11.5.0` |
| [supabase](https://github.com/supabase/supabase-py) | `2.11.0` |
`2.12.0` |
| [mem0ai](https://github.com/mem0ai/mem0) | `0.1.44` | `0.1.48` |


Updates `anthropic` from 0.40.0 to 0.45.2
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/anthropics/anthropic-sdk-python/releases">anthropic's
releases</a>.</em></p>
<blockquote>
<h2>v0.45.2</h2>
<h2>0.45.2 (2025-01-27)</h2>
<p>Full Changelog: <a
href="https://github.com/anthropics/anthropic-sdk-python/compare/v0.45.1...v0.45.2">v0.45.1...v0.45.2</a></p>
<h3>Bug Fixes</h3>
<ul>
<li><strong>streaming:</strong> avoid invalid deser type error (<a
href="https://redirect.github.com/anthropics/anthropic-sdk-python/issues/845">#845</a>)
(<a
href="72a2585680">72a2585</a>)</li>
</ul>
<h2>v0.45.1</h2>
<h2>0.45.1 (2025-01-27)</h2>
<p>Full Changelog: <a
href="https://github.com/anthropics/anthropic-sdk-python/compare/v0.45.0...v0.45.1">v0.45.0...v0.45.1</a></p>
<h3>Bug Fixes</h3>
<ul>
<li><strong>streaming:</strong> accumulate citations (<a
href="https://redirect.github.com/anthropics/anthropic-sdk-python/issues/844">#844</a>)
(<a
href="e665f2fefd">e665f2f</a>)</li>
</ul>
<h3>Chores</h3>
<ul>
<li><strong>docs:</strong> updates (<a
href="https://redirect.github.com/anthropics/anthropic-sdk-python/issues/841">#841</a>)
(<a
href="fb10a7d658">fb10a7d</a>)</li>
</ul>
<h2>v0.45.0</h2>
<h2>0.45.0 (2025-01-23)</h2>
<p>Full Changelog: <a
href="https://github.com/anthropics/anthropic-sdk-python/compare/v0.44.0...v0.45.0">v0.44.0...v0.45.0</a></p>
<h3>Features</h3>
<ul>
<li><strong>api:</strong> add citations (<a
href="https://redirect.github.com/anthropics/anthropic-sdk-python/issues/839">#839</a>)
(<a
href="2ec74b6ff1">2ec74b6</a>)</li>
<li><strong>client:</strong> support results endpoint (<a
href="https://redirect.github.com/anthropics/anthropic-sdk-python/issues/835">#835</a>)
(<a
href="5dd88bf2d2">5dd88bf</a>)</li>
</ul>
<h3>Chores</h3>
<ul>
<li><strong>internal:</strong> minor formatting changes (<a
href="https://redirect.github.com/anthropics/anthropic-sdk-python/issues/838">#838</a>)
(<a
href="31eb826deb">31eb826</a>)</li>
</ul>
<h2>v0.44.0</h2>
<h2>0.44.0 (2025-01-21)</h2>
<p>Full Changelog: <a
href="https://github.com/anthropics/anthropic-sdk-python/compare/v0.43.1...v0.44.0">v0.43.1...v0.44.0</a></p>
<h3>Features</h3>
<ul>
<li><strong>streaming:</strong> add request_id getter (<a
href="https://redirect.github.com/anthropics/anthropic-sdk-python/issues/831">#831</a>)
(<a
href="fb397e0851">fb397e0</a>)</li>
</ul>
<h3>Bug Fixes</h3>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/anthropics/anthropic-sdk-python/blob/main/CHANGELOG.md">anthropic's
changelog</a>.</em></p>
<blockquote>
<h2>0.45.2 (2025-01-27)</h2>
<p>Full Changelog: <a
href="https://github.com/anthropics/anthropic-sdk-python/compare/v0.45.1...v0.45.2">v0.45.1...v0.45.2</a></p>
<h3>Bug Fixes</h3>
<ul>
<li><strong>streaming:</strong> avoid invalid deser type error (<a
href="https://redirect.github.com/anthropics/anthropic-sdk-python/issues/845">#845</a>)
(<a
href="72a2585680">72a2585</a>)</li>
</ul>
<h2>0.45.1 (2025-01-27)</h2>
<p>Full Changelog: <a
href="https://github.com/anthropics/anthropic-sdk-python/compare/v0.45.0...v0.45.1">v0.45.0...v0.45.1</a></p>
<h3>Bug Fixes</h3>
<ul>
<li><strong>streaming:</strong> accumulate citations (<a
href="https://redirect.github.com/anthropics/anthropic-sdk-python/issues/844">#844</a>)
(<a
href="e665f2fefd">e665f2f</a>)</li>
</ul>
<h3>Chores</h3>
<ul>
<li><strong>docs:</strong> updates (<a
href="https://redirect.github.com/anthropics/anthropic-sdk-python/issues/841">#841</a>)
(<a
href="fb10a7d658">fb10a7d</a>)</li>
</ul>
<h2>0.45.0 (2025-01-23)</h2>
<p>Full Changelog: <a
href="https://github.com/anthropics/anthropic-sdk-python/compare/v0.44.0...v0.45.0">v0.44.0...v0.45.0</a></p>
<h3>Features</h3>
<ul>
<li><strong>api:</strong> add citations (<a
href="https://redirect.github.com/anthropics/anthropic-sdk-python/issues/839">#839</a>)
(<a
href="2ec74b6ff1">2ec74b6</a>)</li>
<li><strong>client:</strong> support results endpoint (<a
href="https://redirect.github.com/anthropics/anthropic-sdk-python/issues/835">#835</a>)
(<a
href="5dd88bf2d2">5dd88bf</a>)</li>
</ul>
<h3>Chores</h3>
<ul>
<li><strong>internal:</strong> minor formatting changes (<a
href="https://redirect.github.com/anthropics/anthropic-sdk-python/issues/838">#838</a>)
(<a
href="31eb826deb">31eb826</a>)</li>
</ul>
<h2>0.44.0 (2025-01-21)</h2>
<p>Full Changelog: <a
href="https://github.com/anthropics/anthropic-sdk-python/compare/v0.43.1...v0.44.0">v0.43.1...v0.44.0</a></p>
<h3>Features</h3>
<ul>
<li><strong>streaming:</strong> add request_id getter (<a
href="https://redirect.github.com/anthropics/anthropic-sdk-python/issues/831">#831</a>)
(<a
href="fb397e0851">fb397e0</a>)</li>
</ul>
<h3>Bug Fixes</h3>
<ul>
<li><strong>tests:</strong> make test_get_platform less flaky (<a
href="https://redirect.github.com/anthropics/anthropic-sdk-python/issues/830">#830</a>)
(<a
href="f2c10cae0c">f2c10ca</a>)</li>
</ul>
<h3>Chores</h3>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="0f9ccca8e2"><code>0f9ccca</code></a>
release: 0.45.2</li>
<li><a
href="37acfbcda5"><code>37acfbc</code></a>
fix(streaming): avoid invalid deser type error (<a
href="https://redirect.github.com/anthropics/anthropic-sdk-python/issues/845">#845</a>)</li>
<li><a
href="d0f98a4e6b"><code>d0f98a4</code></a>
release: 0.45.1</li>
<li><a
href="872c614851"><code>872c614</code></a>
fix(streaming): accumulate citations (<a
href="https://redirect.github.com/anthropics/anthropic-sdk-python/issues/844">#844</a>)</li>
<li><a
href="14bf8fe88f"><code>14bf8fe</code></a>
chore(docs): updates (<a
href="https://redirect.github.com/anthropics/anthropic-sdk-python/issues/841">#841</a>)</li>
<li><a
href="c5102baffb"><code>c5102ba</code></a>
release: 0.45.0</li>
<li><a
href="67aa83e5d5"><code>67aa83e</code></a>
feat(api): add citations (<a
href="https://redirect.github.com/anthropics/anthropic-sdk-python/issues/839">#839</a>)</li>
<li><a
href="bb1f52bfba"><code>bb1f52b</code></a>
chore(internal): minor formatting changes (<a
href="https://redirect.github.com/anthropics/anthropic-sdk-python/issues/838">#838</a>)</li>
<li><a
href="4b9140284b"><code>4b91402</code></a>
feat(client): support results endpoint (<a
href="https://redirect.github.com/anthropics/anthropic-sdk-python/issues/835">#835</a>)</li>
<li><a
href="d212ec9f6d"><code>d212ec9</code></a>
release: 0.44.0</li>
<li>Additional commits viewable in <a
href="https://github.com/anthropics/anthropic-sdk-python/compare/v0.40.0...v0.45.2">compare
view</a></li>
</ul>
</details>
<br />

Updates `google-api-python-client` from 2.159.0 to 2.160.0
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/googleapis/google-api-python-client/releases">google-api-python-client's
releases</a>.</em></p>
<blockquote>
<h2>v2.160.0</h2>
<h2><a
href="https://github.com/googleapis/google-api-python-client/compare/v2.159.0...v2.160.0">2.160.0</a>
(2025-01-21)</h2>
<h3>Features</h3>
<ul>
<li><strong>accesscontextmanager:</strong> Update the api <a
href="8b40ee6938</a>
(<a
href="165d3b5ad0">165d3b5</a>)</li>
<li><strong>adsenseplatform:</strong> Update the api <a
href="04355c7c7c</a>
(<a
href="165d3b5ad0">165d3b5</a>)</li>
<li><strong>aiplatform:</strong> Update the api <a
href="24228d4886</a>
(<a
href="165d3b5ad0">165d3b5</a>)</li>
<li><strong>analyticsadmin:</strong> Update the api <a
href="dff2a84a82</a>
(<a
href="165d3b5ad0">165d3b5</a>)</li>
<li><strong>analyticshub:</strong> Update the api <a
href="0208b0b028</a>
(<a
href="165d3b5ad0">165d3b5</a>)</li>
<li><strong>androidenterprise:</strong> Update the api <a
href="b7865bd3ff</a>
(<a
href="165d3b5ad0">165d3b5</a>)</li>
<li><strong>classroom:</strong> Update the api <a
href="ef72b5f7f9</a>
(<a
href="165d3b5ad0">165d3b5</a>)</li>
<li><strong>cloudbuild:</strong> Update the api <a
href="41e76d1b7e</a>
(<a
href="165d3b5ad0">165d3b5</a>)</li>
<li><strong>compute:</strong> Update the api <a
href="48c508dffa</a>
(<a
href="165d3b5ad0">165d3b5</a>)</li>
<li><strong>dialogflow:</strong> Update the api <a
href="6c3ff85115</a>
(<a
href="165d3b5ad0">165d3b5</a>)</li>
<li><strong>discoveryengine:</strong> Update the api <a
href="9afd49fbbf</a>
(<a
href="165d3b5ad0">165d3b5</a>)</li>
<li><strong>displayvideo:</strong> Update the api <a
href="63b01f3147</a>
(<a
href="165d3b5ad0">165d3b5</a>)</li>
<li><strong>file:</strong> Update the api <a
href="e7bf3e1cc7</a>
(<a
href="165d3b5ad0">165d3b5</a>)</li>
<li><strong>gkehub:</strong> Update the api <a
href="aa81a39a6f</a>
(<a
href="165d3b5ad0">165d3b5</a>)</li>
<li><strong>integrations:</strong> Update the api <a
href="da21dd8beb</a>
(<a
href="165d3b5ad0">165d3b5</a>)</li>
<li><strong>merchantapi:</strong> Update the api <a
href="c66e25ffad</a>
(<a
href="165d3b5ad0">165d3b5</a>)</li>
<li><strong>migrationcenter:</strong> Update the api <a
href="06c1759266</a>
(<a
href="165d3b5ad0">165d3b5</a>)</li>
<li><strong>monitoring:</strong> Update the api <a
href="6d1dc83375</a>
(<a
href="165d3b5ad0">165d3b5</a>)</li>
<li><strong>netapp:</strong> Update the api <a
href="628b723392</a>
(<a
href="165d3b5ad0">165d3b5</a>)</li>
<li><strong>networkmanagement:</strong> Update the api <a
href="45d70c19e9</a>
(<a
href="165d3b5ad0">165d3b5</a>)</li>
<li><strong>redis:</strong> Update the api <a
href="a866933256</a>
(<a
href="165d3b5ad0">165d3b5</a>)</li>
<li><strong>spanner:</strong> Update the api <a
href="9540ac50fc</a>
(<a
href="165d3b5ad0">165d3b5</a>)</li>
<li><strong>websecurityscanner:</strong> Update the api <a
href="5eae43739b</a>
(<a
href="165d3b5ad0">165d3b5</a>)</li>
</ul>
<h3>Bug Fixes</h3>
<ul>
<li><strong>secretmanager:</strong> Update the api <a
href="477de50ccc</a>
(<a
href="165d3b5ad0">165d3b5</a>)</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="6ba5c06d12"><code>6ba5c06</code></a>
chore(main): release 2.160.0 (<a
href="https://redirect.github.com/googleapis/google-api-python-client/issues/2552">#2552</a>)</li>
<li><a
href="165d3b5ad0"><code>165d3b5</code></a>
chore: Update discovery artifacts (<a
href="https://redirect.github.com/googleapis/google-api-python-client/issues/2551">#2551</a>)</li>
<li><a
href="f4b3014212"><code>f4b3014</code></a>
chore(python): exclude .github/workflows/unittest.yml in renovate config
(<a
href="https://redirect.github.com/googleapis/google-api-python-client/issues/2546">#2546</a>)</li>
<li>See full diff in <a
href="https://github.com/googleapis/google-api-python-client/compare/v2.159.0...v2.160.0">compare
view</a></li>
</ul>
</details>
<br />

Updates `groq` from 0.13.1 to 0.15.0
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/groq/groq-python/releases">groq's
releases</a>.</em></p>
<blockquote>
<h2>v0.15.0</h2>
<h2>0.15.0 (2025-01-11)</h2>
<p>Full Changelog: <a
href="https://github.com/groq/groq-python/compare/v0.14.0...v0.15.0">v0.14.0...v0.15.0</a></p>
<h3>Features</h3>
<ul>
<li><strong>api:</strong> api update (<a
href="https://redirect.github.com/groq/groq-python/issues/175">#175</a>)
(<a
href="61cffbc78a">61cffbc</a>)</li>
</ul>
<h3>Bug Fixes</h3>
<ul>
<li>correctly handle deserialising <code>cls</code> fields (<a
href="https://redirect.github.com/groq/groq-python/issues/174">#174</a>)
(<a
href="0b2e997ce4">0b2e997</a>)</li>
</ul>
<h3>Chores</h3>
<ul>
<li><strong>internal:</strong> codegen related update (<a
href="https://redirect.github.com/groq/groq-python/issues/172">#172</a>)
(<a
href="d6ecadaa24">d6ecada</a>)</li>
</ul>
<h2>v0.14.0</h2>
<h2>0.14.0 (2025-01-09)</h2>
<p>Full Changelog: <a
href="https://github.com/groq/groq-python/compare/v0.13.1...v0.14.0">v0.13.1...v0.14.0</a></p>
<h3>Features</h3>
<ul>
<li><strong>api:</strong> api update (<a
href="https://redirect.github.com/groq/groq-python/issues/163">#163</a>)
(<a
href="43a7a5b048">43a7a5b</a>)</li>
<li><strong>api:</strong> api update (<a
href="https://redirect.github.com/groq/groq-python/issues/167">#167</a>)
(<a
href="5016206e46">5016206</a>)</li>
<li><strong>api:</strong> api update (<a
href="https://redirect.github.com/groq/groq-python/issues/170">#170</a>)
(<a
href="2b35e952e1">2b35e95</a>)</li>
</ul>
<h3>Bug Fixes</h3>
<ul>
<li><strong>client:</strong> only call .close() when needed (<a
href="https://redirect.github.com/groq/groq-python/issues/169">#169</a>)
(<a
href="6a0ec576de">6a0ec57</a>)</li>
</ul>
<h3>Chores</h3>
<ul>
<li>add missing isclass check (<a
href="https://redirect.github.com/groq/groq-python/issues/166">#166</a>)
(<a
href="9cb1e72737">9cb1e72</a>)</li>
<li><strong>internal:</strong> bump httpx dependency (<a
href="https://redirect.github.com/groq/groq-python/issues/168">#168</a>)
(<a
href="c260ae969c">c260ae9</a>)</li>
<li><strong>internal:</strong> codegen related update (<a
href="https://redirect.github.com/groq/groq-python/issues/158">#158</a>)
(<a
href="85b5765b2b">85b5765</a>)</li>
<li><strong>internal:</strong> codegen related update (<a
href="https://redirect.github.com/groq/groq-python/issues/160">#160</a>)
(<a
href="8b87c4d657">8b87c4d</a>)</li>
<li><strong>internal:</strong> codegen related update (<a
href="https://redirect.github.com/groq/groq-python/issues/164">#164</a>)
(<a
href="d7b6be5f4b">d7b6be5</a>)</li>
<li><strong>internal:</strong> fix some typos (<a
href="https://redirect.github.com/groq/groq-python/issues/162">#162</a>)
(<a
href="32482ae691">32482ae</a>)</li>
</ul>
<h3>Documentation</h3>
<ul>
<li><strong>readme:</strong> example snippet for client context manager
(<a
href="https://redirect.github.com/groq/groq-python/issues/161">#161</a>)
(<a
href="b7bfd15768">b7bfd15</a>)</li>
</ul>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/groq/groq-python/blob/main/CHANGELOG.md">groq's
changelog</a>.</em></p>
<blockquote>
<h2>0.15.0 (2025-01-11)</h2>
<p>Full Changelog: <a
href="https://github.com/groq/groq-python/compare/v0.14.0...v0.15.0">v0.14.0...v0.15.0</a></p>
<h3>Features</h3>
<ul>
<li><strong>api:</strong> api update (<a
href="https://redirect.github.com/groq/groq-python/issues/175">#175</a>)
(<a
href="61cffbc78a">61cffbc</a>)</li>
</ul>
<h3>Bug Fixes</h3>
<ul>
<li>correctly handle deserialising <code>cls</code> fields (<a
href="https://redirect.github.com/groq/groq-python/issues/174">#174</a>)
(<a
href="0b2e997ce4">0b2e997</a>)</li>
</ul>
<h3>Chores</h3>
<ul>
<li><strong>internal:</strong> codegen related update (<a
href="https://redirect.github.com/groq/groq-python/issues/172">#172</a>)
(<a
href="d6ecadaa24">d6ecada</a>)</li>
</ul>
<h2>0.14.0 (2025-01-09)</h2>
<p>Full Changelog: <a
href="https://github.com/groq/groq-python/compare/v0.13.1...v0.14.0">v0.13.1...v0.14.0</a></p>
<h3>Features</h3>
<ul>
<li><strong>api:</strong> api update (<a
href="https://redirect.github.com/groq/groq-python/issues/163">#163</a>)
(<a
href="43a7a5b048">43a7a5b</a>)</li>
<li><strong>api:</strong> api update (<a
href="https://redirect.github.com/groq/groq-python/issues/167">#167</a>)
(<a
href="5016206e46">5016206</a>)</li>
<li><strong>api:</strong> api update (<a
href="https://redirect.github.com/groq/groq-python/issues/170">#170</a>)
(<a
href="2b35e952e1">2b35e95</a>)</li>
</ul>
<h3>Bug Fixes</h3>
<ul>
<li><strong>client:</strong> only call .close() when needed (<a
href="https://redirect.github.com/groq/groq-python/issues/169">#169</a>)
(<a
href="6a0ec576de">6a0ec57</a>)</li>
</ul>
<h3>Chores</h3>
<ul>
<li>add missing isclass check (<a
href="https://redirect.github.com/groq/groq-python/issues/166">#166</a>)
(<a
href="9cb1e72737">9cb1e72</a>)</li>
<li><strong>internal:</strong> bump httpx dependency (<a
href="https://redirect.github.com/groq/groq-python/issues/168">#168</a>)
(<a
href="c260ae969c">c260ae9</a>)</li>
<li><strong>internal:</strong> codegen related update (<a
href="https://redirect.github.com/groq/groq-python/issues/158">#158</a>)
(<a
href="85b5765b2b">85b5765</a>)</li>
<li><strong>internal:</strong> codegen related update (<a
href="https://redirect.github.com/groq/groq-python/issues/160">#160</a>)
(<a
href="8b87c4d657">8b87c4d</a>)</li>
<li><strong>internal:</strong> codegen related update (<a
href="https://redirect.github.com/groq/groq-python/issues/164">#164</a>)
(<a
href="d7b6be5f4b">d7b6be5</a>)</li>
<li><strong>internal:</strong> fix some typos (<a
href="https://redirect.github.com/groq/groq-python/issues/162">#162</a>)
(<a
href="32482ae691">32482ae</a>)</li>
</ul>
<h3>Documentation</h3>
<ul>
<li><strong>readme:</strong> example snippet for client context manager
(<a
href="https://redirect.github.com/groq/groq-python/issues/161">#161</a>)
(<a
href="b7bfd15768">b7bfd15</a>)</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="2b91340106"><code>2b91340</code></a>
release: 0.15.0 (<a
href="https://redirect.github.com/groq/groq-python/issues/173">#173</a>)</li>
<li><a
href="f6e2c46aa4"><code>f6e2c46</code></a>
release: 0.14.0 (<a
href="https://redirect.github.com/groq/groq-python/issues/159">#159</a>)</li>
<li>See full diff in <a
href="https://github.com/groq/groq-python/compare/v0.13.1...v0.15.0">compare
view</a></li>
</ul>
</details>
<br />

Updates `openai` from 1.60.0 to 1.60.2
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/openai/openai-python/releases">openai's
releases</a>.</em></p>
<blockquote>
<h2>v1.60.2</h2>
<h2>1.60.2 (2025-01-27)</h2>
<p>Full Changelog: <a
href="https://github.com/openai/openai-python/compare/v1.60.1...v1.60.2">v1.60.1...v1.60.2</a></p>
<h3>Bug Fixes</h3>
<ul>
<li><strong>parsing:</strong> don't validate input tools in the
asynchronous <code>.parse()</code> method (<a
href="6fcfe73cd3">6fcfe73</a>)</li>
</ul>
<h2>v1.60.1</h2>
<h2>1.60.1 (2025-01-24)</h2>
<p>Full Changelog: <a
href="https://github.com/openai/openai-python/compare/v1.60.0...v1.60.1">v1.60.0...v1.60.1</a></p>
<h3>Chores</h3>
<ul>
<li><strong>internal:</strong> minor formatting changes (<a
href="https://redirect.github.com/openai/openai-python/issues/2050">#2050</a>)
(<a
href="9c44192be5">9c44192</a>)</li>
</ul>
<h3>Documentation</h3>
<ul>
<li><strong>examples/azure:</strong> add async snippet (<a
href="https://redirect.github.com/openai/openai-python/issues/1787">#1787</a>)
(<a
href="f60eda1c1e">f60eda1</a>)</li>
</ul>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/openai/openai-python/blob/main/CHANGELOG.md">openai's
changelog</a>.</em></p>
<blockquote>
<h2>1.60.2 (2025-01-27)</h2>
<p>Full Changelog: <a
href="https://github.com/openai/openai-python/compare/v1.60.1...v1.60.2">v1.60.1...v1.60.2</a></p>
<h3>Bug Fixes</h3>
<ul>
<li><strong>parsing:</strong> don't validate input tools in the
asynchronous <code>.parse()</code> method (<a
href="6fcfe73cd3">6fcfe73</a>)</li>
</ul>
<h2>1.60.1 (2025-01-24)</h2>
<p>Full Changelog: <a
href="https://github.com/openai/openai-python/compare/v1.60.0...v1.60.1">v1.60.0...v1.60.1</a></p>
<h3>Chores</h3>
<ul>
<li><strong>internal:</strong> minor formatting changes (<a
href="https://redirect.github.com/openai/openai-python/issues/2050">#2050</a>)
(<a
href="9c44192be5">9c44192</a>)</li>
</ul>
<h3>Documentation</h3>
<ul>
<li><strong>examples/azure:</strong> add async snippet (<a
href="https://redirect.github.com/openai/openai-python/issues/1787">#1787</a>)
(<a
href="f60eda1c1e">f60eda1</a>)</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="d16e6edde5"><code>d16e6ed</code></a>
release: 1.60.2</li>
<li><a
href="257d79e8a0"><code>257d79e</code></a>
fix(parsing): don't validate input tools in the asynchronous
<code>.parse()</code> method</li>
<li><a
href="b95be16e7c"><code>b95be16</code></a>
release: 1.60.1</li>
<li><a
href="27d0e67b1d"><code>27d0e67</code></a>
chore(internal): minor formatting changes (<a
href="https://redirect.github.com/openai/openai-python/issues/2050">#2050</a>)</li>
<li><a
href="abc5459c75"><code>abc5459</code></a>
docs(examples/azure): add async snippet (<a
href="https://redirect.github.com/openai/openai-python/issues/1787">#1787</a>)</li>
<li>See full diff in <a
href="https://github.com/openai/openai-python/compare/v1.60.0...v1.60.2">compare
view</a></li>
</ul>
</details>
<br />

Updates `pytest-asyncio` from 0.25.2 to 0.25.3
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/pytest-dev/pytest-asyncio/releases">pytest-asyncio's
releases</a>.</em></p>
<blockquote>
<h2>pytest-asyncio 0.25.3</h2>
<ul>
<li>Avoid errors in cleanup of async generators when event loop is
already closed <a
href="https://redirect.github.com/pytest-dev/pytest-asyncio/issues/1040">#1040</a></li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="7c501923b0"><code>7c50192</code></a>
fix: Avoid errors in cleanup of async generators when event loop is
already c...</li>
<li>See full diff in <a
href="https://github.com/pytest-dev/pytest-asyncio/compare/v0.25.2...v0.25.3">compare
view</a></li>
</ul>
</details>
<br />

Updates `sentry-sdk` from 2.19.2 to 2.20.0
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/getsentry/sentry-python/releases">sentry-sdk's
releases</a>.</em></p>
<blockquote>
<h2>2.20.0</h2>
<ul>
<li>
<p><strong>New integration:</strong> Add <a
href="https://typer.tiangolo.com/">Typer</a> integration (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3869">#3869</a>)
by <a
href="https://github.com/patrick91"><code>@​patrick91</code></a></p>
<p>For more information, see the documentation for the <a
href="https://docs.sentry.io/platforms/python/integrations/typer/">TyperIntegration</a>.</p>
</li>
<li>
<p><strong>New integration:</strong> Add <a
href="https://www.getunleash.io/">Unleash</a> feature flagging
integration (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3888">#3888</a>)
by <a href="https://github.com/aliu39"><code>@​aliu39</code></a></p>
<p>For more information, see the documentation for the <a
href="https://docs.sentry.io/platforms/python/integrations/unleash/">UnleashIntegration</a>.</p>
</li>
<li>
<p>Add custom tracking of feature flag evaluations (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3860">#3860</a>)
by <a href="https://github.com/aliu39"><code>@​aliu39</code></a></p>
</li>
<li>
<p>Feature Flags: Register LD hook in setup instead of init, and don't
check for initialization (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3890">#3890</a>)
by <a href="https://github.com/aliu39"><code>@​aliu39</code></a></p>
</li>
<li>
<p>Feature Flags: Moved adding of <code>flags</code> context into Scope
(<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3917">#3917</a>)
by <a
href="https://github.com/antonpirker"><code>@​antonpirker</code></a></p>
</li>
<li>
<p>Create a separate group for feature flag test suites (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3911">#3911</a>)
by <a
href="https://github.com/sentrivana"><code>@​sentrivana</code></a></p>
</li>
<li>
<p>Fix flaky LaunchDarkly tests (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3896">#3896</a>)
by <a href="https://github.com/aliu39"><code>@​aliu39</code></a></p>
</li>
<li>
<p>Fix LRU cache copying (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3883">#3883</a>)
by <a href="https://github.com/ffelixg"><code>@​ffelixg</code></a></p>
</li>
<li>
<p>Fix cache pollution from mutable reference (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3887">#3887</a>)
by <a
href="https://github.com/cmanallen"><code>@​cmanallen</code></a></p>
</li>
<li>
<p>Centralize minimum version checking (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3910">#3910</a>)
by <a
href="https://github.com/sentrivana"><code>@​sentrivana</code></a></p>
</li>
<li>
<p>Support SparkIntegration activation after SparkContext created (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3411">#3411</a>)
by <a
href="https://github.com/seyoon-lim"><code>@​seyoon-lim</code></a></p>
</li>
<li>
<p>Preserve ARQ enqueue_job <strong>kwdefaults</strong> after patching
(<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3903">#3903</a>)
by <a href="https://github.com/danmr"><code>@​danmr</code></a></p>
</li>
<li>
<p>Add Github workflow to comment on issues when a fix was released (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3866">#3866</a>)
by <a
href="https://github.com/antonpirker"><code>@​antonpirker</code></a></p>
</li>
<li>
<p>Update test matrix for Sanic (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3904">#3904</a>)
by <a
href="https://github.com/antonpirker"><code>@​antonpirker</code></a></p>
</li>
<li>
<p>Rename scripts (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3885">#3885</a>)
by <a
href="https://github.com/sentrivana"><code>@​sentrivana</code></a></p>
</li>
<li>
<p>Fix CI (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3878">#3878</a>)
by <a
href="https://github.com/sentrivana"><code>@​sentrivana</code></a></p>
</li>
<li>
<p>Treat <code>potel-base</code> as release branch in CI (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3912">#3912</a>)
by <a
href="https://github.com/sentrivana"><code>@​sentrivana</code></a></p>
</li>
<li>
<p>build(deps): bump actions/create-github-app-token from 1.11.0 to
1.11.1 (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3893">#3893</a>)
by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a></p>
</li>
<li>
<p>build(deps): bump codecov/codecov-action from 5.0.7 to 5.1.1 (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3867">#3867</a>)
by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a></p>
</li>
<li>
<p>build(deps): bump codecov/codecov-action from 5.1.1 to 5.1.2 (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3892">#3892</a>)
by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a></p>
</li>
</ul>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/getsentry/sentry-python/blob/master/CHANGELOG.md">sentry-sdk's
changelog</a>.</em></p>
<blockquote>
<h2>2.20.0</h2>
<ul>
<li>
<p><strong>New integration:</strong> Add <a
href="https://typer.tiangolo.com/">Typer</a> integration (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3869">#3869</a>)
by <a
href="https://github.com/patrick91"><code>@​patrick91</code></a></p>
<p>For more information, see the documentation for the <a
href="https://docs.sentry.io/platforms/python/integrations/typer/">TyperIntegration</a>.</p>
</li>
<li>
<p><strong>New integration:</strong> Add <a
href="https://www.getunleash.io/">Unleash</a> feature flagging
integration (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3888">#3888</a>)
by <a href="https://github.com/aliu39"><code>@​aliu39</code></a></p>
<p>For more information, see the documentation for the <a
href="https://docs.sentry.io/platforms/python/integrations/unleash/">UnleashIntegration</a>.</p>
</li>
<li>
<p>Add custom tracking of feature flag evaluations (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3860">#3860</a>)
by <a href="https://github.com/aliu39"><code>@​aliu39</code></a></p>
</li>
<li>
<p>Feature Flags: Register LD hook in setup instead of init, and don't
check for initialization (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3890">#3890</a>)
by <a href="https://github.com/aliu39"><code>@​aliu39</code></a></p>
</li>
<li>
<p>Feature Flags: Moved adding of <code>flags</code> context into Scope
(<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3917">#3917</a>)
by <a
href="https://github.com/antonpirker"><code>@​antonpirker</code></a></p>
</li>
<li>
<p>Create a separate group for feature flag test suites (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3911">#3911</a>)
by <a
href="https://github.com/sentrivana"><code>@​sentrivana</code></a></p>
</li>
<li>
<p>Fix flaky LaunchDarkly tests (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3896">#3896</a>)
by <a href="https://github.com/aliu39"><code>@​aliu39</code></a></p>
</li>
<li>
<p>Fix LRU cache copying (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3883">#3883</a>)
by <a href="https://github.com/ffelixg"><code>@​ffelixg</code></a></p>
</li>
<li>
<p>Fix cache pollution from mutable reference (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3887">#3887</a>)
by <a
href="https://github.com/cmanallen"><code>@​cmanallen</code></a></p>
</li>
<li>
<p>Centralize minimum version checking (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3910">#3910</a>)
by <a
href="https://github.com/sentrivana"><code>@​sentrivana</code></a></p>
</li>
<li>
<p>Support SparkIntegration activation after SparkContext created (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3411">#3411</a>)
by <a
href="https://github.com/seyoon-lim"><code>@​seyoon-lim</code></a></p>
</li>
<li>
<p>Preserve ARQ enqueue_job <strong>kwdefaults</strong> after patching
(<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3903">#3903</a>)
by <a href="https://github.com/danmr"><code>@​danmr</code></a></p>
</li>
<li>
<p>Add Github workflow to comment on issues when a fix was released (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3866">#3866</a>)
by <a
href="https://github.com/antonpirker"><code>@​antonpirker</code></a></p>
</li>
<li>
<p>Update test matrix for Sanic (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3904">#3904</a>)
by <a
href="https://github.com/antonpirker"><code>@​antonpirker</code></a></p>
</li>
<li>
<p>Rename scripts (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3885">#3885</a>)
by <a
href="https://github.com/sentrivana"><code>@​sentrivana</code></a></p>
</li>
<li>
<p>Fix CI (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3878">#3878</a>)
by <a
href="https://github.com/sentrivana"><code>@​sentrivana</code></a></p>
</li>
<li>
<p>Treat <code>potel-base</code> as release branch in CI (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3912">#3912</a>)
by <a
href="https://github.com/sentrivana"><code>@​sentrivana</code></a></p>
</li>
<li>
<p>build(deps): bump actions/create-github-app-token from 1.11.0 to
1.11.1 (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3893">#3893</a>)
by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a></p>
</li>
<li>
<p>build(deps): bump codecov/codecov-action from 5.0.7 to 5.1.1 (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3867">#3867</a>)
by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a></p>
</li>
<li>
<p>build(deps): bump codecov/codecov-action from 5.1.1 to 5.1.2 (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3892">#3892</a>)
by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a></p>
</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="4e0505ea5c"><code>4e0505e</code></a>
Updated changelog</li>
<li><a
href="ca68a7f3fb"><code>ca68a7f</code></a>
release: 2.20.0</li>
<li><a
href="2ee194c0d4"><code>2ee194c</code></a>
feat(flags): remove Unleash get_variant patching code (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3914">#3914</a>)</li>
<li><a
href="288f69a962"><code>288f69a</code></a>
Moved adding of <code>flags</code> context into Scope (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3917">#3917</a>)</li>
<li><a
href="9f9ff345c6"><code>9f9ff34</code></a>
tests: Create a separate group for feature flag suites (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3911">#3911</a>)</li>
<li><a
href="fa241c3425"><code>fa241c3</code></a>
Treat potel-base as release branch in CI (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3912">#3912</a>)</li>
<li><a
href="be5327356f"><code>be53273</code></a>
Centralize minimum version checking (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3910">#3910</a>)</li>
<li><a
href="4432e26a45"><code>4432e26</code></a>
Small contribution docs update (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3909">#3909</a>)</li>
<li><a
href="c6a89d64db"><code>c6a89d6</code></a>
feat(flags): add Unleash feature flagging integration (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3888">#3888</a>)</li>
<li><a
href="bf65ede421"><code>bf65ede</code></a>
ref(flags): Beter naming for featureflags module and identifier (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/3902">#3902</a>)</li>
<li>Additional commits viewable in <a
href="https://github.com/getsentry/sentry-python/compare/2.19.2...2.20.0">compare
view</a></li>
</ul>
</details>
<br />

Updates `stripe` from 11.4.1 to 11.5.0
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/stripe/stripe-python/releases">stripe's
releases</a>.</em></p>
<blockquote>
<h2>v11.5.0</h2>
<h2>11.5.0 - 2025-01-27</h2>
<ul>
<li><a
href="https://redirect.github.com/stripe/stripe-python/pull/1443">#1443</a>
Update generated code
<ul>
<li>Add support for <code>pay_by_bank_payments</code> on resource class
<code>stripe.Account.Capabilities</code> and parameter class
<code>stripe.Account.CreateParamsCapabilities</code></li>
<li>Add support for <code>directorship_declaration</code> on resource
class <code>stripe.Account.Company</code> and parameter classes
<code>stripe.Account.CreateParamsCompany</code> and
<code>stripe.Token.CreateParamsAccountCompany</code></li>
<li>Add support for <code>ownership_exemption_reason</code> on resource
class <code>stripe.Account.Company</code> and parameter classes
<code>stripe.Account.CreateParamsCompany</code> and
<code>stripe.Token.CreateParamsAccountCompany</code></li>
<li>Add support for <code>proof_of_ultimate_beneficial_ownership</code>
on parameter class
<code>stripe.Account.CreateParamsDocuments</code></li>
<li>Add support for <code>financial_account</code> on resource classes
<code>stripe.AccountSession.Components</code> and
<code>stripe.treasury.OutboundTransfer.DestinationPaymentMethodDetails</code>
and parameter class
<code>stripe.AccountSession.CreateParamsComponents</code></li>
<li>Add support for <code>issuing_card</code> on resource class
<code>stripe.AccountSession.Components</code> and parameter class
<code>stripe.AccountSession.CreateParamsComponents</code></li>
<li>Add support for <code>advice_code</code> on resource classes
<code>stripe.Charge.Outcome</code>,
<code>stripe.Invoice.LastFinalizationError</code>,
<code>stripe.PaymentIntent.LastPaymentError</code>,
<code>stripe.SetupAttempt.SetupError</code>, and
<code>stripe.SetupIntent.LastSetupError</code></li>
<li>Add support for <code>country</code> on resource classes
<code>stripe.Charge.PaymentMethodDetails.Paypal</code>,
<code>stripe.ConfirmationToken.PaymentMethodPreview.Paypal</code>, and
<code>stripe.PaymentMethod.Paypal</code></li>
<li>Add support for <code>pay_by_bank</code> on resource classes
<code>stripe.Charge.PaymentMethodDetails</code>,
<code>stripe.ConfirmationToken.PaymentMethodPreview</code>, and
<code>stripe.PaymentIntent.PaymentMethodOptions</code>, parameter
classes
<code>stripe.ConfirmationToken.CreateParamsPaymentMethodData</code>,
<code>stripe.PaymentIntent.ConfirmParamsPaymentMethodData</code>,
<code>stripe.PaymentIntent.ConfirmParamsPaymentMethodOptions</code>,
<code>stripe.PaymentIntent.CreateParamsPaymentMethodData</code>,
<code>stripe.PaymentIntent.CreateParamsPaymentMethodOptions</code>,
<code>stripe.PaymentIntent.ModifyParamsPaymentMethodData</code>,
<code>stripe.PaymentIntent.ModifyParamsPaymentMethodOptions</code>,
<code>stripe.PaymentMethod.CreateParams</code>,
<code>stripe.PaymentMethod.ModifyParams</code>,
<code>stripe.PaymentMethodConfiguration.CreateParams</code>,
<code>stripe.PaymentMethodConfiguration.ModifyParams</code>,
<code>stripe.SetupIntent.ConfirmParamsPaymentMethodData</code>,
<code>stripe.SetupIntent.CreateParamsPaymentMethodData</code>,
<code>stripe.SetupIntent.ModifyParamsPaymentMethodData</code>, and
<code>stripe.checkout.Session.CreateParamsPaymentMethodOptions</code>,
and resources <code>stripe.PaymentMethod</code> and
<code>stripe.PaymentMethodConfiguration</code></li>
<li>Add support for <code>phone_number_collection</code> on parameter
class <code>stripe.PaymentLink.ModifyParams</code></li>
<li>Add support for <code>discounts</code> on resource
<code>stripe.checkout.Session</code></li>
<li>Add support for <code>jpy</code> on parameter classes
<code>stripe.terminal.Configuration.CreateParamsTipping</code> and
<code>stripe.terminal.Configuration.ModifyParamsTipping</code> and
resource class <code>stripe.terminal.Configuration.Tipping</code></li>
<li>Add support for <code>nickname</code> on parameter classes
<code>stripe.treasury.FinancialAccount.CreateParams</code> and
<code>stripe.treasury.FinancialAccount.ModifyParams</code> and resource
<code>stripe.treasury.FinancialAccount</code></li>
<li>Add support for <code>forwarding_settings</code> on parameter class
<code>stripe.treasury.FinancialAccount.ModifyParams</code></li>
<li>Add support for <code>_cls_close</code> on resource
<code>stripe.treasury.FinancialAccount</code></li>
<li>Add support for <code>close</code> on resource
<code>stripe.treasury.FinancialAccount</code></li>
<li>Add support for <code>is_default</code> on resource
<code>stripe.treasury.FinancialAccount</code></li>
<li>Add support for <code>destination_payment_method_data</code> on
parameter class
<code>stripe.treasury.OutboundTransfer.CreateParams</code></li>
<li>Add support for <code>outbound_transfer</code> on resource class
<code>stripe.treasury.ReceivedCredit.LinkedFlows.SourceFlowDetails</code></li>
<li>Add support for <code>SD</code> on enums
<code>stripe.checkout.Session.ShippingAddressCollection.allowed_countries</code>,
<code>stripe.checkout.Session.CreateParamsShippingAddressCollection.allowed_countries</code>,
<code>stripe.PaymentLink.ShippingAddressCollection.allowed_countries</code>,
<code>stripe.PaymentLink.CreateParamsShippingAddressCollection.allowed_countries</code>,
and
<code>stripe.PaymentLink.ModifyParamsShippingAddressCollection.allowed_countries</code></li>
<li>Add support for <code>pay_by_bank</code> on enums
<code>stripe.checkout.Session.CreateParams.payment_method_types</code>,
<code>stripe.ConfirmationToken.PaymentMethodPreview.type</code>,
<code>stripe.ConfirmationToken.CreateParamsPaymentMethodData.type</code>,
<code>stripe.Customer.ListPaymentMethodsParams.type</code>,
<code>stripe.PaymentIntent.ConfirmParamsPaymentMethodData.type</code>,
<code>stripe.PaymentIntent.CreateParamsPaymentMethodData.type</code>,
<code>stripe.PaymentIntent.ModifyParamsPaymentMethodData.type</code>,
<code>stripe.PaymentLink.payment_method_types</code>,
<code>stripe.PaymentLink.CreateParams.payment_method_types</code>,
<code>stripe.PaymentLink.ModifyParams.payment_method_types</code>,
<code>stripe.PaymentMethod.type</code>,
<code>stripe.PaymentMethod.CreateParams.type</code>,
<code>stripe.PaymentMethod.ListParams.type</code>,
<code>stripe.SetupIntent.ConfirmParamsPaymentMethodData.type</code>,
<code>stripe.SetupIntent.CreateParamsPaymentMethodData.type</code>, and
<code>stripe.SetupIntent.ModifyParamsPaymentMethodData.type</code></li>
<li>Add support for <code>financial_account</code> on enum
<code>stripe.treasury.OutboundTransfer.DestinationPaymentMethodDetails.type</code></li>
<li>Add support for <code>outbound_transfer</code> on enums
<code>stripe.treasury.ReceivedCredit.LinkedFlows.SourceFlowDetails.type</code>
and
<code>stripe.treasury.ReceivedCredit.ListParamsLinkedFlows.source_flow_type</code></li>
<li>Add support for <code>2025-01-27.acacia</code> on enum
<code>stripe.WebhookEndpoint.CreateParams.api_version</code></li>
<li>Change type of <code>pretax_credit_amounts</code> on
<code>stripe.CreditNote</code> and
<code>stripe.CreditNoteLineItem</code> from
<code>Optional[List[PretaxCreditAmount]]</code> to
<code>List[PretaxCreditAmount]</code></li>
</ul>
</li>
<li><a
href="https://redirect.github.com/stripe/stripe-python/pull/1451">#1451</a>
Upgrade to download-artifact@v4</li>
<li><a
href="https://redirect.github.com/stripe/stripe-python/pull/1448">#1448</a>
Updated upload artifact ci action</li>
<li><a
href="https://redirect.github.com/stripe/stripe-python/pull/1446">#1446</a>
add just to publish CI</li>
<li><a
href="https://redirect.github.com/stripe/stripe-python/pull/1444">#1444</a>
Added CONTRIBUTING.md file</li>
<li><a
href="https://redirect.github.com/stripe/stripe-python/pull/1445">#1445</a>
minor justfile fixes &amp; pin CI version</li>
<li><a
href="https://redirect.github.com/stripe/stripe-python/pull/1440">#1440</a>
add justfile, update readme, remove coveralls</li>
<li><a
href="https://redirect.github.com/stripe/stripe-python/pull/1442">#1442</a>
Fix V2 ListObject.data type hint
<ul>
<li>Change <code>stripe.v2.ListObject.data</code> type hint from
<code>List[StripeObject]</code> to <code>List[T]</code> where T is the
specific stripe object contained within the list</li>
</ul>
</li>
</ul>
<p>See <a
href="https://github.com/stripe/stripe-python/blob/v11.5.0b3/CHANGELOG.md">the
changelog for more details</a>.</p>
<h2>v11.5.0b3</h2>
<ul>
<li><a
href="https://redirect.github.com/stripe/stripe-python/pull/1447">#1447</a>
Update generated code for beta
<ul>
<li>Remove support for <code>stripe_account</code> on resource classes
<code>stripe.terminal.Reader.Action.CollectPaymentMethod</code>,
<code>stripe.terminal.Reader.Action.ConfirmPaymentIntent</code>,
<code>stripe.terminal.Reader.Action.ProcessPaymentIntent</code>, and
<code>stripe.terminal.Reader.Action.RefundPayment</code></li>
</ul>
</li>
</ul>
<p>See <a
href="https://github.com/stripe/stripe-python/blob/v11.5.0b3/CHANGELOG.md">the
changelog for more details</a>.</p>
<h2>v11.5.0b2</h2>
<ul>
<li><a
href="https://redirect.github.com/stripe/stripe-python/pull/1439">#1439</a>
Update generated code for beta
<ul>
<li>Add support for <code>pay_by_bank_payments</code> on resource class
<code>stripe.Account.Capabilities</code> and parameter class
<code>stripe.Account.CreateParamsCapabilities</code></li>
<li>Add support for <code>directorship_declaration</code> on parameter
classes <code>stripe.Account.CreateParamsCompany</code> and
<code>stripe.Token.CreateParamsAccountCompany</code></li>
<li>Add support for <code>proof_of_ultimate_beneficial_ownership</code>
on parameter class
<code>stripe.Account.CreateParamsDocuments</code></li>
</ul>
</li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/stripe/stripe-python/blob/master/CHANGELOG.md">stripe's
changelog</a>.</em></p>
<blockquote>
<h2>11.5.0 - 2025-01-27</h2>
<ul>
<li><a
href="https://redirect.github.com/stripe/stripe-python/pull/1443">#1443</a>
Update generated code
<ul>
<li>Add support for <code>pay_by_bank_payments</code> on resource class
<code>stripe.Account.Capabilities</code> and parameter class
<code>stripe.Account.CreateParamsCapabilities</code></li>
<li>Add support for <code>directorship_declaration</code> on resource
class <code>stripe.Account.Company</code> and parameter classes
<code>stripe.Account.CreateParamsCompany</code> and
<code>stripe.Token.CreateParamsAccountCompany</code></li>
<li>Add support for <code>ownership_exemption_reason</code> on resource
class <code>stripe.Account.Company</code> and parameter classes
<code>stripe.Account.CreateParamsCompany</code> and
<code>stripe.Token.CreateParamsAccountCompany</code></li>
<li>Add support for <code>proof_of_ultimate_beneficial_ownership</code>
on parameter class
<code>stripe.Account.CreateParamsDocuments</code></li>
<li>Add support for <code>financial_account</code> on resource classes
<code>stripe.AccountSession.Components</code> and
<code>stripe.treasury.OutboundTransfer.DestinationPaymentMethodDetails</code>
and parameter class
<code>stripe.AccountSession.CreateParamsComponents</code></li>
<li>Add support for <code>issuing_card</code> on resource class
<code>stripe.AccountSession.Components</code> and parameter class
<code>stripe.AccountSession.CreateParamsComponents</code></li>
<li>Add support for <code>advice_code</code> on resource classes
<code>stripe.Charge.Outcome</code>,
<code>stripe.Invoice.LastFinalizationError</code>,
<code>stripe.PaymentIntent.LastPaymentError</code>,
<code>stripe.SetupAttempt.SetupError</code>, and
<code>stripe.SetupIntent.LastSetupError</code></li>
<li>Add support for <code>country</code> on resource classes
<code>stripe.Charge.PaymentMethodDetails.Paypal</code>,
<code>stripe.ConfirmationToken.PaymentMethodPreview.Paypal</code>, and
<code>stripe.PaymentMethod.Paypal</code></li>
<li>Add support for <code>pay_by_bank</code> on resource classes
<code>stripe.Charge.PaymentMethodDetails</code>,
<code>stripe.ConfirmationToken.PaymentMethodPreview</code>, and
<code>stripe.PaymentIntent.PaymentMethodOptions</code>, parameter
classes
<code>stripe.ConfirmationToken.CreateParamsPaymentMethodData</code>,
<code>stripe.PaymentIntent.ConfirmParamsPaymentMethodData</code>,
<code>stripe.PaymentIntent.ConfirmParamsPaymentMethodOptions</code>,
<code>stripe.PaymentIntent.CreateParamsPaymentMethodData</code>,
<code>stripe.PaymentIntent.CreateParamsPaymentMethodOptions</code>,
<code>stripe.PaymentIntent.ModifyParamsPaymentMethodData</code>,
<code>stripe.PaymentIntent.ModifyParamsPaymentMethodOptions</code>,
<code>stripe.PaymentMethod.CreateParams</code>,
<code>stripe.PaymentMethod.ModifyParams</code>,
<code>stripe.PaymentMethodConfiguration.CreateParams</code>,
<code>stripe.PaymentMethodConfiguration.ModifyParams</code>,
<code>stripe.SetupIntent.ConfirmParamsPaymentMethodData</code>,
<code>stripe.SetupIntent.CreateParamsPaymentMethodData</code>,
<code>stripe.SetupIntent.ModifyParamsPaymentMethodData</code>, and
<code>stripe.checkout.Session.CreateParamsPaymentMethodOptions</code>,
and resources <code>stripe.PaymentMethod</code> and
<code>stripe.PaymentMethodConfiguration</code></li>
<li>Add support for <code>phone_number_collection</code> on parameter
class <code>stripe.PaymentLink.ModifyParams</code></li>
<li>Add support for <code>discounts</code> on resource
<code>stripe.checkout.Session</code></li>
<li>Add support for <code>jpy</code> on parameter classes
<code>stripe.terminal.Configuration.CreateParamsTipping</code> and
<code>stripe.terminal.Configuration.ModifyParamsTipping</code> and
resource class <code>stripe.terminal.Configuration.Tipping</code></li>
<li>Add support for <code>nickname</code> on parameter classes
<code>stripe.treasury.FinancialAccount.CreateParams</code> and
<code>stripe.treasury.FinancialAccount.ModifyParams</code> and resource
<code>stripe.treasury.FinancialAccount</code></li>
<li>Add support for <code>forwarding_settings</code> on parameter class
<code>stripe.treasury.FinancialAccount.ModifyParams</code></li>
<li>Add support for <code>_cls_close</code> on resource
<code>stripe.treasury.FinancialAccount</code></li>
<li>Add support for <code>close</code> on resource
<code>stripe.treasury.FinancialAccount</code></li>
<li>Add support for <code>is_default</code> on resource
<code>stripe.treasury.FinancialAccount</code></li>
<li>Add support for <code>destination_payment_method_data</code> on
parameter class
<code>stripe.treasury.OutboundTransfer.CreateParams</code></li>
<li>Add support for <code>outbound_transfer</code> on resource class
<code>stripe.treasury.ReceivedCredit.LinkedFlows.SourceFlowDetails</code></li>
<li>Add support for <code>SD</code> on enums
<code>stripe.checkout.Session.ShippingAddressCollection.allowed_countries</code>,
<code>stripe.checkout.Session.CreateParamsShippingAddressCollection.allowed_countries</code>,
<code>stripe.PaymentLink.ShippingAddressCollection.allowed_countries</code>,
<code>stripe.PaymentLink.CreateParamsShippingAddressCollection.allowed_countries</code>,
and
<code>stripe.PaymentLink.ModifyParamsShippingAddressCollection.allowed_countries</code></li>
<li>Add support for <code>pay_by_bank</code> on enums
<code>stripe.checkout.Session.CreateParams.payment_method_types</code>,
<code>stripe.ConfirmationToken.PaymentMethodPreview.type</code>,
<code>stripe.ConfirmationToken.CreateParamsPaymentMethodData.type</code>,
<code>stripe.Customer.ListPaymentMethodsParams.type</code>,
<code>stripe.PaymentIntent.ConfirmParamsPaymentMethodData.type</code>,
<code>stripe.PaymentIntent.CreateParamsPaymentMethodData.type</code>,
<code>stripe.PaymentIntent.ModifyParamsPaymentMethodData.type</code>,
<code>stripe.PaymentLink.payment_method_types</code>,
<code>stripe.PaymentLink.CreateParams.payment_method_types</code>,
<code>stripe.PaymentLink.ModifyParams.payment_method_types</code>,
<code>stripe.PaymentMethod.type</code>,
<code>stripe.PaymentMethod.CreateParams.type</code>,
<code>stripe.PaymentMethod.ListParams.type</code>,
<code>stripe.SetupIntent.ConfirmParamsPaymentMethodData.type</code>,
<code>stripe.SetupIntent.CreateParamsPaymentMethodData.type</code>, and
<code>stripe.SetupIntent.ModifyParamsPaymentMethodData.type</code></li>
<li>Add support for <code>financial_account</code> on enum
<code>stripe.treasury.OutboundTransfer.DestinationPaymentMethodDetails.type</code></li>
<li>Add support for <code>outbound_transfer</code> on enums
<code>stripe.treasury.ReceivedCredit.LinkedFlows.SourceFlowDetails.type</code>
and
<code>stripe.treasury.ReceivedCredit.ListParamsLinkedFlows.source_flow_type</code></li>
<li>Add support for <code>2025-01-27.acacia</code> on enum
<code>stripe.WebhookEndpoint.CreateParams.api_version</code></li>
<li>Change type of <code>pretax_credit_amounts</code> on
<code>stripe.CreditNote</code> and
<code>stripe.CreditNoteLineItem</code> from
<code>Optional[List[PretaxCreditAmount]]</code> to
<code>List[PretaxCreditAmount]</code></li>
</ul>
</li>
<li><a
href="https://redirect.github.com/stripe/stripe-python/pull/1451">#1451</a>
Upgrade to download-artifact@v4</li>
<li><a
href="https://redirect.github.com/stripe/stripe-python/pull/1448">#1448</a>
Updated upload artifact ci action</li>
<li><a
href="https://redirect.github.com/stripe/stripe-python/pull/1446">#1446</a>
add just to publish CI</li>
<li><a
href="https://redirect.github.com/stripe/stripe-python/pull/1444">#1444</a>
Added CONTRIBUTING.md file</li>
<li><a
href="https://redirect.github.com/stripe/stripe-python/pull/1445">#1445</a>
minor justfile fixes &amp; pin CI version</li>
<li><a
href="https://redirect.github.com/stripe/stripe-python/pull/1440">#1440</a>
add justfile, update readme, remove coveralls</li>
<li><a href="https://redirect...

_Description has been truncated_

---------

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Nicholas Tindle <nicholas.tindle@agpt.co>
2025-01-29 17:08:02 +00:00
Krzysztof Czerwinski
1b69bcbce2 fix(platform): PAYG system fixes (#9296)
1. Stripe is complaining about lack of return url in checkout.
2. Only frontend prevents users from topping-up amounts less than 5$.

### Changes 🏗️

- Set `return_url` and `customer_email` when creating checkout session.
- Change `amount` to `credit_amount` and use credits instead of USD for
clarity when requesting top up.
- Accept only top-ups of 500+ credits (5$) and multiples of 100 credits
*on the backend*.

### Checklist 📋

#### For code changes:
- [ ] I have clearly listed my changes in the PR description
- [ ] I have made a test plan
- [ ] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [ ] ...

<details>
  <summary>Example test plan</summary>
  
  - [ ] Create from scratch and execute an agent with at least 3 blocks
- [ ] Import an agent from file upload, and confirm it executes
correctly
  - [ ] Upload agent to marketplace
- [ ] Import an agent from marketplace and confirm it executes correctly
  - [ ] Edit an agent from monitor, and confirm it executes correctly
</details>

#### For configuration changes:
- [ ] `.env.example` is updated or already compatible with my changes
- [ ] `docker-compose.yml` is updated or already compatible with my
changes
- [ ] I have included a list of my configuration changes in the PR
description (under **Changes**)

<details>
  <summary>Examples of configuration changes</summary>

  - Changing ports
  - Adding new services that need to communicate with each other
  - Secrets or environment variable changes
  - New or infrastructure changes such as databases
</details>

---------

Co-authored-by: Zamil Majdy <zamil.majdy@agpt.co>
2025-01-29 16:56:05 +00:00
dependabot[bot]
ef118eff34 chore(libs/deps-dev): bump ruff from 0.9.2 to 0.9.3 in /autogpt_platform/autogpt_libs in the development-dependencies group (#9343)
Bumps the development-dependencies group in
/autogpt_platform/autogpt_libs with 1 update:
[ruff](https://github.com/astral-sh/ruff).

Updates `ruff` from 0.9.2 to 0.9.3
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/astral-sh/ruff/releases">ruff's
releases</a>.</em></p>
<blockquote>
<h2>0.9.3</h2>
<h2>Release Notes</h2>
<h3>Preview features</h3>
<ul>
<li>[<code>airflow</code>] Argument <code>fail_stop</code> in DAG has
been renamed as <code>fail_fast</code> (<code>AIR302</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15633">#15633</a>)</li>
<li>[<code>airflow</code>] Extend <code>AIR303</code> with more symbols
(<a
href="https://redirect.github.com/astral-sh/ruff/pull/15611">#15611</a>)</li>
<li>[<code>flake8-bandit</code>] Report all references to suspicious
functions (<code>S3</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15541">#15541</a>)</li>
<li>[<code>flake8-pytest-style</code>] Do not emit diagnostics for empty
<code>for</code> loops (<code>PT012</code>, <code>PT031</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15542">#15542</a>)</li>
<li>[<code>flake8-simplify</code>] Avoid double negations
(<code>SIM103</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15562">#15562</a>)</li>
<li>[<code>pyflakes</code>] Fix infinite loop with unused local import
in <code>__init__.py</code> (<code>F401</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15517">#15517</a>)</li>
<li>[<code>pylint</code>] Do not report methods with only one
<code>EM101</code>-compatible <code>raise</code> (<code>PLR6301</code>)
(<a
href="https://redirect.github.com/astral-sh/ruff/pull/15507">#15507</a>)</li>
<li>[<code>pylint</code>] Implement
<code>redefined-slots-in-subclass</code> (<code>W0244</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/9640">#9640</a>)</li>
<li>[<code>pyupgrade</code>] Add rules to use PEP 695 generics in
classes and functions (<code>UP046</code>, <code>UP047</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15565">#15565</a>,
<a
href="https://redirect.github.com/astral-sh/ruff/pull/15659">#15659</a>)</li>
<li>[<code>refurb</code>] Implement <code>for-loop-writes</code>
(<code>FURB122</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/10630">#10630</a>)</li>
<li>[<code>ruff</code>] Implement <code>needless-else</code> clause
(<code>RUF047</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15051">#15051</a>)</li>
<li>[<code>ruff</code>] Implement <code>starmap-zip</code>
(<code>RUF058</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15483">#15483</a>)</li>
</ul>
<h3>Rule changes</h3>
<ul>
<li>[<code>flake8-bugbear</code>] Do not raise error if keyword argument
is present and target-python version is less or equals than 3.9
(<code>B903</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15549">#15549</a>)</li>
<li>[<code>flake8-comprehensions</code>] strip parentheses around
generators in <code>unnecessary-generator-set</code> (<code>C401</code>)
(<a
href="https://redirect.github.com/astral-sh/ruff/pull/15553">#15553</a>)</li>
<li>[<code>flake8-pytest-style</code>] Rewrite references to
<code>.exception</code> (<code>PT027</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15680">#15680</a>)</li>
<li>[<code>flake8-simplify</code>] Mark fixes as unsafe
(<code>SIM201</code>, <code>SIM202</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15626">#15626</a>)</li>
<li>[<code>flake8-type-checking</code>] Fix some safe fixes being
labeled unsafe (<code>TC006</code>,<code>TC008</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15638">#15638</a>)</li>
<li>[<code>isort</code>] Omit trailing whitespace in
<code>unsorted-imports</code> (<code>I001</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15518">#15518</a>)</li>
<li>[<code>pydoclint</code>] Allow ignoring one line docstrings for
<code>DOC</code> rules (<a
href="https://redirect.github.com/astral-sh/ruff/pull/13302">#13302</a>)</li>
<li>[<code>pyflakes</code>] Apply redefinition fixes by source code
order (<code>F811</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15575">#15575</a>)</li>
<li>[<code>pyflakes</code>] Avoid removing too many imports in
<code>redefined-while-unused</code> (<code>F811</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15585">#15585</a>)</li>
<li>[<code>pyflakes</code>] Group redefinition fixes by source statement
(<code>F811</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15574">#15574</a>)</li>
<li>[<code>pylint</code>] Include name of base class in message for
<code>redefined-slots-in-subclass</code> (<code>W0244</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15559">#15559</a>)</li>
<li>[<code>ruff</code>] Update fix for <code>RUF055</code> to use
<code>var == value</code> (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15605">#15605</a>)</li>
</ul>
<h3>Formatter</h3>
<ul>
<li>Fix bracket spacing for single-element tuples in f-string
expressions (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15537">#15537</a>)</li>
<li>Fix unstable f-string formatting for expressions containing a
trailing comma (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15545">#15545</a>)</li>
</ul>
<h3>Performance</h3>
<ul>
<li>Avoid quadratic membership check in import fixes (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15576">#15576</a>)</li>
</ul>
<h3>Server</h3>
<ul>
<li>Allow <code>unsafe-fixes</code> settings for code actions (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15666">#15666</a>)</li>
</ul>
<h3>Bug fixes</h3>
<ul>
<li>[<code>flake8-bandit</code>] Add missing single-line/dotall regex
flag (<code>S608</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15654">#15654</a>)</li>
<li>[<code>flake8-import-conventions</code>] Fix infinite loop between
<code>ICN001</code> and <code>I002</code> (<code>ICN001</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15480">#15480</a>)</li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/astral-sh/ruff/blob/main/CHANGELOG.md">ruff's
changelog</a>.</em></p>
<blockquote>
<h2>0.9.3</h2>
<h3>Preview features</h3>
<ul>
<li>[<code>airflow</code>] Argument <code>fail_stop</code> in DAG has
been renamed as <code>fail_fast</code> (<code>AIR302</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15633">#15633</a>)</li>
<li>[<code>airflow</code>] Extend <code>AIR303</code> with more symbols
(<a
href="https://redirect.github.com/astral-sh/ruff/pull/15611">#15611</a>)</li>
<li>[<code>flake8-bandit</code>] Report all references to suspicious
functions (<code>S3</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15541">#15541</a>)</li>
<li>[<code>flake8-pytest-style</code>] Do not emit diagnostics for empty
<code>for</code> loops (<code>PT012</code>, <code>PT031</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15542">#15542</a>)</li>
<li>[<code>flake8-simplify</code>] Avoid double negations
(<code>SIM103</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15562">#15562</a>)</li>
<li>[<code>pyflakes</code>] Fix infinite loop with unused local import
in <code>__init__.py</code> (<code>F401</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15517">#15517</a>)</li>
<li>[<code>pylint</code>] Do not report methods with only one
<code>EM101</code>-compatible <code>raise</code> (<code>PLR6301</code>)
(<a
href="https://redirect.github.com/astral-sh/ruff/pull/15507">#15507</a>)</li>
<li>[<code>pylint</code>] Implement
<code>redefined-slots-in-subclass</code> (<code>W0244</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/9640">#9640</a>)</li>
<li>[<code>pyupgrade</code>] Add rules to use PEP 695 generics in
classes and functions (<code>UP046</code>, <code>UP047</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15565">#15565</a>,
<a
href="https://redirect.github.com/astral-sh/ruff/pull/15659">#15659</a>)</li>
<li>[<code>refurb</code>] Implement <code>for-loop-writes</code>
(<code>FURB122</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/10630">#10630</a>)</li>
<li>[<code>ruff</code>] Implement <code>needless-else</code> clause
(<code>RUF047</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15051">#15051</a>)</li>
<li>[<code>ruff</code>] Implement <code>starmap-zip</code>
(<code>RUF058</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15483">#15483</a>)</li>
</ul>
<h3>Rule changes</h3>
<ul>
<li>[<code>flake8-bugbear</code>] Do not raise error if keyword argument
is present and target-python version is less or equals than 3.9
(<code>B903</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15549">#15549</a>)</li>
<li>[<code>flake8-comprehensions</code>] strip parentheses around
generators in <code>unnecessary-generator-set</code> (<code>C401</code>)
(<a
href="https://redirect.github.com/astral-sh/ruff/pull/15553">#15553</a>)</li>
<li>[<code>flake8-pytest-style</code>] Rewrite references to
<code>.exception</code> (<code>PT027</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15680">#15680</a>)</li>
<li>[<code>flake8-simplify</code>] Mark fixes as unsafe
(<code>SIM201</code>, <code>SIM202</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15626">#15626</a>)</li>
<li>[<code>flake8-type-checking</code>] Fix some safe fixes being
labeled unsafe (<code>TC006</code>,<code>TC008</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15638">#15638</a>)</li>
<li>[<code>isort</code>] Omit trailing whitespace in
<code>unsorted-imports</code> (<code>I001</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15518">#15518</a>)</li>
<li>[<code>pydoclint</code>] Allow ignoring one line docstrings for
<code>DOC</code> rules (<a
href="https://redirect.github.com/astral-sh/ruff/pull/13302">#13302</a>)</li>
<li>[<code>pyflakes</code>] Apply redefinition fixes by source code
order (<code>F811</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15575">#15575</a>)</li>
<li>[<code>pyflakes</code>] Avoid removing too many imports in
<code>redefined-while-unused</code> (<code>F811</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15585">#15585</a>)</li>
<li>[<code>pyflakes</code>] Group redefinition fixes by source statement
(<code>F811</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15574">#15574</a>)</li>
<li>[<code>pylint</code>] Include name of base class in message for
<code>redefined-slots-in-subclass</code> (<code>W0244</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15559">#15559</a>)</li>
<li>[<code>ruff</code>] Update fix for <code>RUF055</code> to use
<code>var == value</code> (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15605">#15605</a>)</li>
</ul>
<h3>Formatter</h3>
<ul>
<li>Fix bracket spacing for single-element tuples in f-string
expressions (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15537">#15537</a>)</li>
<li>Fix unstable f-string formatting for expressions containing a
trailing comma (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15545">#15545</a>)</li>
</ul>
<h3>Performance</h3>
<ul>
<li>Avoid quadratic membership check in import fixes (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15576">#15576</a>)</li>
</ul>
<h3>Server</h3>
<ul>
<li>Allow <code>unsafe-fixes</code> settings for code actions (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15666">#15666</a>)</li>
</ul>
<h3>Bug fixes</h3>
<ul>
<li>[<code>flake8-bandit</code>] Add missing single-line/dotall regex
flag (<code>S608</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15654">#15654</a>)</li>
<li>[<code>flake8-import-conventions</code>] Fix infinite loop between
<code>ICN001</code> and <code>I002</code> (<code>ICN001</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15480">#15480</a>)</li>
<li>[<code>flake8-simplify</code>] Do not emit diagnostics for
expressions inside string type annotations (<code>SIM222</code>,
<code>SIM223</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15405">#15405</a>)</li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="90589372da"><code>9058937</code></a>
Fix grep for version number in docker build (<a
href="https://redirect.github.com/astral-sh/ruff/issues/15699">#15699</a>)</li>
<li><a
href="b5ffb404de"><code>b5ffb40</code></a>
Bump version to 0.9.3 (<a
href="https://redirect.github.com/astral-sh/ruff/issues/15698">#15698</a>)</li>
<li><a
href="cffd1866ce"><code>cffd186</code></a>
Preserve raw string prefix and escapes (<a
href="https://redirect.github.com/astral-sh/ruff/issues/15694">#15694</a>)</li>
<li><a
href="569060f46c"><code>569060f</code></a>
[<code>flake8-pytest-style</code>] Rewrite references to
<code>.exception</code> (<code>PT027</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/issues/15680">#15680</a>)</li>
<li><a
href="15394a8028"><code>15394a8</code></a>
[red-knot] MDTests: Do not depend on precise public-symbol type
inference (<a
href="https://redirect.github.com/astral-sh/ruff/issues/1">#1</a>...</li>
<li><a
href="fc2ebea736"><code>fc2ebea</code></a>
[red-knot] Make <code>infer.rs</code> unit tests independent of public
symbol inference ...</li>
<li><a
href="43160b4c3e"><code>43160b4</code></a>
Tidy knot CLI tests (<a
href="https://redirect.github.com/astral-sh/ruff/issues/15685">#15685</a>)</li>
<li><a
href="0173738eef"><code>0173738</code></a>
[red-knot] Port comprehension tests to Markdown (<a
href="https://redirect.github.com/astral-sh/ruff/issues/15688">#15688</a>)</li>
<li><a
href="05ea77b1d4"><code>05ea77b</code></a>
Create Unknown rule diagnostics with a source range (<a
href="https://redirect.github.com/astral-sh/ruff/issues/15648">#15648</a>)</li>
<li><a
href="1e790d3885"><code>1e790d3</code></a>
[red-knot] Port 'deferred annotations' unit tests to Markdown (<a
href="https://redirect.github.com/astral-sh/ruff/issues/15686">#15686</a>)</li>
<li>Additional commits viewable in <a
href="https://github.com/astral-sh/ruff/compare/0.9.2...0.9.3">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=ruff&package-manager=pip&previous-version=0.9.2&new-version=0.9.3)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore <dependency name> major version` will close this
group update PR and stop Dependabot creating any more for the specific
dependency's major version (unless you unignore this specific
dependency's major version or upgrade to it yourself)
- `@dependabot ignore <dependency name> minor version` will close this
group update PR and stop Dependabot creating any more for the specific
dependency's minor version (unless you unignore this specific
dependency's minor version or upgrade to it yourself)
- `@dependabot ignore <dependency name>` will close this group update PR
and stop Dependabot creating any more for the specific dependency
(unless you unignore this specific dependency or upgrade to it yourself)
- `@dependabot unignore <dependency name>` will remove all of the ignore
conditions of the specified dependency
- `@dependabot unignore <dependency name> <ignore condition>` will
remove the ignore condition of the specified dependency and ignore
conditions


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Nicholas Tindle <nicholas.tindle@agpt.co>
2025-01-29 16:47:47 +00:00
Nicholas Tindle
f1bc9d1581 Update .deepsource.toml 2025-01-29 10:31:57 -06:00
Zamil Majdy
f67060fd8f fix(backend): Fix get balance error on user with null running balance (#9365)
### Changes 🏗️

The current state can cause this error:
```
    if (snapshot_time.year, snapshot_time.month) == (cur_time.year, cur_time.month):
        ^^^^^^^^^^^^^^^^^^
AttributeError: 'str' object has no attribute 'year'
```

Which is caused by the timestamp return value possibly a string.

### Checklist 📋

#### For code changes:
- [ ] I have clearly listed my changes in the PR description
- [ ] I have made a test plan
- [ ] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [ ] ...

<details>
  <summary>Example test plan</summary>
  
  - [ ] Create from scratch and execute an agent with at least 3 blocks
- [ ] Import an agent from file upload, and confirm it executes
correctly
  - [ ] Upload agent to marketplace
- [ ] Import an agent from marketplace and confirm it executes correctly
  - [ ] Edit an agent from monitor, and confirm it executes correctly
</details>

#### For configuration changes:
- [ ] `.env.example` is updated or already compatible with my changes
- [ ] `docker-compose.yml` is updated or already compatible with my
changes
- [ ] I have included a list of my configuration changes in the PR
description (under **Changes**)

<details>
  <summary>Examples of configuration changes</summary>

  - Changing ports
  - Adding new services that need to communicate with each other
  - Secrets or environment variable changes
  - New or infrastructure changes such as databases
</details>
2025-01-29 14:25:50 +00:00
Zamil Majdy
c31a2ec565 fix(backend): Remove hardcoded auto-top-up config definition on set_auto_top_up (#9361)
### Changes 🏗️

Make set_auto_top_up accepts AutoTopUpConfig & eliminate hardcoding of
json value.

### Checklist 📋

#### For code changes:
- [ ] I have clearly listed my changes in the PR description
- [ ] I have made a test plan
- [ ] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [ ] ...

<details>
  <summary>Example test plan</summary>
  
  - [ ] Create from scratch and execute an agent with at least 3 blocks
- [ ] Import an agent from file upload, and confirm it executes
correctly
  - [ ] Upload agent to marketplace
- [ ] Import an agent from marketplace and confirm it executes correctly
  - [ ] Edit an agent from monitor, and confirm it executes correctly
</details>

#### For configuration changes:
- [ ] `.env.example` is updated or already compatible with my changes
- [ ] `docker-compose.yml` is updated or already compatible with my
changes
- [ ] I have included a list of my configuration changes in the PR
description (under **Changes**)

<details>
  <summary>Examples of configuration changes</summary>

  - Changing ports
  - Adding new services that need to communicate with each other
  - Secrets or environment variable changes
  - New or infrastructure changes such as databases
</details>
2025-01-29 13:59:06 +00:00
Nicholas Tindle
97a26dba7e fix(frontend): add typechecks and fix existing type errors in frontend (#9336)
<!-- Clearly explain the need for these changes: -->
We want to be able to use typechecking and see errors before they occur.
This is a PR to help enable us to do so by fixing the existing errors
and hopefully not causing new ones.

### Changes 🏗️
- adds check to ci
- disables some code points
- fixes lots of type errors
- fixes a bunch of the stories

<!-- Concisely describe all of the changes made in this pull request:
-->

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [x] added types
  - [x] Ran some of the stories
  - [x] Asked all the relevant parties for manual checks

---------

Co-authored-by: SwiftyOS <craigswift13@gmail.com>
2025-01-29 12:32:35 +00:00
Zamil Majdy
5e2043b774 fix(backend): Fix failing test_block_credit_reset when it's executed at the end of 31-days month (#9360)
Currently, the test is failing on Jan 29 because 29th Feb does not
exist.

### Changes 🏗️

Force use the first day of the month for getting the current time.

### Checklist 📋

#### For code changes:
- [ ] I have clearly listed my changes in the PR description
- [ ] I have made a test plan
- [ ] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [ ] ...

<details>
  <summary>Example test plan</summary>
  
  - [ ] Create from scratch and execute an agent with at least 3 blocks
- [ ] Import an agent from file upload, and confirm it executes
correctly
  - [ ] Upload agent to marketplace
- [ ] Import an agent from marketplace and confirm it executes correctly
  - [ ] Edit an agent from monitor, and confirm it executes correctly
</details>

#### For configuration changes:
- [ ] `.env.example` is updated or already compatible with my changes
- [ ] `docker-compose.yml` is updated or already compatible with my
changes
- [ ] I have included a list of my configuration changes in the PR
description (under **Changes**)

<details>
  <summary>Examples of configuration changes</summary>

  - Changing ports
  - Adding new services that need to communicate with each other
  - Secrets or environment variable changes
  - New or infrastructure changes such as databases
</details>

Co-authored-by: Nicholas Tindle <nicholas.tindle@agpt.co>
2025-01-29 12:17:40 +00:00
Nicholas Tindle
f37957d71f Merge branch 'master' into dev 2025-01-29 06:11:09 -06:00
Nicholas Tindle
e199e54791 ci(repo): deepsource config 2025-01-29 05:47:01 -06:00
Ritik Dutta
33c718a9d7 Fix broken connections when copying and pasting an agent (#9138)
### Summary
This pull request fixes the issue of broken connections when copying and
pasting an agent. The connections are now preserved during the
copy-paste operation.

### Changes
- Modified the `useCopyPaste.ts` file to ensure that connections are
preserved during the copy-paste operation.

### Steps to Reproduce
1. Copy a small section from an Agent that features links to outside the
area you're selecting (use shift + click & drag).
2. Go to a fresh builder canvas in a new tab.
3. Paste the copied section.
4. Try to save the Agent; previously, it would fail due to the outgoing
link being broken.

### Testing
- Verified that connections remain intact when copying and pasting
agents.
- Tested by saving the agent to ensure no broken links.

### Related Issue
Fixes
[#9137](https://github.com/Significant-Gravitas/AutoGPT/issues/9137)

### Notes
Please review the changes and let me know if any further adjustments are
needed.

---------

Co-authored-by: Swifty <craigswift13@gmail.com>
Co-authored-by: Nicholas Tindle <nicholas.tindle@agpt.co>
Co-authored-by: Bently <tomnoon9@gmail.com>
2025-01-28 15:49:06 +00:00
Nicholas Tindle
c5747c59d7 feat: mem0 ai memory block (#9285)
We want to have some more intelligent and less user managed memory
methods, so we add mem0


### Changes 🏗️

- Adds user_id to kwargs for blocks
- Add mem0 blocks

<!-- Concisely describe all of the changes made in this pull request:
-->

### Checklist 📋

- [x] document adding user_id to kwargs for blocks
- [x] Add run and agent Id as optional checkboxes that will be passed
down to mem0

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [ ] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
- [ ] Build and submit an agent to @Torantulino and the marketplace for
a personal AI tutor based on recommendations from the mem0 team

---------

Co-authored-by: Aarushi <50577581+aarushik93@users.noreply.github.com>
Co-authored-by: Zamil Majdy <zamil.majdy@agpt.co>
2025-01-28 15:15:29 +00:00
dependabot[bot]
e3190b6234 chore(frontend/deps): bump the production-dependencies group across 1 directory with 19 updates (#9349)
Bumps the production-dependencies group with 18 updates in the
/autogpt_platform/frontend directory:

| Package | From | To |
| --- | --- | --- |
| [@faker-js/faker](https://github.com/faker-js/faker) | `9.3.0` |
`9.4.0` |
|
[@next/third-parties](https://github.com/vercel/next.js/tree/HEAD/packages/third-parties)
| `15.1.3` | `15.1.6` |
| [@radix-ui/react-alert-dialog](https://github.com/radix-ui/primitives)
| `1.1.4` | `1.1.5` |
| [@radix-ui/react-context-menu](https://github.com/radix-ui/primitives)
| `2.2.4` | `2.2.5` |
|
[@radix-ui/react-dropdown-menu](https://github.com/radix-ui/primitives)
| `2.1.4` | `2.1.5` |
| [@radix-ui/react-popover](https://github.com/radix-ui/primitives) |
`1.1.4` | `1.1.5` |
| [@radix-ui/react-select](https://github.com/radix-ui/primitives) |
`2.1.4` | `2.1.5` |
| [@radix-ui/react-toast](https://github.com/radix-ui/primitives) |
`1.2.4` | `1.2.5` |
| [@radix-ui/react-tooltip](https://github.com/radix-ui/primitives) |
`1.1.6` | `1.1.7` |
| [@sentry/nextjs](https://github.com/getsentry/sentry-javascript) |
`8.48.0` | `8.51.0` |
| [@stripe/stripe-js](https://github.com/stripe/stripe-js) | `5.4.0` |
`5.5.0` |
| [@supabase/supabase-js](https://github.com/supabase/supabase-js) |
`2.47.10` | `2.48.1` |
|
[@xyflow/react](https://github.com/xyflow/xyflow/tree/HEAD/packages/react)
| `12.3.6` | `12.4.2` |
| [embla-carousel-react](https://github.com/davidjerleke/embla-carousel)
| `8.5.1` | `8.5.2` |
|
[lucide-react](https://github.com/lucide-icons/lucide/tree/HEAD/packages/lucide-react)
| `0.469.0` | `0.474.0` |
| [react-day-picker](https://github.com/gpbl/react-day-picker) | `9.5.0`
| `9.5.1` |
| [react-shepherd](https://github.com/shepherd-pro/shepherd) | `6.1.6` |
`6.1.7` |
| [uuid](https://github.com/uuidjs/uuid) | `11.0.4` | `11.0.5` |


Updates `@faker-js/faker` from 9.3.0 to 9.4.0
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/faker-js/faker/releases"><code>@​faker-js/faker</code>'s
releases</a>.</em></p>
<blockquote>
<h2>v9.4.0</h2>
<h2>What's Changed</h2>
<ul>
<li>refactor: replace MersenneTwister implementation with a pure-rand
based one by <a
href="https://github.com/ST-DDT"><code>@​ST-DDT</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3288">faker-js/faker#3288</a></li>
<li>chore: improve variable naming of system module by <a
href="https://github.com/ST-DDT"><code>@​ST-DDT</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3315">faker-js/faker#3315</a></li>
<li>refactor(locale): re-moo-ved some incorrect <code>cow</code> data by
<a
href="https://github.com/matthewmayer"><code>@​matthewmayer</code></a>
in <a
href="https://redirect.github.com/faker-js/faker/pull/3326">faker-js/faker#3326</a></li>
<li>refactor(locale): fix truncated song names with commas by <a
href="https://github.com/matthewmayer"><code>@​matthewmayer</code></a>
in <a
href="https://redirect.github.com/faker-js/faker/pull/3327">faker-js/faker#3327</a></li>
<li>refactor(locale): fix bad uz street_name_part data by <a
href="https://github.com/matthewmayer"><code>@​matthewmayer</code></a>
in <a
href="https://redirect.github.com/faker-js/faker/pull/3328">faker-js/faker#3328</a></li>
<li>docs(helpers): add missing example results to uniqueArray by <a
href="https://github.com/ST-DDT"><code>@​ST-DDT</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3321">faker-js/faker#3321</a></li>
<li>chore: align coding style of deprecated.ts by <a
href="https://github.com/ST-DDT"><code>@​ST-DDT</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3312">faker-js/faker#3312</a></li>
<li>refactor(locale): fix various locale data with trailing spaces by <a
href="https://github.com/matthewmayer"><code>@​matthewmayer</code></a>
in <a
href="https://redirect.github.com/faker-js/faker/pull/3329">faker-js/faker#3329</a></li>
<li>chore(deps): lock file maintenance by <a
href="https://github.com/renovate"><code>@​renovate</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3331">faker-js/faker#3331</a></li>
<li>chore: align coding style of word module by <a
href="https://github.com/ST-DDT"><code>@​ST-DDT</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3319">faker-js/faker#3319</a></li>
<li>chore: align coding style of lorem module by <a
href="https://github.com/ST-DDT"><code>@​ST-DDT</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3313">faker-js/faker#3313</a></li>
<li>chore: improve parameter naming of internet module by <a
href="https://github.com/ST-DDT"><code>@​ST-DDT</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3317">faker-js/faker#3317</a></li>
<li>feat(finance): use fake patterns for transactionDescription by <a
href="https://github.com/ST-DDT"><code>@​ST-DDT</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3202">faker-js/faker#3202</a></li>
<li>feat(locale): add german animal types by <a
href="https://github.com/motz0815"><code>@​motz0815</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3334">faker-js/faker#3334</a></li>
<li>feat(internet): update to simplified modern user-agent list by <a
href="https://github.com/matthewmayer"><code>@​matthewmayer</code></a>
in <a
href="https://redirect.github.com/faker-js/faker/pull/3324">faker-js/faker#3324</a></li>
<li>feat(location): add list of spoken languages by <a
href="https://github.com/UmairJibran"><code>@​UmairJibran</code></a> in
<a
href="https://redirect.github.com/faker-js/faker/pull/3333">faker-js/faker#3333</a></li>
<li>chore: improve variable naming of helpers module by <a
href="https://github.com/ST-DDT"><code>@​ST-DDT</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3316">faker-js/faker#3316</a></li>
<li>docs: fix link to randomizer guide by <a
href="https://github.com/ST-DDT"><code>@​ST-DDT</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3332">faker-js/faker#3332</a></li>
<li>chore(deps): lock file maintenance by <a
href="https://github.com/renovate"><code>@​renovate</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3337">faker-js/faker#3337</a></li>
<li>fix(finance): correct Discover card number format by <a
href="https://github.com/julsql"><code>@​julsql</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3336">faker-js/faker#3336</a></li>
<li>docs: improve examples for objectKey,objectValue,objectEntry by <a
href="https://github.com/matthewmayer"><code>@​matthewmayer</code></a>
in <a
href="https://redirect.github.com/faker-js/faker/pull/3339">faker-js/faker#3339</a></li>
<li>infra(unicorn): prevent-abbreviations by <a
href="https://github.com/ST-DDT"><code>@​ST-DDT</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3320">faker-js/faker#3320</a></li>
<li>chore(deps): lock file maintenance by <a
href="https://github.com/renovate"><code>@​renovate</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3340">faker-js/faker#3340</a></li>
<li>fix: basic wildcard range handling + add more tests by <a
href="https://github.com/ST-DDT"><code>@​ST-DDT</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3322">faker-js/faker#3322</a></li>
<li>docs(api): add refresh button to examples by <a
href="https://github.com/ST-DDT"><code>@​ST-DDT</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3301">faker-js/faker#3301</a></li>
<li>fix(system): semver parts should not be limited to 0-9 by <a
href="https://github.com/matthewmayer"><code>@​matthewmayer</code></a>
in <a
href="https://redirect.github.com/faker-js/faker/pull/3349">faker-js/faker#3349</a></li>
<li>fix(image): dataUri should return random type by <a
href="https://github.com/matthewmayer"><code>@​matthewmayer</code></a>
in <a
href="https://redirect.github.com/faker-js/faker/pull/3347">faker-js/faker#3347</a></li>
<li>refactor(image): deprecate urlPlaceholder by <a
href="https://github.com/ST-DDT"><code>@​ST-DDT</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3341">faker-js/faker#3341</a></li>
<li>docs: fix examples which return string/number by <a
href="https://github.com/matthewmayer"><code>@​matthewmayer</code></a>
in <a
href="https://redirect.github.com/faker-js/faker/pull/3348">faker-js/faker#3348</a></li>
<li>docs(image): improve urlPlaceholder alternatives by <a
href="https://github.com/ST-DDT"><code>@​ST-DDT</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3351">faker-js/faker#3351</a></li>
<li>chore: update LICENSE file by <a
href="https://github.com/ST-DDT"><code>@​ST-DDT</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3350">faker-js/faker#3350</a></li>
<li>chore(deps): update
mcr.microsoft.com/devcontainers/typescript-node:22 docker digest to
9791f4a by <a
href="https://github.com/renovate"><code>@​renovate</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3354">faker-js/faker#3354</a></li>
<li>chore(deps): update dependency prettier to v3.4.2 by <a
href="https://github.com/renovate"><code>@​renovate</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3355">faker-js/faker#3355</a></li>
<li>chore(deps): update all non-major dependencies by <a
href="https://github.com/renovate"><code>@​renovate</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3359">faker-js/faker#3359</a></li>
<li>chore(deps): update dependency ts-morph to v25 by <a
href="https://github.com/renovate"><code>@​renovate</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3360">faker-js/faker#3360</a></li>
<li>chore(deps): update eslint by <a
href="https://github.com/renovate"><code>@​renovate</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3357">faker-js/faker#3357</a></li>
<li>chore(deps): update devdependencies by <a
href="https://github.com/renovate"><code>@​renovate</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3356">faker-js/faker#3356</a></li>
<li>chore(deps): lock file maintenance by <a
href="https://github.com/renovate"><code>@​renovate</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3353">faker-js/faker#3353</a></li>
<li>chore(deps): update vitest by <a
href="https://github.com/renovate"><code>@​renovate</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3358">faker-js/faker#3358</a></li>
<li>chore(deps): lock file maintenance by <a
href="https://github.com/renovate"><code>@​renovate</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3366">faker-js/faker#3366</a></li>
<li>docs(finance): add seeAlsos by <a
href="https://github.com/ST-DDT"><code>@​ST-DDT</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3344">faker-js/faker#3344</a></li>
<li>chore(deps): update eslint by <a
href="https://github.com/renovate"><code>@​renovate</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3369">faker-js/faker#3369</a></li>
<li>docs: use embedded build for preview examples by <a
href="https://github.com/ST-DDT"><code>@​ST-DDT</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3343">faker-js/faker#3343</a></li>
<li>chore(deps): lock file maintenance by <a
href="https://github.com/renovate"><code>@​renovate</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3370">faker-js/faker#3370</a></li>
<li>refactor(locales): fix chemical element names in zh_CN by <a
href="https://github.com/tuanzisama"><code>@​tuanzisama</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3371">faker-js/faker#3371</a></li>
<li>refactor(locale): improve product_name data in en and tr by <a
href="https://github.com/k-salih"><code>@​k-salih</code></a> in <a
href="https://redirect.github.com/faker-js/faker/pull/3372">faker-js/faker#3372</a></li>
<li>chore(release): 9.4.0 by <a
href="https://github.com/fakerjs-bot"><code>@​fakerjs-bot</code></a> in
<a
href="https://redirect.github.com/faker-js/faker/pull/3362">faker-js/faker#3362</a></li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/faker-js/faker/blob/next/CHANGELOG.md"><code>@​faker-js/faker</code>'s
changelog</a>.</em></p>
<blockquote>
<h2><a
href="https://github.com/faker-js/faker/compare/v9.3.0...v9.4.0">9.4.0</a>
(2025-01-15)</h2>
<h3>Features</h3>
<ul>
<li><strong>finance:</strong> use fake patterns for
transactionDescription (<a
href="https://redirect.github.com/faker-js/faker/issues/3202">#3202</a>)
(<a
href="5ec4a6c9dd">5ec4a6c</a>)</li>
<li><strong>internet:</strong> update to simplified modern user-agent
list (<a
href="https://redirect.github.com/faker-js/faker/issues/3324">#3324</a>)
(<a
href="3c7abb55e6">3c7abb5</a>)</li>
<li><strong>location:</strong> add list of spoken languages (<a
href="https://redirect.github.com/faker-js/faker/issues/3333">#3333</a>)
(<a
href="ff6dda94dd">ff6dda9</a>)</li>
</ul>
<h3>Changed Locales</h3>
<ul>
<li><strong>locale:</strong> fix various locale data with trailing
spaces (<a
href="https://redirect.github.com/faker-js/faker/issues/3329">#3329</a>)
(<a
href="e5eec0ed84">e5eec0e</a>)</li>
<li><strong>locale:</strong> improve product_name data in en and tr (<a
href="https://redirect.github.com/faker-js/faker/issues/3372">#3372</a>)
(<a
href="773fc1f654">773fc1f</a>)</li>
</ul>
<h3>Bug Fixes</h3>
<ul>
<li><strong>animal:</strong> re-moo-ved some incorrect cow data (<a
href="https://redirect.github.com/faker-js/faker/issues/3326">#3326</a>)
(<a
href="47f835bd0d">47f835b</a>)</li>
<li>basic wildcard range handling + add more tests (<a
href="https://redirect.github.com/faker-js/faker/issues/3322">#3322</a>)
(<a
href="817f8a01d9">817f8a0</a>)</li>
<li><strong>finance:</strong> update Discover card number format to
ensure accuracy (<a
href="https://redirect.github.com/faker-js/faker/issues/3336">#3336</a>)
(<a
href="69c006344b">69c0063</a>)</li>
<li><strong>image:</strong> dataUri should return random type (<a
href="https://redirect.github.com/faker-js/faker/issues/3347">#3347</a>)
(<a
href="eceb17d257">eceb17d</a>)</li>
<li><strong>locales:</strong> update chemical element names in zh_CN (<a
href="https://redirect.github.com/faker-js/faker/issues/3371">#3371</a>)
(<a
href="6ec6f84922">6ec6f84</a>)</li>
<li><strong>location:</strong> fix bad uz street_name_part data (<a
href="https://redirect.github.com/faker-js/faker/issues/3328">#3328</a>)
(<a
href="b6132cbee6">b6132cb</a>)</li>
<li><strong>music:</strong> fix truncated song names with commas (<a
href="https://redirect.github.com/faker-js/faker/issues/3327">#3327</a>)
(<a
href="f36fc71ac4">f36fc71</a>),
closes <a
href="https://redirect.github.com/faker-js/faker/issues/996">#996</a></li>
<li><strong>system:</strong> semver parts should not be limited to 0-9
(<a
href="https://redirect.github.com/faker-js/faker/issues/3349">#3349</a>)
(<a
href="c0d92b8fa8">c0d92b8</a>)</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="192b540d70"><code>192b540</code></a>
chore(release): 9.4.0 (<a
href="https://redirect.github.com/faker-js/faker/issues/3362">#3362</a>)</li>
<li><a
href="773fc1f654"><code>773fc1f</code></a>
refactor(locale): improve product_name data in en and tr (<a
href="https://redirect.github.com/faker-js/faker/issues/3372">#3372</a>)</li>
<li><a
href="6ec6f84922"><code>6ec6f84</code></a>
fix(locales): update chemical element names in zh_CN (<a
href="https://redirect.github.com/faker-js/faker/issues/3371">#3371</a>)</li>
<li><a
href="bddce72b49"><code>bddce72</code></a>
chore(deps): lock file maintenance (<a
href="https://redirect.github.com/faker-js/faker/issues/3370">#3370</a>)</li>
<li><a
href="3bb64a230a"><code>3bb64a2</code></a>
docs: use embedded build for preview examples (<a
href="https://redirect.github.com/faker-js/faker/issues/3343">#3343</a>)</li>
<li><a
href="d2c67636c0"><code>d2c6763</code></a>
chore(deps): update eslint (<a
href="https://redirect.github.com/faker-js/faker/issues/3369">#3369</a>)</li>
<li><a
href="5d4754c51e"><code>5d4754c</code></a>
docs(finance): add seeAlsos (<a
href="https://redirect.github.com/faker-js/faker/issues/3344">#3344</a>)</li>
<li><a
href="42dade59d3"><code>42dade5</code></a>
chore(deps): lock file maintenance (<a
href="https://redirect.github.com/faker-js/faker/issues/3366">#3366</a>)</li>
<li><a
href="ece4f16dd2"><code>ece4f16</code></a>
chore(deps): update vitest (<a
href="https://redirect.github.com/faker-js/faker/issues/3358">#3358</a>)</li>
<li><a
href="5b8356bbcb"><code>5b8356b</code></a>
chore(deps): lock file maintenance (<a
href="https://redirect.github.com/faker-js/faker/issues/3353">#3353</a>)</li>
<li>Additional commits viewable in <a
href="https://github.com/faker-js/faker/compare/v9.3.0...v9.4.0">compare
view</a></li>
</ul>
</details>
<br />

Updates `@next/third-parties` from 15.1.3 to 15.1.6
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/vercel/next.js/releases"><code>@​next/third-parties</code>'s
releases</a>.</em></p>
<blockquote>
<h2>v15.1.6</h2>
<blockquote>
<p>[!NOTE]<br />
This release is backporting bug fixes. It does <strong>not</strong>
include all pending features/changes on canary.</p>
</blockquote>
<h3>Core Changes</h3>
<ul>
<li>fix: don't memory-leak promises passed to waitUntil (<a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/75041">#75041</a>)</li>
<li>backport: fix prerender issue with intercepting routes +
generateStaticParams (<a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/75170">#75170</a>)</li>
</ul>
<h3>Credits</h3>
<p>Huge thanks to <a
href="https://github.com/lubieowoce"><code>@​lubieowoce</code></a> and
<a href="https://github.com/ztanner"><code>@​ztanner</code></a> for
helping!</p>
<h2>v15.1.5</h2>
<blockquote>
<p>[!NOTE]<br />
This release is backporting bug fixes. It does <strong>not</strong>
include all pending features/changes on canary.</p>
</blockquote>
<h3>Core Changes</h3>
<ul>
<li>Fix missing revalidate with notFound() (<a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/75009">#75009</a>)</li>
<li>fix: when metadatabase is set we should not warn (<a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/74840">#74840</a>)</li>
<li>Fix <code>@​vercel/og</code> license SPDX expression (<a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/74745">#74745</a>)</li>
<li>fix: ts language server rule metadata should allow null (<a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/74704">#74704</a>)</li>
<li>fix: eslint rule of using img in metadata routes (<a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/74864">#74864</a>)</li>
<li>Fix presentation when onerror receives an event without error (<a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/74643">#74643</a>)</li>
<li>fix fetch lock not being consistently released <a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/74623">#74623</a>
(<a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/75028">#75028</a>)</li>
</ul>
<h3>Credits</h3>
<p>Huge thanks to <a
href="https://github.com/ijjk"><code>@​ijjk</code></a>, <a
href="https://github.com/huozhi"><code>@​huozhi</code></a>, <a
href="https://github.com/matmannion"><code>@​matmannion</code></a> and
<a href="https://github.com/ztanner"><code>@​ztanner</code></a> for
helping!</p>
<h2>v15.1.4</h2>
<blockquote>
<p>[!NOTE]<br />
This release is backporting bug fixes. It does <strong>not</strong>
include all pending features/changes on canary.</p>
</blockquote>
<h3>Core Changes</h3>
<ul>
<li>backport: force module format for virtual client-proxy (<a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/74608">#74608</a>)</li>
<li>Fix prerender tags when notFound is called (<a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/74607">#74607</a>)</li>
<li>Use provided waitUntil for pending revalidates (<a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/74604">#74604</a>)</li>
<li>Feature: next/image: add support for images.qualities in next.config
(<a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/74588">#74588</a>)</li>
<li>Chore: docs: add missing search: '' on remotePatterns (<a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/74587">#74587</a>)</li>
<li>Chore: docs: update version history of next/image (<a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/73923">#73923</a>)
(<a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/74570">#74570</a>)</li>
<li>Chore: next/image: improve imgopt api bypass detection for
unsupported images (<a
href="https://github.com/vercel/next.js/tree/HEAD/packages/third-parties/issues/74569">#74569</a>)</li>
</ul>
<h3>Credits</h3>
<p>Huge thanks to @ and @ for helping!</p>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="38a6d0177e"><code>38a6d01</code></a>
v15.1.6</li>
<li><a
href="47102cae01"><code>47102ca</code></a>
v15.1.5</li>
<li><a
href="48f2588b0f"><code>48f2588</code></a>
v15.1.4</li>
<li>See full diff in <a
href="https://github.com/vercel/next.js/commits/v15.1.6/packages/third-parties">compare
view</a></li>
</ul>
</details>
<br />

Updates `@radix-ui/react-alert-dialog` from 1.1.4 to 1.1.5
<details>
<summary>Commits</summary>
<ul>
<li>See full diff in <a
href="https://github.com/radix-ui/primitives/commits">compare
view</a></li>
</ul>
</details>
<br />

Updates `@radix-ui/react-context-menu` from 2.2.4 to 2.2.5
<details>
<summary>Commits</summary>
<ul>
<li>See full diff in <a
href="https://github.com/radix-ui/primitives/commits">compare
view</a></li>
</ul>
</details>
<br />

Updates `@radix-ui/react-dialog` from 1.1.4 to 1.1.5
<details>
<summary>Commits</summary>
<ul>
<li>See full diff in <a
href="https://github.com/radix-ui/primitives/commits">compare
view</a></li>
</ul>
</details>
<br />

Updates `@radix-ui/react-dropdown-menu` from 2.1.4 to 2.1.5
<details>
<summary>Commits</summary>
<ul>
<li>See full diff in <a
href="https://github.com/radix-ui/primitives/commits">compare
view</a></li>
</ul>
</details>
<br />

Updates `@radix-ui/react-popover` from 1.1.4 to 1.1.5
<details>
<summary>Commits</summary>
<ul>
<li>See full diff in <a
href="https://github.com/radix-ui/primitives/commits">compare
view</a></li>
</ul>
</details>
<br />

Updates `@radix-ui/react-select` from 2.1.4 to 2.1.5
<details>
<summary>Commits</summary>
<ul>
<li>See full diff in <a
href="https://github.com/radix-ui/primitives/commits">compare
view</a></li>
</ul>
</details>
<br />

Updates `@radix-ui/react-toast` from 1.2.4 to 1.2.5
<details>
<summary>Commits</summary>
<ul>
<li>See full diff in <a
href="https://github.com/radix-ui/primitives/commits">compare
view</a></li>
</ul>
</details>
<br />

Updates `@radix-ui/react-tooltip` from 1.1.6 to 1.1.7
<details>
<summary>Commits</summary>
<ul>
<li>See full diff in <a
href="https://github.com/radix-ui/primitives/commits">compare
view</a></li>
</ul>
</details>
<br />

Updates `@sentry/nextjs` from 8.48.0 to 8.51.0
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/getsentry/sentry-javascript/releases"><code>@​sentry/nextjs</code>'s
releases</a>.</em></p>
<blockquote>
<h2>8.51.0</h2>
<h3>Important Changes</h3>
<ul>
<li>
<p><strong>feat(v8/node): Add <code>prismaInstrumentation</code> option
to Prisma integration as escape hatch for all Prisma versions (<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/15128">#15128</a>)</strong></p>
<p>This release adds a compatibility API to add support for Prisma
version 6.
To capture performance data for Prisma version 6:</p>
<ol>
<li>
<p>Install the <code>@prisma/instrumentation</code> package on version
6.</p>
</li>
<li>
<p>Pass a <code>new PrismaInstrumentation()</code> instance as exported
from <code>@prisma/instrumentation</code> to the
<code>prismaInstrumentation</code> option:</p>
<pre lang="js"><code>import { PrismaInstrumentation } from
'@prisma/instrumentation';
<p>Sentry.init({
integrations: [
prismaIntegration({
// Override the default instrumentation that Sentry uses
prismaInstrumentation: new PrismaInstrumentation(),
}),
],
});
</code></pre></p>
<p>The passed instrumentation instance will override the default
instrumentation instance the integration would use, while the
<code>prismaIntegration</code> will still ensure data compatibility for
the various Prisma versions.</p>
</li>
<li>
<p>Remove the <code>previewFeatures = [&quot;tracing&quot;]</code>
option from the client generator block of your Prisma schema.</p>
</li>
</ol>
</li>
</ul>
<h3>Other Changes</h3>
<ul>
<li>feat(v8/browser): Add <code>multiplexedtransport.js</code> CDN
bundle (<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/15046">#15046</a>)</li>
<li>feat(v8/browser): Add Unleash integration (<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/14948">#14948</a>)</li>
<li>feat(v8/deno): Deprecate Deno SDK as published on deno.land (<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/15121">#15121</a>)</li>
<li>feat(v8/sveltekit): Deprecate <code>fetchProxyScriptNonce</code>
option (<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/15011">#15011</a>)</li>
<li>fix(v8/aws-lambda): Avoid overwriting root span name (<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/15054">#15054</a>)</li>
<li>fix(v8/core): <code>fatal</code> events should set session as
crashed (<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/15073">#15073</a>)</li>
<li>fix(v8/node/nestjs): Use method on current fastify request (<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/15104">#15104</a>)</li>
</ul>
<p>Work in this release was contributed by <a
href="https://github.com/tjhiggins"><code>@​tjhiggins</code></a>, and <a
href="https://github.com/nwalters512"><code>@​nwalters512</code></a>.
Thank you for your contributions!</p>
<h2>Bundle size 📦</h2>
<table>
<thead>
<tr>
<th>Path</th>
<th>Size</th>
</tr>
</thead>
<tbody>
<tr>
<td><code>@​sentry/browser</code></td>
<td>23.29 KB</td>
</tr>
<tr>
<td><code>@​sentry/browser</code> - with treeshaking flags</td>
<td>23.17 KB</td>
</tr>
<tr>
<td><code>@​sentry/browser</code> (incl. Tracing)</td>
<td>35.85 KB</td>
</tr>
<tr>
<td><code>@​sentry/browser</code> (incl. Tracing, Replay)</td>
<td>73.2 KB</td>
</tr>
<tr>
<td><code>@​sentry/browser</code> (incl. Tracing, Replay) - with
treeshaking flags</td>
<td>66.66 KB</td>
</tr>
<tr>
<td><code>@​sentry/browser</code> (incl. Tracing, Replay with
Canvas)</td>
<td>77.51 KB</td>
</tr>
</tbody>
</table>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/getsentry/sentry-javascript/blob/8.51.0/CHANGELOG.md"><code>@​sentry/nextjs</code>'s
changelog</a>.</em></p>
<blockquote>
<h2>8.51.0</h2>
<h3>Important Changes</h3>
<ul>
<li>
<p><strong>feat(v8/node): Add <code>prismaInstrumentation</code> option
to Prisma integration as escape hatch for all Prisma versions (<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/15128">#15128</a>)</strong></p>
<p>This release adds a compatibility API to add support for Prisma
version 6.
To capture performance data for Prisma version 6:</p>
<ol>
<li>
<p>Install the <code>@prisma/instrumentation</code> package on version
6.</p>
</li>
<li>
<p>Pass a <code>new PrismaInstrumentation()</code> instance as exported
from <code>@prisma/instrumentation</code> to the
<code>prismaInstrumentation</code> option:</p>
<pre lang="js"><code>import { PrismaInstrumentation } from
'@prisma/instrumentation';
<p>Sentry.init({
integrations: [
prismaIntegration({
// Override the default instrumentation that Sentry uses
prismaInstrumentation: new PrismaInstrumentation(),
}),
],
});
</code></pre></p>
<p>The passed instrumentation instance will override the default
instrumentation instance the integration would use, while the
<code>prismaIntegration</code> will still ensure data compatibility for
the various Prisma versions.</p>
</li>
<li>
<p>Remove the <code>previewFeatures = [&quot;tracing&quot;]</code>
option from the client generator block of your Prisma schema.</p>
</li>
</ol>
</li>
</ul>
<h3>Other Changes</h3>
<ul>
<li>feat(v8/browser): Add <code>multiplexedtransport.js</code> CDN
bundle (<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/15046">#15046</a>)</li>
<li>feat(v8/browser): Add Unleash integration (<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/14948">#14948</a>)</li>
<li>feat(v8/deno): Deprecate Deno SDK as published on deno.land (<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/15121">#15121</a>)</li>
<li>feat(v8/sveltekit): Deprecate <code>fetchProxyScriptNonce</code>
option (<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/15011">#15011</a>)</li>
<li>fix(v8/aws-lambda): Avoid overwriting root span name (<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/15054">#15054</a>)</li>
<li>fix(v8/core): <code>fatal</code> events should set session as
crashed (<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/15073">#15073</a>)</li>
<li>fix(v8/node/nestjs): Use method on current fastify request (<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/15104">#15104</a>)</li>
</ul>
<p>Work in this release was contributed by <a
href="https://github.com/tjhiggins"><code>@​tjhiggins</code></a>, and <a
href="https://github.com/nwalters512"><code>@​nwalters512</code></a>.
Thank you for your contributions!</p>
<h2>8.50.0</h2>
<ul>
<li>feat(v8/react): Add support for React Router
<code>createMemoryRouter</code> (<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/14985">#14985</a>)</li>
</ul>
<h2>8.49.0</h2>
<ul>
<li>feat(v8/browser): Flush offline queue on flush and browser online
event (<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/14969">#14969</a>)</li>
<li>feat(v8/react): Add a <code>handled</code> prop to ErrorBoundary (<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/14978">#14978</a>)</li>
<li>fix(profiling/v8): Don't put <code>require</code>,
<code>__filename</code> and <code>__dirname</code> on global object (<a
href="https://redirect.github.com/getsentry/sentry-javascript/pull/14952">#14952</a>)</li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="3a0e160267"><code>3a0e160</code></a>
release: 8.51.0</li>
<li><a
href="e9a5f000d2"><code>e9a5f00</code></a>
meta: Update Changelog for 8.51.0 (<a
href="https://redirect.github.com/getsentry/sentry-javascript/issues/15113">#15113</a>)</li>
<li><a
href="d9138ffabf"><code>d9138ff</code></a>
feat(node/v8): Add <code>prismaInstrumentation</code> option to Prisma
integration as es...</li>
<li><a
href="d7aa93f38b"><code>d7aa93f</code></a>
feat(v8/sveltekit): Deprecate <code>fetchProxyScriptNonce</code> option
(<a
href="https://redirect.github.com/getsentry/sentry-javascript/issues/15011">#15011</a>)</li>
<li><a
href="79c2c2a9ee"><code>79c2c2a</code></a>
feat(deno): Deprecate Deno SDK as published on deno.land (<a
href="https://redirect.github.com/getsentry/sentry-javascript/issues/15121">#15121</a>)</li>
<li><a
href="b0dc860004"><code>b0dc860</code></a>
feat(flags/v8): Add Unleash integration (<a
href="https://redirect.github.com/getsentry/sentry-javascript/issues/14948">#14948</a>)</li>
<li><a
href="5f47fbbb2d"><code>5f47fbb</code></a>
chore(v8/deps): Deduplicate yarn.lock (<a
href="https://redirect.github.com/getsentry/sentry-javascript/issues/15094">#15094</a>)</li>
<li><a
href="8b870ba5b4"><code>8b870ba</code></a>
fix(v8/node/nestjs): Use method on current fastify request (<a
href="https://redirect.github.com/getsentry/sentry-javascript/issues/15104">#15104</a>)</li>
<li><a
href="d296ce0db3"><code>d296ce0</code></a>
fix(v8/core): <code>fatal</code> events should set session as crashed
(<a
href="https://redirect.github.com/getsentry/sentry-javascript/issues/15073">#15073</a>)</li>
<li><a
href="dd2201e01e"><code>dd2201e</code></a>
chore(v8/deps): Upgrade to Vitest 2.1.8 and Vite 5.4.11 (<a
href="https://redirect.github.com/getsentry/sentry-javascript/issues/15038">#15038</a>)</li>
<li>Additional commits viewable in <a
href="https://github.com/getsentry/sentry-javascript/compare/8.48.0...8.51.0">compare
view</a></li>
</ul>
</details>
<br />

Updates `@stripe/stripe-js` from 5.4.0 to 5.5.0
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/stripe/stripe-js/releases"><code>@​stripe/stripe-js</code>'s
releases</a>.</em></p>
<blockquote>
<h2>v5.5.0</h2>
<h3>New features</h3>
<ul>
<li>Add type definition for elements group update-end event (<a
href="https://redirect.github.com/stripe/stripe-js/issues/704">#704</a>)</li>
<li>Add paypal buttonType types (<a
href="https://redirect.github.com/stripe/stripe-js/issues/703">#703</a>)</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="d251532cb4"><code>d251532</code></a>
Add type definition for elements group update-end event (<a
href="https://redirect.github.com/stripe/stripe-js/issues/704">#704</a>)</li>
<li><a
href="cf55c17944"><code>cf55c17</code></a>
Add paypal buttonType types (<a
href="https://redirect.github.com/stripe/stripe-js/issues/703">#703</a>)</li>
<li><a
href="e9989c66d7"><code>e9989c6</code></a>
v5.4.0</li>
<li>See full diff in <a
href="https://github.com/stripe/stripe-js/compare/v5.4.0...v5.5.0">compare
view</a></li>
</ul>
</details>
<details>
<summary>Maintainer changes</summary>
<p>This version was pushed to npm by <a
href="https://www.npmjs.com/~bdaily-stripe">bdaily-stripe</a>, a new
releaser for <code>@​stripe/stripe-js</code> since your current
version.</p>
</details>
<br />

Updates `@supabase/supabase-js` from 2.47.10 to 2.48.1
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/supabase/supabase-js/releases"><code>@​supabase/supabase-js</code>'s
releases</a>.</em></p>
<blockquote>
<h2>v2.48.1</h2>
<h2><a
href="https://github.com/supabase/supabase-js/compare/v2.48.0...v2.48.1">2.48.1</a>
(2025-01-24)</h2>
<h3>Bug Fixes</h3>
<ul>
<li><strong>types:</strong> bump postgrest-js 1.18.1 (<a
href="da9e26d748">da9e26d</a>),
closes <a
href="https://redirect.github.com/supabase/supabase-js/issues/1354">#1354</a></li>
</ul>
<h2>v2.48.0</h2>
<h1><a
href="https://github.com/supabase/supabase-js/compare/v2.47.16...v2.48.0">2.48.0</a>
(2025-01-20)</h1>
<h3>Features</h3>
<ul>
<li><strong>deps:</strong> bump postgrest-js to 1.18.0 (<a
href="4397e57a7c">4397e57</a>)</li>
</ul>
<h2>v2.47.16</h2>
<h2><a
href="https://github.com/supabase/supabase-js/compare/v2.47.15...v2.47.16">2.47.16</a>
(2025-01-17)</h2>
<h3>Bug Fixes</h3>
<ul>
<li>🐛 Fix nullish coalescing operator issue in
hasCustomAuthorizationHeader (<a
href="e8cffdad0d">e8cffda</a>),
closes <a
href="https://redirect.github.com/supabase/supabase-js/issues/1338">#1338</a></li>
</ul>
<h2>v2.47.15</h2>
<h2><a
href="https://github.com/supabase/supabase-js/compare/v2.47.14...v2.47.15">2.47.15</a>
(2025-01-16)</h2>
<h3>Bug Fixes</h3>
<ul>
<li>Make the return value of accessToken nullable (<a
href="f8e48ffe87">f8e48ff</a>)</li>
</ul>
<h2>v2.47.14</h2>
<h2><a
href="https://github.com/supabase/supabase-js/compare/v2.47.13...v2.47.14">2.47.14</a>
(2025-01-15)</h2>
<h3>Bug Fixes</h3>
<ul>
<li>bump postgrest-js to 1.17.11 (<a
href="6822cdc14c">6822cdc</a>)</li>
</ul>
<h2>v2.47.13</h2>
<h2><a
href="https://github.com/supabase/supabase-js/compare/v2.47.12...v2.47.13">2.47.13</a>
(2025-01-14)</h2>
<h3>Bug Fixes</h3>
<ul>
<li>export PostgrestError as a class (<a
href="7ba8408183">7ba8408</a>)</li>
</ul>
<h2>v2.47.12</h2>
<h2><a
href="https://github.com/supabase/supabase-js/compare/v2.47.11...v2.47.12">2.47.12</a>
(2025-01-08)</h2>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="3316f2426d"><code>3316f24</code></a>
Merge pull request <a
href="https://redirect.github.com/supabase/supabase-js/issues/1361">#1361</a>
from supabase/fix/chore-bump-postgrest-js-1-18-1</li>
<li><a
href="da9e26d748"><code>da9e26d</code></a>
fix(types): bump postgrest-js 1.18.1</li>
<li><a
href="07cf12d03e"><code>07cf12d</code></a>
Merge pull request <a
href="https://redirect.github.com/supabase/supabase-js/issues/1358">#1358</a>
from supabase/chore/bump-postgrest-js-1-18-0</li>
<li><a
href="4397e57a7c"><code>4397e57</code></a>
feat(deps): bump postgrest-js to 1.18.0</li>
<li><a
href="76217a46f6"><code>76217a4</code></a>
Merge pull request <a
href="https://redirect.github.com/supabase/supabase-js/issues/1344">#1344</a>
from raghavyuva/fix/nullish-coalescing-operator</li>
<li><a
href="a50725dd97"><code>a50725d</code></a>
Merge pull request <a
href="https://redirect.github.com/supabase/supabase-js/issues/1332">#1332</a>
from supabase/fix/nullable-accessToken</li>
<li><a
href="1661420e1b"><code>1661420</code></a>
Merge pull request <a
href="https://redirect.github.com/supabase/supabase-js/issues/1355">#1355</a>
from supabase/chore/bump-postgrest-js-1-17-11</li>
<li><a
href="6822cdc14c"><code>6822cdc</code></a>
fix: bump postgrest-js to 1.17.11</li>
<li><a
href="7ba8408183"><code>7ba8408</code></a>
fix: export PostgrestError as a class</li>
<li><a
href="80d3c76fa6"><code>80d3c76</code></a>
fix: Bump postgrest-js to 1.17.10</li>
<li>Additional commits viewable in <a
href="https://github.com/supabase/supabase-js/compare/v2.47.10...v2.48.1">compare
view</a></li>
</ul>
</details>
<br />

Updates `@xyflow/react` from 12.3.6 to 12.4.2
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/xyflow/xyflow/releases"><code>@​xyflow/react</code>'s
releases</a>.</em></p>
<blockquote>
<h2><code>@​xyflow/react</code><a
href="https://github.com/12"><code>@​12</code></a>.4.2</h2>
<h3>Patch Changes</h3>
<ul>
<li>
<p><a
href="https://redirect.github.com/xyflow/xyflow/pull/4957">#4957</a> <a
href="fe843982bf"><code>fe843982</code></a>
Thanks <a
href="https://github.com/peterkogo"><code>@​peterkogo</code></a>! -
Narrow properties selected, selectable, deletable, draggable of
NodeProps type to be required.</p>
</li>
<li>
<p>Updated dependencies [<a
href="fe843982bf"><code>fe843982</code></a>,
<a
href="e73ef09fbc"><code>e73ef09f</code></a>]:</p>
<ul>
<li><code>@​xyflow/system</code><a
href="https://github.com/0"><code>@​0</code></a>.0.50</li>
</ul>
</li>
</ul>
<h2><code>@​xyflow/react</code><a
href="https://github.com/12"><code>@​12</code></a>.4.1</h2>
<h3>Patch Changes</h3>
<ul>
<li>
<p><a
href="https://redirect.github.com/xyflow/xyflow/pull/4949">#4949</a> <a
href="592c7eaf95"><code>592c7eaf</code></a>
Thanks <a
href="https://github.com/peterkogo"><code>@​peterkogo</code></a>! - Fix
useNodeConnection hook not returning all connected edges.</p>
</li>
<li>
<p>Updated dependencies [<a
href="592c7eaf95"><code>592c7eaf</code></a>]:</p>
<ul>
<li><code>@​xyflow/system</code><a
href="https://github.com/0"><code>@​0</code></a>.0.49</li>
</ul>
</li>
</ul>
<h2><code>@​xyflow/react</code><a
href="https://github.com/12"><code>@​12</code></a>.4.0</h2>
<h3>Minor Changes</h3>
<ul>
<li><a
href="https://redirect.github.com/xyflow/xyflow/pull/4725">#4725</a> <a
href="e10f53cf89"><code>e10f53cf</code></a>
Thanks <a
href="https://github.com/peterkogo"><code>@​peterkogo</code></a>! - Add
useNodeConnections hook to track all connections to a node. Can be
filtered by handleType and handleId.</li>
</ul>
<h3>Patch Changes</h3>
<ul>
<li>
<p><a
href="https://redirect.github.com/xyflow/xyflow/pull/4947">#4947</a> <a
href="868aa3f3db"><code>868aa3f3</code></a>
Thanks <a href="https://github.com/moklick"><code>@​moklick</code></a>!
- Export ResizeControlVariant correctly as a value.</p>
</li>
<li>
<p><a
href="https://redirect.github.com/xyflow/xyflow/pull/4880">#4880</a> <a
href="e2d849dca6"><code>e2d849dc</code></a>
Thanks <a href="https://github.com/crimx"><code>@​crimx</code></a>! -
Add type check for all event targets</p>
</li>
<li>
<p><a
href="https://redirect.github.com/xyflow/xyflow/pull/4929">#4929</a> <a
href="4947f683b7"><code>4947f683</code></a>
Thanks <a
href="https://github.com/peterkogo"><code>@​peterkogo</code></a>! -
Optimize selections and take into account if edges connected to selected
nodes are actually selectable.</p>
</li>
<li>
<p>Updated dependencies [<a
href="e2d849dca6"><code>e2d849dc</code></a>,
<a
href="e10f53cf89"><code>e10f53cf</code></a>,
<a
href="4947f683b7"><code>4947f683</code></a>]:</p>
<ul>
<li><code>@​xyflow/system</code><a
href="https://github.com/0"><code>@​0</code></a>.0.48</li>
</ul>
</li>
</ul>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/xyflow/xyflow/blob/main/packages/react/CHANGELOG.md"><code>@​xyflow/react</code>'s
changelog</a>.</em></p>
<blockquote>
<h2>12.4.2</h2>
<h3>Patch Changes</h3>
<ul>
<li>
<p><a
href="https://redirect.github.com/xyflow/xyflow/pull/4957">#4957</a> <a
href="fe843982bf"><code>fe843982</code></a>
Thanks <a
href="https://github.com/peterkogo"><code>@​peterkogo</code></a>! -
Narrow properties selected, selectable, deletable, draggable of
NodeProps type to be required.</p>
</li>
<li>
<p>Updated dependencies [<a
href="fe843982bf"><code>fe843982</code></a>,
<a
href="e73ef09fbc"><code>e73ef09f</code></a>]:</p>
<ul>
<li><code>@​xyflow/system</code><a
href="https://github.com/0"><code>@​0</code></a>.0.50</li>
</ul>
</li>
</ul>
<h2>12.4.1</h2>
<h3>Patch Changes</h3>
<ul>
<li>
<p><a
href="https://redirect.github.com/xyflow/xyflow/pull/4949">#4949</a> <a
href="592c7eaf95"><code>592c7eaf</code></a>
Thanks <a
href="https://github.com/peterkogo"><code>@​peterkogo</code></a>! - Fix
useNodeConnection hook not returning all connected edges.</p>
</li>
<li>
<p>Updated dependencies [<a
href="592c7eaf95"><code>592c7eaf</code></a>]:</p>
<ul>
<li><code>@​xyflow/system</code><a
href="https://github.com/0"><code>@​0</code></a>.0.49</li>
</ul>
</li>
</ul>
<h2>12.4.0</h2>
<h3>Minor Changes</h3>
<ul>
<li><a
href="https://redirect.github.com/xyflow/xyflow/pull/4725">#4725</a> <a
href="e10f53cf89"><code>e10f53cf</code></a>
Thanks <a
href="https://github.com/peterkogo"><code>@​peterkogo</code></a>! - Add
useNodeConnections hook to track all connections to a node. Can be
filtered by handleType and handleId.</li>
</ul>
<h3>Patch Changes</h3>
<ul>
<li>
<p><a
href="https://redirect.github.com/xyflow/xyflow/pull/4947">#4947</a> <a
href="868aa3f3db"><code>868aa3f3</code></a>
Thanks <a href="https://github.com/moklick"><code>@​moklick</code></a>!
- Export ResizeControlVariant correctly as a value.</p>
</li>
<li>
<p><a
href="https://redirect.github.com/xyflow/xyflow/pull/4880">#4880</a> <a
href="e2d849dca6"><code>e2d849dc</code></a>
Thanks <a href="https://github.com/crimx"><code>@​crimx</code></a>! -
Add type check for all event targets</p>
</li>
<li>
<p><a
href="https://redirect.github.com/xyflow/xyflow/pull/4929">#4929</a> <a
href="4947f683b7"><code>4947f683</code></a>
Thanks <a
href="https://github.com/peterkogo"><code>@​peterkogo</code></a>! -
Optimize selections and take into account if edges connected to selected
nodes are actually selectable.</p>
</li>
<li>
<p>Updated dependencies [<a
href="e2d849dca6"><code>e2d849dc</code></a>,
<a
href="e10f53cf89"><code>e10f53cf</code></a>,
<a
href="4947f683b7"><code>4947f683</code></a>]:</p>
<ul>
<li><code>@​xyflow/system</code><a
href="https://github.com/0"><code>@​0</code></a>.0.48</li>
</ul>
</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="9ecb0bb6c7"><code>9ecb0bb</code></a>
chore(packages): bump</li>
<li><a
href="98b212f0d5"><code>98b212f</code></a>
Narrow NodeProps type properties selected, selectable, draggable and
deletabl...</li>
<li><a
href="562c9586d3"><code>562c958</code></a>
chore(packages): bump</li>
<li><a
href="dcf61e6042"><code>dcf61e6</code></a>
chore(packages): bump</li>
<li><a
href="bf7ef89bdb"><code>bf7ef89</code></a>
fix(types): export ResizeControlVariant correctly closes <a
href="https://github.com/xyflow/xyflow/tree/HEAD/packages/react/issues/4946">#4946</a></li>
<li><a
href="cdca1c55d1"><code>cdca1c5</code></a>
Merge pull request <a
href="https://github.com/xyflow/xyflow/tree/HEAD/packages/react/issues/4880">#4880</a>
from crimx/patch-2</li>
<li><a
href="78f9d4e645"><code>78f9d4e</code></a>
refactor(useNodeConnections): change param names</li>
<li><a
href="748c140146"><code>748c140</code></a>
fixes</li>
<li><a
href="228ea770a3"><code>228ea77</code></a>
let let be const</li>
<li><a
href="4947f683b7"><code>4947f68</code></a>
Move areSetsEqual to system</li>
<li>Additional commits viewable in <a
href="https://github.com/xyflow/xyflow/commits/@xyflow/react@12.4.2/packages/react">compare
view</a></li>
</ul>
</details>
<br />

Updates `embla-carousel-react` from 8.5.1 to 8.5.2
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/davidjerleke/embla-carousel/releases">embla-carousel-react's
releases</a>.</em></p>
<blockquote>
<h2>v8.5.2</h2>
<h2>Support</h2>
<p>Embla Carousel is an open source MIT licensed project. If you are
interested in <strong>supporting this project</strong> you can sponsor
it here:</p>
<ul>
<li><a href="https://github.com/sponsors/davidjerleke"><strong><code>💖
Sponsor</code></strong></a></li>
</ul>
<h2>What's Changed</h2>
<ul>
<li>[Bug]: Offset by 1 issue for canScrollNext when tab size has
fractional width by <a
href="https://github.com/yasuhiro-yamamoto"><code>@​yasuhiro-yamamoto</code></a>
in <a
href="https://redirect.github.com/davidjerleke/embla-carousel/pull/1096">davidjerleke/embla-carousel#1096</a></li>
<li>[Bug]: Autoplay never starts if page with carousel is loaded in
inactive browser tab by <a
href="https://github.com/davidjerleke"><code>@​davidjerleke</code></a>
in <a
href="https://redirect.github.com/davidjerleke/embla-carousel/pull/1101">davidjerleke/embla-carousel#1101</a></li>
</ul>
<h2>New Contributors</h2>
<ul>
<li><a
href="https://github.com/yasuhiro-yamamoto"><code>@​yasuhiro-yamamoto</code></a>
made their first contribution in <a
href="https://redirect.github.com/davidjerleke/embla-carousel/pull/1096">davidjerleke/embla-carousel#1096</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/davidjerleke/embla-carousel/compare/v8.5.1...v8.5.2">https://github.com/davidjerleke/embla-carousel/compare/v8.5.1...v8.5.2</a></p>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="5ee3c4ed02"><code>5ee3c4e</code></a>
8.5.2</li>
<li><a
href="acad2f5460"><code>acad2f5</code></a>
Merge pull request <a
href="https://redirect.github.com/davidjerleke/embla-carousel/issues/1101">#1101</a>
from davidjerleke/bug/<a
href="https://redirect.github.com/davidjerleke/embla-carousel/issues/1082">#1082</a></li>
<li><a
href="6727eed531"><code>6727eed</code></a>
Bug fix for <a
href="https://redirect.github.com/davidjerleke/embla-carousel/issues/1082">#1082</a>.</li>
<li><a
href="1b019396e2"><code>1b01939</code></a>
Merge pull request <a
href="https://redirect.github.com/davidjerleke/embla-carousel/issues/1096">#1096</a>
from yasuhiro-yamamoto/fix/issue-899</li>
<li><a
href="266ef1bac1"><code>266ef1b</code></a>
Bug fix for <a
href="https://redirect.github.com/davidjerleke/embla-carousel/issues/899">#899</a>.</li>
<li><a
href="ba2c14cd12"><code>ba2c14c</code></a>
Instantly start animation progress.</li>
<li><a
href="e32119ab93"><code>e32119a</code></a>
Build docs with v8.5.1.</li>
<li>See full diff in <a
href="https://github.com/davidjerleke/embla-carousel/compare/v8.5.1...v8.5.2">compare
view</a></li>
</ul>
</details>
<br />

Updates `lucide-react` from 0.469.0 to 0.474.0
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/lucide-icons/lucide/releases">lucide-react's
releases</a>.</em></p>
<blockquote>
<h2>New icons 0.474.0</h2>
<h2>Modified Icons 🔨</h2>
<ul>
<li><code>expand</code> (<a
href="https://github.com/lucide-icons/lucide/tree/HEAD/packages/lucide-react/issues/2677">#2677</a>)
by <a href="https://github.com/jguddas"><code>@​jguddas</code></a></li>
</ul>
<h2>New icons 0.473.0</h2>
<h2>Modified Icons 🔨</h2>
<ul>
<li><code>package</code> (<a
href="https://github.com/lucide-icons/lucide/tree/HEAD/packages/lucide-react/issues/2706">#2706</a>)
by <a href="https://github.com/sezze"><code>@​sezze</code></a></li>
</ul>
<h2>New icons 0.472.0</h2>
<h2>New icons 🎨</h2>
<ul>
<li><code>battery-plus</code> (<a
href="https://github.com/lucide-icons/lucide/tree/HEAD/packages/lucide-react/issues/2693">#2693</a>)
by <a
href="https://github.com/Footagesus"><code>@​Footagesus</code></a></li>
<li><code>map-plus</code> (<a
href="https://github.com/lucide-icons/lucide/tree/HEAD/packages/lucide-react/issues/2697">#2697</a>)
by <a
href="https://github.com/Seanw265"><code>@​Seanw265</code></a></li>
</ul>
<h2>What's Changed</h2>
<ul>
<li>lucide-svelte: Make sure license ends up in SvelteKit bundles by <a
href="https://github.com/Lettnald"><code>@​Lettnald</code></a> in <a
href="https://redirect.github.com/lucide-icons/lucide/pull/2728">lucide-icons/lucide#2728</a></li>
<li>lucide-react: Fixes aliases imports.</li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/lucide-icons/lucide/compare/0.471.1...0.472.0">https://github.com/lucide-icons/lucide/compare/0.471.1...0.472.0</a></p>
<h2>Hotfix Lucide React exports</h2>
<h2>What's Changed</h2>
<ul>
<li>fix(lucide-react) Adds type module in package.json by <a
href="https://github.com/ericfennis"><code>@​ericfennis</code></a> in <a
href="https://redirect.github.com/lucide-icons/lucide/pull/2731">lucide-icons/lucide#2731</a></li>
</ul>
<h2>Dynamic Icon component Lucide React and new icons 0.471.0</h2>
<h2>New Dynamic Icon Component (lucide-react)</h2>
<p>This is an easier approach than the previous
<code>dynamicIconImports</code> we exported in the library. This one
supports all environments.
We removed the examples in the docs of how you can make a dynamic icon
yourself with a dedicated DynamicIcon component.
This one fetches the icon data itself and renders it instead of fetching
the Icon component from the library.
This makes it more flexible with all the frontend frameworks and
libraries that exist for React.</p>
<blockquote>
<p>🚨
Not recommended for regular applications that work fine with the regular
static icon components.
Using the dynamic icon component increases build time, separate bundles,
and separate network requests for each icon.</p>
</blockquote>
<h3>How to use</h3>
<p><code>DynamicIcon</code> is useful for applications that want to show
icons dynamically by icon name, for example when using a content
management system where icon names are stored in a database.</p>
<pre lang="jsx"><code>const App = () =&gt; (
&lt;DynamicIcon name=&quot;camera&quot; color=&quot;red&quot; size={48}
/&gt;
);
&lt;/tr&gt;&lt;/table&gt; 
</code></pre>
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="961404d5cc"><code>961404d</code></a>
replace <code>keyof ReactSVG</code> with <code>SVGElementType</code> (<a
href="https://github.com/lucide-icons/lucide/tree/HEAD/packages/lucide-react/issues/2668">#2668</a>)</li>
<li><a
href="31c3fefc17"><code>31c3fef</code></a>
fix(lucide-react) Adds type module in package.json (<a
href="https://github.com/lucide-icons/lucide/tree/HEAD/packages/lucide-react/issues/2731">#2731</a>)</li>
<li><a
href="58c2e108c3"><code>58c2e10</code></a>
feat(lucide-react): Add DynamicIcon component (<a
href="https://github.com/lucide-icons/lucide/tree/HEAD/packages/lucide-react/issues/2686">#2686</a>)</li>
<li>See full diff in <a
href="https://github.com/lucide-icons/lucide/commits/0.474.0/packages/lucide-react">compare
view</a></li>
</ul>
</details>
<br />

Updates `react-day-picker` from 9.5.0 to 9.5.1
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/gpbl/react-day-picker/releases">react-day-picker's
releases</a>.</em></p>
<blockquote>
<h2>v9.5.1</h2>
<p>This release fixes the calendar breaking its layout when passing a
<code>month</code> not included between <code>startMonth</code> and
<code>endMonth</code> props.</p>
<h2>What's Changed</h2>
<ul>
<li>fix: display calendar in a valid month when <code>month</code> prop
is invalid by <a
href="https://github.com/rodgobbi"><code>@​rodgobbi</code></a> in <a
href="https://redirect.github.com/gpbl/react-day-picker/pull/2672">gpbl/react-day-picker#2672</a></li>
<li>fix(test): using <code>new Date()</code> instead of
<code>today()</code> fails test by <a
href="https://github.com/gpbl"><code>@​gpbl</code></a> in <a
href="https://redirect.github.com/gpbl/react-day-picker/pull/2656">gpbl/react-day-picker#2656</a></li>
<li>chore(types): update <code>DateLib</code> to not import types from
date-fns by <a href="https://github.com/gpbl"><code>@​gpbl</code></a> in
<a
href="https://redirect.github.com/gpbl/react-day-picker/pull/2655">gpbl/react-day-picker#2655</a></li>
<li>docs: fix broken <code>style.css</code> link by <a
href="https://github.com/jakedee"><code>@​jakedee</code></a> in <a
href="https://redirect.github.com/gpbl/react-day-picker/pull/2666">gpbl/react-day-picker#2666</a></li>
<li>docs: custom components guide to display better examples by <a
href="https://github.com/rodgobbi"><code>@​rodgobbi</code></a> in <a
href="https://redirect.github.com/gpbl/react-day-picker/pull/2668">gpbl/react-day-picker#2668</a></li>
</ul>
<h2>New Contributors</h2>
<ul>
<li><a href="https://github.com/jakedee"><code>@​jakedee</code></a> made
their first contribution in <a
href="https://redirect.github.com/gpbl/react-day-picker/pull/2666">gpbl/react-day-picker#2666</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/gpbl/react-day-picker/compare/v9.5.0...v9.5.1">https://github.com/gpbl/react-day-picker/compare/v9.5.0...v9.5.1</a></p>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="c1e49e606f"><code>c1e49e6</code></a>
build: bump v9.5.1</li>
<li><a
href="9fa2d26421"><code>9fa2d26</code></a>
fix: display calendar in a valid month when month prop is invalid (<a
href="https://redirect.github.com/gpbl/react-day-picker/issues/2672">#2672</a>)</li>
<li><a
href="553f702cc6"><code>553f702</code></a>
build: update dev dependencies (<a
href="https://redirect.github.com/gpbl/react-day-picker/issues/2670">#2670</a>)</li>
<li><a
href="0441d3e0e0"><code>0441d3e</code></a>
Merge branch 'main' of <a
href="https://github.com/gpbl/react-day-picker">https://github.com/gpbl/react-day-picker</a></li>
<li><a
href="fa66a33fa4"><code>fa66a33</code></a>
docs: update for shadcn/ui users</li>
<li><a
href="bf909c25f3"><code>bf909c2</code></a>
docs: fix custom components docs (<a
href="https://redirect.github.com/gpbl/react-day-picker/issues/2668">#2668</a>)</li>
<li><a
href="6b0abf9710"><code>6b0abf9</code></a>
docs: fix broken style.css link (<a
href="https://redirect.github.com/gpbl/react-day-picker/issues/2666">#2666</a>)</li>
<li><a
href="9ad9de48fb"><code>9ad9de4</code></a>
chore: upgrade copyright statements to 2025</li>
<li><a
href="1f070c3c7c"><code>1f070c3</code></a>
chore(types): update DateLib to not import types from date-fns (<a
href="https://redirect.github.com/gpbl/react-day-picker/issues/2655">#2655</a>)</li>
<li><a
href="c958dd089f"><code>c958dd0</code></a>
fix(test): using <code>new Date()</code> instead of <code>today()</code>
fails test (<a
href="https://redirect.github.com/gpbl/react-day-picker/issues/2656">#2656</a>)</li>
<li>Additional commits viewable in <a
href="https://github.com/gpbl/react-day-picker/compare/v9.5.0...v9.5.1">compare
view</a></li>
</ul>
</details>
<br />

Updates `react-shepherd` from 6.1.6 to 6.1.7
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/shepherd-pro/shepherd/releases">react-shepherd's
releases</a>.</em></p>
<blockquote>
<h2>v6.1.7-react-shepherd</h2>
<h2>Release (2025-01-13)</h2>
<p>react-shepherd 6.1.7 (patch)
shepherd.js 14.4.0 (minor)</p>
<h4>🚀 Enhancement</h4>
<ul>
<li><code>shepherd.js</code>
<ul>
<li><a
href="https://redirect.github.com/shipshapecode/shepherd/pull/3068">#3068</a>
💄 Add reset for dialog element (<a
href="https://github.com/chuckcarpenter"><code>@​chuckcarpenter</code></a>)</li>
</ul>
</li>
</ul>
<h4>🐛 Bug Fix</h4>
<ul>
<li><code>shepherd.js</code>, <code>cypress-tests</code>
<ul>
<li><a
href="https://redirect.github.com/shipshapecode/shepherd/pull/3083">#3083</a>
 Remove tabindex to reduce tabs for modal trap (<a
href="https://github.com/chuckcarpenter"><code>@​...

_Description has been truncated_

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-01-28 13:25:54 +00:00
dependabot[bot]
9845a8a22a chore(frontend/deps-dev): bump the development-dependencies group in /autogpt_platform/frontend with 15 updates (#9347)
Bumps the development-dependencies group in /autogpt_platform/frontend
with 15 updates:

| Package | From | To |
| --- | --- | --- |
| [@playwright/test](https://github.com/microsoft/playwright) | `1.49.1`
| `1.50.0` |
|
[@storybook/addon-a11y](https://github.com/storybookjs/storybook/tree/HEAD/code/addons/a11y)
| `8.5.0` | `8.5.2` |
|
[@storybook/addon-essentials](https://github.com/storybookjs/storybook/tree/HEAD/code/addons/essentials)
| `8.5.0` | `8.5.2` |
|
[@storybook/addon-interactions](https://github.com/storybookjs/storybook/tree/HEAD/code/addons/interactions)
| `8.5.0` | `8.5.2` |
|
[@storybook/addon-links](https://github.com/storybookjs/storybook/tree/HEAD/code/addons/links)
| `8.5.0` | `8.5.2` |
|
[@storybook/addon-onboarding](https://github.com/storybookjs/storybook/tree/HEAD/code/addons/onboarding)
| `8.5.0` | `8.5.2` |
|
[@storybook/blocks](https://github.com/storybookjs/storybook/tree/HEAD/code/lib/blocks)
| `8.5.0` | `8.5.2` |
|
[@storybook/nextjs](https://github.com/storybookjs/storybook/tree/HEAD/code/frameworks/nextjs)
| `8.5.0` | `8.5.2` |
|
[@storybook/react](https://github.com/storybookjs/storybook/tree/HEAD/code/renderers/react)
| `8.5.0` | `8.5.2` |
|
[@storybook/test](https://github.com/storybookjs/storybook/tree/HEAD/code/lib/test)
| `8.5.0` | `8.5.2` |
|
[@types/node](https://github.com/DefinitelyTyped/DefinitelyTyped/tree/HEAD/types/node)
| `22.10.7` | `22.10.10` |
| [chromatic](https://github.com/chromaui/chromatic-cli) | `11.25.0` |
`11.25.1` |
|
[eslint-config-next](https://github.com/vercel/next.js/tree/HEAD/packages/eslint-config-next)
| `15.1.5` | `15.1.6` |
|
[prettier-plugin-tailwindcss](https://github.com/tailwindlabs/prettier-plugin-tailwindcss)
| `0.6.10` | `0.6.11` |
|
[storybook](https://github.com/storybookjs/storybook/tree/HEAD/code/lib/cli)
| `8.5.0` | `8.5.2` |

Updates `@playwright/test` from 1.49.1 to 1.50.0
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/microsoft/playwright/releases"><code>@​playwright/test</code>'s
releases</a>.</em></p>
<blockquote>
<h2>v1.50.0</h2>
<h2>Test runner</h2>
<ul>
<li>
<p>New option <a
href="https://playwright.dev/docs/api/class-test#test-step-option-timeout"><code>timeout</code></a>
allows specifying a maximum run time for an individual test step. A
timed-out step will fail the execution of the test.</p>
<pre lang="js"><code>test('some test', async ({ page }) =&gt; {
  await test.step('a step', async () =&gt; {
    // This step can time out separately from the test
  }, { timeout: 1000 });
});
</code></pre>
</li>
<li>
<p>New method <a
href="https://playwright.dev/docs/api/class-test#test-step-skip">test.step.skip()</a>
to disable execution of a test step.</p>
<pre lang="js"><code>test('some test', async ({ page }) =&gt; {
  await test.step('before running step', async () =&gt; {
    // Normal step
  });
<p>await test.step.skip('not yet ready', async () =&gt; {<br />
// This step is skipped<br />
});</p>
<p>await test.step('after running step', async () =&gt; {<br />
// This step still runs even though the previous one was skipped<br />
});<br />
});<br />
</code></pre></p>
</li>
<li>
<p>Expanded <a
href="https://playwright.dev/docs/api/class-locatorassertions#locator-assertions-to-match-aria-snapshot-2">expect(locator).toMatchAriaSnapshot()</a>
to allow storing of aria snapshots in separate YAML files.</p>
</li>
<li>
<p>Added method <a
href="https://playwright.dev/docs/api/class-locatorassertions#locator-assertions-to-have-accessible-error-message">expect(locator).toHaveAccessibleErrorMessage()</a>
to assert the Locator points to an element with a given <a
href="https://w3c.github.io/aria/#aria-errormessage">aria
errormessage</a>.</p>
</li>
<li>
<p>Option <a
href="https://playwright.dev/docs/api/class-testconfig#test-config-update-snapshots">testConfig.updateSnapshots</a>
added the configuration enum <code>changed</code>. <code>changed</code>
updates only the snapshots that have changed, whereas <code>all</code>
now updates all snapshots, regardless of whether there are any
differences.</p>
</li>
<li>
<p>New option <a
href="https://playwright.dev/docs/api/class-testconfig#test-config-update-source-method">testConfig.updateSourceMethod</a>
defines the way source code is updated when <a
href="https://playwright.dev/docs/api/class-testconfig#test-config-update-snapshots">testConfig.updateSnapshots</a>
is configured. Added <code>overwrite</code> and <code>3-way</code> modes
that write the changes into source code, on top of existing
<code>patch</code> mode that creates a patch file.</p>
<pre lang="bash"><code>npx playwright test --update-snapshots=changed
--update-source-method=3way
</code></pre>
</li>
<li>
<p>Option <a
href="https://playwright.dev/docs/api/class-testconfig#test-config-web-server">testConfig.webServer</a>
added a <code>gracefulShutdown</code> field for specifying a process
kill signal other than the default <code>SIGKILL</code>.</p>
</li>
<li>
<p>Exposed <a
href="https://playwright.dev/docs/api/class-teststep#test-step-attachments">testStep.attachments</a>
from the reporter API to allow retrieval of all attachments created by
that step.</p>
</li>
</ul>
<h2>UI updates</h2>
<ul>
<li>Updated default HTML reporter to improve display of
attachments.</li>
<li>New button for picking elements to produce aria snapshots.</li>
<li>Additional details (such as keys pressed) are now displayed
alongside action API calls in traces.</li>
<li>Display of <code>canvas</code> content in traces is error-prone.
Display is now disabled by default, and can be enabled via the
<code>Display canvas content</code> UI setting.</li>
<li><code>Call</code> and <code>Network</code> panels now display
additional time information.</li>
</ul>
<h2>Breaking</h2>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="9d22178533"><code>9d22178</code></a>
chore: mark v1.50.0 (<a
href="https://redirect.github.com/microsoft/playwright/issues/34447">#34447</a>)</li>
<li><a
href="5d6ac9622d"><code>5d6ac96</code></a>
cherry-pick(<a
href="https://redirect.github.com/microsoft/playwright/issues/34442">#34442</a>):
fix(test runner): respect updateSourceMethod from the co...</li>
<li><a
href="97b76b46af"><code>97b76b4</code></a>
cherry-pick(<a
href="https://redirect.github.com/microsoft/playwright/issues/34440">#34440</a>):
chore(driver): roll driver to recent Node.js LTS version</li>
<li><a
href="7bbcc3c624"><code>7bbcc3c</code></a>
cherry-pick(<a
href="https://redirect.github.com/microsoft/playwright/issues/34353">#34353</a>):
chore: move attachment link back to tree item, make it f...</li>
<li><a
href="2811a1d4f5"><code>2811a1d</code></a>
cherry-pick(<a
href="https://redirect.github.com/microsoft/playwright/issues/34430">#34430</a>):
docs: switch gracefulShutdown to primarily mention SIGTE...</li>
<li><a
href="2b8d1ce260"><code>2b8d1ce</code></a>
cherry-pick(<a
href="https://redirect.github.com/microsoft/playwright/issues/34380">#34380</a>):
docs: release notes for v1.50 js (<a
href="https://redirect.github.com/microsoft/playwright/issues/34425">#34425</a>)</li>
<li><a
href="715eb250e7"><code>715eb25</code></a>
cherry-pick(<a
href="https://redirect.github.com/microsoft/playwright/issues/34409">#34409</a>):
fix(aria snapshot): make rebase work when options are sp...</li>
<li><a
href="6106ef020f"><code>6106ef0</code></a>
cherry-pick(<a
href="https://redirect.github.com/microsoft/playwright/issues/34410">#34410</a>):
fix(list reporter): do not break after output without tr...</li>
<li><a
href="09cd74f7b0"><code>09cd74f</code></a>
cherry-pick(<a
href="https://redirect.github.com/microsoft/playwright/issues/34407">#34407</a>):
fix(step.skip): show a skipped indicator in UI mode (<a
href="https://redirect.github.com/microsoft/playwright/issues/34">#34</a>...</li>
<li><a
href="cc6eb090ec"><code>cc6eb09</code></a>
cherry-pick(<a
href="https://redirect.github.com/microsoft/playwright/issues/34386">#34386</a>):
chore: step timeout improvements (<a
href="https://redirect.github.com/microsoft/playwright/issues/34387">#34387</a>)</li>
<li>Additional commits viewable in <a
href="https://github.com/microsoft/playwright/compare/v1.49.1...v1.50.0">compare
view</a></li>
</ul>
</details>
<br />

Updates `@storybook/addon-a11y` from 8.5.0 to 8.5.2
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/storybookjs/storybook/releases"><code>@​storybook/addon-a11y</code>'s
releases</a>.</em></p>
<blockquote>
<h2>v8.5.2</h2>
<h2>8.5.2</h2>
<ul>
<li>Addon Test: Support Vitest 3 browser.test.instances field - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30309">#30309</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>CLI: Corrected Next.js createScript for pnpm. - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30304">#30304</a>,
thanks <a
href="https://github.com/zhyd1997"><code>@​zhyd1997</code></a>!</li>
</ul>
<h2>v8.5.1</h2>
<h2>8.5.1</h2>
<ul>
<li>Addon Test: Replace <code>interaction test</code> -&gt;
<code>component test</code> - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30333">#30333</a>,
thanks <a
href="https://github.com/kylegach"><code>@​kylegach</code></a>!</li>
<li>Manager: Fix escaping of single quotes in dynamic import paths - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30278">#30278</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>RNW-Vite: Support requires for images/fonts - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30305">#30305</a>,
thanks <a
href="https://github.com/dannyhw"><code>@​dannyhw</code></a>!</li>
</ul>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/storybookjs/storybook/blob/next/CHANGELOG.md"><code>@​storybook/addon-a11y</code>'s
changelog</a>.</em></p>
<blockquote>
<h2>8.5.2</h2>
<ul>
<li>Addon Test: Support Vitest 3 browser.test.instances field - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30309">#30309</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>CLI: Corrected Next.js createScript for pnpm. - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30304">#30304</a>,
thanks <a
href="https://github.com/zhyd1997"><code>@​zhyd1997</code></a>!</li>
</ul>
<h2>8.5.1</h2>
<ul>
<li>Addon Test: Replace <code>interaction test</code> -&gt;
<code>component test</code> - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30333">#30333</a>,
thanks <a
href="https://github.com/kylegach"><code>@​kylegach</code></a>!</li>
<li>Addon Test: Support Vitest 3 browser.test.instances field - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30309">#30309</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>Manager: Fix escaping of single quotes in dynamic import paths - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30278">#30278</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>RNW-Vite: Support requires for images/fonts - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30305">#30305</a>,
thanks <a
href="https://github.com/dannyhw"><code>@​dannyhw</code></a>!</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="7dac855e80"><code>7dac855</code></a>
Bump version from &quot;8.5.1&quot; to &quot;8.5.2&quot; [skip ci]</li>
<li><a
href="600af05703"><code>600af05</code></a>
Bump version from &quot;8.5.0&quot; to &quot;8.5.1&quot; [skip ci]</li>
<li>See full diff in <a
href="https://github.com/storybookjs/storybook/commits/v8.5.2/code/addons/a11y">compare
view</a></li>
</ul>
</details>
<br />

Updates `@storybook/addon-essentials` from 8.5.0 to 8.5.2
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/storybookjs/storybook/releases"><code>@​storybook/addon-essentials</code>'s
releases</a>.</em></p>
<blockquote>
<h2>v8.5.2</h2>
<h2>8.5.2</h2>
<ul>
<li>Addon Test: Support Vitest 3 browser.test.instances field - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30309">#30309</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>CLI: Corrected Next.js createScript for pnpm. - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30304">#30304</a>,
thanks <a
href="https://github.com/zhyd1997"><code>@​zhyd1997</code></a>!</li>
</ul>
<h2>v8.5.1</h2>
<h2>8.5.1</h2>
<ul>
<li>Addon Test: Replace <code>interaction test</code> -&gt;
<code>component test</code> - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30333">#30333</a>,
thanks <a
href="https://github.com/kylegach"><code>@​kylegach</code></a>!</li>
<li>Manager: Fix escaping of single quotes in dynamic import paths - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30278">#30278</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>RNW-Vite: Support requires for images/fonts - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30305">#30305</a>,
thanks <a
href="https://github.com/dannyhw"><code>@​dannyhw</code></a>!</li>
</ul>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/storybookjs/storybook/blob/next/CHANGELOG.md"><code>@​storybook/addon-essentials</code>'s
changelog</a>.</em></p>
<blockquote>
<h2>8.5.2</h2>
<ul>
<li>Addon Test: Support Vitest 3 browser.test.instances field - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30309">#30309</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>CLI: Corrected Next.js createScript for pnpm. - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30304">#30304</a>,
thanks <a
href="https://github.com/zhyd1997"><code>@​zhyd1997</code></a>!</li>
</ul>
<h2>8.5.1</h2>
<ul>
<li>Addon Test: Replace <code>interaction test</code> -&gt;
<code>component test</code> - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30333">#30333</a>,
thanks <a
href="https://github.com/kylegach"><code>@​kylegach</code></a>!</li>
<li>Addon Test: Support Vitest 3 browser.test.instances field - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30309">#30309</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>Manager: Fix escaping of single quotes in dynamic import paths - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30278">#30278</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>RNW-Vite: Support requires for images/fonts - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30305">#30305</a>,
thanks <a
href="https://github.com/dannyhw"><code>@​dannyhw</code></a>!</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="7dac855e80"><code>7dac855</code></a>
Bump version from &quot;8.5.1&quot; to &quot;8.5.2&quot; [skip ci]</li>
<li><a
href="600af05703"><code>600af05</code></a>
Bump version from &quot;8.5.0&quot; to &quot;8.5.1&quot; [skip ci]</li>
<li>See full diff in <a
href="https://github.com/storybookjs/storybook/commits/v8.5.2/code/addons/essentials">compare
view</a></li>
</ul>
</details>
<br />

Updates `@storybook/addon-interactions` from 8.5.0 to 8.5.2
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/storybookjs/storybook/releases"><code>@​storybook/addon-interactions</code>'s
releases</a>.</em></p>
<blockquote>
<h2>v8.5.2</h2>
<h2>8.5.2</h2>
<ul>
<li>Addon Test: Support Vitest 3 browser.test.instances field - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30309">#30309</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>CLI: Corrected Next.js createScript for pnpm. - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30304">#30304</a>,
thanks <a
href="https://github.com/zhyd1997"><code>@​zhyd1997</code></a>!</li>
</ul>
<h2>v8.5.1</h2>
<h2>8.5.1</h2>
<ul>
<li>Addon Test: Replace <code>interaction test</code> -&gt;
<code>component test</code> - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30333">#30333</a>,
thanks <a
href="https://github.com/kylegach"><code>@​kylegach</code></a>!</li>
<li>Manager: Fix escaping of single quotes in dynamic import paths - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30278">#30278</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>RNW-Vite: Support requires for images/fonts - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30305">#30305</a>,
thanks <a
href="https://github.com/dannyhw"><code>@​dannyhw</code></a>!</li>
</ul>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/storybookjs/storybook/blob/next/CHANGELOG.md"><code>@​storybook/addon-interactions</code>'s
changelog</a>.</em></p>
<blockquote>
<h2>8.5.2</h2>
<ul>
<li>Addon Test: Support Vitest 3 browser.test.instances field - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30309">#30309</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>CLI: Corrected Next.js createScript for pnpm. - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30304">#30304</a>,
thanks <a
href="https://github.com/zhyd1997"><code>@​zhyd1997</code></a>!</li>
</ul>
<h2>8.5.1</h2>
<ul>
<li>Addon Test: Replace <code>interaction test</code> -&gt;
<code>component test</code> - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30333">#30333</a>,
thanks <a
href="https://github.com/kylegach"><code>@​kylegach</code></a>!</li>
<li>Addon Test: Support Vitest 3 browser.test.instances field - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30309">#30309</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>Manager: Fix escaping of single quotes in dynamic import paths - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30278">#30278</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>RNW-Vite: Support requires for images/fonts - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30305">#30305</a>,
thanks <a
href="https://github.com/dannyhw"><code>@​dannyhw</code></a>!</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="7dac855e80"><code>7dac855</code></a>
Bump version from &quot;8.5.1&quot; to &quot;8.5.2&quot; [skip ci]</li>
<li><a
href="600af05703"><code>600af05</code></a>
Bump version from &quot;8.5.0&quot; to &quot;8.5.1&quot; [skip ci]</li>
<li>See full diff in <a
href="https://github.com/storybookjs/storybook/commits/v8.5.2/code/addons/interactions">compare
view</a></li>
</ul>
</details>
<br />

Updates `@storybook/addon-links` from 8.5.0 to 8.5.2
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/storybookjs/storybook/releases"><code>@​storybook/addon-links</code>'s
releases</a>.</em></p>
<blockquote>
<h2>v8.5.2</h2>
<h2>8.5.2</h2>
<ul>
<li>Addon Test: Support Vitest 3 browser.test.instances field - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30309">#30309</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>CLI: Corrected Next.js createScript for pnpm. - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30304">#30304</a>,
thanks <a
href="https://github.com/zhyd1997"><code>@​zhyd1997</code></a>!</li>
</ul>
<h2>v8.5.1</h2>
<h2>8.5.1</h2>
<ul>
<li>Addon Test: Replace <code>interaction test</code> -&gt;
<code>component test</code> - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30333">#30333</a>,
thanks <a
href="https://github.com/kylegach"><code>@​kylegach</code></a>!</li>
<li>Manager: Fix escaping of single quotes in dynamic import paths - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30278">#30278</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>RNW-Vite: Support requires for images/fonts - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30305">#30305</a>,
thanks <a
href="https://github.com/dannyhw"><code>@​dannyhw</code></a>!</li>
</ul>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/storybookjs/storybook/blob/next/CHANGELOG.md"><code>@​storybook/addon-links</code>'s
changelog</a>.</em></p>
<blockquote>
<h2>8.5.2</h2>
<ul>
<li>Addon Test: Support Vitest 3 browser.test.instances field - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30309">#30309</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>CLI: Corrected Next.js createScript for pnpm. - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30304">#30304</a>,
thanks <a
href="https://github.com/zhyd1997"><code>@​zhyd1997</code></a>!</li>
</ul>
<h2>8.5.1</h2>
<ul>
<li>Addon Test: Replace <code>interaction test</code> -&gt;
<code>component test</code> - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30333">#30333</a>,
thanks <a
href="https://github.com/kylegach"><code>@​kylegach</code></a>!</li>
<li>Addon Test: Support Vitest 3 browser.test.instances field - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30309">#30309</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>Manager: Fix escaping of single quotes in dynamic import paths - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30278">#30278</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>RNW-Vite: Support requires for images/fonts - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30305">#30305</a>,
thanks <a
href="https://github.com/dannyhw"><code>@​dannyhw</code></a>!</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="7dac855e80"><code>7dac855</code></a>
Bump version from &quot;8.5.1&quot; to &quot;8.5.2&quot; [skip ci]</li>
<li><a
href="600af05703"><code>600af05</code></a>
Bump version from &quot;8.5.0&quot; to &quot;8.5.1&quot; [skip ci]</li>
<li>See full diff in <a
href="https://github.com/storybookjs/storybook/commits/v8.5.2/code/addons/links">compare
view</a></li>
</ul>
</details>
<br />

Updates `@storybook/addon-onboarding` from 8.5.0 to 8.5.2
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/storybookjs/storybook/releases"><code>@​storybook/addon-onboarding</code>'s
releases</a>.</em></p>
<blockquote>
<h2>v8.5.2</h2>
<h2>8.5.2</h2>
<ul>
<li>Addon Test: Support Vitest 3 browser.test.instances field - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30309">#30309</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>CLI: Corrected Next.js createScript for pnpm. - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30304">#30304</a>,
thanks <a
href="https://github.com/zhyd1997"><code>@​zhyd1997</code></a>!</li>
</ul>
<h2>v8.5.1</h2>
<h2>8.5.1</h2>
<ul>
<li>Addon Test: Replace <code>interaction test</code> -&gt;
<code>component test</code> - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30333">#30333</a>,
thanks <a
href="https://github.com/kylegach"><code>@​kylegach</code></a>!</li>
<li>Manager: Fix escaping of single quotes in dynamic import paths - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30278">#30278</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>RNW-Vite: Support requires for images/fonts - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30305">#30305</a>,
thanks <a
href="https://github.com/dannyhw"><code>@​dannyhw</code></a>!</li>
</ul>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/storybookjs/storybook/blob/next/CHANGELOG.md"><code>@​storybook/addon-onboarding</code>'s
changelog</a>.</em></p>
<blockquote>
<h2>8.5.2</h2>
<ul>
<li>Addon Test: Support Vitest 3 browser.test.instances field - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30309">#30309</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>CLI: Corrected Next.js createScript for pnpm. - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30304">#30304</a>,
thanks <a
href="https://github.com/zhyd1997"><code>@​zhyd1997</code></a>!</li>
</ul>
<h2>8.5.1</h2>
<ul>
<li>Addon Test: Replace <code>interaction test</code> -&gt;
<code>component test</code> - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30333">#30333</a>,
thanks <a
href="https://github.com/kylegach"><code>@​kylegach</code></a>!</li>
<li>Addon Test: Support Vitest 3 browser.test.instances field - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30309">#30309</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>Manager: Fix escaping of single quotes in dynamic import paths - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30278">#30278</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>RNW-Vite: Support requires for images/fonts - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30305">#30305</a>,
thanks <a
href="https://github.com/dannyhw"><code>@​dannyhw</code></a>!</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="7dac855e80"><code>7dac855</code></a>
Bump version from &quot;8.5.1&quot; to &quot;8.5.2&quot; [skip ci]</li>
<li><a
href="600af05703"><code>600af05</code></a>
Bump version from &quot;8.5.0&quot; to &quot;8.5.1&quot; [skip ci]</li>
<li>See full diff in <a
href="https://github.com/storybookjs/storybook/commits/v8.5.2/code/addons/onboarding">compare
view</a></li>
</ul>
</details>
<br />

Updates `@storybook/blocks` from 8.5.0 to 8.5.2
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/storybookjs/storybook/releases"><code>@​storybook/blocks</code>'s
releases</a>.</em></p>
<blockquote>
<h2>v8.5.2</h2>
<h2>8.5.2</h2>
<ul>
<li>Addon Test: Support Vitest 3 browser.test.instances field - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30309">#30309</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>CLI: Corrected Next.js createScript for pnpm. - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30304">#30304</a>,
thanks <a
href="https://github.com/zhyd1997"><code>@​zhyd1997</code></a>!</li>
</ul>
<h2>v8.5.1</h2>
<h2>8.5.1</h2>
<ul>
<li>Addon Test: Replace <code>interaction test</code> -&gt;
<code>component test</code> - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30333">#30333</a>,
thanks <a
href="https://github.com/kylegach"><code>@​kylegach</code></a>!</li>
<li>Manager: Fix escaping of single quotes in dynamic import paths - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30278">#30278</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>RNW-Vite: Support requires for images/fonts - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30305">#30305</a>,
thanks <a
href="https://github.com/dannyhw"><code>@​dannyhw</code></a>!</li>
</ul>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/storybookjs/storybook/blob/next/CHANGELOG.md"><code>@​storybook/blocks</code>'s
changelog</a>.</em></p>
<blockquote>
<h2>8.5.2</h2>
<ul>
<li>Addon Test: Support Vitest 3 browser.test.instances field - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30309">#30309</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>CLI: Corrected Next.js createScript for pnpm. - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30304">#30304</a>,
thanks <a
href="https://github.com/zhyd1997"><code>@​zhyd1997</code></a>!</li>
</ul>
<h2>8.5.1</h2>
<ul>
<li>Addon Test: Replace <code>interaction test</code> -&gt;
<code>component test</code> - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30333">#30333</a>,
thanks <a
href="https://github.com/kylegach"><code>@​kylegach</code></a>!</li>
<li>Addon Test: Support Vitest 3 browser.test.instances field - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30309">#30309</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>Manager: Fix escaping of single quotes in dynamic import paths - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30278">#30278</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>RNW-Vite: Support requires for images/fonts - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30305">#30305</a>,
thanks <a
href="https://github.com/dannyhw"><code>@​dannyhw</code></a>!</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="7dac855e80"><code>7dac855</code></a>
Bump version from &quot;8.5.1&quot; to &quot;8.5.2&quot; [skip ci]</li>
<li><a
href="600af05703"><code>600af05</code></a>
Bump version from &quot;8.5.0&quot; to &quot;8.5.1&quot; [skip ci]</li>
<li>See full diff in <a
href="https://github.com/storybookjs/storybook/commits/v8.5.2/code/lib/blocks">compare
view</a></li>
</ul>
</details>
<br />

Updates `@storybook/nextjs` from 8.5.0 to 8.5.2
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/storybookjs/storybook/releases"><code>@​storybook/nextjs</code>'s
releases</a>.</em></p>
<blockquote>
<h2>v8.5.2</h2>
<h2>8.5.2</h2>
<ul>
<li>Addon Test: Support Vitest 3 browser.test.instances field - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30309">#30309</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>CLI: Corrected Next.js createScript for pnpm. - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30304">#30304</a>,
thanks <a
href="https://github.com/zhyd1997"><code>@​zhyd1997</code></a>!</li>
</ul>
<h2>v8.5.1</h2>
<h2>8.5.1</h2>
<ul>
<li>Addon Test: Replace <code>interaction test</code> -&gt;
<code>component test</code> - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30333">#30333</a>,
thanks <a
href="https://github.com/kylegach"><code>@​kylegach</code></a>!</li>
<li>Manager: Fix escaping of single quotes in dynamic import paths - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30278">#30278</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>RNW-Vite: Support requires for images/fonts - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30305">#30305</a>,
thanks <a
href="https://github.com/dannyhw"><code>@​dannyhw</code></a>!</li>
</ul>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/storybookjs/storybook/blob/next/CHANGELOG.md"><code>@​storybook/nextjs</code>'s
changelog</a>.</em></p>
<blockquote>
<h2>8.5.2</h2>
<ul>
<li>Addon Test: Support Vitest 3 browser.test.instances field - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30309">#30309</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>CLI: Corrected Next.js createScript for pnpm. - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30304">#30304</a>,
thanks <a
href="https://github.com/zhyd1997"><code>@​zhyd1997</code></a>!</li>
</ul>
<h2>8.5.1</h2>
<ul>
<li>Addon Test: Replace <code>interaction test</code> -&gt;
<code>component test</code> - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30333">#30333</a>,
thanks <a
href="https://github.com/kylegach"><code>@​kylegach</code></a>!</li>
<li>Addon Test: Support Vitest 3 browser.test.instances field - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30309">#30309</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>Manager: Fix escaping of single quotes in dynamic import paths - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30278">#30278</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>RNW-Vite: Support requires for images/fonts - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30305">#30305</a>,
thanks <a
href="https://github.com/dannyhw"><code>@​dannyhw</code></a>!</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="7dac855e80"><code>7dac855</code></a>
Bump version from &quot;8.5.1&quot; to &quot;8.5.2&quot; [skip ci]</li>
<li><a
href="600af05703"><code>600af05</code></a>
Bump version from &quot;8.5.0&quot; to &quot;8.5.1&quot; [skip ci]</li>
<li><a
href="032a7658ed"><code>032a765</code></a>
Merge pull request <a
href="https://github.com/storybookjs/storybook/tree/HEAD/code/frameworks/nextjs/issues/30333">#30333</a>
from storybookjs/interactions-to-component-2</li>
<li>See full diff in <a
href="https://github.com/storybookjs/storybook/commits/v8.5.2/code/frameworks/nextjs">compare
view</a></li>
</ul>
</details>
<br />

Updates `@storybook/react` from 8.5.0 to 8.5.2
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/storybookjs/storybook/releases"><code>@​storybook/react</code>'s
releases</a>.</em></p>
<blockquote>
<h2>v8.5.2</h2>
<h2>8.5.2</h2>
<ul>
<li>Addon Test: Support Vitest 3 browser.test.instances field - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30309">#30309</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>CLI: Corrected Next.js createScript for pnpm. - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30304">#30304</a>,
thanks <a
href="https://github.com/zhyd1997"><code>@​zhyd1997</code></a>!</li>
</ul>
<h2>v8.5.1</h2>
<h2>8.5.1</h2>
<ul>
<li>Addon Test: Replace <code>interaction test</code> -&gt;
<code>component test</code> - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30333">#30333</a>,
thanks <a
href="https://github.com/kylegach"><code>@​kylegach</code></a>!</li>
<li>Manager: Fix escaping of single quotes in dynamic import paths - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30278">#30278</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>RNW-Vite: Support requires for images/fonts - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30305">#30305</a>,
thanks <a
href="https://github.com/dannyhw"><code>@​dannyhw</code></a>!</li>
</ul>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/storybookjs/storybook/blob/next/CHANGELOG.md"><code>@​storybook/react</code>'s
changelog</a>.</em></p>
<blockquote>
<h2>8.5.2</h2>
<ul>
<li>Addon Test: Support Vitest 3 browser.test.instances field - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30309">#30309</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>CLI: Corrected Next.js createScript for pnpm. - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30304">#30304</a>,
thanks <a
href="https://github.com/zhyd1997"><code>@​zhyd1997</code></a>!</li>
</ul>
<h2>8.5.1</h2>
<ul>
<li>Addon Test: Replace <code>interaction test</code> -&gt;
<code>component test</code> - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30333">#30333</a>,
thanks <a
href="https://github.com/kylegach"><code>@​kylegach</code></a>!</li>
<li>Addon Test: Support Vitest 3 browser.test.instances field - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30309">#30309</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>Manager: Fix escaping of single quotes in dynamic import paths - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30278">#30278</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>RNW-Vite: Support requires for images/fonts - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30305">#30305</a>,
thanks <a
href="https://github.com/dannyhw"><code>@​dannyhw</code></a>!</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="7dac855e80"><code>7dac855</code></a>
Bump version from &quot;8.5.1&quot; to &quot;8.5.2&quot; [skip ci]</li>
<li><a
href="600af05703"><code>600af05</code></a>
Bump version from &quot;8.5.0&quot; to &quot;8.5.1&quot; [skip ci]</li>
<li><a
href="032a7658ed"><code>032a765</code></a>
Merge pull request <a
href="https://github.com/storybookjs/storybook/tree/HEAD/code/renderers/react/issues/30333">#30333</a>
from storybookjs/interactions-to-component-2</li>
<li>See full diff in <a
href="https://github.com/storybookjs/storybook/commits/v8.5.2/code/renderers/react">compare
view</a></li>
</ul>
</details>
<br />

Updates `@storybook/test` from 8.5.0 to 8.5.2
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/storybookjs/storybook/releases"><code>@​storybook/test</code>'s
releases</a>.</em></p>
<blockquote>
<h2>v8.5.2</h2>
<h2>8.5.2</h2>
<ul>
<li>Addon Test: Support Vitest 3 browser.test.instances field - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30309">#30309</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>CLI: Corrected Next.js createScript for pnpm. - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30304">#30304</a>,
thanks <a
href="https://github.com/zhyd1997"><code>@​zhyd1997</code></a>!</li>
</ul>
<h2>v8.5.1</h2>
<h2>8.5.1</h2>
<ul>
<li>Addon Test: Replace <code>interaction test</code> -&gt;
<code>component test</code> - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30333">#30333</a>,
thanks <a
href="https://github.com/kylegach"><code>@​kylegach</code></a>!</li>
<li>Manager: Fix escaping of single quotes in dynamic import paths - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30278">#30278</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>RNW-Vite: Support requires for images/fonts - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30305">#30305</a>,
thanks <a
href="https://github.com/dannyhw"><code>@​dannyhw</code></a>!</li>
</ul>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/storybookjs/storybook/blob/next/CHANGELOG.md"><code>@​storybook/test</code>'s
changelog</a>.</em></p>
<blockquote>
<h2>8.5.2</h2>
<ul>
<li>Addon Test: Support Vitest 3 browser.test.instances field - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30309">#30309</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>CLI: Corrected Next.js createScript for pnpm. - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30304">#30304</a>,
thanks <a
href="https://github.com/zhyd1997"><code>@​zhyd1997</code></a>!</li>
</ul>
<h2>8.5.1</h2>
<ul>
<li>Addon Test: Replace <code>interaction test</code> -&gt;
<code>component test</code> - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30333">#30333</a>,
thanks <a
href="https://github.com/kylegach"><code>@​kylegach</code></a>!</li>
<li>Addon Test: Support Vitest 3 browser.test.instances field - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30309">#30309</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>Manager: Fix escaping of single quotes in dynamic import paths - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30278">#30278</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>RNW-Vite: Support requires for images/fonts - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30305">#30305</a>,
thanks <a
href="https://github.com/dannyhw"><code>@​dannyhw</code></a>!</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="7dac855e80"><code>7dac855</code></a>
Bump version from &quot;8.5.1&quot; to &quot;8.5.2&quot; [skip ci]</li>
<li><a
href="600af05703"><code>600af05</code></a>
Bump version from &quot;8.5.0&quot; to &quot;8.5.1&quot; [skip ci]</li>
<li>See full diff in <a
href="https://github.com/storybookjs/storybook/commits/v8.5.2/code/lib/test">compare
view</a></li>
</ul>
</details>
<br />

Updates `@types/node` from 22.10.7 to 22.10.10
<details>
<summary>Commits</summary>
<ul>
<li>See full diff in <a
href="https://github.com/DefinitelyTyped/DefinitelyTyped/commits/HEAD/types/node">compare
view</a></li>
</ul>
</details>
<br />

Updates `chromatic` from 11.25.0 to 11.25.1
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/chromaui/chromatic-cli/releases">chromatic's
releases</a>.</em></p>
<blockquote>
<h2>v11.25.1</h2>
<h4>🐛 Bug Fix</h4>
<ul>
<li>Don't normalize package.json fields <a
href="https://redirect.github.com/chromaui/chromatic-cli/pull/1143">#1143</a>
(<a href="https://github.com/codykaup"><code>@​codykaup</code></a>)</li>
</ul>
<h4>Authors: 1</h4>
<ul>
<li>Cody Kaup (<a
href="https://github.com/codykaup"><code>@​codykaup</code></a>)</li>
</ul>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/chromaui/chromatic-cli/blob/main/CHANGELOG.md">chromatic's
changelog</a>.</em></p>
<blockquote>
<h1>v11.25.1 (Wed Jan 22 2025)</h1>
<h4>🐛 Bug Fix</h4>
<ul>
<li>Don't normalize package.json fields <a
href="https://redirect.github.com/chromaui/chromatic-cli/pull/1143">#1143</a>
(<a href="https://github.com/codykaup"><code>@​codykaup</code></a>)</li>
</ul>
<h4>Authors: 1</h4>
<ul>
<li>Cody Kaup (<a
href="https://github.com/codykaup"><code>@​codykaup</code></a>)</li>
</ul>
<hr />
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="035a1bf6ef"><code>035a1bf</code></a>
Bump version to: 11.25.1 [skip ci]</li>
<li><a
href="f9af9e6ca4"><code>f9af9e6</code></a>
Update CHANGELOG.md [skip ci]</li>
<li><a
href="e69d42b2f6"><code>e69d42b</code></a>
Merge pull request <a
href="https://redirect.github.com/chromaui/chromatic-cli/issues/1143">#1143</a>
from chromaui/cody/cap-2409-cli-exits-silently-when-...</li>
<li><a
href="e9d7fa8a68"><code>e9d7fa8</code></a>
Don't normalize package.json fields</li>
<li>See full diff in <a
href="https://github.com/chromaui/chromatic-cli/compare/v11.25.0...v11.25.1">compare
view</a></li>
</ul>
</details>
<br />

Updates `eslint-config-next` from 15.1.5 to 15.1.6
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/vercel/next.js/releases">eslint-config-next's
releases</a>.</em></p>
<blockquote>
<h2>v15.1.6</h2>
<blockquote>
<p>[!NOTE]<br />
This release is backporting bug fixes. It does <strong>not</strong>
include all pending features/changes on canary.</p>
</blockquote>
<h3>Core Changes</h3>
<ul>
<li>fix: don't memory-leak promises passed to waitUntil (<a
href="https://github.com/vercel/next.js/tree/HEAD/packages/eslint-config-next/issues/75041">#75041</a>)</li>
<li>backport: fix prerender issue with intercepting routes +
generateStaticParams (<a
href="https://github.com/vercel/next.js/tree/HEAD/packages/eslint-config-next/issues/75170">#75170</a>)</li>
</ul>
<h3>Credits</h3>
<p>Huge thanks to <a
href="https://github.com/lubieowoce"><code>@​lubieowoce</code></a> and
<a href="https://github.com/ztanner"><code>@​ztanner</code></a> for
helping!</p>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="38a6d0177e"><code>38a6d01</code></a>
v15.1.6</li>
<li>See full diff in <a
href="https://github.com/vercel/next.js/commits/v15.1.6/packages/eslint-config-next">compare
view</a></li>
</ul>
</details>
<br />

Updates `prettier-plugin-tailwindcss` from 0.6.10 to 0.6.11
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/tailwindlabs/prettier-plugin-tailwindcss/releases">prettier-plugin-tailwindcss's
releases</a>.</em></p>
<blockquote>
<h2>v0.6.11</h2>
<ul>
<li>Support TypeScript configs and plugins when using v4 (<a
href="https://redirect.github.com/tailwindlabs/prettier-plugin-tailwindcss/pull/342">#342</a>)</li>
</ul>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/tailwindlabs/prettier-plugin-tailwindcss/blob/main/CHANGELOG.md">prettier-plugin-tailwindcss's
changelog</a>.</em></p>
<blockquote>
<h2>[0.6.11] - 2025-01-23</h2>
<ul>
<li>Support TypeScript configs and plugins when using v4 (<a
href="https://redirect.github.com/tailwindlabs/prettier-plugin-tailwindcss/pull/342">#342</a>)</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="b5fb12c15b"><code>b5fb12c</code></a>
0.6.11</li>
<li><a
href="be269bd5e4"><code>be269bd</code></a>
Fix import</li>
<li><a
href="253033556b"><code>2530335</code></a>
Support loading TypeScript configs in v4 (<a
href="https://redirect.github.com/tailwindlabs/prettier-plugin-tailwindcss/issues/342">#342</a>)</li>
<li><a
href="6d3fa0786f"><code>6d3fa07</code></a>
Update changelog</li>
<li><a
href="95fce37f64"><code>95fce37</code></a>
Update changelog</li>
<li><a
href="f8a2244735"><code>f8a2244</code></a>
Update changelog</li>
<li>See full diff in <a
href="https://github.com/tailwindlabs/prettier-plugin-tailwindcss/compare/v0.6.10...v0.6.11">compare
view</a></li>
</ul>
</details>
<br />

Updates `storybook` from 8.5.0 to 8.5.2
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/storybookjs/storybook/releases">storybook's
releases</a>.</em></p>
<blockquote>
<h2>v8.5.2</h2>
<h2>8.5.2</h2>
<ul>
<li>Addon Test: Support Vitest 3 browser.test.instances field - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30309">#30309</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>CLI: Corrected Next.js createScript for pnpm. - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30304">#30304</a>,
thanks <a
href="https://github.com/zhyd1997"><code>@​zhyd1997</code></a>!</li>
</ul>
<h2>v8.5.1</h2>
<h2>8.5.1</h2>
<ul>
<li>Addon Test: Replace <code>interaction test</code> -&gt;
<code>component test</code> - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30333">#30333</a>,
thanks <a
href="https://github.com/kylegach"><code>@​kylegach</code></a>!</li>
<li>Manager: Fix escaping of single quotes in dynamic import paths - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30278">#30278</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>RNW-Vite: Support requires for images/fonts - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30305">#30305</a>,
thanks <a
href="https://github.com/dannyhw"><code>@​dannyhw</code></a>!</li>
</ul>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/storybookjs/storybook/blob/next/CHANGELOG.md">storybook's
changelog</a>.</em></p>
<blockquote>
<h2>8.5.2</h2>
<ul>
<li>Addon Test: Support Vitest 3 browser.test.instances field - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30309">#30309</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>CLI: Corrected Next.js createScript for pnpm. - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30304">#30304</a>,
thanks <a
href="https://github.com/zhyd1997"><code>@​zhyd1997</code></a>!</li>
</ul>
<h2>8.5.1</h2>
<ul>
<li>Addon Test: Replace <code>interaction test</code> -&gt;
<code>component test</code> - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30333">#30333</a>,
thanks <a
href="https://github.com/kylegach"><code>@​kylegach</code></a>!</li>
<li>Addon Test: Support Vitest 3 browser.test.instances field - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30309">#30309</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>Manager: Fix escaping of single quotes in dynamic import paths - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30278">#30278</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>RNW-Vite: Support requires for images/fonts - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30305">#30305</a>,
thanks <a
href="https://github.com/dannyhw"><code>@​dannyhw</code></a>!</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="7dac855e80"><code>7dac855</code></a>
Bump version from &quot;8.5.1&quot; to &quot;8.5.2&quot; [skip ci]</li>
<li><a
href="600af05703"><code>600af05</code></a>
Bump version from &quot;8.5.0&quot; to &quot;8.5.1&quot; [skip ci]</li>
<li>See full diff in <a
href="https://github.com/storybookjs/storybook/commits/v8.5.2/code/lib/cli">compare
view</a></li>
</ul>
</details>
<br />


Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore <dependency name> major version` will close this
group update PR and stop Dependabot creating any more for the specific
dependency's major version (unless you unignore this specific
dependency's major version or upgrade to it yourself)
- `@dependabot ignore <dependency name> minor version` will close this
group update PR and stop Dependabot creating any more for the specific
dependency's minor version (unless you unignore this specific
dependency's minor version or upgrade to it yourself)
- `@dependabot ignore <dependency name>` will close this group update PR
and stop Dependabot creating any more for the specific dependency
(unless you unignore this specific dependency or upgrade to it yourself)
- `@dependabot unignore <dependency name>` will remove all of the ignore
conditions of the specified dependency
- `@dependabot unignore <dependency name> <ignore condition>` will
remove the ignore condition of the specified dependency and ignore
conditions


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-01-28 13:21:24 +00:00
Abhimanyu Yadav
56dc5610e4 fix(frontend) : add default support in oneOf field (#9352)
- Adding support of default value in oneOf field in
`node-input-components.tsx` file
- Also I have manually tested it and it's working.
2025-01-28 12:48:53 +00:00
Mario Sacaj
8685df587b CodeExecutionBlock split into InstantiationBlock & StepExecutionBlock (#9158)
Especially in case of heavy sandbox customization we don't want to spawn
a new VM every time we need to execute code, especially in circumstances
when time of inactivity is negligible.

### Changes 🏗️

Added two now blocks inside
`autogpt_platform/backend/backend/blocks/code_executor.py` which split
the logic of the CodeExecutionBlock in two parts: InstantiationBlock and
StepExecutionBlock.

The overall setup shall be the done in the InstantiationBlock which then
passes as output the `sandbox_id` which identifies the sandbox instance.

At a later stage any line of code can be executed through
StepExecutionBlock inside the same instance using the e2b library
function `sandbox = Sandbox.connect(sandbox_id=sandbox_id,
api_key=api_key)`. Variables and memory form earlier executions or
instantiation are persisted.

Beware, this approach does not make use of the beta apis to pause and
resume an instance. By using the approach above the VM will likely be
killed after the timeout inactivity set at instantiation time.


### New Features:

* **InstantiationBlock Class**:
- Added a new class to instantiate an isolated sandbox environment with
internet access for code execution. This class includes input and output
schemas, methods for running and executing code, and handles sandbox
setup commands and code execution.

* **StepExecutionBlock Class**:
- Introduced a new class to execute code in a previously instantiated
sandbox environment. This class also includes input and output schemas,
methods for running and executing step code, and handles connecting to
an existing sandbox for code execution.

### Minor Changes:

* **ProgrammingLanguage Enum**:
- Added a blank line for better readability within the
`ProgrammingLanguage` enum definition.

---------

Co-authored-by: Toran Bruce Richards <toran.richards@gmail.com>
Co-authored-by: Aarushi <50577581+aarushik93@users.noreply.github.com>
2025-01-28 12:45:18 +00:00
Nicholas Tindle
7609969169 feat(blocks): Add blocks for GitHub checks & statuses (#9271)
<!-- Clearly explain the need for these changes: -->
We really want to be able to automate with our agents some behaviors
that involve blocking PRs or unblocking them based on automations.

### Changes 🏗️

<!-- Concisely describe all of the changes made in this pull request:
-->
- Adds Status Blocks for github
- Modifies the pull requests block to not have include pr changes as
advanced because that's a wild place to hide it
- Adds some disabled checks blocks that require github app
- Added IfMatches Block to make using stuff easier

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [x] Built agent using

---------

Co-authored-by: Aarushi <50577581+aarushik93@users.noreply.github.com>
2025-01-28 12:34:09 +00:00
Zamil Majdy
e0d153b284 fix(backend): Make spend credit failure as part of block execution failure (#9340)
<img width="1427" alt="image"
src="https://github.com/user-attachments/assets/de48ceb7-2a90-44e6-a687-d6759a1aa434"
/>

### Changes 🏗️

Block execution that cost credit can fail, the scope of this fix is
making this error as part of the node execution instead of a system
error.

### Checklist 📋

#### For code changes:
- [ ] I have clearly listed my changes in the PR description
- [ ] I have made a test plan
- [ ] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [ ] ...

<details>
  <summary>Example test plan</summary>
  
  - [ ] Create from scratch and execute an agent with at least 3 blocks
- [ ] Import an agent from file upload, and confirm it executes
correctly
  - [ ] Upload agent to marketplace
- [ ] Import an agent from marketplace and confirm it executes correctly
  - [ ] Edit an agent from monitor, and confirm it executes correctly
</details>

#### For configuration changes:
- [ ] `.env.example` is updated or already compatible with my changes
- [ ] `docker-compose.yml` is updated or already compatible with my
changes
- [ ] I have included a list of my configuration changes in the PR
description (under **Changes**)

<details>
  <summary>Examples of configuration changes</summary>

  - Changing ports
  - Adding new services that need to communicate with each other
  - Secrets or environment variable changes
  - New or infrastructure changes such as databases
</details>
2025-01-28 12:21:31 +00:00
Reinier van der Leer
0811e8a990 feat(backend): Enable executing store agents without agent ownership (#9267)
This re-introduces PR #9179 with some fixes.

This PR enables the execution of store agents even if they are not owned
by the user. Key changes include handling store-listed agents in the
`get_graph` logic, improving execution flow, and ensuring
version-specific handling. These updates support more flexible agent
execution.

### Changes 🏗️
(copied from #9179)

- **Graph Retrieval:** Updated `get_graph` to check store listings for
agents not owned by the user.
- **Version Handling:** Added `graph_version` to execution methods for
consistent version-specific execution.
- **Execution Flow:** Refactored `scheduler.py`, `rest_api.py`, and
other modules for clearer logic and better maintainability.
- **Testing:** Updated `test_manager.py` and other test cases to
validate execution of store-listed agents added test for accessing graph

Out-of-scope changes:
- Add logic to pretty-print Pydantic validation error responses to
backend API client in frontend

---------

Co-authored-by: Nicholas Tindle <nicktindle@outlook.com>
Co-authored-by: Nicholas Tindle <nicholas.tindle@agpt.co>
Co-authored-by: Swifty <craigswift13@gmail.com>
Co-authored-by: Zamil Majdy <zamil.majdy@agpt.co>
2025-01-28 11:00:32 +00:00
Zamil Majdy
a7a59a26b9 feat(platform): Implement Auto-Top-Up credits capability (#9278)
<img width="1157" alt="image"
src="https://github.com/user-attachments/assets/5b70aa1e-876f-4163-b69e-c22bd21ea8d3"
/>

### Changes 🏗️

Added auto-top-up credits capability:
* Added autoTopUpConfig column on user
* Added autoTopUp form UI
* Added payment charge logic for top_up_credits 
* Added auto-top-up logic on spend_credits

### Checklist 📋

#### For code changes:
- [ ] I have clearly listed my changes in the PR description
- [ ] I have made a test plan
- [ ] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [ ] ...

<details>
  <summary>Example test plan</summary>
  
  - [ ] Create from scratch and execute an agent with at least 3 blocks
- [ ] Import an agent from file upload, and confirm it executes
correctly
  - [ ] Upload agent to marketplace
- [ ] Import an agent from marketplace and confirm it executes correctly
  - [ ] Edit an agent from monitor, and confirm it executes correctly
</details>

#### For configuration changes:
- [ ] `.env.example` is updated or already compatible with my changes
- [ ] `docker-compose.yml` is updated or already compatible with my
changes
- [ ] I have included a list of my configuration changes in the PR
description (under **Changes**)

<details>
  <summary>Examples of configuration changes</summary>

  - Changing ports
  - Adding new services that need to communicate with each other
  - Secrets or environment variable changes
  - New or infrastructure changes such as databases
</details>

---------

Co-authored-by: Krzysztof Czerwinski <kpczerwinski@gmail.com>
2025-01-28 10:18:29 +00:00
Bently
40897ce874 fix(blocks): add ollama_host to AIConversationBlock (#9351)
<!-- Clearly explain the need for these changes: -->

### Changes 🏗️

In the AI Conversation block it uses
``AIStructuredResponseGeneratorBlock.input`` for the input but it is
missing ollama_host, this means that if i try to use ollama with this
block, ollama_host is not passed, this PR fixes that by simply adding
``ollama_host=input_data.ollama_host`` to the ``class
AIConversationBlock(AIBlockBase):``

```diff
AIStructuredResponseGeneratorBlock.Input(
    prompt="",
    credentials=input_data.credentials,
    model=input_data.model,
    conversation_history=input_data.messages,
    max_tokens=input_data.max_tokens,
    expected_format={},
++  ollama_host=input_data.ollama_host,
),
```

<!-- Concisely describe all of the changes made in this pull request:
-->

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
- [x] Run the AI Conversation block with a normal provider like GPT-4 to
see if it works like normal
- [x] Run the AI Conversation block with Ollama to make sure the changes
made work
2025-01-28 08:23:05 +00:00
Nicholas Tindle
ac8a466cda feat(platform): Add username+password credentials type; fix email and reddit blocks (#9113)
<!-- Clearly explain the need for these changes: -->
Update and adds a basic credential field for use in integrations like
reddit


### Changes 🏗️

<!-- Concisely describe all of the changes made in this pull request:
-->
- Reddit
	- Drops the Username and Password for reddit from the .env
	- Updates Reddit block with modern provider and credential system
- moves clientid and secret to reading from `Settings().secrets` rather
than input on the block
	- moves user agent to `Settings().config`

- SMTP
  - update the block to support user password and modern credentials

- Add `UserPasswordCredentials`
	- Default API key expiry to None explicitly to help type cohesion
- add `UserPasswordCredentials` with a weird form of `bearer` which we
ideally remove because `basic` is a more appropriate name. This is
dependent on `Webhook _base` allowing a subset of `Credentials`
	- Update `Credentials` and `CredentialsType`
- Fix various `OAuth2Credentials | APIKeyCredentials` -> `Credentials`
mismatches between base and derived classes
- Replace `router/@post(create_api_key_credentials)` with
`create_credentials` which now takes a credential and is discriminated
by `type` provided by the credential

- UI/Frontend
- Updated various pages to have saved credential types, icons, and text
for User Pass Credentials
- Update credential input to have an input/modals/selects for user/pass
combos
- Update the types to support having user/pass credentials too (we
should make this more centralized)
	- Update Credential Providres to support user_password
- Update `client.ts` to support the new streamlined credential creation
method and endpoint

- DX
	- Sort the provider names **again**


TODO:
- [x] Reactivate Conditionally Disabling Reddit
~~- [ ] Look into moving Webhooks base to allow subset of `Credentials`
rather than requiring all webhooks to support the input of all valid
`Credentials` types~~ Out of scope
- [x] Figure out the `singleCredential` calculator in
`credentials-input.tsx` so that it also respects User Pass credentials
and isn't a logic mess

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [x] Test with agents

---------

Co-authored-by: Zamil Majdy <zamil.majdy@agpt.co>
2025-01-28 08:07:50 +00:00
Nicholas Tindle
97ecaf5639 fix(test): timeout is causing intermittent failures (#9341)
<!-- Clearly explain the need for these changes: -->
Tests keep failing
### Changes 🏗️

<!-- Concisely describe all of the changes made in this pull request:
-->
- changes a 20 to 30
2025-01-27 21:09:59 +00:00
Nicholas Tindle
536bf8cdaf ci(repo): Update repo-close-stale-issues.yml (#9339)
<!-- Clearly explain the need for these changes: -->
We are making issues stale that aren’t. With the additions of more
careful project management this has become a hindrance rather than help.
### Changes 🏗️

<!-- Concisely describe all of the changes made in this pull request:
-->
- increases days until stale from 50 to 100
2025-01-27 08:55:35 +00:00
Zamil Majdy
bb3be444de feat(platform): Add multimedia file support & add basic Video blocks (#9320)
Currently, there is no support for passing files in the platform, each
generated file should be hosted somewhere.
This PR adds support of passing files temporarily during the execution
to open up more block that does multimedia operations.

<img width="583" alt="image"
src="https://github.com/user-attachments/assets/c285de5a-c2a9-41a0-9be1-305a316879d6"
/>

<img width="1291" alt="image"
src="https://github.com/user-attachments/assets/d7bcaf38-80fa-4b51-91da-b4eed80a02c1"
/>


### Changes 🏗️

* Add media support for passing files (local files, base64, URL) and
`FileStoreBlock` (file version of `StoreValueBlock`)
* Add initial multimedia blocks: `LoopVideoBlock` &
`AddAudioToVideoBlock`.

### Checklist 📋

#### For code changes:
- [ ] I have clearly listed my changes in the PR description
- [ ] I have made a test plan
- [ ] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [ ] ...

<details>
  <summary>Example test plan</summary>
  
  - [ ] Create from scratch and execute an agent with at least 3 blocks
- [ ] Import an agent from file upload, and confirm it executes
correctly
  - [ ] Upload agent to marketplace
- [ ] Import an agent from marketplace and confirm it executes correctly
  - [ ] Edit an agent from monitor, and confirm it executes correctly
</details>

#### For configuration changes:
- [ ] `.env.example` is updated or already compatible with my changes
- [ ] `docker-compose.yml` is updated or already compatible with my
changes
- [ ] I have included a list of my configuration changes in the PR
description (under **Changes**)

<details>
  <summary>Examples of configuration changes</summary>

  - Changing ports
  - Adding new services that need to communicate with each other
  - Secrets or environment variable changes
  - New or infrastructure changes such as databases
</details>

---------

Co-authored-by: Nicholas Tindle <nicholas.tindle@agpt.co>
2025-01-26 16:08:05 +00:00
Nicholas Tindle
996efaf4fc fix: check if linear is enabled before showing blocks (#9338)
<!-- Clearly explain the need for these changes: -->
Linear blocks show up even if oauth isn't configured

### Changes 🏗️
disables the linear blocks if oauth isn't configured

<!-- Concisely describe all of the changes made in this pull request:
-->

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [x] checked block list with and without oauth configured
2025-01-26 15:06:03 +00:00
Zamil Majdy
479a8470fe fix(block): Remove TwitterUnblockUserBlock & TwitterBlockUserBlock (#9337)
Twitter API v2 Client update:
```
    .. versionadded:: 4.0

    .. versionchanged:: 4.15
        Removed ``block`` and ``unblock`` methods, as the endpoints they use
        have been removed
```

### Changes 🏗️

Remove TwitterUnblockUserBlock & TwitterBlockUserBlock.

### Checklist 📋

#### For code changes:
- [ ] I have clearly listed my changes in the PR description
- [ ] I have made a test plan
- [ ] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [ ] ...

<details>
  <summary>Example test plan</summary>
  
  - [ ] Create from scratch and execute an agent with at least 3 blocks
- [ ] Import an agent from file upload, and confirm it executes
correctly
  - [ ] Upload agent to marketplace
- [ ] Import an agent from marketplace and confirm it executes correctly
  - [ ] Edit an agent from monitor, and confirm it executes correctly
</details>

#### For configuration changes:
- [ ] `.env.example` is updated or already compatible with my changes
- [ ] `docker-compose.yml` is updated or already compatible with my
changes
- [ ] I have included a list of my configuration changes in the PR
description (under **Changes**)

<details>
  <summary>Examples of configuration changes</summary>

  - Changing ports
  - Adding new services that need to communicate with each other
  - Secrets or environment variable changes
  - New or infrastructure changes such as databases
</details>
2025-01-26 14:06:39 +00:00
Azraf Nahian
12eb495b2c Added autocomplete (#9333)
**Clearly explain the need for these changes:**
The changes were made to enhance the usability and accessibility of the
login and signup forms. By adding appropriate `autocomplete` attributes,
the forms are now more compatible with browsers and password managers,
improving the user experience by supporting autofill, password
generation, and accessibility.

Additionally, the `yarn.lock` file was updated to reflect dependency
changes made during this work, ensuring consistent dependency resolution
across all environments.

This PR also resolves **issue #9312**, which addresses missing
`autocomplete` attributes on login and signup forms.

---

### **Changes 🏗️**
- Added `autocomplete="username"` to the email input field in both login
and signup forms.
- Added `autocomplete="current-password"` to the password input field in
the login form.
- Added `autocomplete="new-password"` to the password and confirm
password input fields in the signup form.
- Updated `yarn.lock` to reflect dependency updates.
- Improved browser compatibility and user accessibility.
- Enhanced support for autofill and password generation for better form
usability.

---

### **Checklist 📋**

#### For code changes:
- [x] I have clearly listed my changes in the PR description.
- [x] I have made a test plan.
- [x] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
- [x] Verified the `autocomplete` attributes are present in the rendered
HTML using browser DevTools.
- [x] Tested the login form for autofill compatibility with email and
password.
- [x] Tested the signup form with password generation using a password
manager.
  - [x] Ensured no functionality or styling was affected by the changes.

<details>
  <summary>Example test plan</summary>
  
- [x] Open the login form and verify that email
(`autocomplete="username"`) and password
(`autocomplete="current-password"`) inputs work with browser autofill.
- [x] Open the signup form and confirm that the password manager
suggests generating a strong password for `autocomplete="new-password"`.
  - [x] Verify that form submissions work as expected with no errors.
</details>

---

#### For configuration changes:
- [x] `.env.example` is updated or already compatible with my changes.
- [x] `docker-compose.yml` is updated or already compatible with my
changes.
- [x] I have included a list of my configuration changes in the PR
description (under **Changes**).

<details>
  <summary>Examples of configuration changes</summary>

No configuration changes were required for this PR.

</details> 

---

### **Issue Resolved:**
This pull request resolves **issue #9312** by adding the necessary
`autocomplete` attributes to the login and signup forms to improve
accessibility and compatibility with password managers.

---------

Co-authored-by: Nicholas Tindle <nicktindle@outlook.com>
Co-authored-by: Nicholas Tindle <nicholas.tindle@agpt.co>
2025-01-26 04:19:41 +00:00
Zamil Majdy
d74e4ef1a8 feat(block): Add LLM prompt as the output pin (#9330)
### Changes 🏗️

To ease the debugging, we can expose the prompt sent to the LLM
provider.

<img width="418" alt="image"
src="https://github.com/user-attachments/assets/0c8d7502-4f87-4002-a498-331f341859bd"
/>


### Checklist 📋

#### For code changes:
- [ ] I have clearly listed my changes in the PR description
- [ ] I have made a test plan
- [ ] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [ ] ...

<details>
  <summary>Example test plan</summary>
  
  - [ ] Create from scratch and execute an agent with at least 3 blocks
- [ ] Import an agent from file upload, and confirm it executes
correctly
  - [ ] Upload agent to marketplace
- [ ] Import an agent from marketplace and confirm it executes correctly
  - [ ] Edit an agent from monitor, and confirm it executes correctly
</details>

#### For configuration changes:
- [ ] `.env.example` is updated or already compatible with my changes
- [ ] `docker-compose.yml` is updated or already compatible with my
changes
- [ ] I have included a list of my configuration changes in the PR
description (under **Changes**)

<details>
  <summary>Examples of configuration changes</summary>

  - Changing ports
  - Adding new services that need to communicate with each other
  - Secrets or environment variable changes
  - New or infrastructure changes such as databases
</details>
2025-01-25 12:33:41 +00:00
Ayush Mittal
5b7a491a36 Resolve for issue #9082: Marketplace - "Select All" doesn't work in Agent Dashboard (#9247)
### Changes 🏗️

I have done changes in AgentTable and AgentTableRow components present
in autogpt_platform/frontend directory. The selectedAgents state is
passed on to AgentTableRow component, so that when Select All is
checked, individual check boxes also get checked.

### Checklist 📋

#### For code changes:
- [ ] I have clearly listed my changes in the PR description


#### For configuration changes:
- [ ] `.env.example` was already compatible with my changes
- [ ] `docker-compose.yml` was already compatible with my changes

---------

Co-authored-by: Nicholas Tindle <nicholas.tindle@agpt.co>
2025-01-24 17:16:51 +00:00
dependabot[bot]
016b4a6313 chore(frontend/deps-dev): bump the development-dependencies group across 1 directory with 17 updates (#9298)
Bumps the development-dependencies group with 15 updates in the
/autogpt_platform/frontend directory:

| Package | From | To |
| --- | --- | --- |
|
[@chromatic-com/storybook](https://github.com/chromaui/addon-visual-tests)
| `3.2.3` | `3.2.4` |
|
[@storybook/addon-a11y](https://github.com/storybookjs/storybook/tree/HEAD/code/addons/a11y)
| `8.4.7` | `8.5.0` |
|
[@storybook/addon-essentials](https://github.com/storybookjs/storybook/tree/HEAD/code/addons/essentials)
| `8.4.7` | `8.5.0` |
|
[@storybook/addon-interactions](https://github.com/storybookjs/storybook/tree/HEAD/code/addons/interactions)
| `8.4.7` | `8.5.0` |
|
[@storybook/addon-links](https://github.com/storybookjs/storybook/tree/HEAD/code/addons/links)
| `8.4.7` | `8.5.0` |
|
[@storybook/addon-onboarding](https://github.com/storybookjs/storybook/tree/HEAD/code/addons/onboarding)
| `8.4.7` | `8.5.0` |
|
[@storybook/blocks](https://github.com/storybookjs/storybook/tree/HEAD/code/lib/blocks)
| `8.4.7` | `8.5.0` |
|
[@storybook/nextjs](https://github.com/storybookjs/storybook/tree/HEAD/code/frameworks/nextjs)
| `8.4.7` | `8.5.0` |
|
[@types/node](https://github.com/DefinitelyTyped/DefinitelyTyped/tree/HEAD/types/node)
| `22.10.5` | `22.10.7` |
| [chromatic](https://github.com/chromaui/chromatic-cli) | `11.22.0` |
`11.25.0` |
|
[eslint-config-next](https://github.com/vercel/next.js/tree/HEAD/packages/eslint-config-next)
| `15.1.3` | `15.1.5` |
| [postcss](https://github.com/postcss/postcss) | `8.4.49` | `8.5.1` |
|
[prettier-plugin-tailwindcss](https://github.com/tailwindlabs/prettier-plugin-tailwindcss)
| `0.6.9` | `0.6.10` |
|
[storybook](https://github.com/storybookjs/storybook/tree/HEAD/code/lib/cli)
| `8.4.7` | `8.5.0` |
| [typescript](https://github.com/microsoft/TypeScript) | `5.7.2` |
`5.7.3` |


Updates `@chromatic-com/storybook` from 3.2.3 to 3.2.4
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/chromaui/addon-visual-tests/blob/v3.2.4/CHANGELOG.md"><code>@​chromatic-com/storybook</code>'s
changelog</a>.</em></p>
<blockquote>
<h1>v3.2.4 (Fri Jan 17 2025)</h1>
<h4>🐛 Bug Fix</h4>
<ul>
<li>Remove the connection timeout notification <a
href="https://redirect.github.com/chromaui/addon-visual-tests/pull/351">#351</a>
(<a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>)</li>
<li>Set up Codecov <a
href="https://redirect.github.com/chromaui/addon-visual-tests/pull/350">#350</a>
(<a
href="https://github.com/paulelliott"><code>@​paulelliott</code></a>)</li>
</ul>
<h4>Authors: 2</h4>
<ul>
<li>Paul Elliott (<a
href="https://github.com/paulelliott"><code>@​paulelliott</code></a>)</li>
<li>Valentin Palkovic (<a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>)</li>
</ul>
<hr />
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="a490548df4"><code>a490548</code></a>
Bump version to: 3.2.4 [skip ci]</li>
<li><a
href="d86ecff42c"><code>d86ecff</code></a>
Update CHANGELOG.md [skip ci]</li>
<li><a
href="dcee16503c"><code>dcee165</code></a>
Merge pull request <a
href="https://redirect.github.com/chromaui/addon-visual-tests/issues/351">#351</a>
from chromaui/valentin/remove-connection-timeout-noti...</li>
<li><a
href="127a468c9e"><code>127a468</code></a>
Remove the connection timeout notification</li>
<li><a
href="a7a43dae6d"><code>a7a43da</code></a>
Merge pull request <a
href="https://redirect.github.com/chromaui/addon-visual-tests/issues/350">#350</a>
from chromaui/set-up-codecov</li>
<li><a
href="8499c6a425"><code>8499c6a</code></a>
Upload coverage reports to Codecov</li>
<li><a
href="476a673686"><code>476a673</code></a>
Set up vitest coverage reporting</li>
<li><a
href="74d3d8d575"><code>74d3d8d</code></a>
Update vitest to 2.1.8</li>
<li>See full diff in <a
href="https://github.com/chromaui/addon-visual-tests/compare/v3.2.3...v3.2.4">compare
view</a></li>
</ul>
</details>
<br />

Updates `@storybook/addon-a11y` from 8.4.7 to 8.5.0
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/storybookjs/storybook/releases"><code>@​storybook/addon-a11y</code>'s
releases</a>.</em></p>
<blockquote>
<h2>v8.5.0</h2>
<h2>8.5.0</h2>
<p>Storybook 8.5 is packed with powerful features to enhance your
development workflow. This release makes it easier than ever to build
accessible, well-tested UIs. Here’s what’s new:</p>
<ul>
<li>🦾 Realtime accessibility tests to help build UIs for everybody</li>
<li>🛡️ Project code coverage to measure the completeness of your
tests</li>
<li>🎯 Focused tests for faster test feedback</li>
<li>⚛️ React Native Web Vite framework (experimental) for testing mobile
UI</li>
<li>⚛️ React 19 support</li>
<li>🎁 Storybook test early access program to level up your testing
game</li>
<li>💯 Hundreds more improvements</li>
</ul>
<!-- raw HTML omitted -->
<ul>
<li>Addon A11y: Add conditional rendering for a11y violation number in
Testing Module - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30073">#30073</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>Addon A11y: Add typesVersions support for TypeScript definitions in
a11y package - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30005">#30005</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>Addon A11y: Adjust default behaviour when using with
experimental-addon-test - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30162">#30162</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>Addon A11y: Change default element selector - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30253">#30253</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>Addon A11y: Create a11y test provider and revamp a11y addon - <a
href="https://redirect.github.com/storybookjs/storybook/pull/29643">#29643</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>Addon A11y: Don't set a11y tag as comment in automigrations - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30257">#30257</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>Addon A11y: Fix skipped status handling in Testing Module - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30077">#30077</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>Addon A11y: Refactor environment variable handling for Vitest
integration - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30022">#30022</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>Addon A11y: Remove warnings API - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30049">#30049</a>,
thanks <a
href="https://github.com/kasperpeulen"><code>@​kasperpeulen</code></a>!</li>
<li>Addon A11y: Run the a11y automigration on postInstall - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30004">#30004</a>,
thanks <a
href="https://github.com/kasperpeulen"><code>@​kasperpeulen</code></a>!</li>
<li>Addon A11y: Show errors of axe properly - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30050">#30050</a>,
thanks <a
href="https://github.com/kasperpeulen"><code>@​kasperpeulen</code></a>!</li>
<li>Addon A11y: Update accessibility status handling in
TestProviderRender - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30027">#30027</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>Addon Docs: Dynamically import rehype - <a
href="https://redirect.github.com/storybookjs/storybook/pull/29544">#29544</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>Addon Docs: Make new code panel opt in - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30248">#30248</a>,
thanks <a
href="https://github.com/shilman"><code>@​shilman</code></a>!</li>
<li>Addon Onboarding: Prebundle react-confetti - <a
href="https://redirect.github.com/storybookjs/storybook/pull/29996">#29996</a>,
thanks <a
href="https://github.com/yannbf"><code>@​yannbf</code></a>!</li>
<li>Addon Test: Add <code>@vitest/coverage-v8</code> during postinstall
if no coverage reporter is installed - <a
href="https://redirect.github.com/storybookjs/storybook/pull/29993">#29993</a>,
thanks <a
href="https://github.com/ghengeveld"><code>@​ghengeveld</code></a>!</li>
<li>Addon Test: Add prerequisite check for MSW - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30193">#30193</a>,
thanks <a
href="https://github.com/yannbf"><code>@​yannbf</code></a>!</li>
<li>Addon Test: Add support for previewHead - <a
href="https://redirect.github.com/storybookjs/storybook/pull/29808">#29808</a>,
thanks <a
href="https://github.com/ndelangen"><code>@​ndelangen</code></a>!</li>
<li>Addon Test: Add Vitest 3 support - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30181">#30181</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>Addon Test: Always run Vitest in watch mode internally - <a
href="https://redirect.github.com/storybookjs/storybook/pull/29749">#29749</a>,
thanks <a
href="https://github.com/JReinhold"><code>@​JReinhold</code></a>!</li>
<li>Addon Test: Always use installed version of vitest - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30134">#30134</a>,
thanks <a
href="https://github.com/kasperpeulen"><code>@​kasperpeulen</code></a>!</li>
<li>Addon Test: Clarify message when <code>vitest</code> detects missing
deps - <a
href="https://redirect.github.com/storybookjs/storybook/pull/29763">#29763</a>,
thanks <a
href="https://github.com/ndelangen"><code>@​ndelangen</code></a>!</li>
<li>Addon Test: Clear coverage data when starting or watching - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30072">#30072</a>,
thanks <a
href="https://github.com/ghengeveld"><code>@​ghengeveld</code></a>!</li>
<li>Addon Test: Context menu UI - <a
href="https://redirect.github.com/storybookjs/storybook/pull/29727">#29727</a>,
thanks <a
href="https://github.com/ghengeveld"><code>@​ghengeveld</code></a>!</li>
<li>Addon Test: Context menu updates - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30107">#30107</a>,
thanks <a
href="https://github.com/ghengeveld"><code>@​ghengeveld</code></a>!</li>
<li>Addon Test: Correctly stop Storybook when Vitest closes - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30012">#30012</a>,
thanks <a
href="https://github.com/JReinhold"><code>@​JReinhold</code></a>!</li>
<li>Addon Test: Filter out falsy test results in TestProviderRender - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30001">#30001</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>Addon Test: Fix documentation links - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30128">#30128</a>,
thanks <a
href="https://github.com/yannbf"><code>@​yannbf</code></a>!</li>
<li>Addon Test: Fix duplicate <code>test.include</code> patterns - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30029">#30029</a>,
thanks <a
href="https://github.com/JReinhold"><code>@​JReinhold</code></a>!</li>
<li>Addon Test: Fix environment variable for Vitest Storybook
integration - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30054">#30054</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>Addon Test: Fix error reporting for <code>vitest</code> crashes - <a
href="https://redirect.github.com/storybookjs/storybook/pull/29751">#29751</a>,
thanks <a
href="https://github.com/ndelangen"><code>@​ndelangen</code></a>!</li>
<li>Addon Test: Fix generated path to <code>vitest.setup.js</code> - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30233">#30233</a>,
thanks <a
href="https://github.com/JReinhold"><code>@​JReinhold</code></a>!</li>
<li>Addon Test: Fix indexing behavior - <a
href="https://redirect.github.com/storybookjs/storybook/pull/29836">#29836</a>,
thanks <a
href="https://github.com/yannbf"><code>@​yannbf</code></a>!</li>
<li>Addon Test: Fix printing null% for coverage - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30061">#30061</a>,
thanks <a
href="https://github.com/ghengeveld"><code>@​ghengeveld</code></a>!</li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/storybookjs/storybook/blob/next/CHANGELOG.md"><code>@​storybook/addon-a11y</code>'s
changelog</a>.</em></p>
<blockquote>
<h2>8.5.0</h2>
<p>Storybook 8.5 is packed with powerful features to enhance your
development workflow. This release makes it easier than ever to build
accessible, well-tested UIs. Here’s what’s new:</p>
<ul>
<li>🦾 Realtime accessibility tests to help build UIs for everybody</li>
<li>🛡️ Project code coverage to measure the completeness of your
tests</li>
<li>🎯 Focused tests for faster test feedback</li>
<li>⚛️ React Native Web Vite framework (experimental) for testing mobile
UI⚛️</li>
<li>🎁 Storybook test early access program to level up your testing
game</li>
<li>💯 Hundreds more improvements</li>
</ul>
<!-- raw HTML omitted -->
<ul>
<li>Addon A11y: Add conditional rendering for a11y violation number in
Testing Module - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30073">#30073</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>Addon A11y: Add typesVersions support for TypeScript definitions in
a11y package - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30005">#30005</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>Addon A11y: Adjust default behaviour when using with
experimental-addon-test - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30162">#30162</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>Addon A11y: Change default element selector - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30253">#30253</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>Addon A11y: Create a11y test provider and revamp a11y addon - <a
href="https://redirect.github.com/storybookjs/storybook/pull/29643">#29643</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>Addon A11y: Don't set a11y tag as comment in automigrations - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30257">#30257</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>Addon A11y: Fix skipped status handling in Testing Module - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30077">#30077</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>Addon A11y: Refactor environment variable handling for Vitest
integration - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30022">#30022</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>Addon A11y: Remove warnings API - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30049">#30049</a>,
thanks <a
href="https://github.com/kasperpeulen"><code>@​kasperpeulen</code></a>!</li>
<li>Addon A11y: Run the a11y automigration on postInstall - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30004">#30004</a>,
thanks <a
href="https://github.com/kasperpeulen"><code>@​kasperpeulen</code></a>!</li>
<li>Addon A11y: Show errors of axe properly - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30050">#30050</a>,
thanks <a
href="https://github.com/kasperpeulen"><code>@​kasperpeulen</code></a>!</li>
<li>Addon A11y: Update accessibility status handling in
TestProviderRender - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30027">#30027</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>Addon Docs: Dynamically import rehype - <a
href="https://redirect.github.com/storybookjs/storybook/pull/29544">#29544</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>Addon Docs: Make new code panel opt in - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30248">#30248</a>,
thanks <a
href="https://github.com/shilman"><code>@​shilman</code></a>!</li>
<li>Addon Onboarding: Prebundle react-confetti - <a
href="https://redirect.github.com/storybookjs/storybook/pull/29996">#29996</a>,
thanks <a
href="https://github.com/yannbf"><code>@​yannbf</code></a>!</li>
<li>Addon Test: Add <code>@vitest/coverage-v8</code> during postinstall
if no coverage reporter is installed - <a
href="https://redirect.github.com/storybookjs/storybook/pull/29993">#29993</a>,
thanks <a
href="https://github.com/ghengeveld"><code>@​ghengeveld</code></a>!</li>
<li>Addon Test: Add prerequisite check for MSW - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30193">#30193</a>,
thanks <a
href="https://github.com/yannbf"><code>@​yannbf</code></a>!</li>
<li>Addon Test: Add support for previewHead - <a
href="https://redirect.github.com/storybookjs/storybook/pull/29808">#29808</a>,
thanks <a
href="https://github.com/ndelangen"><code>@​ndelangen</code></a>!</li>
<li>Addon Test: Add Vitest 3 support - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30181">#30181</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>Addon Test: Always run Vitest in watch mode internally - <a
href="https://redirect.github.com/storybookjs/storybook/pull/29749">#29749</a>,
thanks <a
href="https://github.com/JReinhold"><code>@​JReinhold</code></a>!</li>
<li>Addon Test: Always use installed version of vitest - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30134">#30134</a>,
thanks <a
href="https://github.com/kasperpeulen"><code>@​kasperpeulen</code></a>!</li>
<li>Addon Test: Clarify message when <code>vitest</code> detects missing
deps - <a
href="https://redirect.github.com/storybookjs/storybook/pull/29763">#29763</a>,
thanks <a
href="https://github.com/ndelangen"><code>@​ndelangen</code></a>!</li>
<li>Addon Test: Clear coverage data when starting or watching - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30072">#30072</a>,
thanks <a
href="https://github.com/ghengeveld"><code>@​ghengeveld</code></a>!</li>
<li>Addon Test: Context menu UI - <a
href="https://redirect.github.com/storybookjs/storybook/pull/29727">#29727</a>,
thanks <a
href="https://github.com/ghengeveld"><code>@​ghengeveld</code></a>!</li>
<li>Addon Test: Context menu updates - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30107">#30107</a>,
thanks <a
href="https://github.com/ghengeveld"><code>@​ghengeveld</code></a>!</li>
<li>Addon Test: Correctly stop Storybook when Vitest closes - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30012">#30012</a>,
thanks <a
href="https://github.com/JReinhold"><code>@​JReinhold</code></a>!</li>
<li>Addon Test: Filter out falsy test results in TestProviderRender - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30001">#30001</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>Addon Test: Fix documentation links - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30128">#30128</a>,
thanks <a
href="https://github.com/yannbf"><code>@​yannbf</code></a>!</li>
<li>Addon Test: Fix duplicate <code>test.include</code> patterns - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30029">#30029</a>,
thanks <a
href="https://github.com/JReinhold"><code>@​JReinhold</code></a>!</li>
<li>Addon Test: Fix environment variable for Vitest Storybook
integration - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30054">#30054</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>Addon Test: Fix error reporting for <code>vitest</code> crashes - <a
href="https://redirect.github.com/storybookjs/storybook/pull/29751">#29751</a>,
thanks <a
href="https://github.com/ndelangen"><code>@​ndelangen</code></a>!</li>
<li>Addon Test: Fix generated path to <code>vitest.setup.js</code> - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30233">#30233</a>,
thanks <a
href="https://github.com/JReinhold"><code>@​JReinhold</code></a>!</li>
<li>Addon Test: Fix indexing behavior - <a
href="https://redirect.github.com/storybookjs/storybook/pull/29836">#29836</a>,
thanks <a
href="https://github.com/yannbf"><code>@​yannbf</code></a>!</li>
<li>Addon Test: Fix printing null% for coverage - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30061">#30061</a>,
thanks <a
href="https://github.com/ghengeveld"><code>@​ghengeveld</code></a>!</li>
<li>Addon Test: Fix run request while booting or restarting Vitest - <a
href="https://redirect.github.com/storybookjs/storybook/pull/29829">#29829</a>,
thanks <a
href="https://github.com/ghengeveld"><code>@​ghengeveld</code></a>!</li>
<li>Addon Test: Handle undefined storyId - <a
href="https://redirect.github.com/storybookjs/storybook/pull/29998">#29998</a>,
thanks <a
href="https://github.com/ghengeveld"><code>@​ghengeveld</code></a>!</li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="92770672e5"><code>9277067</code></a>
Bump version from &quot;8.5.0-beta.11&quot; to &quot;8.5.0&quot; [skip
ci]</li>
<li><a
href="d8fe93ac1b"><code>d8fe93a</code></a>
Bump version from &quot;8.5.0-beta.10&quot; to &quot;8.5.0-beta.11&quot;
[skip ci]</li>
<li><a
href="426586d37a"><code>426586d</code></a>
Bump version from &quot;8.5.0-beta.9&quot; to &quot;8.5.0-beta.10&quot;
[skip ci]</li>
<li><a
href="6b337506e7"><code>6b33750</code></a>
Addon A11y: Change default element selector</li>
<li><a
href="a6a633a61e"><code>a6a633a</code></a>
Merge branch 'next-release' into next</li>
<li><a
href="b607dbe575"><code>b607dbe</code></a>
Bump version from &quot;8.5.0-beta.8&quot; to &quot;8.5.0-beta.9&quot;
[skip ci]</li>
<li><a
href="7e75f73809"><code>7e75f73</code></a>
Merge remote-tracking branch 'origin/next' into
valentin/a11y-refactorings</li>
<li><a
href="3b979ee412"><code>3b979ee</code></a>
Bump version from &quot;8.5.0-beta.7&quot; to &quot;8.5.0-beta.8&quot;
[skip ci]</li>
<li><a
href="21b452ac9e"><code>21b452a</code></a>
Rename a11ytest tag to a11y-test</li>
<li><a
href="8de05db935"><code>8de05db</code></a>
Enhance post-install script to inherit stdio for better logging during
addon ...</li>
<li>Additional commits viewable in <a
href="https://github.com/storybookjs/storybook/commits/v8.5.0/code/addons/a11y">compare
view</a></li>
</ul>
</details>
<br />

Updates `@storybook/addon-essentials` from 8.4.7 to 8.5.0
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/storybookjs/storybook/releases"><code>@​storybook/addon-essentials</code>'s
releases</a>.</em></p>
<blockquote>
<h2>v8.5.0</h2>
<h2>8.5.0</h2>
<p>Storybook 8.5 is packed with powerful features to enhance your
development workflow. This release makes it easier than ever to build
accessible, well-tested UIs. Here’s what’s new:</p>
<ul>
<li>🦾 Realtime accessibility tests to help build UIs for everybody</li>
<li>🛡️ Project code coverage to measure the completeness of your
tests</li>
<li>🎯 Focused tests for faster test feedback</li>
<li>⚛️ React Native Web Vite framework (experimental) for testing mobile
UI</li>
<li>⚛️ React 19 support</li>
<li>🎁 Storybook test early access program to level up your testing
game</li>
<li>💯 Hundreds more improvements</li>
</ul>
<!-- raw HTML omitted -->
<ul>
<li>Addon A11y: Add conditional rendering for a11y violation number in
Testing Module - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30073">#30073</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>Addon A11y: Add typesVersions support for TypeScript definitions in
a11y package - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30005">#30005</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>Addon A11y: Adjust default behaviour when using with
experimental-addon-test - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30162">#30162</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>Addon A11y: Change default element selector - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30253">#30253</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>Addon A11y: Create a11y test provider and revamp a11y addon - <a
href="https://redirect.github.com/storybookjs/storybook/pull/29643">#29643</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>Addon A11y: Don't set a11y tag as comment in automigrations - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30257">#30257</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>Addon A11y: Fix skipped status handling in Testing Module - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30077">#30077</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>Addon A11y: Refactor environment variable handling for Vitest
integration - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30022">#30022</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>Addon A11y: Remove warnings API - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30049">#30049</a>,
thanks <a
href="https://github.com/kasperpeulen"><code>@​kasperpeulen</code></a>!</li>
<li>Addon A11y: Run the a11y automigration on postInstall - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30004">#30004</a>,
thanks <a
href="https://github.com/kasperpeulen"><code>@​kasperpeulen</code></a>!</li>
<li>Addon A11y: Show errors of axe properly - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30050">#30050</a>,
thanks <a
href="https://github.com/kasperpeulen"><code>@​kasperpeulen</code></a>!</li>
<li>Addon A11y: Update accessibility status handling in
TestProviderRender - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30027">#30027</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>Addon Docs: Dynamically import rehype - <a
href="https://redirect.github.com/storybookjs/storybook/pull/29544">#29544</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>Addon Docs: Make new code panel opt in - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30248">#30248</a>,
thanks <a
href="https://github.com/shilman"><code>@​shilman</code></a>!</li>
<li>Addon Onboarding: Prebundle react-confetti - <a
href="https://redirect.github.com/storybookjs/storybook/pull/29996">#29996</a>,
thanks <a
href="https://github.com/yannbf"><code>@​yannbf</code></a>!</li>
<li>Addon Test: Add <code>@vitest/coverage-v8</code> during postinstall
if no coverage reporter is installed - <a
href="https://redirect.github.com/storybookjs/storybook/pull/29993">#29993</a>,
thanks <a
href="https://github.com/ghengeveld"><code>@​ghengeveld</code></a>!</li>
<li>Addon Test: Add prerequisite check for MSW - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30193">#30193</a>,
thanks <a
href="https://github.com/yannbf"><code>@​yannbf</code></a>!</li>
<li>Addon Test: Add support for previewHead - <a
href="https://redirect.github.com/storybookjs/storybook/pull/29808">#29808</a>,
thanks <a
href="https://github.com/ndelangen"><code>@​ndelangen</code></a>!</li>
<li>Addon Test: Add Vitest 3 support - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30181">#30181</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>Addon Test: Always run Vitest in watch mode internally - <a
href="https://redirect.github.com/storybookjs/storybook/pull/29749">#29749</a>,
thanks <a
href="https://github.com/JReinhold"><code>@​JReinhold</code></a>!</li>
<li>Addon Test: Always use installed version of vitest - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30134">#30134</a>,
thanks <a
href="https://github.com/kasperpeulen"><code>@​kasperpeulen</code></a>!</li>
<li>Addon Test: Clarify message when <code>vitest</code> detects missing
deps - <a
href="https://redirect.github.com/storybookjs/storybook/pull/29763">#29763</a>,
thanks <a
href="https://github.com/ndelangen"><code>@​ndelangen</code></a>!</li>
<li>Addon Test: Clear coverage data when starting or watching - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30072">#30072</a>,
thanks <a
href="https://github.com/ghengeveld"><code>@​ghengeveld</code></a>!</li>
<li>Addon Test: Context menu UI - <a
href="https://redirect.github.com/storybookjs/storybook/pull/29727">#29727</a>,
thanks <a
href="https://github.com/ghengeveld"><code>@​ghengeveld</code></a>!</li>
<li>Addon Test: Context menu updates - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30107">#30107</a>,
thanks <a
href="https://github.com/ghengeveld"><code>@​ghengeveld</code></a>!</li>
<li>Addon Test: Correctly stop Storybook when Vitest closes - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30012">#30012</a>,
thanks <a
href="https://github.com/JReinhold"><code>@​JReinhold</code></a>!</li>
<li>Addon Test: Filter out falsy test results in TestProviderRender - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30001">#30001</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>Addon Test: Fix documentation links - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30128">#30128</a>,
thanks <a
href="https://github.com/yannbf"><code>@​yannbf</code></a>!</li>
<li>Addon Test: Fix duplicate <code>test.include</code> patterns - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30029">#30029</a>,
thanks <a
href="https://github.com/JReinhold"><code>@​JReinhold</code></a>!</li>
<li>Addon Test: Fix environment variable for Vitest Storybook
integration - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30054">#30054</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>Addon Test: Fix error reporting for <code>vitest</code> crashes - <a
href="https://redirect.github.com/storybookjs/storybook/pull/29751">#29751</a>,
thanks <a
href="https://github.com/ndelangen"><code>@​ndelangen</code></a>!</li>
<li>Addon Test: Fix generated path to <code>vitest.setup.js</code> - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30233">#30233</a>,
thanks <a
href="https://github.com/JReinhold"><code>@​JReinhold</code></a>!</li>
<li>Addon Test: Fix indexing behavior - <a
href="https://redirect.github.com/storybookjs/storybook/pull/29836">#29836</a>,
thanks <a
href="https://github.com/yannbf"><code>@​yannbf</code></a>!</li>
<li>Addon Test: Fix printing null% for coverage - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30061">#30061</a>,
thanks <a
href="https://github.com/ghengeveld"><code>@​ghengeveld</code></a>!</li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/storybookjs/storybook/blob/next/CHANGELOG.md"><code>@​storybook/addon-essentials</code>'s
changelog</a>.</em></p>
<blockquote>
<h2>8.5.0</h2>
<p>Storybook 8.5 is packed with powerful features to enhance your
development workflow. This release makes it easier than ever to build
accessible, well-tested UIs. Here’s what’s new:</p>
<ul>
<li>🦾 Realtime accessibility tests to help build UIs for everybody</li>
<li>🛡️ Project code coverage to measure the completeness of your
tests</li>
<li>🎯 Focused tests for faster test feedback</li>
<li>⚛️ React Native Web Vite framework (experimental) for testing mobile
UI⚛️</li>
<li>🎁 Storybook test early access program to level up your testing
game</li>
<li>💯 Hundreds more improvements</li>
</ul>
<!-- raw HTML omitted -->
<ul>
<li>Addon A11y: Add conditional rendering for a11y violation number in
Testing Module - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30073">#30073</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>Addon A11y: Add typesVersions support for TypeScript definitions in
a11y package - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30005">#30005</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>Addon A11y: Adjust default behaviour when using with
experimental-addon-test - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30162">#30162</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>Addon A11y: Change default element selector - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30253">#30253</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>Addon A11y: Create a11y test provider and revamp a11y addon - <a
href="https://redirect.github.com/storybookjs/storybook/pull/29643">#29643</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>Addon A11y: Don't set a11y tag as comment in automigrations - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30257">#30257</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>Addon A11y: Fix skipped status handling in Testing Module - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30077">#30077</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>Addon A11y: Refactor environment variable handling for Vitest
integration - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30022">#30022</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>Addon A11y: Remove warnings API - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30049">#30049</a>,
thanks <a
href="https://github.com/kasperpeulen"><code>@​kasperpeulen</code></a>!</li>
<li>Addon A11y: Run the a11y automigration on postInstall - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30004">#30004</a>,
thanks <a
href="https://github.com/kasperpeulen"><code>@​kasperpeulen</code></a>!</li>
<li>Addon A11y: Show errors of axe properly - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30050">#30050</a>,
thanks <a
href="https://github.com/kasperpeulen"><code>@​kasperpeulen</code></a>!</li>
<li>Addon A11y: Update accessibility status handling in
TestProviderRender - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30027">#30027</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>Addon Docs: Dynamically import rehype - <a
href="https://redirect.github.com/storybookjs/storybook/pull/29544">#29544</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>Addon Docs: Make new code panel opt in - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30248">#30248</a>,
thanks <a
href="https://github.com/shilman"><code>@​shilman</code></a>!</li>
<li>Addon Onboarding: Prebundle react-confetti - <a
href="https://redirect.github.com/storybookjs/storybook/pull/29996">#29996</a>,
thanks <a
href="https://github.com/yannbf"><code>@​yannbf</code></a>!</li>
<li>Addon Test: Add <code>@vitest/coverage-v8</code> during postinstall
if no coverage reporter is installed - <a
href="https://redirect.github.com/storybookjs/storybook/pull/29993">#29993</a>,
thanks <a
href="https://github.com/ghengeveld"><code>@​ghengeveld</code></a>!</li>
<li>Addon Test: Add prerequisite check for MSW - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30193">#30193</a>,
thanks <a
href="https://github.com/yannbf"><code>@​yannbf</code></a>!</li>
<li>Addon Test: Add support for previewHead - <a
href="https://redirect.github.com/storybookjs/storybook/pull/29808">#29808</a>,
thanks <a
href="https://github.com/ndelangen"><code>@​ndelangen</code></a>!</li>
<li>Addon Test: Add Vitest 3 support - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30181">#30181</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>Addon Test: Always run Vitest in watch mode internally - <a
href="https://redirect.github.com/storybookjs/storybook/pull/29749">#29749</a>,
thanks <a
href="https://github.com/JReinhold"><code>@​JReinhold</code></a>!</li>
<li>Addon Test: Always use installed version of vitest - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30134">#30134</a>,
thanks <a
href="https://github.com/kasperpeulen"><code>@​kasperpeulen</code></a>!</li>
<li>Addon Test: Clarify message when <code>vitest</code> detects missing
deps - <a
href="https://redirect.github.com/storybookjs/storybook/pull/29763">#29763</a>,
thanks <a
href="https://github.com/ndelangen"><code>@​ndelangen</code></a>!</li>
<li>Addon Test: Clear coverage data when starting or watching - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30072">#30072</a>,
thanks <a
href="https://github.com/ghengeveld"><code>@​ghengeveld</code></a>!</li>
<li>Addon Test: Context menu UI - <a
href="https://redirect.github.com/storybookjs/storybook/pull/29727">#29727</a>,
thanks <a
href="https://github.com/ghengeveld"><code>@​ghengeveld</code></a>!</li>
<li>Addon Test: Context menu updates - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30107">#30107</a>,
thanks <a
href="https://github.com/ghengeveld"><code>@​ghengeveld</code></a>!</li>
<li>Addon Test: Correctly stop Storybook when Vitest closes - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30012">#30012</a>,
thanks <a
href="https://github.com/JReinhold"><code>@​JReinhold</code></a>!</li>
<li>Addon Test: Filter out falsy test results in TestProviderRender - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30001">#30001</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>Addon Test: Fix documentation links - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30128">#30128</a>,
thanks <a
href="https://github.com/yannbf"><code>@​yannbf</code></a>!</li>
<li>Addon Test: Fix duplicate <code>test.include</code> patterns - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30029">#30029</a>,
thanks <a
href="https://github.com/JReinhold"><code>@​JReinhold</code></a>!</li>
<li>Addon Test: Fix environment variable for Vitest Storybook
integration - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30054">#30054</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>Addon Test: Fix error reporting for <code>vitest</code> crashes - <a
href="https://redirect.github.com/storybookjs/storybook/pull/29751">#29751</a>,
thanks <a
href="https://github.com/ndelangen"><code>@​ndelangen</code></a>!</li>
<li>Addon Test: Fix generated path to <code>vitest.setup.js</code> - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30233">#30233</a>,
thanks <a
href="https://github.com/JReinhold"><code>@​JReinhold</code></a>!</li>
<li>Addon Test: Fix indexing behavior - <a
href="https://redirect.github.com/storybookjs/storybook/pull/29836">#29836</a>,
thanks <a
href="https://github.com/yannbf"><code>@​yannbf</code></a>!</li>
<li>Addon Test: Fix printing null% for coverage - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30061">#30061</a>,
thanks <a
href="https://github.com/ghengeveld"><code>@​ghengeveld</code></a>!</li>
<li>Addon Test: Fix run request while booting or restarting Vitest - <a
href="https://redirect.github.com/storybookjs/storybook/pull/29829">#29829</a>,
thanks <a
href="https://github.com/ghengeveld"><code>@​ghengeveld</code></a>!</li>
<li>Addon Test: Handle undefined storyId - <a
href="https://redirect.github.com/storybookjs/storybook/pull/29998">#29998</a>,
thanks <a
href="https://github.com/ghengeveld"><code>@​ghengeveld</code></a>!</li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="92770672e5"><code>9277067</code></a>
Bump version from &quot;8.5.0-beta.11&quot; to &quot;8.5.0&quot; [skip
ci]</li>
<li><a
href="d8fe93ac1b"><code>d8fe93a</code></a>
Bump version from &quot;8.5.0-beta.10&quot; to &quot;8.5.0-beta.11&quot;
[skip ci]</li>
<li><a
href="426586d37a"><code>426586d</code></a>
Bump version from &quot;8.5.0-beta.9&quot; to &quot;8.5.0-beta.10&quot;
[skip ci]</li>
<li><a
href="b607dbe575"><code>b607dbe</code></a>
Bump version from &quot;8.5.0-beta.8&quot; to &quot;8.5.0-beta.9&quot;
[skip ci]</li>
<li><a
href="3b979ee412"><code>3b979ee</code></a>
Bump version from &quot;8.5.0-beta.7&quot; to &quot;8.5.0-beta.8&quot;
[skip ci]</li>
<li><a
href="2b9f1cfc16"><code>2b9f1cf</code></a>
Bump version from &quot;8.5.0-beta.6&quot; to &quot;8.5.0-beta.7&quot;
[skip ci]</li>
<li><a
href="91f53fdf55"><code>91f53fd</code></a>
Bump version from &quot;8.5.0-beta.5&quot; to &quot;8.5.0-beta.6&quot;
[skip ci]</li>
<li><a
href="ef9ee273d6"><code>ef9ee27</code></a>
Bump version from &quot;8.5.0-beta.4&quot; to &quot;8.5.0-beta.5&quot;
[skip ci]</li>
<li><a
href="cf555266f5"><code>cf55526</code></a>
Bump version from &quot;8.5.0-beta.3&quot; to &quot;8.5.0-beta.4&quot;
[skip ci]</li>
<li><a
href="df37d0880e"><code>df37d08</code></a>
Bump version from &quot;8.5.0-beta.2&quot; to &quot;8.5.0-beta.3&quot;
[skip ci]</li>
<li>Additional commits viewable in <a
href="https://github.com/storybookjs/storybook/commits/v8.5.0/code/addons/essentials">compare
view</a></li>
</ul>
</details>
<br />

Updates `@storybook/addon-interactions` from 8.4.7 to 8.5.0
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/storybookjs/storybook/releases"><code>@​storybook/addon-interactions</code>'s
releases</a>.</em></p>
<blockquote>
<h2>v8.5.0</h2>
<h2>8.5.0</h2>
<p>Storybook 8.5 is packed with powerful features to enhance your
development workflow. This release makes it easier than ever to build
accessible, well-tested UIs. Here’s what’s new:</p>
<ul>
<li>🦾 Realtime accessibility tests to help build UIs for everybody</li>
<li>🛡️ Project code coverage to measure the completeness of your
tests</li>
<li>🎯 Focused tests for faster test feedback</li>
<li>⚛️ React Native Web Vite framework (experimental) for testing mobile
UI</li>
<li>⚛️ React 19 support</li>
<li>🎁 Storybook test early access program to level up your testing
game</li>
<li>💯 Hundreds more improvements</li>
</ul>
<!-- raw HTML omitted -->
<ul>
<li>Addon A11y: Add conditional rendering for a11y violation number in
Testing Module - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30073">#30073</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>Addon A11y: Add typesVersions support for TypeScript definitions in
a11y package - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30005">#30005</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>Addon A11y: Adjust default behaviour when using with
experimental-addon-test - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30162">#30162</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>Addon A11y: Change default element selector - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30253">#30253</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>Addon A11y: Create a11y test provider and revamp a11y addon - <a
href="https://redirect.github.com/storybookjs/storybook/pull/29643">#29643</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>Addon A11y: Don't set a11y tag as comment in automigrations - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30257">#30257</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>Addon A11y: Fix skipped status handling in Testing Module - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30077">#30077</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>Addon A11y: Refactor environment variable handling for Vitest
integration - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30022">#30022</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>Addon A11y: Remove warnings API - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30049">#30049</a>,
thanks <a
href="https://github.com/kasperpeulen"><code>@​kasperpeulen</code></a>!</li>
<li>Addon A11y: Run the a11y automigration on postInstall - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30004">#30004</a>,
thanks <a
href="https://github.com/kasperpeulen"><code>@​kasperpeulen</code></a>!</li>
<li>Addon A11y: Show errors of axe properly - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30050">#30050</a>,
thanks <a
href="https://github.com/kasperpeulen"><code>@​kasperpeulen</code></a>!</li>
<li>Addon A11y: Update accessibility status handling in
TestProviderRender - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30027">#30027</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>Addon Docs: Dynamically import rehype - <a
href="https://redirect.github.com/storybookjs/storybook/pull/29544">#29544</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>Addon Docs: Make new code panel opt in - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30248">#30248</a>,
thanks <a
href="https://github.com/shilman"><code>@​shilman</code></a>!</li>
<li>Addon Onboarding: Prebundle react-confetti - <a
href="https://redirect.github.com/storybookjs/storybook/pull/29996">#29996</a>,
thanks <a
href="https://github.com/yannbf"><code>@​yannbf</code></a>!</li>
<li>Addon Test: Add <code>@vitest/coverage-v8</code> during postinstall
if no coverage reporter is installed - <a
href="https://redirect.github.com/storybookjs/storybook/pull/29993">#29993</a>,
thanks <a
href="https://github.com/ghengeveld"><code>@​ghengeveld</code></a>!</li>
<li>Addon Test: Add prerequisite check for MSW - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30193">#30193</a>,
thanks <a
href="https://github.com/yannbf"><code>@​yannbf</code></a>!</li>
<li>Addon Test: Add support for previewHead - <a
href="https://redirect.github.com/storybookjs/storybook/pull/29808">#29808</a>,
thanks <a
href="https://github.com/ndelangen"><code>@​ndelangen</code></a>!</li>
<li>Addon Test: Add Vitest 3 support - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30181">#30181</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>Addon Test: Always run Vitest in watch mode internally - <a
href="https://redirect.github.com/storybookjs/storybook/pull/29749">#29749</a>,
thanks <a
href="https://github.com/JReinhold"><code>@​JReinhold</code></a>!</li>
<li>Addon Test: Always use installed version of vitest - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30134">#30134</a>,
thanks <a
href="https://github.com/kasperpeulen"><code>@​kasperpeulen</code></a>!</li>
<li>Addon Test: Clarify message when <code>vitest</code> detects missing
deps - <a
href="https://redirect.github.com/storybookjs/storybook/pull/29763">#29763</a>,
thanks <a
href="https://github.com/ndelangen"><code>@​ndelangen</code></a>!</li>
<li>Addon Test: Clear coverage data when starting or watching - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30072">#30072</a>,
thanks <a
href="https://github.com/ghengeveld"><code>@​ghengeveld</code></a>!</li>
<li>Addon Test: Context menu UI - <a
href="https://redirect.github.com/storybookjs/storybook/pull/29727">#29727</a>,
thanks <a
href="https://github.com/ghengeveld"><code>@​ghengeveld</code></a>!</li>
<li>Addon Test: Context menu updates - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30107">#30107</a>,
thanks <a
href="https://github.com/ghengeveld"><code>@​ghengeveld</code></a>!</li>
<li>Addon Test: Correctly stop Storybook when Vitest closes - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30012">#30012</a>,
thanks <a
href="https://github.com/JReinhold"><code>@​JReinhold</code></a>!</li>
<li>Addon Test: Filter out falsy test results in TestProviderRender - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30001">#30001</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>Addon Test: Fix documentation links - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30128">#30128</a>,
thanks <a
href="https://github.com/yannbf"><code>@​yannbf</code></a>!</li>
<li>Addon Test: Fix duplicate <code>test.include</code> patterns - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30029">#30029</a>,
thanks <a
href="https://github.com/JReinhold"><code>@​JReinhold</code></a>!</li>
<li>Addon Test: Fix environment variable for Vitest Storybook
integration - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30054">#30054</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>Addon Test: Fix error reporting for <code>vitest</code> crashes - <a
href="https://redirect.github.com/storybookjs/storybook/pull/29751">#29751</a>,
thanks <a
href="https://github.com/ndelangen"><code>@​ndelangen</code></a>!</li>
<li>Addon Test: Fix generated path to <code>vitest.setup.js</code> - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30233">#30233</a>,
thanks <a
href="https://github.com/JReinhold"><code>@​JReinhold</code></a>!</li>
<li>Addon Test: Fix indexing behavior - <a
href="https://redirect.github.com/storybookjs/storybook/pull/29836">#29836</a>,
thanks <a
href="https://github.com/yannbf"><code>@​yannbf</code></a>!</li>
<li>Addon Test: Fix printing null% for coverage - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30061">#30061</a>,
thanks <a
href="https://github.com/ghengeveld"><code>@​ghengeveld</code></a>!</li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/storybookjs/storybook/blob/next/CHANGELOG.md"><code>@​storybook/addon-interactions</code>'s
changelog</a>.</em></p>
<blockquote>
<h2>8.5.0</h2>
<p>Storybook 8.5 is packed with powerful features to enhance your
development workflow. This release makes it easier than ever to build
accessible, well-tested UIs. Here’s what’s new:</p>
<ul>
<li>🦾 Realtime accessibility tests to help build UIs for everybody</li>
<li>🛡️ Project code coverage to measure the completeness of your
tests</li>
<li>🎯 Focused tests for faster test feedback</li>
<li>⚛️ React Native Web Vite framework (experimental) for testing mobile
UI⚛️</li>
<li>🎁 Storybook test early access program to level up your testing
game</li>
<li>💯 Hundreds more improvements</li>
</ul>
<!-- raw HTML omitted -->
<ul>
<li>Addon A11y: Add conditional rendering for a11y violation number in
Testing Module - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30073">#30073</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>Addon A11y: Add typesVersions support for TypeScript definitions in
a11y package - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30005">#30005</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>Addon A11y: Adjust default behaviour when using with
experimental-addon-test - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30162">#30162</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>Addon A11y: Change default element selector - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30253">#30253</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>Addon A11y: Create a11y test provider and revamp a11y addon - <a
href="https://redirect.github.com/storybookjs/storybook/pull/29643">#29643</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>Addon A11y: Don't set a11y tag as comment in automigrations - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30257">#30257</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>Addon A11y: Fix skipped status handling in Testing Module - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30077">#30077</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>Addon A11y: Refactor environment variable handling for Vitest
integration - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30022">#30022</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>Addon A11y: Remove warnings API - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30049">#30049</a>,
thanks <a
href="https://github.com/kasperpeulen"><code>@​kasperpeulen</code></a>!</li>
<li>Addon A11y: Run the a11y automigration on postInstall - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30004">#30004</a>,
thanks <a
href="https://github.com/kasperpeulen"><code>@​kasperpeulen</code></a>!</li>
<li>Addon A11y: Show errors of axe properly - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30050">#30050</a>,
thanks <a
href="https://github.com/kasperpeulen"><code>@​kasperpeulen</code></a>!</li>
<li>Addon A11y: Update accessibility status handling in
TestProviderRender - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30027">#30027</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>Addon Docs: Dynamically import rehype - <a
href="https://redirect.github.com/storybookjs/storybook/pull/29544">#29544</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>Addon Docs: Make new code panel opt in - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30248">#30248</a>,
thanks <a
href="https://github.com/shilman"><code>@​shilman</code></a>!</li>
<li>Addon Onboarding: Prebundle react-confetti - <a
href="https://redirect.github.com/storybookjs/storybook/pull/29996">#29996</a>,
thanks <a
href="https://github.com/yannbf"><code>@​yannbf</code></a>!</li>
<li>Addon Test: Add <code>@vitest/coverage-v8</code> during postinstall
if no coverage reporter is installed - <a
href="https://redirect.github.com/storybookjs/storybook/pull/29993">#29993</a>,
thanks <a
href="https://github.com/ghengeveld"><code>@​ghengeveld</code></a>!</li>
<li>Addon Test: Add prerequisite check for MSW - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30193">#30193</a>,
thanks <a
href="https://github.com/yannbf"><code>@​yannbf</code></a>!</li>
<li>Addon Test: Add support for previewHead - <a
href="https://redirect.github.com/storybookjs/storybook/pull/29808">#29808</a>,
thanks <a
href="https://github.com/ndelangen"><code>@​ndelangen</code></a>!</li>
<li>Addon Test: Add Vitest 3 support - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30181">#30181</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>Addon Test: Always run Vitest in watch mode internally - <a
href="https://redirect.github.com/storybookjs/storybook/pull/29749">#29749</a>,
thanks <a
href="https://github.com/JReinhold"><code>@​JReinhold</code></a>!</li>
<li>Addon Test: Always use installed version of vitest - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30134">#30134</a>,
thanks <a
href="https://github.com/kasperpeulen"><code>@​kasperpeulen</code></a>!</li>
<li>Addon Test: Clarify message when <code>vitest</code> detects missing
deps - <a
href="https://redirect.github.com/storybookjs/storybook/pull/29763">#29763</a>,
thanks <a
href="https://github.com/ndelangen"><code>@​ndelangen</code></a>!</li>
<li>Addon Test: Clear coverage data when starting or watching - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30072">#30072</a>,
thanks <a
href="https://github.com/ghengeveld"><code>@​ghengeveld</code></a>!</li>
<li>Addon Test: Context menu UI - <a
href="https://redirect.github.com/storybookjs/storybook/pull/29727">#29727</a>,
thanks <a
href="https://github.com/ghengeveld"><code>@​ghengeveld</code></a>!</li>
<li>Addon Test: Context menu updates - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30107">#30107</a>,
thanks <a
href="https://github.com/ghengeveld"><code>@​ghengeveld</code></a>!</li>
<li>Addon Test: Correctly stop Storybook when Vitest closes - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30012">#30012</a>,
thanks <a
href="https://github.com/JReinhold"><code>@​JReinhold</code></a>!</li>
<li>Addon Test: Filter out falsy test results in TestProviderRender - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30001">#30001</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>Addon Test: Fix documentation links - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30128">#30128</a>,
thanks <a
href="https://github.com/yannbf"><code>@​yannbf</code></a>!</li>
<li>Addon Test: Fix duplicate <code>test.include</code> patterns - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30029">#30029</a>,
thanks <a
href="https://github.com/JReinhold"><code>@​JReinhold</code></a>!</li>
<li>Addon Test: Fix environment variable for Vitest Storybook
integration - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30054">#30054</a>,
thanks <a
href="https://github.com/valentinpalkovic"><code>@​valentinpalkovic</code></a>!</li>
<li>Addon Test: Fix error reporting for <code>vitest</code> crashes - <a
href="https://redirect.github.com/storybookjs/storybook/pull/29751">#29751</a>,
thanks <a
href="https://github.com/ndelangen"><code>@​ndelangen</code></a>!</li>
<li>Addon Test: Fix generated path to <code>vitest.setup.js</code> - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30233">#30233</a>,
thanks <a
href="https://github.com/JReinhold"><code>@​JReinhold</code></a>!</li>
<li>Addon Test: Fix indexing behavior - <a
href="https://redirect.github.com/storybookjs/storybook/pull/29836">#29836</a>,
thanks <a
href="https://github.com/yannbf"><code>@​yannbf</code></a>!</li>
<li>Addon Test: Fix printing null% for coverage - <a
href="https://redirect.github.com/storybookjs/storybook/pull/30061">#30061</a>,
thanks <a
href="https://github.com/ghengeveld"><code>@​ghengeveld</code></a>!</li>
<li>Addon Test: Fix run request while booting or restarting Vitest - <a
href="https://redirect.github.com/storybookjs/storybook/pull/29829">#29829</a>,
thanks <a
href="https://github.com/ghengeveld"><code>@​ghengeveld</code></a>!</li>
<li>Addon Test: Handle undefined storyId - <a
href="https://redirect.github.com/storybookjs/storybook/pull/29998">#29998</a>,
thanks <a
href="https://github.com/ghengeveld"><code>@​ghengeveld</code></a>!</li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="92770672e5"><code>9277067</code></a>
Bump version from &quot;8.5.0-beta.11&quot; to &quot;8.5.0&quot; [skip
ci]</li>
<li><a
href="e447db6537"><code>e447db6</code></a>
Merge branch 'next' into jeppe/fix-interactions-removal</li>
<li><a
href="1a0d0eaa34"><code>1a0d0ea</code></a>
move addon order check from preset.js to dist/preset.js</li>
<li><a
href="d8fe93ac1b"><code>d8fe93a</code></a>
Bump version from &quot;8.5.0-beta.10&quot; to &quot;8.5.0-beta.11&quot;
[skip ci]</li>
<li><a
href="426586d37a"><code>426586d</code></a>
Bump version from &quot;8.5.0-beta.9&quot; to &quot;8.5.0-beta.10&quot;
[skip ci]</li>
<li><a
href="b607dbe575"><code>b607dbe</code></a>
Bump version from &quot;8.5.0-beta.8&quot; to &quot;8.5.0-beta.9&quot;
[skip ci]</li>
<li><a
href="3b979ee412"><code>3b979ee</code></a>
Bump version from &quot;8.5.0-beta.7&quot; to &quot;8.5.0-beta.8&quot;
[skip ci]</li>
<li><a
href="2b9f1cfc16"><code>2b9f1cf</code></a>
Bump version from &quot;8.5.0-beta.6&quot; to &quot;8.5.0-beta.7&quot;
[skip ci]</li>
<li><a
href="91f53fdf55"><code>91f53fd</code></a>
Bump version from &quot;8.5.0-beta.5&quot; to &quot;8.5.0-beta.6&quot;
[skip ci]</li>
<li><a
href="ef9ee273d6"><code>ef9ee27</code></a>
Bump version from &quot;8.5.0-beta.4&quot; to &quot;8.5.0-beta.5&quot;
[skip ci]</li>
<li>Additional ...

_Description has been truncated_

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Nicholas Tindle <nicholas.tindle@agpt.co>
2025-01-24 16:21:09 +00:00
dependabot[bot]
21337e6bc4 chore(libs/deps): bump the production-dependencies group across 1 directory with 4 updates (#9237)
Bumps the production-dependencies group with 4 updates in the
/autogpt_platform/autogpt_libs directory:
[pydantic](https://github.com/pydantic/pydantic),
[pydantic-settings](https://github.com/pydantic/pydantic-settings),
[pytest-asyncio](https://github.com/pytest-dev/pytest-asyncio) and
[supabase](https://github.com/supabase/supabase-py).

Updates `pydantic` from 2.10.3 to 2.10.5
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/pydantic/pydantic/releases">pydantic's
releases</a>.</em></p>
<blockquote>
<h2>v2.10.4 2024-12-18</h2>
<h2>What's Changed</h2>
<h3>Packaging</h3>
<ul>
<li>Bump <code>pydantic-core</code> to v2.27.2 by <a
href="https://github.com/davidhewitt"><code>@​davidhewitt</code></a> in
<a
href="https://redirect.github.com/pydantic/pydantic/pull/11138">#11138</a></li>
</ul>
<h3>Fixes</h3>
<ul>
<li>Fix for comparison of <code>AnyUrl</code> objects by <a
href="https://github.com/alexprabhat99"><code>@​alexprabhat99</code></a>
in <a
href="https://redirect.github.com/pydantic/pydantic/pull/11082">#11082</a></li>
<li>Properly fetch PEP 695 type params for functions, do not fetch
annotations from signature by <a
href="https://github.com/Viicos"><code>@​Viicos</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic/pull/11093">#11093</a></li>
<li>Include JSON Schema input core schema in function schemas by <a
href="https://github.com/Viicos"><code>@​Viicos</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic/pull/11085">#11085</a></li>
<li>Add <code>len</code> to <code>_BaseUrl</code> to avoid TypeError by
<a href="https://github.com/Kharianne"><code>@​Kharianne</code></a> in
<a
href="https://redirect.github.com/pydantic/pydantic/pull/11111">#11111</a></li>
<li>Make sure the type reference is removed from the seen references by
<a href="https://github.com/Viicos"><code>@​Viicos</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic/pull/11143">#11143</a></li>
</ul>
<h2>New Contributors</h2>
<ul>
<li><a
href="https://github.com/alexprabhat99"><code>@​alexprabhat99</code></a>
made their first contribution in <a
href="https://redirect.github.com/pydantic/pydantic/pull/11082">#11082</a></li>
<li><a href="https://github.com/Kharianne"><code>@​Kharianne</code></a>
made their first contribution in <a
href="https://redirect.github.com/pydantic/pydantic/pull/11111">#11111</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/pydantic/pydantic/compare/v2.10.3...v2.10.4">https://github.com/pydantic/pydantic/compare/v2.10.3...v2.10.4</a></p>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/pydantic/pydantic/blob/main/HISTORY.md">pydantic's
changelog</a>.</em></p>
<blockquote>
<h2>v2.10.5 (2025-01-08)</h2>
<p><a
href="https://github.com/pydantic/pydantic/releases/tag/v2.10.5">GitHub
release</a></p>
<h3>What's Changed</h3>
<ul>
<li>Remove custom MRO implementation of Pydantic models by <a
href="https://github.com/Viicos"><code>@​Viicos</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic/pull/11184">#11184</a></li>
<li>Fix URL serialization for unions by <a
href="https://github.com/sydney-runkle"><code>@​sydney-runkle</code></a>
in <a
href="https://redirect.github.com/pydantic/pydantic/pull/11233">#11233</a></li>
</ul>
<h2>v2.10.4 (2024-12-18)</h2>
<p><a
href="https://github.com/pydantic/pydantic/releases/tag/v2.10.4">GitHub
release</a></p>
<h3>What's Changed</h3>
<h4>Packaging</h4>
<ul>
<li>Bump <code>pydantic-core</code> to v2.27.2 by <a
href="https://github.com/davidhewitt"><code>@​davidhewitt</code></a> in
<a
href="https://redirect.github.com/pydantic/pydantic/pull/11138">#11138</a></li>
</ul>
<h4>Fixes</h4>
<ul>
<li>Fix for comparison of <code>AnyUrl</code> objects by <a
href="https://github.com/alexprabhat99"><code>@​alexprabhat99</code></a>
in <a
href="https://redirect.github.com/pydantic/pydantic/pull/11082">#11082</a></li>
<li>Properly fetch PEP 695 type params for functions, do not fetch
annotations from signature by <a
href="https://github.com/Viicos"><code>@​Viicos</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic/pull/11093">#11093</a></li>
<li>Include JSON Schema input core schema in function schemas by <a
href="https://github.com/Viicos"><code>@​Viicos</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic/pull/11085">#11085</a></li>
<li>Add <code>len</code> to <code>_BaseUrl</code> to avoid TypeError by
<a href="https://github.com/Kharianne"><code>@​Kharianne</code></a> in
<a
href="https://redirect.github.com/pydantic/pydantic/pull/11111">#11111</a></li>
<li>Make sure the type reference is removed from the seen references by
<a href="https://github.com/Viicos"><code>@​Viicos</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic/pull/11143">#11143</a></li>
</ul>
<h3>New Contributors</h3>
<ul>
<li><a href="https://github.com/FyZzyss"><code>@​FyZzyss</code></a> made
their first contribution in <a
href="https://redirect.github.com/pydantic/pydantic/pull/10789">#10789</a></li>
<li><a href="https://github.com/tamird"><code>@​tamird</code></a> made
their first contribution in <a
href="https://redirect.github.com/pydantic/pydantic/pull/10948">#10948</a></li>
<li><a href="https://github.com/felixxm"><code>@​felixxm</code></a> made
their first contribution in <a
href="https://redirect.github.com/pydantic/pydantic/pull/11077">#11077</a></li>
<li><a
href="https://github.com/alexprabhat99"><code>@​alexprabhat99</code></a>
made their first contribution in <a
href="https://redirect.github.com/pydantic/pydantic/pull/11082">#11082</a></li>
<li><a href="https://github.com/Kharianne"><code>@​Kharianne</code></a>
made their first contribution in <a
href="https://redirect.github.com/pydantic/pydantic/pull/11111">#11111</a></li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="5d34efda82"><code>5d34efd</code></a>
Prepare release v2.10.5 (<a
href="https://redirect.github.com/pydantic/pydantic/issues/11237">#11237</a>)</li>
<li><a
href="6e585f925e"><code>6e585f9</code></a>
Fix url serialization for unions (<a
href="https://redirect.github.com/pydantic/pydantic/issues/11233">#11233</a>)</li>
<li><a
href="5a22e02608"><code>5a22e02</code></a>
Remove custom MRO implementation of Pydantic models (<a
href="https://redirect.github.com/pydantic/pydantic/issues/11195">#11195</a>)</li>
<li><a
href="5bd3a6507b"><code>5bd3a65</code></a>
fix history.md</li>
<li><a
href="46f094569a"><code>46f0945</code></a>
Prepare for v2.10.4 (<a
href="https://redirect.github.com/pydantic/pydantic/issues/11144">#11144</a>)</li>
<li><a
href="ea69e695f2"><code>ea69e69</code></a>
Make sure the type reference is removed from the seen references (<a
href="https://redirect.github.com/pydantic/pydantic/issues/11145">#11145</a>)</li>
<li><a
href="a07c31e4a4"><code>a07c31e</code></a>
Include JSON Schema input core schema in function schemas (<a
href="https://redirect.github.com/pydantic/pydantic/issues/11142">#11142</a>)</li>
<li><a
href="9166d55163"><code>9166d55</code></a>
Update <code>WithJsonSchema</code> documentation, add usage
documentation for `json_sche...</li>
<li><a
href="572f57de01"><code>572f57d</code></a>
Rewrite validators documentation (<a
href="https://redirect.github.com/pydantic/pydantic/issues/11060">#11060</a>)</li>
<li><a
href="9faa8d9cbd"><code>9faa8d9</code></a>
Fix for comaparison of AnyUrl objects (<a
href="https://redirect.github.com/pydantic/pydantic/issues/11082">#11082</a>)</li>
<li>Additional commits viewable in <a
href="https://github.com/pydantic/pydantic/compare/v2.10.3...v2.10.5">compare
view</a></li>
</ul>
</details>
<br />

Updates `pydantic-settings` from 2.7.0 to 2.7.1
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/pydantic/pydantic-settings/releases">pydantic-settings's
releases</a>.</em></p>
<blockquote>
<h2>v2.7.1</h2>
<h2>What's Changed</h2>
<ul>
<li>Move preferred alias resolution to private method by <a
href="https://github.com/kschwab"><code>@​kschwab</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic-settings/pull/507">pydantic/pydantic-settings#507</a></li>
<li>Prepare release 2.7.1 by <a
href="https://github.com/hramezani"><code>@​hramezani</code></a> in <a
href="https://redirect.github.com/pydantic/pydantic-settings/pull/511">pydantic/pydantic-settings#511</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/pydantic/pydantic-settings/compare/v2.7.0...v2.7.1">https://github.com/pydantic/pydantic-settings/compare/v2.7.0...v2.7.1</a></p>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="c989335d26"><code>c989335</code></a>
Prepare release 2.7.1 (<a
href="https://redirect.github.com/pydantic/pydantic-settings/issues/511">#511</a>)</li>
<li><a
href="66ecc3adec"><code>66ecc3a</code></a>
Move preferred alias resolution to private method (<a
href="https://redirect.github.com/pydantic/pydantic-settings/issues/507">#507</a>)</li>
<li>See full diff in <a
href="https://github.com/pydantic/pydantic-settings/compare/v2.7.0...v2.7.1">compare
view</a></li>
</ul>
</details>
<br />

Updates `pytest-asyncio` from 0.25.0 to 0.25.2
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/pytest-dev/pytest-asyncio/releases">pytest-asyncio's
releases</a>.</em></p>
<blockquote>
<h2>pytest-asyncio 0.25.2</h2>
<ul>
<li>Call <code>loop.shutdown_asyncgens()</code> before closing the event
loop to ensure async generators are closed in the same manner as
<code>asyncio.run</code> does <a
href="https://redirect.github.com/pytest-dev/pytest-asyncio/pull/1034">#1034</a></li>
</ul>
<h2>pytest-asyncio 0.25.1</h2>
<ul>
<li>Fixes an issue that caused a broken event loop when a
function-scoped test was executed in between two tests with wider loop
scope <a
href="https://redirect.github.com/pytest-dev/pytest-asyncio/issues/950">#950</a></li>
<li>Improves test collection speed in auto mode <a
href="https://redirect.github.com/pytest-dev/pytest-asyncio/pull/1020">#1020</a></li>
<li>Corrects the warning that is emitted upon redefining the event_loop
fixture</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="2188cdbbf7"><code>2188cdb</code></a>
build: Prepare release of v0.25.2.</li>
<li><a
href="c3ad6340b8"><code>c3ad634</code></a>
fix: Shutdown generators before closing event loops.</li>
<li><a
href="e8ffb10528"><code>e8ffb10</code></a>
[pre-commit.ci] pre-commit autoupdate</li>
<li><a
href="aae43d4f76"><code>aae43d4</code></a>
Build(deps): Bump hypothesis in /dependencies/default</li>
<li><a
href="941e8b5104"><code>941e8b5</code></a>
Build(deps): Bump pygments from 2.18.0 to 2.19.1 in
/dependencies/docs</li>
<li><a
href="623ab74b80"><code>623ab74</code></a>
docs: Prepare release of v0.25.1.</li>
<li><a
href="c236550e73"><code>c236550</code></a>
docs: Fix broken link to the pytest.mark.asyncio reference.</li>
<li><a
href="41c645b3b7"><code>41c645b</code></a>
fix: Correct warning message when redefining the event_loop
fixture.</li>
<li><a
href="2fd10f8243"><code>2fd10f8</code></a>
docs: Clarify deprecation of event_loop fixture.</li>
<li><a
href="a4e82ab25b"><code>a4e82ab</code></a>
docs: Added changelog entry for <a
href="https://redirect.github.com/pytest-dev/pytest-asyncio/issues/1020">#1020</a>.</li>
<li>Additional commits viewable in <a
href="https://github.com/pytest-dev/pytest-asyncio/compare/v0.25.0...v0.25.2">compare
view</a></li>
</ul>
</details>
<br />

Updates `supabase` from 2.10.0 to 2.11.0
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/supabase/supabase-py/releases">supabase's
releases</a>.</em></p>
<blockquote>
<h2>v2.11.0</h2>
<h2><a
href="https://github.com/supabase/supabase-py/compare/v2.10.0...v2.11.0">2.11.0</a>
(2024-12-30)</h2>
<h3>Features</h3>
<ul>
<li><strong>auth:</strong> bump gotrue from 2.11.0 to 2.11.1 (<a
href="https://redirect.github.com/supabase/supabase-py/issues/1021">#1021</a>)
(<a
href="17e8a662de">17e8a66</a>)</li>
<li><strong>functions:</strong> bump supafunc from 0.8.0 to 0.9.0 (<a
href="https://redirect.github.com/supabase/supabase-py/issues/1008">#1008</a>)
(<a
href="19ab5df525">19ab5df</a>)</li>
<li><strong>postgrest:</strong> bump postgrest from 0.19.0 to 0.19.1 (<a
href="https://redirect.github.com/supabase/supabase-py/issues/1022">#1022</a>)
(<a
href="9a8b72fe3d">9a8b72f</a>)</li>
<li><strong>realtime:</strong> bump realtime from 2.0.6 to 2.1.0 (<a
href="https://redirect.github.com/supabase/supabase-py/issues/1019">#1019</a>)
(<a
href="f251d520af">f251d52</a>)</li>
<li><strong>storage:</strong> bump storage3 from 0.10.0 to 0.11.0 (<a
href="https://redirect.github.com/supabase/supabase-py/issues/1020">#1020</a>)
(<a
href="fb7a7e1257">fb7a7e1</a>)</li>
</ul>
<h3>Bug Fixes</h3>
<ul>
<li>remove project reference (<a
href="https://redirect.github.com/supabase/supabase-py/issues/999">#999</a>)
(<a
href="e126d04d26">e126d04</a>)</li>
</ul>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/supabase/supabase-py/blob/main/CHANGELOG.md">supabase's
changelog</a>.</em></p>
<blockquote>
<h2><a
href="https://github.com/supabase/supabase-py/compare/v2.10.0...v2.11.0">2.11.0</a>
(2024-12-30)</h2>
<h3>Features</h3>
<ul>
<li><strong>auth:</strong> bump gotrue from 2.10.0 to 2.11.0 (<a
href="https://redirect.github.com/supabase/supabase-py/issues/1005">#1005</a>)
(<a
href="721de30e5b">721de30</a>)</li>
<li><strong>auth:</strong> bump gotrue from 2.11.0 to 2.11.1 (<a
href="https://redirect.github.com/supabase/supabase-py/issues/1021">#1021</a>)
(<a
href="17e8a662de">17e8a66</a>)</li>
<li><strong>functions:</strong> bump supafunc from 0.7.0 to 0.8.0 (<a
href="https://redirect.github.com/supabase/supabase-py/issues/1006">#1006</a>)
(<a
href="bfc4a5c1e3">bfc4a5c</a>)</li>
<li><strong>functions:</strong> bump supafunc from 0.8.0 to 0.9.0 (<a
href="https://redirect.github.com/supabase/supabase-py/issues/1008">#1008</a>)
(<a
href="19ab5df525">19ab5df</a>)</li>
<li><strong>postgrest:</strong> bump postgrest from 0.18.0 to 0.19.0 (<a
href="https://redirect.github.com/supabase/supabase-py/issues/1004">#1004</a>)
(<a
href="d7861b0c4d">d7861b0</a>)</li>
<li><strong>postgrest:</strong> bump postgrest from 0.19.0 to 0.19.1 (<a
href="https://redirect.github.com/supabase/supabase-py/issues/1022">#1022</a>)
(<a
href="9a8b72fe3d">9a8b72f</a>)</li>
<li><strong>realtime:</strong> bump realtime from 2.0.6 to 2.1.0 (<a
href="https://redirect.github.com/supabase/supabase-py/issues/1019">#1019</a>)
(<a
href="f251d520af">f251d52</a>)</li>
<li><strong>storage:</strong> bump storage3 from 0.10.0 to 0.11.0 (<a
href="https://redirect.github.com/supabase/supabase-py/issues/1020">#1020</a>)
(<a
href="fb7a7e1257">fb7a7e1</a>)</li>
<li><strong>storage:</strong> bump storage3 from 0.9.0 to 0.10.0 (<a
href="https://redirect.github.com/supabase/supabase-py/issues/1003">#1003</a>)
(<a
href="718edc3892">718edc3</a>)</li>
</ul>
<h3>Bug Fixes</h3>
<ul>
<li>remove project reference (<a
href="https://redirect.github.com/supabase/supabase-py/issues/999">#999</a>)
(<a
href="e126d04d26">e126d04</a>)</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="6fed6d64a1"><code>6fed6d6</code></a>
chore(main): release 2.11.0 (<a
href="https://redirect.github.com/supabase/supabase-py/issues/1000">#1000</a>)</li>
<li><a
href="9a8b72fe3d"><code>9a8b72f</code></a>
feat(postgrest): bump postgrest from 0.19.0 to 0.19.1 (<a
href="https://redirect.github.com/supabase/supabase-py/issues/1022">#1022</a>)</li>
<li><a
href="17e8a662de"><code>17e8a66</code></a>
feat(auth): bump gotrue from 2.11.0 to 2.11.1 (<a
href="https://redirect.github.com/supabase/supabase-py/issues/1021">#1021</a>)</li>
<li><a
href="fb7a7e1257"><code>fb7a7e1</code></a>
feat(storage): bump storage3 from 0.10.0 to 0.11.0 (<a
href="https://redirect.github.com/supabase/supabase-py/issues/1020">#1020</a>)</li>
<li><a
href="f251d520af"><code>f251d52</code></a>
feat(realtime): bump realtime from 2.0.6 to 2.1.0 (<a
href="https://redirect.github.com/supabase/supabase-py/issues/1019">#1019</a>)</li>
<li><a
href="616bc21629"><code>616bc21</code></a>
chore(deps-dev): bump jinja2 from 3.1.4 to 3.1.5 in the pip group across
1 di...</li>
<li><a
href="4f68eee25b"><code>4f68eee</code></a>
chore: Add Coveralls to CI (<a
href="https://redirect.github.com/supabase/supabase-py/issues/1018">#1018</a>)</li>
<li><a
href="7ecf6b62c7"><code>7ecf6b6</code></a>
chore(deps-dev): bump pytest-asyncio from 0.24.0 to 0.25.0 (<a
href="https://redirect.github.com/supabase/supabase-py/issues/1014">#1014</a>)</li>
<li><a
href="3fa4814937"><code>3fa4814</code></a>
chore: update httpx (<a
href="https://redirect.github.com/supabase/supabase-py/issues/1013">#1013</a>)</li>
<li><a
href="c81ffbaac5"><code>c81ffba</code></a>
chore(deps-dev): bump commitizen from 4.0.0 to 4.1.0 (<a
href="https://redirect.github.com/supabase/supabase-py/issues/1012">#1012</a>)</li>
<li>Additional commits viewable in <a
href="https://github.com/supabase/supabase-py/compare/v2.10.0...v2.11.0">compare
view</a></li>
</ul>
</details>
<br />


Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore <dependency name> major version` will close this
group update PR and stop Dependabot creating any more for the specific
dependency's major version (unless you unignore this specific
dependency's major version or upgrade to it yourself)
- `@dependabot ignore <dependency name> minor version` will close this
group update PR and stop Dependabot creating any more for the specific
dependency's minor version (unless you unignore this specific
dependency's minor version or upgrade to it yourself)
- `@dependabot ignore <dependency name>` will close this group update PR
and stop Dependabot creating any more for the specific dependency
(unless you unignore this specific dependency or upgrade to it yourself)
- `@dependabot unignore <dependency name>` will remove all of the ignore
conditions of the specified dependency
- `@dependabot unignore <dependency name> <ignore condition>` will
remove the ignore condition of the specified dependency and ignore
conditions


</details>

---------

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Nicholas Tindle <nicholas.tindle@agpt.co>
2025-01-24 15:54:29 +00:00
Zamil Majdy
88f711e486 fix(backend): Make monthly top-up adjust the target balance instead of top-up the target amount (#9295)
To maintain the transaction running-balance integrity, the monthly
top-up balance has to adjust the top-up amount to maintain the balance
of the user to be at least X amount, instead of top-up the user balance
with a fixed X amount.

### Changes 🏗️

* Add an additional query to sum the remaining un-snapshotted balance
(if any).
* Monthly top-up transaction set the amount target_amount -
current_balance instead of fixed target_amount.

### Checklist 📋

#### For code changes:
- [ ] I have clearly listed my changes in the PR description
- [ ] I have made a test plan
- [ ] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [ ] ...

<details>
  <summary>Example test plan</summary>
  
  - [ ] Create from scratch and execute an agent with at least 3 blocks
- [ ] Import an agent from file upload, and confirm it executes
correctly
  - [ ] Upload agent to marketplace
- [ ] Import an agent from marketplace and confirm it executes correctly
  - [ ] Edit an agent from monitor, and confirm it executes correctly
</details>

#### For configuration changes:
- [ ] `.env.example` is updated or already compatible with my changes
- [ ] `docker-compose.yml` is updated or already compatible with my
changes
- [ ] I have included a list of my configuration changes in the PR
description (under **Changes**)

<details>
  <summary>Examples of configuration changes</summary>

  - Changing ports
  - Adding new services that need to communicate with each other
  - Secrets or environment variable changes
  - New or infrastructure changes such as databases
</details>
2025-01-24 14:43:43 +00:00
dependabot[bot]
56330b1dd3 chore(backend/deps-dev): bump the development-dependencies group across 1 directory with 5 updates (#9300)
Bumps the development-dependencies group with 5 updates in the
/autogpt_platform/backend directory:

| Package | From | To |
| --- | --- | --- |
| [poethepoet](https://github.com/nat-n/poethepoet) | `0.31.0` |
`0.32.1` |
| [ruff](https://github.com/astral-sh/ruff) | `0.8.3` | `0.9.2` |
| [pyright](https://github.com/RobertCraigie/pyright-python) | `1.1.389`
| `1.1.392.post0` |
| [aiohappyeyeballs](https://github.com/aio-libs/aiohappyeyeballs) |
`2.4.3` | `2.4.4` |
| [faker](https://github.com/joke2k/faker) | `33.1.0` | `33.3.1` |


Updates `poethepoet` from 0.31.0 to 0.32.1
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/nat-n/poethepoet/releases">poethepoet's
releases</a>.</em></p>
<blockquote>
<h2>v0.32.1</h2>
<h2>Enhancements</h2>
<ul>
<li>feat: Upgrade poetry dependency to make the poetry plugin work with
poetry 2.0 by <a
href="https://github.com/nat-n"><code>@​nat-n</code></a> in <a
href="https://redirect.github.com/nat-n/poethepoet/pull/269">nat-n/poethepoet#269</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/nat-n/poethepoet/compare/v0.32.0...v0.32.1">https://github.com/nat-n/poethepoet/compare/v0.32.0...v0.32.1</a></p>
<h2>0.32.0</h2>
<h2>Enhancements</h2>
<ul>
<li>
<p>Make command parsing support <em>default value</em> and <em>alternate
value</em> operations on param expansions by <a
href="https://github.com/nat-n"><code>@​nat-n</code></a> in <a
href="https://redirect.github.com/nat-n/poethepoet/pull/266">nat-n/poethepoet#266</a></p>
<ul>
<li>See feature <a
href="https://poethepoet.natn.io/tasks/task_types/cmd.html#parameter-expansion-operators">📖
documentation</a> for more details</li>
</ul>
</li>
<li>
<p>Explicitly disallow <code>capture_stdout</code> option on sequence
tasks by <a href="https://github.com/nat-n"><code>@​nat-n</code></a> in
<a
href="https://redirect.github.com/nat-n/poethepoet/pull/265">nat-n/poethepoet#265</a></p>
</li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/nat-n/poethepoet/compare/v0.31.1...v0.32.0">https://github.com/nat-n/poethepoet/compare/v0.31.1...v0.32.0</a></p>
<h2>0.31.1</h2>
<h2>Fixes</h2>
<ul>
<li>fix: Explicitly disallow capture_stdout option on sequence tasks by
<a href="https://github.com/nat-n"><code>@​nat-n</code></a> in <a
href="https://redirect.github.com/nat-n/poethepoet/pull/261">nat-n/poethepoet#261</a></li>
<li>fix: Allow env var defaults in included task files by <a
href="https://github.com/nat-n"><code>@​nat-n</code></a> in <a
href="https://redirect.github.com/nat-n/poethepoet/pull/263">nat-n/poethepoet#263</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/nat-n/poethepoet/compare/v0.31.0...v0.31.1">https://github.com/nat-n/poethepoet/compare/v0.31.0...v0.31.1</a></p>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="9997e878d4"><code>9997e87</code></a>
Bump version to 0.32.1</li>
<li><a
href="d2fa221f57"><code>d2fa221</code></a>
feat: Upgrade poetry dependency to work with 2.0 (<a
href="https://redirect.github.com/nat-n/poethepoet/issues/269">#269</a>)</li>
<li><a
href="0133c42752"><code>0133c42</code></a>
Fix docs publishing workflow</li>
<li><a
href="f199b1135d"><code>f199b11</code></a>
Bump version to 0.32.0</li>
<li><a
href="a72a19378f"><code>a72a193</code></a>
Remove dev dependency on bpython</li>
<li><a
href="68e9e9ba1a"><code>68e9e9b</code></a>
Make command parsing support operations on param expansions (<a
href="https://redirect.github.com/nat-n/poethepoet/issues/266">#266</a>)</li>
<li><a
href="eecbb96e09"><code>eecbb96</code></a>
feat: Explicitly disallow capture_stdout option on sequence tasks (<a
href="https://redirect.github.com/nat-n/poethepoet/issues/265">#265</a>)</li>
<li><a
href="3884fcd240"><code>3884fcd</code></a>
fix: Run CI on pushing a version tag</li>
<li><a
href="525a1ebcac"><code>525a1eb</code></a>
docs: Document compatibility with uv projects</li>
<li><a
href="03546b7af7"><code>03546b7</code></a>
Bump version to 0.31.1</li>
<li>Additional commits viewable in <a
href="https://github.com/nat-n/poethepoet/compare/v0.31.0...v0.32.1">compare
view</a></li>
</ul>
</details>
<br />

Updates `ruff` from 0.8.3 to 0.9.2
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/astral-sh/ruff/releases">ruff's
releases</a>.</em></p>
<blockquote>
<h2>0.9.2</h2>
<h2>Release Notes</h2>
<h3>Preview features</h3>
<ul>
<li>[<code>airflow</code>] Fix typo &quot;security_managr&quot; to
&quot;security_manager&quot; (<code>AIR303</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15463">#15463</a>)</li>
<li>[<code>airflow</code>] extend and fix AIR302 rules (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15525">#15525</a>)</li>
<li>[<code>fastapi</code>] Handle parameters with <code>Depends</code>
correctly (<code>FAST003</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15364">#15364</a>)</li>
<li>[<code>flake8-pytest-style</code>] Implement pytest.warns
diagnostics (<code>PT029</code>, <code>PT030</code>, <code>PT031</code>)
(<a
href="https://redirect.github.com/astral-sh/ruff/pull/15444">#15444</a>)</li>
<li>[<code>flake8-pytest-style</code>] Test function parameters with
default arguments (<code>PT028</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15449">#15449</a>)</li>
<li>[<code>flake8-type-checking</code>] Avoid false positives for
<code>|</code> in <code>TC008</code> (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15201">#15201</a>)</li>
</ul>
<h3>Rule changes</h3>
<ul>
<li>[<code>flake8-todos</code>] Allow VSCode GitHub PR extension style
links in <code>missing-todo-link</code> (<code>TD003</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15519">#15519</a>)</li>
<li>[<code>pyflakes</code>] Show syntax error message for
<code>F722</code> (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15523">#15523</a>)</li>
</ul>
<h3>Formatter</h3>
<ul>
<li>Fix curly bracket spacing around f-string expressions containing
curly braces (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15471">#15471</a>)</li>
<li>Fix joining of f-strings with different quotes when using quote
style <code>Preserve</code> (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15524">#15524</a>)</li>
</ul>
<h3>Server</h3>
<ul>
<li>Avoid indexing the same workspace multiple times (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15495">#15495</a>)</li>
<li>Display context for <code>ruff.configuration</code> errors (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15452">#15452</a>)</li>
</ul>
<h3>Configuration</h3>
<ul>
<li>Remove <code>flatten</code> to improve deserialization error
messages (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15414">#15414</a>)</li>
</ul>
<h3>Bug fixes</h3>
<ul>
<li>Parse triple-quoted string annotations as if parenthesized (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15387">#15387</a>)</li>
<li>[<code>fastapi</code>] Update <code>Annotated</code> fixes
(<code>FAST002</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15462">#15462</a>)</li>
<li>[<code>flake8-bandit</code>] Check for <code>builtins</code> instead
of <code>builtin</code> (<code>S102</code>, <code>PTH123</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15443">#15443</a>)</li>
<li>[<code>flake8-pathlib</code>] Fix <code>--select</code> for
<code>os-path-dirname</code> (<code>PTH120</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15446">#15446</a>)</li>
<li>[<code>ruff</code>] Fix false positive on global keyword
(<code>RUF052</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15235">#15235</a>)</li>
</ul>
<h2>Contributors</h2>
<ul>
<li><a
href="https://github.com/AlexWaygood"><code>@​AlexWaygood</code></a></li>
<li><a
href="https://github.com/BurntSushi"><code>@​BurntSushi</code></a></li>
<li><a
href="https://github.com/Daverball"><code>@​Daverball</code></a></li>
<li><a
href="https://github.com/Garrett-R"><code>@​Garrett-R</code></a></li>
<li><a
href="https://github.com/Glyphack"><code>@​Glyphack</code></a></li>
<li><a
href="https://github.com/InSyncWithFoo"><code>@​InSyncWithFoo</code></a></li>
<li><a href="https://github.com/Lee-W"><code>@​Lee-W</code></a></li>
<li><a
href="https://github.com/MichaReiser"><code>@​MichaReiser</code></a></li>
<li><a
href="https://github.com/cake-monotone"><code>@​cake-monotone</code></a></li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/astral-sh/ruff/blob/main/CHANGELOG.md">ruff's
changelog</a>.</em></p>
<blockquote>
<h2>0.9.2</h2>
<h3>Preview features</h3>
<ul>
<li>[<code>airflow</code>] Fix typo &quot;security_managr&quot; to
&quot;security_manager&quot; (<code>AIR303</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15463">#15463</a>)</li>
<li>[<code>airflow</code>] extend and fix AIR302 rules (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15525">#15525</a>)</li>
<li>[<code>fastapi</code>] Handle parameters with <code>Depends</code>
correctly (<code>FAST003</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15364">#15364</a>)</li>
<li>[<code>flake8-pytest-style</code>] Implement pytest.warns
diagnostics (<code>PT029</code>, <code>PT030</code>, <code>PT031</code>)
(<a
href="https://redirect.github.com/astral-sh/ruff/pull/15444">#15444</a>)</li>
<li>[<code>flake8-pytest-style</code>] Test function parameters with
default arguments (<code>PT028</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15449">#15449</a>)</li>
<li>[<code>flake8-type-checking</code>] Avoid false positives for
<code>|</code> in <code>TC008</code> (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15201">#15201</a>)</li>
</ul>
<h3>Rule changes</h3>
<ul>
<li>[<code>flake8-todos</code>] Allow VSCode GitHub PR extension style
links in <code>missing-todo-link</code> (<code>TD003</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15519">#15519</a>)</li>
<li>[<code>pyflakes</code>] Show syntax error message for
<code>F722</code> (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15523">#15523</a>)</li>
</ul>
<h3>Formatter</h3>
<ul>
<li>Fix curly bracket spacing around f-string expressions containing
curly braces (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15471">#15471</a>)</li>
<li>Fix joining of f-strings with different quotes when using quote
style <code>Preserve</code> (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15524">#15524</a>)</li>
</ul>
<h3>Server</h3>
<ul>
<li>Avoid indexing the same workspace multiple times (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15495">#15495</a>)</li>
<li>Display context for <code>ruff.configuration</code> errors (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15452">#15452</a>)</li>
</ul>
<h3>Configuration</h3>
<ul>
<li>Remove <code>flatten</code> to improve deserialization error
messages (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15414">#15414</a>)</li>
</ul>
<h3>Bug fixes</h3>
<ul>
<li>Parse triple-quoted string annotations as if parenthesized (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15387">#15387</a>)</li>
<li>[<code>fastapi</code>] Update <code>Annotated</code> fixes
(<code>FAST002</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15462">#15462</a>)</li>
<li>[<code>flake8-bandit</code>] Check for <code>builtins</code> instead
of <code>builtin</code> (<code>S102</code>, <code>PTH123</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15443">#15443</a>)</li>
<li>[<code>flake8-pathlib</code>] Fix <code>--select</code> for
<code>os-path-dirname</code> (<code>PTH120</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15446">#15446</a>)</li>
<li>[<code>ruff</code>] Fix false positive on global keyword
(<code>RUF052</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15235">#15235</a>)</li>
</ul>
<h2>0.9.1</h2>
<h3>Preview features</h3>
<ul>
<li>[<code>pycodestyle</code>] Run
<code>too-many-newlines-at-end-of-file</code> on each cell in notebooks
(<code>W391</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15308">#15308</a>)</li>
<li>[<code>ruff</code>] Omit diagnostic for shadowed private function
parameters in <code>used-dummy-variable</code> (<code>RUF052</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15376">#15376</a>)</li>
</ul>
<h3>Rule changes</h3>
<ul>
<li>[<code>flake8-bugbear</code>] Improve
<code>assert-raises-exception</code> message (<code>B017</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15389">#15389</a>)</li>
</ul>
<h3>Formatter</h3>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="0a39348381"><code>0a39348</code></a>
Include build binaries</li>
<li><a
href="027f8009e5"><code>027f800</code></a>
Comment out non-npm-publish jobs</li>
<li><a
href="425870df76"><code>425870d</code></a>
Upload npm publish logs when failed</li>
<li><a
href="c20255abe4"><code>c20255a</code></a>
Bump version to 0.9.2 (<a
href="https://redirect.github.com/astral-sh/ruff/issues/15529">#15529</a>)</li>
<li><a
href="420365811f"><code>4203658</code></a>
Fix joining of f-strings with different quotes when using quote style
`Preser...</li>
<li><a
href="fc9dd63d64"><code>fc9dd63</code></a>
[airflow] extend and fix AIR302 rules (<a
href="https://redirect.github.com/astral-sh/ruff/issues/15525">#15525</a>)</li>
<li><a
href="79e52c7fdf"><code>79e52c7</code></a>
[<code>pyflakes</code>] Show syntax error message for <code>F722</code>
(<a
href="https://redirect.github.com/astral-sh/ruff/issues/15523">#15523</a>)</li>
<li><a
href="cf4ab7cba1"><code>cf4ab7c</code></a>
Parse triple quoted string annotations as if parenthesized (<a
href="https://redirect.github.com/astral-sh/ruff/issues/15387">#15387</a>)</li>
<li><a
href="d2656e88a3"><code>d2656e8</code></a>
[<code>flake8-todos</code>] Allow VSCode GitHub PR extension style links
in `missing-tod...</li>
<li><a
href="c53ee608a1"><code>c53ee60</code></a>
Typeshed-sync workflow: add appropriate labels, link directly to failing
run ...</li>
<li>Additional commits viewable in <a
href="https://github.com/astral-sh/ruff/compare/0.8.3...0.9.2">compare
view</a></li>
</ul>
</details>
<br />

Updates `pyright` from 1.1.389 to 1.1.392.post0
<details>
<summary>Commits</summary>
<ul>
<li><a
href="33dece9ee3"><code>33dece9</code></a>
chore: release v1.1.392.post0 (<a
href="https://redirect.github.com/RobertCraigie/pyright-python/issues/331">#331</a>)</li>
<li><a
href="f15e56f25d"><code>f15e56f</code></a>
feat: bundle pyright inside wheel (<a
href="https://redirect.github.com/RobertCraigie/pyright-python/issues/300">#300</a>)</li>
<li><a
href="f5c77313ff"><code>f5c7731</code></a>
[pyright updated to 1.1.392] Update Version (<a
href="https://redirect.github.com/RobertCraigie/pyright-python/issues/329">#329</a>)</li>
<li><a
href="08b251cffc"><code>08b251c</code></a>
CI: lower ubunutu version + bump macos + node 18 (<a
href="https://redirect.github.com/RobertCraigie/pyright-python/issues/330">#330</a>)</li>
<li><a
href="3356df1d40"><code>3356df1</code></a>
[pyright updated to 1.1.391] Update Version (<a
href="https://redirect.github.com/RobertCraigie/pyright-python/issues/327">#327</a>)</li>
<li><a
href="ee025bc694"><code>ee025bc</code></a>
Pyright NPM Package update to 1.1.390 (<a
href="https://redirect.github.com/RobertCraigie/pyright-python/issues/325">#325</a>)</li>
<li>See full diff in <a
href="https://github.com/RobertCraigie/pyright-python/compare/v1.1.389...v1.1.392.post0">compare
view</a></li>
</ul>
</details>
<br />

Updates `aiohappyeyeballs` from 2.4.3 to 2.4.4
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/aio-libs/aiohappyeyeballs/releases">aiohappyeyeballs's
releases</a>.</em></p>
<blockquote>
<h1>v2.4.4 (2024-11-30)</h1>
<h2>Fix</h2>
<ul>
<li>fix: handle OSError on failure to close socket instead of raising
IndexError (<a
href="https://redirect.github.com/aio-libs/aiohappyeyeballs/issues/114">#114</a>)</li>
</ul>
<p>Co-authored-by: pre-commit-ci[bot] &lt;66853113+pre-commit-ci[bot]<a
href="https://github.com/users"><code>@​users</code></a>.noreply.github.com&gt;
Co-authored-by: J. Nick Koston &lt;<a
href="mailto:nick@koston.org">nick@koston.org</a>&gt; (<a
href="c542f684d3"><code>c542f68</code></a>)</p>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/aio-libs/aiohappyeyeballs/blob/main/CHANGELOG.md">aiohappyeyeballs's
changelog</a>.</em></p>
<blockquote>
<h2>v2.4.4 (2024-11-30)</h2>
<h3>Fix</h3>
<ul>
<li>Handle oserror on failure to close socket instead of raising
indexerror (<a
href="https://redirect.github.com/aio-libs/aiohappyeyeballs/issues/114">#114</a>)
(<a
href="c542f684d3"><code>c542f68</code></a>)</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="3c4f2a6892"><code>3c4f2a6</code></a>
2.4.4</li>
<li><a
href="c542f684d3"><code>c542f68</code></a>
fix: handle OSError on failure to close socket instead of raising
IndexError ...</li>
<li><a
href="fd90f564d5"><code>fd90f56</code></a>
chore(pre-commit.ci): pre-commit autoupdate (<a
href="https://redirect.github.com/aio-libs/aiohappyeyeballs/issues/116">#116</a>)</li>
<li><a
href="0653807446"><code>0653807</code></a>
chore: bump codecov-action to 5.0.3 (<a
href="https://redirect.github.com/aio-libs/aiohappyeyeballs/issues/115">#115</a>)</li>
<li><a
href="90e01edddd"><code>90e01ed</code></a>
chore(pre-commit.ci): pre-commit autoupdate (<a
href="https://redirect.github.com/aio-libs/aiohappyeyeballs/issues/113">#113</a>)</li>
<li><a
href="31825f2a3c"><code>31825f2</code></a>
chore(pre-commit.ci): pre-commit autoupdate (<a
href="https://redirect.github.com/aio-libs/aiohappyeyeballs/issues/111">#111</a>)</li>
<li><a
href="4c23bcad40"><code>4c23bca</code></a>
chore: add missing FUNDING.yml (<a
href="https://redirect.github.com/aio-libs/aiohappyeyeballs/issues/110">#110</a>)</li>
<li><a
href="b5dfff592e"><code>b5dfff5</code></a>
chore(pre-commit.ci): pre-commit autoupdate (<a
href="https://redirect.github.com/aio-libs/aiohappyeyeballs/issues/108">#108</a>)</li>
<li><a
href="5a3b4cb871"><code>5a3b4cb</code></a>
chore: fix docs (<a
href="https://redirect.github.com/aio-libs/aiohappyeyeballs/issues/106">#106</a>)</li>
<li>See full diff in <a
href="https://github.com/aio-libs/aiohappyeyeballs/compare/v2.4.3...v2.4.4">compare
view</a></li>
</ul>
</details>
<br />

Updates `faker` from 33.1.0 to 33.3.1
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/joke2k/faker/releases">faker's
releases</a>.</em></p>
<blockquote>
<h2>Release v33.3.1</h2>
<p>See <a
href="https://github.com/joke2k/faker/blob/refs/tags/v33.3.1/CHANGELOG.md">CHANGELOG.md</a>.</p>
<h2>Release v33.3.0</h2>
<p>See <a
href="https://github.com/joke2k/faker/blob/refs/tags/v33.3.0/CHANGELOG.md">CHANGELOG.md</a>.</p>
<h2>Release v33.2.0</h2>
<p>See <a
href="https://github.com/joke2k/faker/blob/refs/tags/v33.2.0/CHANGELOG.md">CHANGELOG.md</a>.</p>
<h2>Release v33.1.3</h2>
<p>See <a
href="https://github.com/joke2k/faker/blob/refs/tags/v33.1.3/CHANGELOG.md">CHANGELOG.md</a>.</p>
<h2>Release v33.1.2</h2>
<p>See <a
href="https://github.com/joke2k/faker/blob/refs/tags/v33.1.2/CHANGELOG.md">CHANGELOG.md</a>.</p>
<h2>Release v33.1.1</h2>
<p>See <a
href="https://github.com/joke2k/faker/blob/refs/tags/v33.1.1/CHANGELOG.md">CHANGELOG.md</a>.</p>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/joke2k/faker/blob/master/CHANGELOG.md">faker's
changelog</a>.</em></p>
<blockquote>
<h3><a
href="https://github.com/joke2k/faker/compare/v33.3.0...v33.3.1">v33.3.1
- 2025-01-10</a></h3>
<ul>
<li>Fix <code>nl_BE</code> Bank Provider (BBAN, IBAN, SWIFT). Thanks <a
href="https://github.com/AliYmn"><code>@​AliYmn</code></a>.</li>
</ul>
<h3><a
href="https://github.com/joke2k/faker/compare/v33.2.3...v33.3.0">v33.3.0
- 2025-01-03</a></h3>
<ul>
<li>Add support for Zulu (<code>zu_ZA</code>) address provider and
corresponding tests. Thanks <a
href="https://github.com/AliYmn"><code>@​AliYmn</code></a>.</li>
</ul>
<h3><a
href="https://github.com/joke2k/faker/compare/v33.1.3...v33.2.0">v33.2.0
- 2025-01-03</a></h3>
<ul>
<li>Add currency provider for <code>uk_UA</code>. Thanks <a
href="https://github.com/SaulTigh"><code>@​SaulTigh</code></a>.</li>
</ul>
<h3><a
href="https://github.com/joke2k/faker/compare/v33.1.2...v33.1.3">v33.1.3
- 2025-01-03</a></h3>
<ul>
<li>Fix type annotation on Python 3.8.</li>
</ul>
<h3><a
href="https://github.com/joke2k/faker/compare/v33.1.1...v33.1.2">v33.1.2
- 2025-01-03</a></h3>
<ul>
<li>Fix <code>ru_RU</code> passport provider. Thanks <a
href="https://github.com/denisSurkov"><code>@​denisSurkov</code></a>.</li>
</ul>
<h3><a
href="https://github.com/joke2k/faker/compare/v33.1.0...v33.1.1">v33.1.1
- 2025-01-03</a></h3>
<ul>
<li>Fix address number output issue in <code>ko_KR</code> address
provider. Thanks <a
href="https://github.com/semi-yu"><code>@​semi-yu</code></a>.</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="bbcab85add"><code>bbcab85</code></a>
Bump version: 33.3.0 → 33.3.1</li>
<li><a
href="9130687455"><code>9130687</code></a>
📝 Update CHANGELOG.md</li>
<li><a
href="f4ad7754ea"><code>f4ad775</code></a>
Fix <code>nl_BE</code> Bank Provider (BBAN, IBAN, SWIFT) (<a
href="https://redirect.github.com/joke2k/faker/issues/2142">#2142</a>)</li>
<li><a
href="a21084461e"><code>a210844</code></a>
Bump version: 33.2.0 → 33.3.0</li>
<li><a
href="8ec1609428"><code>8ec1609</code></a>
📝 Update CHANGELOG.md</li>
<li><a
href="4e2839e93b"><code>4e2839e</code></a>
fix administrative units un <code>zu_ZA</code> address provider</li>
<li><a
href="d9d70c4602"><code>d9d70c4</code></a>
Add support for Zulu (<code>zu_ZA</code>) address provider and
corresponding tests (<a
href="https://redirect.github.com/joke2k/faker/issues/2143">#2143</a>)</li>
<li><a
href="858e31f94e"><code>858e31f</code></a>
Bump version: 33.1.3 → 33.2.0</li>
<li><a
href="27a7005ef4"><code>27a7005</code></a>
📝 Update CHANGELOG.md</li>
<li><a
href="d72db5473c"><code>d72db54</code></a>
feat: add currency provider for <code>uk_UA</code> (<a
href="https://redirect.github.com/joke2k/faker/issues/2141">#2141</a>)</li>
<li>Additional commits viewable in <a
href="https://github.com/joke2k/faker/compare/v33.1.0...v33.3.1">compare
view</a></li>
</ul>
</details>
<br />


Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore <dependency name> major version` will close this
group update PR and stop Dependabot creating any more for the specific
dependency's major version (unless you unignore this specific
dependency's major version or upgrade to it yourself)
- `@dependabot ignore <dependency name> minor version` will close this
group update PR and stop Dependabot creating any more for the specific
dependency's minor version (unless you unignore this specific
dependency's minor version or upgrade to it yourself)
- `@dependabot ignore <dependency name>` will close this group update PR
and stop Dependabot creating any more for the specific dependency
(unless you unignore this specific dependency or upgrade to it yourself)
- `@dependabot unignore <dependency name>` will remove all of the ignore
conditions of the specified dependency
- `@dependabot unignore <dependency name> <ignore condition>` will
remove the ignore condition of the specified dependency and ignore
conditions


</details>

---------

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Nicholas Tindle <nicholas.tindle@agpt.co>
2025-01-24 14:28:43 +00:00
Zamil Majdy
a584f1fdd2 feat(backend): Minimize the non caught error logic between graph exec creation and queueing (#9306)
There should be no possibly failing code between graph exec creation and
the actual queueing.
It could risk the graph execution being stuck on the QUEUED status.

### Changes 🏗️

Moved the node exec creation and status update to be inside the graph
execution logic.

### Checklist 📋

#### For code changes:
- [ ] I have clearly listed my changes in the PR description
- [ ] I have made a test plan
- [ ] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [ ] ...

<details>
  <summary>Example test plan</summary>
  
  - [ ] Create from scratch and execute an agent with at least 3 blocks
- [ ] Import an agent from file upload, and confirm it executes
correctly
  - [ ] Upload agent to marketplace
- [ ] Import an agent from marketplace and confirm it executes correctly
  - [ ] Edit an agent from monitor, and confirm it executes correctly
</details>

#### For configuration changes:
- [ ] `.env.example` is updated or already compatible with my changes
- [ ] `docker-compose.yml` is updated or already compatible with my
changes
- [ ] I have included a list of my configuration changes in the PR
description (under **Changes**)

<details>
  <summary>Examples of configuration changes</summary>

  - Changing ports
  - Adding new services that need to communicate with each other
  - Secrets or environment variable changes
  - New or infrastructure changes such as databases
</details>
2025-01-24 13:52:42 +00:00
Reinier van der Leer
da7aead361 fix(frontend): Fix page layouts (sizing + padding) (#9311)
- Resolves #9310

### Changes 🏗️

- Make base layout full width and fix its sizing behavior
  - Fix navbar overflowing on the right
- Set padding on `/monitoring`
- Make `/login` and `/signup` layouts self-center

### Checklist 📋

- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
  - Check layouts of all pages
    - `/signup`
    - `/login`
    - `/build`
    - `/monitoring`
    - `/store`
    - `/store/profile`
    - `/store/dashboard`
    - `/store/settings`

---------

Co-authored-by: Zamil Majdy <zamil.majdy@agpt.co>
2025-01-24 13:17:46 +00:00
dependabot[bot]
5383e8ba27 chore(libs/deps-dev): bump ruff from 0.8.6 to 0.9.2 in /autogpt_platform/autogpt_libs in the development-dependencies group across 1 directory (#9299)
Bumps the development-dependencies group with 1 update in the
/autogpt_platform/autogpt_libs directory:
[ruff](https://github.com/astral-sh/ruff).

Updates `ruff` from 0.8.6 to 0.9.2
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/astral-sh/ruff/releases">ruff's
releases</a>.</em></p>
<blockquote>
<h2>0.9.2</h2>
<h2>Release Notes</h2>
<h3>Preview features</h3>
<ul>
<li>[<code>airflow</code>] Fix typo &quot;security_managr&quot; to
&quot;security_manager&quot; (<code>AIR303</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15463">#15463</a>)</li>
<li>[<code>airflow</code>] extend and fix AIR302 rules (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15525">#15525</a>)</li>
<li>[<code>fastapi</code>] Handle parameters with <code>Depends</code>
correctly (<code>FAST003</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15364">#15364</a>)</li>
<li>[<code>flake8-pytest-style</code>] Implement pytest.warns
diagnostics (<code>PT029</code>, <code>PT030</code>, <code>PT031</code>)
(<a
href="https://redirect.github.com/astral-sh/ruff/pull/15444">#15444</a>)</li>
<li>[<code>flake8-pytest-style</code>] Test function parameters with
default arguments (<code>PT028</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15449">#15449</a>)</li>
<li>[<code>flake8-type-checking</code>] Avoid false positives for
<code>|</code> in <code>TC008</code> (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15201">#15201</a>)</li>
</ul>
<h3>Rule changes</h3>
<ul>
<li>[<code>flake8-todos</code>] Allow VSCode GitHub PR extension style
links in <code>missing-todo-link</code> (<code>TD003</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15519">#15519</a>)</li>
<li>[<code>pyflakes</code>] Show syntax error message for
<code>F722</code> (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15523">#15523</a>)</li>
</ul>
<h3>Formatter</h3>
<ul>
<li>Fix curly bracket spacing around f-string expressions containing
curly braces (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15471">#15471</a>)</li>
<li>Fix joining of f-strings with different quotes when using quote
style <code>Preserve</code> (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15524">#15524</a>)</li>
</ul>
<h3>Server</h3>
<ul>
<li>Avoid indexing the same workspace multiple times (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15495">#15495</a>)</li>
<li>Display context for <code>ruff.configuration</code> errors (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15452">#15452</a>)</li>
</ul>
<h3>Configuration</h3>
<ul>
<li>Remove <code>flatten</code> to improve deserialization error
messages (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15414">#15414</a>)</li>
</ul>
<h3>Bug fixes</h3>
<ul>
<li>Parse triple-quoted string annotations as if parenthesized (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15387">#15387</a>)</li>
<li>[<code>fastapi</code>] Update <code>Annotated</code> fixes
(<code>FAST002</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15462">#15462</a>)</li>
<li>[<code>flake8-bandit</code>] Check for <code>builtins</code> instead
of <code>builtin</code> (<code>S102</code>, <code>PTH123</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15443">#15443</a>)</li>
<li>[<code>flake8-pathlib</code>] Fix <code>--select</code> for
<code>os-path-dirname</code> (<code>PTH120</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15446">#15446</a>)</li>
<li>[<code>ruff</code>] Fix false positive on global keyword
(<code>RUF052</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15235">#15235</a>)</li>
</ul>
<h2>Contributors</h2>
<ul>
<li><a
href="https://github.com/AlexWaygood"><code>@​AlexWaygood</code></a></li>
<li><a
href="https://github.com/BurntSushi"><code>@​BurntSushi</code></a></li>
<li><a
href="https://github.com/Daverball"><code>@​Daverball</code></a></li>
<li><a
href="https://github.com/Garrett-R"><code>@​Garrett-R</code></a></li>
<li><a
href="https://github.com/Glyphack"><code>@​Glyphack</code></a></li>
<li><a
href="https://github.com/InSyncWithFoo"><code>@​InSyncWithFoo</code></a></li>
<li><a href="https://github.com/Lee-W"><code>@​Lee-W</code></a></li>
<li><a
href="https://github.com/MichaReiser"><code>@​MichaReiser</code></a></li>
<li><a
href="https://github.com/cake-monotone"><code>@​cake-monotone</code></a></li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/astral-sh/ruff/blob/main/CHANGELOG.md">ruff's
changelog</a>.</em></p>
<blockquote>
<h2>0.9.2</h2>
<h3>Preview features</h3>
<ul>
<li>[<code>airflow</code>] Fix typo &quot;security_managr&quot; to
&quot;security_manager&quot; (<code>AIR303</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15463">#15463</a>)</li>
<li>[<code>airflow</code>] extend and fix AIR302 rules (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15525">#15525</a>)</li>
<li>[<code>fastapi</code>] Handle parameters with <code>Depends</code>
correctly (<code>FAST003</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15364">#15364</a>)</li>
<li>[<code>flake8-pytest-style</code>] Implement pytest.warns
diagnostics (<code>PT029</code>, <code>PT030</code>, <code>PT031</code>)
(<a
href="https://redirect.github.com/astral-sh/ruff/pull/15444">#15444</a>)</li>
<li>[<code>flake8-pytest-style</code>] Test function parameters with
default arguments (<code>PT028</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15449">#15449</a>)</li>
<li>[<code>flake8-type-checking</code>] Avoid false positives for
<code>|</code> in <code>TC008</code> (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15201">#15201</a>)</li>
</ul>
<h3>Rule changes</h3>
<ul>
<li>[<code>flake8-todos</code>] Allow VSCode GitHub PR extension style
links in <code>missing-todo-link</code> (<code>TD003</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15519">#15519</a>)</li>
<li>[<code>pyflakes</code>] Show syntax error message for
<code>F722</code> (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15523">#15523</a>)</li>
</ul>
<h3>Formatter</h3>
<ul>
<li>Fix curly bracket spacing around f-string expressions containing
curly braces (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15471">#15471</a>)</li>
<li>Fix joining of f-strings with different quotes when using quote
style <code>Preserve</code> (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15524">#15524</a>)</li>
</ul>
<h3>Server</h3>
<ul>
<li>Avoid indexing the same workspace multiple times (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15495">#15495</a>)</li>
<li>Display context for <code>ruff.configuration</code> errors (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15452">#15452</a>)</li>
</ul>
<h3>Configuration</h3>
<ul>
<li>Remove <code>flatten</code> to improve deserialization error
messages (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15414">#15414</a>)</li>
</ul>
<h3>Bug fixes</h3>
<ul>
<li>Parse triple-quoted string annotations as if parenthesized (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15387">#15387</a>)</li>
<li>[<code>fastapi</code>] Update <code>Annotated</code> fixes
(<code>FAST002</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15462">#15462</a>)</li>
<li>[<code>flake8-bandit</code>] Check for <code>builtins</code> instead
of <code>builtin</code> (<code>S102</code>, <code>PTH123</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15443">#15443</a>)</li>
<li>[<code>flake8-pathlib</code>] Fix <code>--select</code> for
<code>os-path-dirname</code> (<code>PTH120</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15446">#15446</a>)</li>
<li>[<code>ruff</code>] Fix false positive on global keyword
(<code>RUF052</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15235">#15235</a>)</li>
</ul>
<h2>0.9.1</h2>
<h3>Preview features</h3>
<ul>
<li>[<code>pycodestyle</code>] Run
<code>too-many-newlines-at-end-of-file</code> on each cell in notebooks
(<code>W391</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15308">#15308</a>)</li>
<li>[<code>ruff</code>] Omit diagnostic for shadowed private function
parameters in <code>used-dummy-variable</code> (<code>RUF052</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15376">#15376</a>)</li>
</ul>
<h3>Rule changes</h3>
<ul>
<li>[<code>flake8-bugbear</code>] Improve
<code>assert-raises-exception</code> message (<code>B017</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/15389">#15389</a>)</li>
</ul>
<h3>Formatter</h3>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="0a39348381"><code>0a39348</code></a>
Include build binaries</li>
<li><a
href="027f8009e5"><code>027f800</code></a>
Comment out non-npm-publish jobs</li>
<li><a
href="425870df76"><code>425870d</code></a>
Upload npm publish logs when failed</li>
<li><a
href="c20255abe4"><code>c20255a</code></a>
Bump version to 0.9.2 (<a
href="https://redirect.github.com/astral-sh/ruff/issues/15529">#15529</a>)</li>
<li><a
href="420365811f"><code>4203658</code></a>
Fix joining of f-strings with different quotes when using quote style
`Preser...</li>
<li><a
href="fc9dd63d64"><code>fc9dd63</code></a>
[airflow] extend and fix AIR302 rules (<a
href="https://redirect.github.com/astral-sh/ruff/issues/15525">#15525</a>)</li>
<li><a
href="79e52c7fdf"><code>79e52c7</code></a>
[<code>pyflakes</code>] Show syntax error message for <code>F722</code>
(<a
href="https://redirect.github.com/astral-sh/ruff/issues/15523">#15523</a>)</li>
<li><a
href="cf4ab7cba1"><code>cf4ab7c</code></a>
Parse triple quoted string annotations as if parenthesized (<a
href="https://redirect.github.com/astral-sh/ruff/issues/15387">#15387</a>)</li>
<li><a
href="d2656e88a3"><code>d2656e8</code></a>
[<code>flake8-todos</code>] Allow VSCode GitHub PR extension style links
in `missing-tod...</li>
<li><a
href="c53ee608a1"><code>c53ee60</code></a>
Typeshed-sync workflow: add appropriate labels, link directly to failing
run ...</li>
<li>Additional commits viewable in <a
href="https://github.com/astral-sh/ruff/compare/0.8.6...0.9.2">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=ruff&package-manager=pip&previous-version=0.8.6&new-version=0.9.2)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore <dependency name> major version` will close this
group update PR and stop Dependabot creating any more for the specific
dependency's major version (unless you unignore this specific
dependency's major version or upgrade to it yourself)
- `@dependabot ignore <dependency name> minor version` will close this
group update PR and stop Dependabot creating any more for the specific
dependency's minor version (unless you unignore this specific
dependency's minor version or upgrade to it yourself)
- `@dependabot ignore <dependency name>` will close this group update PR
and stop Dependabot creating any more for the specific dependency
(unless you unignore this specific dependency or upgrade to it yourself)
- `@dependabot unignore <dependency name>` will remove all of the ignore
conditions of the specified dependency
- `@dependabot unignore <dependency name> <ignore condition>` will
remove the ignore condition of the specified dependency and ignore
conditions


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-01-22 09:17:48 +00:00
Krzysztof Czerwinski
800625c952 fix(frontend): Change /store* url to /marketplace* (#9119)
We have branded it as "Marketplace", so the URL shouldn't be "store".

### Changes 🏗️

- Change frontend URLs from `/store*` to `/marketplace*`
- No API route changes to minimize bugs (follow up:
https://github.com/Significant-Gravitas/AutoGPT/issues/9118)
2025-01-18 17:49:41 +01:00
Nicholas Tindle
56612f16cf feat(platform): Linear integration (#9269)
<!-- Clearly explain the need for these changes: -->

I want to be able to do stuff with linear automatically

### Changes 🏗️
- Adds all the backing details to add linear auth and API access with
oauth (and prep for API key)

<!-- Concisely describe all of the changes made in this pull request:
-->

### Checklist 📋

#### For code changes:
- [ ] I have clearly listed my changes in the PR description
- [ ] I have made a test plan
- [ ] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [ ] ...

<details>
  <summary>Example test plan</summary>
  
  - [ ] Create from scratch and execute an agent with at least 3 blocks
- [ ] Import an agent from file upload, and confirm it executes
correctly
  - [ ] Upload agent to marketplace
- [ ] Import an agent from marketplace and confirm it executes correctly
  - [ ] Edit an agent from monitor, and confirm it executes correctly
</details>

#### For configuration changes:
- [ ] `.env.example` is updated or already compatible with my changes
- [ ] `docker-compose.yml` is updated or already compatible with my
changes
- [ ] I have included a list of my configuration changes in the PR
description (under **Changes**)

<details>
  <summary>Examples of configuration changes</summary>

  - Changing ports
  - Adding new services that need to communicate with each other
  - Secrets or environment variable changes
  - New or infrastructure changes such as databases
</details>

---------

Co-authored-by: Aarushi <50577581+aarushik93@users.noreply.github.com>
2025-01-17 13:35:58 +00:00
Reinier van der Leer
0d2bb46786 fix(frontend): Unbreak save button after save error (#9290)
- Resolves #9253

### Changes 🏗️

- Update state when an error occurs on save, to re-enable the save
button

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [x] I have made a test plan
- [x] I have tested my changes according to the test plan:
- Try to save an agent with missing required fields -> should give an
error
  - Fill out the required fields and try saving again -> should work
2025-01-17 13:29:43 +00:00
Aarushi
c61317e448 feat(platform): Create external API (#9272)
We want to allow external api calls against our platform
We also want to keep it sep from internal platform calls for dev ex,
security and scale seperation of concerns

### Changes 🏗️

This PR adds the required external routes
It mounts the new routes on the same app 

Infra PR will seprate routing and domains

### Checklist 📋

#### For code changes:
- [ ] I have clearly listed my changes in the PR description
- [ ] I have made a test plan
- [ ] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [ ] ...

<details>
  <summary>Example test plan</summary>
  
  - [ ] Create from scratch and execute an agent with at least 3 blocks
- [ ] Import an agent from file upload, and confirm it executes
correctly
  - [ ] Upload agent to marketplace
- [ ] Import an agent from marketplace and confirm it executes correctly
  - [ ] Edit an agent from monitor, and confirm it executes correctly
</details>

#### For configuration changes:
- [ ] `.env.example` is updated or already compatible with my changes
- [ ] `docker-compose.yml` is updated or already compatible with my
changes
- [ ] I have included a list of my configuration changes in the PR
description (under **Changes**)

<details>
  <summary>Examples of configuration changes</summary>

  - Changing ports
  - Adding new services that need to communicate with each other
  - Secrets or environment variable changes
  - New or infrastructure changes such as databases
</details>
2025-01-17 11:44:04 +00:00
Bently
3c30783b14 docs(Ollama): Remove steps about adding ollama credentials (#9288)
Since ["feat: no longer require ollama key
#9287"](https://github.com/Significant-Gravitas/AutoGPT/pull/9287) we no
longer need the steps for adding ollama credentials so this removes them
from the docs
2025-01-16 22:35:05 +00:00
Zamil Majdy
56b33327ab feat(platform): Add billing portal entry point (#9264)
<img width="1445" alt="image"
src="https://github.com/user-attachments/assets/5aeb7ee2-4d06-4a64-889b-599ad68c6dae"
/>


### Changes 🏗️

Added an entry point to open the Stripe billing portal.

### Checklist 📋

#### For code changes:
- [ ] I have clearly listed my changes in the PR description
- [ ] I have made a test plan
- [ ] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [ ] ...

<details>
  <summary>Example test plan</summary>
  
  - [ ] Create from scratch and execute an agent with at least 3 blocks
- [ ] Import an agent from file upload, and confirm it executes
correctly
  - [ ] Upload agent to marketplace
- [ ] Import an agent from marketplace and confirm it executes correctly
  - [ ] Edit an agent from monitor, and confirm it executes correctly
</details>

#### For configuration changes:
- [ ] `.env.example` is updated or already compatible with my changes
- [ ] `docker-compose.yml` is updated or already compatible with my
changes
- [ ] I have included a list of my configuration changes in the PR
description (under **Changes**)

<details>
  <summary>Examples of configuration changes</summary>

  - Changing ports
  - Adding new services that need to communicate with each other
  - Secrets or environment variable changes
  - New or infrastructure changes such as databases
</details>

---------

Co-authored-by: Krzysztof Czerwinski <kpczerwinski@gmail.com>
2025-01-16 21:00:15 +00:00
Zamil Majdy
c36c239dd5 feat(backend): Add graph/node id & execution id on CreditTransaction table (#9217)
We need to be able to determine the cost of graph/node execution.

### Changes 🏗️

* Add these columns into CreditTransaction `metadata` column:
    - graph_id
    - node_id
    - graph_exec_id
    - node_exec_id
    - block_id
* Drop the `blockId` column and backfill the dropped value into
metadata->>block_id.
* Frequent queries on these values will require an index created on
demand through a migration, depending on the use case.

---------

Co-authored-by: Krzysztof Czerwinski <kpczerwinski@gmail.com>
Co-authored-by: Reinier van der Leer <pwuts@agpt.co>
2025-01-16 20:59:49 +00:00
Nicholas Tindle
e53f1eaf80 feat: no longer require ollama key (#9287)
<!-- Clearly explain the need for these changes: -->

### Changes 🏗️

<!-- Concisely describe all of the changes made in this pull request:
-->

### Checklist 📋

#### For code changes:
- [ ] I have clearly listed my changes in the PR description
- [ ] I have made a test plan
- [ ] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [ ] ...

<details>
  <summary>Example test plan</summary>
  
  - [ ] Create from scratch and execute an agent with at least 3 blocks
- [ ] Import an agent from file upload, and confirm it executes
correctly
  - [ ] Upload agent to marketplace
- [ ] Import an agent from marketplace and confirm it executes correctly
  - [ ] Edit an agent from monitor, and confirm it executes correctly
</details>

#### For configuration changes:
- [ ] `.env.example` is updated or already compatible with my changes
- [ ] `docker-compose.yml` is updated or already compatible with my
changes
- [ ] I have included a list of my configuration changes in the PR
description (under **Changes**)

<details>
  <summary>Examples of configuration changes</summary>

  - Changing ports
  - Adding new services that need to communicate with each other
  - Secrets or environment variable changes
  - New or infrastructure changes such as databases
</details>
2025-01-16 12:25:08 +00:00
Krzysztof Czerwinski
04915f2db0 feat(platform): Implement top-up flow for PAYG System (#9050)
This PR adds Stripe integration and payment processing for topping-up
user accounts with credits.

### Changes 🏗️

Includes:
- https://github.com/Significant-Gravitas/AutoGPT/pull/9176

#### Top-up flow

1. To top-up a user visits their settings and clicks `Credits` button
(it's unavailable if `NEXT_PUBLIC_STRIPE_PUBLISHABLE_KEY` isn't present)
2. User inputs top-up amount (min 5$ in 1$ increments) and click the
button to confirm.
3. Backend receives top-up request, creates database entry and requests
stripe to provide url for this specific checkout.
4. User gets redirected to externally hosted Stripe checkout page, after
payment (or cancelling) they get redirected back to Credits page.
5. In the meantime Stripe processes payment and sends webhook
confirmation to the backend, backend updates database to activate bought
credits.
6. Credits page shows success (or failure) information (by using url
param `topup=success|cancel`). Credit counter won't update without
refreshing the page unless payment was confirmed before user was back on
Credits page which is the case when testing checkout locally.

<img width="804" alt="Screenshot 2025-01-01 at 2 55 35 PM"
src="https://github.com/user-attachments/assets/22fb518d-b30b-4154-bb4b-edea1d57b6c2"
/>

#### Backend
- Add `stripe` package
- Add environment variables:
  - `STRIPE_API_KEY`
  - `STRIPE_WEBHOOK_SECRET`
- Add routes:
  - `POST /credits`: top-up request, returns Stripe checkout url.
- `POST /credits/stripe_webhook`: Stripe webhook endpoint to notify of
successful payment.
- `PATCH /credits`: prompts beckend to check payment status. It's an
additional failsafe in case webhook fails.
- Update `credit.py` and related files to handle top-up request and
payment confirmation

#### Frontend
- Add `stripe-js` package
- Add `NEXT_PUBLIC_STRIPE_PUBLISHABLE_KEY` environment variable
- Modify user settings sidebar to show `Credits` if
`NEXT_PUBLIC_STRIPE_PUBLISHABLE_KEY` is available
- Add `store/credits` page where user can top-up their account, it shows
confirmation (or failure) after completing checkout.
- Add `useCredits` hook that returns user credits and allows to request
top-up.

### Checklist 📋

#### For code changes:
- [x] I have clearly listed my changes in the PR description
- [ ] I have made a test plan
- [ ] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [ ] ...

<details>
  <summary>Example test plan</summary>
  
  - [ ] Create from scratch and execute an agent with at least 3 blocks
- [ ] Import an agent from file upload, and confirm it executes
correctly
  - [ ] Upload agent to marketplace
- [ ] Import an agent from marketplace and confirm it executes correctly
  - [ ] Edit an agent from monitor, and confirm it executes correctly
</details>

#### For configuration changes:
- [x] `.env.example` is updated or already compatible with my changes
- [ ] `docker-compose.yml` is updated or already compatible with my
changes
- [ ] I have included a list of my configuration changes in the PR
description (under **Changes**)

<details>
  <summary>Examples of configuration changes</summary>

  - Changing ports
  - Adding new services that need to communicate with each other
  - Secrets or environment variable changes
  - New or infrastructure changes such as databases
</details>

---------

Co-authored-by: Zamil Majdy <zamil.majdy@agpt.co>
2025-01-15 23:46:52 +00:00
Nicholas Tindle
9d79bfadea [Snyk] Security upgrade next from 14.2.20 to 14.2.21 (#9243)
![snyk-top-banner](https://redirect.github.com/andygongea/OWASP-Benchmark/assets/818805/c518c423-16fe-447e-b67f-ad5a49b5d123)

### Snyk has created this PR to fix 1 vulnerabilities in the yarn
dependencies of this project.

#### Snyk changed the following file(s):

- `autogpt_platform/frontend/package.json`
- `autogpt_platform/frontend/yarn.lock`


#### Note for
[zero-installs](https://yarnpkg.com/features/zero-installs) users

If you are using the Yarn feature
[zero-installs](https://yarnpkg.com/features/zero-installs) that was
introduced in Yarn V2, note that this PR does not update the
`.yarn/cache/` directory meaning this code cannot be pulled and
immediately developed on as one would expect for a zero-install project
- you will need to run `yarn` to update the contents of the
`./yarn/cache` directory.
If you are not using zero-install you can ignore this as your flow
should likely be unchanged.




#### Vulnerabilities that will be fixed with an upgrade:

|  | Issue |  
:-------------------------:|:-------------------------
![medium
severity](https://res.cloudinary.com/snyk/image/upload/w_20,h_20/v1561977819/icon/m.png
'medium severity') | Allocation of Resources Without Limits or
Throttling
<br/>[SNYK-JS-NEXT-8602067](https://snyk.io/vuln/SNYK-JS-NEXT-8602067)




---

> [!IMPORTANT]
>
> - Check the changes in this PR to ensure they won't cause issues with
your project.
> - Max score is 1000. Note that the real score may have changed since
the PR was raised.
> - This PR was automatically created by Snyk using the credentials of a
real user.

---

**Note:** _You are seeing this because you or someone else with access
to this repository has authorized Snyk to open fix PRs._

For more information: <img
src="https://api.segment.io/v1/pixel/track?data=eyJ3cml0ZUtleSI6InJyWmxZcEdHY2RyTHZsb0lYd0dUcVg4WkFRTnNCOUEwIiwiYW5vbnltb3VzSWQiOiI4NWY3NDgyYy03NGFiLTQxNmYtYjQ4OC0wMTUwMDlmYzY5NzkiLCJldmVudCI6IlBSIHZpZXdlZCIsInByb3BlcnRpZXMiOnsicHJJZCI6Ijg1Zjc0ODJjLTc0YWItNDE2Zi1iNDg4LTAxNTAwOWZjNjk3OSJ9fQ=="
width="0" height="0"/>
🧐 [View latest project
report](https://app.snyk.io/org/significant-gravitas/project/3d924968-0cf3-4767-9609-501fa4962856?utm_source&#x3D;github&amp;utm_medium&#x3D;referral&amp;page&#x3D;fix-pr)
📜 [Customise PR
templates](https://docs.snyk.io/scan-using-snyk/pull-requests/snyk-fix-pull-or-merge-requests/customize-pr-templates?utm_source=github&utm_content=fix-pr-template)
🛠 [Adjust project
settings](https://app.snyk.io/org/significant-gravitas/project/3d924968-0cf3-4767-9609-501fa4962856?utm_source&#x3D;github&amp;utm_medium&#x3D;referral&amp;page&#x3D;fix-pr/settings)
📚 [Read about Snyk's upgrade
logic](https://docs.snyk.io/scan-with-snyk/snyk-open-source/manage-vulnerabilities/upgrade-package-versions-to-fix-vulnerabilities?utm_source=github&utm_content=fix-pr-template)

---

**Learn how to fix vulnerabilities with free interactive lessons:**

🦉 [Allocation of Resources Without Limits or
Throttling](https://learn.snyk.io/lesson/no-rate-limiting/?loc&#x3D;fix-pr)

[//]: #
'snyk:metadata:{"customTemplate":{"variablesUsed":[],"fieldsUsed":[]},"dependencies":[{"name":"next","from":"14.2.20","to":"14.2.21"}],"env":"prod","issuesToFix":["SNYK-JS-NEXT-8602067"],"prId":"85f7482c-74ab-416f-b488-015009fc6979","prPublicId":"85f7482c-74ab-416f-b488-015009fc6979","packageManager":"yarn","priorityScoreList":[null],"projectPublicId":"3d924968-0cf3-4767-9609-501fa4962856","projectUrl":"https://app.snyk.io/org/significant-gravitas/project/3d924968-0cf3-4767-9609-501fa4962856?utm_source=github&utm_medium=referral&page=fix-pr","prType":"fix","templateFieldSources":{"branchName":"default","commitMessage":"default","description":"default","title":"default"},"templateVariants":["updated-fix-title"],"type":"auto","upgrade":["SNYK-JS-NEXT-8602067"],"vulns":["SNYK-JS-NEXT-8602067"],"patch":[],"isBreakingChange":false,"remediationStrategy":"vuln"}'

Co-authored-by: snyk-bot <snyk-bot@snyk.io>
2025-01-15 18:11:13 +00:00
Nicholas Tindle
5f50c4863d test(frontend): Re-enable the tests in monitor.spec.ts and then ensure they pass (#9248)
Enable the tests in `monitor.spec.ts`.

* Remove `test.describe.skip` to enable the tests.
* Ensure the tests are now running and passing successfully.

---

For more details, open the [Copilot Workspace
session](https://copilot-workspace.githubnext.com/Significant-Gravitas/AutoGPT/pull/9248?shareId=edbd64cc-ea19-477b-be06-5eea84c28665).
2025-01-15 09:41:41 +00:00
Aarushi
fe84cbe566 Revert "feature(backend): Add ability to execute store agents without agent ownership" (#9263)
Reverts Significant-Gravitas/AutoGPT#9179

This PR is preventing the running of agents in dev.
2025-01-13 18:34:17 +00:00
Aarushi
5618072375 fix(blocks/Exa): Fix exa contents block advanced toggle (#9255)
Toggling the advanced option on Exa Contents Block isn't working. It
throws a frontend error.

### Changes 🏗️

Remove Optional from ContentRetrievalSettings in exa/contents.py

### Checklist 📋

#### For code changes:
- [x ] I have clearly listed my changes in the PR description
- [ x] I have made a test plan
- [ x] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - Add an ExaContentsBlock
  - Hit advanced


#### For configuration changes:
N/A
2025-01-13 16:35:14 +00:00
Reinier van der Leer
95b79abcfe Revert broken Library v2 DB stuff of #9218, #9211 (#9256)
- **Revert "feature(platform): Implement library add, update, remove,
archive functionality (#9218)"**
- **Revert "feat(backend): Add Support for Managing Agent Presets with
Pagination and Soft Delete (#9211)"**

These PRs contain untested changes to DB functions and cause issues in
production.
2025-01-13 16:08:58 +01:00
Swifty
fd6f28fa57 feature(platform): Implement library add, update, remove, archive functionality (#9218)
### Changes 🏗️

1. **Core Features**:
   - Add agents to the user's library.
   - Update library agents (auto-update, favorite, archive, delete).
   - Paginate library agents and presets.
   - Execute graphs using presets.

2. **Refactoring**:
   - Replaced `UserAgent` with `LibraryAgent`.
   - Separated routes for agents and presets.

3. **Schema Changes**:
- Added `LibraryAgent` table with fields like `isArchived`, `isDeleted`,
etc.
   - Soft delete functionality for `AgentPreset`.

4. **Testing**:
   - Updated tests for `LibraryAgent` operations.
   - Added edge case tests for deletion, archiving, and pagination.

5. **Database Migrations**:
   - Migration to drop `UserAgent` and add `LibraryAgent`.
   - Added fields for soft deletion and auto-update.


Note this includes the changes from the following PR's to avoid merge
conflicts with them:

#9179 
#9211

---------

Co-authored-by: Reinier van der Leer <pwuts@agpt.co>
2025-01-10 13:02:53 +01:00
Swifty
4b17cc9963 feat(backend): Add Support for Managing Agent Presets with Pagination and Soft Delete (#9211)
#### Summary
- **New Models**: Added `LibraryAgentPreset`,
`LibraryAgentPresetResponse`, `Pagination`, and
`CreateLibraryAgentPresetRequest`.
- **Database Changes**:
  - Added `isDeleted` column in `AgentPreset` for soft delete.
  - CRUD operations for `AgentPreset`:
    - `get_presets` with pagination.
    - `get_preset` by ID.
    - `create_or_update_preset` for upsert.
    - `delete_preset` to soft delete.
- **API Routes**:
  - `GET /presets`: Fetch paginated presets.
  - `GET /presets/{preset_id}`: Fetch a single preset.
  - `POST /presets`: Create a preset.
  - `PUT /presets/{preset_id}`: Update a preset.
  - `DELETE /presets/{preset_id}`: Soft delete a preset.
- **Tests**:
  - Coverage for models, CRUD operations, and pagination.
- **Migration**:
  - Added `isDeleted` field to support soft delete.

#### Review Notes
- Validate migration scripts and test coverage.
- Ensure API aligns with project standards.

---------

Co-authored-by: Reinier van der Leer <pwuts@agpt.co>
2025-01-10 12:57:35 +01:00
Swifty
00bb7c67b3 feature(backend): Add ability to execute store agents without agent ownership (#9179)
### Description

This PR enables the execution of store agents even if they are not owned
by the user. Key changes include handling store-listed agents in the
`get_graph` logic, improving execution flow, and ensuring
version-specific handling. These updates support more flexible agent
execution.

### Changes 🏗️

- **Graph Retrieval:** Updated `get_graph` to check store listings for
agents not owned by the user.
- **Version Handling:** Added `graph_version` to execution methods for
consistent version-specific execution.
- **Execution Flow:** Refactored `scheduler.py`, `rest_api.py`, and
other modules for clearer logic and better maintainability.
- **Testing:** Updated `test_manager.py` and other test cases to
validate execution of store-listed agents added test for accessing graph

---------

Co-authored-by: Reinier van der Leer <pwuts@agpt.co>
Co-authored-by: Zamil Majdy <zamil.majdy@agpt.co>
2025-01-10 12:39:06 +01:00
Zamil Majdy
9d1bc25ffa hotfix(backend): Increase statement timeout for the double brace migration (#9245)
### Changes 🏗️


https://github.com/Significant-Gravitas/AutoGPT/actions/runs/12696734339/job/35391431786

### Checklist 📋

#### For code changes:
- [ ] I have clearly listed my changes in the PR description
- [ ] I have made a test plan
- [ ] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [ ] ...

<details>
  <summary>Example test plan</summary>
  
  - [ ] Create from scratch and execute an agent with at least 3 blocks
- [ ] Import an agent from file upload, and confirm it executes
correctly
  - [ ] Upload agent to marketplace
- [ ] Import an agent from marketplace and confirm it executes correctly
  - [ ] Edit an agent from monitor, and confirm it executes correctly
</details>

#### For configuration changes:
- [ ] `.env.example` is updated or already compatible with my changes
- [ ] `docker-compose.yml` is updated or already compatible with my
changes
- [ ] I have included a list of my configuration changes in the PR
description (under **Changes**)

<details>
  <summary>Examples of configuration changes</summary>

  - Changing ports
  - Adding new services that need to communicate with each other
  - Secrets or environment variable changes
  - New or infrastructure changes such as databases
</details>
2025-01-09 21:59:22 +00:00
Zamil Majdy
3a3ee994c2 hotfix(backend): Increase statement timeout for the double brace migration (#9244)
### Changes 🏗️


https://github.com/Significant-Gravitas/AutoGPT/actions/runs/12696734339/job/35391431786

### Checklist 📋

#### For code changes:
- [ ] I have clearly listed my changes in the PR description
- [ ] I have made a test plan
- [ ] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [ ] ...

<details>
  <summary>Example test plan</summary>
  
  - [ ] Create from scratch and execute an agent with at least 3 blocks
- [ ] Import an agent from file upload, and confirm it executes
correctly
  - [ ] Upload agent to marketplace
- [ ] Import an agent from marketplace and confirm it executes correctly
  - [ ] Edit an agent from monitor, and confirm it executes correctly
</details>

#### For configuration changes:
- [ ] `.env.example` is updated or already compatible with my changes
- [ ] `docker-compose.yml` is updated or already compatible with my
changes
- [ ] I have included a list of my configuration changes in the PR
description (under **Changes**)

<details>
  <summary>Examples of configuration changes</summary>

  - Changing ports
  - Adding new services that need to communicate with each other
  - Secrets or environment variable changes
  - New or infrastructure changes such as databases
</details>
2025-01-09 19:44:37 +00:00
Aarushi
0d44f5be13 feat(backend/blocks/nvidia): Provide Nvidia by default (#9235)
We want to allow users to use Nvidia without their own keys

### Changes 🏗️

Added nvidia api key to credentials store.

### Checklist 📋

#### For code changes:
- [ ] I have clearly listed my changes in the PR description
- [ ] I have made a test plan
- [ ] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [ ] ...

<details>
  <summary>Example test plan</summary>
  
  - [ ] Create from scratch and execute an agent with at least 3 blocks
- [ ] Import an agent from file upload, and confirm it executes
correctly
  - [ ] Upload agent to marketplace
- [ ] Import an agent from marketplace and confirm it executes correctly
  - [ ] Edit an agent from monitor, and confirm it executes correctly
</details>

#### For configuration changes:
- [x] `.env.example` is updated or already compatible with my changes

<details>
  <summary>Examples of configuration changes</summary>

  - Changing ports
  - Adding new services that need to communicate with each other
  - Secrets or environment variable changes
  - New or infrastructure changes such as databases
</details>
2025-01-09 18:12:05 +00:00
Zamil Majdy
1670579a61 fix(block): Remove Python.format & Jinja templating format backward compatibility (#9229)
Python format uses `{Variable}` as the variable placeholder, while Jinja
uses `{{Variable}}` as its default.
Jinja is used as the main templating engine on the system, but the
Python format version is still maintained for backward compatibility.

However, the backward compatibility support can cause a side effect
while passing JSON string value into the block that uses it:
https://github.com/Significant-Gravitas/AutoGPT/issues/9194

### Changes 🏗️

* Use `{{Variable}}` place holder format and removed `{Variable}`
support in these blocks:
 - '363ae599-353e-4804-937e-b2ee3cef3da4', -- AgentOutputBlock
 - 'db7d8f02-2f44-4c55-ab7a-eae0941f0c30', -- FillTextTemplateBlock
 - '1f292d4a-41a4-4977-9684-7c8d560b9f91', -- AITextGeneratorBlock
- 'ed55ac19-356e-4243-a6cb-bc599e9b716f' --
AIStructuredResponseGeneratorBlock
* Add Jinja templating support on `AITextGeneratorBlock` &
`AIStructuredResponseGeneratorBlock`
* Migrated the existing database content to prevent breaking changes.

### Checklist 📋

#### For code changes:
- [ ] I have clearly listed my changes in the PR description
- [ ] I have made a test plan
- [ ] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [ ] ...

<details>
  <summary>Example test plan</summary>
  
  - [ ] Create from scratch and execute an agent with at least 3 blocks
- [ ] Import an agent from file upload, and confirm it executes
correctly
  - [ ] Upload agent to marketplace
- [ ] Import an agent from marketplace and confirm it executes correctly
  - [ ] Edit an agent from monitor, and confirm it executes correctly
</details>

#### For configuration changes:
- [ ] `.env.example` is updated or already compatible with my changes
- [ ] `docker-compose.yml` is updated or already compatible with my
changes
- [ ] I have included a list of my configuration changes in the PR
description (under **Changes**)

<details>
  <summary>Examples of configuration changes</summary>

  - Changing ports
  - Adding new services that need to communicate with each other
  - Secrets or environment variable changes
  - New or infrastructure changes such as databases
</details>
2025-01-09 16:29:16 +00:00
Bently
a1889e6212 docs(Ollama): Update Ollama docs (#9234)
The Ollama docs where very out of date and needed updating so I have
updated them and added some screenshots so its easier to follow.

I have also added a new Ollama model to the platform, "llama3.2" as that
is what i based the tutorial off and its name is easy to find in the
list of models

I also added a new folder in the "imgs" dir to store the Ollama related
photo just to keep things tidy
2025-01-09 15:19:37 +00:00
Abhimanyu Yadav
9c702516fd feat(platform): fix carousel on store page (#9230)
- resolves #8973 

Adding smooth scrolling and solving some weird interaction on carousal

### Changes

- Update `CarouselPrevious`, `CarouselPrevious` and add
`CarouselIndicator` in `carousel.tsx`
- Add `CarouselPrevious`, `CarouselPrevious` and `CarouselIndicator`
support in `FeaturedSection.tsx`

### Demo 


https://github.com/user-attachments/assets/ba9a22fa-ddf2-469f-ba8a-aee1a7fc5f78
2025-01-09 13:48:53 +00:00
Aarushi
32c908ae13 fix(backend): Add default credentials for Fal, Exa, E2B (#9233)
We want to provide certain providers by default on our platform. These
three were not added previously, so fixing that.

### Changes 🏗️

If api keys for Fal Exa or E2B exist in environment variables, load them
by default as credentials that are usable by our users.

### Checklist 📋

#### For code changes:
- [ ] I have clearly listed my changes in the PR description
- [ ] I have made a test plan
- [ ] I have tested my changes according to the test plan:
  <!-- Put your test plan here: -->
  - [ ] ...

<details>
  <summary>Example test plan</summary>
  
  - [ ] Create from scratch and execute an agent with at least 3 blocks
- [ ] Import an agent from file upload, and confirm it executes
correctly
  - [ ] Upload agent to marketplace
- [ ] Import an agent from marketplace and confirm it executes correctly
  - [ ] Edit an agent from monitor, and confirm it executes correctly
</details>

#### For configuration changes:
- [x] `.env.example` is updated or already compatible with my changes

<details>
  <summary>Examples of configuration changes</summary>

  - Changing ports
  - Adding new services that need to communicate with each other
  - Secrets or environment variable changes
  - New or infrastructure changes such as databases
</details>
2025-01-09 13:47:54 +00:00
Abhimanyu Yadav
b4a0100c22 feat(platform): Add Twitter integration (#8754)
- Resolves #8326  

Create a Twitter integration with some small frontend changes.

### Changes
1. Add Twitter OAuth 2.0 with PKCE support for authentication.
2. Add a way to multi-select from a list of enums by creating a
multi-select on the frontend.
3. Add blocks for Twitter integration.
4. `_types.py` for repetitive enums and input types.
5. `_builders.py` for creating parameters without repeating the same
logic.
6. `_serializer.py` to serialize the Tweepy enums into dictionaries so
they can travel easily from Pyro5.
7. `_mappers.py` to map the frontend values to the correct request
values.

> I have added a new multi-select feature because my list contains many
items, and selecting all of them makes the block cluttered. This new
block displays only the first two items and then show something like "2
more" . It works only for list of enums.


### Blocks

Block Name | What It Does | Error Reason | Manual Testing
-- | -- | -- | --
`TwitterBookmarkTweetBlock` | Bookmark a tweet on Twitter | No error | 
`TwitterGetBookmarkedTweetsBlock` | Get all your bookmarked tweets from
Twitter | No error | 
`TwitterRemoveBookmarkTweetBlock` | Remove a bookmark for a tweet on
Twitter | No error | 
`TwitterHideReplyBlock` | Hides a reply of one of your tweets | No error
| 
`TwitterUnhideReplyBlock` | Unhides a reply to a tweet | No error | 
`TwitterLikeTweetBlock` | Likes a tweet | No error | 
`TwitterGetLikingUsersBlock` | Gets information about users who liked
one of your tweets | No error | 
`TwitterGetLikedTweetsBlock` | Gets information about tweets liked by
you | No error | 
`TwitterUnlikeTweetBlock` | Unlikes a tweet that was previously liked |
No error | 
`TwitterPostTweetBlock` | Create a tweet on Twitter with the option to
include one additional element such as media, quote, or deep link. | No
error | 
`TwitterDeleteTweetBlock` | Deletes a tweet on Twitter using Twitter ID
| No error | 
`TwitterSearchRecentTweetsBlock` | Searches all public Tweets in Twitter
history | No error | 
`TwitterGetQuoteTweetsBlock` | Gets quote tweets for a specified tweet
ID | No error | 
`TwitterRetweetBlock` | Retweets a tweet on Twitter | No error | 
`TwitterRemoveRetweetBlock` | Removes a retweet on Twitter | No error |

`TwitterGetRetweetersBlock` | Gets information about who has retweeted a
tweet | No error | 
`TwitterGetUserMentionsBlock` | Returns Tweets where a single user is
mentioned, just put that user ID | No error | 
`TwitterGetHomeTimelineBlock` | Returns a collection of the most recent
Tweets and Retweets posted by you and users you follow | No error | 
`TwitterGetUserTweetsBlock` | Returns Tweets composed by a single user,
specified by the requested user ID | No error | 
`TwitterGetTweetBlock` | Returns information about a single Tweet
specified by the requested ID | No error | 
`TwitterGetTweetsBlock` | Returns information about multiple Tweets
specified by the requested IDs | No error | 
`TwitterUnblockUserBlock` | Unblock a specific user on Twitter | No
error | 
`TwitterGetBlockedUsersBlock` | Get a list of users who are blocked by
the authenticating user | No error | 
`TwitterBlockUserBlock` | Block a specific user on Twitter | No error |

`TwitterUnfollowUserBlock` | Allows a user to unfollow another user
specified by target user ID | No error | 
`TwitterFollowUserBlock` | Allows a user to follow another user
specified by target user ID | No error | 
`TwitterGetFollowersBlock` | Retrieves a list of followers for a
specified Twitter user ID | Need Enterprise level access | 
`TwitterGetFollowingBlock` | Retrieves a list of users that a specified
Twitter user ID is following | Need Enterprise level access | 
`TwitterUnmuteUserBlock` | Allows a user to unmute another user
specified by target user ID | No error | 
`TwitterGetMutedUsersBlock` | Returns a list of users who are muted by
the authenticating user | No error | 
`TwitterMuteUserBlock` | Allows a user to mute another user specified by
target user ID | No error | 
`TwitterGetUserBlock` | Gets information about a single Twitter user
specified by ID or username | No error | 
`TwitterGetUsersBlock` | Gets information about multiple Twitter users
specified by IDs or usernames | No error | 
`TwitterSearchSpacesBlock` | Returns live or scheduled Spaces matching
specified search terms [for a week only] | No error | 
`TwitterGetSpacesBlock` | Gets information about multiple Twitter Spaces
specified by Space IDs or creator user IDs | No error | 
`TwitterGetSpaceByIdBlock` | Gets information about a single Twitter
Space specified by Space ID | No error | 
`TwitterGetSpaceBuyersBlock` | Gets list of users who purchased a ticket
to the requested Space | I do not have a monetized account for this | 
`TwitterGetSpaceTweetsBlock` | Gets list of Tweets shared in the
requested Space | No error | 
`TwitterUnfollowListBlock` | Unfollows a Twitter list for the
authenticated user | No error | 
`TwitterFollowListBlock` | Follows a Twitter list for the authenticated
user | No error | 
`TwitterListGetFollowersBlock` | Gets followers of a specified Twitter
list | Enterprise level access | 
`TwitterGetFollowedListsBlock` | Gets lists followed by a specified
Twitter user | Enterprise level access | 
`TwitterGetListBlock` | Gets information about a Twitter List specified
by ID | No error | 
`TwitterGetOwnedListsBlock` | Gets all Lists owned by the specified user
| No error | 
`TwitterRemoveListMemberBlock` | Removes a member from a Twitter List
that the authenticated user owns | No error | 
`TwitterAddListMemberBlock` | Adds a member to a Twitter List that the
authenticated user owns | No error | 
`TwitterGetListMembersBlock` | Gets the members of a specified Twitter
List | No error | 
`TwitterGetListMembershipsBlock` | Gets all Lists that a specified user
is a member of | No error | 
`TwitterGetListTweetsBlock` | Gets tweets from a specified Twitter list
| No error | 
`TwitterDeleteListBlock` | Deletes a Twitter List owned by the
authenticated user | No error | 
`TwitterUpdateListBlock` | Updates a Twitter List owned by the
authenticated user | No error | 
`TwitterCreateListBlock` | Creates a Twitter List owned by the
authenticated user | No error | 
`TwitterUnpinListBlock` | Enables the authenticated user to unpin a
List. | No error | 
`TwitterPinListBlock` | Enables the authenticated user to pin a List. |
No error | 
`TwitterGetPinnedListsBlock` | Returns the Lists pinned by the
authenticated user. | No error | 
`TwitterGetDMEventsBlock` | Gets a list of Direct Message events for the
authenticated user | Need Enterprise level access | 
`TwitterSendDirectMessageBlock` | Sends a direct message to a Twitter
user | Need Enterprise level access | 
`TwitterCreateDMConversationBlock` | Creates a new group direct message
| Need Enterprise level access | 

### Need to add more stuff
1. A normal input to select date and time.
2. Some more enterprise-level blocks, especially webhook triggers.

Supported triggers 


Event Name | Description
-- | --
Posts (by user) | User creates a new post.
Post deletes (by user) | User deletes an existing post.
@mentions (of user) | User is mentioned in a post.
Replies (to or from user) | User replies to a post or receives a reply
from another user.
Retweets (by user or of user) | User retweets a post or someone retweets
the user's post.
Quote Tweets (by user or of user) | User quote tweets a post or someone
quote tweets the user's post.
Retweets of Quoted Tweets (by user or of user) | Retweets of quote
tweets by the user or of the user.
Likes (by user or of user) | User likes a post or someone likes the
user's post.
Follows (by user or of user) | User follows another user or another user
follows the user.
Unfollows (by user) | User unfollows another user.
Blocks (by user) | User blocks another user.
Unblocks (by user) | User unblocks a previously blocked user.
Mutes (by user) | User mutes another user.
Unmutes (by user) | User unmutes a previously muted user.
Direct Messages sent (by user) | User sends direct messages to other
users.
Direct Messages received (by user) | User receives direct messages from
other users.
Typing indicators (to user) | Indicators showing when someone is typing
a message to the user.
Read receipts (to user) | Indicators showing when the user has read a
message.
Subscription revokes (by user) | User revokes a subscription to a
service or content.

---------

Co-authored-by: Nicholas Tindle <nicholas.tindle@agpt.co>
Co-authored-by: Nicholas Tindle <nicktindle@outlook.com>
2025-01-08 19:47:00 +00:00
267 changed files with 29409 additions and 5403 deletions

18
.deepsource.toml Normal file
View File

@@ -0,0 +1,18 @@
version = 1
test_patterns = ["**/*.spec.ts","**/*_test.py","**/*_tests.py","**/test_*.py"]
exclude_patterns = ["classic/**"]
[[analyzers]]
name = "javascript"
[analyzers.meta]
plugins = ["react"]
environment = ["nodejs"]
[[analyzers]]
name = "python"
[analyzers.meta]
runtime_version = "3.x.x"

View File

@@ -37,6 +37,25 @@ jobs:
run: |
yarn lint
type-check:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: Set up Node.js
uses: actions/setup-node@v4
with:
node-version: "21"
- name: Install dependencies
run: |
yarn install --frozen-lockfile
- name: Run tsc check
run: |
yarn type-check
test:
runs-on: ubuntu-latest
strategy:

View File

@@ -25,7 +25,7 @@ jobs:
close-issue-message: >
This issue was closed automatically because it has been stale for 10 days
with no activity.
days-before-stale: 50
days-before-stale: 100
days-before-close: 10
# Do not touch meta issues:
exempt-issue-labels: meta,fridge,project management

View File

@@ -22,7 +22,7 @@ To run the AutoGPT Platform, follow these steps:
2. Run the following command:
```
git submodule update --init --recursive
git submodule update --init --recursive --progress
```
This command will initialize and update the submodules in the repository. The `supabase` folder will be cloned to the root directory.

View File

@@ -18,7 +18,7 @@ ERROR_LOG_FILE = "error.log"
SIMPLE_LOG_FORMAT = "%(asctime)s %(levelname)s %(title)s%(message)s"
DEBUG_LOG_FORMAT = (
"%(asctime)s %(levelname)s %(filename)s:%(lineno)d" " %(title)s%(message)s"
"%(asctime)s %(levelname)s %(filename)s:%(lineno)d %(title)s%(message)s"
)

View File

@@ -0,0 +1,76 @@
from typing import Annotated, Any, Literal, Optional, TypedDict
from uuid import uuid4
from pydantic import BaseModel, Field, SecretStr, field_serializer
class _BaseCredentials(BaseModel):
id: str = Field(default_factory=lambda: str(uuid4()))
provider: str
title: Optional[str]
@field_serializer("*")
def dump_secret_strings(value: Any, _info):
if isinstance(value, SecretStr):
return value.get_secret_value()
return value
class OAuth2Credentials(_BaseCredentials):
type: Literal["oauth2"] = "oauth2"
username: Optional[str]
"""Username of the third-party service user that these credentials belong to"""
access_token: SecretStr
access_token_expires_at: Optional[int]
"""Unix timestamp (seconds) indicating when the access token expires (if at all)"""
refresh_token: Optional[SecretStr]
refresh_token_expires_at: Optional[int]
"""Unix timestamp (seconds) indicating when the refresh token expires (if at all)"""
scopes: list[str]
metadata: dict[str, Any] = Field(default_factory=dict)
def bearer(self) -> str:
return f"Bearer {self.access_token.get_secret_value()}"
class APIKeyCredentials(_BaseCredentials):
type: Literal["api_key"] = "api_key"
api_key: SecretStr
expires_at: Optional[int]
"""Unix timestamp (seconds) indicating when the API key expires (if at all)"""
def bearer(self) -> str:
return f"Bearer {self.api_key.get_secret_value()}"
Credentials = Annotated[
OAuth2Credentials | APIKeyCredentials,
Field(discriminator="type"),
]
CredentialsType = Literal["api_key", "oauth2"]
class OAuthState(BaseModel):
token: str
provider: str
expires_at: int
code_verifier: Optional[str] = None
scopes: list[str]
"""Unix timestamp (seconds) indicating when this OAuth state expires"""
class UserMetadata(BaseModel):
integration_credentials: list[Credentials] = Field(default_factory=list)
integration_oauth_states: list[OAuthState] = Field(default_factory=list)
class UserMetadataRaw(TypedDict, total=False):
integration_credentials: list[dict]
integration_oauth_states: list[dict]
class UserIntegrations(BaseModel):
credentials: list[Credentials] = Field(default_factory=list)
oauth_states: list[OAuthState] = Field(default_factory=list)

View File

@@ -1,4 +1,4 @@
# This file is automatically @generated by Poetry 1.8.5 and should not be changed by hand.
# This file is automatically @generated by Poetry 2.0.1 and should not be changed by hand.
[[package]]
name = "aiohappyeyeballs"
@@ -6,6 +6,7 @@ version = "2.4.0"
description = "Happy Eyeballs for asyncio"
optional = false
python-versions = ">=3.8"
groups = ["main"]
files = [
{file = "aiohappyeyeballs-2.4.0-py3-none-any.whl", hash = "sha256:7ce92076e249169a13c2f49320d1967425eaf1f407522d707d59cac7628d62bd"},
{file = "aiohappyeyeballs-2.4.0.tar.gz", hash = "sha256:55a1714f084e63d49639800f95716da97a1f173d46a16dfcfda0016abb93b6b2"},
@@ -17,6 +18,7 @@ version = "3.10.5"
description = "Async http client/server framework (asyncio)"
optional = false
python-versions = ">=3.8"
groups = ["main"]
files = [
{file = "aiohttp-3.10.5-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:18a01eba2574fb9edd5f6e5fb25f66e6ce061da5dab5db75e13fe1558142e0a3"},
{file = "aiohttp-3.10.5-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:94fac7c6e77ccb1ca91e9eb4cb0ac0270b9fb9b289738654120ba8cebb1189c6"},
@@ -129,6 +131,7 @@ version = "1.3.1"
description = "aiosignal: a list of registered asynchronous callbacks"
optional = false
python-versions = ">=3.7"
groups = ["main"]
files = [
{file = "aiosignal-1.3.1-py3-none-any.whl", hash = "sha256:f8376fb07dd1e86a584e4fcdec80b36b7f81aac666ebc724e2c090300dd83b17"},
{file = "aiosignal-1.3.1.tar.gz", hash = "sha256:54cd96e15e1649b75d6c87526a6ff0b6c1b0dd3459f43d9ca11d48c339b68cfc"},
@@ -143,6 +146,7 @@ version = "0.7.0"
description = "Reusable constraint types to use with typing.Annotated"
optional = false
python-versions = ">=3.8"
groups = ["main"]
files = [
{file = "annotated_types-0.7.0-py3-none-any.whl", hash = "sha256:1f02e8b43a8fbbc3f3e0d4f0f4bfc8131bcb4eebe8849b8e5c773f3a1c582a53"},
{file = "annotated_types-0.7.0.tar.gz", hash = "sha256:aff07c09a53a08bc8cfccb9c85b05f1aa9a2a6f23728d790723543408344ce89"},
@@ -154,6 +158,7 @@ version = "4.4.0"
description = "High level compatibility layer for multiple asynchronous event loop implementations"
optional = false
python-versions = ">=3.8"
groups = ["main"]
files = [
{file = "anyio-4.4.0-py3-none-any.whl", hash = "sha256:c1b2d8f46a8a812513012e1107cb0e68c17159a7a594208005a57dc776e1bdc7"},
{file = "anyio-4.4.0.tar.gz", hash = "sha256:5aadc6a1bbb7cdb0bede386cac5e2940f5e2ff3aa20277e991cf028e0585ce94"},
@@ -176,10 +181,12 @@ version = "4.0.3"
description = "Timeout context manager for asyncio programs"
optional = false
python-versions = ">=3.7"
groups = ["main", "dev"]
files = [
{file = "async-timeout-4.0.3.tar.gz", hash = "sha256:4640d96be84d82d02ed59ea2b7105a0f7b33abe8703703cd0ab0bf87c427522f"},
{file = "async_timeout-4.0.3-py3-none-any.whl", hash = "sha256:7405140ff1230c310e51dc27b3145b9092d659ce68ff733fb0cefe3ee42be028"},
]
markers = {main = "python_version < \"3.11\"", dev = "python_full_version < \"3.11.3\""}
[[package]]
name = "attrs"
@@ -187,6 +194,7 @@ version = "24.2.0"
description = "Classes Without Boilerplate"
optional = false
python-versions = ">=3.7"
groups = ["main"]
files = [
{file = "attrs-24.2.0-py3-none-any.whl", hash = "sha256:81921eb96de3191c8258c199618104dd27ac608d9366f5e35d011eae1867ede2"},
{file = "attrs-24.2.0.tar.gz", hash = "sha256:5cfb1b9148b5b086569baec03f20d7b6bf3bcacc9a42bebf87ffaaca362f6346"},
@@ -206,6 +214,7 @@ version = "5.5.0"
description = "Extensible memoizing collections and decorators"
optional = false
python-versions = ">=3.7"
groups = ["main"]
files = [
{file = "cachetools-5.5.0-py3-none-any.whl", hash = "sha256:02134e8439cdc2ffb62023ce1debca2944c3f289d66bb17ead3ab3dede74b292"},
{file = "cachetools-5.5.0.tar.gz", hash = "sha256:2cc24fb4cbe39633fb7badd9db9ca6295d766d9c2995f245725a46715d050f2a"},
@@ -217,6 +226,7 @@ version = "2024.8.30"
description = "Python package for providing Mozilla's CA Bundle."
optional = false
python-versions = ">=3.6"
groups = ["main"]
files = [
{file = "certifi-2024.8.30-py3-none-any.whl", hash = "sha256:922820b53db7a7257ffbda3f597266d435245903d80737e34f8a45ff3e3230d8"},
{file = "certifi-2024.8.30.tar.gz", hash = "sha256:bec941d2aa8195e248a60b31ff9f0558284cf01a52591ceda73ea9afffd69fd9"},
@@ -228,6 +238,7 @@ version = "3.3.2"
description = "The Real First Universal Charset Detector. Open, modern and actively maintained alternative to Chardet."
optional = false
python-versions = ">=3.7.0"
groups = ["main"]
files = [
{file = "charset-normalizer-3.3.2.tar.gz", hash = "sha256:f30c3cb33b24454a82faecaf01b19c18562b1e89558fb6c56de4d9118a032fd5"},
{file = "charset_normalizer-3.3.2-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:25baf083bf6f6b341f4121c2f3c548875ee6f5339300e08be3f2b2ba1721cdd3"},
@@ -327,6 +338,7 @@ version = "0.4.6"
description = "Cross-platform colored terminal text."
optional = false
python-versions = "!=3.0.*,!=3.1.*,!=3.2.*,!=3.3.*,!=3.4.*,!=3.5.*,!=3.6.*,>=2.7"
groups = ["main"]
files = [
{file = "colorama-0.4.6-py2.py3-none-any.whl", hash = "sha256:4f1d9991f5acc0ca119f9d443620b77f9d6b33703e51011c16baf57afb285fc6"},
{file = "colorama-0.4.6.tar.gz", hash = "sha256:08695f5cb7ed6e0531a20572697297273c47b8cae5a63ffc6d6ed5c201be6e44"},
@@ -338,6 +350,7 @@ version = "1.2.14"
description = "Python @deprecated decorator to deprecate old python classes, functions or methods."
optional = false
python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*"
groups = ["main"]
files = [
{file = "Deprecated-1.2.14-py2.py3-none-any.whl", hash = "sha256:6fac8b097794a90302bdbb17b9b815e732d3c4720583ff1b198499d78470466c"},
{file = "Deprecated-1.2.14.tar.gz", hash = "sha256:e5323eb936458dccc2582dc6f9c322c852a775a27065ff2b0c4970b9d53d01b3"},
@@ -355,6 +368,7 @@ version = "2.1.0"
description = "A library to handle automated deprecations"
optional = false
python-versions = "*"
groups = ["main"]
files = [
{file = "deprecation-2.1.0-py2.py3-none-any.whl", hash = "sha256:a10811591210e1fb0e768a8c25517cabeabcba6f0bf96564f8ff45189f90b14a"},
{file = "deprecation-2.1.0.tar.gz", hash = "sha256:72b3bde64e5d778694b0cf68178aed03d15e15477116add3fb773e581f9518ff"},
@@ -369,6 +383,8 @@ version = "1.2.2"
description = "Backport of PEP 654 (exception groups)"
optional = false
python-versions = ">=3.7"
groups = ["main"]
markers = "python_version < \"3.11\""
files = [
{file = "exceptiongroup-1.2.2-py3-none-any.whl", hash = "sha256:3111b9d131c238bec2f8f516e123e14ba243563fb135d3fe885990585aa7795b"},
{file = "exceptiongroup-1.2.2.tar.gz", hash = "sha256:47c2edf7c6738fafb49fd34290706d1a1a2f4d1c6df275526b62cbb4aa5393cc"},
@@ -383,6 +399,7 @@ version = "1.2.2"
description = "Dictionary with auto-expiring values for caching purposes"
optional = false
python-versions = "*"
groups = ["main"]
files = [
{file = "expiringdict-1.2.2-py3-none-any.whl", hash = "sha256:09a5d20bc361163e6432a874edd3179676e935eb81b925eccef48d409a8a45e8"},
{file = "expiringdict-1.2.2.tar.gz", hash = "sha256:300fb92a7e98f15b05cf9a856c1415b3bc4f2e132be07daa326da6414c23ee09"},
@@ -397,6 +414,7 @@ version = "1.4.1"
description = "A list-like structure which implements collections.abc.MutableSequence"
optional = false
python-versions = ">=3.8"
groups = ["main"]
files = [
{file = "frozenlist-1.4.1-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:f9aa1878d1083b276b0196f2dfbe00c9b7e752475ed3b682025ff20c1c1f51ac"},
{file = "frozenlist-1.4.1-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:29acab3f66f0f24674b7dc4736477bcd4bc3ad4b896f5f45379a67bce8b96868"},
@@ -483,6 +501,7 @@ version = "2.19.2"
description = "Google API client core library"
optional = false
python-versions = ">=3.7"
groups = ["main"]
files = [
{file = "google_api_core-2.19.2-py3-none-any.whl", hash = "sha256:53ec0258f2837dd53bbd3d3df50f5359281b3cc13f800c941dd15a9b5a415af4"},
{file = "google_api_core-2.19.2.tar.gz", hash = "sha256:ca07de7e8aa1c98a8bfca9321890ad2340ef7f2eb136e558cee68f24b94b0a8f"},
@@ -514,6 +533,7 @@ version = "2.34.0"
description = "Google Authentication Library"
optional = false
python-versions = ">=3.7"
groups = ["main"]
files = [
{file = "google_auth-2.34.0-py2.py3-none-any.whl", hash = "sha256:72fd4733b80b6d777dcde515628a9eb4a577339437012874ea286bca7261ee65"},
{file = "google_auth-2.34.0.tar.gz", hash = "sha256:8eb87396435c19b20d32abd2f984e31c191a15284af72eb922f10e5bde9c04cc"},
@@ -537,6 +557,7 @@ version = "1.4.5"
description = "Google Cloud Appengine Logging API client library"
optional = false
python-versions = ">=3.7"
groups = ["main"]
files = [
{file = "google_cloud_appengine_logging-1.4.5-py2.py3-none-any.whl", hash = "sha256:344e0244404049b42164e4d6dc718ca2c81b393d066956e7cb85fd9407ed9c48"},
{file = "google_cloud_appengine_logging-1.4.5.tar.gz", hash = "sha256:de7d766e5d67b19fc5833974b505b32d2a5bbdfb283fd941e320e7cfdae4cb83"},
@@ -554,6 +575,7 @@ version = "0.3.0"
description = "Google Cloud Audit Protos"
optional = false
python-versions = ">=3.7"
groups = ["main"]
files = [
{file = "google_cloud_audit_log-0.3.0-py2.py3-none-any.whl", hash = "sha256:8340793120a1d5aa143605def8704ecdcead15106f754ef1381ae3bab533722f"},
{file = "google_cloud_audit_log-0.3.0.tar.gz", hash = "sha256:901428b257020d8c1d1133e0fa004164a555e5a395c7ca3cdbb8486513df3a65"},
@@ -569,6 +591,7 @@ version = "2.4.1"
description = "Google Cloud API client core library"
optional = false
python-versions = ">=3.7"
groups = ["main"]
files = [
{file = "google-cloud-core-2.4.1.tar.gz", hash = "sha256:9b7749272a812bde58fff28868d0c5e2f585b82f37e09a1f6ed2d4d10f134073"},
{file = "google_cloud_core-2.4.1-py2.py3-none-any.whl", hash = "sha256:a9e6a4422b9ac5c29f79a0ede9485473338e2ce78d91f2370c01e730eab22e61"},
@@ -583,13 +606,14 @@ grpc = ["grpcio (>=1.38.0,<2.0dev)", "grpcio-status (>=1.38.0,<2.0.dev0)"]
[[package]]
name = "google-cloud-logging"
version = "3.11.3"
version = "3.11.4"
description = "Stackdriver Logging API client library"
optional = false
python-versions = ">=3.7"
groups = ["main"]
files = [
{file = "google_cloud_logging-3.11.3-py2.py3-none-any.whl", hash = "sha256:b8ec23f2998f76a58f8492db26a0f4151dd500425c3f08448586b85972f3c494"},
{file = "google_cloud_logging-3.11.3.tar.gz", hash = "sha256:0a73cd94118875387d4535371d9e9426861edef8e44fba1261e86782d5b8d54f"},
{file = "google_cloud_logging-3.11.4-py2.py3-none-any.whl", hash = "sha256:1d465ac62df29fb94bba4d6b4891035e57d573d84541dd8a40eebbc74422b2f0"},
{file = "google_cloud_logging-3.11.4.tar.gz", hash = "sha256:32305d989323f3c58603044e2ac5d9cf23e9465ede511bbe90b4309270d3195c"},
]
[package.dependencies]
@@ -601,7 +625,8 @@ google-cloud-core = ">=2.0.0,<3.0.0dev"
grpc-google-iam-v1 = ">=0.12.4,<1.0.0dev"
opentelemetry-api = ">=1.9.0"
proto-plus = [
{version = ">=1.22.2,<2.0.0dev", markers = "python_version >= \"3.11\""},
{version = ">=1.25.0,<2.0.0dev", markers = "python_version >= \"3.13\""},
{version = ">=1.22.2,<2.0.0dev", markers = "python_version >= \"3.11\" and python_version < \"3.13\""},
{version = ">=1.22.0,<2.0.0dev", markers = "python_version < \"3.11\""},
]
protobuf = ">=3.20.2,<4.21.0 || >4.21.0,<4.21.1 || >4.21.1,<4.21.2 || >4.21.2,<4.21.3 || >4.21.3,<4.21.4 || >4.21.4,<4.21.5 || >4.21.5,<6.0.0dev"
@@ -612,6 +637,7 @@ version = "1.65.0"
description = "Common protobufs used in Google APIs"
optional = false
python-versions = ">=3.7"
groups = ["main"]
files = [
{file = "googleapis_common_protos-1.65.0-py2.py3-none-any.whl", hash = "sha256:2972e6c496f435b92590fd54045060867f3fe9be2c82ab148fc8885035479a63"},
{file = "googleapis_common_protos-1.65.0.tar.gz", hash = "sha256:334a29d07cddc3aa01dee4988f9afd9b2916ee2ff49d6b757155dc0d197852c0"},
@@ -626,17 +652,18 @@ grpc = ["grpcio (>=1.44.0,<2.0.0.dev0)"]
[[package]]
name = "gotrue"
version = "2.10.0"
version = "2.11.1"
description = "Python Client Library for Supabase Auth"
optional = false
python-versions = "<4.0,>=3.9"
groups = ["main"]
files = [
{file = "gotrue-2.10.0-py3-none-any.whl", hash = "sha256:768e58207488e5184ffbdc4351b7280d913daf97962f4e9f2cca05c80004b042"},
{file = "gotrue-2.10.0.tar.gz", hash = "sha256:4edf4c251da3535f2b044e23deba221e848ca1210c17d0c7a9b19f79a1e3f3c0"},
{file = "gotrue-2.11.1-py3-none-any.whl", hash = "sha256:1b2d915bdc65fd0ad608532759ce9c72fa2e910145c1e6901f2188519e7bcd2d"},
{file = "gotrue-2.11.1.tar.gz", hash = "sha256:5594ceee60bd873e5f4fdd028b08dece3906f6013b6ed08e7786b71c0092fed0"},
]
[package.dependencies]
httpx = {version = ">=0.26,<0.28", extras = ["http2"]}
httpx = {version = ">=0.26,<0.29", extras = ["http2"]}
pydantic = ">=1.10,<3"
[[package]]
@@ -645,6 +672,7 @@ version = "0.13.1"
description = "IAM API client library"
optional = false
python-versions = ">=3.7"
groups = ["main"]
files = [
{file = "grpc-google-iam-v1-0.13.1.tar.gz", hash = "sha256:3ff4b2fd9d990965e410965253c0da6f66205d5a8291c4c31c6ebecca18a9001"},
{file = "grpc_google_iam_v1-0.13.1-py2.py3-none-any.whl", hash = "sha256:c3e86151a981811f30d5e7330f271cee53e73bb87755e88cc3b6f0c7b5fe374e"},
@@ -661,6 +689,7 @@ version = "1.66.1"
description = "HTTP/2-based RPC framework"
optional = false
python-versions = ">=3.8"
groups = ["main"]
files = [
{file = "grpcio-1.66.1-cp310-cp310-linux_armv7l.whl", hash = "sha256:4877ba180591acdf127afe21ec1c7ff8a5ecf0fe2600f0d3c50e8c4a1cbc6492"},
{file = "grpcio-1.66.1-cp310-cp310-macosx_12_0_universal2.whl", hash = "sha256:3750c5a00bd644c75f4507f77a804d0189d97a107eb1481945a0cf3af3e7a5ac"},
@@ -719,6 +748,7 @@ version = "1.66.1"
description = "Status proto mapping for gRPC"
optional = false
python-versions = ">=3.8"
groups = ["main"]
files = [
{file = "grpcio_status-1.66.1-py3-none-any.whl", hash = "sha256:cf9ed0b4a83adbe9297211c95cb5488b0cd065707e812145b842c85c4782ff02"},
{file = "grpcio_status-1.66.1.tar.gz", hash = "sha256:b3f7d34ccc46d83fea5261eea3786174459f763c31f6e34f1d24eba6d515d024"},
@@ -735,6 +765,7 @@ version = "0.14.0"
description = "A pure-Python, bring-your-own-I/O implementation of HTTP/1.1"
optional = false
python-versions = ">=3.7"
groups = ["main"]
files = [
{file = "h11-0.14.0-py3-none-any.whl", hash = "sha256:e3fe4ac4b851c468cc8363d500db52c2ead036020723024a109d37346efaa761"},
{file = "h11-0.14.0.tar.gz", hash = "sha256:8f19fbbe99e72420ff35c00b27a34cb9937e902a8b810e2c88300c6f0a3b699d"},
@@ -746,6 +777,7 @@ version = "4.1.0"
description = "HTTP/2 State-Machine based protocol implementation"
optional = false
python-versions = ">=3.6.1"
groups = ["main"]
files = [
{file = "h2-4.1.0-py3-none-any.whl", hash = "sha256:03a46bcf682256c95b5fd9e9a99c1323584c3eec6440d379b9903d709476bc6d"},
{file = "h2-4.1.0.tar.gz", hash = "sha256:a83aca08fbe7aacb79fec788c9c0bac936343560ed9ec18b82a13a12c28d2abb"},
@@ -761,6 +793,7 @@ version = "4.0.0"
description = "Pure-Python HPACK header compression"
optional = false
python-versions = ">=3.6.1"
groups = ["main"]
files = [
{file = "hpack-4.0.0-py3-none-any.whl", hash = "sha256:84a076fad3dc9a9f8063ccb8041ef100867b1878b25ef0ee63847a5d53818a6c"},
{file = "hpack-4.0.0.tar.gz", hash = "sha256:fc41de0c63e687ebffde81187a948221294896f6bdc0ae2312708df339430095"},
@@ -772,6 +805,7 @@ version = "1.0.5"
description = "A minimal low-level HTTP client."
optional = false
python-versions = ">=3.8"
groups = ["main"]
files = [
{file = "httpcore-1.0.5-py3-none-any.whl", hash = "sha256:421f18bac248b25d310f3cacd198d55b8e6125c107797b609ff9b7a6ba7991b5"},
{file = "httpcore-1.0.5.tar.gz", hash = "sha256:34a38e2f9291467ee3b44e89dd52615370e152954ba21721378a87b2960f7a61"},
@@ -793,6 +827,7 @@ version = "0.27.2"
description = "The next generation HTTP client."
optional = false
python-versions = ">=3.8"
groups = ["main"]
files = [
{file = "httpx-0.27.2-py3-none-any.whl", hash = "sha256:7bb2708e112d8fdd7829cd4243970f0c223274051cb35ee80c03301ee29a3df0"},
{file = "httpx-0.27.2.tar.gz", hash = "sha256:f7c2be1d2f3c3c3160d441802406b206c2b76f5947b11115e6df10c6c65e66c2"},
@@ -819,6 +854,7 @@ version = "6.0.1"
description = "HTTP/2 framing layer for Python"
optional = false
python-versions = ">=3.6.1"
groups = ["main"]
files = [
{file = "hyperframe-6.0.1-py3-none-any.whl", hash = "sha256:0ec6bafd80d8ad2195c4f03aacba3a8265e57bc4cff261e802bf39970ed02a15"},
{file = "hyperframe-6.0.1.tar.gz", hash = "sha256:ae510046231dc8e9ecb1a6586f63d2347bf4c8905914aa84ba585ae85f28a914"},
@@ -830,6 +866,7 @@ version = "3.8"
description = "Internationalized Domain Names in Applications (IDNA)"
optional = false
python-versions = ">=3.6"
groups = ["main"]
files = [
{file = "idna-3.8-py3-none-any.whl", hash = "sha256:050b4e5baadcd44d760cedbd2b8e639f2ff89bbc7a5730fcc662954303377aac"},
{file = "idna-3.8.tar.gz", hash = "sha256:d838c2c0ed6fced7693d5e8ab8e734d5f8fda53a039c0164afb0b82e771e3603"},
@@ -841,6 +878,7 @@ version = "8.4.0"
description = "Read metadata from Python packages"
optional = false
python-versions = ">=3.8"
groups = ["main"]
files = [
{file = "importlib_metadata-8.4.0-py3-none-any.whl", hash = "sha256:66f342cc6ac9818fc6ff340576acd24d65ba0b3efabb2b4ac08b598965a4a2f1"},
{file = "importlib_metadata-8.4.0.tar.gz", hash = "sha256:9a547d3bc3608b025f93d403fdd1aae741c24fbb8314df4b155675742ce303c5"},
@@ -860,6 +898,7 @@ version = "2.0.0"
description = "brain-dead simple config-ini parsing"
optional = false
python-versions = ">=3.7"
groups = ["main"]
files = [
{file = "iniconfig-2.0.0-py3-none-any.whl", hash = "sha256:b6a85871a79d2e3b22d2d1b94ac2824226a63c6b741c88f7ae975f18b6778374"},
{file = "iniconfig-2.0.0.tar.gz", hash = "sha256:2d91e135bf72d31a410b17c16da610a82cb55f6b0477d1a902134b24a455b8b3"},
@@ -871,6 +910,7 @@ version = "6.1.0"
description = "multidict implementation"
optional = false
python-versions = ">=3.8"
groups = ["main"]
files = [
{file = "multidict-6.1.0-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:3380252550e372e8511d49481bd836264c009adb826b23fefcc5dd3c69692f60"},
{file = "multidict-6.1.0-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:99f826cbf970077383d7de805c0681799491cb939c25450b9b5b3ced03ca99f1"},
@@ -975,6 +1015,7 @@ version = "1.27.0"
description = "OpenTelemetry Python API"
optional = false
python-versions = ">=3.8"
groups = ["main"]
files = [
{file = "opentelemetry_api-1.27.0-py3-none-any.whl", hash = "sha256:953d5871815e7c30c81b56d910c707588000fff7a3ca1c73e6531911d53065e7"},
{file = "opentelemetry_api-1.27.0.tar.gz", hash = "sha256:ed673583eaa5f81b5ce5e86ef7cdaf622f88ef65f0b9aab40b843dcae5bef342"},
@@ -990,6 +1031,7 @@ version = "24.1"
description = "Core utilities for Python packages"
optional = false
python-versions = ">=3.8"
groups = ["main"]
files = [
{file = "packaging-24.1-py3-none-any.whl", hash = "sha256:5b8f2217dbdbd2f7f384c41c628544e6d52f2d0f53c6d0c3ea61aa5d1d7ff124"},
{file = "packaging-24.1.tar.gz", hash = "sha256:026ed72c8ed3fcce5bf8950572258698927fd1dbda10a5e981cdf0ac37f4f002"},
@@ -1001,6 +1043,7 @@ version = "1.5.0"
description = "plugin and hook calling mechanisms for python"
optional = false
python-versions = ">=3.8"
groups = ["main"]
files = [
{file = "pluggy-1.5.0-py3-none-any.whl", hash = "sha256:44e1ad92c8ca002de6377e165f3e0f1be63266ab4d554740532335b9d75ea669"},
{file = "pluggy-1.5.0.tar.gz", hash = "sha256:2cffa88e94fdc978c4c574f15f9e59b7f4201d439195c3715ca9e2486f1d0cf1"},
@@ -1012,30 +1055,32 @@ testing = ["pytest", "pytest-benchmark"]
[[package]]
name = "postgrest"
version = "0.18.0"
version = "0.19.1"
description = "PostgREST client for Python. This library provides an ORM interface to PostgREST."
optional = false
python-versions = "<4.0,>=3.9"
groups = ["main"]
files = [
{file = "postgrest-0.18.0-py3-none-any.whl", hash = "sha256:200baad0d23fee986b3a0ffd3e07bfe0cdd40e09760f11e8e13a6c0c2376d5fa"},
{file = "postgrest-0.18.0.tar.gz", hash = "sha256:29c1a94801a17eb9ad590189993fe5a7a6d8c1bfc11a3c9d0ce7ba146454ebb3"},
{file = "postgrest-0.19.1-py3-none-any.whl", hash = "sha256:a8e7be4e1abc69fd8eee5a49d7dc3a76dfbffbd778beed0b2bd7accb3f4f3a2a"},
{file = "postgrest-0.19.1.tar.gz", hash = "sha256:d8fa88953cced4f45efa0f412056c364f64ece8a35b5b35f458a7e58c133fbca"},
]
[package.dependencies]
deprecation = ">=2.1.0,<3.0.0"
httpx = {version = ">=0.26,<0.28", extras = ["http2"]}
httpx = {version = ">=0.26,<0.29", extras = ["http2"]}
pydantic = ">=1.9,<3.0"
strenum = {version = ">=0.4.9,<0.5.0", markers = "python_version < \"3.11\""}
[[package]]
name = "proto-plus"
version = "1.24.0"
description = "Beautiful, Pythonic protocol buffers."
version = "1.26.0"
description = "Beautiful, Pythonic protocol buffers"
optional = false
python-versions = ">=3.7"
groups = ["main"]
files = [
{file = "proto-plus-1.24.0.tar.gz", hash = "sha256:30b72a5ecafe4406b0d339db35b56c4059064e69227b8c3bda7462397f966445"},
{file = "proto_plus-1.24.0-py3-none-any.whl", hash = "sha256:402576830425e5f6ce4c2a6702400ac79897dab0b4343821aa5188b0fab81a12"},
{file = "proto_plus-1.26.0-py3-none-any.whl", hash = "sha256:bf2dfaa3da281fc3187d12d224c707cb57214fb2c22ba854eb0c105a3fb2d4d7"},
{file = "proto_plus-1.26.0.tar.gz", hash = "sha256:6e93d5f5ca267b54300880fff156b6a3386b3fa3f43b1da62e680fc0c586ef22"},
]
[package.dependencies]
@@ -1050,6 +1095,7 @@ version = "5.28.0"
description = ""
optional = false
python-versions = ">=3.8"
groups = ["main"]
files = [
{file = "protobuf-5.28.0-cp310-abi3-win32.whl", hash = "sha256:66c3edeedb774a3508ae70d87b3a19786445fe9a068dd3585e0cefa8a77b83d0"},
{file = "protobuf-5.28.0-cp310-abi3-win_amd64.whl", hash = "sha256:6d7cc9e60f976cf3e873acb9a40fed04afb5d224608ed5c1a105db4a3f09c5b6"},
@@ -1070,6 +1116,7 @@ version = "0.6.1"
description = "Pure-Python implementation of ASN.1 types and DER/BER/CER codecs (X.208)"
optional = false
python-versions = ">=3.8"
groups = ["main"]
files = [
{file = "pyasn1-0.6.1-py3-none-any.whl", hash = "sha256:0d632f46f2ba09143da3a8afe9e33fb6f92fa2320ab7e886e2d0f7672af84629"},
{file = "pyasn1-0.6.1.tar.gz", hash = "sha256:6f580d2bdd84365380830acf45550f2511469f673cb4a5ae3857a3170128b034"},
@@ -1081,6 +1128,7 @@ version = "0.4.1"
description = "A collection of ASN.1-based protocols modules"
optional = false
python-versions = ">=3.8"
groups = ["main"]
files = [
{file = "pyasn1_modules-0.4.1-py3-none-any.whl", hash = "sha256:49bfa96b45a292b711e986f222502c1c9a5e1f4e568fc30e2574a6c7d07838fd"},
{file = "pyasn1_modules-0.4.1.tar.gz", hash = "sha256:c28e2dbf9c06ad61c71a075c7e0f9fd0f1b0bb2d2ad4377f240d33ac2ab60a7c"},
@@ -1091,18 +1139,19 @@ pyasn1 = ">=0.4.6,<0.7.0"
[[package]]
name = "pydantic"
version = "2.10.3"
version = "2.10.6"
description = "Data validation using Python type hints"
optional = false
python-versions = ">=3.8"
groups = ["main"]
files = [
{file = "pydantic-2.10.3-py3-none-any.whl", hash = "sha256:be04d85bbc7b65651c5f8e6b9976ed9c6f41782a55524cef079a34a0bb82144d"},
{file = "pydantic-2.10.3.tar.gz", hash = "sha256:cb5ac360ce894ceacd69c403187900a02c4b20b693a9dd1d643e1effab9eadf9"},
{file = "pydantic-2.10.6-py3-none-any.whl", hash = "sha256:427d664bf0b8a2b34ff5dd0f5a18df00591adcee7198fbd71981054cef37b584"},
{file = "pydantic-2.10.6.tar.gz", hash = "sha256:ca5daa827cce33de7a42be142548b0096bf05a7e7b365aebfa5f8eeec7128236"},
]
[package.dependencies]
annotated-types = ">=0.6.0"
pydantic-core = "2.27.1"
pydantic-core = "2.27.2"
typing-extensions = ">=4.12.2"
[package.extras]
@@ -1111,111 +1160,112 @@ timezone = ["tzdata"]
[[package]]
name = "pydantic-core"
version = "2.27.1"
version = "2.27.2"
description = "Core functionality for Pydantic validation and serialization"
optional = false
python-versions = ">=3.8"
groups = ["main"]
files = [
{file = "pydantic_core-2.27.1-cp310-cp310-macosx_10_12_x86_64.whl", hash = "sha256:71a5e35c75c021aaf400ac048dacc855f000bdfed91614b4a726f7432f1f3d6a"},
{file = "pydantic_core-2.27.1-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:f82d068a2d6ecfc6e054726080af69a6764a10015467d7d7b9f66d6ed5afa23b"},
{file = "pydantic_core-2.27.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:121ceb0e822f79163dd4699e4c54f5ad38b157084d97b34de8b232bcaad70278"},
{file = "pydantic_core-2.27.1-cp310-cp310-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:4603137322c18eaf2e06a4495f426aa8d8388940f3c457e7548145011bb68e05"},
{file = "pydantic_core-2.27.1-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a33cd6ad9017bbeaa9ed78a2e0752c5e250eafb9534f308e7a5f7849b0b1bfb4"},
{file = "pydantic_core-2.27.1-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:15cc53a3179ba0fcefe1e3ae50beb2784dede4003ad2dfd24f81bba4b23a454f"},
{file = "pydantic_core-2.27.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:45d9c5eb9273aa50999ad6adc6be5e0ecea7e09dbd0d31bd0c65a55a2592ca08"},
{file = "pydantic_core-2.27.1-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:8bf7b66ce12a2ac52d16f776b31d16d91033150266eb796967a7e4621707e4f6"},
{file = "pydantic_core-2.27.1-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:655d7dd86f26cb15ce8a431036f66ce0318648f8853d709b4167786ec2fa4807"},
{file = "pydantic_core-2.27.1-cp310-cp310-musllinux_1_1_armv7l.whl", hash = "sha256:5556470f1a2157031e676f776c2bc20acd34c1990ca5f7e56f1ebf938b9ab57c"},
{file = "pydantic_core-2.27.1-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:f69ed81ab24d5a3bd93861c8c4436f54afdf8e8cc421562b0c7504cf3be58206"},
{file = "pydantic_core-2.27.1-cp310-none-win32.whl", hash = "sha256:f5a823165e6d04ccea61a9f0576f345f8ce40ed533013580e087bd4d7442b52c"},
{file = "pydantic_core-2.27.1-cp310-none-win_amd64.whl", hash = "sha256:57866a76e0b3823e0b56692d1a0bf722bffb324839bb5b7226a7dbd6c9a40b17"},
{file = "pydantic_core-2.27.1-cp311-cp311-macosx_10_12_x86_64.whl", hash = "sha256:ac3b20653bdbe160febbea8aa6c079d3df19310d50ac314911ed8cc4eb7f8cb8"},
{file = "pydantic_core-2.27.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:a5a8e19d7c707c4cadb8c18f5f60c843052ae83c20fa7d44f41594c644a1d330"},
{file = "pydantic_core-2.27.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:7f7059ca8d64fea7f238994c97d91f75965216bcbe5f695bb44f354893f11d52"},
{file = "pydantic_core-2.27.1-cp311-cp311-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:bed0f8a0eeea9fb72937ba118f9db0cb7e90773462af7962d382445f3005e5a4"},
{file = "pydantic_core-2.27.1-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a3cb37038123447cf0f3ea4c74751f6a9d7afef0eb71aa07bf5f652b5e6a132c"},
{file = "pydantic_core-2.27.1-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:84286494f6c5d05243456e04223d5a9417d7f443c3b76065e75001beb26f88de"},
{file = "pydantic_core-2.27.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:acc07b2cfc5b835444b44a9956846b578d27beeacd4b52e45489e93276241025"},
{file = "pydantic_core-2.27.1-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:4fefee876e07a6e9aad7a8c8c9f85b0cdbe7df52b8a9552307b09050f7512c7e"},
{file = "pydantic_core-2.27.1-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:258c57abf1188926c774a4c94dd29237e77eda19462e5bb901d88adcab6af919"},
{file = "pydantic_core-2.27.1-cp311-cp311-musllinux_1_1_armv7l.whl", hash = "sha256:35c14ac45fcfdf7167ca76cc80b2001205a8d5d16d80524e13508371fb8cdd9c"},
{file = "pydantic_core-2.27.1-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:d1b26e1dff225c31897696cab7d4f0a315d4c0d9e8666dbffdb28216f3b17fdc"},
{file = "pydantic_core-2.27.1-cp311-none-win32.whl", hash = "sha256:2cdf7d86886bc6982354862204ae3b2f7f96f21a3eb0ba5ca0ac42c7b38598b9"},
{file = "pydantic_core-2.27.1-cp311-none-win_amd64.whl", hash = "sha256:3af385b0cee8df3746c3f406f38bcbfdc9041b5c2d5ce3e5fc6637256e60bbc5"},
{file = "pydantic_core-2.27.1-cp311-none-win_arm64.whl", hash = "sha256:81f2ec23ddc1b476ff96563f2e8d723830b06dceae348ce02914a37cb4e74b89"},
{file = "pydantic_core-2.27.1-cp312-cp312-macosx_10_12_x86_64.whl", hash = "sha256:9cbd94fc661d2bab2bc702cddd2d3370bbdcc4cd0f8f57488a81bcce90c7a54f"},
{file = "pydantic_core-2.27.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:5f8c4718cd44ec1580e180cb739713ecda2bdee1341084c1467802a417fe0f02"},
{file = "pydantic_core-2.27.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:15aae984e46de8d376df515f00450d1522077254ef6b7ce189b38ecee7c9677c"},
{file = "pydantic_core-2.27.1-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:1ba5e3963344ff25fc8c40da90f44b0afca8cfd89d12964feb79ac1411a260ac"},
{file = "pydantic_core-2.27.1-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:992cea5f4f3b29d6b4f7f1726ed8ee46c8331c6b4eed6db5b40134c6fe1768bb"},
{file = "pydantic_core-2.27.1-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:0325336f348dbee6550d129b1627cb8f5351a9dc91aad141ffb96d4937bd9529"},
{file = "pydantic_core-2.27.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7597c07fbd11515f654d6ece3d0e4e5093edc30a436c63142d9a4b8e22f19c35"},
{file = "pydantic_core-2.27.1-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:3bbd5d8cc692616d5ef6fbbbd50dbec142c7e6ad9beb66b78a96e9c16729b089"},
{file = "pydantic_core-2.27.1-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:dc61505e73298a84a2f317255fcc72b710b72980f3a1f670447a21efc88f8381"},
{file = "pydantic_core-2.27.1-cp312-cp312-musllinux_1_1_armv7l.whl", hash = "sha256:e1f735dc43da318cad19b4173dd1ffce1d84aafd6c9b782b3abc04a0d5a6f5bb"},
{file = "pydantic_core-2.27.1-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:f4e5658dbffe8843a0f12366a4c2d1c316dbe09bb4dfbdc9d2d9cd6031de8aae"},
{file = "pydantic_core-2.27.1-cp312-none-win32.whl", hash = "sha256:672ebbe820bb37988c4d136eca2652ee114992d5d41c7e4858cdd90ea94ffe5c"},
{file = "pydantic_core-2.27.1-cp312-none-win_amd64.whl", hash = "sha256:66ff044fd0bb1768688aecbe28b6190f6e799349221fb0de0e6f4048eca14c16"},
{file = "pydantic_core-2.27.1-cp312-none-win_arm64.whl", hash = "sha256:9a3b0793b1bbfd4146304e23d90045f2a9b5fd5823aa682665fbdaf2a6c28f3e"},
{file = "pydantic_core-2.27.1-cp313-cp313-macosx_10_12_x86_64.whl", hash = "sha256:f216dbce0e60e4d03e0c4353c7023b202d95cbaeff12e5fd2e82ea0a66905073"},
{file = "pydantic_core-2.27.1-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:a2e02889071850bbfd36b56fd6bc98945e23670773bc7a76657e90e6b6603c08"},
{file = "pydantic_core-2.27.1-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:42b0e23f119b2b456d07ca91b307ae167cc3f6c846a7b169fca5326e32fdc6cf"},
{file = "pydantic_core-2.27.1-cp313-cp313-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:764be71193f87d460a03f1f7385a82e226639732214b402f9aa61f0d025f0737"},
{file = "pydantic_core-2.27.1-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:1c00666a3bd2f84920a4e94434f5974d7bbc57e461318d6bb34ce9cdbbc1f6b2"},
{file = "pydantic_core-2.27.1-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:3ccaa88b24eebc0f849ce0a4d09e8a408ec5a94afff395eb69baf868f5183107"},
{file = "pydantic_core-2.27.1-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c65af9088ac534313e1963443d0ec360bb2b9cba6c2909478d22c2e363d98a51"},
{file = "pydantic_core-2.27.1-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:206b5cf6f0c513baffaeae7bd817717140770c74528f3e4c3e1cec7871ddd61a"},
{file = "pydantic_core-2.27.1-cp313-cp313-musllinux_1_1_aarch64.whl", hash = "sha256:062f60e512fc7fff8b8a9d680ff0ddaaef0193dba9fa83e679c0c5f5fbd018bc"},
{file = "pydantic_core-2.27.1-cp313-cp313-musllinux_1_1_armv7l.whl", hash = "sha256:a0697803ed7d4af5e4c1adf1670af078f8fcab7a86350e969f454daf598c4960"},
{file = "pydantic_core-2.27.1-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:58ca98a950171f3151c603aeea9303ef6c235f692fe555e883591103da709b23"},
{file = "pydantic_core-2.27.1-cp313-none-win32.whl", hash = "sha256:8065914ff79f7eab1599bd80406681f0ad08f8e47c880f17b416c9f8f7a26d05"},
{file = "pydantic_core-2.27.1-cp313-none-win_amd64.whl", hash = "sha256:ba630d5e3db74c79300d9a5bdaaf6200172b107f263c98a0539eeecb857b2337"},
{file = "pydantic_core-2.27.1-cp313-none-win_arm64.whl", hash = "sha256:45cf8588c066860b623cd11c4ba687f8d7175d5f7ef65f7129df8a394c502de5"},
{file = "pydantic_core-2.27.1-cp38-cp38-macosx_10_12_x86_64.whl", hash = "sha256:5897bec80a09b4084aee23f9b73a9477a46c3304ad1d2d07acca19723fb1de62"},
{file = "pydantic_core-2.27.1-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:d0165ab2914379bd56908c02294ed8405c252250668ebcb438a55494c69f44ab"},
{file = "pydantic_core-2.27.1-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:6b9af86e1d8e4cfc82c2022bfaa6f459381a50b94a29e95dcdda8442d6d83864"},
{file = "pydantic_core-2.27.1-cp38-cp38-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:5f6c8a66741c5f5447e047ab0ba7a1c61d1e95580d64bce852e3df1f895c4067"},
{file = "pydantic_core-2.27.1-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:9a42d6a8156ff78981f8aa56eb6394114e0dedb217cf8b729f438f643608cbcd"},
{file = "pydantic_core-2.27.1-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:64c65f40b4cd8b0e049a8edde07e38b476da7e3aaebe63287c899d2cff253fa5"},
{file = "pydantic_core-2.27.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9fdcf339322a3fae5cbd504edcefddd5a50d9ee00d968696846f089b4432cf78"},
{file = "pydantic_core-2.27.1-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:bf99c8404f008750c846cb4ac4667b798a9f7de673ff719d705d9b2d6de49c5f"},
{file = "pydantic_core-2.27.1-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:8f1edcea27918d748c7e5e4d917297b2a0ab80cad10f86631e488b7cddf76a36"},
{file = "pydantic_core-2.27.1-cp38-cp38-musllinux_1_1_armv7l.whl", hash = "sha256:159cac0a3d096f79ab6a44d77a961917219707e2a130739c64d4dd46281f5c2a"},
{file = "pydantic_core-2.27.1-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:029d9757eb621cc6e1848fa0b0310310de7301057f623985698ed7ebb014391b"},
{file = "pydantic_core-2.27.1-cp38-none-win32.whl", hash = "sha256:a28af0695a45f7060e6f9b7092558a928a28553366519f64083c63a44f70e618"},
{file = "pydantic_core-2.27.1-cp38-none-win_amd64.whl", hash = "sha256:2d4567c850905d5eaaed2f7a404e61012a51caf288292e016360aa2b96ff38d4"},
{file = "pydantic_core-2.27.1-cp39-cp39-macosx_10_12_x86_64.whl", hash = "sha256:e9386266798d64eeb19dd3677051f5705bf873e98e15897ddb7d76f477131967"},
{file = "pydantic_core-2.27.1-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:4228b5b646caa73f119b1ae756216b59cc6e2267201c27d3912b592c5e323b60"},
{file = "pydantic_core-2.27.1-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0b3dfe500de26c52abe0477dde16192ac39c98f05bf2d80e76102d394bd13854"},
{file = "pydantic_core-2.27.1-cp39-cp39-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:aee66be87825cdf72ac64cb03ad4c15ffef4143dbf5c113f64a5ff4f81477bf9"},
{file = "pydantic_core-2.27.1-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:3b748c44bb9f53031c8cbc99a8a061bc181c1000c60a30f55393b6e9c45cc5bd"},
{file = "pydantic_core-2.27.1-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:5ca038c7f6a0afd0b2448941b6ef9d5e1949e999f9e5517692eb6da58e9d44be"},
{file = "pydantic_core-2.27.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6e0bd57539da59a3e4671b90a502da9a28c72322a4f17866ba3ac63a82c4498e"},
{file = "pydantic_core-2.27.1-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:ac6c2c45c847bbf8f91930d88716a0fb924b51e0c6dad329b793d670ec5db792"},
{file = "pydantic_core-2.27.1-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:b94d4ba43739bbe8b0ce4262bcc3b7b9f31459ad120fb595627eaeb7f9b9ca01"},
{file = "pydantic_core-2.27.1-cp39-cp39-musllinux_1_1_armv7l.whl", hash = "sha256:00e6424f4b26fe82d44577b4c842d7df97c20be6439e8e685d0d715feceb9fb9"},
{file = "pydantic_core-2.27.1-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:38de0a70160dd97540335b7ad3a74571b24f1dc3ed33f815f0880682e6880131"},
{file = "pydantic_core-2.27.1-cp39-none-win32.whl", hash = "sha256:7ccebf51efc61634f6c2344da73e366c75e735960b5654b63d7e6f69a5885fa3"},
{file = "pydantic_core-2.27.1-cp39-none-win_amd64.whl", hash = "sha256:a57847b090d7892f123726202b7daa20df6694cbd583b67a592e856bff603d6c"},
{file = "pydantic_core-2.27.1-pp310-pypy310_pp73-macosx_10_12_x86_64.whl", hash = "sha256:3fa80ac2bd5856580e242dbc202db873c60a01b20309c8319b5c5986fbe53ce6"},
{file = "pydantic_core-2.27.1-pp310-pypy310_pp73-macosx_11_0_arm64.whl", hash = "sha256:d950caa237bb1954f1b8c9227b5065ba6875ac9771bb8ec790d956a699b78676"},
{file = "pydantic_core-2.27.1-pp310-pypy310_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0e4216e64d203e39c62df627aa882f02a2438d18a5f21d7f721621f7a5d3611d"},
{file = "pydantic_core-2.27.1-pp310-pypy310_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:02a3d637bd387c41d46b002f0e49c52642281edacd2740e5a42f7017feea3f2c"},
{file = "pydantic_core-2.27.1-pp310-pypy310_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:161c27ccce13b6b0c8689418da3885d3220ed2eae2ea5e9b2f7f3d48f1d52c27"},
{file = "pydantic_core-2.27.1-pp310-pypy310_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:19910754e4cc9c63bc1c7f6d73aa1cfee82f42007e407c0f413695c2f7ed777f"},
{file = "pydantic_core-2.27.1-pp310-pypy310_pp73-musllinux_1_1_armv7l.whl", hash = "sha256:e173486019cc283dc9778315fa29a363579372fe67045e971e89b6365cc035ed"},
{file = "pydantic_core-2.27.1-pp310-pypy310_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:af52d26579b308921b73b956153066481f064875140ccd1dfd4e77db89dbb12f"},
{file = "pydantic_core-2.27.1-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:981fb88516bd1ae8b0cbbd2034678a39dedc98752f264ac9bc5839d3923fa04c"},
{file = "pydantic_core-2.27.1-pp39-pypy39_pp73-macosx_10_12_x86_64.whl", hash = "sha256:5fde892e6c697ce3e30c61b239330fc5d569a71fefd4eb6512fc6caec9dd9e2f"},
{file = "pydantic_core-2.27.1-pp39-pypy39_pp73-macosx_11_0_arm64.whl", hash = "sha256:816f5aa087094099fff7edabb5e01cc370eb21aa1a1d44fe2d2aefdfb5599b31"},
{file = "pydantic_core-2.27.1-pp39-pypy39_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:9c10c309e18e443ddb108f0ef64e8729363adbfd92d6d57beec680f6261556f3"},
{file = "pydantic_core-2.27.1-pp39-pypy39_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:98476c98b02c8e9b2eec76ac4156fd006628b1b2d0ef27e548ffa978393fd154"},
{file = "pydantic_core-2.27.1-pp39-pypy39_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:c3027001c28434e7ca5a6e1e527487051136aa81803ac812be51802150d880dd"},
{file = "pydantic_core-2.27.1-pp39-pypy39_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:7699b1df36a48169cdebda7ab5a2bac265204003f153b4bd17276153d997670a"},
{file = "pydantic_core-2.27.1-pp39-pypy39_pp73-musllinux_1_1_armv7l.whl", hash = "sha256:1c39b07d90be6b48968ddc8c19e7585052088fd7ec8d568bb31ff64c70ae3c97"},
{file = "pydantic_core-2.27.1-pp39-pypy39_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:46ccfe3032b3915586e469d4972973f893c0a2bb65669194a5bdea9bacc088c2"},
{file = "pydantic_core-2.27.1-pp39-pypy39_pp73-win_amd64.whl", hash = "sha256:62ba45e21cf6571d7f716d903b5b7b6d2617e2d5d67c0923dc47b9d41369f840"},
{file = "pydantic_core-2.27.1.tar.gz", hash = "sha256:62a763352879b84aa31058fc931884055fd75089cccbd9d58bb6afd01141b235"},
{file = "pydantic_core-2.27.2-cp310-cp310-macosx_10_12_x86_64.whl", hash = "sha256:2d367ca20b2f14095a8f4fa1210f5a7b78b8a20009ecced6b12818f455b1e9fa"},
{file = "pydantic_core-2.27.2-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:491a2b73db93fab69731eaee494f320faa4e093dbed776be1a829c2eb222c34c"},
{file = "pydantic_core-2.27.2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:7969e133a6f183be60e9f6f56bfae753585680f3b7307a8e555a948d443cc05a"},
{file = "pydantic_core-2.27.2-cp310-cp310-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:3de9961f2a346257caf0aa508a4da705467f53778e9ef6fe744c038119737ef5"},
{file = "pydantic_core-2.27.2-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:e2bb4d3e5873c37bb3dd58714d4cd0b0e6238cebc4177ac8fe878f8b3aa8e74c"},
{file = "pydantic_core-2.27.2-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:280d219beebb0752699480fe8f1dc61ab6615c2046d76b7ab7ee38858de0a4e7"},
{file = "pydantic_core-2.27.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:47956ae78b6422cbd46f772f1746799cbb862de838fd8d1fbd34a82e05b0983a"},
{file = "pydantic_core-2.27.2-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:14d4a5c49d2f009d62a2a7140d3064f686d17a5d1a268bc641954ba181880236"},
{file = "pydantic_core-2.27.2-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:337b443af21d488716f8d0b6164de833e788aa6bd7e3a39c005febc1284f4962"},
{file = "pydantic_core-2.27.2-cp310-cp310-musllinux_1_1_armv7l.whl", hash = "sha256:03d0f86ea3184a12f41a2d23f7ccb79cdb5a18e06993f8a45baa8dfec746f0e9"},
{file = "pydantic_core-2.27.2-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:7041c36f5680c6e0f08d922aed302e98b3745d97fe1589db0a3eebf6624523af"},
{file = "pydantic_core-2.27.2-cp310-cp310-win32.whl", hash = "sha256:50a68f3e3819077be2c98110c1f9dcb3817e93f267ba80a2c05bb4f8799e2ff4"},
{file = "pydantic_core-2.27.2-cp310-cp310-win_amd64.whl", hash = "sha256:e0fd26b16394ead34a424eecf8a31a1f5137094cabe84a1bcb10fa6ba39d3d31"},
{file = "pydantic_core-2.27.2-cp311-cp311-macosx_10_12_x86_64.whl", hash = "sha256:8e10c99ef58cfdf2a66fc15d66b16c4a04f62bca39db589ae8cba08bc55331bc"},
{file = "pydantic_core-2.27.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:26f32e0adf166a84d0cb63be85c562ca8a6fa8de28e5f0d92250c6b7e9e2aff7"},
{file = "pydantic_core-2.27.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8c19d1ea0673cd13cc2f872f6c9ab42acc4e4f492a7ca9d3795ce2b112dd7e15"},
{file = "pydantic_core-2.27.2-cp311-cp311-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:5e68c4446fe0810e959cdff46ab0a41ce2f2c86d227d96dc3847af0ba7def306"},
{file = "pydantic_core-2.27.2-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:d9640b0059ff4f14d1f37321b94061c6db164fbe49b334b31643e0528d100d99"},
{file = "pydantic_core-2.27.2-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:40d02e7d45c9f8af700f3452f329ead92da4c5f4317ca9b896de7ce7199ea459"},
{file = "pydantic_core-2.27.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:1c1fd185014191700554795c99b347d64f2bb637966c4cfc16998a0ca700d048"},
{file = "pydantic_core-2.27.2-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:d81d2068e1c1228a565af076598f9e7451712700b673de8f502f0334f281387d"},
{file = "pydantic_core-2.27.2-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:1a4207639fb02ec2dbb76227d7c751a20b1a6b4bc52850568e52260cae64ca3b"},
{file = "pydantic_core-2.27.2-cp311-cp311-musllinux_1_1_armv7l.whl", hash = "sha256:3de3ce3c9ddc8bbd88f6e0e304dea0e66d843ec9de1b0042b0911c1663ffd474"},
{file = "pydantic_core-2.27.2-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:30c5f68ded0c36466acede341551106821043e9afaad516adfb6e8fa80a4e6a6"},
{file = "pydantic_core-2.27.2-cp311-cp311-win32.whl", hash = "sha256:c70c26d2c99f78b125a3459f8afe1aed4d9687c24fd677c6a4436bc042e50d6c"},
{file = "pydantic_core-2.27.2-cp311-cp311-win_amd64.whl", hash = "sha256:08e125dbdc505fa69ca7d9c499639ab6407cfa909214d500897d02afb816e7cc"},
{file = "pydantic_core-2.27.2-cp311-cp311-win_arm64.whl", hash = "sha256:26f0d68d4b235a2bae0c3fc585c585b4ecc51382db0e3ba402a22cbc440915e4"},
{file = "pydantic_core-2.27.2-cp312-cp312-macosx_10_12_x86_64.whl", hash = "sha256:9e0c8cfefa0ef83b4da9588448b6d8d2a2bf1a53c3f1ae5fca39eb3061e2f0b0"},
{file = "pydantic_core-2.27.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:83097677b8e3bd7eaa6775720ec8e0405f1575015a463285a92bfdfe254529ef"},
{file = "pydantic_core-2.27.2-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:172fce187655fece0c90d90a678424b013f8fbb0ca8b036ac266749c09438cb7"},
{file = "pydantic_core-2.27.2-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:519f29f5213271eeeeb3093f662ba2fd512b91c5f188f3bb7b27bc5973816934"},
{file = "pydantic_core-2.27.2-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:05e3a55d124407fffba0dd6b0c0cd056d10e983ceb4e5dbd10dda135c31071d6"},
{file = "pydantic_core-2.27.2-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:9c3ed807c7b91de05e63930188f19e921d1fe90de6b4f5cd43ee7fcc3525cb8c"},
{file = "pydantic_core-2.27.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6fb4aadc0b9a0c063206846d603b92030eb6f03069151a625667f982887153e2"},
{file = "pydantic_core-2.27.2-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:28ccb213807e037460326424ceb8b5245acb88f32f3d2777427476e1b32c48c4"},
{file = "pydantic_core-2.27.2-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:de3cd1899e2c279b140adde9357c4495ed9d47131b4a4eaff9052f23398076b3"},
{file = "pydantic_core-2.27.2-cp312-cp312-musllinux_1_1_armv7l.whl", hash = "sha256:220f892729375e2d736b97d0e51466252ad84c51857d4d15f5e9692f9ef12be4"},
{file = "pydantic_core-2.27.2-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:a0fcd29cd6b4e74fe8ddd2c90330fd8edf2e30cb52acda47f06dd615ae72da57"},
{file = "pydantic_core-2.27.2-cp312-cp312-win32.whl", hash = "sha256:1e2cb691ed9834cd6a8be61228471d0a503731abfb42f82458ff27be7b2186fc"},
{file = "pydantic_core-2.27.2-cp312-cp312-win_amd64.whl", hash = "sha256:cc3f1a99a4f4f9dd1de4fe0312c114e740b5ddead65bb4102884b384c15d8bc9"},
{file = "pydantic_core-2.27.2-cp312-cp312-win_arm64.whl", hash = "sha256:3911ac9284cd8a1792d3cb26a2da18f3ca26c6908cc434a18f730dc0db7bfa3b"},
{file = "pydantic_core-2.27.2-cp313-cp313-macosx_10_12_x86_64.whl", hash = "sha256:7d14bd329640e63852364c306f4d23eb744e0f8193148d4044dd3dacdaacbd8b"},
{file = "pydantic_core-2.27.2-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:82f91663004eb8ed30ff478d77c4d1179b3563df6cdb15c0817cd1cdaf34d154"},
{file = "pydantic_core-2.27.2-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:71b24c7d61131bb83df10cc7e687433609963a944ccf45190cfc21e0887b08c9"},
{file = "pydantic_core-2.27.2-cp313-cp313-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:fa8e459d4954f608fa26116118bb67f56b93b209c39b008277ace29937453dc9"},
{file = "pydantic_core-2.27.2-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:ce8918cbebc8da707ba805b7fd0b382816858728ae7fe19a942080c24e5b7cd1"},
{file = "pydantic_core-2.27.2-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:eda3f5c2a021bbc5d976107bb302e0131351c2ba54343f8a496dc8783d3d3a6a"},
{file = "pydantic_core-2.27.2-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bd8086fa684c4775c27f03f062cbb9eaa6e17f064307e86b21b9e0abc9c0f02e"},
{file = "pydantic_core-2.27.2-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:8d9b3388db186ba0c099a6d20f0604a44eabdeef1777ddd94786cdae158729e4"},
{file = "pydantic_core-2.27.2-cp313-cp313-musllinux_1_1_aarch64.whl", hash = "sha256:7a66efda2387de898c8f38c0cf7f14fca0b51a8ef0b24bfea5849f1b3c95af27"},
{file = "pydantic_core-2.27.2-cp313-cp313-musllinux_1_1_armv7l.whl", hash = "sha256:18a101c168e4e092ab40dbc2503bdc0f62010e95d292b27827871dc85450d7ee"},
{file = "pydantic_core-2.27.2-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:ba5dd002f88b78a4215ed2f8ddbdf85e8513382820ba15ad5ad8955ce0ca19a1"},
{file = "pydantic_core-2.27.2-cp313-cp313-win32.whl", hash = "sha256:1ebaf1d0481914d004a573394f4be3a7616334be70261007e47c2a6fe7e50130"},
{file = "pydantic_core-2.27.2-cp313-cp313-win_amd64.whl", hash = "sha256:953101387ecf2f5652883208769a79e48db18c6df442568a0b5ccd8c2723abee"},
{file = "pydantic_core-2.27.2-cp313-cp313-win_arm64.whl", hash = "sha256:ac4dbfd1691affb8f48c2c13241a2e3b60ff23247cbcf981759c768b6633cf8b"},
{file = "pydantic_core-2.27.2-cp38-cp38-macosx_10_12_x86_64.whl", hash = "sha256:d3e8d504bdd3f10835468f29008d72fc8359d95c9c415ce6e767203db6127506"},
{file = "pydantic_core-2.27.2-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:521eb9b7f036c9b6187f0b47318ab0d7ca14bd87f776240b90b21c1f4f149320"},
{file = "pydantic_core-2.27.2-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:85210c4d99a0114f5a9481b44560d7d1e35e32cc5634c656bc48e590b669b145"},
{file = "pydantic_core-2.27.2-cp38-cp38-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:d716e2e30c6f140d7560ef1538953a5cd1a87264c737643d481f2779fc247fe1"},
{file = "pydantic_core-2.27.2-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:f66d89ba397d92f840f8654756196d93804278457b5fbede59598a1f9f90b228"},
{file = "pydantic_core-2.27.2-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:669e193c1c576a58f132e3158f9dfa9662969edb1a250c54d8fa52590045f046"},
{file = "pydantic_core-2.27.2-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9fdbe7629b996647b99c01b37f11170a57ae675375b14b8c13b8518b8320ced5"},
{file = "pydantic_core-2.27.2-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:d262606bf386a5ba0b0af3b97f37c83d7011439e3dc1a9298f21efb292e42f1a"},
{file = "pydantic_core-2.27.2-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:cabb9bcb7e0d97f74df8646f34fc76fbf793b7f6dc2438517d7a9e50eee4f14d"},
{file = "pydantic_core-2.27.2-cp38-cp38-musllinux_1_1_armv7l.whl", hash = "sha256:d2d63f1215638d28221f664596b1ccb3944f6e25dd18cd3b86b0a4c408d5ebb9"},
{file = "pydantic_core-2.27.2-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:bca101c00bff0adb45a833f8451b9105d9df18accb8743b08107d7ada14bd7da"},
{file = "pydantic_core-2.27.2-cp38-cp38-win32.whl", hash = "sha256:f6f8e111843bbb0dee4cb6594cdc73e79b3329b526037ec242a3e49012495b3b"},
{file = "pydantic_core-2.27.2-cp38-cp38-win_amd64.whl", hash = "sha256:fd1aea04935a508f62e0d0ef1f5ae968774a32afc306fb8545e06f5ff5cdf3ad"},
{file = "pydantic_core-2.27.2-cp39-cp39-macosx_10_12_x86_64.whl", hash = "sha256:c10eb4f1659290b523af58fa7cffb452a61ad6ae5613404519aee4bfbf1df993"},
{file = "pydantic_core-2.27.2-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:ef592d4bad47296fb11f96cd7dc898b92e795032b4894dfb4076cfccd43a9308"},
{file = "pydantic_core-2.27.2-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:c61709a844acc6bf0b7dce7daae75195a10aac96a596ea1b776996414791ede4"},
{file = "pydantic_core-2.27.2-cp39-cp39-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:42c5f762659e47fdb7b16956c71598292f60a03aa92f8b6351504359dbdba6cf"},
{file = "pydantic_core-2.27.2-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:4c9775e339e42e79ec99c441d9730fccf07414af63eac2f0e48e08fd38a64d76"},
{file = "pydantic_core-2.27.2-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:57762139821c31847cfb2df63c12f725788bd9f04bc2fb392790959b8f70f118"},
{file = "pydantic_core-2.27.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:0d1e85068e818c73e048fe28cfc769040bb1f475524f4745a5dc621f75ac7630"},
{file = "pydantic_core-2.27.2-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:097830ed52fd9e427942ff3b9bc17fab52913b2f50f2880dc4a5611446606a54"},
{file = "pydantic_core-2.27.2-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:044a50963a614ecfae59bb1eaf7ea7efc4bc62f49ed594e18fa1e5d953c40e9f"},
{file = "pydantic_core-2.27.2-cp39-cp39-musllinux_1_1_armv7l.whl", hash = "sha256:4e0b4220ba5b40d727c7f879eac379b822eee5d8fff418e9d3381ee45b3b0362"},
{file = "pydantic_core-2.27.2-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:5e4f4bb20d75e9325cc9696c6802657b58bc1dbbe3022f32cc2b2b632c3fbb96"},
{file = "pydantic_core-2.27.2-cp39-cp39-win32.whl", hash = "sha256:cca63613e90d001b9f2f9a9ceb276c308bfa2a43fafb75c8031c4f66039e8c6e"},
{file = "pydantic_core-2.27.2-cp39-cp39-win_amd64.whl", hash = "sha256:77d1bca19b0f7021b3a982e6f903dcd5b2b06076def36a652e3907f596e29f67"},
{file = "pydantic_core-2.27.2-pp310-pypy310_pp73-macosx_10_12_x86_64.whl", hash = "sha256:2bf14caea37e91198329b828eae1618c068dfb8ef17bb33287a7ad4b61ac314e"},
{file = "pydantic_core-2.27.2-pp310-pypy310_pp73-macosx_11_0_arm64.whl", hash = "sha256:b0cb791f5b45307caae8810c2023a184c74605ec3bcbb67d13846c28ff731ff8"},
{file = "pydantic_core-2.27.2-pp310-pypy310_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:688d3fd9fcb71f41c4c015c023d12a79d1c4c0732ec9eb35d96e3388a120dcf3"},
{file = "pydantic_core-2.27.2-pp310-pypy310_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:3d591580c34f4d731592f0e9fe40f9cc1b430d297eecc70b962e93c5c668f15f"},
{file = "pydantic_core-2.27.2-pp310-pypy310_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:82f986faf4e644ffc189a7f1aafc86e46ef70372bb153e7001e8afccc6e54133"},
{file = "pydantic_core-2.27.2-pp310-pypy310_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:bec317a27290e2537f922639cafd54990551725fc844249e64c523301d0822fc"},
{file = "pydantic_core-2.27.2-pp310-pypy310_pp73-musllinux_1_1_armv7l.whl", hash = "sha256:0296abcb83a797db256b773f45773da397da75a08f5fcaef41f2044adec05f50"},
{file = "pydantic_core-2.27.2-pp310-pypy310_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:0d75070718e369e452075a6017fbf187f788e17ed67a3abd47fa934d001863d9"},
{file = "pydantic_core-2.27.2-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:7e17b560be3c98a8e3aa66ce828bdebb9e9ac6ad5466fba92eb74c4c95cb1151"},
{file = "pydantic_core-2.27.2-pp39-pypy39_pp73-macosx_10_12_x86_64.whl", hash = "sha256:c33939a82924da9ed65dab5a65d427205a73181d8098e79b6b426bdf8ad4e656"},
{file = "pydantic_core-2.27.2-pp39-pypy39_pp73-macosx_11_0_arm64.whl", hash = "sha256:00bad2484fa6bda1e216e7345a798bd37c68fb2d97558edd584942aa41b7d278"},
{file = "pydantic_core-2.27.2-pp39-pypy39_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:c817e2b40aba42bac6f457498dacabc568c3b7a986fc9ba7c8d9d260b71485fb"},
{file = "pydantic_core-2.27.2-pp39-pypy39_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:251136cdad0cb722e93732cb45ca5299fb56e1344a833640bf93b2803f8d1bfd"},
{file = "pydantic_core-2.27.2-pp39-pypy39_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:d2088237af596f0a524d3afc39ab3b036e8adb054ee57cbb1dcf8e09da5b29cc"},
{file = "pydantic_core-2.27.2-pp39-pypy39_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:d4041c0b966a84b4ae7a09832eb691a35aec90910cd2dbe7a208de59be77965b"},
{file = "pydantic_core-2.27.2-pp39-pypy39_pp73-musllinux_1_1_armv7l.whl", hash = "sha256:8083d4e875ebe0b864ffef72a4304827015cff328a1be6e22cc850753bfb122b"},
{file = "pydantic_core-2.27.2-pp39-pypy39_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:f141ee28a0ad2123b6611b6ceff018039df17f32ada8b534e6aa039545a3efb2"},
{file = "pydantic_core-2.27.2-pp39-pypy39_pp73-win_amd64.whl", hash = "sha256:7d0c8399fcc1848491f00e0314bd59fb34a9c008761bcb422a057670c3f65e35"},
{file = "pydantic_core-2.27.2.tar.gz", hash = "sha256:eb026e5a4c1fee05726072337ff51d1efb6f59090b7da90d30ea58625b1ffb39"},
]
[package.dependencies]
@@ -1223,13 +1273,14 @@ typing-extensions = ">=4.6.0,<4.7.0 || >4.7.0"
[[package]]
name = "pydantic-settings"
version = "2.7.0"
version = "2.7.1"
description = "Settings management using Pydantic"
optional = false
python-versions = ">=3.8"
groups = ["main"]
files = [
{file = "pydantic_settings-2.7.0-py3-none-any.whl", hash = "sha256:e00c05d5fa6cbbb227c84bd7487c5c1065084119b750df7c8c1a554aed236eb5"},
{file = "pydantic_settings-2.7.0.tar.gz", hash = "sha256:ac4bfd4a36831a48dbf8b2d9325425b549a0a6f18cea118436d728eb4f1c4d66"},
{file = "pydantic_settings-2.7.1-py3-none-any.whl", hash = "sha256:590be9e6e24d06db33a4262829edef682500ef008565a969c73d39d5f8bfb3fd"},
{file = "pydantic_settings-2.7.1.tar.gz", hash = "sha256:10c9caad35e64bfb3c2fbf70a078c0e25cc92499782e5200747f942a065dec93"},
]
[package.dependencies]
@@ -1247,6 +1298,7 @@ version = "2.10.1"
description = "JSON Web Token implementation in Python"
optional = false
python-versions = ">=3.9"
groups = ["main"]
files = [
{file = "PyJWT-2.10.1-py3-none-any.whl", hash = "sha256:dcdd193e30abefd5debf142f9adfcdd2b58004e644f25406ffaebd50bd98dacb"},
{file = "pyjwt-2.10.1.tar.gz", hash = "sha256:3cc5772eb20009233caf06e9d8a0577824723b44e6648ee0a2aedb6cf9381953"},
@@ -1264,6 +1316,7 @@ version = "8.3.3"
description = "pytest: simple powerful testing with Python"
optional = false
python-versions = ">=3.8"
groups = ["main"]
files = [
{file = "pytest-8.3.3-py3-none-any.whl", hash = "sha256:a6853c7375b2663155079443d2e45de913a911a11d669df02a50814944db57b2"},
{file = "pytest-8.3.3.tar.gz", hash = "sha256:70b98107bd648308a7952b06e6ca9a50bc660be218d53c257cc1fc94fda10181"},
@@ -1282,13 +1335,14 @@ dev = ["argcomplete", "attrs (>=19.2)", "hypothesis (>=3.56)", "mock", "pygments
[[package]]
name = "pytest-asyncio"
version = "0.25.0"
version = "0.25.3"
description = "Pytest support for asyncio"
optional = false
python-versions = ">=3.9"
groups = ["main"]
files = [
{file = "pytest_asyncio-0.25.0-py3-none-any.whl", hash = "sha256:db5432d18eac6b7e28b46dcd9b69921b55c3b1086e85febfe04e70b18d9e81b3"},
{file = "pytest_asyncio-0.25.0.tar.gz", hash = "sha256:8c0610303c9e0442a5db8604505fc0f545456ba1528824842b37b4a626cbf609"},
{file = "pytest_asyncio-0.25.3-py3-none-any.whl", hash = "sha256:9e89518e0f9bd08928f97a3482fdc4e244df17529460bc038291ccaf8f85c7c3"},
{file = "pytest_asyncio-0.25.3.tar.gz", hash = "sha256:fc1da2cf9f125ada7e710b4ddad05518d4cee187ae9412e9ac9271003497f07a"},
]
[package.dependencies]
@@ -1304,6 +1358,7 @@ version = "3.14.0"
description = "Thin-wrapper around the mock package for easier use with pytest"
optional = false
python-versions = ">=3.8"
groups = ["main"]
files = [
{file = "pytest-mock-3.14.0.tar.gz", hash = "sha256:2719255a1efeceadbc056d6bf3df3d1c5015530fb40cf347c0f9afac88410bd0"},
{file = "pytest_mock-3.14.0-py3-none-any.whl", hash = "sha256:0b72c38033392a5f4621342fe11e9219ac11ec9d375f8e2a0c164539e0d70f6f"},
@@ -1321,6 +1376,7 @@ version = "2.9.0.post0"
description = "Extensions to the standard Python datetime module"
optional = false
python-versions = "!=3.0.*,!=3.1.*,!=3.2.*,>=2.7"
groups = ["main"]
files = [
{file = "python-dateutil-2.9.0.post0.tar.gz", hash = "sha256:37dd54208da7e1cd875388217d5e00ebd4179249f90fb72437e91a35459a0ad3"},
{file = "python_dateutil-2.9.0.post0-py2.py3-none-any.whl", hash = "sha256:a8b2bc7bffae282281c8140a97d3aa9c14da0b136dfe83f850eea9a5f7470427"},
@@ -1335,6 +1391,7 @@ version = "1.0.1"
description = "Read key-value pairs from a .env file and set them as environment variables"
optional = false
python-versions = ">=3.8"
groups = ["main"]
files = [
{file = "python-dotenv-1.0.1.tar.gz", hash = "sha256:e324ee90a023d808f1959c46bcbc04446a10ced277783dc6ee09987c37ec10ca"},
{file = "python_dotenv-1.0.1-py3-none-any.whl", hash = "sha256:f7b63ef50f1b690dddf550d03497b66d609393b40b564ed0d674909a68ebf16a"},
@@ -1349,6 +1406,7 @@ version = "2.0.2"
description = ""
optional = false
python-versions = "<4.0,>=3.9"
groups = ["main"]
files = [
{file = "realtime-2.0.2-py3-none-any.whl", hash = "sha256:2634c915bc38807f2013f21e8bcc4d2f79870dfd81460ddb9393883d0489928a"},
{file = "realtime-2.0.2.tar.gz", hash = "sha256:519da9325b3b8102139d51785013d592f6b2403d81fa21d838a0b0234723ed7d"},
@@ -1366,6 +1424,7 @@ version = "5.2.1"
description = "Python client for Redis database and key-value store"
optional = false
python-versions = ">=3.8"
groups = ["dev"]
files = [
{file = "redis-5.2.1-py3-none-any.whl", hash = "sha256:ee7e1056b9aea0f04c6c2ed59452947f34c4940ee025f5dd83e6a6418b6989e4"},
{file = "redis-5.2.1.tar.gz", hash = "sha256:16f2e22dff21d5125e8481515e386711a34cbec50f0e44413dd7d9c060a54e0f"},
@@ -1384,6 +1443,7 @@ version = "2.32.3"
description = "Python HTTP for Humans."
optional = false
python-versions = ">=3.8"
groups = ["main"]
files = [
{file = "requests-2.32.3-py3-none-any.whl", hash = "sha256:70761cfe03c773ceb22aa2f671b4757976145175cdfca038c02654d061d6dcc6"},
{file = "requests-2.32.3.tar.gz", hash = "sha256:55365417734eb18255590a9ff9eb97e9e1da868d4ccd6402399eaf68af20a760"},
@@ -1405,6 +1465,7 @@ version = "4.9"
description = "Pure-Python RSA implementation"
optional = false
python-versions = ">=3.6,<4"
groups = ["main"]
files = [
{file = "rsa-4.9-py3-none-any.whl", hash = "sha256:90260d9058e514786967344d0ef75fa8727eed8a7d2e43ce9f4bcf1b536174f7"},
{file = "rsa-4.9.tar.gz", hash = "sha256:e38464a49c6c85d7f1351b0126661487a7e0a14a50f1675ec50eb34d4f20ef21"},
@@ -1415,29 +1476,30 @@ pyasn1 = ">=0.1.3"
[[package]]
name = "ruff"
version = "0.8.6"
version = "0.9.3"
description = "An extremely fast Python linter and code formatter, written in Rust."
optional = false
python-versions = ">=3.7"
groups = ["dev"]
files = [
{file = "ruff-0.8.6-py3-none-linux_armv6l.whl", hash = "sha256:defed167955d42c68b407e8f2e6f56ba52520e790aba4ca707a9c88619e580e3"},
{file = "ruff-0.8.6-py3-none-macosx_10_12_x86_64.whl", hash = "sha256:54799ca3d67ae5e0b7a7ac234baa657a9c1784b48ec954a094da7c206e0365b1"},
{file = "ruff-0.8.6-py3-none-macosx_11_0_arm64.whl", hash = "sha256:e88b8f6d901477c41559ba540beeb5a671e14cd29ebd5683903572f4b40a9807"},
{file = "ruff-0.8.6-py3-none-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0509e8da430228236a18a677fcdb0c1f102dd26d5520f71f79b094963322ed25"},
{file = "ruff-0.8.6-py3-none-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:91a7ddb221779871cf226100e677b5ea38c2d54e9e2c8ed847450ebbdf99b32d"},
{file = "ruff-0.8.6-py3-none-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:248b1fb3f739d01d528cc50b35ee9c4812aa58cc5935998e776bf8ed5b251e75"},
{file = "ruff-0.8.6-py3-none-manylinux_2_17_ppc64.manylinux2014_ppc64.whl", hash = "sha256:bc3c083c50390cf69e7e1b5a5a7303898966be973664ec0c4a4acea82c1d4315"},
{file = "ruff-0.8.6-py3-none-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:52d587092ab8df308635762386f45f4638badb0866355b2b86760f6d3c076188"},
{file = "ruff-0.8.6-py3-none-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:61323159cf21bc3897674e5adb27cd9e7700bab6b84de40d7be28c3d46dc67cf"},
{file = "ruff-0.8.6-py3-none-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7ae4478b1471fc0c44ed52a6fb787e641a2ac58b1c1f91763bafbc2faddc5117"},
{file = "ruff-0.8.6-py3-none-musllinux_1_2_aarch64.whl", hash = "sha256:0c000a471d519b3e6cfc9c6680025d923b4ca140ce3e4612d1a2ef58e11f11fe"},
{file = "ruff-0.8.6-py3-none-musllinux_1_2_armv7l.whl", hash = "sha256:9257aa841e9e8d9b727423086f0fa9a86b6b420fbf4bf9e1465d1250ce8e4d8d"},
{file = "ruff-0.8.6-py3-none-musllinux_1_2_i686.whl", hash = "sha256:45a56f61b24682f6f6709636949ae8cc82ae229d8d773b4c76c09ec83964a95a"},
{file = "ruff-0.8.6-py3-none-musllinux_1_2_x86_64.whl", hash = "sha256:496dd38a53aa173481a7d8866bcd6451bd934d06976a2505028a50583e001b76"},
{file = "ruff-0.8.6-py3-none-win32.whl", hash = "sha256:e169ea1b9eae61c99b257dc83b9ee6c76f89042752cb2d83486a7d6e48e8f764"},
{file = "ruff-0.8.6-py3-none-win_amd64.whl", hash = "sha256:f1d70bef3d16fdc897ee290d7d20da3cbe4e26349f62e8a0274e7a3f4ce7a905"},
{file = "ruff-0.8.6-py3-none-win_arm64.whl", hash = "sha256:7d7fc2377a04b6e04ffe588caad613d0c460eb2ecba4c0ccbbfe2bc973cbc162"},
{file = "ruff-0.8.6.tar.gz", hash = "sha256:dcad24b81b62650b0eb8814f576fc65cfee8674772a6e24c9b747911801eeaa5"},
{file = "ruff-0.9.3-py3-none-linux_armv6l.whl", hash = "sha256:7f39b879064c7d9670197d91124a75d118d00b0990586549949aae80cdc16624"},
{file = "ruff-0.9.3-py3-none-macosx_10_12_x86_64.whl", hash = "sha256:a187171e7c09efa4b4cc30ee5d0d55a8d6c5311b3e1b74ac5cb96cc89bafc43c"},
{file = "ruff-0.9.3-py3-none-macosx_11_0_arm64.whl", hash = "sha256:c59ab92f8e92d6725b7ded9d4a31be3ef42688a115c6d3da9457a5bda140e2b4"},
{file = "ruff-0.9.3-py3-none-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:2dc153c25e715be41bb228bc651c1e9b1a88d5c6e5ed0194fa0dfea02b026439"},
{file = "ruff-0.9.3-py3-none-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:646909a1e25e0dc28fbc529eab8eb7bb583079628e8cbe738192853dbbe43af5"},
{file = "ruff-0.9.3-py3-none-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:5a5a46e09355695fbdbb30ed9889d6cf1c61b77b700a9fafc21b41f097bfbba4"},
{file = "ruff-0.9.3-py3-none-manylinux_2_17_ppc64.manylinux2014_ppc64.whl", hash = "sha256:c4bb09d2bbb394e3730d0918c00276e79b2de70ec2a5231cd4ebb51a57df9ba1"},
{file = "ruff-0.9.3-py3-none-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:96a87ec31dc1044d8c2da2ebbed1c456d9b561e7d087734336518181b26b3aa5"},
{file = "ruff-0.9.3-py3-none-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:9bb7554aca6f842645022fe2d301c264e6925baa708b392867b7a62645304df4"},
{file = "ruff-0.9.3-py3-none-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:cabc332b7075a914ecea912cd1f3d4370489c8018f2c945a30bcc934e3bc06a6"},
{file = "ruff-0.9.3-py3-none-musllinux_1_2_aarch64.whl", hash = "sha256:33866c3cc2a575cbd546f2cd02bdd466fed65118e4365ee538a3deffd6fcb730"},
{file = "ruff-0.9.3-py3-none-musllinux_1_2_armv7l.whl", hash = "sha256:006e5de2621304c8810bcd2ee101587712fa93b4f955ed0985907a36c427e0c2"},
{file = "ruff-0.9.3-py3-none-musllinux_1_2_i686.whl", hash = "sha256:ba6eea4459dbd6b1be4e6bfc766079fb9b8dd2e5a35aff6baee4d9b1514ea519"},
{file = "ruff-0.9.3-py3-none-musllinux_1_2_x86_64.whl", hash = "sha256:90230a6b8055ad47d3325e9ee8f8a9ae7e273078a66401ac66df68943ced029b"},
{file = "ruff-0.9.3-py3-none-win32.whl", hash = "sha256:eabe5eb2c19a42f4808c03b82bd313fc84d4e395133fb3fc1b1516170a31213c"},
{file = "ruff-0.9.3-py3-none-win_amd64.whl", hash = "sha256:040ceb7f20791dfa0e78b4230ee9dce23da3b64dd5848e40e3bf3ab76468dcf4"},
{file = "ruff-0.9.3-py3-none-win_arm64.whl", hash = "sha256:800d773f6d4d33b0a3c60e2c6ae8f4c202ea2de056365acfa519aa48acf28e0b"},
{file = "ruff-0.9.3.tar.gz", hash = "sha256:8293f89985a090ebc3ed1064df31f3b4b56320cdfcec8b60d3295bddb955c22a"},
]
[[package]]
@@ -1446,6 +1508,7 @@ version = "1.16.0"
description = "Python 2 and 3 compatibility utilities"
optional = false
python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*"
groups = ["main"]
files = [
{file = "six-1.16.0-py2.py3-none-any.whl", hash = "sha256:8abb2f1d86890a2dfb989f9a77cfcfd3e47c2a354b01111771326f8aa26e0254"},
{file = "six-1.16.0.tar.gz", hash = "sha256:1e61c37477a1626458e36f7b1d82aa5c9b094fa4802892072e49de9c60c4c926"},
@@ -1457,6 +1520,7 @@ version = "1.3.1"
description = "Sniff out which async library your code is running under"
optional = false
python-versions = ">=3.7"
groups = ["main"]
files = [
{file = "sniffio-1.3.1-py3-none-any.whl", hash = "sha256:2f6da418d1f1e0fddd844478f41680e794e6051915791a034ff65e5f100525a2"},
{file = "sniffio-1.3.1.tar.gz", hash = "sha256:f4324edc670a0f49750a81b895f35c3adb843cca46f0530f79fc1babb23789dc"},
@@ -1464,17 +1528,18 @@ files = [
[[package]]
name = "storage3"
version = "0.9.0"
version = "0.11.0"
description = "Supabase Storage client for Python."
optional = false
python-versions = "<4.0,>=3.9"
groups = ["main"]
files = [
{file = "storage3-0.9.0-py3-none-any.whl", hash = "sha256:8b2fb91f0c61583a2f4eac74a8bae67e00d41ff38095c8a6cd3f2ce5e0ab76e7"},
{file = "storage3-0.9.0.tar.gz", hash = "sha256:e16697f60894c94e1d9df0d2e4af783c1b3f7dd08c9013d61978825c624188c4"},
{file = "storage3-0.11.0-py3-none-any.whl", hash = "sha256:de2d8f9c9103ca91a9a9d0d69d80b07a3ab6f647b93e023e6a1a97d3607b9728"},
{file = "storage3-0.11.0.tar.gz", hash = "sha256:243583f2180686c0f0a19e6117d8a9796fd60c0ca72ec567d62b75a5af0d57a1"},
]
[package.dependencies]
httpx = {version = ">=0.26,<0.28", extras = ["http2"]}
httpx = {version = ">=0.26,<0.29", extras = ["http2"]}
python-dateutil = ">=2.8.2,<3.0.0"
[[package]]
@@ -1483,6 +1548,7 @@ version = "0.4.15"
description = "An Enum that inherits from str."
optional = false
python-versions = "*"
groups = ["main"]
files = [
{file = "StrEnum-0.4.15-py3-none-any.whl", hash = "sha256:a30cda4af7cc6b5bf52c8055bc4bf4b2b6b14a93b574626da33df53cf7740659"},
{file = "StrEnum-0.4.15.tar.gz", hash = "sha256:878fb5ab705442070e4dd1929bb5e2249511c0bcf2b0eeacf3bcd80875c82eff"},
@@ -1495,36 +1561,39 @@ test = ["pylint", "pytest", "pytest-black", "pytest-cov", "pytest-pylint"]
[[package]]
name = "supabase"
version = "2.10.0"
version = "2.13.0"
description = "Supabase client for Python."
optional = false
python-versions = "<4.0,>=3.9"
groups = ["main"]
files = [
{file = "supabase-2.10.0-py3-none-any.whl", hash = "sha256:183fb23c04528593f8f81c24ceb8178f3a56bff40fec7ed873b6c55ebc2e420a"},
{file = "supabase-2.10.0.tar.gz", hash = "sha256:9ac095f8947bf60780e67c0edcbab53e2db3f6f3f022329397b093500bf2607c"},
{file = "supabase-2.13.0-py3-none-any.whl", hash = "sha256:6cfccc055be21dab311afc5e9d5b37f3a4966f8394703763fbc8f8e86f36eaa6"},
{file = "supabase-2.13.0.tar.gz", hash = "sha256:452574d34bd978c8d11b5f02b0182b48e8854e511c969483c83875ec01495f11"},
]
[package.dependencies]
gotrue = ">=2.10.0,<3.0.0"
httpx = ">=0.26,<0.28"
postgrest = ">=0.18,<0.19"
gotrue = ">=2.11.0,<3.0.0"
httpx = ">=0.26,<0.29"
postgrest = ">=0.19,<0.20"
realtime = ">=2.0.0,<3.0.0"
storage3 = ">=0.9.0,<0.10.0"
supafunc = ">=0.7.0,<0.8.0"
storage3 = ">=0.10,<0.12"
supafunc = ">=0.9,<0.10"
[[package]]
name = "supafunc"
version = "0.7.0"
version = "0.9.2"
description = "Library for Supabase Functions"
optional = false
python-versions = "<4.0,>=3.9"
groups = ["main"]
files = [
{file = "supafunc-0.7.0-py3-none-any.whl", hash = "sha256:4160260dc02bdd906be1e2ffd7cb3ae8b74ae437c892bb475352b6a99d9ff8eb"},
{file = "supafunc-0.7.0.tar.gz", hash = "sha256:5b1c415fba1395740b2b4eedd1d786384bd58b98f6333a11ba7889820a48b6a7"},
{file = "supafunc-0.9.2-py3-none-any.whl", hash = "sha256:be5ee9f53842c4b0ba5f4abfb5bddf9f9e37e69e755ec0526852bb15af9d2ff5"},
{file = "supafunc-0.9.2.tar.gz", hash = "sha256:f5164114a3e65e7e552539f3f1050aa3d4970885abdd7405555c17fd216e2da1"},
]
[package.dependencies]
httpx = {version = ">=0.26,<0.28", extras = ["http2"]}
httpx = {version = ">=0.26,<0.29", extras = ["http2"]}
strenum = ">=0.4.15,<0.5.0"
[[package]]
name = "tomli"
@@ -1532,6 +1601,8 @@ version = "2.1.0"
description = "A lil' TOML parser"
optional = false
python-versions = ">=3.8"
groups = ["main"]
markers = "python_version < \"3.11\""
files = [
{file = "tomli-2.1.0-py3-none-any.whl", hash = "sha256:a5c57c3d1c56f5ccdf89f6523458f60ef716e210fc47c4cfb188c5ba473e0391"},
{file = "tomli-2.1.0.tar.gz", hash = "sha256:3f646cae2aec94e17d04973e4249548320197cfabdf130015d023de4b74d8ab8"},
@@ -1543,6 +1614,7 @@ version = "4.12.2"
description = "Backported and Experimental Type Hints for Python 3.8+"
optional = false
python-versions = ">=3.8"
groups = ["main"]
files = [
{file = "typing_extensions-4.12.2-py3-none-any.whl", hash = "sha256:04e5ca0351e0f3f85c6853954072df659d0d13fac324d0072316b67d7794700d"},
{file = "typing_extensions-4.12.2.tar.gz", hash = "sha256:1a7ead55c7e559dd4dee8856e3a88b41225abfe1ce8df57b7c13915fe121ffb8"},
@@ -1554,6 +1626,7 @@ version = "2.2.2"
description = "HTTP library with thread-safe connection pooling, file post, and more."
optional = false
python-versions = ">=3.8"
groups = ["main"]
files = [
{file = "urllib3-2.2.2-py3-none-any.whl", hash = "sha256:a448b2f64d686155468037e1ace9f2d2199776e17f0a46610480d311f73e3472"},
{file = "urllib3-2.2.2.tar.gz", hash = "sha256:dd505485549a7a552833da5e6063639d0d177c04f23bc3864e41e5dc5f612168"},
@@ -1571,6 +1644,7 @@ version = "12.0"
description = "An implementation of the WebSocket Protocol (RFC 6455 & 7692)"
optional = false
python-versions = ">=3.8"
groups = ["main"]
files = [
{file = "websockets-12.0-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:d554236b2a2006e0ce16315c16eaa0d628dab009c33b63ea03f41c6107958374"},
{file = "websockets-12.0-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:2d225bb6886591b1746b17c0573e29804619c8f755b5598d875bb4235ea639be"},
@@ -1652,6 +1726,7 @@ version = "1.16.0"
description = "Module for decorators, wrappers and monkey patching."
optional = false
python-versions = ">=3.6"
groups = ["main"]
files = [
{file = "wrapt-1.16.0-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:ffa565331890b90056c01db69c0fe634a776f8019c143a5ae265f9c6bc4bd6d4"},
{file = "wrapt-1.16.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:e4fdb9275308292e880dcbeb12546df7f3e0f96c6b41197e0cf37d2826359020"},
@@ -1731,6 +1806,7 @@ version = "1.11.1"
description = "Yet another URL library"
optional = false
python-versions = ">=3.8"
groups = ["main"]
files = [
{file = "yarl-1.11.1-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:400cd42185f92de559d29eeb529e71d80dfbd2f45c36844914a4a34297ca6f00"},
{file = "yarl-1.11.1-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:8258c86f47e080a258993eed877d579c71da7bda26af86ce6c2d2d072c11320d"},
@@ -1836,6 +1912,7 @@ version = "3.20.1"
description = "Backport of pathlib-compatible object wrapper for zip files"
optional = false
python-versions = ">=3.8"
groups = ["main"]
files = [
{file = "zipp-3.20.1-py3-none-any.whl", hash = "sha256:9960cd8967c8f85a56f920d5d507274e74f9ff813a0ab8889a5b5be2daf44064"},
{file = "zipp-3.20.1.tar.gz", hash = "sha256:c22b14cc4763c5a5b04134207736c107db42e9d3ef2d9779d465f5f1bcba572b"},
@@ -1850,6 +1927,6 @@ test = ["big-O", "importlib-resources", "jaraco.functools", "jaraco.itertools",
type = ["pytest-mypy"]
[metadata]
lock-version = "2.0"
lock-version = "2.1"
python-versions = ">=3.10,<4.0"
content-hash = "bf1b0125759dadb1369fff05ffba64fea3e82b9b7a43d0068e1c80974a4ebc1c"
content-hash = "a4d81b3b55a67036ca7a441793e13e8fbe20af973fcf1623f36cdee7bc82999f"

View File

@@ -9,19 +9,19 @@ packages = [{ include = "autogpt_libs" }]
[tool.poetry.dependencies]
colorama = "^0.4.6"
expiringdict = "^1.2.2"
google-cloud-logging = "^3.11.3"
pydantic = "^2.10.3"
pydantic-settings = "^2.7.0"
google-cloud-logging = "^3.11.4"
pydantic = "^2.10.6"
pydantic-settings = "^2.7.1"
pyjwt = "^2.10.1"
pytest-asyncio = "^0.25.0"
pytest-asyncio = "^0.25.3"
pytest-mock = "^3.14.0"
python = ">=3.10,<4.0"
python-dotenv = "^1.0.1"
supabase = "^2.10.0"
supabase = "^2.13.0"
[tool.poetry.group.dev.dependencies]
redis = "^5.2.1"
ruff = "^0.8.6"
ruff = "^0.9.3"
[build-system]
requires = ["poetry-core"]

View File

@@ -15,6 +15,9 @@ REDIS_PORT=6379
REDIS_PASSWORD=password
ENABLE_CREDIT=false
STRIPE_API_KEY=
STRIPE_WEBHOOK_SECRET=
# What environment things should be logged under: local dev or prod
APP_ENV=local
# What environment to behave as: "local" or "cloud"
@@ -28,6 +31,12 @@ SUPABASE_URL=http://localhost:8000
SUPABASE_SERVICE_ROLE_KEY=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyAgCiAgICAicm9sZSI6ICJzZXJ2aWNlX3JvbGUiLAogICAgImlzcyI6ICJzdXBhYmFzZS1kZW1vIiwKICAgICJpYXQiOiAxNjQxNzY5MjAwLAogICAgImV4cCI6IDE3OTk1MzU2MDAKfQ.DaYlNEoUrrEn2Ig7tqibS-PHK5vgusbcbo7X36XVt4Q
SUPABASE_JWT_SECRET=your-super-secret-jwt-token-with-at-least-32-characters-long
# RabbitMQ credentials -- Used for communication between services
RABBITMQ_HOST=localhost
RABBITMQ_PORT=5672
RABBITMQ_DEFAULT_USER=rabbitmq_user_default
RABBITMQ_DEFAULT_PASS=k0VMxyIJF9S35f3x2uaw5IWAl6Y536O7
## For local development, you may need to set FRONTEND_BASE_URL for the OAuth flow
## for integrations to work. Defaults to the value of PLATFORM_BASE_URL if not set.
# FRONTEND_BASE_URL=http://localhost:3000
@@ -36,7 +45,7 @@ SUPABASE_JWT_SECRET=your-super-secret-jwt-token-with-at-least-32-characters-long
## to use the platform's webhook-related functionality.
## If you are developing locally, you can use something like ngrok to get a publc URL
## and tunnel it to your locally running backend.
PLATFORM_BASE_URL=https://your-public-url-here
PLATFORM_BASE_URL=http://localhost:3000
## == INTEGRATION CREDENTIALS == ##
# Each set of server side credentials is required for the corresponding 3rd party
@@ -58,6 +67,35 @@ GITHUB_CLIENT_SECRET=
GOOGLE_CLIENT_ID=
GOOGLE_CLIENT_SECRET=
# Twitter (X) OAuth 2.0 with PKCE Configuration
# 1. Create a Twitter Developer Account:
# - Visit https://developer.x.com/en and sign up
# 2. Set up your application:
# - Navigate to Developer Portal > Projects > Create Project
# - Add a new app to your project
# 3. Configure app settings:
# - App Permissions: Read + Write + Direct Messages
# - App Type: Web App, Automated App or Bot
# - OAuth 2.0 Callback URL: http://localhost:3000/auth/integrations/oauth_callback
# - Save your Client ID and Client Secret below
TWITTER_CLIENT_ID=
TWITTER_CLIENT_SECRET=
# Linear App
# Make a new workspace for your OAuth APP -- trust me
# https://linear.app/settings/api/applications/new
# Callback URL: http://localhost:3000/auth/integrations/oauth_callback
LINEAR_CLIENT_ID=
LINEAR_CLIENT_SECRET=
# To obtain Todoist API credentials:
# 1. Create a Todoist account at todoist.com
# 2. Visit the Developer Console: https://developer.todoist.com/appconsole.html
# 3. Click "Create new app"
# 4. Once created, copy your Client ID and Client Secret below
TODOIST_CLIENT_ID=
TODOIST_CLIENT_SECRET=
## ===== OPTIONAL API KEYS ===== ##
# LLM
@@ -67,10 +105,12 @@ GROQ_API_KEY=
OPEN_ROUTER_API_KEY=
# Reddit
# Go to https://www.reddit.com/prefs/apps and create a new app
# Choose "script" for the type
# Fill in the redirect uri as <your_frontend_url>/auth/integrations/oauth_callback, e.g. http://localhost:3000/auth/integrations/oauth_callback
REDDIT_CLIENT_ID=
REDDIT_CLIENT_SECRET=
REDDIT_USERNAME=
REDDIT_PASSWORD=
REDDIT_USER_AGENT="AutoGPT:1.0 (by /u/autogpt)"
# Discord
DISCORD_BOT_TOKEN=
@@ -106,6 +146,21 @@ REPLICATE_API_KEY=
# Ideogram
IDEOGRAM_API_KEY=
# Fal
FAL_API_KEY=
# Exa
EXA_API_KEY=
# E2B
E2B_API_KEY=
# Mem0
MEM0_API_KEY=
# Nvidia
NVIDIA_API_KEY=
# Logging Configuration
LOG_LEVEL=INFO
ENABLE_CLOUD_LOGGING=false

View File

@@ -66,10 +66,17 @@ We use the Poetry to manage the dependencies. To set up the project, follow thes
### Starting the server without Docker
To run the server locally, start in the autogpt_platform folder:
```sh
cd ..
```
Run the following command to run database in docker but the application locally:
```sh
docker compose --profile local up deps --build --detach
cd backend
poetry run app
```

View File

@@ -1,13 +1,52 @@
import enum
from typing import Any, List
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema, BlockType
from backend.data.model import SchemaField
from backend.util.file import MediaFile, store_media_file
from backend.util.mock import MockObject
from backend.util.text import TextFormatter
from backend.util.type import convert
formatter = TextFormatter()
class FileStoreBlock(Block):
class Input(BlockSchema):
file_in: MediaFile = SchemaField(
description="The file to store in the temporary directory, it can be a URL, data URI, or local path."
)
class Output(BlockSchema):
file_out: MediaFile = SchemaField(
description="The relative path to the stored file in the temporary directory."
)
def __init__(self):
super().__init__(
id="cbb50872-625b-42f0-8203-a2ae78242d8a",
description="Stores the input file in the temporary directory.",
categories={BlockCategory.BASIC, BlockCategory.MULTIMEDIA},
input_schema=FileStoreBlock.Input,
output_schema=FileStoreBlock.Output,
static_output=True,
)
def run(
self,
input_data: Input,
*,
graph_exec_id: str,
**kwargs,
) -> BlockOutput:
file_path = store_media_file(
graph_exec_id=graph_exec_id,
file=input_data.file_in,
return_content=False,
)
yield "file_out", file_path
class StoreValueBlock(Block):
"""
This block allows you to provide a constant value as a block, in a stateless manner.
@@ -241,7 +280,7 @@ class AgentOutputBlock(Block):
advanced=True,
)
format: str = SchemaField(
description="The format string to be used to format the recorded_value.",
description="The format string to be used to format the recorded_value. Use Jinja2 syntax.",
default="",
advanced=True,
)
@@ -258,6 +297,7 @@ class AgentOutputBlock(Block):
class Output(BlockSchema):
output: Any = SchemaField(description="The value recorded as output.")
name: Any = SchemaField(description="The name of the value recorded as output.")
def __init__(self):
super().__init__(
@@ -309,6 +349,7 @@ class AgentOutputBlock(Block):
yield "output", f"Error: {e}, {input_data.value}"
else:
yield "output", input_data.value
yield "name", input_data.name
class AddToDictionaryBlock(Block):
@@ -469,6 +510,48 @@ class AddToListBlock(Block):
yield "updated_list", updated_list
class FindInListBlock(Block):
class Input(BlockSchema):
list: List[Any] = SchemaField(description="The list to search in.")
value: Any = SchemaField(description="The value to search for.")
class Output(BlockSchema):
index: int = SchemaField(description="The index of the value in the list.")
found: bool = SchemaField(
description="Whether the value was found in the list."
)
not_found_value: Any = SchemaField(
description="The value that was not found in the list."
)
def __init__(self):
super().__init__(
id="5e2c6d0a-1e37-489f-b1d0-8e1812b23333",
description="Finds the index of the value in the list.",
categories={BlockCategory.BASIC},
input_schema=FindInListBlock.Input,
output_schema=FindInListBlock.Output,
test_input=[
{"list": [1, 2, 3, 4, 5], "value": 3},
{"list": [1, 2, 3, 4, 5], "value": 6},
],
test_output=[
("index", 2),
("found", True),
("found", False),
("not_found_value", 6),
],
)
def run(self, input_data: Input, **kwargs) -> BlockOutput:
try:
yield "index", input_data.list.index(input_data.value)
yield "found", True
except ValueError:
yield "found", False
yield "not_found_value", input_data.value
class NoteBlock(Block):
class Input(BlockSchema):
text: str = SchemaField(description="The text to display in the sticky note.")
@@ -590,3 +673,47 @@ class CreateListBlock(Block):
yield "list", input_data.values
except Exception as e:
yield "error", f"Failed to create list: {str(e)}"
class TypeOptions(enum.Enum):
STRING = "string"
NUMBER = "number"
BOOLEAN = "boolean"
LIST = "list"
DICTIONARY = "dictionary"
class UniversalTypeConverterBlock(Block):
class Input(BlockSchema):
value: Any = SchemaField(
description="The value to convert to a universal type."
)
type: TypeOptions = SchemaField(description="The type to convert the value to.")
class Output(BlockSchema):
value: Any = SchemaField(description="The converted value.")
def __init__(self):
super().__init__(
id="95d1b990-ce13-4d88-9737-ba5c2070c97b",
description="This block is used to convert a value to a universal type.",
categories={BlockCategory.BASIC},
input_schema=UniversalTypeConverterBlock.Input,
output_schema=UniversalTypeConverterBlock.Output,
)
def run(self, input_data: Input, **kwargs) -> BlockOutput:
try:
converted_value = convert(
input_data.value,
{
TypeOptions.STRING: str,
TypeOptions.NUMBER: float,
TypeOptions.BOOLEAN: bool,
TypeOptions.LIST: list,
TypeOptions.DICTIONARY: dict,
}[input_data.type],
)
yield "value", converted_value
except Exception as e:
yield "error", f"Failed to convert value: {str(e)}"

View File

@@ -107,3 +107,83 @@ class ConditionBlock(Block):
yield "yes_output", yes_value
else:
yield "no_output", no_value
class IfInputMatchesBlock(Block):
class Input(BlockSchema):
input: Any = SchemaField(
description="The input to match against",
placeholder="For example: 10 or 'hello' or True",
)
value: Any = SchemaField(
description="The value to output if the input matches",
placeholder="For example: 'Greater' or 20 or False",
)
yes_value: Any = SchemaField(
description="The value to output if the input matches",
placeholder="For example: 'Greater' or 20 or False",
default=None,
)
no_value: Any = SchemaField(
description="The value to output if the input does not match",
placeholder="For example: 'Greater' or 20 or False",
default=None,
)
class Output(BlockSchema):
result: bool = SchemaField(
description="The result of the condition evaluation (True or False)"
)
yes_output: Any = SchemaField(
description="The output value if the condition is true"
)
no_output: Any = SchemaField(
description="The output value if the condition is false"
)
def __init__(self):
super().__init__(
id="6dbbc4b3-ca6c-42b6-b508-da52d23e13f2",
input_schema=IfInputMatchesBlock.Input,
output_schema=IfInputMatchesBlock.Output,
description="Handles conditional logic based on comparison operators",
categories={BlockCategory.LOGIC},
test_input=[
{
"input": 10,
"value": 10,
"yes_value": "Greater",
"no_value": "Not greater",
},
{
"input": 10,
"value": 20,
"yes_value": "Greater",
"no_value": "Not greater",
},
{
"input": 10,
"value": None,
"yes_value": "Yes",
"no_value": "No",
},
],
test_output=[
("result", True),
("yes_output", "Greater"),
("result", False),
("no_output", "Not greater"),
("result", False),
("no_output", "No"),
# ("result", True),
# ("yes_output", "Yes"),
],
)
def run(self, input_data: Input, **kwargs) -> BlockOutput:
if input_data.input == input_data.value or input_data.input is input_data.value:
yield "result", True
yield "yes_output", input_data.yes_value
else:
yield "result", False
yield "no_output", input_data.no_value

View File

@@ -188,3 +188,270 @@ class CodeExecutionBlock(Block):
yield "stderr_logs", stderr_logs
except Exception as e:
yield "error", str(e)
class InstantiationBlock(Block):
class Input(BlockSchema):
credentials: CredentialsMetaInput[
Literal[ProviderName.E2B], Literal["api_key"]
] = CredentialsField(
description="Enter your api key for the E2B Sandbox. You can get it in here - https://e2b.dev/docs",
)
# Todo : Option to run commond in background
setup_commands: list[str] = SchemaField(
description=(
"Shell commands to set up the sandbox before running the code. "
"You can use `curl` or `git` to install your desired Debian based "
"package manager. `pip` and `npm` are pre-installed.\n\n"
"These commands are executed with `sh`, in the foreground."
),
placeholder="pip install cowsay",
default=[],
advanced=False,
)
setup_code: str = SchemaField(
description="Code to execute in the sandbox",
placeholder="print('Hello, World!')",
default="",
advanced=False,
)
language: ProgrammingLanguage = SchemaField(
description="Programming language to execute",
default=ProgrammingLanguage.PYTHON,
advanced=False,
)
timeout: int = SchemaField(
description="Execution timeout in seconds", default=300
)
template_id: str = SchemaField(
description=(
"You can use an E2B sandbox template by entering its ID here. "
"Check out the E2B docs for more details: "
"[E2B - Sandbox template](https://e2b.dev/docs/sandbox-template)"
),
default="",
advanced=True,
)
class Output(BlockSchema):
sandbox_id: str = SchemaField(description="ID of the sandbox instance")
response: str = SchemaField(description="Response from code execution")
stdout_logs: str = SchemaField(
description="Standard output logs from execution"
)
stderr_logs: str = SchemaField(description="Standard error logs from execution")
error: str = SchemaField(description="Error message if execution failed")
def __init__(self):
super().__init__(
id="ff0861c9-1726-4aec-9e5b-bf53f3622112",
description="Instantiate an isolated sandbox environment with internet access where to execute code in.",
categories={BlockCategory.DEVELOPER_TOOLS},
input_schema=InstantiationBlock.Input,
output_schema=InstantiationBlock.Output,
test_credentials=TEST_CREDENTIALS,
test_input={
"credentials": TEST_CREDENTIALS_INPUT,
"setup_code": "print('Hello World')",
"language": ProgrammingLanguage.PYTHON.value,
"setup_commands": [],
"timeout": 300,
"template_id": "",
},
test_output=[
("sandbox_id", str),
("response", "Hello World"),
("stdout_logs", "Hello World\n"),
],
test_mock={
"execute_code": lambda setup_code, language, setup_commands, timeout, api_key, template_id: (
"sandbox_id",
"Hello World",
"Hello World\n",
"",
),
},
)
def run(
self, input_data: Input, *, credentials: APIKeyCredentials, **kwargs
) -> BlockOutput:
try:
sandbox_id, response, stdout_logs, stderr_logs = self.execute_code(
input_data.setup_code,
input_data.language,
input_data.setup_commands,
input_data.timeout,
credentials.api_key.get_secret_value(),
input_data.template_id,
)
if sandbox_id:
yield "sandbox_id", sandbox_id
else:
yield "error", "Sandbox ID not found"
if response:
yield "response", response
if stdout_logs:
yield "stdout_logs", stdout_logs
if stderr_logs:
yield "stderr_logs", stderr_logs
except Exception as e:
yield "error", str(e)
def execute_code(
self,
code: str,
language: ProgrammingLanguage,
setup_commands: list[str],
timeout: int,
api_key: str,
template_id: str,
):
try:
sandbox = None
if template_id:
sandbox = Sandbox(
template=template_id, api_key=api_key, timeout=timeout
)
else:
sandbox = Sandbox(api_key=api_key, timeout=timeout)
if not sandbox:
raise Exception("Sandbox not created")
# Running setup commands
for cmd in setup_commands:
sandbox.commands.run(cmd)
# Executing the code
execution = sandbox.run_code(
code,
language=language.value,
on_error=lambda e: sandbox.kill(), # Kill the sandbox if there is an error
)
if execution.error:
raise Exception(execution.error)
response = execution.text
stdout_logs = "".join(execution.logs.stdout)
stderr_logs = "".join(execution.logs.stderr)
return sandbox.sandbox_id, response, stdout_logs, stderr_logs
except Exception as e:
raise e
class StepExecutionBlock(Block):
class Input(BlockSchema):
credentials: CredentialsMetaInput[
Literal[ProviderName.E2B], Literal["api_key"]
] = CredentialsField(
description="Enter your api key for the E2B Sandbox. You can get it in here - https://e2b.dev/docs",
)
sandbox_id: str = SchemaField(
description="ID of the sandbox instance to execute the code in",
advanced=False,
)
step_code: str = SchemaField(
description="Code to execute in the sandbox",
placeholder="print('Hello, World!')",
default="",
advanced=False,
)
language: ProgrammingLanguage = SchemaField(
description="Programming language to execute",
default=ProgrammingLanguage.PYTHON,
advanced=False,
)
class Output(BlockSchema):
response: str = SchemaField(description="Response from code execution")
stdout_logs: str = SchemaField(
description="Standard output logs from execution"
)
stderr_logs: str = SchemaField(description="Standard error logs from execution")
error: str = SchemaField(description="Error message if execution failed")
def __init__(self):
super().__init__(
id="82b59b8e-ea10-4d57-9161-8b169b0adba6",
description="Execute code in a previously instantiated sandbox environment.",
categories={BlockCategory.DEVELOPER_TOOLS},
input_schema=StepExecutionBlock.Input,
output_schema=StepExecutionBlock.Output,
test_credentials=TEST_CREDENTIALS,
test_input={
"credentials": TEST_CREDENTIALS_INPUT,
"sandbox_id": "sandbox_id",
"step_code": "print('Hello World')",
"language": ProgrammingLanguage.PYTHON.value,
},
test_output=[
("response", "Hello World"),
("stdout_logs", "Hello World\n"),
],
test_mock={
"execute_step_code": lambda sandbox_id, step_code, language, api_key: (
"Hello World",
"Hello World\n",
"",
),
},
)
def execute_step_code(
self,
sandbox_id: str,
code: str,
language: ProgrammingLanguage,
api_key: str,
):
try:
sandbox = Sandbox.connect(sandbox_id=sandbox_id, api_key=api_key)
if not sandbox:
raise Exception("Sandbox not found")
# Executing the code
execution = sandbox.run_code(code, language=language.value)
if execution.error:
raise Exception(execution.error)
response = execution.text
stdout_logs = "".join(execution.logs.stdout)
stderr_logs = "".join(execution.logs.stderr)
return response, stdout_logs, stderr_logs
except Exception as e:
raise e
def run(
self, input_data: Input, *, credentials: APIKeyCredentials, **kwargs
) -> BlockOutput:
try:
response, stdout_logs, stderr_logs = self.execute_step_code(
input_data.sandbox_id,
input_data.step_code,
input_data.language,
credentials.api_key.get_secret_value(),
)
if response:
yield "response", response
if stdout_logs:
yield "stdout_logs", stdout_logs
if stderr_logs:
yield "stderr_logs", stderr_logs
except Exception as e:
yield "error", str(e)

View File

@@ -1,22 +1,53 @@
import smtplib
from email.mime.multipart import MIMEMultipart
from email.mime.text import MIMEText
from typing import Literal
from pydantic import BaseModel, ConfigDict
from pydantic import BaseModel, ConfigDict, SecretStr
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.model import BlockSecret, SchemaField, SecretField
from backend.data.model import (
CredentialsField,
CredentialsMetaInput,
SchemaField,
UserPasswordCredentials,
)
from backend.integrations.providers import ProviderName
TEST_CREDENTIALS = UserPasswordCredentials(
id="01234567-89ab-cdef-0123-456789abcdef",
provider="smtp",
username=SecretStr("mock-smtp-username"),
password=SecretStr("mock-smtp-password"),
title="Mock SMTP credentials",
)
TEST_CREDENTIALS_INPUT = {
"provider": TEST_CREDENTIALS.provider,
"id": TEST_CREDENTIALS.id,
"type": TEST_CREDENTIALS.type,
"title": TEST_CREDENTIALS.title,
}
SMTPCredentials = UserPasswordCredentials
SMTPCredentialsInput = CredentialsMetaInput[
Literal[ProviderName.SMTP],
Literal["user_password"],
]
class EmailCredentials(BaseModel):
def SMTPCredentialsField() -> SMTPCredentialsInput:
return CredentialsField(
description="The SMTP integration requires a username and password.",
)
class SMTPConfig(BaseModel):
smtp_server: str = SchemaField(
default="smtp.gmail.com", description="SMTP server address"
default="smtp.example.com", description="SMTP server address"
)
smtp_port: int = SchemaField(default=25, description="SMTP port number")
smtp_username: BlockSecret = SecretField(key="smtp_username")
smtp_password: BlockSecret = SecretField(key="smtp_password")
model_config = ConfigDict(title="Email Credentials")
model_config = ConfigDict(title="SMTP Config")
class SendEmailBlock(Block):
@@ -30,10 +61,11 @@ class SendEmailBlock(Block):
body: str = SchemaField(
description="Body of the email", placeholder="Enter the email body"
)
creds: EmailCredentials = SchemaField(
description="SMTP credentials",
default=EmailCredentials(),
config: SMTPConfig = SchemaField(
description="SMTP Config",
default=SMTPConfig(),
)
credentials: SMTPCredentialsInput = SMTPCredentialsField()
class Output(BlockSchema):
status: str = SchemaField(description="Status of the email sending operation")
@@ -43,7 +75,6 @@ class SendEmailBlock(Block):
def __init__(self):
super().__init__(
disabled=True,
id="4335878a-394e-4e67-adf2-919877ff49ae",
description="This block sends an email using the provided SMTP credentials.",
categories={BlockCategory.OUTPUT},
@@ -53,25 +84,29 @@ class SendEmailBlock(Block):
"to_email": "recipient@example.com",
"subject": "Test Email",
"body": "This is a test email.",
"creds": {
"config": {
"smtp_server": "smtp.gmail.com",
"smtp_port": 25,
"smtp_username": "your-email@gmail.com",
"smtp_password": "your-gmail-password",
},
"credentials": TEST_CREDENTIALS_INPUT,
},
test_credentials=TEST_CREDENTIALS,
test_output=[("status", "Email sent successfully")],
test_mock={"send_email": lambda *args, **kwargs: "Email sent successfully"},
)
@staticmethod
def send_email(
creds: EmailCredentials, to_email: str, subject: str, body: str
config: SMTPConfig,
to_email: str,
subject: str,
body: str,
credentials: SMTPCredentials,
) -> str:
smtp_server = creds.smtp_server
smtp_port = creds.smtp_port
smtp_username = creds.smtp_username.get_secret_value()
smtp_password = creds.smtp_password.get_secret_value()
smtp_server = config.smtp_server
smtp_port = config.smtp_port
smtp_username = credentials.username.get_secret_value()
smtp_password = credentials.password.get_secret_value()
msg = MIMEMultipart()
msg["From"] = smtp_username
@@ -86,10 +121,13 @@ class SendEmailBlock(Block):
return "Email sent successfully"
def run(self, input_data: Input, **kwargs) -> BlockOutput:
def run(
self, input_data: Input, *, credentials: SMTPCredentials, **kwargs
) -> BlockOutput:
yield "status", self.send_email(
input_data.creds,
input_data.to_email,
input_data.subject,
input_data.body,
config=input_data.config,
to_email=input_data.to_email,
subject=input_data.subject,
body=input_data.body,
credentials=credentials,
)

View File

@@ -1,4 +1,4 @@
from typing import List, Optional
from typing import List
from pydantic import BaseModel
@@ -13,12 +13,12 @@ from backend.util.request import requests
class ContentRetrievalSettings(BaseModel):
text: Optional[dict] = SchemaField(
text: dict = SchemaField(
description="Text content settings",
default={"maxCharacters": 1000, "includeHtmlTags": False},
advanced=True,
)
highlights: Optional[dict] = SchemaField(
highlights: dict = SchemaField(
description="Highlight settings",
default={
"numSentences": 3,
@@ -27,7 +27,7 @@ class ContentRetrievalSettings(BaseModel):
},
advanced=True,
)
summary: Optional[dict] = SchemaField(
summary: dict = SchemaField(
description="Summary settings",
default={"query": ""},
advanced=True,

View File

@@ -1,6 +1,9 @@
from urllib.parse import urlparse
from backend.blocks.github._auth import GithubCredentials
from backend.blocks.github._auth import (
GithubCredentials,
GithubFineGrainedAPICredentials,
)
from backend.util.request import Requests
@@ -30,12 +33,15 @@ def _convert_to_api_url(url: str) -> str:
def _get_headers(credentials: GithubCredentials) -> dict[str, str]:
return {
"Authorization": credentials.bearer(),
"Authorization": credentials.auth_header(),
"Accept": "application/vnd.github.v3+json",
}
def get_api(credentials: GithubCredentials, convert_urls: bool = True) -> Requests:
def get_api(
credentials: GithubCredentials | GithubFineGrainedAPICredentials,
convert_urls: bool = True,
) -> Requests:
return Requests(
trusted_origins=["https://api.github.com", "https://github.com"],
extra_url_validator=_convert_to_api_url if convert_urls else None,

View File

@@ -22,6 +22,11 @@ GithubCredentialsInput = CredentialsMetaInput[
Literal["api_key", "oauth2"] if GITHUB_OAUTH_IS_CONFIGURED else Literal["api_key"],
]
GithubFineGrainedAPICredentials = APIKeyCredentials
GithubFineGrainedAPICredentialsInput = CredentialsMetaInput[
Literal[ProviderName.GITHUB], Literal["api_key"]
]
def GithubCredentialsField(scope: str) -> GithubCredentialsInput:
"""
@@ -37,6 +42,16 @@ def GithubCredentialsField(scope: str) -> GithubCredentialsInput:
)
def GithubFineGrainedAPICredentialsField(
scope: str,
) -> GithubFineGrainedAPICredentialsInput:
return CredentialsField(
required_scopes={scope},
description="The GitHub integration can be used with OAuth, "
"or any API key with sufficient permissions for the blocks it is used on.",
)
TEST_CREDENTIALS = APIKeyCredentials(
id="01234567-89ab-cdef-0123-456789abcdef",
provider="github",
@@ -50,3 +65,18 @@ TEST_CREDENTIALS_INPUT = {
"type": TEST_CREDENTIALS.type,
"title": TEST_CREDENTIALS.type,
}
TEST_FINE_GRAINED_CREDENTIALS = APIKeyCredentials(
id="01234567-89ab-cdef-0123-456789abcdef",
provider="github",
api_key=SecretStr("mock-github-api-key"),
title="Mock GitHub API key",
expires_at=None,
)
TEST_FINE_GRAINED_CREDENTIALS_INPUT = {
"provider": TEST_FINE_GRAINED_CREDENTIALS.provider,
"id": TEST_FINE_GRAINED_CREDENTIALS.id,
"type": TEST_FINE_GRAINED_CREDENTIALS.type,
"title": TEST_FINE_GRAINED_CREDENTIALS.type,
}

View File

@@ -0,0 +1,356 @@
from enum import Enum
from typing import Optional
from pydantic import BaseModel
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.model import SchemaField
from ._api import get_api
from ._auth import (
TEST_CREDENTIALS,
TEST_CREDENTIALS_INPUT,
GithubCredentials,
GithubCredentialsField,
GithubCredentialsInput,
)
# queued, in_progress, completed, waiting, requested, pending
class ChecksStatus(Enum):
QUEUED = "queued"
IN_PROGRESS = "in_progress"
COMPLETED = "completed"
WAITING = "waiting"
REQUESTED = "requested"
PENDING = "pending"
class ChecksConclusion(Enum):
SUCCESS = "success"
FAILURE = "failure"
NEUTRAL = "neutral"
CANCELLED = "cancelled"
TIMED_OUT = "timed_out"
ACTION_REQUIRED = "action_required"
SKIPPED = "skipped"
class GithubCreateCheckRunBlock(Block):
"""Block for creating a new check run on a GitHub repository."""
class Input(BlockSchema):
credentials: GithubCredentialsInput = GithubCredentialsField("repo:status")
repo_url: str = SchemaField(
description="URL of the GitHub repository",
placeholder="https://github.com/owner/repo",
)
name: str = SchemaField(
description="The name of the check run (e.g., 'code-coverage')",
)
head_sha: str = SchemaField(
description="The SHA of the commit to check",
)
status: ChecksStatus = SchemaField(
description="Current status of the check run",
default=ChecksStatus.QUEUED,
)
conclusion: Optional[ChecksConclusion] = SchemaField(
description="The final conclusion of the check (required if status is completed)",
default=None,
)
details_url: str = SchemaField(
description="The URL for the full details of the check",
default="",
)
output_title: str = SchemaField(
description="Title of the check run output",
default="",
)
output_summary: str = SchemaField(
description="Summary of the check run output",
default="",
)
output_text: str = SchemaField(
description="Detailed text of the check run output",
default="",
)
class Output(BlockSchema):
class CheckRunResult(BaseModel):
id: int
html_url: str
status: str
check_run: CheckRunResult = SchemaField(
description="Details of the created check run"
)
error: str = SchemaField(
description="Error message if check run creation failed"
)
def __init__(self):
super().__init__(
id="2f45e89a-3b7d-4f22-b89e-6c4f5c7e1234",
description="Creates a new check run for a specific commit in a GitHub repository",
categories={BlockCategory.DEVELOPER_TOOLS},
input_schema=GithubCreateCheckRunBlock.Input,
output_schema=GithubCreateCheckRunBlock.Output,
test_input={
"repo_url": "https://github.com/owner/repo",
"name": "test-check",
"head_sha": "ce587453ced02b1526dfb4cb910479d431683101",
"status": ChecksStatus.COMPLETED.value,
"conclusion": ChecksConclusion.SUCCESS.value,
"output_title": "Test Results",
"output_summary": "All tests passed",
"credentials": TEST_CREDENTIALS_INPUT,
},
# requires a github app not available to oauth in our current system
disabled=True,
test_credentials=TEST_CREDENTIALS,
test_output=[
(
"check_run",
{
"id": 4,
"html_url": "https://github.com/owner/repo/runs/4",
"status": "completed",
},
),
],
test_mock={
"create_check_run": lambda *args, **kwargs: {
"id": 4,
"html_url": "https://github.com/owner/repo/runs/4",
"status": "completed",
}
},
)
@staticmethod
def create_check_run(
credentials: GithubCredentials,
repo_url: str,
name: str,
head_sha: str,
status: ChecksStatus,
conclusion: Optional[ChecksConclusion] = None,
details_url: Optional[str] = None,
output_title: Optional[str] = None,
output_summary: Optional[str] = None,
output_text: Optional[str] = None,
) -> dict:
api = get_api(credentials)
class CheckRunData(BaseModel):
name: str
head_sha: str
status: str
conclusion: Optional[str] = None
details_url: Optional[str] = None
output: Optional[dict[str, str]] = None
data = CheckRunData(
name=name,
head_sha=head_sha,
status=status.value,
)
if conclusion:
data.conclusion = conclusion.value
if details_url:
data.details_url = details_url
if output_title or output_summary or output_text:
output_data = {
"title": output_title or "",
"summary": output_summary or "",
"text": output_text or "",
}
data.output = output_data
check_runs_url = f"{repo_url}/check-runs"
response = api.post(check_runs_url)
result = response.json()
return {
"id": result["id"],
"html_url": result["html_url"],
"status": result["status"],
}
def run(
self,
input_data: Input,
*,
credentials: GithubCredentials,
**kwargs,
) -> BlockOutput:
try:
result = self.create_check_run(
credentials=credentials,
repo_url=input_data.repo_url,
name=input_data.name,
head_sha=input_data.head_sha,
status=input_data.status,
conclusion=input_data.conclusion,
details_url=input_data.details_url,
output_title=input_data.output_title,
output_summary=input_data.output_summary,
output_text=input_data.output_text,
)
yield "check_run", result
except Exception as e:
yield "error", str(e)
class GithubUpdateCheckRunBlock(Block):
"""Block for updating an existing check run on a GitHub repository."""
class Input(BlockSchema):
credentials: GithubCredentialsInput = GithubCredentialsField("repo:status")
repo_url: str = SchemaField(
description="URL of the GitHub repository",
placeholder="https://github.com/owner/repo",
)
check_run_id: int = SchemaField(
description="The ID of the check run to update",
)
status: ChecksStatus = SchemaField(
description="New status of the check run",
)
conclusion: ChecksConclusion = SchemaField(
description="The final conclusion of the check (required if status is completed)",
)
output_title: Optional[str] = SchemaField(
description="New title of the check run output",
default=None,
)
output_summary: Optional[str] = SchemaField(
description="New summary of the check run output",
default=None,
)
output_text: Optional[str] = SchemaField(
description="New detailed text of the check run output",
default=None,
)
class Output(BlockSchema):
class CheckRunResult(BaseModel):
id: int
html_url: str
status: str
conclusion: Optional[str]
check_run: CheckRunResult = SchemaField(
description="Details of the updated check run"
)
error: str = SchemaField(description="Error message if check run update failed")
def __init__(self):
super().__init__(
id="8a23c567-9d01-4e56-b789-0c12d3e45678", # Generated UUID
description="Updates an existing check run in a GitHub repository",
categories={BlockCategory.DEVELOPER_TOOLS},
input_schema=GithubUpdateCheckRunBlock.Input,
output_schema=GithubUpdateCheckRunBlock.Output,
# requires a github app not available to oauth in our current system
disabled=True,
test_input={
"repo_url": "https://github.com/owner/repo",
"check_run_id": 4,
"status": ChecksStatus.COMPLETED.value,
"conclusion": ChecksConclusion.SUCCESS.value,
"output_title": "Updated Results",
"output_summary": "All tests passed after retry",
"credentials": TEST_CREDENTIALS_INPUT,
},
test_credentials=TEST_CREDENTIALS,
test_output=[
(
"check_run",
{
"id": 4,
"html_url": "https://github.com/owner/repo/runs/4",
"status": "completed",
"conclusion": "success",
},
),
],
test_mock={
"update_check_run": lambda *args, **kwargs: {
"id": 4,
"html_url": "https://github.com/owner/repo/runs/4",
"status": "completed",
"conclusion": "success",
}
},
)
@staticmethod
def update_check_run(
credentials: GithubCredentials,
repo_url: str,
check_run_id: int,
status: ChecksStatus,
conclusion: Optional[ChecksConclusion] = None,
output_title: Optional[str] = None,
output_summary: Optional[str] = None,
output_text: Optional[str] = None,
) -> dict:
api = get_api(credentials)
class UpdateCheckRunData(BaseModel):
status: str
conclusion: Optional[str] = None
output: Optional[dict[str, str]] = None
data = UpdateCheckRunData(
status=status.value,
)
if conclusion:
data.conclusion = conclusion.value
if output_title or output_summary or output_text:
output_data = {
"title": output_title or "",
"summary": output_summary or "",
"text": output_text or "",
}
data.output = output_data
check_run_url = f"{repo_url}/check-runs/{check_run_id}"
response = api.patch(check_run_url)
result = response.json()
return {
"id": result["id"],
"html_url": result["html_url"],
"status": result["status"],
"conclusion": result.get("conclusion"),
}
def run(
self,
input_data: Input,
*,
credentials: GithubCredentials,
**kwargs,
) -> BlockOutput:
try:
result = self.update_check_run(
credentials=credentials,
repo_url=input_data.repo_url,
check_run_id=input_data.check_run_id,
status=input_data.status,
conclusion=input_data.conclusion,
output_title=input_data.output_title,
output_summary=input_data.output_summary,
output_text=input_data.output_text,
)
yield "check_run", result
except Exception as e:
yield "error", str(e)

View File

@@ -200,6 +200,7 @@ class GithubReadPullRequestBlock(Block):
include_pr_changes: bool = SchemaField(
description="Whether to include the changes made in the pull request",
default=False,
advanced=False,
)
class Output(BlockSchema):

View File

@@ -0,0 +1,180 @@
from enum import Enum
from typing import Optional
from pydantic import BaseModel
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.model import SchemaField
from ._api import get_api
from ._auth import (
TEST_CREDENTIALS,
TEST_CREDENTIALS_INPUT,
GithubFineGrainedAPICredentials,
GithubFineGrainedAPICredentialsField,
GithubFineGrainedAPICredentialsInput,
)
class StatusState(Enum):
ERROR = "error"
FAILURE = "failure"
PENDING = "pending"
SUCCESS = "success"
class GithubCreateStatusBlock(Block):
"""Block for creating a commit status on a GitHub repository."""
class Input(BlockSchema):
credentials: GithubFineGrainedAPICredentialsInput = (
GithubFineGrainedAPICredentialsField("repo:status")
)
repo_url: str = SchemaField(
description="URL of the GitHub repository",
placeholder="https://github.com/owner/repo",
)
sha: str = SchemaField(
description="The SHA of the commit to set status for",
)
state: StatusState = SchemaField(
description="The state of the status (error, failure, pending, success)",
)
target_url: Optional[str] = SchemaField(
description="URL with additional details about this status",
default=None,
)
description: Optional[str] = SchemaField(
description="Short description of the status",
default=None,
)
check_name: Optional[str] = SchemaField(
description="Label to differentiate this status from others",
default="AutoGPT Platform Checks",
advanced=False,
)
class Output(BlockSchema):
class StatusResult(BaseModel):
id: int
url: str
state: str
context: str
description: Optional[str]
target_url: Optional[str]
created_at: str
updated_at: str
status: StatusResult = SchemaField(description="Details of the created status")
error: str = SchemaField(description="Error message if status creation failed")
def __init__(self):
super().__init__(
id="3d67f123-a4b5-4c89-9d01-2e34f5c67890", # Generated UUID
description="Creates a new commit status in a GitHub repository",
categories={BlockCategory.DEVELOPER_TOOLS},
input_schema=GithubCreateStatusBlock.Input,
output_schema=GithubCreateStatusBlock.Output,
test_input={
"repo_url": "https://github.com/owner/repo",
"sha": "ce587453ced02b1526dfb4cb910479d431683101",
"state": StatusState.SUCCESS.value,
"target_url": "https://example.com/build/status",
"description": "The build succeeded!",
"check_name": "continuous-integration/jenkins",
"credentials": TEST_CREDENTIALS_INPUT,
},
test_credentials=TEST_CREDENTIALS,
test_output=[
(
"status",
{
"id": 1234567890,
"url": "https://api.github.com/repos/owner/repo/statuses/ce587453ced02b1526dfb4cb910479d431683101",
"state": "success",
"context": "continuous-integration/jenkins",
"description": "The build succeeded!",
"target_url": "https://example.com/build/status",
"created_at": "2024-01-21T10:00:00Z",
"updated_at": "2024-01-21T10:00:00Z",
},
),
],
test_mock={
"create_status": lambda *args, **kwargs: {
"id": 1234567890,
"url": "https://api.github.com/repos/owner/repo/statuses/ce587453ced02b1526dfb4cb910479d431683101",
"state": "success",
"context": "continuous-integration/jenkins",
"description": "The build succeeded!",
"target_url": "https://example.com/build/status",
"created_at": "2024-01-21T10:00:00Z",
"updated_at": "2024-01-21T10:00:00Z",
}
},
)
@staticmethod
def create_status(
credentials: GithubFineGrainedAPICredentials,
repo_url: str,
sha: str,
state: StatusState,
target_url: Optional[str] = None,
description: Optional[str] = None,
context: str = "default",
) -> dict:
api = get_api(credentials)
class StatusData(BaseModel):
state: str
target_url: Optional[str] = None
description: Optional[str] = None
context: str
data = StatusData(
state=state.value,
context=context,
)
if target_url:
data.target_url = target_url
if description:
data.description = description
status_url = f"{repo_url}/statuses/{sha}"
response = api.post(status_url, json=data)
result = response.json()
return {
"id": result["id"],
"url": result["url"],
"state": result["state"],
"context": result["context"],
"description": result.get("description"),
"target_url": result.get("target_url"),
"created_at": result["created_at"],
"updated_at": result["updated_at"],
}
def run(
self,
input_data: Input,
*,
credentials: GithubFineGrainedAPICredentials,
**kwargs,
) -> BlockOutput:
try:
result = self.create_status(
credentials=credentials,
repo_url=input_data.repo_url,
sha=input_data.sha,
state=input_data.state,
target_url=input_data.target_url,
description=input_data.description,
context=input_data.check_name or "AutoGPT Platform Checks",
)
yield "status", result
except Exception as e:
yield "error", str(e)

View File

@@ -151,7 +151,7 @@ class IdeogramModelBlock(Block):
super().__init__(
id="6ab085e2-20b3-4055-bc3e-08036e01eca6",
description="This block runs Ideogram models with both simple and advanced settings.",
categories={BlockCategory.AI},
categories={BlockCategory.AI, BlockCategory.MULTIMEDIA},
input_schema=IdeogramModelBlock.Input,
output_schema=IdeogramModelBlock.Output,
test_input={

View File

@@ -0,0 +1,272 @@
from __future__ import annotations
import json
from typing import Any, Dict, Optional
from backend.blocks.linear._auth import LinearCredentials
from backend.blocks.linear.models import (
CreateCommentResponse,
CreateIssueResponse,
Issue,
Project,
)
from backend.util.request import Requests
class LinearAPIException(Exception):
def __init__(self, message: str, status_code: int):
super().__init__(message)
self.status_code = status_code
class LinearClient:
"""Client for the Linear API
If you're looking for the schema: https://studio.apollographql.com/public/Linear-API/variant/current/schema
"""
API_URL = "https://api.linear.app/graphql"
def __init__(
self,
credentials: LinearCredentials | None = None,
custom_requests: Optional[Requests] = None,
):
if custom_requests:
self._requests = custom_requests
else:
headers: Dict[str, str] = {
"Content-Type": "application/json",
}
if credentials:
headers["Authorization"] = credentials.auth_header()
self._requests = Requests(
extra_headers=headers,
trusted_origins=["https://api.linear.app"],
raise_for_status=False,
)
def _execute_graphql_request(
self, query: str, variables: dict | None = None
) -> Any:
"""
Executes a GraphQL request against the Linear API and returns the response data.
Args:
query: The GraphQL query string.
variables (optional): Any GraphQL query variables
Returns:
The parsed JSON response data, or raises a LinearAPIException on error.
"""
payload: Dict[str, Any] = {"query": query}
if variables:
payload["variables"] = variables
response = self._requests.post(self.API_URL, json=payload)
if not response.ok:
try:
error_data = response.json()
error_message = error_data.get("errors", [{}])[0].get("message", "")
except json.JSONDecodeError:
error_message = response.text
raise LinearAPIException(
f"Linear API request failed ({response.status_code}): {error_message}",
response.status_code,
)
response_data = response.json()
if "errors" in response_data:
error_messages = [
error.get("message", "") for error in response_data["errors"]
]
raise LinearAPIException(
f"Linear API returned errors: {', '.join(error_messages)}",
response.status_code,
)
return response_data["data"]
def query(self, query: str, variables: Optional[dict] = None) -> dict:
"""Executes a GraphQL query.
Args:
query: The GraphQL query string.
variables: Query variables, if any.
Returns:
The response data.
"""
return self._execute_graphql_request(query, variables)
def mutate(self, mutation: str, variables: Optional[dict] = None) -> dict:
"""Executes a GraphQL mutation.
Args:
mutation: The GraphQL mutation string.
variables: Query variables, if any.
Returns:
The response data.
"""
return self._execute_graphql_request(mutation, variables)
def try_create_comment(self, issue_id: str, comment: str) -> CreateCommentResponse:
try:
mutation = """
mutation CommentCreate($input: CommentCreateInput!) {
commentCreate(input: $input) {
success
comment {
id
body
}
}
}
"""
variables = {
"input": {
"body": comment,
"issueId": issue_id,
}
}
added_comment = self.mutate(mutation, variables)
# Select the commentCreate field from the mutation response
return CreateCommentResponse(**added_comment["commentCreate"])
except LinearAPIException as e:
raise e
def try_get_team_by_name(self, team_name: str) -> str:
try:
query = """
query GetTeamId($searchTerm: String!) {
teams(filter: {
or: [
{ name: { eqIgnoreCase: $searchTerm } },
{ key: { eqIgnoreCase: $searchTerm } }
]
}) {
nodes {
id
name
key
}
}
}
"""
variables: dict[str, Any] = {
"searchTerm": team_name,
}
team_id = self.query(query, variables)
return team_id["teams"]["nodes"][0]["id"]
except LinearAPIException as e:
raise e
def try_create_issue(
self,
team_id: str,
title: str,
description: str | None = None,
priority: int | None = None,
project_id: str | None = None,
) -> CreateIssueResponse:
try:
mutation = """
mutation IssueCreate($input: IssueCreateInput!) {
issueCreate(input: $input) {
issue {
title
description
id
identifier
priority
}
}
}
"""
variables: dict[str, Any] = {
"input": {
"teamId": team_id,
"title": title,
}
}
if project_id:
variables["input"]["projectId"] = project_id
if description:
variables["input"]["description"] = description
if priority:
variables["input"]["priority"] = priority
added_issue = self.mutate(mutation, variables)
return CreateIssueResponse(**added_issue["issueCreate"])
except LinearAPIException as e:
raise e
def try_search_projects(self, term: str) -> list[Project]:
try:
query = """
query SearchProjects($term: String!, $includeComments: Boolean!) {
searchProjects(term: $term, includeComments: $includeComments) {
nodes {
id
name
description
priority
progress
content
}
}
}
"""
variables: dict[str, Any] = {
"term": term,
"includeComments": True,
}
projects = self.query(query, variables)
return [
Project(**project) for project in projects["searchProjects"]["nodes"]
]
except LinearAPIException as e:
raise e
def try_search_issues(self, term: str) -> list[Issue]:
try:
query = """
query SearchIssues($term: String!, $includeComments: Boolean!) {
searchIssues(term: $term, includeComments: $includeComments) {
nodes {
id
identifier
title
description
priority
}
}
}
"""
variables: dict[str, Any] = {
"term": term,
"includeComments": True,
}
issues = self.query(query, variables)
return [Issue(**issue) for issue in issues["searchIssues"]["nodes"]]
except LinearAPIException as e:
raise e

View File

@@ -0,0 +1,101 @@
from enum import Enum
from typing import Literal
from pydantic import SecretStr
from backend.data.model import (
APIKeyCredentials,
CredentialsField,
CredentialsMetaInput,
OAuth2Credentials,
)
from backend.integrations.providers import ProviderName
from backend.util.settings import Secrets
secrets = Secrets()
LINEAR_OAUTH_IS_CONFIGURED = bool(
secrets.linear_client_id and secrets.linear_client_secret
)
LinearCredentials = OAuth2Credentials | APIKeyCredentials
# LinearCredentialsInput = CredentialsMetaInput[
# Literal[ProviderName.LINEAR],
# Literal["oauth2", "api_key"] if LINEAR_OAUTH_IS_CONFIGURED else Literal["oauth2"],
# ]
LinearCredentialsInput = CredentialsMetaInput[
Literal[ProviderName.LINEAR], Literal["oauth2"]
]
# (required) Comma separated list of scopes:
# read - (Default) Read access for the user's account. This scope will always be present.
# write - Write access for the user's account. If your application only needs to create comments, use a more targeted scope
# issues:create - Allows creating new issues and their attachments
# comments:create - Allows creating new issue comments
# timeSchedule:write - Allows creating and modifying time schedules
# admin - Full access to admin level endpoints. You should never ask for this permission unless it's absolutely needed
class LinearScope(str, Enum):
READ = "read"
WRITE = "write"
ISSUES_CREATE = "issues:create"
COMMENTS_CREATE = "comments:create"
TIME_SCHEDULE_WRITE = "timeSchedule:write"
ADMIN = "admin"
def LinearCredentialsField(scopes: list[LinearScope]) -> LinearCredentialsInput:
"""
Creates a Linear credentials input on a block.
Params:
scope: The authorization scope needed for the block to work. ([list of available scopes](https://docs.github.com/en/apps/oauth-apps/building-oauth-apps/scopes-for-oauth-apps#available-scopes))
""" # noqa
return CredentialsField(
required_scopes=set([LinearScope.READ.value]).union(
set([scope.value for scope in scopes])
),
description="The Linear integration can be used with OAuth, "
"or any API key with sufficient permissions for the blocks it is used on.",
)
TEST_CREDENTIALS_OAUTH = OAuth2Credentials(
id="01234567-89ab-cdef-0123-456789abcdef",
provider="linear",
title="Mock Linear API key",
username="mock-linear-username",
access_token=SecretStr("mock-linear-access-token"),
access_token_expires_at=None,
refresh_token=SecretStr("mock-linear-refresh-token"),
refresh_token_expires_at=None,
scopes=["mock-linear-scopes"],
)
TEST_CREDENTIALS_API_KEY = APIKeyCredentials(
id="01234567-89ab-cdef-0123-456789abcdef",
provider="linear",
title="Mock Linear API key",
api_key=SecretStr("mock-linear-api-key"),
expires_at=None,
)
TEST_CREDENTIALS_INPUT_OAUTH = {
"provider": TEST_CREDENTIALS_OAUTH.provider,
"id": TEST_CREDENTIALS_OAUTH.id,
"type": TEST_CREDENTIALS_OAUTH.type,
"title": TEST_CREDENTIALS_OAUTH.type,
}
TEST_CREDENTIALS_INPUT_API_KEY = {
"provider": TEST_CREDENTIALS_API_KEY.provider,
"id": TEST_CREDENTIALS_API_KEY.id,
"type": TEST_CREDENTIALS_API_KEY.type,
"title": TEST_CREDENTIALS_API_KEY.type,
}

View File

@@ -0,0 +1,83 @@
from backend.blocks.linear._api import LinearAPIException, LinearClient
from backend.blocks.linear._auth import (
LINEAR_OAUTH_IS_CONFIGURED,
TEST_CREDENTIALS_INPUT_OAUTH,
TEST_CREDENTIALS_OAUTH,
LinearCredentials,
LinearCredentialsField,
LinearCredentialsInput,
LinearScope,
)
from backend.blocks.linear.models import CreateCommentResponse
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.model import SchemaField
class LinearCreateCommentBlock(Block):
"""Block for creating comments on Linear issues"""
class Input(BlockSchema):
credentials: LinearCredentialsInput = LinearCredentialsField(
scopes=[LinearScope.COMMENTS_CREATE],
)
issue_id: str = SchemaField(description="ID of the issue to comment on")
comment: str = SchemaField(description="Comment text to add to the issue")
class Output(BlockSchema):
comment_id: str = SchemaField(description="ID of the created comment")
comment_body: str = SchemaField(
description="Text content of the created comment"
)
error: str = SchemaField(description="Error message if comment creation failed")
def __init__(self):
super().__init__(
id="8f7d3a2e-9b5c-4c6a-8f1d-7c8b3e4a5d6c",
description="Creates a new comment on a Linear issue",
input_schema=self.Input,
output_schema=self.Output,
categories={BlockCategory.PRODUCTIVITY, BlockCategory.ISSUE_TRACKING},
test_input={
"issue_id": "TEST-123",
"comment": "Test comment",
"credentials": TEST_CREDENTIALS_INPUT_OAUTH,
},
disabled=not LINEAR_OAUTH_IS_CONFIGURED,
test_credentials=TEST_CREDENTIALS_OAUTH,
test_output=[("comment_id", "abc123"), ("comment_body", "Test comment")],
test_mock={
"create_comment": lambda *args, **kwargs: (
"abc123",
"Test comment",
)
},
)
@staticmethod
def create_comment(
credentials: LinearCredentials, issue_id: str, comment: str
) -> tuple[str, str]:
client = LinearClient(credentials=credentials)
response: CreateCommentResponse = client.try_create_comment(
issue_id=issue_id, comment=comment
)
return response.comment.id, response.comment.body
def run(
self, input_data: Input, *, credentials: LinearCredentials, **kwargs
) -> BlockOutput:
"""Execute the comment creation"""
try:
comment_id, comment_body = self.create_comment(
credentials=credentials,
issue_id=input_data.issue_id,
comment=input_data.comment,
)
yield "comment_id", comment_id
yield "comment_body", comment_body
except LinearAPIException as e:
yield "error", str(e)
except Exception as e:
yield "error", f"Unexpected error: {str(e)}"

View File

@@ -0,0 +1,189 @@
from backend.blocks.linear._api import LinearAPIException, LinearClient
from backend.blocks.linear._auth import (
LINEAR_OAUTH_IS_CONFIGURED,
TEST_CREDENTIALS_INPUT_OAUTH,
TEST_CREDENTIALS_OAUTH,
LinearCredentials,
LinearCredentialsField,
LinearCredentialsInput,
LinearScope,
)
from backend.blocks.linear.models import CreateIssueResponse, Issue
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.model import SchemaField
class LinearCreateIssueBlock(Block):
"""Block for creating issues on Linear"""
class Input(BlockSchema):
credentials: LinearCredentialsInput = LinearCredentialsField(
scopes=[LinearScope.ISSUES_CREATE],
)
title: str = SchemaField(description="Title of the issue")
description: str | None = SchemaField(description="Description of the issue")
team_name: str = SchemaField(
description="Name of the team to create the issue on"
)
priority: int | None = SchemaField(
description="Priority of the issue",
default=None,
minimum=0,
maximum=4,
)
project_name: str | None = SchemaField(
description="Name of the project to create the issue on",
default=None,
)
class Output(BlockSchema):
issue_id: str = SchemaField(description="ID of the created issue")
issue_title: str = SchemaField(description="Title of the created issue")
error: str = SchemaField(description="Error message if issue creation failed")
def __init__(self):
super().__init__(
id="f9c68f55-dcca-40a8-8771-abf9601680aa",
description="Creates a new issue on Linear",
disabled=not LINEAR_OAUTH_IS_CONFIGURED,
input_schema=self.Input,
output_schema=self.Output,
categories={BlockCategory.PRODUCTIVITY, BlockCategory.ISSUE_TRACKING},
test_input={
"title": "Test issue",
"description": "Test description",
"team_name": "Test team",
"project_name": "Test project",
"credentials": TEST_CREDENTIALS_INPUT_OAUTH,
},
test_credentials=TEST_CREDENTIALS_OAUTH,
test_output=[("issue_id", "abc123"), ("issue_title", "Test issue")],
test_mock={
"create_issue": lambda *args, **kwargs: (
"abc123",
"Test issue",
)
},
)
@staticmethod
def create_issue(
credentials: LinearCredentials,
team_name: str,
title: str,
description: str | None = None,
priority: int | None = None,
project_name: str | None = None,
) -> tuple[str, str]:
client = LinearClient(credentials=credentials)
team_id = client.try_get_team_by_name(team_name=team_name)
project_id: str | None = None
if project_name:
projects = client.try_search_projects(term=project_name)
if projects:
project_id = projects[0].id
else:
raise LinearAPIException("Project not found", status_code=404)
response: CreateIssueResponse = client.try_create_issue(
team_id=team_id,
title=title,
description=description,
priority=priority,
project_id=project_id,
)
return response.issue.identifier, response.issue.title
def run(
self, input_data: Input, *, credentials: LinearCredentials, **kwargs
) -> BlockOutput:
"""Execute the issue creation"""
try:
issue_id, issue_title = self.create_issue(
credentials=credentials,
team_name=input_data.team_name,
title=input_data.title,
description=input_data.description,
priority=input_data.priority,
project_name=input_data.project_name,
)
yield "issue_id", issue_id
yield "issue_title", issue_title
except LinearAPIException as e:
yield "error", str(e)
except Exception as e:
yield "error", f"Unexpected error: {str(e)}"
class LinearSearchIssuesBlock(Block):
"""Block for searching issues on Linear"""
class Input(BlockSchema):
term: str = SchemaField(description="Term to search for issues")
credentials: LinearCredentialsInput = LinearCredentialsField(
scopes=[LinearScope.READ],
)
class Output(BlockSchema):
issues: list[Issue] = SchemaField(description="List of issues")
def __init__(self):
super().__init__(
id="b5a2a0e6-26b4-4c5b-8a42-bc79e9cb65c2",
description="Searches for issues on Linear",
input_schema=self.Input,
output_schema=self.Output,
disabled=not LINEAR_OAUTH_IS_CONFIGURED,
test_input={
"term": "Test issue",
"credentials": TEST_CREDENTIALS_INPUT_OAUTH,
},
test_credentials=TEST_CREDENTIALS_OAUTH,
test_output=[
(
"issues",
[
Issue(
id="abc123",
identifier="abc123",
title="Test issue",
description="Test description",
priority=1,
)
],
)
],
test_mock={
"search_issues": lambda *args, **kwargs: [
Issue(
id="abc123",
identifier="abc123",
title="Test issue",
description="Test description",
priority=1,
)
]
},
)
@staticmethod
def search_issues(
credentials: LinearCredentials,
term: str,
) -> list[Issue]:
client = LinearClient(credentials=credentials)
response: list[Issue] = client.try_search_issues(term=term)
return response
def run(
self, input_data: Input, *, credentials: LinearCredentials, **kwargs
) -> BlockOutput:
"""Execute the issue search"""
try:
issues = self.search_issues(credentials=credentials, term=input_data.term)
yield "issues", issues
except LinearAPIException as e:
yield "error", str(e)
except Exception as e:
yield "error", f"Unexpected error: {str(e)}"

View File

@@ -0,0 +1,41 @@
from pydantic import BaseModel
class Comment(BaseModel):
id: str
body: str
class CreateCommentInput(BaseModel):
body: str
issueId: str
class CreateCommentResponse(BaseModel):
success: bool
comment: Comment
class CreateCommentResponseWrapper(BaseModel):
commentCreate: CreateCommentResponse
class Issue(BaseModel):
id: str
identifier: str
title: str
description: str | None
priority: int
class CreateIssueResponse(BaseModel):
issue: Issue
class Project(BaseModel):
id: str
name: str
description: str
priority: int
progress: int
content: str

View File

@@ -0,0 +1,95 @@
from backend.blocks.linear._api import LinearAPIException, LinearClient
from backend.blocks.linear._auth import (
LINEAR_OAUTH_IS_CONFIGURED,
TEST_CREDENTIALS_INPUT_OAUTH,
TEST_CREDENTIALS_OAUTH,
LinearCredentials,
LinearCredentialsField,
LinearCredentialsInput,
LinearScope,
)
from backend.blocks.linear.models import Project
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.model import SchemaField
class LinearSearchProjectsBlock(Block):
"""Block for searching projects on Linear"""
class Input(BlockSchema):
credentials: LinearCredentialsInput = LinearCredentialsField(
scopes=[LinearScope.READ],
)
term: str = SchemaField(description="Term to search for projects")
class Output(BlockSchema):
projects: list[Project] = SchemaField(description="List of projects")
error: str = SchemaField(description="Error message if issue creation failed")
def __init__(self):
super().__init__(
id="446a1d35-9d8f-4ac5-83ea-7684ec50e6af",
description="Searches for projects on Linear",
input_schema=self.Input,
output_schema=self.Output,
categories={BlockCategory.PRODUCTIVITY, BlockCategory.ISSUE_TRACKING},
test_input={
"term": "Test project",
"credentials": TEST_CREDENTIALS_INPUT_OAUTH,
},
disabled=not LINEAR_OAUTH_IS_CONFIGURED,
test_credentials=TEST_CREDENTIALS_OAUTH,
test_output=[
(
"projects",
[
Project(
id="abc123",
name="Test project",
description="Test description",
priority=1,
progress=1,
content="Test content",
)
],
)
],
test_mock={
"search_projects": lambda *args, **kwargs: [
Project(
id="abc123",
name="Test project",
description="Test description",
priority=1,
progress=1,
content="Test content",
)
]
},
)
@staticmethod
def search_projects(
credentials: LinearCredentials,
term: str,
) -> list[Project]:
client = LinearClient(credentials=credentials)
response: list[Project] = client.try_search_projects(term=term)
return response
def run(
self, input_data: Input, *, credentials: LinearCredentials, **kwargs
) -> BlockOutput:
"""Execute the project search"""
try:
projects = self.search_projects(
credentials=credentials,
term=input_data.term,
)
yield "projects", projects
except LinearAPIException as e:
yield "error", str(e)
except Exception as e:
yield "error", f"Unexpected error: {str(e)}"

View File

@@ -1,5 +1,6 @@
import ast
import logging
from abc import ABC
from enum import Enum, EnumMeta
from json import JSONDecodeError
from types import MappingProxyType
@@ -26,8 +27,10 @@ from backend.data.model import (
)
from backend.util import json
from backend.util.settings import BehaveAs, Settings
from backend.util.text import TextFormatter
logger = logging.getLogger(__name__)
fmt = TextFormatter()
LLMProviderName = Literal[
ProviderName.ANTHROPIC,
@@ -66,6 +69,7 @@ def AICredentialsField() -> AICredentials:
class ModelMetadata(NamedTuple):
provider: str
context_window: int
max_output_tokens: int | None
class LlmModelMeta(EnumMeta):
@@ -89,6 +93,8 @@ class LlmModelMeta(EnumMeta):
class LlmModel(str, Enum, metaclass=LlmModelMeta):
# OpenAI models
O3_MINI = "o3-mini"
O1 = "o1"
O1_PREVIEW = "o1-preview"
O1_MINI = "o1-mini"
GPT4O_MINI = "gpt-4o-mini"
@@ -97,29 +103,31 @@ class LlmModel(str, Enum, metaclass=LlmModelMeta):
GPT3_5_TURBO = "gpt-3.5-turbo"
# Anthropic models
CLAUDE_3_5_SONNET = "claude-3-5-sonnet-latest"
CLAUDE_3_5_HAIKU = "claude-3-5-haiku-latest"
CLAUDE_3_HAIKU = "claude-3-haiku-20240307"
# Groq models
LLAMA3_8B = "llama3-8b-8192"
LLAMA3_70B = "llama3-70b-8192"
MIXTRAL_8X7B = "mixtral-8x7b-32768"
GEMMA_7B = "gemma-7b-it"
GEMMA2_9B = "gemma2-9b-it"
# New Groq models (Preview)
LLAMA3_1_405B = "llama-3.1-405b-reasoning"
LLAMA3_1_70B = "llama-3.1-70b-versatile"
LLAMA3_3_70B = "llama-3.3-70b-versatile"
LLAMA3_1_8B = "llama-3.1-8b-instant"
LLAMA3_70B = "llama3-70b-8192"
LLAMA3_8B = "llama3-8b-8192"
MIXTRAL_8X7B = "mixtral-8x7b-32768"
# Groq preview models
DEEPSEEK_LLAMA_70B = "deepseek-r1-distill-llama-70b"
# Ollama models
OLLAMA_LLAMA3_3 = "llama3.3"
OLLAMA_LLAMA3_2 = "llama3.2"
OLLAMA_LLAMA3_8B = "llama3"
OLLAMA_LLAMA3_405B = "llama3.1:405b"
OLLAMA_DOLPHIN = "dolphin-mistral:latest"
# OpenRouter models
GEMINI_FLASH_1_5_8B = "google/gemini-flash-1.5"
GEMINI_FLASH_1_5 = "google/gemini-flash-1.5"
GROK_BETA = "x-ai/grok-beta"
MISTRAL_NEMO = "mistralai/mistral-nemo"
COHERE_COMMAND_R_08_2024 = "cohere/command-r-08-2024"
COHERE_COMMAND_R_PLUS_08_2024 = "cohere/command-r-plus-08-2024"
EVA_QWEN_2_5_32B = "eva-unit-01/eva-qwen-2.5-32b"
DEEPSEEK_CHAT = "deepseek/deepseek-chat"
DEEPSEEK_CHAT = "deepseek/deepseek-chat" # Actually: DeepSeek V3
PERPLEXITY_LLAMA_3_1_SONAR_LARGE_128K_ONLINE = (
"perplexity/llama-3.1-sonar-large-128k-online"
)
@@ -144,46 +152,74 @@ class LlmModel(str, Enum, metaclass=LlmModelMeta):
def context_window(self) -> int:
return self.metadata.context_window
@property
def max_output_tokens(self) -> int | None:
return self.metadata.max_output_tokens
MODEL_METADATA = {
LlmModel.O1_PREVIEW: ModelMetadata("openai", 32000),
LlmModel.O1_MINI: ModelMetadata("openai", 62000),
LlmModel.GPT4O_MINI: ModelMetadata("openai", 128000),
LlmModel.GPT4O: ModelMetadata("openai", 128000),
LlmModel.GPT4_TURBO: ModelMetadata("openai", 128000),
LlmModel.GPT3_5_TURBO: ModelMetadata("openai", 16385),
LlmModel.CLAUDE_3_5_SONNET: ModelMetadata("anthropic", 200000),
LlmModel.CLAUDE_3_HAIKU: ModelMetadata("anthropic", 200000),
LlmModel.LLAMA3_8B: ModelMetadata("groq", 8192),
LlmModel.LLAMA3_70B: ModelMetadata("groq", 8192),
LlmModel.MIXTRAL_8X7B: ModelMetadata("groq", 32768),
LlmModel.GEMMA_7B: ModelMetadata("groq", 8192),
LlmModel.GEMMA2_9B: ModelMetadata("groq", 8192),
LlmModel.LLAMA3_1_405B: ModelMetadata("groq", 8192),
# Limited to 16k during preview
LlmModel.LLAMA3_1_70B: ModelMetadata("groq", 131072),
LlmModel.LLAMA3_1_8B: ModelMetadata("groq", 131072),
LlmModel.OLLAMA_LLAMA3_8B: ModelMetadata("ollama", 8192),
LlmModel.OLLAMA_LLAMA3_405B: ModelMetadata("ollama", 8192),
LlmModel.OLLAMA_DOLPHIN: ModelMetadata("ollama", 32768),
LlmModel.GEMINI_FLASH_1_5_8B: ModelMetadata("open_router", 8192),
LlmModel.GROK_BETA: ModelMetadata("open_router", 8192),
LlmModel.MISTRAL_NEMO: ModelMetadata("open_router", 4000),
LlmModel.COHERE_COMMAND_R_08_2024: ModelMetadata("open_router", 4000),
LlmModel.COHERE_COMMAND_R_PLUS_08_2024: ModelMetadata("open_router", 4000),
LlmModel.EVA_QWEN_2_5_32B: ModelMetadata("open_router", 4000),
LlmModel.DEEPSEEK_CHAT: ModelMetadata("open_router", 8192),
# https://platform.openai.com/docs/models
LlmModel.O3_MINI: ModelMetadata("openai", 200000, 100000), # o3-mini-2025-01-31
LlmModel.O1: ModelMetadata("openai", 200000, 100000), # o1-2024-12-17
LlmModel.O1_PREVIEW: ModelMetadata(
"openai", 128000, 32768
), # o1-preview-2024-09-12
LlmModel.O1_MINI: ModelMetadata("openai", 128000, 65536), # o1-mini-2024-09-12
LlmModel.GPT4O_MINI: ModelMetadata(
"openai", 128000, 16384
), # gpt-4o-mini-2024-07-18
LlmModel.GPT4O: ModelMetadata("openai", 128000, 16384), # gpt-4o-2024-08-06
LlmModel.GPT4_TURBO: ModelMetadata(
"openai", 128000, 4096
), # gpt-4-turbo-2024-04-09
LlmModel.GPT3_5_TURBO: ModelMetadata("openai", 16385, 4096), # gpt-3.5-turbo-0125
# https://docs.anthropic.com/en/docs/about-claude/models
LlmModel.CLAUDE_3_5_SONNET: ModelMetadata(
"anthropic", 200000, 8192
), # claude-3-5-sonnet-20241022
LlmModel.CLAUDE_3_5_HAIKU: ModelMetadata(
"anthropic", 200000, 8192
), # claude-3-5-haiku-20241022
LlmModel.CLAUDE_3_HAIKU: ModelMetadata(
"anthropic", 200000, 4096
), # claude-3-haiku-20240307
# https://console.groq.com/docs/models
LlmModel.GEMMA2_9B: ModelMetadata("groq", 8192, None),
LlmModel.LLAMA3_3_70B: ModelMetadata("groq", 128000, 32768),
LlmModel.LLAMA3_1_8B: ModelMetadata("groq", 128000, 8192),
LlmModel.LLAMA3_70B: ModelMetadata("groq", 8192, None),
LlmModel.LLAMA3_8B: ModelMetadata("groq", 8192, None),
LlmModel.MIXTRAL_8X7B: ModelMetadata("groq", 32768, None),
LlmModel.DEEPSEEK_LLAMA_70B: ModelMetadata("groq", 128000, None),
# https://ollama.com/library
LlmModel.OLLAMA_LLAMA3_3: ModelMetadata("ollama", 8192, None),
LlmModel.OLLAMA_LLAMA3_2: ModelMetadata("ollama", 8192, None),
LlmModel.OLLAMA_LLAMA3_8B: ModelMetadata("ollama", 8192, None),
LlmModel.OLLAMA_LLAMA3_405B: ModelMetadata("ollama", 8192, None),
LlmModel.OLLAMA_DOLPHIN: ModelMetadata("ollama", 32768, None),
# https://openrouter.ai/models
LlmModel.GEMINI_FLASH_1_5: ModelMetadata("open_router", 1000000, 8192),
LlmModel.GROK_BETA: ModelMetadata("open_router", 131072, 131072),
LlmModel.MISTRAL_NEMO: ModelMetadata("open_router", 128000, 4096),
LlmModel.COHERE_COMMAND_R_08_2024: ModelMetadata("open_router", 128000, 4096),
LlmModel.COHERE_COMMAND_R_PLUS_08_2024: ModelMetadata("open_router", 128000, 4096),
LlmModel.EVA_QWEN_2_5_32B: ModelMetadata("open_router", 16384, 4096),
LlmModel.DEEPSEEK_CHAT: ModelMetadata("open_router", 64000, 2048),
LlmModel.PERPLEXITY_LLAMA_3_1_SONAR_LARGE_128K_ONLINE: ModelMetadata(
"open_router", 8192
"open_router", 127072, 127072
),
LlmModel.QWEN_QWQ_32B_PREVIEW: ModelMetadata("open_router", 4000),
LlmModel.NOUSRESEARCH_HERMES_3_LLAMA_3_1_405B: ModelMetadata("open_router", 4000),
LlmModel.NOUSRESEARCH_HERMES_3_LLAMA_3_1_70B: ModelMetadata("open_router", 4000),
LlmModel.AMAZON_NOVA_LITE_V1: ModelMetadata("open_router", 4000),
LlmModel.AMAZON_NOVA_MICRO_V1: ModelMetadata("open_router", 4000),
LlmModel.AMAZON_NOVA_PRO_V1: ModelMetadata("open_router", 4000),
LlmModel.MICROSOFT_WIZARDLM_2_8X22B: ModelMetadata("open_router", 4000),
LlmModel.GRYPHE_MYTHOMAX_L2_13B: ModelMetadata("open_router", 4000),
LlmModel.QWEN_QWQ_32B_PREVIEW: ModelMetadata("open_router", 32768, 32768),
LlmModel.NOUSRESEARCH_HERMES_3_LLAMA_3_1_405B: ModelMetadata(
"open_router", 131000, 4096
),
LlmModel.NOUSRESEARCH_HERMES_3_LLAMA_3_1_70B: ModelMetadata(
"open_router", 12288, 12288
),
LlmModel.AMAZON_NOVA_LITE_V1: ModelMetadata("open_router", 300000, 5120),
LlmModel.AMAZON_NOVA_MICRO_V1: ModelMetadata("open_router", 128000, 5120),
LlmModel.AMAZON_NOVA_PRO_V1: ModelMetadata("open_router", 300000, 5120),
LlmModel.MICROSOFT_WIZARDLM_2_8X22B: ModelMetadata("open_router", 65536, 4096),
LlmModel.GRYPHE_MYTHOMAX_L2_13B: ModelMetadata("open_router", 4096, 4096),
}
for model in LlmModel:
@@ -202,7 +238,17 @@ class Message(BlockSchema):
content: str
class AIStructuredResponseGeneratorBlock(Block):
class AIBlockBase(Block, ABC):
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
self.prompt = ""
def merge_llm_stats(self, block: "AIBlockBase"):
self.merge_stats(block.execution_stats)
self.prompt = block.prompt
class AIStructuredResponseGeneratorBlock(AIBlockBase):
class Input(BlockSchema):
prompt: str = SchemaField(
description="The prompt to send to the language model.",
@@ -234,7 +280,9 @@ class AIStructuredResponseGeneratorBlock(Block):
description="Number of times to retry the LLM call if the response does not match the expected format.",
)
prompt_values: dict[str, str] = SchemaField(
advanced=False, default={}, description="Values used to fill in the prompt."
advanced=False,
default={},
description="Values used to fill in the prompt. The values can be used in the prompt by putting them in a double curly braces, e.g. {{variable_name}}.",
)
max_tokens: int | None = SchemaField(
advanced=True,
@@ -252,6 +300,7 @@ class AIStructuredResponseGeneratorBlock(Block):
response: dict[str, Any] = SchemaField(
description="The response object generated by the language model."
)
prompt: str = SchemaField(description="The prompt sent to the language model.")
error: str = SchemaField(description="Error message if the API call failed.")
def __init__(self):
@@ -271,7 +320,10 @@ class AIStructuredResponseGeneratorBlock(Block):
"prompt": "User prompt",
},
test_credentials=TEST_CREDENTIALS,
test_output=("response", {"key1": "key1Value", "key2": "key2Value"}),
test_output=[
("response", {"key1": "key1Value", "key2": "key2Value"}),
("prompt", str),
],
test_mock={
"llm_call": lambda *args, **kwargs: (
json.dumps(
@@ -285,19 +337,20 @@ class AIStructuredResponseGeneratorBlock(Block):
)
},
)
self.prompt = ""
@staticmethod
def llm_call(
self,
credentials: APIKeyCredentials,
llm_model: LlmModel,
prompt: list[dict],
json_format: bool,
max_tokens: int | None = None,
max_tokens: int | None,
ollama_host: str = "localhost:11434",
) -> tuple[str, int, int]:
"""
Args:
api_key: API key for the LLM provider.
credentials: The API key credentials to use.
llm_model: The LLM model to use.
prompt: The prompt to send to the LLM.
json_format: Whether the response should be in JSON format.
@@ -310,6 +363,7 @@ class AIStructuredResponseGeneratorBlock(Block):
The number of tokens used in the completion.
"""
provider = llm_model.metadata.provider
max_tokens = max_tokens or llm_model.max_output_tokens or 4096
if provider == "openai":
oai_client = openai.OpenAI(api_key=credentials.api_key.get_secret_value())
@@ -331,6 +385,7 @@ class AIStructuredResponseGeneratorBlock(Block):
response_format=response_format, # type: ignore
max_completion_tokens=max_tokens,
)
self.prompt = json.dumps(prompt)
return (
response.choices[0].message.content or "",
@@ -358,8 +413,9 @@ class AIStructuredResponseGeneratorBlock(Block):
model=llm_model.value,
system=sysprompt,
messages=messages,
max_tokens=max_tokens or 8192,
max_tokens=max_tokens,
)
self.prompt = json.dumps(prompt)
if not resp.content:
raise ValueError("No content returned from Anthropic.")
@@ -386,6 +442,7 @@ class AIStructuredResponseGeneratorBlock(Block):
response_format=response_format, # type: ignore
max_tokens=max_tokens,
)
self.prompt = json.dumps(prompt)
return (
response.choices[0].message.content or "",
response.usage.prompt_tokens if response.usage else 0,
@@ -400,6 +457,7 @@ class AIStructuredResponseGeneratorBlock(Block):
prompt=f"{sys_messages}\n\n{usr_messages}",
stream=False,
)
self.prompt = json.dumps(prompt)
return (
response.get("response") or "",
response.get("prompt_eval_count") or 0,
@@ -420,6 +478,7 @@ class AIStructuredResponseGeneratorBlock(Block):
messages=prompt, # type: ignore
max_tokens=max_tokens,
)
self.prompt = json.dumps(prompt)
# If there's no response, raise an error
if not response.choices:
@@ -448,8 +507,8 @@ class AIStructuredResponseGeneratorBlock(Block):
values = input_data.prompt_values
if values:
input_data.prompt = input_data.prompt.format(**values)
input_data.sys_prompt = input_data.sys_prompt.format(**values)
input_data.prompt = fmt.format_string(input_data.prompt, values)
input_data.sys_prompt = fmt.format_string(input_data.sys_prompt, values)
if input_data.sys_prompt:
prompt.append({"role": "system", "content": input_data.sys_prompt})
@@ -519,9 +578,11 @@ class AIStructuredResponseGeneratorBlock(Block):
)
for k, v in parsed_dict.items()
}
yield "prompt", self.prompt
return
else:
yield "response", {"response": response_text}
yield "prompt", self.prompt
return
retry_prompt = trim_prompt(
@@ -552,7 +613,7 @@ class AIStructuredResponseGeneratorBlock(Block):
raise RuntimeError(retry_prompt)
class AITextGeneratorBlock(Block):
class AITextGeneratorBlock(AIBlockBase):
class Input(BlockSchema):
prompt: str = SchemaField(
description="The prompt to send to the language model. You can use any of the {keys} from Prompt Values to fill in the prompt with values from the prompt values dictionary by putting them in curly braces.",
@@ -576,7 +637,9 @@ class AITextGeneratorBlock(Block):
description="Number of times to retry the LLM call if the response does not match the expected format.",
)
prompt_values: dict[str, str] = SchemaField(
advanced=False, default={}, description="Values used to fill in the prompt."
advanced=False,
default={},
description="Values used to fill in the prompt. The values can be used in the prompt by putting them in a double curly braces, e.g. {{variable_name}}.",
)
ollama_host: str = SchemaField(
advanced=True,
@@ -593,6 +656,7 @@ class AITextGeneratorBlock(Block):
response: str = SchemaField(
description="The response generated by the language model."
)
prompt: str = SchemaField(description="The prompt sent to the language model.")
error: str = SchemaField(description="Error message if the API call failed.")
def __init__(self):
@@ -607,7 +671,10 @@ class AITextGeneratorBlock(Block):
"credentials": TEST_CREDENTIALS_INPUT,
},
test_credentials=TEST_CREDENTIALS,
test_output=("response", "Response text"),
test_output=[
("response", "Response text"),
("prompt", str),
],
test_mock={"llm_call": lambda *args, **kwargs: "Response text"},
)
@@ -618,7 +685,7 @@ class AITextGeneratorBlock(Block):
) -> str:
block = AIStructuredResponseGeneratorBlock()
response = block.run_once(input_data, "response", credentials=credentials)
self.merge_stats(block.execution_stats)
self.merge_llm_stats(block)
return response["response"]
def run(
@@ -629,6 +696,7 @@ class AITextGeneratorBlock(Block):
expected_format={},
)
yield "response", self.llm_call(object_input_data, credentials)
yield "prompt", self.prompt
class SummaryStyle(Enum):
@@ -638,7 +706,7 @@ class SummaryStyle(Enum):
NUMBERED_LIST = "numbered list"
class AITextSummarizerBlock(Block):
class AITextSummarizerBlock(AIBlockBase):
class Input(BlockSchema):
text: str = SchemaField(
description="The text to summarize.",
@@ -681,6 +749,7 @@ class AITextSummarizerBlock(Block):
class Output(BlockSchema):
summary: str = SchemaField(description="The final summary of the text.")
prompt: str = SchemaField(description="The prompt sent to the language model.")
error: str = SchemaField(description="Error message if the API call failed.")
def __init__(self):
@@ -695,7 +764,10 @@ class AITextSummarizerBlock(Block):
"credentials": TEST_CREDENTIALS_INPUT,
},
test_credentials=TEST_CREDENTIALS,
test_output=("summary", "Final summary of a long text"),
test_output=[
("summary", "Final summary of a long text"),
("prompt", str),
],
test_mock={
"llm_call": lambda input_data, credentials: (
{"final_summary": "Final summary of a long text"}
@@ -723,6 +795,7 @@ class AITextSummarizerBlock(Block):
final_summary = self._combine_summaries(summaries, input_data, credentials)
yield "summary", final_summary
yield "prompt", self.prompt
@staticmethod
def _split_text(text: str, max_tokens: int, overlap: int) -> list[str]:
@@ -743,7 +816,7 @@ class AITextSummarizerBlock(Block):
) -> dict:
block = AIStructuredResponseGeneratorBlock()
response = block.run_once(input_data, "response", credentials=credentials)
self.merge_stats(block.execution_stats)
self.merge_llm_stats(block)
return response
def _summarize_chunk(
@@ -800,7 +873,7 @@ class AITextSummarizerBlock(Block):
] # Get the first yielded value
class AIConversationBlock(Block):
class AIConversationBlock(AIBlockBase):
class Input(BlockSchema):
messages: List[Message] = SchemaField(
description="List of messages in the conversation.", min_length=1
@@ -826,6 +899,7 @@ class AIConversationBlock(Block):
response: str = SchemaField(
description="The model's response to the conversation."
)
prompt: str = SchemaField(description="The prompt sent to the language model.")
error: str = SchemaField(description="Error message if the API call failed.")
def __init__(self):
@@ -849,10 +923,13 @@ class AIConversationBlock(Block):
"credentials": TEST_CREDENTIALS_INPUT,
},
test_credentials=TEST_CREDENTIALS,
test_output=(
"response",
"The 2020 World Series was played at Globe Life Field in Arlington, Texas.",
),
test_output=[
(
"response",
"The 2020 World Series was played at Globe Life Field in Arlington, Texas.",
),
("prompt", str),
],
test_mock={
"llm_call": lambda *args, **kwargs: "The 2020 World Series was played at Globe Life Field in Arlington, Texas."
},
@@ -865,7 +942,7 @@ class AIConversationBlock(Block):
) -> str:
block = AIStructuredResponseGeneratorBlock()
response = block.run_once(input_data, "response", credentials=credentials)
self.merge_stats(block.execution_stats)
self.merge_llm_stats(block)
return response["response"]
def run(
@@ -879,14 +956,16 @@ class AIConversationBlock(Block):
conversation_history=input_data.messages,
max_tokens=input_data.max_tokens,
expected_format={},
ollama_host=input_data.ollama_host,
),
credentials=credentials,
)
yield "response", response
yield "prompt", self.prompt
class AIListGeneratorBlock(Block):
class AIListGeneratorBlock(AIBlockBase):
class Input(BlockSchema):
focus: str | None = SchemaField(
description="The focus of the list to generate.",
@@ -929,6 +1008,7 @@ class AIListGeneratorBlock(Block):
list_item: str = SchemaField(
description="Each individual item in the list.",
)
prompt: str = SchemaField(description="The prompt sent to the language model.")
error: str = SchemaField(
description="Error message if the list generation failed."
)
@@ -960,6 +1040,7 @@ class AIListGeneratorBlock(Block):
"generated_list",
["Zylora Prime", "Kharon-9", "Vortexia", "Oceara", "Draknos"],
),
("prompt", str),
("list_item", "Zylora Prime"),
("list_item", "Kharon-9"),
("list_item", "Vortexia"),
@@ -973,13 +1054,14 @@ class AIListGeneratorBlock(Block):
},
)
@staticmethod
def llm_call(
self,
input_data: AIStructuredResponseGeneratorBlock.Input,
credentials: APIKeyCredentials,
) -> dict[str, str]:
llm_block = AIStructuredResponseGeneratorBlock()
response = llm_block.run_once(input_data, "response", credentials=credentials)
self.merge_llm_stats(llm_block)
return response
@staticmethod
@@ -1093,6 +1175,7 @@ class AIListGeneratorBlock(Block):
# If we reach here, we have a valid Python list
logger.debug("Successfully generated a valid Python list")
yield "generated_list", parsed_list
yield "prompt", self.prompt
# Yield each item in the list
for item in parsed_list:

View File

@@ -0,0 +1,245 @@
import os
import tempfile
from typing import Literal, Optional
from moviepy.audio.io.AudioFileClip import AudioFileClip
from moviepy.video.fx.Loop import Loop
from moviepy.video.io.VideoFileClip import VideoFileClip
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.model import SchemaField
from backend.util.file import MediaFile, get_exec_file_path, store_media_file
class MediaDurationBlock(Block):
class Input(BlockSchema):
media_in: MediaFile = SchemaField(
description="Media input (URL, data URI, or local path)."
)
is_video: bool = SchemaField(
description="Whether the media is a video (True) or audio (False).",
default=True,
)
class Output(BlockSchema):
duration: float = SchemaField(
description="Duration of the media file (in seconds)."
)
error: str = SchemaField(
description="Error message if something fails.", default=""
)
def __init__(self):
super().__init__(
id="d8b91fd4-da26-42d4-8ecb-8b196c6d84b6",
description="Block to get the duration of a media file.",
categories={BlockCategory.MULTIMEDIA},
input_schema=MediaDurationBlock.Input,
output_schema=MediaDurationBlock.Output,
)
def run(
self,
input_data: Input,
*,
graph_exec_id: str,
**kwargs,
) -> BlockOutput:
# 1) Store the input media locally
local_media_path = store_media_file(
graph_exec_id=graph_exec_id,
file=input_data.media_in,
return_content=False,
)
media_abspath = get_exec_file_path(graph_exec_id, local_media_path)
# 2) Load the clip
if input_data.is_video:
clip = VideoFileClip(media_abspath)
else:
clip = AudioFileClip(media_abspath)
yield "duration", clip.duration
class LoopVideoBlock(Block):
"""
Block for looping (repeating) a video clip until a given duration or number of loops.
"""
class Input(BlockSchema):
video_in: MediaFile = SchemaField(
description="The input video (can be a URL, data URI, or local path)."
)
# Provide EITHER a `duration` or `n_loops` or both. We'll demonstrate `duration`.
duration: Optional[float] = SchemaField(
description="Target duration (in seconds) to loop the video to. If omitted, defaults to no looping.",
default=None,
ge=0.0,
)
n_loops: Optional[int] = SchemaField(
description="Number of times to repeat the video. If omitted, defaults to 1 (no repeat).",
default=None,
ge=1,
)
output_return_type: Literal["file_path", "data_uri"] = SchemaField(
description="How to return the output video. Either a relative path or base64 data URI.",
default="file_path",
)
class Output(BlockSchema):
video_out: str = SchemaField(
description="Looped video returned either as a relative path or a data URI."
)
error: str = SchemaField(
description="Error message if something fails.", default=""
)
def __init__(self):
super().__init__(
id="8bf9eef6-5451-4213-b265-25306446e94b",
description="Block to loop a video to a given duration or number of repeats.",
categories={BlockCategory.MULTIMEDIA},
input_schema=LoopVideoBlock.Input,
output_schema=LoopVideoBlock.Output,
)
def run(
self,
input_data: Input,
*,
node_exec_id: str,
graph_exec_id: str,
**kwargs,
) -> BlockOutput:
# 1) Store the input video locally
local_video_path = store_media_file(
graph_exec_id=graph_exec_id,
file=input_data.video_in,
return_content=False,
)
input_abspath = get_exec_file_path(graph_exec_id, local_video_path)
# 2) Load the clip
clip = VideoFileClip(input_abspath)
# 3) Apply the loop effect
looped_clip = clip
if input_data.duration:
# Loop until we reach the specified duration
looped_clip = looped_clip.with_effects([Loop(duration=input_data.duration)])
elif input_data.n_loops:
looped_clip = looped_clip.with_effects([Loop(n=input_data.n_loops)])
else:
raise ValueError("Either 'duration' or 'n_loops' must be provided.")
assert isinstance(looped_clip, VideoFileClip)
# 4) Save the looped output
output_filename = MediaFile(
f"{node_exec_id}_looped_{os.path.basename(local_video_path)}"
)
output_abspath = get_exec_file_path(graph_exec_id, output_filename)
looped_clip = looped_clip.with_audio(clip.audio)
looped_clip.write_videofile(output_abspath, codec="libx264", audio_codec="aac")
# Return as data URI
video_out = store_media_file(
graph_exec_id=graph_exec_id,
file=output_filename,
return_content=input_data.output_return_type == "data_uri",
)
yield "video_out", video_out
class AddAudioToVideoBlock(Block):
"""
Block that adds (attaches) an audio track to an existing video.
Optionally scale the volume of the new track.
"""
class Input(BlockSchema):
video_in: MediaFile = SchemaField(
description="Video input (URL, data URI, or local path)."
)
audio_in: MediaFile = SchemaField(
description="Audio input (URL, data URI, or local path)."
)
volume: float = SchemaField(
description="Volume scale for the newly attached audio track (1.0 = original).",
default=1.0,
)
output_return_type: Literal["file_path", "data_uri"] = SchemaField(
description="Return the final output as a relative path or base64 data URI.",
default="file_path",
)
class Output(BlockSchema):
video_out: MediaFile = SchemaField(
description="Final video (with attached audio), as a path or data URI."
)
error: str = SchemaField(
description="Error message if something fails.", default=""
)
def __init__(self):
super().__init__(
id="3503748d-62b6-4425-91d6-725b064af509",
description="Block to attach an audio file to a video file using moviepy.",
categories={BlockCategory.MULTIMEDIA},
input_schema=AddAudioToVideoBlock.Input,
output_schema=AddAudioToVideoBlock.Output,
)
def run(
self,
input_data: Input,
*,
node_exec_id: str,
graph_exec_id: str,
**kwargs,
) -> BlockOutput:
# 1) Store the inputs locally
local_video_path = store_media_file(
graph_exec_id=graph_exec_id,
file=input_data.video_in,
return_content=False,
)
local_audio_path = store_media_file(
graph_exec_id=graph_exec_id,
file=input_data.audio_in,
return_content=False,
)
abs_temp_dir = os.path.join(tempfile.gettempdir(), "exec_file", graph_exec_id)
video_abspath = os.path.join(abs_temp_dir, local_video_path)
audio_abspath = os.path.join(abs_temp_dir, local_audio_path)
# 2) Load video + audio with moviepy
video_clip = VideoFileClip(video_abspath)
audio_clip = AudioFileClip(audio_abspath)
# Optionally scale volume
if input_data.volume != 1.0:
audio_clip = audio_clip.with_volume_scaled(input_data.volume)
# 3) Attach the new audio track
final_clip = video_clip.with_audio(audio_clip)
# 4) Write to output file
output_filename = MediaFile(
f"{node_exec_id}_audio_attached_{os.path.basename(local_video_path)}"
)
output_abspath = os.path.join(abs_temp_dir, output_filename)
final_clip.write_videofile(output_abspath, codec="libx264", audio_codec="aac")
# 5) Return either path or data URI
video_out = store_media_file(
graph_exec_id=graph_exec_id,
file=output_filename,
return_content=input_data.output_return_type == "data_uri",
)
yield "video_out", video_out

View File

@@ -0,0 +1,338 @@
from typing import Any, Literal, Optional, Union
from mem0 import MemoryClient
from pydantic import BaseModel, SecretStr
from backend.data.block import Block, BlockOutput, BlockSchema
from backend.data.model import (
APIKeyCredentials,
CredentialsField,
CredentialsMetaInput,
SchemaField,
)
from backend.integrations.providers import ProviderName
TEST_CREDENTIALS = APIKeyCredentials(
id="ed55ac19-356e-4243-a6cb-bc599e9b716f",
provider="mem0",
api_key=SecretStr("mock-mem0-api-key"),
title="Mock Mem0 API key",
expires_at=None,
)
TEST_CREDENTIALS_INPUT = {
"provider": TEST_CREDENTIALS.provider,
"id": TEST_CREDENTIALS.id,
"type": TEST_CREDENTIALS.type,
"title": TEST_CREDENTIALS.title,
}
class Mem0Base:
"""Base class with shared utilities for Mem0 blocks"""
@staticmethod
def _get_client(credentials: APIKeyCredentials) -> MemoryClient:
"""Get initialized Mem0 client"""
return MemoryClient(api_key=credentials.api_key.get_secret_value())
Filter = dict[str, list[dict[str, str | dict[str, list[str]]]]]
class Conversation(BaseModel):
discriminator: Literal["conversation"]
messages: list[dict[str, str]]
class Content(BaseModel):
discriminator: Literal["content"]
content: str
class AddMemoryBlock(Block, Mem0Base):
"""Block for adding memories to Mem0
Always limited by user_id and optional graph_id and graph_exec_id"""
class Input(BlockSchema):
credentials: CredentialsMetaInput[
Literal[ProviderName.MEM0], Literal["api_key"]
] = CredentialsField(description="Mem0 API key credentials")
content: Union[Content, Conversation] = SchemaField(
discriminator="discriminator",
description="Content to add - either a string or list of message objects as output from an AI block",
default=Content(discriminator="content", content="I'm a vegetarian"),
)
metadata: dict[str, Any] = SchemaField(
description="Optional metadata for the memory", default={}
)
limit_memory_to_run: bool = SchemaField(
description="Limit the memory to the run", default=False
)
limit_memory_to_agent: bool = SchemaField(
description="Limit the memory to the agent", default=False
)
class Output(BlockSchema):
action: str = SchemaField(description="Action of the operation")
memory: str = SchemaField(description="Memory created")
error: str = SchemaField(description="Error message if operation fails")
def __init__(self):
super().__init__(
id="dce97578-86be-45a4-ae50-f6de33fc935a",
description="Add new memories to Mem0 with user segmentation",
input_schema=AddMemoryBlock.Input,
output_schema=AddMemoryBlock.Output,
test_input=[
{
"content": {
"discriminator": "conversation",
"messages": [{"role": "user", "content": "I'm a vegetarian"}],
},
"metadata": {"food": "vegetarian"},
"credentials": TEST_CREDENTIALS_INPUT,
},
{
"content": {
"discriminator": "content",
"content": "I am a vegetarian",
},
"metadata": {"food": "vegetarian"},
"credentials": TEST_CREDENTIALS_INPUT,
},
],
test_output=[("action", "NO_CHANGE"), ("action", "NO_CHANGE")],
test_credentials=TEST_CREDENTIALS,
test_mock={"_get_client": lambda credentials: MockMemoryClient()},
)
def run(
self,
input_data: Input,
*,
credentials: APIKeyCredentials,
user_id: str,
graph_id: str,
graph_exec_id: str,
**kwargs
) -> BlockOutput:
try:
client = self._get_client(credentials)
if isinstance(input_data.content, Conversation):
messages = input_data.content.messages
else:
messages = [{"role": "user", "content": input_data.content}]
params = {
"user_id": user_id,
"output_format": "v1.1",
"metadata": input_data.metadata,
}
if input_data.limit_memory_to_run:
params["run_id"] = graph_exec_id
if input_data.limit_memory_to_agent:
params["agent_id"] = graph_id
# Use the client to add memory
result = client.add(
messages,
**params,
)
if len(result.get("results", [])) > 0:
for result in result.get("results", []):
yield "action", result["event"]
yield "memory", result["memory"]
else:
yield "action", "NO_CHANGE"
except Exception as e:
yield "error", str(object=e)
class SearchMemoryBlock(Block, Mem0Base):
"""Block for searching memories in Mem0"""
class Input(BlockSchema):
credentials: CredentialsMetaInput[
Literal[ProviderName.MEM0], Literal["api_key"]
] = CredentialsField(description="Mem0 API key credentials")
query: str = SchemaField(
description="Search query",
advanced=False,
)
trigger: bool = SchemaField(
description="An unused field that is used to (re-)trigger the block when you have no other inputs",
default=False,
advanced=False,
)
categories_filter: list[str] = SchemaField(
description="Categories to filter by",
default=[],
advanced=True,
)
limit_memory_to_run: bool = SchemaField(
description="Limit the memory to the run", default=False
)
limit_memory_to_agent: bool = SchemaField(
description="Limit the memory to the agent", default=True
)
class Output(BlockSchema):
memories: Any = SchemaField(description="List of matching memories")
error: str = SchemaField(description="Error message if operation fails")
def __init__(self):
super().__init__(
id="bd7c84e3-e073-4b75-810c-600886ec8a5b",
description="Search memories in Mem0 by user",
input_schema=SearchMemoryBlock.Input,
output_schema=SearchMemoryBlock.Output,
test_input={
"query": "vegetarian preferences",
"credentials": TEST_CREDENTIALS_INPUT,
"top_k": 10,
"rerank": True,
},
test_output=[
("memories", [{"id": "test-memory", "content": "test content"}])
],
test_credentials=TEST_CREDENTIALS,
test_mock={"_get_client": lambda credentials: MockMemoryClient()},
)
def run(
self,
input_data: Input,
*,
credentials: APIKeyCredentials,
user_id: str,
graph_id: str,
graph_exec_id: str,
**kwargs
) -> BlockOutput:
try:
client = self._get_client(credentials)
filters: Filter = {
# This works with only one filter, so we can allow others to add on later
"AND": [
{"user_id": user_id},
]
}
if input_data.categories_filter:
filters["AND"].append(
{"categories": {"contains": input_data.categories_filter}}
)
if input_data.limit_memory_to_run:
filters["AND"].append({"run_id": graph_exec_id})
if input_data.limit_memory_to_agent:
filters["AND"].append({"agent_id": graph_id})
result: list[dict[str, Any]] = client.search(
input_data.query, version="v2", filters=filters
)
yield "memories", result
except Exception as e:
yield "error", str(e)
class GetAllMemoriesBlock(Block, Mem0Base):
"""Block for retrieving all memories from Mem0"""
class Input(BlockSchema):
credentials: CredentialsMetaInput[
Literal[ProviderName.MEM0], Literal["api_key"]
] = CredentialsField(description="Mem0 API key credentials")
trigger: bool = SchemaField(
description="An unused field that is used to trigger the block when you have no other inputs",
default=False,
advanced=False,
)
categories: Optional[list[str]] = SchemaField(
description="Filter by categories", default=None
)
limit_memory_to_run: bool = SchemaField(
description="Limit the memory to the run", default=False
)
limit_memory_to_agent: bool = SchemaField(
description="Limit the memory to the agent", default=False
)
class Output(BlockSchema):
memories: Any = SchemaField(description="List of memories")
error: str = SchemaField(description="Error message if operation fails")
def __init__(self):
super().__init__(
id="45aee5bf-4767-45d1-a28b-e01c5aae9fc1",
description="Retrieve all memories from Mem0 with pagination",
input_schema=GetAllMemoriesBlock.Input,
output_schema=GetAllMemoriesBlock.Output,
test_input={
"user_id": "test_user",
"credentials": TEST_CREDENTIALS_INPUT,
},
test_output=[
("memories", [{"id": "test-memory", "content": "test content"}]),
],
test_credentials=TEST_CREDENTIALS,
test_mock={"_get_client": lambda credentials: MockMemoryClient()},
)
def run(
self,
input_data: Input,
*,
credentials: APIKeyCredentials,
user_id: str,
graph_id: str,
graph_exec_id: str,
**kwargs
) -> BlockOutput:
try:
client = self._get_client(credentials)
filters: Filter = {
"AND": [
{"user_id": user_id},
]
}
if input_data.limit_memory_to_run:
filters["AND"].append({"run_id": graph_exec_id})
if input_data.limit_memory_to_agent:
filters["AND"].append({"agent_id": graph_id})
if input_data.categories:
filters["AND"].append(
{"categories": {"contains": input_data.categories}}
)
memories: list[dict[str, Any]] = client.get_all(
filters=filters,
version="v2",
)
yield "memories", memories
except Exception as e:
yield "error", str(e)
# Mock client for testing
class MockMemoryClient:
"""Mock Mem0 client for testing"""
def add(self, *args, **kwargs):
return {"memory_id": "test-memory-id", "status": "success"}
def search(self, *args, **kwargs) -> list[dict[str, str]]:
return [{"id": "test-memory", "content": "test content"}]
def get_all(self, *args, **kwargs) -> list[dict[str, str]]:
return [{"id": "test-memory", "content": "test content"}]

View File

@@ -1,22 +1,48 @@
from datetime import datetime, timezone
from typing import Iterator
from typing import Iterator, Literal
import praw
from pydantic import BaseModel, ConfigDict
from pydantic import BaseModel, SecretStr
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.model import BlockSecret, SchemaField, SecretField
from backend.data.model import (
CredentialsField,
CredentialsMetaInput,
SchemaField,
UserPasswordCredentials,
)
from backend.integrations.providers import ProviderName
from backend.util.mock import MockObject
from backend.util.settings import Settings
RedditCredentials = UserPasswordCredentials
RedditCredentialsInput = CredentialsMetaInput[
Literal[ProviderName.REDDIT],
Literal["user_password"],
]
class RedditCredentials(BaseModel):
client_id: BlockSecret = SecretField(key="reddit_client_id")
client_secret: BlockSecret = SecretField(key="reddit_client_secret")
username: BlockSecret = SecretField(key="reddit_username")
password: BlockSecret = SecretField(key="reddit_password")
user_agent: str = "AutoGPT:1.0 (by /u/autogpt)"
def RedditCredentialsField() -> RedditCredentialsInput:
"""Creates a Reddit credentials input on a block."""
return CredentialsField(
description="The Reddit integration requires a username and password.",
)
model_config = ConfigDict(title="Reddit Credentials")
TEST_CREDENTIALS = UserPasswordCredentials(
id="01234567-89ab-cdef-0123-456789abcdef",
provider="reddit",
username=SecretStr("mock-reddit-username"),
password=SecretStr("mock-reddit-password"),
title="Mock Reddit credentials",
)
TEST_CREDENTIALS_INPUT = {
"provider": TEST_CREDENTIALS.provider,
"id": TEST_CREDENTIALS.id,
"type": TEST_CREDENTIALS.type,
"title": TEST_CREDENTIALS.title,
}
class RedditPost(BaseModel):
@@ -31,13 +57,16 @@ class RedditComment(BaseModel):
comment: str
settings = Settings()
def get_praw(creds: RedditCredentials) -> praw.Reddit:
client = praw.Reddit(
client_id=creds.client_id.get_secret_value(),
client_secret=creds.client_secret.get_secret_value(),
client_id=settings.secrets.reddit_client_id,
client_secret=settings.secrets.reddit_client_secret,
username=creds.username.get_secret_value(),
password=creds.password.get_secret_value(),
user_agent=creds.user_agent,
user_agent=settings.config.reddit_user_agent,
)
me = client.user.me()
if not me:
@@ -48,11 +77,11 @@ def get_praw(creds: RedditCredentials) -> praw.Reddit:
class GetRedditPostsBlock(Block):
class Input(BlockSchema):
subreddit: str = SchemaField(description="Subreddit name")
creds: RedditCredentials = SchemaField(
description="Reddit credentials",
default=RedditCredentials(),
subreddit: str = SchemaField(
description="Subreddit name, excluding the /r/ prefix",
default="writingprompts",
)
credentials: RedditCredentialsInput = RedditCredentialsField()
last_minutes: int | None = SchemaField(
description="Post time to stop minutes ago while fetching posts",
default=None,
@@ -70,20 +99,18 @@ class GetRedditPostsBlock(Block):
def __init__(self):
super().__init__(
disabled=True,
id="c6731acb-4285-4ee1-bc9b-03d0766c370f",
description="This block fetches Reddit posts from a defined subreddit name.",
categories={BlockCategory.SOCIAL},
disabled=(
not settings.secrets.reddit_client_id
or not settings.secrets.reddit_client_secret
),
input_schema=GetRedditPostsBlock.Input,
output_schema=GetRedditPostsBlock.Output,
test_credentials=TEST_CREDENTIALS,
test_input={
"creds": {
"client_id": "client_id",
"client_secret": "client_secret",
"username": "username",
"password": "password",
"user_agent": "user_agent",
},
"credentials": TEST_CREDENTIALS_INPUT,
"subreddit": "subreddit",
"last_post": "id3",
"post_limit": 2,
@@ -103,7 +130,7 @@ class GetRedditPostsBlock(Block):
),
],
test_mock={
"get_posts": lambda _: [
"get_posts": lambda input_data, credentials: [
MockObject(id="id1", title="title1", selftext="body1"),
MockObject(id="id2", title="title2", selftext="body2"),
MockObject(id="id3", title="title2", selftext="body2"),
@@ -112,14 +139,18 @@ class GetRedditPostsBlock(Block):
)
@staticmethod
def get_posts(input_data: Input) -> Iterator[praw.reddit.Submission]:
client = get_praw(input_data.creds)
def get_posts(
input_data: Input, *, credentials: RedditCredentials
) -> Iterator[praw.reddit.Submission]:
client = get_praw(credentials)
subreddit = client.subreddit(input_data.subreddit)
return subreddit.new(limit=input_data.post_limit or 10)
def run(self, input_data: Input, **kwargs) -> BlockOutput:
def run(
self, input_data: Input, *, credentials: RedditCredentials, **kwargs
) -> BlockOutput:
current_time = datetime.now(tz=timezone.utc)
for post in self.get_posts(input_data):
for post in self.get_posts(input_data=input_data, credentials=credentials):
if input_data.last_minutes:
post_datetime = datetime.fromtimestamp(
post.created_utc, tz=timezone.utc
@@ -141,9 +172,7 @@ class GetRedditPostsBlock(Block):
class PostRedditCommentBlock(Block):
class Input(BlockSchema):
creds: RedditCredentials = SchemaField(
description="Reddit credentials", default=RedditCredentials()
)
credentials: RedditCredentialsInput = RedditCredentialsField()
data: RedditComment = SchemaField(description="Reddit comment")
class Output(BlockSchema):
@@ -156,7 +185,15 @@ class PostRedditCommentBlock(Block):
categories={BlockCategory.SOCIAL},
input_schema=PostRedditCommentBlock.Input,
output_schema=PostRedditCommentBlock.Output,
test_input={"data": {"post_id": "id", "comment": "comment"}},
disabled=(
not settings.secrets.reddit_client_id
or not settings.secrets.reddit_client_secret
),
test_credentials=TEST_CREDENTIALS,
test_input={
"credentials": TEST_CREDENTIALS_INPUT,
"data": {"post_id": "id", "comment": "comment"},
},
test_output=[("comment_id", "dummy_comment_id")],
test_mock={"reply_post": lambda creds, comment: "dummy_comment_id"},
)
@@ -170,5 +207,7 @@ class PostRedditCommentBlock(Block):
raise ValueError("Failed to post comment.")
return new_comment.id
def run(self, input_data: Input, **kwargs) -> BlockOutput:
yield "comment_id", self.reply_post(input_data.creds, input_data.data)
def run(
self, input_data: Input, *, credentials: RedditCredentials, **kwargs
) -> BlockOutput:
yield "comment_id", self.reply_post(credentials, input_data.data)

View File

@@ -131,7 +131,7 @@ class ReplicateFluxAdvancedModelBlock(Block):
super().__init__(
id="90f8c45e-e983-4644-aa0b-b4ebe2f531bc",
description="This block runs Flux models on Replicate with advanced settings.",
categories={BlockCategory.AI},
categories={BlockCategory.AI, BlockCategory.MULTIMEDIA},
input_schema=ReplicateFluxAdvancedModelBlock.Input,
output_schema=ReplicateFluxAdvancedModelBlock.Output,
test_input={

View File

@@ -0,0 +1,174 @@
from base64 import b64encode
from enum import Enum
from typing import Literal
from pydantic import SecretStr
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.model import (
APIKeyCredentials,
CredentialsField,
CredentialsMetaInput,
SchemaField,
)
from backend.integrations.providers import ProviderName
from backend.util.file import MediaFile, store_media_file
from backend.util.request import Requests
class Format(str, Enum):
PNG = "png"
JPEG = "jpeg"
WEBP = "webp"
class ScreenshotWebPageBlock(Block):
"""Block for taking screenshots using ScreenshotOne API"""
class Input(BlockSchema):
credentials: CredentialsMetaInput[
Literal[ProviderName.SCREENSHOTONE], Literal["api_key"]
] = CredentialsField(description="The ScreenshotOne API key")
url: str = SchemaField(
description="URL of the website to screenshot",
placeholder="https://example.com",
)
viewport_width: int = SchemaField(
description="Width of the viewport in pixels", default=1920
)
viewport_height: int = SchemaField(
description="Height of the viewport in pixels", default=1080
)
full_page: bool = SchemaField(
description="Whether to capture the full page length", default=False
)
format: Format = SchemaField(
description="Output format (png, jpeg, webp)", default=Format.PNG
)
block_ads: bool = SchemaField(description="Whether to block ads", default=True)
block_cookie_banners: bool = SchemaField(
description="Whether to block cookie banners", default=True
)
block_chats: bool = SchemaField(
description="Whether to block chat widgets", default=True
)
cache: bool = SchemaField(
description="Whether to enable caching", default=False
)
class Output(BlockSchema):
image: MediaFile = SchemaField(description="The screenshot image data")
error: str = SchemaField(description="Error message if the screenshot failed")
def __init__(self):
super().__init__(
id="3a7c4b8d-6e2f-4a5d-b9c1-f8d23c5a9b0e", # Generated UUID
description="Takes a screenshot of a specified website using ScreenshotOne API",
categories={BlockCategory.DATA},
input_schema=ScreenshotWebPageBlock.Input,
output_schema=ScreenshotWebPageBlock.Output,
test_input={
"url": "https://example.com",
"viewport_width": 1920,
"viewport_height": 1080,
"full_page": False,
"format": "png",
"block_ads": True,
"block_cookie_banners": True,
"block_chats": True,
"cache": False,
"credentials": {
"provider": "screenshotone",
"type": "api_key",
"id": "test-id",
"title": "Test API Key",
},
},
test_credentials=APIKeyCredentials(
id="test-id",
provider="screenshotone",
api_key=SecretStr("test-key"),
title="Test API Key",
expires_at=None,
),
test_output=[
(
"image",
"data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAABAAAAAQCAYAAAAf8/9hAAAAAXNSR0IArs4c6QAAAB5JREFUOE9jZPjP8J+BAsA4agDDaBgwjIYBw7AIAwCV5B/xAsMbygAAAABJRU5ErkJggg==",
),
],
test_mock={
"take_screenshot": lambda *args, **kwargs: {
"image": "data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAABAAAAAQCAYAAAAf8/9hAAAAAXNSR0IArs4c6QAAAB5JREFUOE9jZPjP8J+BAsA4agDDaBgwjIYBw7AIAwCV5B/xAsMbygAAAABJRU5ErkJggg==",
}
},
)
@staticmethod
def take_screenshot(
credentials: APIKeyCredentials,
graph_exec_id: str,
url: str,
viewport_width: int,
viewport_height: int,
full_page: bool,
format: Format,
block_ads: bool,
block_cookie_banners: bool,
block_chats: bool,
cache: bool,
) -> dict:
"""
Takes a screenshot using the ScreenshotOne API
"""
api = Requests(trusted_origins=["https://api.screenshotone.com"])
# Build API URL with parameters
params = {
"access_key": credentials.api_key.get_secret_value(),
"url": url,
"viewport_width": viewport_width,
"viewport_height": viewport_height,
"full_page": str(full_page).lower(),
"format": format.value,
"block_ads": str(block_ads).lower(),
"block_cookie_banners": str(block_cookie_banners).lower(),
"block_chats": str(block_chats).lower(),
"cache": str(cache).lower(),
}
response = api.get("https://api.screenshotone.com/take", params=params)
return {
"image": store_media_file(
graph_exec_id=graph_exec_id,
file=f"data:image/{format.value};base64,{b64encode(response.content).decode('utf-8')}",
return_content=True,
)
}
def run(
self,
input_data: Input,
*,
credentials: APIKeyCredentials,
graph_exec_id: str,
**kwargs,
) -> BlockOutput:
try:
screenshot_data = self.take_screenshot(
credentials=credentials,
graph_exec_id=graph_exec_id,
url=input_data.url,
viewport_width=input_data.viewport_width,
viewport_height=input_data.viewport_height,
full_page=input_data.full_page,
format=input_data.format,
block_ads=input_data.block_ads,
block_cookie_banners=input_data.block_cookie_banners,
block_chats=input_data.block_chats,
cache=input_data.cache,
)
yield "image", screenshot_data["image"]
except Exception as e:
yield "error", str(e)

View File

@@ -78,7 +78,7 @@ class CreateTalkingAvatarVideoBlock(Block):
super().__init__(
id="98c6f503-8c47-4b1c-a96d-351fc7c87dab",
description="This block integrates with D-ID to create video clips and retrieve their URLs.",
categories={BlockCategory.AI},
categories={BlockCategory.AI, BlockCategory.MULTIMEDIA},
input_schema=CreateTalkingAvatarVideoBlock.Input,
output_schema=CreateTalkingAvatarVideoBlock.Output,
test_input={

View File

@@ -76,6 +76,8 @@ class ExtractTextInformationBlock(Block):
class Output(BlockSchema):
positive: str = SchemaField(description="Extracted text")
negative: str = SchemaField(description="Original text")
matched_results: list[str] = SchemaField(description="List of matched results")
matched_count: int = SchemaField(description="Number of matched results")
def __init__(self):
super().__init__(
@@ -103,13 +105,31 @@ class ExtractTextInformationBlock(Block):
},
],
test_output=[
# Test case 1
("positive", "World!"),
("matched_results", ["World!"]),
("matched_count", 1),
# Test case 2
("positive", "Hello, World!"),
("matched_results", ["Hello, World!"]),
("matched_count", 1),
# Test case 3
("negative", "Hello, World!"),
("matched_results", []),
("matched_count", 0),
# Test case 4
("positive", "Hello,"),
("matched_results", ["Hello,"]),
("matched_count", 1),
# Test case 5
("positive", "World!!"),
("matched_results", ["World!!"]),
("matched_count", 1),
# Test case 6
("positive", "World!!"),
("positive", "Earth!!"),
("matched_results", ["World!!", "Earth!!"]),
("matched_count", 2),
],
)
@@ -130,21 +150,24 @@ class ExtractTextInformationBlock(Block):
for match in re.finditer(input_data.pattern, txt, flags)
if input_data.group <= len(match.groups())
]
if not input_data.find_all:
matches = matches[:1]
for match in matches:
yield "positive", match
if not input_data.find_all:
return
if not matches:
yield "negative", input_data.text
yield "matched_results", matches
yield "matched_count", len(matches)
class FillTextTemplateBlock(Block):
class Input(BlockSchema):
values: dict[str, Any] = SchemaField(
description="Values (dict) to be used in format"
description="Values (dict) to be used in format. These values can be used by putting them in double curly braces in the format template. e.g. {{value_name}}.",
)
format: str = SchemaField(
description="Template to format the text using `values`"
description="Template to format the text using `values`. Use Jinja2 syntax."
)
class Output(BlockSchema):
@@ -160,7 +183,7 @@ class FillTextTemplateBlock(Block):
test_input=[
{
"values": {"name": "Alice", "hello": "Hello", "world": "World!"},
"format": "{hello}, {world} {{name}}",
"format": "{{hello}}, {{ world }} {{name}}",
},
{
"values": {"list": ["Hello", " World!"]},
@@ -212,3 +235,71 @@ class CombineTextsBlock(Block):
def run(self, input_data: Input, **kwargs) -> BlockOutput:
combined_text = input_data.delimiter.join(input_data.input)
yield "output", combined_text
class TextSplitBlock(Block):
class Input(BlockSchema):
text: str = SchemaField(description="The text to split.")
delimiter: str = SchemaField(description="The delimiter to split the text by.")
strip: bool = SchemaField(
description="Whether to strip the text.", default=True
)
class Output(BlockSchema):
texts: list[str] = SchemaField(
description="The text split into a list of strings."
)
def __init__(self):
super().__init__(
id="d5ea33c8-a575-477a-b42f-2fe3be5055ec",
description="This block is used to split a text into a list of strings.",
categories={BlockCategory.TEXT},
input_schema=TextSplitBlock.Input,
output_schema=TextSplitBlock.Output,
test_input=[
{"text": "Hello, World!", "delimiter": ","},
{"text": "Hello, World!", "delimiter": ",", "strip": False},
],
test_output=[
("texts", ["Hello", "World!"]),
("texts", ["Hello", " World!"]),
],
)
def run(self, input_data: Input, **kwargs) -> BlockOutput:
if len(input_data.text) == 0:
yield "texts", []
else:
texts = input_data.text.split(input_data.delimiter)
if input_data.strip:
texts = [text.strip() for text in texts]
yield "texts", texts
class TextReplaceBlock(Block):
class Input(BlockSchema):
text: str = SchemaField(description="The text to replace.")
old: str = SchemaField(description="The old text to replace.")
new: str = SchemaField(description="The new text to replace with.")
class Output(BlockSchema):
output: str = SchemaField(description="The text with the replaced text.")
def __init__(self):
super().__init__(
id="7e7c87ab-3469-4bcc-9abe-67705091b713",
description="This block is used to replace a text with a new text.",
categories={BlockCategory.TEXT},
input_schema=TextReplaceBlock.Input,
output_schema=TextReplaceBlock.Output,
test_input=[
{"text": "Hello, World!", "old": "Hello", "new": "Hi"},
],
test_output=[
("output", "Hi, World!"),
],
)
def run(self, input_data: Input, **kwargs) -> BlockOutput:
yield "output", input_data.text.replace(input_data.old, input_data.new)

View File

@@ -53,7 +53,7 @@ class UnrealTextToSpeechBlock(Block):
super().__init__(
id="4ff1ff6d-cc40-4caa-ae69-011daa20c378",
description="Converts text to speech using the Unreal Speech API",
categories={BlockCategory.AI, BlockCategory.TEXT},
categories={BlockCategory.AI, BlockCategory.TEXT, BlockCategory.MULTIMEDIA},
input_schema=UnrealTextToSpeechBlock.Input,
output_schema=UnrealTextToSpeechBlock.Output,
test_input={

View File

@@ -0,0 +1,61 @@
from typing import Literal
from pydantic import SecretStr
from backend.data.model import (
CredentialsField,
CredentialsMetaInput,
OAuth2Credentials,
ProviderName,
)
from backend.integrations.oauth.todoist import TodoistOAuthHandler
from backend.util.settings import Secrets
secrets = Secrets()
TODOIST_OAUTH_IS_CONFIGURED = bool(
secrets.todoist_client_id and secrets.todoist_client_secret
)
TodoistCredentials = OAuth2Credentials
TodoistCredentialsInput = CredentialsMetaInput[
Literal[ProviderName.TODOIST], Literal["oauth2"]
]
def TodoistCredentialsField(scopes: list[str]) -> TodoistCredentialsInput:
"""
Creates a Todoist credentials input on a block.
Params:
scopes: The authorization scopes needed for the block to work.
"""
return CredentialsField(
required_scopes=set(TodoistOAuthHandler.DEFAULT_SCOPES + scopes),
description="The Todoist integration requires OAuth2 authentication.",
)
TEST_CREDENTIALS = OAuth2Credentials(
id="01234567-89ab-cdef-0123-456789abcdef",
provider="todoist",
access_token=SecretStr("mock-todoist-access-token"),
refresh_token=None,
access_token_expires_at=None,
scopes=[
"task:add",
"data:read",
"data:read_write",
"data:delete",
"project:delete",
],
title="Mock Todoist OAuth2 Credentials",
username="mock-todoist-username",
refresh_token_expires_at=None,
)
TEST_CREDENTIALS_INPUT = {
"provider": TEST_CREDENTIALS.provider,
"id": TEST_CREDENTIALS.id,
"type": TEST_CREDENTIALS.type,
"title": TEST_CREDENTIALS.title,
}

View File

@@ -0,0 +1,24 @@
from enum import Enum
class Colors(Enum):
berry_red = "berry_red"
red = "red"
orange = "orange"
yellow = "yellow"
olive_green = "olive_green"
lime_green = "lime_green"
green = "green"
mint_green = "mint_green"
teal = "teal"
sky_blue = "sky_blue"
light_blue = "light_blue"
blue = "blue"
grape = "grape"
violet = "violet"
lavender = "lavender"
magenta = "magenta"
salmon = "salmon"
charcoal = "charcoal"
grey = "grey"
taupe = "taupe"

View File

@@ -0,0 +1,439 @@
from typing import Literal, Union
from pydantic import BaseModel
from todoist_api_python.api import TodoistAPI
from typing_extensions import Optional
from backend.blocks.todoist._auth import (
TEST_CREDENTIALS,
TEST_CREDENTIALS_INPUT,
TodoistCredentials,
TodoistCredentialsField,
TodoistCredentialsInput,
)
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.model import SchemaField
class TaskId(BaseModel):
discriminator: Literal["task"]
task_id: str
class ProjectId(BaseModel):
discriminator: Literal["project"]
project_id: str
class TodoistCreateCommentBlock(Block):
"""Creates a new comment on a Todoist task or project"""
class Input(BlockSchema):
credentials: TodoistCredentialsInput = TodoistCredentialsField([])
content: str = SchemaField(description="Comment content")
id_type: Union[TaskId, ProjectId] = SchemaField(
discriminator="discriminator",
description="Specify either task_id or project_id to comment on",
default=TaskId(discriminator="task", task_id=""),
advanced=False,
)
attachment: Optional[dict] = SchemaField(
description="Optional file attachment", default=None
)
class Output(BlockSchema):
id: str = SchemaField(description="ID of created comment")
content: str = SchemaField(description="Comment content")
posted_at: str = SchemaField(description="Comment timestamp")
task_id: Optional[str] = SchemaField(
description="Associated task ID", default=None
)
project_id: Optional[str] = SchemaField(
description="Associated project ID", default=None
)
error: str = SchemaField(description="Error message if the request failed")
def __init__(self):
super().__init__(
id="1bba7e54-2310-4a31-8e6f-54d5f9ab7459",
description="Creates a new comment on a Todoist task or project",
categories={BlockCategory.PRODUCTIVITY},
input_schema=TodoistCreateCommentBlock.Input,
output_schema=TodoistCreateCommentBlock.Output,
test_input={
"credentials": TEST_CREDENTIALS_INPUT,
"content": "Test comment",
"id_type": {"discriminator": "task", "task_id": "2995104339"},
},
test_credentials=TEST_CREDENTIALS,
test_output=[
("id", "2992679862"),
("content", "Test comment"),
("posted_at", "2016-09-22T07:00:00.000000Z"),
("task_id", "2995104339"),
("project_id", None),
],
test_mock={
"create_comment": lambda content, credentials, task_id=None, project_id=None, attachment=None: {
"id": "2992679862",
"content": "Test comment",
"posted_at": "2016-09-22T07:00:00.000000Z",
"task_id": "2995104339",
"project_id": None,
}
},
)
@staticmethod
def create_comment(
credentials: TodoistCredentials,
content: str,
task_id: Optional[str] = None,
project_id: Optional[str] = None,
attachment: Optional[dict] = None,
):
try:
api = TodoistAPI(credentials.access_token.get_secret_value())
comment = api.add_comment(
content=content,
task_id=task_id,
project_id=project_id,
attachment=attachment,
)
return comment.__dict__
except Exception as e:
raise e
def run(
self,
input_data: Input,
*,
credentials: TodoistCredentials,
**kwargs,
) -> BlockOutput:
try:
task_id = None
project_id = None
if isinstance(input_data.id_type, TaskId):
task_id = input_data.id_type.task_id
else:
project_id = input_data.id_type.project_id
comment_data = self.create_comment(
credentials,
input_data.content,
task_id=task_id,
project_id=project_id,
attachment=input_data.attachment,
)
if comment_data:
yield "id", comment_data["id"]
yield "content", comment_data["content"]
yield "posted_at", comment_data["posted_at"]
yield "task_id", comment_data["task_id"]
yield "project_id", comment_data["project_id"]
except Exception as e:
yield "error", str(e)
class TodoistGetCommentsBlock(Block):
"""Get all comments for a Todoist task or project"""
class Input(BlockSchema):
credentials: TodoistCredentialsInput = TodoistCredentialsField([])
id_type: Union[TaskId, ProjectId] = SchemaField(
discriminator="discriminator",
description="Specify either task_id or project_id to get comments for",
default=TaskId(discriminator="task", task_id=""),
advanced=False,
)
class Output(BlockSchema):
comments: list = SchemaField(description="List of comments")
error: str = SchemaField(description="Error message if the request failed")
def __init__(self):
super().__init__(
id="9972d8ae-ddf2-11ef-a9b8-32d3674e8b7e",
description="Get all comments for a Todoist task or project",
categories={BlockCategory.PRODUCTIVITY},
input_schema=TodoistGetCommentsBlock.Input,
output_schema=TodoistGetCommentsBlock.Output,
test_input={
"credentials": TEST_CREDENTIALS_INPUT,
"id_type": {"discriminator": "task", "task_id": "2995104339"},
},
test_credentials=TEST_CREDENTIALS,
test_output=[
(
"comments",
[
{
"id": "2992679862",
"content": "Test comment",
"posted_at": "2016-09-22T07:00:00.000000Z",
"task_id": "2995104339",
"project_id": None,
"attachment": None,
}
],
)
],
test_mock={
"get_comments": lambda credentials, task_id=None, project_id=None: [
{
"id": "2992679862",
"content": "Test comment",
"posted_at": "2016-09-22T07:00:00.000000Z",
"task_id": "2995104339",
"project_id": None,
"attachment": None,
}
]
},
)
@staticmethod
def get_comments(
credentials: TodoistCredentials,
task_id: Optional[str] = None,
project_id: Optional[str] = None,
):
try:
api = TodoistAPI(credentials.access_token.get_secret_value())
comments = api.get_comments(task_id=task_id, project_id=project_id)
return [comment.__dict__ for comment in comments]
except Exception as e:
raise e
def run(
self,
input_data: Input,
*,
credentials: TodoistCredentials,
**kwargs,
) -> BlockOutput:
try:
task_id = None
project_id = None
if isinstance(input_data.id_type, TaskId):
task_id = input_data.id_type.task_id
else:
project_id = input_data.id_type.project_id
comments = self.get_comments(
credentials, task_id=task_id, project_id=project_id
)
yield "comments", comments
except Exception as e:
yield "error", str(e)
class TodoistGetCommentBlock(Block):
"""Get a single comment from Todoist using comment ID"""
class Input(BlockSchema):
credentials: TodoistCredentialsInput = TodoistCredentialsField([])
comment_id: str = SchemaField(description="Comment ID to retrieve")
class Output(BlockSchema):
content: str = SchemaField(description="Comment content")
id: str = SchemaField(description="Comment ID")
posted_at: str = SchemaField(description="Comment timestamp")
project_id: Optional[str] = SchemaField(
description="Associated project ID", default=None
)
task_id: Optional[str] = SchemaField(
description="Associated task ID", default=None
)
attachment: Optional[dict] = SchemaField(
description="Optional file attachment", default=None
)
error: str = SchemaField(description="Error message if the request failed")
def __init__(self):
super().__init__(
id="a809d264-ddf2-11ef-9764-32d3674e8b7e",
description="Get a single comment from Todoist",
categories={BlockCategory.PRODUCTIVITY},
input_schema=TodoistGetCommentBlock.Input,
output_schema=TodoistGetCommentBlock.Output,
test_input={
"credentials": TEST_CREDENTIALS_INPUT,
"comment_id": "2992679862",
},
test_credentials=TEST_CREDENTIALS,
test_output=[
("content", "Test comment"),
("id", "2992679862"),
("posted_at", "2016-09-22T07:00:00.000000Z"),
("project_id", None),
("task_id", "2995104339"),
("attachment", None),
],
test_mock={
"get_comment": lambda credentials, comment_id: {
"content": "Test comment",
"id": "2992679862",
"posted_at": "2016-09-22T07:00:00.000000Z",
"project_id": None,
"task_id": "2995104339",
"attachment": None,
}
},
)
@staticmethod
def get_comment(credentials: TodoistCredentials, comment_id: str):
try:
api = TodoistAPI(credentials.access_token.get_secret_value())
comment = api.get_comment(comment_id=comment_id)
return comment.__dict__
except Exception as e:
raise e
def run(
self,
input_data: Input,
*,
credentials: TodoistCredentials,
**kwargs,
) -> BlockOutput:
try:
comment_data = self.get_comment(
credentials, comment_id=input_data.comment_id
)
if comment_data:
yield "content", comment_data["content"]
yield "id", comment_data["id"]
yield "posted_at", comment_data["posted_at"]
yield "project_id", comment_data["project_id"]
yield "task_id", comment_data["task_id"]
yield "attachment", comment_data["attachment"]
except Exception as e:
yield "error", str(e)
class TodoistUpdateCommentBlock(Block):
"""Updates a Todoist comment"""
class Input(BlockSchema):
credentials: TodoistCredentialsInput = TodoistCredentialsField([])
comment_id: str = SchemaField(description="Comment ID to update")
content: str = SchemaField(description="New content for the comment")
class Output(BlockSchema):
success: bool = SchemaField(description="Whether the update was successful")
error: str = SchemaField(description="Error message if the request failed")
def __init__(self):
super().__init__(
id="b773c520-ddf2-11ef-9f34-32d3674e8b7e",
description="Updates a Todoist comment",
categories={BlockCategory.PRODUCTIVITY},
input_schema=TodoistUpdateCommentBlock.Input,
output_schema=TodoistUpdateCommentBlock.Output,
test_input={
"credentials": TEST_CREDENTIALS_INPUT,
"comment_id": "2992679862",
"content": "Need one bottle of milk",
},
test_credentials=TEST_CREDENTIALS,
test_output=[("success", True)],
test_mock={"update_comment": lambda credentials, comment_id, content: True},
)
@staticmethod
def update_comment(credentials: TodoistCredentials, comment_id: str, content: str):
try:
api = TodoistAPI(credentials.access_token.get_secret_value())
api.update_comment(comment_id=comment_id, content=content)
return True
except Exception as e:
raise e
def run(
self,
input_data: Input,
*,
credentials: TodoistCredentials,
**kwargs,
) -> BlockOutput:
try:
success = self.update_comment(
credentials,
comment_id=input_data.comment_id,
content=input_data.content,
)
yield "success", success
except Exception as e:
yield "error", str(e)
class TodoistDeleteCommentBlock(Block):
"""Deletes a Todoist comment"""
class Input(BlockSchema):
credentials: TodoistCredentialsInput = TodoistCredentialsField([])
comment_id: str = SchemaField(description="Comment ID to delete")
class Output(BlockSchema):
success: bool = SchemaField(description="Whether the deletion was successful")
error: str = SchemaField(description="Error message if the request failed")
def __init__(self):
super().__init__(
id="bda4c020-ddf2-11ef-b114-32d3674e8b7e",
description="Deletes a Todoist comment",
categories={BlockCategory.PRODUCTIVITY},
input_schema=TodoistDeleteCommentBlock.Input,
output_schema=TodoistDeleteCommentBlock.Output,
test_input={
"credentials": TEST_CREDENTIALS_INPUT,
"comment_id": "2992679862",
},
test_credentials=TEST_CREDENTIALS,
test_output=[("success", True)],
test_mock={"delete_comment": lambda credentials, comment_id: True},
)
@staticmethod
def delete_comment(credentials: TodoistCredentials, comment_id: str):
try:
api = TodoistAPI(credentials.access_token.get_secret_value())
success = api.delete_comment(comment_id=comment_id)
return success
except Exception as e:
raise e
def run(
self,
input_data: Input,
*,
credentials: TodoistCredentials,
**kwargs,
) -> BlockOutput:
try:
success = self.delete_comment(credentials, comment_id=input_data.comment_id)
yield "success", success
except Exception as e:
yield "error", str(e)

View File

@@ -0,0 +1,557 @@
from todoist_api_python.api import TodoistAPI
from typing_extensions import Optional
from backend.blocks.todoist._auth import (
TEST_CREDENTIALS,
TEST_CREDENTIALS_INPUT,
TodoistCredentials,
TodoistCredentialsField,
TodoistCredentialsInput,
)
from backend.blocks.todoist._types import Colors
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.model import SchemaField
class TodoistCreateLabelBlock(Block):
"""Creates a new label in Todoist"""
class Input(BlockSchema):
credentials: TodoistCredentialsInput = TodoistCredentialsField([])
name: str = SchemaField(description="Name of the label")
order: Optional[int] = SchemaField(description="Label order", default=None)
color: Optional[Colors] = SchemaField(
description="The color of the label icon", default=Colors.charcoal
)
is_favorite: bool = SchemaField(
description="Whether the label is a favorite", default=False
)
class Output(BlockSchema):
id: str = SchemaField(description="ID of the created label")
name: str = SchemaField(description="Name of the label")
color: str = SchemaField(description="Color of the label")
order: int = SchemaField(description="Label order")
is_favorite: bool = SchemaField(description="Favorite status")
error: str = SchemaField(description="Error message if the request failed")
def __init__(self):
super().__init__(
id="7288a968-de14-11ef-8997-32d3674e8b7e",
description="Creates a new label in Todoist, It will not work if same name already exists",
categories={BlockCategory.PRODUCTIVITY},
input_schema=TodoistCreateLabelBlock.Input,
output_schema=TodoistCreateLabelBlock.Output,
test_input={
"credentials": TEST_CREDENTIALS_INPUT,
"name": "Test Label",
"color": Colors.charcoal.value,
"order": 1,
"is_favorite": False,
},
test_credentials=TEST_CREDENTIALS,
test_output=[
("id", "2156154810"),
("name", "Test Label"),
("color", "charcoal"),
("order", 1),
("is_favorite", False),
],
test_mock={
"create_label": lambda *args, **kwargs: {
"id": "2156154810",
"name": "Test Label",
"color": "charcoal",
"order": 1,
"is_favorite": False,
}
},
)
@staticmethod
def create_label(credentials: TodoistCredentials, name: str, **kwargs):
try:
api = TodoistAPI(credentials.access_token.get_secret_value())
label = api.add_label(name=name, **kwargs)
return label.__dict__
except Exception as e:
raise e
def run(
self,
input_data: Input,
*,
credentials: TodoistCredentials,
**kwargs,
) -> BlockOutput:
try:
label_args = {
"order": input_data.order,
"color": (
input_data.color.value if input_data.color is not None else None
),
"is_favorite": input_data.is_favorite,
}
label_data = self.create_label(
credentials,
input_data.name,
**{k: v for k, v in label_args.items() if v is not None},
)
if label_data:
yield "id", label_data["id"]
yield "name", label_data["name"]
yield "color", label_data["color"]
yield "order", label_data["order"]
yield "is_favorite", label_data["is_favorite"]
except Exception as e:
yield "error", str(e)
class TodoistListLabelsBlock(Block):
"""Gets all personal labels from Todoist"""
class Input(BlockSchema):
credentials: TodoistCredentialsInput = TodoistCredentialsField([])
class Output(BlockSchema):
labels: list = SchemaField(description="List of complete label data")
label_ids: list = SchemaField(description="List of label IDs")
label_names: list = SchemaField(description="List of label names")
error: str = SchemaField(description="Error message if the request failed")
def __init__(self):
super().__init__(
id="776dd750-de14-11ef-b927-32d3674e8b7e",
description="Gets all personal labels from Todoist",
categories={BlockCategory.PRODUCTIVITY},
input_schema=TodoistListLabelsBlock.Input,
output_schema=TodoistListLabelsBlock.Output,
test_input={"credentials": TEST_CREDENTIALS_INPUT},
test_credentials=TEST_CREDENTIALS,
test_output=[
(
"labels",
[
{
"id": "2156154810",
"name": "Test Label",
"color": "charcoal",
"order": 1,
"is_favorite": False,
}
],
),
("label_ids", ["2156154810"]),
("label_names", ["Test Label"]),
],
test_mock={
"get_labels": lambda *args, **kwargs: [
{
"id": "2156154810",
"name": "Test Label",
"color": "charcoal",
"order": 1,
"is_favorite": False,
}
]
},
)
@staticmethod
def get_labels(credentials: TodoistCredentials):
try:
api = TodoistAPI(credentials.access_token.get_secret_value())
labels = api.get_labels()
return [label.__dict__ for label in labels]
except Exception as e:
raise e
def run(
self,
input_data: Input,
*,
credentials: TodoistCredentials,
**kwargs,
) -> BlockOutput:
try:
labels = self.get_labels(credentials)
yield "labels", labels
yield "label_ids", [label["id"] for label in labels]
yield "label_names", [label["name"] for label in labels]
except Exception as e:
yield "error", str(e)
class TodoistGetLabelBlock(Block):
"""Gets a personal label from Todoist by ID"""
class Input(BlockSchema):
credentials: TodoistCredentialsInput = TodoistCredentialsField([])
label_id: str = SchemaField(description="ID of the label to retrieve")
class Output(BlockSchema):
id: str = SchemaField(description="ID of the label")
name: str = SchemaField(description="Name of the label")
color: str = SchemaField(description="Color of the label")
order: int = SchemaField(description="Label order")
is_favorite: bool = SchemaField(description="Favorite status")
error: str = SchemaField(description="Error message if the request failed")
def __init__(self):
super().__init__(
id="7f236514-de14-11ef-bd7a-32d3674e8b7e",
description="Gets a personal label from Todoist by ID",
categories={BlockCategory.PRODUCTIVITY},
input_schema=TodoistGetLabelBlock.Input,
output_schema=TodoistGetLabelBlock.Output,
test_input={
"credentials": TEST_CREDENTIALS_INPUT,
"label_id": "2156154810",
},
test_credentials=TEST_CREDENTIALS,
test_output=[
("id", "2156154810"),
("name", "Test Label"),
("color", "charcoal"),
("order", 1),
("is_favorite", False),
],
test_mock={
"get_label": lambda *args, **kwargs: {
"id": "2156154810",
"name": "Test Label",
"color": "charcoal",
"order": 1,
"is_favorite": False,
}
},
)
@staticmethod
def get_label(credentials: TodoistCredentials, label_id: str):
try:
api = TodoistAPI(credentials.access_token.get_secret_value())
label = api.get_label(label_id=label_id)
return label.__dict__
except Exception as e:
raise e
def run(
self,
input_data: Input,
*,
credentials: TodoistCredentials,
**kwargs,
) -> BlockOutput:
try:
label_data = self.get_label(credentials, input_data.label_id)
if label_data:
yield "id", label_data["id"]
yield "name", label_data["name"]
yield "color", label_data["color"]
yield "order", label_data["order"]
yield "is_favorite", label_data["is_favorite"]
except Exception as e:
yield "error", str(e)
class TodoistUpdateLabelBlock(Block):
"""Updates a personal label in Todoist using ID"""
class Input(BlockSchema):
credentials: TodoistCredentialsInput = TodoistCredentialsField([])
label_id: str = SchemaField(description="ID of the label to update")
name: Optional[str] = SchemaField(
description="New name of the label", default=None
)
order: Optional[int] = SchemaField(description="Label order", default=None)
color: Optional[Colors] = SchemaField(
description="The color of the label icon", default=None
)
is_favorite: bool = SchemaField(
description="Whether the label is a favorite (true/false)", default=False
)
class Output(BlockSchema):
success: bool = SchemaField(description="Whether the update was successful")
error: str = SchemaField(description="Error message if the request failed")
def __init__(self):
super().__init__(
id="8755614c-de14-11ef-9b56-32d3674e8b7e",
description="Updates a personal label in Todoist",
categories={BlockCategory.PRODUCTIVITY},
input_schema=TodoistUpdateLabelBlock.Input,
output_schema=TodoistUpdateLabelBlock.Output,
test_input={
"credentials": TEST_CREDENTIALS_INPUT,
"label_id": "2156154810",
"name": "Updated Label",
"color": Colors.charcoal.value,
"order": 2,
"is_favorite": True,
},
test_credentials=TEST_CREDENTIALS,
test_output=[("success", True)],
test_mock={"update_label": lambda *args, **kwargs: True},
)
@staticmethod
def update_label(credentials: TodoistCredentials, label_id: str, **kwargs):
try:
api = TodoistAPI(credentials.access_token.get_secret_value())
api.update_label(label_id=label_id, **kwargs)
return True
except Exception as e:
raise e
def run(
self,
input_data: Input,
*,
credentials: TodoistCredentials,
**kwargs,
) -> BlockOutput:
try:
label_args = {}
if input_data.name is not None:
label_args["name"] = input_data.name
if input_data.order is not None:
label_args["order"] = input_data.order
if input_data.color is not None:
label_args["color"] = input_data.color.value
if input_data.is_favorite is not None:
label_args["is_favorite"] = input_data.is_favorite
success = self.update_label(
credentials,
input_data.label_id,
**{k: v for k, v in label_args.items() if v is not None},
)
yield "success", success
except Exception as e:
yield "error", str(e)
class TodoistDeleteLabelBlock(Block):
"""Deletes a personal label in Todoist"""
class Input(BlockSchema):
credentials: TodoistCredentialsInput = TodoistCredentialsField([])
label_id: str = SchemaField(description="ID of the label to delete")
class Output(BlockSchema):
success: bool = SchemaField(description="Whether the deletion was successful")
error: str = SchemaField(description="Error message if the request failed")
def __init__(self):
super().__init__(
id="901b8f86-de14-11ef-98b8-32d3674e8b7e",
description="Deletes a personal label in Todoist",
categories={BlockCategory.PRODUCTIVITY},
input_schema=TodoistDeleteLabelBlock.Input,
output_schema=TodoistDeleteLabelBlock.Output,
test_input={
"credentials": TEST_CREDENTIALS_INPUT,
"label_id": "2156154810",
},
test_credentials=TEST_CREDENTIALS,
test_output=[("success", True)],
test_mock={"delete_label": lambda *args, **kwargs: True},
)
@staticmethod
def delete_label(credentials: TodoistCredentials, label_id: str):
try:
api = TodoistAPI(credentials.access_token.get_secret_value())
success = api.delete_label(label_id=label_id)
return success
except Exception as e:
raise e
def run(
self,
input_data: Input,
*,
credentials: TodoistCredentials,
**kwargs,
) -> BlockOutput:
try:
success = self.delete_label(credentials, input_data.label_id)
yield "success", success
except Exception as e:
yield "error", str(e)
class TodoistGetSharedLabelsBlock(Block):
"""Gets all shared labels from Todoist"""
class Input(BlockSchema):
credentials: TodoistCredentialsInput = TodoistCredentialsField([])
class Output(BlockSchema):
labels: list = SchemaField(description="List of shared label names")
error: str = SchemaField(description="Error message if the request failed")
def __init__(self):
super().__init__(
id="55fba510-de15-11ef-aed2-32d3674e8b7e",
description="Gets all shared labels from Todoist",
categories={BlockCategory.PRODUCTIVITY},
input_schema=TodoistGetSharedLabelsBlock.Input,
output_schema=TodoistGetSharedLabelsBlock.Output,
test_input={"credentials": TEST_CREDENTIALS_INPUT},
test_credentials=TEST_CREDENTIALS,
test_output=[("labels", ["Label1", "Label2", "Label3"])],
test_mock={
"get_shared_labels": lambda *args, **kwargs: [
"Label1",
"Label2",
"Label3",
]
},
)
@staticmethod
def get_shared_labels(credentials: TodoistCredentials):
try:
api = TodoistAPI(credentials.access_token.get_secret_value())
labels = api.get_shared_labels()
return labels
except Exception as e:
raise e
def run(
self,
input_data: Input,
*,
credentials: TodoistCredentials,
**kwargs,
) -> BlockOutput:
try:
labels = self.get_shared_labels(credentials)
yield "labels", labels
except Exception as e:
yield "error", str(e)
class TodoistRenameSharedLabelsBlock(Block):
"""Renames all instances of a shared label"""
class Input(BlockSchema):
credentials: TodoistCredentialsInput = TodoistCredentialsField([])
name: str = SchemaField(description="The name of the existing label to rename")
new_name: str = SchemaField(description="The new name for the label")
class Output(BlockSchema):
success: bool = SchemaField(description="Whether the rename was successful")
error: str = SchemaField(description="Error message if the request failed")
def __init__(self):
super().__init__(
id="9d63ad9a-de14-11ef-ab3f-32d3674e8b7e",
description="Renames all instances of a shared label",
categories={BlockCategory.PRODUCTIVITY},
input_schema=TodoistRenameSharedLabelsBlock.Input,
output_schema=TodoistRenameSharedLabelsBlock.Output,
test_input={
"credentials": TEST_CREDENTIALS_INPUT,
"name": "OldLabel",
"new_name": "NewLabel",
},
test_credentials=TEST_CREDENTIALS,
test_output=[("success", True)],
test_mock={"rename_shared_labels": lambda *args, **kwargs: True},
)
@staticmethod
def rename_shared_labels(credentials: TodoistCredentials, name: str, new_name: str):
try:
api = TodoistAPI(credentials.access_token.get_secret_value())
success = api.rename_shared_label(name=name, new_name=new_name)
return success
except Exception as e:
raise e
def run(
self,
input_data: Input,
*,
credentials: TodoistCredentials,
**kwargs,
) -> BlockOutput:
try:
success = self.rename_shared_labels(
credentials, input_data.name, input_data.new_name
)
yield "success", success
except Exception as e:
yield "error", str(e)
class TodoistRemoveSharedLabelsBlock(Block):
"""Removes all instances of a shared label"""
class Input(BlockSchema):
credentials: TodoistCredentialsInput = TodoistCredentialsField([])
name: str = SchemaField(description="The name of the label to remove")
class Output(BlockSchema):
success: bool = SchemaField(description="Whether the removal was successful")
error: str = SchemaField(description="Error message if the request failed")
def __init__(self):
super().__init__(
id="a6c5cbde-de14-11ef-8863-32d3674e8b7e",
description="Removes all instances of a shared label",
categories={BlockCategory.PRODUCTIVITY},
input_schema=TodoistRemoveSharedLabelsBlock.Input,
output_schema=TodoistRemoveSharedLabelsBlock.Output,
test_input={"credentials": TEST_CREDENTIALS_INPUT, "name": "LabelToRemove"},
test_credentials=TEST_CREDENTIALS,
test_output=[("success", True)],
test_mock={"remove_shared_label": lambda *args, **kwargs: True},
)
@staticmethod
def remove_shared_label(credentials: TodoistCredentials, name: str):
try:
api = TodoistAPI(credentials.access_token.get_secret_value())
success = api.remove_shared_label(name=name)
return success
except Exception as e:
raise e
def run(
self,
input_data: Input,
*,
credentials: TodoistCredentials,
**kwargs,
) -> BlockOutput:
try:
success = self.remove_shared_label(credentials, input_data.name)
yield "success", success
except Exception as e:
yield "error", str(e)

View File

@@ -0,0 +1,566 @@
from todoist_api_python.api import TodoistAPI
from typing_extensions import Optional
from backend.blocks.todoist._auth import (
TEST_CREDENTIALS,
TEST_CREDENTIALS_INPUT,
TodoistCredentials,
TodoistCredentialsField,
TodoistCredentialsInput,
)
from backend.blocks.todoist._types import Colors
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.model import SchemaField
class TodoistListProjectsBlock(Block):
"""Gets all projects for a Todoist user"""
class Input(BlockSchema):
credentials: TodoistCredentialsInput = TodoistCredentialsField([])
class Output(BlockSchema):
names_list: list[str] = SchemaField(description="List of project names")
ids_list: list[str] = SchemaField(description="List of project IDs")
url_list: list[str] = SchemaField(description="List of project URLs")
complete_data: list[dict] = SchemaField(
description="Complete project data including all fields"
)
error: str = SchemaField(description="Error message if request failed")
def __init__(self):
super().__init__(
id="5f3e1d5b-6bc5-40e3-97ee-1318b3f38813",
description="Gets all projects and their details from Todoist",
categories={BlockCategory.PRODUCTIVITY},
input_schema=TodoistListProjectsBlock.Input,
output_schema=TodoistListProjectsBlock.Output,
test_input={
"credentials": TEST_CREDENTIALS_INPUT,
},
test_credentials=TEST_CREDENTIALS,
test_output=[
("names_list", ["Inbox"]),
("ids_list", ["220474322"]),
("url_list", ["https://todoist.com/showProject?id=220474322"]),
(
"complete_data",
[
{
"id": "220474322",
"name": "Inbox",
"url": "https://todoist.com/showProject?id=220474322",
}
],
),
],
test_mock={
"get_project_lists": lambda *args, **kwargs: (
["Inbox"],
["220474322"],
["https://todoist.com/showProject?id=220474322"],
[
{
"id": "220474322",
"name": "Inbox",
"url": "https://todoist.com/showProject?id=220474322",
}
],
None,
)
},
)
@staticmethod
def get_project_lists(credentials: TodoistCredentials):
try:
api = TodoistAPI(credentials.access_token.get_secret_value())
projects = api.get_projects()
names = []
ids = []
urls = []
complete_data = []
for project in projects:
names.append(project.name)
ids.append(project.id)
urls.append(project.url)
complete_data.append(project.__dict__)
return names, ids, urls, complete_data, None
except Exception as e:
raise e
def run(
self,
input_data: Input,
*,
credentials: TodoistCredentials,
**kwargs,
) -> BlockOutput:
try:
names, ids, urls, data, error = self.get_project_lists(credentials)
if names:
yield "names_list", names
if ids:
yield "ids_list", ids
if urls:
yield "url_list", urls
if data:
yield "complete_data", data
except Exception as e:
yield "error", str(e)
class TodoistCreateProjectBlock(Block):
"""Creates a new project in Todoist"""
class Input(BlockSchema):
credentials: TodoistCredentialsInput = TodoistCredentialsField([])
name: str = SchemaField(description="Name of the project", advanced=False)
parent_id: Optional[str] = SchemaField(
description="Parent project ID", default=None, advanced=True
)
color: Optional[Colors] = SchemaField(
description="Color of the project icon",
default=Colors.charcoal,
advanced=True,
)
is_favorite: bool = SchemaField(
description="Whether the project is a favorite",
default=False,
advanced=True,
)
view_style: Optional[str] = SchemaField(
description="Display style (list or board)", default=None, advanced=True
)
class Output(BlockSchema):
success: bool = SchemaField(description="Whether the creation was successful")
error: str = SchemaField(description="Error message if the request failed")
def __init__(self):
super().__init__(
id="ade60136-de14-11ef-b5e5-32d3674e8b7e",
description="Creates a new project in Todoist",
categories={BlockCategory.PRODUCTIVITY},
input_schema=TodoistCreateProjectBlock.Input,
output_schema=TodoistCreateProjectBlock.Output,
test_input={"credentials": TEST_CREDENTIALS_INPUT, "name": "Test Project"},
test_credentials=TEST_CREDENTIALS,
test_output=[("success", True)],
test_mock={"create_project": lambda *args, **kwargs: (True)},
)
@staticmethod
def create_project(
credentials: TodoistCredentials,
name: str,
parent_id: Optional[str],
color: Optional[Colors],
is_favorite: bool,
view_style: Optional[str],
):
try:
api = TodoistAPI(credentials.access_token.get_secret_value())
params = {"name": name, "is_favorite": is_favorite}
if parent_id is not None:
params["parent_id"] = parent_id
if color is not None:
params["color"] = color.value
if view_style is not None:
params["view_style"] = view_style
api.add_project(**params)
return True
except Exception as e:
raise e
def run(
self,
input_data: Input,
*,
credentials: TodoistCredentials,
**kwargs,
) -> BlockOutput:
try:
success = self.create_project(
credentials=credentials,
name=input_data.name,
parent_id=input_data.parent_id,
color=input_data.color,
is_favorite=input_data.is_favorite,
view_style=input_data.view_style,
)
yield "success", success
except Exception as e:
yield "error", str(e)
class TodoistGetProjectBlock(Block):
"""Gets details for a specific Todoist project"""
class Input(BlockSchema):
credentials: TodoistCredentialsInput = TodoistCredentialsField([])
project_id: str = SchemaField(
description="ID of the project to get details for", advanced=False
)
class Output(BlockSchema):
project_id: str = SchemaField(description="ID of project")
project_name: str = SchemaField(description="Name of project")
project_url: str = SchemaField(description="URL of project")
complete_data: dict = SchemaField(
description="Complete project data including all fields"
)
error: str = SchemaField(description="Error message if the request failed")
def __init__(self):
super().__init__(
id="b435b5ea-de14-11ef-8b51-32d3674e8b7e",
description="Gets details for a specific Todoist project",
categories={BlockCategory.PRODUCTIVITY},
input_schema=TodoistGetProjectBlock.Input,
output_schema=TodoistGetProjectBlock.Output,
test_input={
"credentials": TEST_CREDENTIALS_INPUT,
"project_id": "2203306141",
},
test_credentials=TEST_CREDENTIALS,
test_output=[
("project_id", "2203306141"),
("project_name", "Shopping List"),
("project_url", "https://todoist.com/showProject?id=2203306141"),
(
"complete_data",
{
"id": "2203306141",
"name": "Shopping List",
"url": "https://todoist.com/showProject?id=2203306141",
},
),
],
test_mock={
"get_project": lambda *args, **kwargs: (
"2203306141",
"Shopping List",
"https://todoist.com/showProject?id=2203306141",
{
"id": "2203306141",
"name": "Shopping List",
"url": "https://todoist.com/showProject?id=2203306141",
},
)
},
)
@staticmethod
def get_project(credentials: TodoistCredentials, project_id: str):
try:
api = TodoistAPI(credentials.access_token.get_secret_value())
project = api.get_project(project_id=project_id)
return project.id, project.name, project.url, project.__dict__
except Exception as e:
raise e
def run(
self,
input_data: Input,
*,
credentials: TodoistCredentials,
**kwargs,
) -> BlockOutput:
try:
project_id, project_name, project_url, data = self.get_project(
credentials=credentials, project_id=input_data.project_id
)
if project_id:
yield "project_id", project_id
if project_name:
yield "project_name", project_name
if project_url:
yield "project_url", project_url
if data:
yield "complete_data", data
except Exception as e:
yield "error", str(e)
class TodoistUpdateProjectBlock(Block):
"""Updates an existing project in Todoist"""
class Input(BlockSchema):
credentials: TodoistCredentialsInput = TodoistCredentialsField([])
project_id: str = SchemaField(
description="ID of project to update", advanced=False
)
name: Optional[str] = SchemaField(
description="New name for the project", default=None, advanced=False
)
color: Optional[Colors] = SchemaField(
description="New color for the project icon", default=None, advanced=True
)
is_favorite: Optional[bool] = SchemaField(
description="Whether the project should be a favorite",
default=None,
advanced=True,
)
view_style: Optional[str] = SchemaField(
description="Display style (list or board)", default=None, advanced=True
)
class Output(BlockSchema):
success: bool = SchemaField(description="Whether the update was successful")
error: str = SchemaField(description="Error message if the request failed")
def __init__(self):
super().__init__(
id="ba41a20a-de14-11ef-91d7-32d3674e8b7e",
description="Updates an existing project in Todoist",
categories={BlockCategory.PRODUCTIVITY},
input_schema=TodoistUpdateProjectBlock.Input,
output_schema=TodoistUpdateProjectBlock.Output,
test_input={
"credentials": TEST_CREDENTIALS_INPUT,
"project_id": "2203306141",
"name": "Things To Buy",
},
test_credentials=TEST_CREDENTIALS,
test_output=[("success", True)],
test_mock={"update_project": lambda *args, **kwargs: (True)},
)
@staticmethod
def update_project(
credentials: TodoistCredentials,
project_id: str,
name: Optional[str],
color: Optional[Colors],
is_favorite: Optional[bool],
view_style: Optional[str],
):
try:
api = TodoistAPI(credentials.access_token.get_secret_value())
params = {}
if name is not None:
params["name"] = name
if color is not None:
params["color"] = color.value
if is_favorite is not None:
params["is_favorite"] = is_favorite
if view_style is not None:
params["view_style"] = view_style
api.update_project(project_id=project_id, **params)
return True
except Exception as e:
raise e
def run(
self,
input_data: Input,
*,
credentials: TodoistCredentials,
**kwargs,
) -> BlockOutput:
try:
success = self.update_project(
credentials=credentials,
project_id=input_data.project_id,
name=input_data.name,
color=input_data.color,
is_favorite=input_data.is_favorite,
view_style=input_data.view_style,
)
yield "success", success
except Exception as e:
yield "error", str(e)
class TodoistDeleteProjectBlock(Block):
"""Deletes a project and all of its sections and tasks"""
class Input(BlockSchema):
credentials: TodoistCredentialsInput = TodoistCredentialsField([])
project_id: str = SchemaField(
description="ID of project to delete", advanced=False
)
class Output(BlockSchema):
success: bool = SchemaField(description="Whether the deletion was successful")
error: str = SchemaField(description="Error message if the request failed")
def __init__(self):
super().__init__(
id="c2893acc-de14-11ef-a113-32d3674e8b7e",
description="Deletes a Todoist project and all its contents",
categories={BlockCategory.PRODUCTIVITY},
input_schema=TodoistDeleteProjectBlock.Input,
output_schema=TodoistDeleteProjectBlock.Output,
test_input={
"credentials": TEST_CREDENTIALS_INPUT,
"project_id": "2203306141",
},
test_credentials=TEST_CREDENTIALS,
test_output=[("success", True)],
test_mock={"delete_project": lambda *args, **kwargs: (True)},
)
@staticmethod
def delete_project(credentials: TodoistCredentials, project_id: str):
try:
api = TodoistAPI(credentials.access_token.get_secret_value())
success = api.delete_project(project_id=project_id)
return success
except Exception as e:
raise e
def run(
self,
input_data: Input,
*,
credentials: TodoistCredentials,
**kwargs,
) -> BlockOutput:
try:
success = self.delete_project(
credentials=credentials, project_id=input_data.project_id
)
yield "success", success
except Exception as e:
yield "error", str(e)
class TodoistListCollaboratorsBlock(Block):
"""Gets all collaborators for a Todoist project"""
class Input(BlockSchema):
credentials: TodoistCredentialsInput = TodoistCredentialsField([])
project_id: str = SchemaField(
description="ID of the project to get collaborators for", advanced=False
)
class Output(BlockSchema):
collaborator_ids: list[str] = SchemaField(
description="List of collaborator IDs"
)
collaborator_names: list[str] = SchemaField(
description="List of collaborator names"
)
collaborator_emails: list[str] = SchemaField(
description="List of collaborator email addresses"
)
complete_data: list[dict] = SchemaField(
description="Complete collaborator data including all fields"
)
error: str = SchemaField(description="Error message if the request failed")
def __init__(self):
super().__init__(
id="c99c804e-de14-11ef-9f47-32d3674e8b7e",
description="Gets all collaborators for a specific Todoist project",
categories={BlockCategory.PRODUCTIVITY},
input_schema=TodoistListCollaboratorsBlock.Input,
output_schema=TodoistListCollaboratorsBlock.Output,
test_input={
"credentials": TEST_CREDENTIALS_INPUT,
"project_id": "2203306141",
},
test_credentials=TEST_CREDENTIALS,
test_output=[
("collaborator_ids", ["2671362", "2671366"]),
("collaborator_names", ["Alice", "Bob"]),
("collaborator_emails", ["alice@example.com", "bob@example.com"]),
(
"complete_data",
[
{
"id": "2671362",
"name": "Alice",
"email": "alice@example.com",
},
{"id": "2671366", "name": "Bob", "email": "bob@example.com"},
],
),
],
test_mock={
"get_collaborators": lambda *args, **kwargs: (
["2671362", "2671366"],
["Alice", "Bob"],
["alice@example.com", "bob@example.com"],
[
{
"id": "2671362",
"name": "Alice",
"email": "alice@example.com",
},
{"id": "2671366", "name": "Bob", "email": "bob@example.com"},
],
)
},
)
@staticmethod
def get_collaborators(credentials: TodoistCredentials, project_id: str):
try:
api = TodoistAPI(credentials.access_token.get_secret_value())
collaborators = api.get_collaborators(project_id=project_id)
ids = []
names = []
emails = []
complete_data = []
for collaborator in collaborators:
ids.append(collaborator.id)
names.append(collaborator.name)
emails.append(collaborator.email)
complete_data.append(collaborator.__dict__)
return ids, names, emails, complete_data
except Exception as e:
raise e
def run(
self,
input_data: Input,
*,
credentials: TodoistCredentials,
**kwargs,
) -> BlockOutput:
try:
ids, names, emails, data = self.get_collaborators(
credentials=credentials, project_id=input_data.project_id
)
if ids:
yield "collaborator_ids", ids
if names:
yield "collaborator_names", names
if emails:
yield "collaborator_emails", emails
if data:
yield "complete_data", data
except Exception as e:
yield "error", str(e)

View File

@@ -0,0 +1,306 @@
from todoist_api_python.api import TodoistAPI
from typing_extensions import Optional
from backend.blocks.todoist._auth import (
TEST_CREDENTIALS,
TEST_CREDENTIALS_INPUT,
TodoistCredentials,
TodoistCredentialsField,
TodoistCredentialsInput,
)
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.model import SchemaField
class TodoistListSectionsBlock(Block):
"""Gets all sections for a Todoist project"""
class Input(BlockSchema):
credentials: TodoistCredentialsInput = TodoistCredentialsField([])
project_id: Optional[str] = SchemaField(
description="Optional project ID to filter sections"
)
class Output(BlockSchema):
names_list: list[str] = SchemaField(description="List of section names")
ids_list: list[str] = SchemaField(description="List of section IDs")
complete_data: list[dict] = SchemaField(
description="Complete section data including all fields"
)
error: str = SchemaField(description="Error message if the request failed")
def __init__(self):
super().__init__(
id="d6a116d8-de14-11ef-a94c-32d3674e8b7e",
description="Gets all sections and their details from Todoist",
categories={BlockCategory.PRODUCTIVITY},
input_schema=TodoistListSectionsBlock.Input,
output_schema=TodoistListSectionsBlock.Output,
test_input={
"credentials": TEST_CREDENTIALS_INPUT,
"project_id": "2203306141",
},
test_credentials=TEST_CREDENTIALS,
test_output=[
("names_list", ["Groceries"]),
("ids_list", ["7025"]),
(
"complete_data",
[
{
"id": "7025",
"project_id": "2203306141",
"order": 1,
"name": "Groceries",
}
],
),
],
test_mock={
"get_section_lists": lambda *args, **kwargs: (
["Groceries"],
["7025"],
[
{
"id": "7025",
"project_id": "2203306141",
"order": 1,
"name": "Groceries",
}
],
)
},
)
@staticmethod
def get_section_lists(
credentials: TodoistCredentials, project_id: Optional[str] = None
):
try:
api = TodoistAPI(credentials.access_token.get_secret_value())
sections = api.get_sections(project_id=project_id)
names = []
ids = []
complete_data = []
for section in sections:
names.append(section.name)
ids.append(section.id)
complete_data.append(section.__dict__)
return names, ids, complete_data
except Exception as e:
raise e
def run(
self,
input_data: Input,
*,
credentials: TodoistCredentials,
**kwargs,
) -> BlockOutput:
try:
names, ids, data = self.get_section_lists(
credentials, input_data.project_id
)
if names:
yield "names_list", names
if ids:
yield "ids_list", ids
if data:
yield "complete_data", data
except Exception as e:
yield "error", str(e)
# Error in official todoist SDK. Will add this block using sync_api
# class TodoistCreateSectionBlock(Block):
# """Creates a new section in a Todoist project"""
# class Input(BlockSchema):
# credentials: TodoistCredentialsInput = TodoistCredentialsField([])
# name: str = SchemaField(description="Section name")
# project_id: str = SchemaField(description="Project ID this section should belong to")
# order: Optional[int] = SchemaField(description="Optional order among other sections", default=None)
# class Output(BlockSchema):
# success: bool = SchemaField(description="Whether section was successfully created")
# error: str = SchemaField(description="Error message if the request failed")
# def __init__(self):
# super().__init__(
# id="e3025cfc-de14-11ef-b9f2-32d3674e8b7e",
# description="Creates a new section in a Todoist project",
# categories={BlockCategory.PRODUCTIVITY},
# input_schema=TodoistCreateSectionBlock.Input,
# output_schema=TodoistCreateSectionBlock.Output,
# test_input={
# "credentials": TEST_CREDENTIALS_INPUT,
# "name": "Groceries",
# "project_id": "2203306141"
# },
# test_credentials=TEST_CREDENTIALS,
# test_output=[
# ("success", True)
# ],
# test_mock={
# "create_section": lambda *args, **kwargs: (
# {"id": "7025", "project_id": "2203306141", "order": 1, "name": "Groceries"},
# )
# },
# )
# @staticmethod
# def create_section(credentials: TodoistCredentials, name: str, project_id: str, order: Optional[int] = None):
# try:
# api = TodoistAPI(credentials.access_token.get_secret_value())
# section = api.add_section(name=name, project_id=project_id, order=order)
# return section.__dict__
# except Exception as e:
# raise e
# def run(
# self,
# input_data: Input,
# *,
# credentials: TodoistCredentials,
# **kwargs,
# ) -> BlockOutput:
# try:
# section_data = self.create_section(
# credentials,
# input_data.name,
# input_data.project_id,
# input_data.order
# )
# if section_data:
# yield "success", True
# except Exception as e:
# yield "error", str(e)
class TodoistGetSectionBlock(Block):
"""Gets a single section from Todoist by ID"""
class Input(BlockSchema):
credentials: TodoistCredentialsInput = TodoistCredentialsField([])
section_id: str = SchemaField(description="ID of section to fetch")
class Output(BlockSchema):
id: str = SchemaField(description="ID of section")
project_id: str = SchemaField(description="Project ID the section belongs to")
order: int = SchemaField(description="Order of the section")
name: str = SchemaField(description="Name of the section")
error: str = SchemaField(description="Error message if the request failed")
def __init__(self):
super().__init__(
id="ea5580e2-de14-11ef-a5d3-32d3674e8b7e",
description="Gets a single section by ID from Todoist",
categories={BlockCategory.PRODUCTIVITY},
input_schema=TodoistGetSectionBlock.Input,
output_schema=TodoistGetSectionBlock.Output,
test_input={"credentials": TEST_CREDENTIALS_INPUT, "section_id": "7025"},
test_credentials=TEST_CREDENTIALS,
test_output=[
("id", "7025"),
("project_id", "2203306141"),
("order", 1),
("name", "Groceries"),
],
test_mock={
"get_section": lambda *args, **kwargs: {
"id": "7025",
"project_id": "2203306141",
"order": 1,
"name": "Groceries",
}
},
)
@staticmethod
def get_section(credentials: TodoistCredentials, section_id: str):
try:
api = TodoistAPI(credentials.access_token.get_secret_value())
section = api.get_section(section_id=section_id)
return section.__dict__
except Exception as e:
raise e
def run(
self,
input_data: Input,
*,
credentials: TodoistCredentials,
**kwargs,
) -> BlockOutput:
try:
section_data = self.get_section(credentials, input_data.section_id)
if section_data:
yield "id", section_data["id"]
yield "project_id", section_data["project_id"]
yield "order", section_data["order"]
yield "name", section_data["name"]
except Exception as e:
yield "error", str(e)
class TodoistDeleteSectionBlock(Block):
"""Deletes a section and all its tasks from Todoist"""
class Input(BlockSchema):
credentials: TodoistCredentialsInput = TodoistCredentialsField([])
section_id: str = SchemaField(description="ID of section to delete")
class Output(BlockSchema):
success: bool = SchemaField(
description="Whether section was successfully deleted"
)
error: str = SchemaField(description="Error message if the request failed")
def __init__(self):
super().__init__(
id="f0e52eee-de14-11ef-9b12-32d3674e8b7e",
description="Deletes a section and all its tasks from Todoist",
categories={BlockCategory.PRODUCTIVITY},
input_schema=TodoistDeleteSectionBlock.Input,
output_schema=TodoistDeleteSectionBlock.Output,
test_input={"credentials": TEST_CREDENTIALS_INPUT, "section_id": "7025"},
test_credentials=TEST_CREDENTIALS,
test_output=[("success", True)],
test_mock={"delete_section": lambda *args, **kwargs: (True)},
)
@staticmethod
def delete_section(credentials: TodoistCredentials, section_id: str):
try:
api = TodoistAPI(credentials.access_token.get_secret_value())
success = api.delete_section(section_id=section_id)
return success
except Exception as e:
raise e
def run(
self,
input_data: Input,
*,
credentials: TodoistCredentials,
**kwargs,
) -> BlockOutput:
try:
success = self.delete_section(credentials, input_data.section_id)
yield "success", success
except Exception as e:
yield "error", str(e)

View File

@@ -0,0 +1,660 @@
from datetime import datetime
from todoist_api_python.api import TodoistAPI
from todoist_api_python.models import Task
from typing_extensions import Optional
from backend.blocks.todoist._auth import (
TEST_CREDENTIALS,
TEST_CREDENTIALS_INPUT,
TodoistCredentials,
TodoistCredentialsField,
TodoistCredentialsInput,
)
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.model import SchemaField
class TodoistCreateTaskBlock(Block):
"""Creates a new task in a Todoist project"""
class Input(BlockSchema):
credentials: TodoistCredentialsInput = TodoistCredentialsField([])
content: str = SchemaField(description="Task content", advanced=False)
description: Optional[str] = SchemaField(
description="Task description", default=None, advanced=False
)
project_id: Optional[str] = SchemaField(
description="Project ID this task should belong to",
default=None,
advanced=False,
)
section_id: Optional[str] = SchemaField(
description="Section ID this task should belong to",
default=None,
advanced=False,
)
parent_id: Optional[str] = SchemaField(
description="Parent task ID", default=None, advanced=True
)
order: Optional[int] = SchemaField(
description="Optional order among other tasks,[Non-zero integer value used by clients to sort tasks under the same parent]",
default=None,
advanced=True,
)
labels: Optional[list[str]] = SchemaField(
description="Task labels", default=None, advanced=True
)
priority: Optional[int] = SchemaField(
description="Task priority from 1 (normal) to 4 (urgent)",
default=None,
advanced=True,
)
due_date: Optional[datetime] = SchemaField(
description="Due date in YYYY-MM-DD format", advanced=True, default=None
)
deadline_date: Optional[datetime] = SchemaField(
description="Specific date in YYYY-MM-DD format relative to user's timezone",
default=None,
advanced=True,
)
assignee_id: Optional[str] = SchemaField(
description="Responsible user ID", default=None, advanced=True
)
duration_unit: Optional[str] = SchemaField(
description="Task duration unit (minute/day)", default=None, advanced=True
)
duration: Optional[int] = SchemaField(
description="Task duration amount, You need to selecct the duration unit first",
depends_on=["duration_unit"],
default=None,
advanced=True,
)
class Output(BlockSchema):
id: str = SchemaField(description="Task ID")
url: str = SchemaField(description="Task URL")
complete_data: dict = SchemaField(
description="Complete task data as dictionary"
)
error: str = SchemaField(description="Error message if request failed")
def __init__(self):
super().__init__(
id="fde4f458-de14-11ef-bf0c-32d3674e8b7e",
description="Creates a new task in a Todoist project",
categories={BlockCategory.PRODUCTIVITY},
input_schema=TodoistCreateTaskBlock.Input,
output_schema=TodoistCreateTaskBlock.Output,
test_input={
"credentials": TEST_CREDENTIALS_INPUT,
"content": "Buy groceries",
"project_id": "2203306141",
"priority": 4,
},
test_credentials=TEST_CREDENTIALS,
test_output=[
("id", "2995104339"),
("url", "https://todoist.com/showTask?id=2995104339"),
(
"complete_data",
{
"id": "2995104339",
"project_id": "2203306141",
"url": "https://todoist.com/showTask?id=2995104339",
},
),
],
test_mock={
"create_task": lambda *args, **kwargs: (
"2995104339",
"https://todoist.com/showTask?id=2995104339",
{
"id": "2995104339",
"project_id": "2203306141",
"url": "https://todoist.com/showTask?id=2995104339",
},
)
},
)
@staticmethod
def create_task(credentials: TodoistCredentials, content: str, **kwargs):
try:
api = TodoistAPI(credentials.access_token.get_secret_value())
task = api.add_task(content=content, **kwargs)
task_dict = Task.to_dict(task)
return task.id, task.url, task_dict
except Exception as e:
raise e
def run(
self,
input_data: Input,
*,
credentials: TodoistCredentials,
**kwargs,
) -> BlockOutput:
try:
due_date = (
input_data.due_date.strftime("%Y-%m-%d")
if input_data.due_date
else None
)
deadline_date = (
input_data.deadline_date.strftime("%Y-%m-%d")
if input_data.deadline_date
else None
)
task_args = {
"description": input_data.description,
"project_id": input_data.project_id,
"section_id": input_data.section_id,
"parent_id": input_data.parent_id,
"order": input_data.order,
"labels": input_data.labels,
"priority": input_data.priority,
"due_date": due_date,
"deadline_date": deadline_date,
"assignee_id": input_data.assignee_id,
"duration": input_data.duration,
"duration_unit": input_data.duration_unit,
}
id, url, complete_data = self.create_task(
credentials,
input_data.content,
**{k: v for k, v in task_args.items() if v is not None},
)
yield "id", id
yield "url", url
yield "complete_data", complete_data
except Exception as e:
yield "error", str(e)
class TodoistGetTasksBlock(Block):
"""Get active tasks from Todoist"""
class Input(BlockSchema):
credentials: TodoistCredentialsInput = TodoistCredentialsField([])
project_id: Optional[str] = SchemaField(
description="Filter tasks by project ID", default=None, advanced=False
)
section_id: Optional[str] = SchemaField(
description="Filter tasks by section ID", default=None, advanced=True
)
label: Optional[str] = SchemaField(
description="Filter tasks by label name", default=None, advanced=True
)
filter: Optional[str] = SchemaField(
description="Filter by any supported filter, You can see How to use filters or create one of your one here - https://todoist.com/help/articles/introduction-to-filters-V98wIH",
default=None,
advanced=True,
)
lang: Optional[str] = SchemaField(
description="IETF language tag for filter language", default=None
)
ids: Optional[list[str]] = SchemaField(
description="List of task IDs to retrieve", default=None, advanced=False
)
class Output(BlockSchema):
ids: list[str] = SchemaField(description="Task IDs")
urls: list[str] = SchemaField(description="Task URLs")
complete_data: list[dict] = SchemaField(
description="Complete task data as dictionary"
)
error: str = SchemaField(description="Error message if request failed")
def __init__(self):
super().__init__(
id="0b706e86-de15-11ef-a113-32d3674e8b7e",
description="Get active tasks from Todoist",
categories={BlockCategory.PRODUCTIVITY},
input_schema=TodoistGetTasksBlock.Input,
output_schema=TodoistGetTasksBlock.Output,
test_input={
"credentials": TEST_CREDENTIALS_INPUT,
"project_id": "2203306141",
},
test_credentials=TEST_CREDENTIALS,
test_output=[
("ids", ["2995104339"]),
("urls", ["https://todoist.com/showTask?id=2995104339"]),
(
"complete_data",
[
{
"id": "2995104339",
"project_id": "2203306141",
"url": "https://todoist.com/showTask?id=2995104339",
"is_completed": False,
}
],
),
],
test_mock={
"get_tasks": lambda *args, **kwargs: [
{
"id": "2995104339",
"project_id": "2203306141",
"url": "https://todoist.com/showTask?id=2995104339",
"is_completed": False,
}
]
},
)
@staticmethod
def get_tasks(credentials: TodoistCredentials, **kwargs):
try:
api = TodoistAPI(credentials.access_token.get_secret_value())
tasks = api.get_tasks(**kwargs)
return [Task.to_dict(task) for task in tasks]
except Exception as e:
raise e
def run(
self,
input_data: Input,
*,
credentials: TodoistCredentials,
**kwargs,
) -> BlockOutput:
try:
task_filters = {
"project_id": input_data.project_id,
"section_id": input_data.section_id,
"label": input_data.label,
"filter": input_data.filter,
"lang": input_data.lang,
"ids": input_data.ids,
}
tasks = self.get_tasks(
credentials, **{k: v for k, v in task_filters.items() if v is not None}
)
yield "ids", [task["id"] for task in tasks]
yield "urls", [task["url"] for task in tasks]
yield "complete_data", tasks
except Exception as e:
yield "error", str(e)
class TodoistGetTaskBlock(Block):
"""Get an active task from Todoist"""
class Input(BlockSchema):
credentials: TodoistCredentialsInput = TodoistCredentialsField([])
task_id: str = SchemaField(description="Task ID to retrieve")
class Output(BlockSchema):
project_id: str = SchemaField(description="Project ID containing the task")
url: str = SchemaField(description="Task URL")
complete_data: dict = SchemaField(
description="Complete task data as dictionary"
)
error: str = SchemaField(description="Error message if request failed")
def __init__(self):
super().__init__(
id="16d7dc8c-de15-11ef-8ace-32d3674e8b7e",
description="Get an active task from Todoist",
categories={BlockCategory.PRODUCTIVITY},
input_schema=TodoistGetTaskBlock.Input,
output_schema=TodoistGetTaskBlock.Output,
test_input={"credentials": TEST_CREDENTIALS_INPUT, "task_id": "2995104339"},
test_credentials=TEST_CREDENTIALS,
test_output=[
("project_id", "2203306141"),
("url", "https://todoist.com/showTask?id=2995104339"),
(
"complete_data",
{
"id": "2995104339",
"project_id": "2203306141",
"url": "https://todoist.com/showTask?id=2995104339",
},
),
],
test_mock={
"get_task": lambda *args, **kwargs: {
"project_id": "2203306141",
"id": "2995104339",
"url": "https://todoist.com/showTask?id=2995104339",
}
},
)
@staticmethod
def get_task(credentials: TodoistCredentials, task_id: str):
try:
api = TodoistAPI(credentials.access_token.get_secret_value())
task = api.get_task(task_id=task_id)
return Task.to_dict(task)
except Exception as e:
raise e
def run(
self,
input_data: Input,
*,
credentials: TodoistCredentials,
**kwargs,
) -> BlockOutput:
try:
task_data = self.get_task(credentials, input_data.task_id)
if task_data:
yield "project_id", task_data["project_id"]
yield "url", task_data["url"]
yield "complete_data", task_data
except Exception as e:
yield "error", str(e)
class TodoistUpdateTaskBlock(Block):
"""Updates an existing task in Todoist"""
class Input(BlockSchema):
credentials: TodoistCredentialsInput = TodoistCredentialsField([])
task_id: str = SchemaField(description="Task ID to update")
content: str = SchemaField(description="Task content", advanced=False)
description: Optional[str] = SchemaField(
description="Task description", default=None, advanced=False
)
project_id: Optional[str] = SchemaField(
description="Project ID this task should belong to",
default=None,
advanced=False,
)
section_id: Optional[str] = SchemaField(
description="Section ID this task should belong to",
default=None,
advanced=False,
)
parent_id: Optional[str] = SchemaField(
description="Parent task ID", default=None, advanced=True
)
order: Optional[int] = SchemaField(
description="Optional order among other tasks,[Non-zero integer value used by clients to sort tasks under the same parent]",
default=None,
advanced=True,
)
labels: Optional[list[str]] = SchemaField(
description="Task labels", default=None, advanced=True
)
priority: Optional[int] = SchemaField(
description="Task priority from 1 (normal) to 4 (urgent)",
default=None,
advanced=True,
)
due_date: Optional[datetime] = SchemaField(
description="Due date in YYYY-MM-DD format", advanced=True, default=None
)
deadline_date: Optional[datetime] = SchemaField(
description="Specific date in YYYY-MM-DD format relative to user's timezone",
default=None,
advanced=True,
)
assignee_id: Optional[str] = SchemaField(
description="Responsible user ID", default=None, advanced=True
)
duration_unit: Optional[str] = SchemaField(
description="Task duration unit (minute/day)", default=None, advanced=True
)
duration: Optional[int] = SchemaField(
description="Task duration amount, You need to selecct the duration unit first",
depends_on=["duration_unit"],
default=None,
advanced=True,
)
class Output(BlockSchema):
success: bool = SchemaField(description="Whether the update was successful")
error: str = SchemaField(description="Error message if request failed")
def __init__(self):
super().__init__(
id="1eee6d32-de15-11ef-a2ff-32d3674e8b7e",
description="Updates an existing task in Todoist",
categories={BlockCategory.PRODUCTIVITY},
input_schema=TodoistUpdateTaskBlock.Input,
output_schema=TodoistUpdateTaskBlock.Output,
test_input={
"credentials": TEST_CREDENTIALS_INPUT,
"task_id": "2995104339",
"content": "Buy Coffee",
},
test_credentials=TEST_CREDENTIALS,
test_output=[("success", True)],
test_mock={"update_task": lambda *args, **kwargs: True},
)
@staticmethod
def update_task(credentials: TodoistCredentials, task_id: str, **kwargs):
try:
api = TodoistAPI(credentials.access_token.get_secret_value())
is_success = api.update_task(task_id=task_id, **kwargs)
return is_success
except Exception as e:
raise e
def run(
self,
input_data: Input,
*,
credentials: TodoistCredentials,
**kwargs,
) -> BlockOutput:
try:
due_date = (
input_data.due_date.strftime("%Y-%m-%d")
if input_data.due_date
else None
)
deadline_date = (
input_data.deadline_date.strftime("%Y-%m-%d")
if input_data.deadline_date
else None
)
task_updates = {}
if input_data.content is not None:
task_updates["content"] = input_data.content
if input_data.description is not None:
task_updates["description"] = input_data.description
if input_data.project_id is not None:
task_updates["project_id"] = input_data.project_id
if input_data.section_id is not None:
task_updates["section_id"] = input_data.section_id
if input_data.parent_id is not None:
task_updates["parent_id"] = input_data.parent_id
if input_data.order is not None:
task_updates["order"] = input_data.order
if input_data.labels is not None:
task_updates["labels"] = input_data.labels
if input_data.priority is not None:
task_updates["priority"] = input_data.priority
if due_date is not None:
task_updates["due_date"] = due_date
if deadline_date is not None:
task_updates["deadline_date"] = deadline_date
if input_data.assignee_id is not None:
task_updates["assignee_id"] = input_data.assignee_id
if input_data.duration is not None:
task_updates["duration"] = input_data.duration
if input_data.duration_unit is not None:
task_updates["duration_unit"] = input_data.duration_unit
self.update_task(
credentials,
input_data.task_id,
**{k: v for k, v in task_updates.items() if v is not None},
)
yield "success", True
except Exception as e:
yield "error", str(e)
class TodoistCloseTaskBlock(Block):
"""Closes a task in Todoist"""
class Input(BlockSchema):
credentials: TodoistCredentialsInput = TodoistCredentialsField([])
task_id: str = SchemaField(description="Task ID to close")
class Output(BlockSchema):
success: bool = SchemaField(
description="Whether the task was successfully closed"
)
error: str = SchemaField(description="Error message if request failed")
def __init__(self):
super().__init__(
id="29fac798-de15-11ef-b839-32d3674e8b7e",
description="Closes a task in Todoist",
categories={BlockCategory.PRODUCTIVITY},
input_schema=TodoistCloseTaskBlock.Input,
output_schema=TodoistCloseTaskBlock.Output,
test_input={"credentials": TEST_CREDENTIALS_INPUT, "task_id": "2995104339"},
test_credentials=TEST_CREDENTIALS,
test_output=[("success", True)],
test_mock={"close_task": lambda *args, **kwargs: True},
)
@staticmethod
def close_task(credentials: TodoistCredentials, task_id: str):
try:
api = TodoistAPI(credentials.access_token.get_secret_value())
is_success = api.close_task(task_id=task_id)
return is_success
except Exception as e:
raise e
def run(
self,
input_data: Input,
*,
credentials: TodoistCredentials,
**kwargs,
) -> BlockOutput:
try:
is_success = self.close_task(credentials, input_data.task_id)
yield "success", is_success
except Exception as e:
yield "error", str(e)
class TodoistReopenTaskBlock(Block):
"""Reopens a task in Todoist"""
class Input(BlockSchema):
credentials: TodoistCredentialsInput = TodoistCredentialsField([])
task_id: str = SchemaField(description="Task ID to reopen")
class Output(BlockSchema):
success: bool = SchemaField(
description="Whether the task was successfully reopened"
)
error: str = SchemaField(description="Error message if request failed")
def __init__(self):
super().__init__(
id="2e6bf6f8-de15-11ef-ae7c-32d3674e8b7e",
description="Reopens a task in Todoist",
categories={BlockCategory.PRODUCTIVITY},
input_schema=TodoistReopenTaskBlock.Input,
output_schema=TodoistReopenTaskBlock.Output,
test_input={"credentials": TEST_CREDENTIALS_INPUT, "task_id": "2995104339"},
test_credentials=TEST_CREDENTIALS,
test_output=[
("success", True),
],
test_mock={"reopen_task": lambda *args, **kwargs: (True)},
)
@staticmethod
def reopen_task(credentials: TodoistCredentials, task_id: str):
try:
api = TodoistAPI(credentials.access_token.get_secret_value())
is_success = api.reopen_task(task_id=task_id)
return is_success
except Exception as e:
raise e
def run(
self,
input_data: Input,
*,
credentials: TodoistCredentials,
**kwargs,
) -> BlockOutput:
try:
is_success = self.reopen_task(credentials, input_data.task_id)
yield "success", is_success
except Exception as e:
yield "error", str(e)
class TodoistDeleteTaskBlock(Block):
"""Deletes a task in Todoist"""
class Input(BlockSchema):
credentials: TodoistCredentialsInput = TodoistCredentialsField([])
task_id: str = SchemaField(description="Task ID to delete")
class Output(BlockSchema):
success: bool = SchemaField(
description="Whether the task was successfully deleted"
)
error: str = SchemaField(description="Error message if request failed")
def __init__(self):
super().__init__(
id="33c29ada-de15-11ef-bcbb-32d3674e8b7e",
description="Deletes a task in Todoist",
categories={BlockCategory.PRODUCTIVITY},
input_schema=TodoistDeleteTaskBlock.Input,
output_schema=TodoistDeleteTaskBlock.Output,
test_input={"credentials": TEST_CREDENTIALS_INPUT, "task_id": "2995104339"},
test_credentials=TEST_CREDENTIALS,
test_output=[
("success", True),
],
test_mock={"delete_task": lambda *args, **kwargs: (True)},
)
@staticmethod
def delete_task(credentials: TodoistCredentials, task_id: str):
try:
api = TodoistAPI(credentials.access_token.get_secret_value())
is_success = api.delete_task(task_id=task_id)
return is_success
except Exception as e:
raise e
def run(
self,
input_data: Input,
*,
credentials: TodoistCredentials,
**kwargs,
) -> BlockOutput:
try:
is_success = self.delete_task(credentials, input_data.task_id)
yield "success", is_success
except Exception as e:
yield "error", str(e)

View File

@@ -0,0 +1,60 @@
from typing import Literal
from pydantic import SecretStr
from backend.data.model import (
CredentialsField,
CredentialsMetaInput,
OAuth2Credentials,
ProviderName,
)
from backend.integrations.oauth.twitter import TwitterOAuthHandler
from backend.util.settings import Secrets
# --8<-- [start:TwitterOAuthIsConfigured]
secrets = Secrets()
TWITTER_OAUTH_IS_CONFIGURED = bool(
secrets.twitter_client_id and secrets.twitter_client_secret
)
# --8<-- [end:TwitterOAuthIsConfigured]
TwitterCredentials = OAuth2Credentials
TwitterCredentialsInput = CredentialsMetaInput[
Literal[ProviderName.TWITTER], Literal["oauth2"]
]
# Currently, We are getting all the permission from the Twitter API initally
# In future, If we need to add incremental permission, we can use these requested_scopes
def TwitterCredentialsField(scopes: list[str]) -> TwitterCredentialsInput:
"""
Creates a Twitter credentials input on a block.
Params:
scopes: The authorization scopes needed for the block to work.
"""
return CredentialsField(
# required_scopes=set(scopes),
required_scopes=set(TwitterOAuthHandler.DEFAULT_SCOPES + scopes),
description="The Twitter integration requires OAuth2 authentication.",
)
TEST_CREDENTIALS = OAuth2Credentials(
id="01234567-89ab-cdef-0123-456789abcdef",
provider="twitter",
access_token=SecretStr("mock-twitter-access-token"),
refresh_token=SecretStr("mock-twitter-refresh-token"),
access_token_expires_at=1234567890,
scopes=["tweet.read", "tweet.write", "users.read", "offline.access"],
title="Mock Twitter OAuth2 Credentials",
username="mock-twitter-username",
refresh_token_expires_at=1234567890,
)
TEST_CREDENTIALS_INPUT = {
"provider": TEST_CREDENTIALS.provider,
"id": TEST_CREDENTIALS.id,
"type": TEST_CREDENTIALS.type,
"title": TEST_CREDENTIALS.title,
}

View File

@@ -0,0 +1,418 @@
from datetime import datetime
from typing import Any, Dict
from backend.blocks.twitter._mappers import (
get_backend_expansion,
get_backend_field,
get_backend_list_expansion,
get_backend_list_field,
get_backend_media_field,
get_backend_place_field,
get_backend_poll_field,
get_backend_space_expansion,
get_backend_space_field,
get_backend_user_field,
)
from backend.blocks.twitter._types import ( # DMEventFieldFilter,
DMEventExpansionFilter,
DMEventTypeFilter,
DMMediaFieldFilter,
DMTweetFieldFilter,
ExpansionFilter,
ListExpansionsFilter,
ListFieldsFilter,
SpaceExpansionsFilter,
SpaceFieldsFilter,
TweetFieldsFilter,
TweetMediaFieldsFilter,
TweetPlaceFieldsFilter,
TweetPollFieldsFilter,
TweetReplySettingsFilter,
TweetUserFieldsFilter,
UserExpansionsFilter,
)
# Common Builder
class TweetExpansionsBuilder:
def __init__(self, param: Dict[str, Any]):
self.params: Dict[str, Any] = param
def add_expansions(self, expansions: ExpansionFilter | None):
if expansions:
filtered_expansions = [
name for name, value in expansions.dict().items() if value is True
]
if filtered_expansions:
self.params["expansions"] = ",".join(
[get_backend_expansion(exp) for exp in filtered_expansions]
)
return self
def add_media_fields(self, media_fields: TweetMediaFieldsFilter | None):
if media_fields:
filtered_fields = [
name for name, value in media_fields.dict().items() if value is True
]
if filtered_fields:
self.params["media.fields"] = ",".join(
[get_backend_media_field(field) for field in filtered_fields]
)
return self
def add_place_fields(self, place_fields: TweetPlaceFieldsFilter | None):
if place_fields:
filtered_fields = [
name for name, value in place_fields.dict().items() if value is True
]
if filtered_fields:
self.params["place.fields"] = ",".join(
[get_backend_place_field(field) for field in filtered_fields]
)
return self
def add_poll_fields(self, poll_fields: TweetPollFieldsFilter | None):
if poll_fields:
filtered_fields = [
name for name, value in poll_fields.dict().items() if value is True
]
if filtered_fields:
self.params["poll.fields"] = ",".join(
[get_backend_poll_field(field) for field in filtered_fields]
)
return self
def add_tweet_fields(self, tweet_fields: TweetFieldsFilter | None):
if tweet_fields:
filtered_fields = [
name for name, value in tweet_fields.dict().items() if value is True
]
if filtered_fields:
self.params["tweet.fields"] = ",".join(
[get_backend_field(field) for field in filtered_fields]
)
return self
def add_user_fields(self, user_fields: TweetUserFieldsFilter | None):
if user_fields:
filtered_fields = [
name for name, value in user_fields.dict().items() if value is True
]
if filtered_fields:
self.params["user.fields"] = ",".join(
[get_backend_user_field(field) for field in filtered_fields]
)
return self
def build(self):
return self.params
class UserExpansionsBuilder:
def __init__(self, param: Dict[str, Any]):
self.params: Dict[str, Any] = param
def add_expansions(self, expansions: UserExpansionsFilter | None):
if expansions:
filtered_expansions = [
name for name, value in expansions.dict().items() if value is True
]
if filtered_expansions:
self.params["expansions"] = ",".join(filtered_expansions)
return self
def add_tweet_fields(self, tweet_fields: TweetFieldsFilter | None):
if tweet_fields:
filtered_fields = [
name for name, value in tweet_fields.dict().items() if value is True
]
if filtered_fields:
self.params["tweet.fields"] = ",".join(
[get_backend_field(field) for field in filtered_fields]
)
return self
def add_user_fields(self, user_fields: TweetUserFieldsFilter | None):
if user_fields:
filtered_fields = [
name for name, value in user_fields.dict().items() if value is True
]
if filtered_fields:
self.params["user.fields"] = ",".join(
[get_backend_user_field(field) for field in filtered_fields]
)
return self
def build(self):
return self.params
class ListExpansionsBuilder:
def __init__(self, param: Dict[str, Any]):
self.params: Dict[str, Any] = param
def add_expansions(self, expansions: ListExpansionsFilter | None):
if expansions:
filtered_expansions = [
name for name, value in expansions.dict().items() if value is True
]
if filtered_expansions:
self.params["expansions"] = ",".join(
[get_backend_list_expansion(exp) for exp in filtered_expansions]
)
return self
def add_list_fields(self, list_fields: ListFieldsFilter | None):
if list_fields:
filtered_fields = [
name for name, value in list_fields.dict().items() if value is True
]
if filtered_fields:
self.params["list.fields"] = ",".join(
[get_backend_list_field(field) for field in filtered_fields]
)
return self
def add_user_fields(self, user_fields: TweetUserFieldsFilter | None):
if user_fields:
filtered_fields = [
name for name, value in user_fields.dict().items() if value is True
]
if filtered_fields:
self.params["user.fields"] = ",".join(
[get_backend_user_field(field) for field in filtered_fields]
)
return self
def build(self):
return self.params
class SpaceExpansionsBuilder:
def __init__(self, param: Dict[str, Any]):
self.params: Dict[str, Any] = param
def add_expansions(self, expansions: SpaceExpansionsFilter | None):
if expansions:
filtered_expansions = [
name for name, value in expansions.dict().items() if value is True
]
if filtered_expansions:
self.params["expansions"] = ",".join(
[get_backend_space_expansion(exp) for exp in filtered_expansions]
)
return self
def add_space_fields(self, space_fields: SpaceFieldsFilter | None):
if space_fields:
filtered_fields = [
name for name, value in space_fields.dict().items() if value is True
]
if filtered_fields:
self.params["space.fields"] = ",".join(
[get_backend_space_field(field) for field in filtered_fields]
)
return self
def add_user_fields(self, user_fields: TweetUserFieldsFilter | None):
if user_fields:
filtered_fields = [
name for name, value in user_fields.dict().items() if value is True
]
if filtered_fields:
self.params["user.fields"] = ",".join(
[get_backend_user_field(field) for field in filtered_fields]
)
return self
def build(self):
return self.params
class TweetDurationBuilder:
def __init__(self, param: Dict[str, Any]):
self.params: Dict[str, Any] = param
def add_start_time(self, start_time: datetime | None):
if start_time:
self.params["start_time"] = start_time
return self
def add_end_time(self, end_time: datetime | None):
if end_time:
self.params["end_time"] = end_time
return self
def add_since_id(self, since_id: str | None):
if since_id:
self.params["since_id"] = since_id
return self
def add_until_id(self, until_id: str | None):
if until_id:
self.params["until_id"] = until_id
return self
def add_sort_order(self, sort_order: str | None):
if sort_order:
self.params["sort_order"] = sort_order
return self
def build(self):
return self.params
class DMExpansionsBuilder:
def __init__(self, param: Dict[str, Any]):
self.params: Dict[str, Any] = param
def add_expansions(self, expansions: DMEventExpansionFilter):
if expansions:
filtered_expansions = [
name for name, value in expansions.dict().items() if value is True
]
if filtered_expansions:
self.params["expansions"] = ",".join(filtered_expansions)
return self
def add_event_types(self, event_types: DMEventTypeFilter):
if event_types:
filtered_types = [
name for name, value in event_types.dict().items() if value is True
]
if filtered_types:
self.params["event_types"] = ",".join(filtered_types)
return self
def add_media_fields(self, media_fields: DMMediaFieldFilter):
if media_fields:
filtered_fields = [
name for name, value in media_fields.dict().items() if value is True
]
if filtered_fields:
self.params["media.fields"] = ",".join(filtered_fields)
return self
def add_tweet_fields(self, tweet_fields: DMTweetFieldFilter):
if tweet_fields:
filtered_fields = [
name for name, value in tweet_fields.dict().items() if value is True
]
if filtered_fields:
self.params["tweet.fields"] = ",".join(filtered_fields)
return self
def add_user_fields(self, user_fields: TweetUserFieldsFilter):
if user_fields:
filtered_fields = [
name for name, value in user_fields.dict().items() if value is True
]
if filtered_fields:
self.params["user.fields"] = ",".join(filtered_fields)
return self
def build(self):
return self.params
# Specific Builders
class TweetSearchBuilder:
def __init__(self):
self.params: Dict[str, Any] = {"user_auth": False}
def add_query(self, query: str):
if query:
self.params["query"] = query
return self
def add_pagination(self, max_results: int, pagination: str | None):
if max_results:
self.params["max_results"] = max_results
if pagination:
self.params["pagination_token"] = pagination
return self
def build(self):
return self.params
class TweetPostBuilder:
def __init__(self):
self.params: Dict[str, Any] = {"user_auth": False}
def add_text(self, text: str | None):
if text:
self.params["text"] = text
return self
def add_media(self, media_ids: list, tagged_user_ids: list):
if media_ids:
self.params["media_ids"] = media_ids
if tagged_user_ids:
self.params["media_tagged_user_ids"] = tagged_user_ids
return self
def add_deep_link(self, link: str):
if link:
self.params["direct_message_deep_link"] = link
return self
def add_super_followers(self, for_super_followers: bool):
if for_super_followers:
self.params["for_super_followers_only"] = for_super_followers
return self
def add_place(self, place_id: str):
if place_id:
self.params["place_id"] = place_id
return self
def add_poll_options(self, poll_options: list):
if poll_options:
self.params["poll_options"] = poll_options
return self
def add_poll_duration(self, poll_duration_minutes: int):
if poll_duration_minutes:
self.params["poll_duration_minutes"] = poll_duration_minutes
return self
def add_quote(self, quote_id: str):
if quote_id:
self.params["quote_tweet_id"] = quote_id
return self
def add_reply_settings(
self,
exclude_user_ids: list,
reply_to_id: str,
settings: TweetReplySettingsFilter,
):
if exclude_user_ids:
self.params["exclude_reply_user_ids"] = exclude_user_ids
if reply_to_id:
self.params["in_reply_to_tweet_id"] = reply_to_id
if settings.All_Users:
self.params["reply_settings"] = None
elif settings.Following_Users_Only:
self.params["reply_settings"] = "following"
elif settings.Mentioned_Users_Only:
self.params["reply_settings"] = "mentionedUsers"
return self
def build(self):
return self.params
class TweetGetsBuilder:
def __init__(self):
self.params: Dict[str, Any] = {"user_auth": False}
def add_id(self, tweet_id: list[str]):
self.params["id"] = tweet_id
return self
def build(self):
return self.params

View File

@@ -0,0 +1,234 @@
# -------------- Tweets -----------------
# Tweet Expansions
EXPANSION_FRONTEND_TO_BACKEND_MAPPING = {
"Poll_IDs": "attachments.poll_ids",
"Media_Keys": "attachments.media_keys",
"Author_User_ID": "author_id",
"Edit_History_Tweet_IDs": "edit_history_tweet_ids",
"Mentioned_Usernames": "entities.mentions.username",
"Place_ID": "geo.place_id",
"Reply_To_User_ID": "in_reply_to_user_id",
"Referenced_Tweet_ID": "referenced_tweets.id",
"Referenced_Tweet_Author_ID": "referenced_tweets.id.author_id",
}
def get_backend_expansion(frontend_key: str) -> str:
result = EXPANSION_FRONTEND_TO_BACKEND_MAPPING.get(frontend_key)
if result is None:
raise KeyError(f"Invalid expansion key: {frontend_key}")
return result
# TweetReplySettings
REPLY_SETTINGS_FRONTEND_TO_BACKEND_MAPPING = {
"Mentioned_Users_Only": "mentionedUsers",
"Following_Users_Only": "following",
"All_Users": "all",
}
# TweetUserFields
def get_backend_reply_setting(frontend_key: str) -> str:
result = REPLY_SETTINGS_FRONTEND_TO_BACKEND_MAPPING.get(frontend_key)
if result is None:
raise KeyError(f"Invalid reply setting key: {frontend_key}")
return result
USER_FIELDS_FRONTEND_TO_BACKEND_MAPPING = {
"Account_Creation_Date": "created_at",
"User_Bio": "description",
"User_Entities": "entities",
"User_ID": "id",
"User_Location": "location",
"Latest_Tweet_ID": "most_recent_tweet_id",
"Display_Name": "name",
"Pinned_Tweet_ID": "pinned_tweet_id",
"Profile_Picture_URL": "profile_image_url",
"Is_Protected_Account": "protected",
"Account_Statistics": "public_metrics",
"Profile_URL": "url",
"Username": "username",
"Is_Verified": "verified",
"Verification_Type": "verified_type",
"Content_Withholding_Info": "withheld",
}
def get_backend_user_field(frontend_key: str) -> str:
result = USER_FIELDS_FRONTEND_TO_BACKEND_MAPPING.get(frontend_key)
if result is None:
raise KeyError(f"Invalid user field key: {frontend_key}")
return result
# TweetFields
FIELDS_FRONTEND_TO_BACKEND_MAPPING = {
"Tweet_Attachments": "attachments",
"Author_ID": "author_id",
"Context_Annotations": "context_annotations",
"Conversation_ID": "conversation_id",
"Creation_Time": "created_at",
"Edit_Controls": "edit_controls",
"Tweet_Entities": "entities",
"Geographic_Location": "geo",
"Tweet_ID": "id",
"Reply_To_User_ID": "in_reply_to_user_id",
"Language": "lang",
"Public_Metrics": "public_metrics",
"Sensitive_Content_Flag": "possibly_sensitive",
"Referenced_Tweets": "referenced_tweets",
"Reply_Settings": "reply_settings",
"Tweet_Source": "source",
"Tweet_Text": "text",
"Withheld_Content": "withheld",
}
def get_backend_field(frontend_key: str) -> str:
result = FIELDS_FRONTEND_TO_BACKEND_MAPPING.get(frontend_key)
if result is None:
raise KeyError(f"Invalid field key: {frontend_key}")
return result
# TweetPollFields
POLL_FIELDS_FRONTEND_TO_BACKEND_MAPPING = {
"Duration_Minutes": "duration_minutes",
"End_DateTime": "end_datetime",
"Poll_ID": "id",
"Poll_Options": "options",
"Voting_Status": "voting_status",
}
def get_backend_poll_field(frontend_key: str) -> str:
result = POLL_FIELDS_FRONTEND_TO_BACKEND_MAPPING.get(frontend_key)
if result is None:
raise KeyError(f"Invalid poll field key: {frontend_key}")
return result
PLACE_FIELDS_FRONTEND_TO_BACKEND_MAPPING = {
"Contained_Within_Places": "contained_within",
"Country": "country",
"Country_Code": "country_code",
"Full_Location_Name": "full_name",
"Geographic_Coordinates": "geo",
"Place_ID": "id",
"Place_Name": "name",
"Place_Type": "place_type",
}
def get_backend_place_field(frontend_key: str) -> str:
result = PLACE_FIELDS_FRONTEND_TO_BACKEND_MAPPING.get(frontend_key)
if result is None:
raise KeyError(f"Invalid place field key: {frontend_key}")
return result
# TweetMediaFields
MEDIA_FIELDS_FRONTEND_TO_BACKEND_MAPPING = {
"Duration_in_Milliseconds": "duration_ms",
"Height": "height",
"Media_Key": "media_key",
"Preview_Image_URL": "preview_image_url",
"Media_Type": "type",
"Media_URL": "url",
"Width": "width",
"Public_Metrics": "public_metrics",
"Non_Public_Metrics": "non_public_metrics",
"Organic_Metrics": "organic_metrics",
"Promoted_Metrics": "promoted_metrics",
"Alternative_Text": "alt_text",
"Media_Variants": "variants",
}
def get_backend_media_field(frontend_key: str) -> str:
result = MEDIA_FIELDS_FRONTEND_TO_BACKEND_MAPPING.get(frontend_key)
if result is None:
raise KeyError(f"Invalid media field key: {frontend_key}")
return result
# -------------- Spaces -----------------
# SpaceExpansions
EXPANSION_FRONTEND_TO_BACKEND_MAPPING_SPACE = {
"Invited_Users": "invited_user_ids",
"Speakers": "speaker_ids",
"Creator": "creator_id",
"Hosts": "host_ids",
"Topics": "topic_ids",
}
def get_backend_space_expansion(frontend_key: str) -> str:
result = EXPANSION_FRONTEND_TO_BACKEND_MAPPING_SPACE.get(frontend_key)
if result is None:
raise KeyError(f"Invalid expansion key: {frontend_key}")
return result
# SpaceFields
SPACE_FIELDS_FRONTEND_TO_BACKEND_MAPPING = {
"Space_ID": "id",
"Space_State": "state",
"Creation_Time": "created_at",
"End_Time": "ended_at",
"Host_User_IDs": "host_ids",
"Language": "lang",
"Is_Ticketed": "is_ticketed",
"Invited_User_IDs": "invited_user_ids",
"Participant_Count": "participant_count",
"Subscriber_Count": "subscriber_count",
"Scheduled_Start_Time": "scheduled_start",
"Speaker_User_IDs": "speaker_ids",
"Start_Time": "started_at",
"Space_Title": "title",
"Topic_IDs": "topic_ids",
"Last_Updated_Time": "updated_at",
}
def get_backend_space_field(frontend_key: str) -> str:
result = SPACE_FIELDS_FRONTEND_TO_BACKEND_MAPPING.get(frontend_key)
if result is None:
raise KeyError(f"Invalid space field key: {frontend_key}")
return result
# -------------- List Expansions -----------------
# ListExpansions
LIST_EXPANSION_FRONTEND_TO_BACKEND_MAPPING = {"List_Owner_ID": "owner_id"}
def get_backend_list_expansion(frontend_key: str) -> str:
result = LIST_EXPANSION_FRONTEND_TO_BACKEND_MAPPING.get(frontend_key)
if result is None:
raise KeyError(f"Invalid list expansion key: {frontend_key}")
return result
LIST_FIELDS_FRONTEND_TO_BACKEND_MAPPING = {
"List_ID": "id",
"List_Name": "name",
"Creation_Date": "created_at",
"Description": "description",
"Follower_Count": "follower_count",
"Member_Count": "member_count",
"Is_Private": "private",
"Owner_ID": "owner_id",
}
def get_backend_list_field(frontend_key: str) -> str:
result = LIST_FIELDS_FRONTEND_TO_BACKEND_MAPPING.get(frontend_key)
if result is None:
raise KeyError(f"Invalid list field key: {frontend_key}")
return result

View File

@@ -0,0 +1,76 @@
from typing import Any, Dict, List
class BaseSerializer:
@staticmethod
def _serialize_value(value: Any) -> Any:
"""Helper method to serialize individual values"""
if hasattr(value, "data"):
return value.data
return value
class IncludesSerializer(BaseSerializer):
@classmethod
def serialize(cls, includes: Dict[str, Any]) -> Dict[str, Any]:
"""Serializes the includes dictionary"""
if not includes:
return {}
serialized_includes = {}
for key, value in includes.items():
if isinstance(value, list):
serialized_includes[key] = [
cls._serialize_value(item) for item in value
]
else:
serialized_includes[key] = cls._serialize_value(value)
return serialized_includes
class ResponseDataSerializer(BaseSerializer):
@classmethod
def serialize_dict(cls, item: Dict[str, Any]) -> Dict[str, Any]:
"""Serializes a single dictionary item"""
serialized_item = {}
if hasattr(item, "__dict__"):
items = item.__dict__.items()
else:
items = item.items()
for key, value in items:
if isinstance(value, list):
serialized_item[key] = [
cls._serialize_value(sub_item) for sub_item in value
]
else:
serialized_item[key] = cls._serialize_value(value)
return serialized_item
@classmethod
def serialize_list(cls, data: List[Dict[str, Any]]) -> List[Dict[str, Any]]:
"""Serializes a list of dictionary items"""
return [cls.serialize_dict(item) for item in data]
class ResponseSerializer:
@classmethod
def serialize(cls, response) -> Dict[str, Any]:
"""Main serializer that handles both data and includes"""
result = {"data": None, "included": {}}
# Handle response.data
if response.data:
if isinstance(response.data, list):
result["data"] = ResponseDataSerializer.serialize_list(response.data)
else:
result["data"] = ResponseDataSerializer.serialize_dict(response.data)
# Handle includes
if hasattr(response, "includes") and response.includes:
result["included"] = IncludesSerializer.serialize(response.includes)
return result

View File

@@ -0,0 +1,443 @@
from datetime import datetime
from enum import Enum
from pydantic import BaseModel
from backend.data.block import BlockSchema
from backend.data.model import SchemaField
# -------------- Tweets -----------------
class TweetReplySettingsFilter(BaseModel):
Mentioned_Users_Only: bool = False
Following_Users_Only: bool = False
All_Users: bool = False
class TweetUserFieldsFilter(BaseModel):
Account_Creation_Date: bool = False
User_Bio: bool = False
User_Entities: bool = False
User_ID: bool = False
User_Location: bool = False
Latest_Tweet_ID: bool = False
Display_Name: bool = False
Pinned_Tweet_ID: bool = False
Profile_Picture_URL: bool = False
Is_Protected_Account: bool = False
Account_Statistics: bool = False
Profile_URL: bool = False
Username: bool = False
Is_Verified: bool = False
Verification_Type: bool = False
Content_Withholding_Info: bool = False
class TweetFieldsFilter(BaseModel):
Tweet_Attachments: bool = False
Author_ID: bool = False
Context_Annotations: bool = False
Conversation_ID: bool = False
Creation_Time: bool = False
Edit_Controls: bool = False
Tweet_Entities: bool = False
Geographic_Location: bool = False
Tweet_ID: bool = False
Reply_To_User_ID: bool = False
Language: bool = False
Public_Metrics: bool = False
Sensitive_Content_Flag: bool = False
Referenced_Tweets: bool = False
Reply_Settings: bool = False
Tweet_Source: bool = False
Tweet_Text: bool = False
Withheld_Content: bool = False
class PersonalTweetFieldsFilter(BaseModel):
attachments: bool = False
author_id: bool = False
context_annotations: bool = False
conversation_id: bool = False
created_at: bool = False
edit_controls: bool = False
entities: bool = False
geo: bool = False
id: bool = False
in_reply_to_user_id: bool = False
lang: bool = False
non_public_metrics: bool = False
public_metrics: bool = False
organic_metrics: bool = False
promoted_metrics: bool = False
possibly_sensitive: bool = False
referenced_tweets: bool = False
reply_settings: bool = False
source: bool = False
text: bool = False
withheld: bool = False
class TweetPollFieldsFilter(BaseModel):
Duration_Minutes: bool = False
End_DateTime: bool = False
Poll_ID: bool = False
Poll_Options: bool = False
Voting_Status: bool = False
class TweetPlaceFieldsFilter(BaseModel):
Contained_Within_Places: bool = False
Country: bool = False
Country_Code: bool = False
Full_Location_Name: bool = False
Geographic_Coordinates: bool = False
Place_ID: bool = False
Place_Name: bool = False
Place_Type: bool = False
class TweetMediaFieldsFilter(BaseModel):
Duration_in_Milliseconds: bool = False
Height: bool = False
Media_Key: bool = False
Preview_Image_URL: bool = False
Media_Type: bool = False
Media_URL: bool = False
Width: bool = False
Public_Metrics: bool = False
Non_Public_Metrics: bool = False
Organic_Metrics: bool = False
Promoted_Metrics: bool = False
Alternative_Text: bool = False
Media_Variants: bool = False
class ExpansionFilter(BaseModel):
Poll_IDs: bool = False
Media_Keys: bool = False
Author_User_ID: bool = False
Edit_History_Tweet_IDs: bool = False
Mentioned_Usernames: bool = False
Place_ID: bool = False
Reply_To_User_ID: bool = False
Referenced_Tweet_ID: bool = False
Referenced_Tweet_Author_ID: bool = False
class TweetExcludesFilter(BaseModel):
retweets: bool = False
replies: bool = False
# -------------- Users -----------------
class UserExpansionsFilter(BaseModel):
pinned_tweet_id: bool = False
# -------------- DM's' -----------------
class DMEventFieldFilter(BaseModel):
id: bool = False
text: bool = False
event_type: bool = False
created_at: bool = False
dm_conversation_id: bool = False
sender_id: bool = False
participant_ids: bool = False
referenced_tweets: bool = False
attachments: bool = False
class DMEventTypeFilter(BaseModel):
MessageCreate: bool = False
ParticipantsJoin: bool = False
ParticipantsLeave: bool = False
class DMEventExpansionFilter(BaseModel):
attachments_media_keys: bool = False
referenced_tweets_id: bool = False
sender_id: bool = False
participant_ids: bool = False
class DMMediaFieldFilter(BaseModel):
duration_ms: bool = False
height: bool = False
media_key: bool = False
preview_image_url: bool = False
type: bool = False
url: bool = False
width: bool = False
public_metrics: bool = False
alt_text: bool = False
variants: bool = False
class DMTweetFieldFilter(BaseModel):
attachments: bool = False
author_id: bool = False
context_annotations: bool = False
conversation_id: bool = False
created_at: bool = False
edit_controls: bool = False
entities: bool = False
geo: bool = False
id: bool = False
in_reply_to_user_id: bool = False
lang: bool = False
public_metrics: bool = False
possibly_sensitive: bool = False
referenced_tweets: bool = False
reply_settings: bool = False
source: bool = False
text: bool = False
withheld: bool = False
# -------------- Spaces -----------------
class SpaceExpansionsFilter(BaseModel):
Invited_Users: bool = False
Speakers: bool = False
Creator: bool = False
Hosts: bool = False
Topics: bool = False
class SpaceFieldsFilter(BaseModel):
Space_ID: bool = False
Space_State: bool = False
Creation_Time: bool = False
End_Time: bool = False
Host_User_IDs: bool = False
Language: bool = False
Is_Ticketed: bool = False
Invited_User_IDs: bool = False
Participant_Count: bool = False
Subscriber_Count: bool = False
Scheduled_Start_Time: bool = False
Speaker_User_IDs: bool = False
Start_Time: bool = False
Space_Title: bool = False
Topic_IDs: bool = False
Last_Updated_Time: bool = False
class SpaceStatesFilter(str, Enum):
live = "live"
scheduled = "scheduled"
all = "all"
# -------------- List Expansions -----------------
class ListExpansionsFilter(BaseModel):
List_Owner_ID: bool = False
class ListFieldsFilter(BaseModel):
List_ID: bool = False
List_Name: bool = False
Creation_Date: bool = False
Description: bool = False
Follower_Count: bool = False
Member_Count: bool = False
Is_Private: bool = False
Owner_ID: bool = False
# --------- [Input Types] -------------
class TweetExpansionInputs(BlockSchema):
expansions: ExpansionFilter | None = SchemaField(
description="Choose what extra information you want to get with your tweets. For example:\n- Select 'Media_Keys' to get media details\n- Select 'Author_User_ID' to get user information\n- Select 'Place_ID' to get location details",
placeholder="Pick the extra information you want to see",
default=None,
advanced=True,
)
media_fields: TweetMediaFieldsFilter | None = SchemaField(
description="Select what media information you want to see (images, videos, etc). To use this, you must first select 'Media_Keys' in the expansions above.",
placeholder="Choose what media details you want to see",
default=None,
advanced=True,
)
place_fields: TweetPlaceFieldsFilter | None = SchemaField(
description="Select what location information you want to see (country, coordinates, etc). To use this, you must first select 'Place_ID' in the expansions above.",
placeholder="Choose what location details you want to see",
default=None,
advanced=True,
)
poll_fields: TweetPollFieldsFilter | None = SchemaField(
description="Select what poll information you want to see (options, voting status, etc). To use this, you must first select 'Poll_IDs' in the expansions above.",
placeholder="Choose what poll details you want to see",
default=None,
advanced=True,
)
tweet_fields: TweetFieldsFilter | None = SchemaField(
description="Select what tweet information you want to see. For referenced tweets (like retweets), select 'Referenced_Tweet_ID' in the expansions above.",
placeholder="Choose what tweet details you want to see",
default=None,
advanced=True,
)
user_fields: TweetUserFieldsFilter | None = SchemaField(
description="Select what user information you want to see. To use this, you must first select one of these in expansions above:\n- 'Author_User_ID' for tweet authors\n- 'Mentioned_Usernames' for mentioned users\n- 'Reply_To_User_ID' for users being replied to\n- 'Referenced_Tweet_Author_ID' for authors of referenced tweets",
placeholder="Choose what user details you want to see",
default=None,
advanced=True,
)
class DMEventExpansionInputs(BlockSchema):
expansions: DMEventExpansionFilter | None = SchemaField(
description="Select expansions to include related data objects in the 'includes' section.",
placeholder="Enter expansions",
default=None,
advanced=True,
)
event_types: DMEventTypeFilter | None = SchemaField(
description="Select DM event types to include in the response.",
placeholder="Enter event types",
default=None,
advanced=True,
)
media_fields: DMMediaFieldFilter | None = SchemaField(
description="Select media fields to include in the response (requires expansions=attachments.media_keys).",
placeholder="Enter media fields",
default=None,
advanced=True,
)
tweet_fields: DMTweetFieldFilter | None = SchemaField(
description="Select tweet fields to include in the response (requires expansions=referenced_tweets.id).",
placeholder="Enter tweet fields",
default=None,
advanced=True,
)
user_fields: TweetUserFieldsFilter | None = SchemaField(
description="Select user fields to include in the response (requires expansions=sender_id or participant_ids).",
placeholder="Enter user fields",
default=None,
advanced=True,
)
class UserExpansionInputs(BlockSchema):
expansions: UserExpansionsFilter | None = SchemaField(
description="Choose what extra information you want to get with user data. Currently only 'pinned_tweet_id' is available to see a user's pinned tweet.",
placeholder="Select extra user information to include",
default=None,
advanced=True,
)
tweet_fields: TweetFieldsFilter | None = SchemaField(
description="Select what tweet information you want to see in pinned tweets. This only works if you select 'pinned_tweet_id' in expansions above.",
placeholder="Choose what details to see in pinned tweets",
default=None,
advanced=True,
)
user_fields: TweetUserFieldsFilter | None = SchemaField(
description="Select what user information you want to see, like username, bio, profile picture, etc.",
placeholder="Choose what user details you want to see",
default=None,
advanced=True,
)
class SpaceExpansionInputs(BlockSchema):
expansions: SpaceExpansionsFilter | None = SchemaField(
description="Choose additional information you want to get with your Twitter Spaces:\n- Select 'Invited_Users' to see who was invited\n- Select 'Speakers' to see who can speak\n- Select 'Creator' to get details about who made the Space\n- Select 'Hosts' to see who's hosting\n- Select 'Topics' to see Space topics",
placeholder="Pick what extra information you want to see about the Space",
default=None,
advanced=True,
)
space_fields: SpaceFieldsFilter | None = SchemaField(
description="Choose what Space details you want to see, such as:\n- Title\n- Start/End times\n- Number of participants\n- Language\n- State (live/scheduled)\n- And more",
placeholder="Choose what Space information you want to get",
default=SpaceFieldsFilter(Space_Title=True, Host_User_IDs=True),
advanced=True,
)
user_fields: TweetUserFieldsFilter | None = SchemaField(
description="Choose what user information you want to see. This works when you select any of these in expansions above:\n- 'Creator' for Space creator details\n- 'Hosts' for host information\n- 'Speakers' for speaker details\n- 'Invited_Users' for invited user information",
placeholder="Pick what details you want to see about the users",
default=None,
advanced=True,
)
class ListExpansionInputs(BlockSchema):
expansions: ListExpansionsFilter | None = SchemaField(
description="Choose what extra information you want to get with your Twitter Lists:\n- Select 'List_Owner_ID' to get details about who owns the list\n\nThis will let you see more details about the list owner when you also select user fields below.",
placeholder="Pick what extra list information you want to see",
default=ListExpansionsFilter(List_Owner_ID=True),
advanced=True,
)
user_fields: TweetUserFieldsFilter | None = SchemaField(
description="Choose what information you want to see about list owners. This only works when you select 'List_Owner_ID' in expansions above.\n\nYou can see things like:\n- Their username\n- Profile picture\n- Account details\n- And more",
placeholder="Select what details you want to see about list owners",
default=TweetUserFieldsFilter(User_ID=True, Username=True),
advanced=True,
)
list_fields: ListFieldsFilter | None = SchemaField(
description="Choose what information you want to see about the Twitter Lists themselves, such as:\n- List name\n- Description\n- Number of followers\n- Number of members\n- Whether it's private\n- Creation date\n- And more",
placeholder="Pick what list details you want to see",
default=ListFieldsFilter(Owner_ID=True),
advanced=True,
)
class TweetTimeWindowInputs(BlockSchema):
start_time: datetime | None = SchemaField(
description="Start time in YYYY-MM-DDTHH:mm:ssZ format",
placeholder="Enter start time",
default=None,
advanced=False,
)
end_time: datetime | None = SchemaField(
description="End time in YYYY-MM-DDTHH:mm:ssZ format",
placeholder="Enter end time",
default=None,
advanced=False,
)
since_id: str | None = SchemaField(
description="Returns results with Tweet ID greater than this (more recent than), we give priority to since_id over start_time",
placeholder="Enter since ID",
default=None,
advanced=True,
)
until_id: str | None = SchemaField(
description="Returns results with Tweet ID less than this (that is, older than), and used with since_id",
placeholder="Enter until ID",
default=None,
advanced=True,
)
sort_order: str | None = SchemaField(
description="Order of returned tweets (recency or relevancy)",
placeholder="Enter sort order",
default=None,
advanced=True,
)

View File

@@ -0,0 +1,201 @@
# Todo : Add new Type support
# from typing import cast
# import tweepy
# from tweepy.client import Response
# from backend.blocks.twitter._serializer import IncludesSerializer, ResponseDataSerializer
# from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
# from backend.data.model import SchemaField
# from backend.blocks.twitter._builders import DMExpansionsBuilder
# from backend.blocks.twitter._types import DMEventExpansion, DMEventExpansionInputs, DMEventType, DMMediaField, DMTweetField, TweetUserFields
# from backend.blocks.twitter.tweepy_exceptions import handle_tweepy_exception
# from backend.blocks.twitter._auth import (
# TEST_CREDENTIALS,
# TEST_CREDENTIALS_INPUT,
# TwitterCredentials,
# TwitterCredentialsField,
# TwitterCredentialsInput,
# )
# Require Pro or Enterprise plan [Manual Testing Required]
# class TwitterGetDMEventsBlock(Block):
# """
# Gets a list of Direct Message events for the authenticated user
# """
# class Input(DMEventExpansionInputs):
# credentials: TwitterCredentialsInput = TwitterCredentialsField(
# ["dm.read", "offline.access", "user.read", "tweet.read"]
# )
# dm_conversation_id: str = SchemaField(
# description="The ID of the Direct Message conversation",
# placeholder="Enter conversation ID",
# required=True
# )
# max_results: int = SchemaField(
# description="Maximum number of results to return (1-100)",
# placeholder="Enter max results",
# advanced=True,
# default=10,
# )
# pagination_token: str = SchemaField(
# description="Token for pagination",
# placeholder="Enter pagination token",
# advanced=True,
# default=""
# )
# class Output(BlockSchema):
# # Common outputs
# event_ids: list[str] = SchemaField(description="DM Event IDs")
# event_texts: list[str] = SchemaField(description="DM Event text contents")
# event_types: list[str] = SchemaField(description="Types of DM events")
# next_token: str = SchemaField(description="Token for next page of results")
# # Complete outputs
# data: list[dict] = SchemaField(description="Complete DM events data")
# included: dict = SchemaField(description="Additional data requested via expansions")
# meta: dict = SchemaField(description="Metadata about the response")
# error: str = SchemaField(description="Error message if request failed")
# def __init__(self):
# super().__init__(
# id="dc37a6d4-a62e-11ef-a3a5-03061375737b",
# description="This block retrieves Direct Message events for the authenticated user.",
# categories={BlockCategory.SOCIAL},
# input_schema=TwitterGetDMEventsBlock.Input,
# output_schema=TwitterGetDMEventsBlock.Output,
# test_input={
# "dm_conversation_id": "1234567890",
# "max_results": 10,
# "credentials": TEST_CREDENTIALS_INPUT,
# "expansions": [],
# "event_types": [],
# "media_fields": [],
# "tweet_fields": [],
# "user_fields": []
# },
# test_credentials=TEST_CREDENTIALS,
# test_output=[
# ("event_ids", ["1346889436626259968"]),
# ("event_texts", ["Hello just you..."]),
# ("event_types", ["MessageCreate"]),
# ("next_token", None),
# ("data", [{"id": "1346889436626259968", "text": "Hello just you...", "event_type": "MessageCreate"}]),
# ("included", {}),
# ("meta", {}),
# ("error", "")
# ],
# test_mock={
# "get_dm_events": lambda *args, **kwargs: (
# [{"id": "1346889436626259968", "text": "Hello just you...", "event_type": "MessageCreate"}],
# {},
# {},
# ["1346889436626259968"],
# ["Hello just you..."],
# ["MessageCreate"],
# None
# )
# }
# )
# @staticmethod
# def get_dm_events(
# credentials: TwitterCredentials,
# dm_conversation_id: str,
# max_results: int,
# pagination_token: str,
# expansions: list[DMEventExpansion],
# event_types: list[DMEventType],
# media_fields: list[DMMediaField],
# tweet_fields: list[DMTweetField],
# user_fields: list[TweetUserFields]
# ):
# try:
# client = tweepy.Client(
# bearer_token=credentials.access_token.get_secret_value()
# )
# params = {
# "dm_conversation_id": dm_conversation_id,
# "max_results": max_results,
# "pagination_token": None if pagination_token == "" else pagination_token,
# "user_auth": False
# }
# params = (DMExpansionsBuilder(params)
# .add_expansions(expansions)
# .add_event_types(event_types)
# .add_media_fields(media_fields)
# .add_tweet_fields(tweet_fields)
# .add_user_fields(user_fields)
# .build())
# response = cast(Response, client.get_direct_message_events(**params))
# meta = {}
# event_ids = []
# event_texts = []
# event_types = []
# next_token = None
# if response.meta:
# meta = response.meta
# next_token = meta.get("next_token")
# included = IncludesSerializer.serialize(response.includes)
# data = ResponseDataSerializer.serialize_list(response.data)
# if response.data:
# event_ids = [str(item.id) for item in response.data]
# event_texts = [item.text if hasattr(item, "text") else None for item in response.data]
# event_types = [item.event_type for item in response.data]
# return data, included, meta, event_ids, event_texts, event_types, next_token
# raise Exception("No DM events found")
# except tweepy.TweepyException:
# raise
# def run(
# self,
# input_data: Input,
# *,
# credentials: TwitterCredentials,
# **kwargs,
# ) -> BlockOutput:
# try:
# event_data, included, meta, event_ids, event_texts, event_types, next_token = self.get_dm_events(
# credentials,
# input_data.dm_conversation_id,
# input_data.max_results,
# input_data.pagination_token,
# input_data.expansions,
# input_data.event_types,
# input_data.media_fields,
# input_data.tweet_fields,
# input_data.user_fields
# )
# if event_ids:
# yield "event_ids", event_ids
# if event_texts:
# yield "event_texts", event_texts
# if event_types:
# yield "event_types", event_types
# if next_token:
# yield "next_token", next_token
# if event_data:
# yield "data", event_data
# if included:
# yield "included", included
# if meta:
# yield "meta", meta
# except Exception as e:
# yield "error", handle_tweepy_exception(e)

View File

@@ -0,0 +1,260 @@
# Todo : Add new Type support
# from typing import cast
# import tweepy
# from tweepy.client import Response
# from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
# from backend.data.model import SchemaField
# from backend.blocks.twitter.tweepy_exceptions import handle_tweepy_exception
# from backend.blocks.twitter._auth import (
# TEST_CREDENTIALS,
# TEST_CREDENTIALS_INPUT,
# TwitterCredentials,
# TwitterCredentialsField,
# TwitterCredentialsInput,
# )
# Pro and Enterprise plan [Manual Testing Required]
# class TwitterSendDirectMessageBlock(Block):
# """
# Sends a direct message to a Twitter user
# """
# class Input(BlockSchema):
# credentials: TwitterCredentialsInput = TwitterCredentialsField(
# ["offline.access", "direct_messages.write"]
# )
# participant_id: str = SchemaField(
# description="The User ID of the account to send DM to",
# placeholder="Enter recipient user ID",
# default="",
# advanced=False
# )
# dm_conversation_id: str = SchemaField(
# description="The conversation ID to send message to",
# placeholder="Enter conversation ID",
# default="",
# advanced=False
# )
# text: str = SchemaField(
# description="Text of the Direct Message (up to 10,000 characters)",
# placeholder="Enter message text",
# default="",
# advanced=False
# )
# media_id: str = SchemaField(
# description="Media ID to attach to the message",
# placeholder="Enter media ID",
# default=""
# )
# class Output(BlockSchema):
# dm_event_id: str = SchemaField(description="ID of the sent direct message")
# dm_conversation_id_: str = SchemaField(description="ID of the conversation")
# error: str = SchemaField(description="Error message if sending failed")
# def __init__(self):
# super().__init__(
# id="f32f2786-a62e-11ef-a93d-a3ef199dde7f",
# description="This block sends a direct message to a specified Twitter user.",
# categories={BlockCategory.SOCIAL},
# input_schema=TwitterSendDirectMessageBlock.Input,
# output_schema=TwitterSendDirectMessageBlock.Output,
# test_input={
# "participant_id": "783214",
# "dm_conversation_id": "",
# "text": "Hello from Twitter API",
# "media_id": "",
# "credentials": TEST_CREDENTIALS_INPUT
# },
# test_credentials=TEST_CREDENTIALS,
# test_output=[
# ("dm_event_id", "0987654321"),
# ("dm_conversation_id_", "1234567890"),
# ("error", "")
# ],
# test_mock={
# "send_direct_message": lambda *args, **kwargs: (
# "0987654321",
# "1234567890"
# )
# },
# )
# @staticmethod
# def send_direct_message(
# credentials: TwitterCredentials,
# participant_id: str,
# dm_conversation_id: str,
# text: str,
# media_id: str
# ):
# try:
# client = tweepy.Client(
# bearer_token=credentials.access_token.get_secret_value()
# )
# response = cast(
# Response,
# client.create_direct_message(
# participant_id=None if participant_id == "" else participant_id,
# dm_conversation_id=None if dm_conversation_id == "" else dm_conversation_id,
# text=None if text == "" else text,
# media_id=None if media_id == "" else media_id,
# user_auth=False
# )
# )
# if not response.data:
# raise Exception("Failed to send direct message")
# return response.data["dm_event_id"], response.data["dm_conversation_id"]
# except tweepy.TweepyException:
# raise
# except Exception as e:
# print(f"Unexpected error: {str(e)}")
# raise
# def run(
# self,
# input_data: Input,
# *,
# credentials: TwitterCredentials,
# **kwargs,
# ) -> BlockOutput:
# try:
# dm_event_id, dm_conversation_id = self.send_direct_message(
# credentials,
# input_data.participant_id,
# input_data.dm_conversation_id,
# input_data.text,
# input_data.media_id
# )
# yield "dm_event_id", dm_event_id
# yield "dm_conversation_id", dm_conversation_id
# except Exception as e:
# yield "error", handle_tweepy_exception(e)
# class TwitterCreateDMConversationBlock(Block):
# """
# Creates a new group direct message conversation on Twitter
# """
# class Input(BlockSchema):
# credentials: TwitterCredentialsInput = TwitterCredentialsField(
# ["offline.access", "dm.write","dm.read","tweet.read","user.read"]
# )
# participant_ids: list[str] = SchemaField(
# description="Array of User IDs to create conversation with (max 50)",
# placeholder="Enter participant user IDs",
# default=[],
# advanced=False
# )
# text: str = SchemaField(
# description="Text of the Direct Message (up to 10,000 characters)",
# placeholder="Enter message text",
# default="",
# advanced=False
# )
# media_id: str = SchemaField(
# description="Media ID to attach to the message",
# placeholder="Enter media ID",
# default="",
# advanced=False
# )
# class Output(BlockSchema):
# dm_event_id: str = SchemaField(description="ID of the sent direct message")
# dm_conversation_id: str = SchemaField(description="ID of the conversation")
# error: str = SchemaField(description="Error message if sending failed")
# def __init__(self):
# super().__init__(
# id="ec11cabc-a62e-11ef-8c0e-3fe37ba2ec92",
# description="This block creates a new group DM conversation with specified Twitter users.",
# categories={BlockCategory.SOCIAL},
# input_schema=TwitterCreateDMConversationBlock.Input,
# output_schema=TwitterCreateDMConversationBlock.Output,
# test_input={
# "participant_ids": ["783214", "2244994945"],
# "text": "Hello from Twitter API",
# "media_id": "",
# "credentials": TEST_CREDENTIALS_INPUT
# },
# test_credentials=TEST_CREDENTIALS,
# test_output=[
# ("dm_event_id", "0987654321"),
# ("dm_conversation_id", "1234567890"),
# ("error", "")
# ],
# test_mock={
# "create_dm_conversation": lambda *args, **kwargs: (
# "0987654321",
# "1234567890"
# )
# },
# )
# @staticmethod
# def create_dm_conversation(
# credentials: TwitterCredentials,
# participant_ids: list[str],
# text: str,
# media_id: str
# ):
# try:
# client = tweepy.Client(
# bearer_token=credentials.access_token.get_secret_value()
# )
# response = cast(
# Response,
# client.create_direct_message_conversation(
# participant_ids=participant_ids,
# text=None if text == "" else text,
# media_id=None if media_id == "" else media_id,
# user_auth=False
# )
# )
# if not response.data:
# raise Exception("Failed to create DM conversation")
# return response.data["dm_event_id"], response.data["dm_conversation_id"]
# except tweepy.TweepyException:
# raise
# except Exception as e:
# print(f"Unexpected error: {str(e)}")
# raise
# def run(
# self,
# input_data: Input,
# *,
# credentials: TwitterCredentials,
# **kwargs,
# ) -> BlockOutput:
# try:
# dm_event_id, dm_conversation_id = self.create_dm_conversation(
# credentials,
# input_data.participant_ids,
# input_data.text,
# input_data.media_id
# )
# yield "dm_event_id", dm_event_id
# yield "dm_conversation_id", dm_conversation_id
# except Exception as e:
# yield "error", handle_tweepy_exception(e)

View File

@@ -0,0 +1,470 @@
# from typing import cast
import tweepy
from backend.blocks.twitter._auth import (
TEST_CREDENTIALS,
TEST_CREDENTIALS_INPUT,
TwitterCredentials,
TwitterCredentialsField,
TwitterCredentialsInput,
)
# from backend.blocks.twitter._builders import UserExpansionsBuilder
# from backend.blocks.twitter._types import TweetFields, TweetUserFields, UserExpansionInputs, UserExpansions
from backend.blocks.twitter.tweepy_exceptions import handle_tweepy_exception
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.model import SchemaField
# from tweepy.client import Response
class TwitterUnfollowListBlock(Block):
"""
Unfollows a Twitter list for the authenticated user
"""
class Input(BlockSchema):
credentials: TwitterCredentialsInput = TwitterCredentialsField(
["follows.write", "offline.access"]
)
list_id: str = SchemaField(
description="The ID of the List to unfollow",
placeholder="Enter list ID",
)
class Output(BlockSchema):
success: bool = SchemaField(description="Whether the unfollow was successful")
error: str = SchemaField(description="Error message if the request failed")
def __init__(self):
super().__init__(
id="1f43310a-a62f-11ef-8276-2b06a1bbae1a",
description="This block unfollows a specified Twitter list for the authenticated user.",
categories={BlockCategory.SOCIAL},
input_schema=TwitterUnfollowListBlock.Input,
output_schema=TwitterUnfollowListBlock.Output,
test_input={"list_id": "123456789", "credentials": TEST_CREDENTIALS_INPUT},
test_credentials=TEST_CREDENTIALS,
test_output=[
("success", True),
],
test_mock={"unfollow_list": lambda *args, **kwargs: True},
)
@staticmethod
def unfollow_list(credentials: TwitterCredentials, list_id: str):
try:
client = tweepy.Client(
bearer_token=credentials.access_token.get_secret_value()
)
client.unfollow_list(list_id=list_id, user_auth=False)
return True
except tweepy.TweepyException:
raise
def run(
self,
input_data: Input,
*,
credentials: TwitterCredentials,
**kwargs,
) -> BlockOutput:
try:
success = self.unfollow_list(credentials, input_data.list_id)
yield "success", success
except Exception as e:
yield "error", handle_tweepy_exception(e)
class TwitterFollowListBlock(Block):
"""
Follows a Twitter list for the authenticated user
"""
class Input(BlockSchema):
credentials: TwitterCredentialsInput = TwitterCredentialsField(
["tweet.read", "users.read", "list.write", "offline.access"]
)
list_id: str = SchemaField(
description="The ID of the List to follow",
placeholder="Enter list ID",
)
class Output(BlockSchema):
success: bool = SchemaField(description="Whether the follow was successful")
error: str = SchemaField(description="Error message if the request failed")
def __init__(self):
super().__init__(
id="03d8acf6-a62f-11ef-b17f-b72b04a09e79",
description="This block follows a specified Twitter list for the authenticated user.",
categories={BlockCategory.SOCIAL},
input_schema=TwitterFollowListBlock.Input,
output_schema=TwitterFollowListBlock.Output,
test_input={"list_id": "123456789", "credentials": TEST_CREDENTIALS_INPUT},
test_credentials=TEST_CREDENTIALS,
test_output=[
("success", True),
],
test_mock={"follow_list": lambda *args, **kwargs: True},
)
@staticmethod
def follow_list(credentials: TwitterCredentials, list_id: str):
try:
client = tweepy.Client(
bearer_token=credentials.access_token.get_secret_value()
)
client.follow_list(list_id=list_id, user_auth=False)
return True
except tweepy.TweepyException:
raise
def run(
self,
input_data: Input,
*,
credentials: TwitterCredentials,
**kwargs,
) -> BlockOutput:
try:
success = self.follow_list(credentials, input_data.list_id)
yield "success", success
except Exception as e:
yield "error", handle_tweepy_exception(e)
# Enterprise Level [Need to do Manual testing], There is a high possibility that we might get error in this
# Needs Type Input in this
# class TwitterListGetFollowersBlock(Block):
# """
# Gets followers of a specified Twitter list
# """
# class Input(UserExpansionInputs):
# credentials: TwitterCredentialsInput = TwitterCredentialsField(
# ["tweet.read","users.read", "list.read", "offline.access"]
# )
# list_id: str = SchemaField(
# description="The ID of the List to get followers for",
# placeholder="Enter list ID",
# required=True
# )
# max_results: int = SchemaField(
# description="Max number of results per page (1-100)",
# placeholder="Enter max results",
# default=10,
# advanced=True,
# )
# pagination_token: str = SchemaField(
# description="Token for pagination",
# placeholder="Enter pagination token",
# default="",
# advanced=True,
# )
# class Output(BlockSchema):
# user_ids: list[str] = SchemaField(description="List of user IDs of followers")
# usernames: list[str] = SchemaField(description="List of usernames of followers")
# next_token: str = SchemaField(description="Token for next page of results")
# data: list[dict] = SchemaField(description="Complete follower data")
# included: dict = SchemaField(description="Additional data requested via expansions")
# meta: dict = SchemaField(description="Metadata about the response")
# error: str = SchemaField(description="Error message if the request failed")
# def __init__(self):
# super().__init__(
# id="16b289b4-a62f-11ef-95d4-bb29b849eb99",
# description="This block retrieves followers of a specified Twitter list.",
# categories={BlockCategory.SOCIAL},
# input_schema=TwitterListGetFollowersBlock.Input,
# output_schema=TwitterListGetFollowersBlock.Output,
# test_input={
# "list_id": "123456789",
# "max_results": 10,
# "pagination_token": None,
# "credentials": TEST_CREDENTIALS_INPUT,
# "expansions": [],
# "tweet_fields": [],
# "user_fields": []
# },
# test_credentials=TEST_CREDENTIALS,
# test_output=[
# ("user_ids", ["2244994945"]),
# ("usernames", ["testuser"]),
# ("next_token", None),
# ("data", {"followers": [{"id": "2244994945", "username": "testuser"}]}),
# ("included", {}),
# ("meta", {}),
# ("error", "")
# ],
# test_mock={
# "get_list_followers": lambda *args, **kwargs: ({
# "followers": [{"id": "2244994945", "username": "testuser"}]
# }, {}, {}, ["2244994945"], ["testuser"], None)
# }
# )
# @staticmethod
# def get_list_followers(
# credentials: TwitterCredentials,
# list_id: str,
# max_results: int,
# pagination_token: str,
# expansions: list[UserExpansions],
# tweet_fields: list[TweetFields],
# user_fields: list[TweetUserFields]
# ):
# try:
# client = tweepy.Client(
# bearer_token=credentials.access_token.get_secret_value(),
# )
# params = {
# "id": list_id,
# "max_results": max_results,
# "pagination_token": None if pagination_token == "" else pagination_token,
# "user_auth": False
# }
# params = (UserExpansionsBuilder(params)
# .add_expansions(expansions)
# .add_tweet_fields(tweet_fields)
# .add_user_fields(user_fields)
# .build())
# response = cast(
# Response,
# client.get_list_followers(**params)
# )
# meta = {}
# user_ids = []
# usernames = []
# next_token = None
# if response.meta:
# meta = response.meta
# next_token = meta.get("next_token")
# included = IncludesSerializer.serialize(response.includes)
# data = ResponseDataSerializer.serialize_list(response.data)
# if response.data:
# user_ids = [str(item.id) for item in response.data]
# usernames = [item.username for item in response.data]
# return data, included, meta, user_ids, usernames, next_token
# raise Exception("No followers found")
# except tweepy.TweepyException:
# raise
# def run(
# self,
# input_data: Input,
# *,
# credentials: TwitterCredentials,
# **kwargs,
# ) -> BlockOutput:
# try:
# followers_data, included, meta, user_ids, usernames, next_token = self.get_list_followers(
# credentials,
# input_data.list_id,
# input_data.max_results,
# input_data.pagination_token,
# input_data.expansions,
# input_data.tweet_fields,
# input_data.user_fields
# )
# if user_ids:
# yield "user_ids", user_ids
# if usernames:
# yield "usernames", usernames
# if next_token:
# yield "next_token", next_token
# if followers_data:
# yield "data", followers_data
# if included:
# yield "included", included
# if meta:
# yield "meta", meta
# except Exception as e:
# yield "error", handle_tweepy_exception(e)
# class TwitterGetFollowedListsBlock(Block):
# """
# Gets lists followed by a specified Twitter user
# """
# class Input(UserExpansionInputs):
# credentials: TwitterCredentialsInput = TwitterCredentialsField(
# ["follows.read", "users.read", "list.read", "offline.access"]
# )
# user_id: str = SchemaField(
# description="The user ID whose followed Lists to retrieve",
# placeholder="Enter user ID",
# required=True
# )
# max_results: int = SchemaField(
# description="Max number of results per page (1-100)",
# placeholder="Enter max results",
# default=10,
# advanced=True,
# )
# pagination_token: str = SchemaField(
# description="Token for pagination",
# placeholder="Enter pagination token",
# default="",
# advanced=True,
# )
# class Output(BlockSchema):
# list_ids: list[str] = SchemaField(description="List of list IDs")
# list_names: list[str] = SchemaField(description="List of list names")
# data: list[dict] = SchemaField(description="Complete list data")
# includes: dict = SchemaField(description="Additional data requested via expansions")
# meta: dict = SchemaField(description="Metadata about the response")
# next_token: str = SchemaField(description="Token for next page of results")
# error: str = SchemaField(description="Error message if the request failed")
# def __init__(self):
# super().__init__(
# id="0e18bbfc-a62f-11ef-94fa-1f1e174b809e",
# description="This block retrieves all Lists a specified user follows.",
# categories={BlockCategory.SOCIAL},
# input_schema=TwitterGetFollowedListsBlock.Input,
# output_schema=TwitterGetFollowedListsBlock.Output,
# test_input={
# "user_id": "123456789",
# "max_results": 10,
# "pagination_token": None,
# "credentials": TEST_CREDENTIALS_INPUT,
# "expansions": [],
# "tweet_fields": [],
# "user_fields": []
# },
# test_credentials=TEST_CREDENTIALS,
# test_output=[
# ("list_ids", ["12345"]),
# ("list_names", ["Test List"]),
# ("data", {"followed_lists": [{"id": "12345", "name": "Test List"}]}),
# ("includes", {}),
# ("meta", {}),
# ("next_token", None),
# ("error", "")
# ],
# test_mock={
# "get_followed_lists": lambda *args, **kwargs: ({
# "followed_lists": [{"id": "12345", "name": "Test List"}]
# }, {}, {}, ["12345"], ["Test List"], None)
# }
# )
# @staticmethod
# def get_followed_lists(
# credentials: TwitterCredentials,
# user_id: str,
# max_results: int,
# pagination_token: str,
# expansions: list[UserExpansions],
# tweet_fields: list[TweetFields],
# user_fields: list[TweetUserFields]
# ):
# try:
# client = tweepy.Client(
# bearer_token=credentials.access_token.get_secret_value(),
# )
# params = {
# "id": user_id,
# "max_results": max_results,
# "pagination_token": None if pagination_token == "" else pagination_token,
# "user_auth": False
# }
# params = (UserExpansionsBuilder(params)
# .add_expansions(expansions)
# .add_tweet_fields(tweet_fields)
# .add_user_fields(user_fields)
# .build())
# response = cast(
# Response,
# client.get_followed_lists(**params)
# )
# meta = {}
# list_ids = []
# list_names = []
# next_token = None
# if response.meta:
# meta = response.meta
# next_token = meta.get("next_token")
# included = IncludesSerializer.serialize(response.includes)
# data = ResponseDataSerializer.serialize_list(response.data)
# if response.data:
# list_ids = [str(item.id) for item in response.data]
# list_names = [item.name for item in response.data]
# return data, included, meta, list_ids, list_names, next_token
# raise Exception("No followed lists found")
# except tweepy.TweepyException:
# raise
# def run(
# self,
# input_data: Input,
# *,
# credentials: TwitterCredentials,
# **kwargs,
# ) -> BlockOutput:
# try:
# lists_data, included, meta, list_ids, list_names, next_token = self.get_followed_lists(
# credentials,
# input_data.user_id,
# input_data.max_results,
# input_data.pagination_token,
# input_data.expansions,
# input_data.tweet_fields,
# input_data.user_fields
# )
# if list_ids:
# yield "list_ids", list_ids
# if list_names:
# yield "list_names", list_names
# if next_token:
# yield "next_token", next_token
# if lists_data:
# yield "data", lists_data
# if included:
# yield "includes", included
# if meta:
# yield "meta", meta
# except Exception as e:
# yield "error", handle_tweepy_exception(e)

View File

@@ -0,0 +1,348 @@
from typing import cast
import tweepy
from tweepy.client import Response
from backend.blocks.twitter._auth import (
TEST_CREDENTIALS,
TEST_CREDENTIALS_INPUT,
TwitterCredentials,
TwitterCredentialsField,
TwitterCredentialsInput,
)
from backend.blocks.twitter._builders import ListExpansionsBuilder
from backend.blocks.twitter._serializer import (
IncludesSerializer,
ResponseDataSerializer,
)
from backend.blocks.twitter._types import (
ListExpansionInputs,
ListExpansionsFilter,
ListFieldsFilter,
TweetUserFieldsFilter,
)
from backend.blocks.twitter.tweepy_exceptions import handle_tweepy_exception
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.model import SchemaField
class TwitterGetListBlock(Block):
"""
Gets information about a Twitter List specified by ID
"""
class Input(ListExpansionInputs):
credentials: TwitterCredentialsInput = TwitterCredentialsField(
["tweet.read", "users.read", "offline.access"]
)
list_id: str = SchemaField(
description="The ID of the List to lookup",
placeholder="Enter list ID",
required=True,
)
class Output(BlockSchema):
# Common outputs
id: str = SchemaField(description="ID of the Twitter List")
name: str = SchemaField(description="Name of the Twitter List")
owner_id: str = SchemaField(description="ID of the List owner")
owner_username: str = SchemaField(description="Username of the List owner")
# Complete outputs
data: dict = SchemaField(description="Complete list data")
included: dict = SchemaField(
description="Additional data requested via expansions"
)
meta: dict = SchemaField(description="Metadata about the response")
error: str = SchemaField(description="Error message if the request failed")
def __init__(self):
super().__init__(
id="34ebc80a-a62f-11ef-9c2a-3fcab6c07079",
description="This block retrieves information about a specified Twitter List.",
categories={BlockCategory.SOCIAL},
input_schema=TwitterGetListBlock.Input,
output_schema=TwitterGetListBlock.Output,
test_input={
"list_id": "84839422",
"credentials": TEST_CREDENTIALS_INPUT,
"expansions": None,
"list_fields": None,
"user_fields": None,
},
test_credentials=TEST_CREDENTIALS,
test_output=[
("id", "84839422"),
("name", "Official Twitter Accounts"),
("owner_id", "2244994945"),
("owner_username", "TwitterAPI"),
("data", {"id": "84839422", "name": "Official Twitter Accounts"}),
],
test_mock={
"get_list": lambda *args, **kwargs: (
{"id": "84839422", "name": "Official Twitter Accounts"},
{},
{},
"2244994945",
"TwitterAPI",
)
},
)
@staticmethod
def get_list(
credentials: TwitterCredentials,
list_id: str,
expansions: ListExpansionsFilter | None,
user_fields: TweetUserFieldsFilter | None,
list_fields: ListFieldsFilter | None,
):
try:
client = tweepy.Client(
bearer_token=credentials.access_token.get_secret_value()
)
params = {"id": list_id, "user_auth": False}
params = (
ListExpansionsBuilder(params)
.add_expansions(expansions)
.add_user_fields(user_fields)
.add_list_fields(list_fields)
.build()
)
response = cast(Response, client.get_list(**params))
meta = {}
owner_id = ""
owner_username = ""
included = {}
if response.includes:
included = IncludesSerializer.serialize(response.includes)
if "users" in included:
owner_id = str(included["users"][0]["id"])
owner_username = included["users"][0]["username"]
if response.meta:
meta = response.meta
if response.data:
data_dict = ResponseDataSerializer.serialize_dict(response.data)
return data_dict, included, meta, owner_id, owner_username
raise Exception("List not found")
except tweepy.TweepyException:
raise
def run(
self,
input_data: Input,
*,
credentials: TwitterCredentials,
**kwargs,
) -> BlockOutput:
try:
list_data, included, meta, owner_id, owner_username = self.get_list(
credentials,
input_data.list_id,
input_data.expansions,
input_data.user_fields,
input_data.list_fields,
)
yield "id", str(list_data["id"])
yield "name", list_data["name"]
if owner_id:
yield "owner_id", owner_id
if owner_username:
yield "owner_username", owner_username
yield "data", {"id": list_data["id"], "name": list_data["name"]}
if included:
yield "included", included
if meta:
yield "meta", meta
except Exception as e:
yield "error", handle_tweepy_exception(e)
class TwitterGetOwnedListsBlock(Block):
"""
Gets all Lists owned by the specified user
"""
class Input(ListExpansionInputs):
credentials: TwitterCredentialsInput = TwitterCredentialsField(
["tweet.read", "users.read", "list.read", "offline.access"]
)
user_id: str = SchemaField(
description="The user ID whose owned Lists to retrieve",
placeholder="Enter user ID",
required=True,
)
max_results: int | None = SchemaField(
description="Maximum number of results per page (1-100)",
placeholder="Enter max results (default 100)",
advanced=True,
default=10,
)
pagination_token: str | None = SchemaField(
description="Token for pagination",
placeholder="Enter pagination token",
advanced=True,
default="",
)
class Output(BlockSchema):
# Common outputs
list_ids: list[str] = SchemaField(description="List ids of the owned lists")
list_names: list[str] = SchemaField(description="List names of the owned lists")
next_token: str = SchemaField(description="Token for next page of results")
# Complete outputs
data: list[dict] = SchemaField(description="Complete owned lists data")
included: dict = SchemaField(
description="Additional data requested via expansions"
)
meta: dict = SchemaField(description="Metadata about the response")
error: str = SchemaField(description="Error message if the request failed")
def __init__(self):
super().__init__(
id="2b6bdb26-a62f-11ef-a9ce-ff89c2568726",
description="This block retrieves all Lists owned by a specified Twitter user.",
categories={BlockCategory.SOCIAL},
input_schema=TwitterGetOwnedListsBlock.Input,
output_schema=TwitterGetOwnedListsBlock.Output,
test_input={
"user_id": "2244994945",
"max_results": 10,
"credentials": TEST_CREDENTIALS_INPUT,
"expansions": None,
"list_fields": None,
"user_fields": None,
},
test_credentials=TEST_CREDENTIALS,
test_output=[
("list_ids", ["84839422"]),
("list_names", ["Official Twitter Accounts"]),
("data", [{"id": "84839422", "name": "Official Twitter Accounts"}]),
],
test_mock={
"get_owned_lists": lambda *args, **kwargs: (
[{"id": "84839422", "name": "Official Twitter Accounts"}],
{},
{},
["84839422"],
["Official Twitter Accounts"],
None,
)
},
)
@staticmethod
def get_owned_lists(
credentials: TwitterCredentials,
user_id: str,
max_results: int | None,
pagination_token: str | None,
expansions: ListExpansionsFilter | None,
user_fields: TweetUserFieldsFilter | None,
list_fields: ListFieldsFilter | None,
):
try:
client = tweepy.Client(
bearer_token=credentials.access_token.get_secret_value()
)
params = {
"id": user_id,
"max_results": max_results,
"pagination_token": (
None if pagination_token == "" else pagination_token
),
"user_auth": False,
}
params = (
ListExpansionsBuilder(params)
.add_expansions(expansions)
.add_user_fields(user_fields)
.add_list_fields(list_fields)
.build()
)
response = cast(Response, client.get_owned_lists(**params))
meta = {}
included = {}
list_ids = []
list_names = []
next_token = None
if response.meta:
meta = response.meta
next_token = meta.get("next_token")
if response.includes:
included = IncludesSerializer.serialize(response.includes)
if response.data:
data = ResponseDataSerializer.serialize_list(response.data)
list_ids = [
str(item.id) for item in response.data if hasattr(item, "id")
]
list_names = [
item.name for item in response.data if hasattr(item, "name")
]
return data, included, meta, list_ids, list_names, next_token
raise Exception("User have no owned list")
except tweepy.TweepyException:
raise
def run(
self,
input_data: Input,
*,
credentials: TwitterCredentials,
**kwargs,
) -> BlockOutput:
try:
list_data, included, meta, list_ids, list_names, next_token = (
self.get_owned_lists(
credentials,
input_data.user_id,
input_data.max_results,
input_data.pagination_token,
input_data.expansions,
input_data.user_fields,
input_data.list_fields,
)
)
if list_ids:
yield "list_ids", list_ids
if list_names:
yield "list_names", list_names
if next_token:
yield "next_token", next_token
if list_data:
yield "data", list_data
if included:
yield "included", included
if meta:
yield "meta", meta
except Exception as e:
yield "error", handle_tweepy_exception(e)

View File

@@ -0,0 +1,527 @@
from typing import cast
import tweepy
from tweepy.client import Response
from backend.blocks.twitter._auth import (
TEST_CREDENTIALS,
TEST_CREDENTIALS_INPUT,
TwitterCredentials,
TwitterCredentialsField,
TwitterCredentialsInput,
)
from backend.blocks.twitter._builders import (
ListExpansionsBuilder,
UserExpansionsBuilder,
)
from backend.blocks.twitter._serializer import (
IncludesSerializer,
ResponseDataSerializer,
)
from backend.blocks.twitter._types import (
ListExpansionInputs,
ListExpansionsFilter,
ListFieldsFilter,
TweetFieldsFilter,
TweetUserFieldsFilter,
UserExpansionInputs,
UserExpansionsFilter,
)
from backend.blocks.twitter.tweepy_exceptions import handle_tweepy_exception
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.model import SchemaField
class TwitterRemoveListMemberBlock(Block):
"""
Removes a member from a Twitter List that the authenticated user owns
"""
class Input(BlockSchema):
credentials: TwitterCredentialsInput = TwitterCredentialsField(
["list.write", "users.read", "tweet.read", "offline.access"]
)
list_id: str = SchemaField(
description="The ID of the List to remove the member from",
placeholder="Enter list ID",
required=True,
)
user_id: str = SchemaField(
description="The ID of the user to remove from the List",
placeholder="Enter user ID to remove",
required=True,
)
class Output(BlockSchema):
success: bool = SchemaField(
description="Whether the member was successfully removed"
)
error: str = SchemaField(description="Error message if the removal failed")
def __init__(self):
super().__init__(
id="5a3d1320-a62f-11ef-b7ce-a79e7656bcb0",
description="This block removes a specified user from a Twitter List owned by the authenticated user.",
categories={BlockCategory.SOCIAL},
input_schema=TwitterRemoveListMemberBlock.Input,
output_schema=TwitterRemoveListMemberBlock.Output,
test_input={
"list_id": "123456789",
"user_id": "987654321",
"credentials": TEST_CREDENTIALS_INPUT,
},
test_credentials=TEST_CREDENTIALS,
test_output=[("success", True)],
test_mock={"remove_list_member": lambda *args, **kwargs: True},
)
@staticmethod
def remove_list_member(credentials: TwitterCredentials, list_id: str, user_id: str):
try:
client = tweepy.Client(
bearer_token=credentials.access_token.get_secret_value()
)
client.remove_list_member(id=list_id, user_id=user_id, user_auth=False)
return True
except tweepy.TweepyException:
raise
except Exception:
raise
def run(
self,
input_data: Input,
*,
credentials: TwitterCredentials,
**kwargs,
) -> BlockOutput:
try:
success = self.remove_list_member(
credentials, input_data.list_id, input_data.user_id
)
yield "success", success
except Exception as e:
yield "error", handle_tweepy_exception(e)
class TwitterAddListMemberBlock(Block):
"""
Adds a member to a Twitter List that the authenticated user owns
"""
class Input(BlockSchema):
credentials: TwitterCredentialsInput = TwitterCredentialsField(
["list.write", "users.read", "tweet.read", "offline.access"]
)
list_id: str = SchemaField(
description="The ID of the List to add the member to",
placeholder="Enter list ID",
required=True,
)
user_id: str = SchemaField(
description="The ID of the user to add to the List",
placeholder="Enter user ID to add",
required=True,
)
class Output(BlockSchema):
success: bool = SchemaField(
description="Whether the member was successfully added"
)
error: str = SchemaField(description="Error message if the addition failed")
def __init__(self):
super().__init__(
id="3ee8284e-a62f-11ef-84e4-8f6e2cbf0ddb",
description="This block adds a specified user to a Twitter List owned by the authenticated user.",
categories={BlockCategory.SOCIAL},
input_schema=TwitterAddListMemberBlock.Input,
output_schema=TwitterAddListMemberBlock.Output,
test_input={
"list_id": "123456789",
"user_id": "987654321",
"credentials": TEST_CREDENTIALS_INPUT,
},
test_credentials=TEST_CREDENTIALS,
test_output=[("success", True)],
test_mock={"add_list_member": lambda *args, **kwargs: True},
)
@staticmethod
def add_list_member(credentials: TwitterCredentials, list_id: str, user_id: str):
try:
client = tweepy.Client(
bearer_token=credentials.access_token.get_secret_value()
)
client.add_list_member(id=list_id, user_id=user_id, user_auth=False)
return True
except tweepy.TweepyException:
raise
except Exception:
raise
def run(
self,
input_data: Input,
*,
credentials: TwitterCredentials,
**kwargs,
) -> BlockOutput:
try:
success = self.add_list_member(
credentials, input_data.list_id, input_data.user_id
)
yield "success", success
except Exception as e:
yield "error", handle_tweepy_exception(e)
class TwitterGetListMembersBlock(Block):
"""
Gets the members of a specified Twitter List
"""
class Input(UserExpansionInputs):
credentials: TwitterCredentialsInput = TwitterCredentialsField(
["list.read", "offline.access"]
)
list_id: str = SchemaField(
description="The ID of the List to get members from",
placeholder="Enter list ID",
required=True,
)
max_results: int | None = SchemaField(
description="Maximum number of results per page (1-100)",
placeholder="Enter max results",
default=10,
advanced=True,
)
pagination_token: str | None = SchemaField(
description="Token for pagination of results",
placeholder="Enter pagination token",
default="",
advanced=True,
)
class Output(BlockSchema):
ids: list[str] = SchemaField(description="List of member user IDs")
usernames: list[str] = SchemaField(description="List of member usernames")
next_token: str = SchemaField(description="Next token for pagination")
data: list[dict] = SchemaField(
description="Complete user data for list members"
)
included: dict = SchemaField(
description="Additional data requested via expansions"
)
meta: dict = SchemaField(description="Metadata including pagination info")
error: str = SchemaField(description="Error message if the request failed")
def __init__(self):
super().__init__(
id="4dba046e-a62f-11ef-b69a-87240c84b4c7",
description="This block retrieves the members of a specified Twitter List.",
categories={BlockCategory.SOCIAL},
input_schema=TwitterGetListMembersBlock.Input,
output_schema=TwitterGetListMembersBlock.Output,
test_input={
"list_id": "123456789",
"max_results": 2,
"pagination_token": None,
"credentials": TEST_CREDENTIALS_INPUT,
"expansions": None,
"tweet_fields": None,
"user_fields": None,
},
test_credentials=TEST_CREDENTIALS,
test_output=[
("ids", ["12345", "67890"]),
("usernames", ["testuser1", "testuser2"]),
(
"data",
[
{"id": "12345", "username": "testuser1"},
{"id": "67890", "username": "testuser2"},
],
),
],
test_mock={
"get_list_members": lambda *args, **kwargs: (
["12345", "67890"],
["testuser1", "testuser2"],
[
{"id": "12345", "username": "testuser1"},
{"id": "67890", "username": "testuser2"},
],
{},
{},
None,
)
},
)
@staticmethod
def get_list_members(
credentials: TwitterCredentials,
list_id: str,
max_results: int | None,
pagination_token: str | None,
expansions: UserExpansionsFilter | None,
tweet_fields: TweetFieldsFilter | None,
user_fields: TweetUserFieldsFilter | None,
):
try:
client = tweepy.Client(
bearer_token=credentials.access_token.get_secret_value()
)
params = {
"id": list_id,
"max_results": max_results,
"pagination_token": (
None if pagination_token == "" else pagination_token
),
"user_auth": False,
}
params = (
UserExpansionsBuilder(params)
.add_expansions(expansions)
.add_tweet_fields(tweet_fields)
.add_user_fields(user_fields)
.build()
)
response = cast(Response, client.get_list_members(**params))
meta = {}
included = {}
next_token = None
user_ids = []
usernames = []
if response.meta:
meta = response.meta
next_token = meta.get("next_token")
if response.includes:
included = IncludesSerializer.serialize(response.includes)
if response.data:
data = ResponseDataSerializer.serialize_list(response.data)
user_ids = [str(user.id) for user in response.data]
usernames = [user.username for user in response.data]
return user_ids, usernames, data, included, meta, next_token
raise Exception("List members not found")
except tweepy.TweepyException:
raise
def run(
self,
input_data: Input,
*,
credentials: TwitterCredentials,
**kwargs,
) -> BlockOutput:
try:
ids, usernames, data, included, meta, next_token = self.get_list_members(
credentials,
input_data.list_id,
input_data.max_results,
input_data.pagination_token,
input_data.expansions,
input_data.tweet_fields,
input_data.user_fields,
)
if ids:
yield "ids", ids
if usernames:
yield "usernames", usernames
if next_token:
yield "next_token", next_token
if data:
yield "data", data
if included:
yield "included", included
if meta:
yield "meta", meta
except Exception as e:
yield "error", handle_tweepy_exception(e)
class TwitterGetListMembershipsBlock(Block):
"""
Gets all Lists that a specified user is a member of
"""
class Input(ListExpansionInputs):
credentials: TwitterCredentialsInput = TwitterCredentialsField(
["list.read", "offline.access"]
)
user_id: str = SchemaField(
description="The ID of the user whose List memberships to retrieve",
placeholder="Enter user ID",
required=True,
)
max_results: int | None = SchemaField(
description="Maximum number of results per page (1-100)",
placeholder="Enter max results",
advanced=True,
default=10,
)
pagination_token: str | None = SchemaField(
description="Token for pagination of results",
placeholder="Enter pagination token",
advanced=True,
default="",
)
class Output(BlockSchema):
list_ids: list[str] = SchemaField(description="List of list IDs")
next_token: str = SchemaField(description="Next token for pagination")
data: list[dict] = SchemaField(description="List membership data")
included: dict = SchemaField(
description="Additional data requested via expansions"
)
meta: dict = SchemaField(description="Metadata about pagination")
error: str = SchemaField(description="Error message if the request failed")
def __init__(self):
super().__init__(
id="46e6429c-a62f-11ef-81c0-2b55bc7823ba",
description="This block retrieves all Lists that a specified user is a member of.",
categories={BlockCategory.SOCIAL},
input_schema=TwitterGetListMembershipsBlock.Input,
output_schema=TwitterGetListMembershipsBlock.Output,
test_input={
"user_id": "123456789",
"max_results": 1,
"pagination_token": None,
"credentials": TEST_CREDENTIALS_INPUT,
"expansions": None,
"list_fields": None,
"user_fields": None,
},
test_credentials=TEST_CREDENTIALS,
test_output=[
("list_ids", ["84839422"]),
("data", [{"id": "84839422"}]),
],
test_mock={
"get_list_memberships": lambda *args, **kwargs: (
[{"id": "84839422"}],
{},
{},
["84839422"],
None,
)
},
)
@staticmethod
def get_list_memberships(
credentials: TwitterCredentials,
user_id: str,
max_results: int | None,
pagination_token: str | None,
expansions: ListExpansionsFilter | None,
user_fields: TweetUserFieldsFilter | None,
list_fields: ListFieldsFilter | None,
):
try:
client = tweepy.Client(
bearer_token=credentials.access_token.get_secret_value()
)
params = {
"id": user_id,
"max_results": max_results,
"pagination_token": (
None if pagination_token == "" else pagination_token
),
"user_auth": False,
}
params = (
ListExpansionsBuilder(params)
.add_expansions(expansions)
.add_user_fields(user_fields)
.add_list_fields(list_fields)
.build()
)
response = cast(Response, client.get_list_memberships(**params))
meta = {}
included = {}
next_token = None
list_ids = []
if response.meta:
meta = response.meta
next_token = meta.get("next_token")
if response.includes:
included = IncludesSerializer.serialize(response.includes)
if response.data:
data = ResponseDataSerializer.serialize_list(response.data)
list_ids = [str(lst.id) for lst in response.data]
return data, included, meta, list_ids, next_token
raise Exception("List memberships not found")
except tweepy.TweepyException:
raise
except Exception:
raise
def run(
self,
input_data: Input,
*,
credentials: TwitterCredentials,
**kwargs,
) -> BlockOutput:
try:
data, included, meta, list_ids, next_token = self.get_list_memberships(
credentials,
input_data.user_id,
input_data.max_results,
input_data.pagination_token,
input_data.expansions,
input_data.user_fields,
input_data.list_fields,
)
if list_ids:
yield "list_ids", list_ids
if next_token:
yield "next_token", next_token
if data:
yield "data", data
if included:
yield "included", included
if meta:
yield "meta", meta
except Exception as e:
yield "error", handle_tweepy_exception(e)

View File

@@ -0,0 +1,217 @@
from typing import cast
import tweepy
from tweepy.client import Response
from backend.blocks.twitter._auth import (
TEST_CREDENTIALS,
TEST_CREDENTIALS_INPUT,
TwitterCredentials,
TwitterCredentialsField,
TwitterCredentialsInput,
)
from backend.blocks.twitter._builders import TweetExpansionsBuilder
from backend.blocks.twitter._serializer import (
IncludesSerializer,
ResponseDataSerializer,
)
from backend.blocks.twitter._types import (
ExpansionFilter,
TweetExpansionInputs,
TweetFieldsFilter,
TweetMediaFieldsFilter,
TweetPlaceFieldsFilter,
TweetPollFieldsFilter,
TweetUserFieldsFilter,
)
from backend.blocks.twitter.tweepy_exceptions import handle_tweepy_exception
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.model import SchemaField
class TwitterGetListTweetsBlock(Block):
"""
Gets tweets from a specified Twitter list
"""
class Input(TweetExpansionInputs):
credentials: TwitterCredentialsInput = TwitterCredentialsField(
["tweet.read", "offline.access"]
)
list_id: str = SchemaField(
description="The ID of the List whose Tweets you would like to retrieve",
placeholder="Enter list ID",
required=True,
)
max_results: int | None = SchemaField(
description="Maximum number of results per page (1-100)",
placeholder="Enter max results",
default=10,
advanced=True,
)
pagination_token: str | None = SchemaField(
description="Token for paginating through results",
placeholder="Enter pagination token",
default="",
advanced=True,
)
class Output(BlockSchema):
# Common outputs
tweet_ids: list[str] = SchemaField(description="List of tweet IDs")
texts: list[str] = SchemaField(description="List of tweet texts")
next_token: str = SchemaField(description="Token for next page of results")
# Complete outputs
data: list[dict] = SchemaField(description="Complete list tweets data")
included: dict = SchemaField(
description="Additional data requested via expansions"
)
meta: dict = SchemaField(
description="Response metadata including pagination tokens"
)
error: str = SchemaField(description="Error message if the request failed")
def __init__(self):
super().__init__(
id="6657edb0-a62f-11ef-8c10-0326d832467d",
description="This block retrieves tweets from a specified Twitter list.",
categories={BlockCategory.SOCIAL},
input_schema=TwitterGetListTweetsBlock.Input,
output_schema=TwitterGetListTweetsBlock.Output,
test_input={
"list_id": "84839422",
"max_results": 1,
"pagination_token": None,
"credentials": TEST_CREDENTIALS_INPUT,
"expansions": None,
"media_fields": None,
"place_fields": None,
"poll_fields": None,
"tweet_fields": None,
"user_fields": None,
},
test_credentials=TEST_CREDENTIALS,
test_output=[
("tweet_ids", ["1234567890"]),
("texts", ["Test tweet"]),
("data", [{"id": "1234567890", "text": "Test tweet"}]),
],
test_mock={
"get_list_tweets": lambda *args, **kwargs: (
[{"id": "1234567890", "text": "Test tweet"}],
{},
{},
["1234567890"],
["Test tweet"],
None,
)
},
)
@staticmethod
def get_list_tweets(
credentials: TwitterCredentials,
list_id: str,
max_results: int | None,
pagination_token: str | None,
expansions: ExpansionFilter | None,
media_fields: TweetMediaFieldsFilter | None,
place_fields: TweetPlaceFieldsFilter | None,
poll_fields: TweetPollFieldsFilter | None,
tweet_fields: TweetFieldsFilter | None,
user_fields: TweetUserFieldsFilter | None,
):
try:
client = tweepy.Client(
bearer_token=credentials.access_token.get_secret_value()
)
params = {
"id": list_id,
"max_results": max_results,
"pagination_token": (
None if pagination_token == "" else pagination_token
),
"user_auth": False,
}
params = (
TweetExpansionsBuilder(params)
.add_expansions(expansions)
.add_media_fields(media_fields)
.add_place_fields(place_fields)
.add_poll_fields(poll_fields)
.add_tweet_fields(tweet_fields)
.add_user_fields(user_fields)
.build()
)
response = cast(Response, client.get_list_tweets(**params))
meta = {}
included = {}
tweet_ids = []
texts = []
next_token = None
if response.meta:
meta = response.meta
next_token = meta.get("next_token")
if response.includes:
included = IncludesSerializer.serialize(response.includes)
if response.data:
data = ResponseDataSerializer.serialize_list(response.data)
tweet_ids = [str(item.id) for item in response.data]
texts = [item.text for item in response.data]
return data, included, meta, tweet_ids, texts, next_token
raise Exception("No tweets found in this list")
except tweepy.TweepyException:
raise
def run(
self,
input_data: Input,
*,
credentials: TwitterCredentials,
**kwargs,
) -> BlockOutput:
try:
list_data, included, meta, tweet_ids, texts, next_token = (
self.get_list_tweets(
credentials,
input_data.list_id,
input_data.max_results,
input_data.pagination_token,
input_data.expansions,
input_data.media_fields,
input_data.place_fields,
input_data.poll_fields,
input_data.tweet_fields,
input_data.user_fields,
)
)
if tweet_ids:
yield "tweet_ids", tweet_ids
if texts:
yield "texts", texts
if next_token:
yield "next_token", next_token
if list_data:
yield "data", list_data
if included:
yield "included", included
if meta:
yield "meta", meta
except Exception as e:
yield "error", handle_tweepy_exception(e)

View File

@@ -0,0 +1,278 @@
from typing import cast
import tweepy
from tweepy.client import Response
from backend.blocks.twitter._auth import (
TEST_CREDENTIALS,
TEST_CREDENTIALS_INPUT,
TwitterCredentials,
TwitterCredentialsField,
TwitterCredentialsInput,
)
from backend.blocks.twitter.tweepy_exceptions import handle_tweepy_exception
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.model import SchemaField
class TwitterDeleteListBlock(Block):
"""
Deletes a Twitter List owned by the authenticated user
"""
class Input(BlockSchema):
credentials: TwitterCredentialsInput = TwitterCredentialsField(
["list.write", "offline.access"]
)
list_id: str = SchemaField(
description="The ID of the List to be deleted",
placeholder="Enter list ID",
required=True,
)
class Output(BlockSchema):
success: bool = SchemaField(description="Whether the deletion was successful")
error: str = SchemaField(description="Error message if the request failed")
def __init__(self):
super().__init__(
id="843c6892-a62f-11ef-a5c8-b71239a78d3b",
description="This block deletes a specified Twitter List owned by the authenticated user.",
categories={BlockCategory.SOCIAL},
input_schema=TwitterDeleteListBlock.Input,
output_schema=TwitterDeleteListBlock.Output,
test_input={"list_id": "1234567890", "credentials": TEST_CREDENTIALS_INPUT},
test_credentials=TEST_CREDENTIALS,
test_output=[("success", True)],
test_mock={"delete_list": lambda *args, **kwargs: True},
)
@staticmethod
def delete_list(credentials: TwitterCredentials, list_id: str):
try:
client = tweepy.Client(
bearer_token=credentials.access_token.get_secret_value()
)
client.delete_list(id=list_id, user_auth=False)
return True
except tweepy.TweepyException:
raise
except Exception:
raise
def run(
self,
input_data: Input,
*,
credentials: TwitterCredentials,
**kwargs,
) -> BlockOutput:
try:
success = self.delete_list(credentials, input_data.list_id)
yield "success", success
except Exception as e:
yield "error", handle_tweepy_exception(e)
class TwitterUpdateListBlock(Block):
"""
Updates a Twitter List owned by the authenticated user
"""
class Input(BlockSchema):
credentials: TwitterCredentialsInput = TwitterCredentialsField(
["list.write", "offline.access"]
)
list_id: str = SchemaField(
description="The ID of the List to be updated",
placeholder="Enter list ID",
advanced=False,
)
name: str | None = SchemaField(
description="New name for the List",
placeholder="Enter list name",
default="",
advanced=False,
)
description: str | None = SchemaField(
description="New description for the List",
placeholder="Enter list description",
default="",
advanced=False,
)
class Output(BlockSchema):
success: bool = SchemaField(description="Whether the update was successful")
error: str = SchemaField(description="Error message if the request failed")
def __init__(self):
super().__init__(
id="7d12630a-a62f-11ef-90c9-8f5a996612c3",
description="This block updates a specified Twitter List owned by the authenticated user.",
categories={BlockCategory.SOCIAL},
input_schema=TwitterUpdateListBlock.Input,
output_schema=TwitterUpdateListBlock.Output,
test_input={
"list_id": "1234567890",
"name": "Updated List Name",
"description": "Updated List Description",
"private": True,
"credentials": TEST_CREDENTIALS_INPUT,
},
test_credentials=TEST_CREDENTIALS,
test_output=[("success", True)],
test_mock={"update_list": lambda *args, **kwargs: True},
)
@staticmethod
def update_list(
credentials: TwitterCredentials,
list_id: str,
name: str | None,
description: str | None,
):
try:
client = tweepy.Client(
bearer_token=credentials.access_token.get_secret_value()
)
client.update_list(
id=list_id,
name=None if name == "" else name,
description=None if description == "" else description,
user_auth=False,
)
return True
except tweepy.TweepyException:
raise
except Exception:
raise
def run(
self,
input_data: Input,
*,
credentials: TwitterCredentials,
**kwargs,
) -> BlockOutput:
try:
success = self.update_list(
credentials, input_data.list_id, input_data.name, input_data.description
)
yield "success", success
except Exception as e:
yield "error", handle_tweepy_exception(e)
class TwitterCreateListBlock(Block):
"""
Creates a Twitter List owned by the authenticated user
"""
class Input(BlockSchema):
credentials: TwitterCredentialsInput = TwitterCredentialsField(
["list.write", "offline.access"]
)
name: str = SchemaField(
description="The name of the List to be created",
placeholder="Enter list name",
advanced=False,
default="",
)
description: str | None = SchemaField(
description="Description of the List",
placeholder="Enter list description",
advanced=False,
default="",
)
private: bool = SchemaField(
description="Whether the List should be private",
advanced=False,
default=False,
)
class Output(BlockSchema):
url: str = SchemaField(description="URL of the created list")
list_id: str = SchemaField(description="ID of the created list")
error: str = SchemaField(description="Error message if the request failed")
def __init__(self):
super().__init__(
id="724148ba-a62f-11ef-89ba-5349b813ef5f",
description="This block creates a new Twitter List for the authenticated user.",
categories={BlockCategory.SOCIAL},
input_schema=TwitterCreateListBlock.Input,
output_schema=TwitterCreateListBlock.Output,
test_input={
"name": "New List Name",
"description": "New List Description",
"private": True,
"credentials": TEST_CREDENTIALS_INPUT,
},
test_credentials=TEST_CREDENTIALS,
test_output=[
("list_id", "1234567890"),
("url", "https://twitter.com/i/lists/1234567890"),
],
test_mock={"create_list": lambda *args, **kwargs: ("1234567890")},
)
@staticmethod
def create_list(
credentials: TwitterCredentials,
name: str,
description: str | None,
private: bool,
):
try:
client = tweepy.Client(
bearer_token=credentials.access_token.get_secret_value()
)
response = cast(
Response,
client.create_list(
name=None if name == "" else name,
description=None if description == "" else description,
private=private,
user_auth=False,
),
)
list_id = str(response.data["id"])
return list_id
except tweepy.TweepyException:
raise
except Exception:
raise
def run(
self,
input_data: Input,
*,
credentials: TwitterCredentials,
**kwargs,
) -> BlockOutput:
try:
list_id = self.create_list(
credentials, input_data.name, input_data.description, input_data.private
)
yield "list_id", list_id
yield "url", f"https://twitter.com/i/lists/{list_id}"
except Exception as e:
yield "error", handle_tweepy_exception(e)

View File

@@ -0,0 +1,285 @@
from typing import cast
import tweepy
from tweepy.client import Response
from backend.blocks.twitter._auth import (
TEST_CREDENTIALS,
TEST_CREDENTIALS_INPUT,
TwitterCredentials,
TwitterCredentialsField,
TwitterCredentialsInput,
)
from backend.blocks.twitter._builders import ListExpansionsBuilder
from backend.blocks.twitter._serializer import (
IncludesSerializer,
ResponseDataSerializer,
)
from backend.blocks.twitter._types import (
ListExpansionInputs,
ListExpansionsFilter,
ListFieldsFilter,
TweetUserFieldsFilter,
)
from backend.blocks.twitter.tweepy_exceptions import handle_tweepy_exception
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.model import SchemaField
class TwitterUnpinListBlock(Block):
"""
Enables the authenticated user to unpin a List.
"""
class Input(BlockSchema):
credentials: TwitterCredentialsInput = TwitterCredentialsField(
["list.write", "users.read", "tweet.read", "offline.access"]
)
list_id: str = SchemaField(
description="The ID of the List to unpin",
placeholder="Enter list ID",
required=True,
)
class Output(BlockSchema):
success: bool = SchemaField(description="Whether the unpin was successful")
error: str = SchemaField(description="Error message if the request failed")
def __init__(self):
super().__init__(
id="a099c034-a62f-11ef-9622-47d0ceb73555",
description="This block allows the authenticated user to unpin a specified List.",
categories={BlockCategory.SOCIAL},
input_schema=TwitterUnpinListBlock.Input,
output_schema=TwitterUnpinListBlock.Output,
test_input={"list_id": "123456789", "credentials": TEST_CREDENTIALS_INPUT},
test_credentials=TEST_CREDENTIALS,
test_output=[("success", True)],
test_mock={"unpin_list": lambda *args, **kwargs: True},
)
@staticmethod
def unpin_list(credentials: TwitterCredentials, list_id: str):
try:
client = tweepy.Client(
bearer_token=credentials.access_token.get_secret_value()
)
client.unpin_list(list_id=list_id, user_auth=False)
return True
except tweepy.TweepyException:
raise
except Exception:
raise
def run(
self,
input_data: Input,
*,
credentials: TwitterCredentials,
**kwargs,
) -> BlockOutput:
try:
success = self.unpin_list(credentials, input_data.list_id)
yield "success", success
except Exception as e:
yield "error", handle_tweepy_exception(e)
class TwitterPinListBlock(Block):
"""
Enables the authenticated user to pin a List.
"""
class Input(BlockSchema):
credentials: TwitterCredentialsInput = TwitterCredentialsField(
["list.write", "users.read", "tweet.read", "offline.access"]
)
list_id: str = SchemaField(
description="The ID of the List to pin",
placeholder="Enter list ID",
required=True,
)
class Output(BlockSchema):
success: bool = SchemaField(description="Whether the pin was successful")
error: str = SchemaField(description="Error message if the request failed")
def __init__(self):
super().__init__(
id="8ec16e48-a62f-11ef-9f35-f3d6de43a802",
description="This block allows the authenticated user to pin a specified List.",
categories={BlockCategory.SOCIAL},
input_schema=TwitterPinListBlock.Input,
output_schema=TwitterPinListBlock.Output,
test_input={"list_id": "123456789", "credentials": TEST_CREDENTIALS_INPUT},
test_credentials=TEST_CREDENTIALS,
test_output=[("success", True)],
test_mock={"pin_list": lambda *args, **kwargs: True},
)
@staticmethod
def pin_list(credentials: TwitterCredentials, list_id: str):
try:
client = tweepy.Client(
bearer_token=credentials.access_token.get_secret_value()
)
client.pin_list(list_id=list_id, user_auth=False)
return True
except tweepy.TweepyException:
raise
except Exception:
raise
def run(
self,
input_data: Input,
*,
credentials: TwitterCredentials,
**kwargs,
) -> BlockOutput:
try:
success = self.pin_list(credentials, input_data.list_id)
yield "success", success
except Exception as e:
yield "error", handle_tweepy_exception(e)
class TwitterGetPinnedListsBlock(Block):
"""
Returns the Lists pinned by the authenticated user.
"""
class Input(ListExpansionInputs):
credentials: TwitterCredentialsInput = TwitterCredentialsField(
["lists.read", "users.read", "offline.access"]
)
class Output(BlockSchema):
list_ids: list[str] = SchemaField(description="List IDs of the pinned lists")
list_names: list[str] = SchemaField(
description="List names of the pinned lists"
)
data: list[dict] = SchemaField(
description="Response data containing pinned lists"
)
included: dict = SchemaField(
description="Additional data requested via expansions"
)
meta: dict = SchemaField(description="Metadata about the response")
error: str = SchemaField(description="Error message if the request failed")
def __init__(self):
super().__init__(
id="97e03aae-a62f-11ef-bc53-5b89cb02888f",
description="This block returns the Lists pinned by the authenticated user.",
categories={BlockCategory.SOCIAL},
input_schema=TwitterGetPinnedListsBlock.Input,
output_schema=TwitterGetPinnedListsBlock.Output,
test_input={
"expansions": None,
"list_fields": None,
"user_fields": None,
"credentials": TEST_CREDENTIALS_INPUT,
},
test_credentials=TEST_CREDENTIALS,
test_output=[
("list_ids", ["84839422"]),
("list_names", ["Twitter List"]),
("data", [{"id": "84839422", "name": "Twitter List"}]),
],
test_mock={
"get_pinned_lists": lambda *args, **kwargs: (
[{"id": "84839422", "name": "Twitter List"}],
{},
{},
["84839422"],
["Twitter List"],
)
},
)
@staticmethod
def get_pinned_lists(
credentials: TwitterCredentials,
expansions: ListExpansionsFilter | None,
user_fields: TweetUserFieldsFilter | None,
list_fields: ListFieldsFilter | None,
):
try:
client = tweepy.Client(
bearer_token=credentials.access_token.get_secret_value()
)
params = {"user_auth": False}
params = (
ListExpansionsBuilder(params)
.add_expansions(expansions)
.add_user_fields(user_fields)
.add_list_fields(list_fields)
.build()
)
response = cast(Response, client.get_pinned_lists(**params))
meta = {}
included = {}
list_ids = []
list_names = []
if response.meta:
meta = response.meta
if response.includes:
included = IncludesSerializer.serialize(response.includes)
if response.data:
data = ResponseDataSerializer.serialize_list(response.data)
list_ids = [str(item.id) for item in response.data]
list_names = [item.name for item in response.data]
return data, included, meta, list_ids, list_names
raise Exception("Lists not found")
except tweepy.TweepyException:
raise
def run(
self,
input_data: Input,
*,
credentials: TwitterCredentials,
**kwargs,
) -> BlockOutput:
try:
list_data, included, meta, list_ids, list_names = self.get_pinned_lists(
credentials,
input_data.expansions,
input_data.user_fields,
input_data.list_fields,
)
if list_ids:
yield "list_ids", list_ids
if list_names:
yield "list_names", list_names
if list_data:
yield "data", list_data
if included:
yield "included", included
if meta:
yield "meta", meta
except Exception as e:
yield "error", handle_tweepy_exception(e)

View File

@@ -0,0 +1,195 @@
from typing import cast
import tweepy
from tweepy.client import Response
from backend.blocks.twitter._auth import (
TEST_CREDENTIALS,
TEST_CREDENTIALS_INPUT,
TwitterCredentials,
TwitterCredentialsField,
TwitterCredentialsInput,
)
from backend.blocks.twitter._builders import SpaceExpansionsBuilder
from backend.blocks.twitter._serializer import (
IncludesSerializer,
ResponseDataSerializer,
)
from backend.blocks.twitter._types import (
SpaceExpansionInputs,
SpaceExpansionsFilter,
SpaceFieldsFilter,
SpaceStatesFilter,
TweetUserFieldsFilter,
)
from backend.blocks.twitter.tweepy_exceptions import handle_tweepy_exception
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.model import SchemaField
class TwitterSearchSpacesBlock(Block):
"""
Returns live or scheduled Spaces matching specified search terms [for a week only]
"""
class Input(SpaceExpansionInputs):
credentials: TwitterCredentialsInput = TwitterCredentialsField(
["spaces.read", "users.read", "tweet.read", "offline.access"]
)
query: str = SchemaField(
description="Search term to find in Space titles",
placeholder="Enter search query",
)
max_results: int | None = SchemaField(
description="Maximum number of results to return (1-100)",
placeholder="Enter max results",
default=10,
advanced=True,
)
state: SpaceStatesFilter = SchemaField(
description="Type of Spaces to return (live, scheduled, or all)",
placeholder="Enter state filter",
default=SpaceStatesFilter.all,
)
class Output(BlockSchema):
# Common outputs that user commonly uses
ids: list[str] = SchemaField(description="List of space IDs")
titles: list[str] = SchemaField(description="List of space titles")
host_ids: list = SchemaField(description="List of host IDs")
next_token: str = SchemaField(description="Next token for pagination")
# Complete outputs for advanced use
data: list[dict] = SchemaField(description="Complete space data")
includes: dict = SchemaField(
description="Additional data requested via expansions"
)
meta: dict = SchemaField(description="Metadata including pagination info")
error: str = SchemaField(description="Error message if the request failed")
def __init__(self):
super().__init__(
id="aaefdd48-a62f-11ef-a73c-3f44df63e276",
description="This block searches for Twitter Spaces based on specified terms.",
categories={BlockCategory.SOCIAL},
input_schema=TwitterSearchSpacesBlock.Input,
output_schema=TwitterSearchSpacesBlock.Output,
test_input={
"query": "tech",
"max_results": 1,
"state": "live",
"credentials": TEST_CREDENTIALS_INPUT,
"expansions": None,
"space_fields": None,
"user_fields": None,
},
test_credentials=TEST_CREDENTIALS,
test_output=[
("ids", ["1234"]),
("titles", ["Tech Talk"]),
("host_ids", ["5678"]),
("data", [{"id": "1234", "title": "Tech Talk", "host_ids": ["5678"]}]),
],
test_mock={
"search_spaces": lambda *args, **kwargs: (
[{"id": "1234", "title": "Tech Talk", "host_ids": ["5678"]}],
{},
{},
["1234"],
["Tech Talk"],
["5678"],
None,
)
},
)
@staticmethod
def search_spaces(
credentials: TwitterCredentials,
query: str,
max_results: int | None,
state: SpaceStatesFilter,
expansions: SpaceExpansionsFilter | None,
space_fields: SpaceFieldsFilter | None,
user_fields: TweetUserFieldsFilter | None,
):
try:
client = tweepy.Client(
bearer_token=credentials.access_token.get_secret_value()
)
params = {"query": query, "max_results": max_results, "state": state.value}
params = (
SpaceExpansionsBuilder(params)
.add_expansions(expansions)
.add_space_fields(space_fields)
.add_user_fields(user_fields)
.build()
)
response = cast(Response, client.search_spaces(**params))
meta = {}
next_token = ""
if response.meta:
meta = response.meta
if "next_token" in meta:
next_token = meta["next_token"]
included = IncludesSerializer.serialize(response.includes)
data = ResponseDataSerializer.serialize_list(response.data)
if response.data:
ids = [str(space["id"]) for space in response.data if "id" in space]
titles = [space["title"] for space in data if "title" in space]
host_ids = [space["host_ids"] for space in data if "host_ids" in space]
return data, included, meta, ids, titles, host_ids, next_token
raise Exception("Spaces not found")
except tweepy.TweepyException:
raise
def run(
self,
input_data: Input,
*,
credentials: TwitterCredentials,
**kwargs,
) -> BlockOutput:
try:
data, included, meta, ids, titles, host_ids, next_token = (
self.search_spaces(
credentials,
input_data.query,
input_data.max_results,
input_data.state,
input_data.expansions,
input_data.space_fields,
input_data.user_fields,
)
)
if ids:
yield "ids", ids
if titles:
yield "titles", titles
if host_ids:
yield "host_ids", host_ids
if next_token:
yield "next_token", next_token
if data:
yield "data", data
if included:
yield "includes", included
if meta:
yield "meta", meta
except Exception as e:
yield "error", handle_tweepy_exception(e)

View File

@@ -0,0 +1,651 @@
from typing import Literal, Union, cast
import tweepy
from pydantic import BaseModel
from tweepy.client import Response
from backend.blocks.twitter._auth import (
TEST_CREDENTIALS,
TEST_CREDENTIALS_INPUT,
TwitterCredentials,
TwitterCredentialsField,
TwitterCredentialsInput,
)
from backend.blocks.twitter._builders import (
SpaceExpansionsBuilder,
TweetExpansionsBuilder,
UserExpansionsBuilder,
)
from backend.blocks.twitter._serializer import (
IncludesSerializer,
ResponseDataSerializer,
)
from backend.blocks.twitter._types import (
ExpansionFilter,
SpaceExpansionInputs,
SpaceExpansionsFilter,
SpaceFieldsFilter,
TweetExpansionInputs,
TweetFieldsFilter,
TweetMediaFieldsFilter,
TweetPlaceFieldsFilter,
TweetPollFieldsFilter,
TweetUserFieldsFilter,
UserExpansionInputs,
UserExpansionsFilter,
)
from backend.blocks.twitter.tweepy_exceptions import handle_tweepy_exception
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.model import SchemaField
class SpaceList(BaseModel):
discriminator: Literal["space_list"]
space_ids: list[str] = SchemaField(
description="List of Space IDs to lookup (up to 100)",
placeholder="Enter Space IDs",
default=[],
advanced=False,
)
class UserList(BaseModel):
discriminator: Literal["user_list"]
user_ids: list[str] = SchemaField(
description="List of user IDs to lookup their Spaces (up to 100)",
placeholder="Enter user IDs",
default=[],
advanced=False,
)
class TwitterGetSpacesBlock(Block):
"""
Gets information about multiple Twitter Spaces specified by Space IDs or creator user IDs
"""
class Input(SpaceExpansionInputs):
credentials: TwitterCredentialsInput = TwitterCredentialsField(
["spaces.read", "users.read", "offline.access"]
)
identifier: Union[SpaceList, UserList] = SchemaField(
discriminator="discriminator",
description="Choose whether to lookup spaces by their IDs or by creator user IDs",
advanced=False,
)
class Output(BlockSchema):
# Common outputs
ids: list[str] = SchemaField(description="List of space IDs")
titles: list[str] = SchemaField(description="List of space titles")
# Complete outputs for advanced use
data: list[dict] = SchemaField(description="Complete space data")
includes: dict = SchemaField(
description="Additional data requested via expansions"
)
error: str = SchemaField(description="Error message if the request failed")
def __init__(self):
super().__init__(
id="d75bd7d8-a62f-11ef-b0d8-c7a9496f617f",
description="This block retrieves information about multiple Twitter Spaces.",
categories={BlockCategory.SOCIAL},
input_schema=TwitterGetSpacesBlock.Input,
output_schema=TwitterGetSpacesBlock.Output,
test_input={
"identifier": {
"discriminator": "space_list",
"space_ids": ["1DXxyRYNejbKM"],
},
"credentials": TEST_CREDENTIALS_INPUT,
"expansions": None,
"space_fields": None,
"user_fields": None,
},
test_credentials=TEST_CREDENTIALS,
test_output=[
("ids", ["1DXxyRYNejbKM"]),
("titles", ["Test Space"]),
(
"data",
[
{
"id": "1DXxyRYNejbKM",
"title": "Test Space",
"host_id": "1234567",
}
],
),
],
test_mock={
"get_spaces": lambda *args, **kwargs: (
[
{
"id": "1DXxyRYNejbKM",
"title": "Test Space",
"host_id": "1234567",
}
],
{},
["1DXxyRYNejbKM"],
["Test Space"],
)
},
)
@staticmethod
def get_spaces(
credentials: TwitterCredentials,
identifier: Union[SpaceList, UserList],
expansions: SpaceExpansionsFilter | None,
space_fields: SpaceFieldsFilter | None,
user_fields: TweetUserFieldsFilter | None,
):
try:
client = tweepy.Client(
bearer_token=credentials.access_token.get_secret_value()
)
params = {
"ids": (
identifier.space_ids if isinstance(identifier, SpaceList) else None
),
"user_ids": (
identifier.user_ids if isinstance(identifier, UserList) else None
),
}
params = (
SpaceExpansionsBuilder(params)
.add_expansions(expansions)
.add_space_fields(space_fields)
.add_user_fields(user_fields)
.build()
)
response = cast(Response, client.get_spaces(**params))
ids = []
titles = []
included = IncludesSerializer.serialize(response.includes)
if response.data:
data = ResponseDataSerializer.serialize_list(response.data)
ids = [space["id"] for space in data if "id" in space]
titles = [space["title"] for space in data if "title" in space]
return data, included, ids, titles
raise Exception("No spaces found")
except tweepy.TweepyException:
raise
def run(
self,
input_data: Input,
*,
credentials: TwitterCredentials,
**kwargs,
) -> BlockOutput:
try:
data, included, ids, titles = self.get_spaces(
credentials,
input_data.identifier,
input_data.expansions,
input_data.space_fields,
input_data.user_fields,
)
if ids:
yield "ids", ids
if titles:
yield "titles", titles
if data:
yield "data", data
if included:
yield "includes", included
except Exception as e:
yield "error", handle_tweepy_exception(e)
class TwitterGetSpaceByIdBlock(Block):
"""
Gets information about a single Twitter Space specified by Space ID
"""
class Input(SpaceExpansionInputs):
credentials: TwitterCredentialsInput = TwitterCredentialsField(
["spaces.read", "users.read", "offline.access"]
)
space_id: str = SchemaField(
description="Space ID to lookup",
placeholder="Enter Space ID",
required=True,
)
class Output(BlockSchema):
# Common outputs
id: str = SchemaField(description="Space ID")
title: str = SchemaField(description="Space title")
host_ids: list[str] = SchemaField(description="Host ID")
# Complete outputs for advanced use
data: dict = SchemaField(description="Complete space data")
includes: dict = SchemaField(
description="Additional data requested via expansions"
)
error: str = SchemaField(description="Error message if the request failed")
def __init__(self):
super().__init__(
id="c79700de-a62f-11ef-ab20-fb32bf9d5a9d",
description="This block retrieves information about a single Twitter Space.",
categories={BlockCategory.SOCIAL},
input_schema=TwitterGetSpaceByIdBlock.Input,
output_schema=TwitterGetSpaceByIdBlock.Output,
test_input={
"space_id": "1DXxyRYNejbKM",
"credentials": TEST_CREDENTIALS_INPUT,
"expansions": None,
"space_fields": None,
"user_fields": None,
},
test_credentials=TEST_CREDENTIALS,
test_output=[
("id", "1DXxyRYNejbKM"),
("title", "Test Space"),
("host_ids", ["1234567"]),
(
"data",
{
"id": "1DXxyRYNejbKM",
"title": "Test Space",
"host_ids": ["1234567"],
},
),
],
test_mock={
"get_space": lambda *args, **kwargs: (
{
"id": "1DXxyRYNejbKM",
"title": "Test Space",
"host_ids": ["1234567"],
},
{},
)
},
)
@staticmethod
def get_space(
credentials: TwitterCredentials,
space_id: str,
expansions: SpaceExpansionsFilter | None,
space_fields: SpaceFieldsFilter | None,
user_fields: TweetUserFieldsFilter | None,
):
try:
client = tweepy.Client(
bearer_token=credentials.access_token.get_secret_value()
)
params = {
"id": space_id,
}
params = (
SpaceExpansionsBuilder(params)
.add_expansions(expansions)
.add_space_fields(space_fields)
.add_user_fields(user_fields)
.build()
)
response = cast(Response, client.get_space(**params))
includes = {}
if response.includes:
for key, value in response.includes.items():
if isinstance(value, list):
includes[key] = [
item.data if hasattr(item, "data") else item
for item in value
]
else:
includes[key] = value.data if hasattr(value, "data") else value
data = {}
if response.data:
for key, value in response.data.items():
if isinstance(value, list):
data[key] = [
item.data if hasattr(item, "data") else item
for item in value
]
else:
data[key] = value.data if hasattr(value, "data") else value
return data, includes
raise Exception("Space not found")
except tweepy.TweepyException:
raise
def run(
self,
input_data: Input,
*,
credentials: TwitterCredentials,
**kwargs,
) -> BlockOutput:
try:
space_data, includes = self.get_space(
credentials,
input_data.space_id,
input_data.expansions,
input_data.space_fields,
input_data.user_fields,
)
# Common outputs
if space_data:
if "id" in space_data:
yield "id", space_data.get("id")
if "title" in space_data:
yield "title", space_data.get("title")
if "host_ids" in space_data:
yield "host_ids", space_data.get("host_ids")
if space_data:
yield "data", space_data
if includes:
yield "includes", includes
except Exception as e:
yield "error", handle_tweepy_exception(e)
# Not tested yet, might have some problem
class TwitterGetSpaceBuyersBlock(Block):
"""
Gets list of users who purchased a ticket to the requested Space
"""
class Input(UserExpansionInputs):
credentials: TwitterCredentialsInput = TwitterCredentialsField(
["spaces.read", "users.read", "offline.access"]
)
space_id: str = SchemaField(
description="Space ID to lookup buyers for",
placeholder="Enter Space ID",
required=True,
)
class Output(BlockSchema):
# Common outputs
buyer_ids: list[str] = SchemaField(description="List of buyer IDs")
usernames: list[str] = SchemaField(description="List of buyer usernames")
# Complete outputs for advanced use
data: list[dict] = SchemaField(description="Complete space buyers data")
includes: dict = SchemaField(
description="Additional data requested via expansions"
)
error: str = SchemaField(description="Error message if the request failed")
def __init__(self):
super().__init__(
id="c1c121a8-a62f-11ef-8b0e-d7b85f96a46f",
description="This block retrieves a list of users who purchased tickets to a Twitter Space.",
categories={BlockCategory.SOCIAL},
input_schema=TwitterGetSpaceBuyersBlock.Input,
output_schema=TwitterGetSpaceBuyersBlock.Output,
test_input={
"space_id": "1DXxyRYNejbKM",
"credentials": TEST_CREDENTIALS_INPUT,
"expansions": None,
"user_fields": None,
},
test_credentials=TEST_CREDENTIALS,
test_output=[
("buyer_ids", ["2244994945"]),
("usernames", ["testuser"]),
(
"data",
[{"id": "2244994945", "username": "testuser", "name": "Test User"}],
),
],
test_mock={
"get_space_buyers": lambda *args, **kwargs: (
[{"id": "2244994945", "username": "testuser", "name": "Test User"}],
{},
["2244994945"],
["testuser"],
)
},
)
@staticmethod
def get_space_buyers(
credentials: TwitterCredentials,
space_id: str,
expansions: UserExpansionsFilter | None,
user_fields: TweetUserFieldsFilter | None,
):
try:
client = tweepy.Client(
bearer_token=credentials.access_token.get_secret_value()
)
params = {
"id": space_id,
}
params = (
UserExpansionsBuilder(params)
.add_expansions(expansions)
.add_user_fields(user_fields)
.build()
)
response = cast(Response, client.get_space_buyers(**params))
included = IncludesSerializer.serialize(response.includes)
if response.data:
data = ResponseDataSerializer.serialize_list(response.data)
buyer_ids = [buyer["id"] for buyer in data]
usernames = [buyer["username"] for buyer in data]
return data, included, buyer_ids, usernames
raise Exception("No buyers found for this Space")
except tweepy.TweepyException:
raise
def run(
self,
input_data: Input,
*,
credentials: TwitterCredentials,
**kwargs,
) -> BlockOutput:
try:
buyers_data, included, buyer_ids, usernames = self.get_space_buyers(
credentials,
input_data.space_id,
input_data.expansions,
input_data.user_fields,
)
if buyer_ids:
yield "buyer_ids", buyer_ids
if usernames:
yield "usernames", usernames
if buyers_data:
yield "data", buyers_data
if included:
yield "includes", included
except Exception as e:
yield "error", handle_tweepy_exception(e)
class TwitterGetSpaceTweetsBlock(Block):
"""
Gets list of Tweets shared in the requested Space
"""
class Input(TweetExpansionInputs):
credentials: TwitterCredentialsInput = TwitterCredentialsField(
["spaces.read", "users.read", "offline.access"]
)
space_id: str = SchemaField(
description="Space ID to lookup tweets for",
placeholder="Enter Space ID",
required=True,
)
class Output(BlockSchema):
# Common outputs
tweet_ids: list[str] = SchemaField(description="List of tweet IDs")
texts: list[str] = SchemaField(description="List of tweet texts")
# Complete outputs for advanced use
data: list[dict] = SchemaField(description="Complete space tweets data")
includes: dict = SchemaField(
description="Additional data requested via expansions"
)
meta: dict = SchemaField(description="Response metadata")
error: str = SchemaField(description="Error message if the request failed")
def __init__(self):
super().__init__(
id="b69731e6-a62f-11ef-b2d4-1bf14dd6aee4",
description="This block retrieves tweets shared in a Twitter Space.",
categories={BlockCategory.SOCIAL},
input_schema=TwitterGetSpaceTweetsBlock.Input,
output_schema=TwitterGetSpaceTweetsBlock.Output,
test_input={
"space_id": "1DXxyRYNejbKM",
"credentials": TEST_CREDENTIALS_INPUT,
"expansions": None,
"media_fields": None,
"place_fields": None,
"poll_fields": None,
"tweet_fields": None,
"user_fields": None,
},
test_credentials=TEST_CREDENTIALS,
test_output=[
("tweet_ids", ["1234567890"]),
("texts", ["Test tweet"]),
("data", [{"id": "1234567890", "text": "Test tweet"}]),
],
test_mock={
"get_space_tweets": lambda *args, **kwargs: (
[{"id": "1234567890", "text": "Test tweet"}], # data
{},
["1234567890"],
["Test tweet"],
{},
)
},
)
@staticmethod
def get_space_tweets(
credentials: TwitterCredentials,
space_id: str,
expansions: ExpansionFilter | None,
media_fields: TweetMediaFieldsFilter | None,
place_fields: TweetPlaceFieldsFilter | None,
poll_fields: TweetPollFieldsFilter | None,
tweet_fields: TweetFieldsFilter | None,
user_fields: TweetUserFieldsFilter | None,
):
try:
client = tweepy.Client(
bearer_token=credentials.access_token.get_secret_value()
)
params = {
"id": space_id,
}
params = (
TweetExpansionsBuilder(params)
.add_expansions(expansions)
.add_media_fields(media_fields)
.add_place_fields(place_fields)
.add_poll_fields(poll_fields)
.add_tweet_fields(tweet_fields)
.add_user_fields(user_fields)
.build()
)
response = cast(Response, client.get_space_tweets(**params))
included = IncludesSerializer.serialize(response.includes)
if response.data:
data = ResponseDataSerializer.serialize_list(response.data)
tweet_ids = [str(tweet["id"]) for tweet in data]
texts = [tweet["text"] for tweet in data]
meta = response.meta or {}
return data, included, tweet_ids, texts, meta
raise Exception("No tweets found for this Space")
except tweepy.TweepyException:
raise
def run(
self,
input_data: Input,
*,
credentials: TwitterCredentials,
**kwargs,
) -> BlockOutput:
try:
tweets_data, included, tweet_ids, texts, meta = self.get_space_tweets(
credentials,
input_data.space_id,
input_data.expansions,
input_data.media_fields,
input_data.place_fields,
input_data.poll_fields,
input_data.tweet_fields,
input_data.user_fields,
)
if tweet_ids:
yield "tweet_ids", tweet_ids
if texts:
yield "texts", texts
if tweets_data:
yield "data", tweets_data
if included:
yield "includes", included
if meta:
yield "meta", meta
except Exception as e:
yield "error", handle_tweepy_exception(e)

View File

@@ -0,0 +1,20 @@
import tweepy
def handle_tweepy_exception(e: Exception) -> str:
if isinstance(e, tweepy.BadRequest):
return f"Bad Request (400): {str(e)}"
elif isinstance(e, tweepy.Unauthorized):
return f"Unauthorized (401): {str(e)}"
elif isinstance(e, tweepy.Forbidden):
return f"Forbidden (403): {str(e)}"
elif isinstance(e, tweepy.NotFound):
return f"Not Found (404): {str(e)}"
elif isinstance(e, tweepy.TooManyRequests):
return f"Too Many Requests (429): {str(e)}"
elif isinstance(e, tweepy.TwitterServerError):
return f"Twitter Server Error (5xx): {str(e)}"
elif isinstance(e, tweepy.TweepyException):
return f"Tweepy Error: {str(e)}"
else:
return f"Unexpected error: {str(e)}"

View File

@@ -0,0 +1,372 @@
from typing import cast
import tweepy
from tweepy.client import Response
from backend.blocks.twitter._auth import (
TEST_CREDENTIALS,
TEST_CREDENTIALS_INPUT,
TwitterCredentials,
TwitterCredentialsField,
TwitterCredentialsInput,
)
from backend.blocks.twitter._builders import TweetExpansionsBuilder
from backend.blocks.twitter._serializer import (
IncludesSerializer,
ResponseDataSerializer,
)
from backend.blocks.twitter._types import (
ExpansionFilter,
TweetExpansionInputs,
TweetFieldsFilter,
TweetMediaFieldsFilter,
TweetPlaceFieldsFilter,
TweetPollFieldsFilter,
TweetUserFieldsFilter,
)
from backend.blocks.twitter.tweepy_exceptions import handle_tweepy_exception
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.model import SchemaField
class TwitterBookmarkTweetBlock(Block):
"""
Bookmark a tweet on Twitter
"""
class Input(BlockSchema):
credentials: TwitterCredentialsInput = TwitterCredentialsField(
["tweet.read", "bookmark.write", "users.read", "offline.access"]
)
tweet_id: str = SchemaField(
description="ID of the tweet to bookmark",
placeholder="Enter tweet ID",
)
class Output(BlockSchema):
success: bool = SchemaField(description="Whether the bookmark was successful")
error: str = SchemaField(description="Error message if the bookmark failed")
def __init__(self):
super().__init__(
id="f33d67be-a62f-11ef-a797-ff83ec29ee8e",
description="This block bookmarks a tweet on Twitter.",
categories={BlockCategory.SOCIAL},
input_schema=TwitterBookmarkTweetBlock.Input,
output_schema=TwitterBookmarkTweetBlock.Output,
test_input={
"tweet_id": "1234567890",
"credentials": TEST_CREDENTIALS_INPUT,
},
test_credentials=TEST_CREDENTIALS,
test_output=[
("success", True),
],
test_mock={"bookmark_tweet": lambda *args, **kwargs: True},
)
@staticmethod
def bookmark_tweet(
credentials: TwitterCredentials,
tweet_id: str,
):
try:
client = tweepy.Client(
bearer_token=credentials.access_token.get_secret_value()
)
client.bookmark(tweet_id)
return True
except tweepy.TweepyException:
raise
def run(
self,
input_data: Input,
*,
credentials: TwitterCredentials,
**kwargs,
) -> BlockOutput:
try:
success = self.bookmark_tweet(credentials, input_data.tweet_id)
yield "success", success
except Exception as e:
yield "error", handle_tweepy_exception(e)
class TwitterGetBookmarkedTweetsBlock(Block):
"""
Get All your bookmarked tweets from Twitter
"""
class Input(TweetExpansionInputs):
credentials: TwitterCredentialsInput = TwitterCredentialsField(
["tweet.read", "bookmark.read", "users.read", "offline.access"]
)
max_results: int | None = SchemaField(
description="Maximum number of results to return (1-100)",
placeholder="Enter max results",
default=10,
advanced=True,
)
pagination_token: str | None = SchemaField(
description="Token for pagination",
placeholder="Enter pagination token",
default="",
advanced=True,
)
class Output(BlockSchema):
# Common Outputs that user commonly uses
id: list[str] = SchemaField(description="All Tweet IDs")
text: list[str] = SchemaField(description="All Tweet texts")
userId: list[str] = SchemaField(description="IDs of the tweet authors")
userName: list[str] = SchemaField(description="Usernames of the tweet authors")
# Complete Outputs for advanced use
data: list[dict] = SchemaField(description="Complete Tweet data")
included: dict = SchemaField(
description="Additional data that you have requested (Optional) via Expansions field"
)
meta: dict = SchemaField(
description="Provides metadata such as pagination info (next_token) or result counts"
)
next_token: str = SchemaField(description="Next token for pagination")
error: str = SchemaField(description="Error message if the request failed")
def __init__(self):
super().__init__(
id="ed26783e-a62f-11ef-9a21-c77c57dd8a1f",
description="This block retrieves bookmarked tweets from Twitter.",
categories={BlockCategory.SOCIAL},
input_schema=TwitterGetBookmarkedTweetsBlock.Input,
output_schema=TwitterGetBookmarkedTweetsBlock.Output,
test_input={
"max_results": 2,
"pagination_token": None,
"expansions": None,
"media_fields": None,
"place_fields": None,
"poll_fields": None,
"tweet_fields": None,
"user_fields": None,
"credentials": TEST_CREDENTIALS_INPUT,
},
test_credentials=TEST_CREDENTIALS,
test_output=[
("id", ["1234567890"]),
("text", ["Test tweet"]),
("userId", ["12345"]),
("userName", ["testuser"]),
("data", [{"id": "1234567890", "text": "Test tweet"}]),
],
test_mock={
"get_bookmarked_tweets": lambda *args, **kwargs: (
["1234567890"],
["Test tweet"],
["12345"],
["testuser"],
[{"id": "1234567890", "text": "Test tweet"}],
{},
{},
None,
)
},
)
@staticmethod
def get_bookmarked_tweets(
credentials: TwitterCredentials,
max_results: int | None,
pagination_token: str | None,
expansions: ExpansionFilter | None,
media_fields: TweetMediaFieldsFilter | None,
place_fields: TweetPlaceFieldsFilter | None,
poll_fields: TweetPollFieldsFilter | None,
tweet_fields: TweetFieldsFilter | None,
user_fields: TweetUserFieldsFilter | None,
):
try:
client = tweepy.Client(
bearer_token=credentials.access_token.get_secret_value()
)
params = {
"max_results": max_results,
"pagination_token": (
None if pagination_token == "" else pagination_token
),
}
params = (
TweetExpansionsBuilder(params)
.add_expansions(expansions)
.add_media_fields(media_fields)
.add_place_fields(place_fields)
.add_poll_fields(poll_fields)
.add_tweet_fields(tweet_fields)
.add_user_fields(user_fields)
.build()
)
response = cast(
Response,
client.get_bookmarks(**params),
)
meta = {}
tweet_ids = []
tweet_texts = []
user_ids = []
user_names = []
next_token = None
if response.meta:
meta = response.meta
next_token = meta.get("next_token")
included = IncludesSerializer.serialize(response.includes)
data = ResponseDataSerializer.serialize_list(response.data)
if response.data:
tweet_ids = [str(tweet.id) for tweet in response.data]
tweet_texts = [tweet.text for tweet in response.data]
if "users" in included:
for user in included["users"]:
user_ids.append(str(user["id"]))
user_names.append(user["username"])
return (
tweet_ids,
tweet_texts,
user_ids,
user_names,
data,
included,
meta,
next_token,
)
raise Exception("No bookmarked tweets found")
except tweepy.TweepyException:
raise
def run(
self,
input_data: Input,
*,
credentials: TwitterCredentials,
**kwargs,
) -> BlockOutput:
try:
ids, texts, user_ids, user_names, data, included, meta, next_token = (
self.get_bookmarked_tweets(
credentials,
input_data.max_results,
input_data.pagination_token,
input_data.expansions,
input_data.media_fields,
input_data.place_fields,
input_data.poll_fields,
input_data.tweet_fields,
input_data.user_fields,
)
)
if ids:
yield "id", ids
if texts:
yield "text", texts
if user_ids:
yield "userId", user_ids
if user_names:
yield "userName", user_names
if data:
yield "data", data
if included:
yield "included", included
if meta:
yield "meta", meta
if next_token:
yield "next_token", next_token
except Exception as e:
yield "error", handle_tweepy_exception(e)
class TwitterRemoveBookmarkTweetBlock(Block):
"""
Remove a bookmark for a tweet on Twitter
"""
class Input(BlockSchema):
credentials: TwitterCredentialsInput = TwitterCredentialsField(
["tweet.read", "bookmark.write", "users.read", "offline.access"]
)
tweet_id: str = SchemaField(
description="ID of the tweet to remove bookmark from",
placeholder="Enter tweet ID",
)
class Output(BlockSchema):
success: bool = SchemaField(
description="Whether the bookmark was successfully removed"
)
error: str = SchemaField(
description="Error message if the bookmark removal failed"
)
def __init__(self):
super().__init__(
id="e4100684-a62f-11ef-9be9-770cb41a2616",
description="This block removes a bookmark from a tweet on Twitter.",
categories={BlockCategory.SOCIAL},
input_schema=TwitterRemoveBookmarkTweetBlock.Input,
output_schema=TwitterRemoveBookmarkTweetBlock.Output,
test_input={
"tweet_id": "1234567890",
"credentials": TEST_CREDENTIALS_INPUT,
},
test_credentials=TEST_CREDENTIALS,
test_output=[
("success", True),
],
test_mock={"remove_bookmark_tweet": lambda *args, **kwargs: True},
)
@staticmethod
def remove_bookmark_tweet(
credentials: TwitterCredentials,
tweet_id: str,
):
try:
client = tweepy.Client(
bearer_token=credentials.access_token.get_secret_value()
)
client.remove_bookmark(tweet_id)
return True
except tweepy.TweepyException:
raise
def run(
self,
input_data: Input,
*,
credentials: TwitterCredentials,
**kwargs,
) -> BlockOutput:
try:
success = self.remove_bookmark_tweet(credentials, input_data.tweet_id)
yield "success", success
except Exception as e:
yield "error", handle_tweepy_exception(e)

View File

@@ -0,0 +1,154 @@
import tweepy
from backend.blocks.twitter._auth import (
TEST_CREDENTIALS,
TEST_CREDENTIALS_INPUT,
TwitterCredentials,
TwitterCredentialsField,
TwitterCredentialsInput,
)
from backend.blocks.twitter.tweepy_exceptions import handle_tweepy_exception
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.model import SchemaField
class TwitterHideReplyBlock(Block):
"""
Hides a reply of one of your tweets
"""
class Input(BlockSchema):
credentials: TwitterCredentialsInput = TwitterCredentialsField(
["tweet.read", "tweet.moderate.write", "users.read", "offline.access"]
)
tweet_id: str = SchemaField(
description="ID of the tweet reply to hide",
placeholder="Enter tweet ID",
)
class Output(BlockSchema):
success: bool = SchemaField(description="Whether the operation was successful")
error: str = SchemaField(description="Error message if the request failed")
def __init__(self):
super().__init__(
id="07d58b3e-a630-11ef-a030-93701d1a465e",
description="This block hides a reply to a tweet.",
categories={BlockCategory.SOCIAL},
input_schema=TwitterHideReplyBlock.Input,
output_schema=TwitterHideReplyBlock.Output,
test_input={
"tweet_id": "1234567890",
"credentials": TEST_CREDENTIALS_INPUT,
},
test_credentials=TEST_CREDENTIALS,
test_output=[
("success", True),
],
test_mock={"hide_reply": lambda *args, **kwargs: True},
)
@staticmethod
def hide_reply(
credentials: TwitterCredentials,
tweet_id: str,
):
try:
client = tweepy.Client(
bearer_token=credentials.access_token.get_secret_value()
)
client.hide_reply(id=tweet_id, user_auth=False)
return True
except tweepy.TweepyException:
raise
def run(
self,
input_data: Input,
*,
credentials: TwitterCredentials,
**kwargs,
) -> BlockOutput:
try:
success = self.hide_reply(
credentials,
input_data.tweet_id,
)
yield "success", success
except Exception as e:
yield "error", handle_tweepy_exception(e)
class TwitterUnhideReplyBlock(Block):
"""
Unhides a reply to a tweet
"""
class Input(BlockSchema):
credentials: TwitterCredentialsInput = TwitterCredentialsField(
["tweet.read", "tweet.moderate.write", "users.read", "offline.access"]
)
tweet_id: str = SchemaField(
description="ID of the tweet reply to unhide",
placeholder="Enter tweet ID",
)
class Output(BlockSchema):
success: bool = SchemaField(description="Whether the operation was successful")
error: str = SchemaField(description="Error message if the request failed")
def __init__(self):
super().__init__(
id="fcf9e4e4-a62f-11ef-9d85-57d3d06b616a",
description="This block unhides a reply to a tweet.",
categories={BlockCategory.SOCIAL},
input_schema=TwitterUnhideReplyBlock.Input,
output_schema=TwitterUnhideReplyBlock.Output,
test_input={
"tweet_id": "1234567890",
"credentials": TEST_CREDENTIALS_INPUT,
},
test_credentials=TEST_CREDENTIALS,
test_output=[
("success", True),
],
test_mock={"unhide_reply": lambda *args, **kwargs: True},
)
@staticmethod
def unhide_reply(
credentials: TwitterCredentials,
tweet_id: str,
):
try:
client = tweepy.Client(
bearer_token=credentials.access_token.get_secret_value()
)
client.unhide_reply(id=tweet_id, user_auth=False)
return True
except tweepy.TweepyException:
raise
def run(
self,
input_data: Input,
*,
credentials: TwitterCredentials,
**kwargs,
) -> BlockOutput:
try:
success = self.unhide_reply(
credentials,
input_data.tweet_id,
)
yield "success", success
except Exception as e:
yield "error", handle_tweepy_exception(e)

View File

@@ -0,0 +1,576 @@
from typing import cast
import tweepy
from tweepy.client import Response
from backend.blocks.twitter._auth import (
TEST_CREDENTIALS,
TEST_CREDENTIALS_INPUT,
TwitterCredentials,
TwitterCredentialsField,
TwitterCredentialsInput,
)
from backend.blocks.twitter._builders import (
TweetExpansionsBuilder,
UserExpansionsBuilder,
)
from backend.blocks.twitter._serializer import (
IncludesSerializer,
ResponseDataSerializer,
)
from backend.blocks.twitter._types import (
ExpansionFilter,
TweetExpansionInputs,
TweetFieldsFilter,
TweetMediaFieldsFilter,
TweetPlaceFieldsFilter,
TweetPollFieldsFilter,
TweetUserFieldsFilter,
UserExpansionInputs,
UserExpansionsFilter,
)
from backend.blocks.twitter.tweepy_exceptions import handle_tweepy_exception
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.model import SchemaField
class TwitterLikeTweetBlock(Block):
"""
Likes a tweet
"""
class Input(BlockSchema):
credentials: TwitterCredentialsInput = TwitterCredentialsField(
["tweet.read", "like.write", "users.read", "offline.access"]
)
tweet_id: str = SchemaField(
description="ID of the tweet to like",
placeholder="Enter tweet ID",
)
class Output(BlockSchema):
success: bool = SchemaField(description="Whether the operation was successful")
error: str = SchemaField(description="Error message if the request failed")
def __init__(self):
super().__init__(
id="4d0b4c5c-a630-11ef-8e08-1b14c507b347",
description="This block likes a tweet.",
categories={BlockCategory.SOCIAL},
input_schema=TwitterLikeTweetBlock.Input,
output_schema=TwitterLikeTweetBlock.Output,
test_input={
"tweet_id": "1234567890",
"credentials": TEST_CREDENTIALS_INPUT,
},
test_credentials=TEST_CREDENTIALS,
test_output=[
("success", True),
],
test_mock={"like_tweet": lambda *args, **kwargs: True},
)
@staticmethod
def like_tweet(
credentials: TwitterCredentials,
tweet_id: str,
):
try:
client = tweepy.Client(
bearer_token=credentials.access_token.get_secret_value()
)
client.like(tweet_id=tweet_id, user_auth=False)
return True
except tweepy.TweepyException:
raise
def run(
self,
input_data: Input,
*,
credentials: TwitterCredentials,
**kwargs,
) -> BlockOutput:
try:
success = self.like_tweet(
credentials,
input_data.tweet_id,
)
yield "success", success
except Exception as e:
yield "error", handle_tweepy_exception(e)
class TwitterGetLikingUsersBlock(Block):
"""
Gets information about users who liked a one of your tweet
"""
class Input(UserExpansionInputs):
credentials: TwitterCredentialsInput = TwitterCredentialsField(
["tweet.read", "users.read", "like.read", "offline.access"]
)
tweet_id: str = SchemaField(
description="ID of the tweet to get liking users for",
placeholder="Enter tweet ID",
)
max_results: int | None = SchemaField(
description="Maximum number of results to return (1-100)",
placeholder="Enter max results",
default=10,
advanced=True,
)
pagination_token: str | None = SchemaField(
description="Token for getting next/previous page of results",
placeholder="Enter pagination token",
default="",
advanced=True,
)
class Output(BlockSchema):
# Common Outputs that user commonly uses
id: list[str] = SchemaField(description="All User IDs who liked the tweet")
username: list[str] = SchemaField(
description="All User usernames who liked the tweet"
)
next_token: str = SchemaField(description="Next token for pagination")
# Complete Outputs for advanced use
data: list[dict] = SchemaField(description="Complete Tweet data")
included: dict = SchemaField(
description="Additional data that you have requested (Optional) via Expansions field"
)
meta: dict = SchemaField(
description="Provides metadata such as pagination info (next_token) or result counts"
)
# error
error: str = SchemaField(description="Error message if the request failed")
def __init__(self):
super().__init__(
id="34275000-a630-11ef-b01e-5f00d9077c08",
description="This block gets information about users who liked a tweet.",
categories={BlockCategory.SOCIAL},
input_schema=TwitterGetLikingUsersBlock.Input,
output_schema=TwitterGetLikingUsersBlock.Output,
test_input={
"tweet_id": "1234567890",
"max_results": 1,
"pagination_token": None,
"credentials": TEST_CREDENTIALS_INPUT,
"expansions": None,
"tweet_fields": None,
"user_fields": None,
},
test_credentials=TEST_CREDENTIALS,
test_output=[
("id", ["1234567890"]),
("username", ["testuser"]),
("data", [{"id": "1234567890", "username": "testuser"}]),
],
test_mock={
"get_liking_users": lambda *args, **kwargs: (
["1234567890"],
["testuser"],
[{"id": "1234567890", "username": "testuser"}],
{},
{},
None,
)
},
)
@staticmethod
def get_liking_users(
credentials: TwitterCredentials,
tweet_id: str,
max_results: int | None,
pagination_token: str | None,
expansions: UserExpansionsFilter | None,
tweet_fields: TweetFieldsFilter | None,
user_fields: TweetUserFieldsFilter | None,
):
try:
client = tweepy.Client(
bearer_token=credentials.access_token.get_secret_value()
)
params = {
"id": tweet_id,
"max_results": max_results,
"pagination_token": (
None if pagination_token == "" else pagination_token
),
"user_auth": False,
}
params = (
UserExpansionsBuilder(params)
.add_expansions(expansions)
.add_tweet_fields(tweet_fields)
.add_user_fields(user_fields)
.build()
)
response = cast(Response, client.get_liking_users(**params))
if not response.data and not response.meta:
raise Exception("No liking users found")
meta = {}
user_ids = []
usernames = []
next_token = None
if response.meta:
meta = response.meta
next_token = meta.get("next_token")
included = IncludesSerializer.serialize(response.includes)
data = ResponseDataSerializer.serialize_list(response.data)
if response.data:
user_ids = [str(user.id) for user in response.data]
usernames = [user.username for user in response.data]
return user_ids, usernames, data, included, meta, next_token
raise Exception("No liking users found")
except tweepy.TweepyException:
raise
def run(
self,
input_data: Input,
*,
credentials: TwitterCredentials,
**kwargs,
) -> BlockOutput:
try:
ids, usernames, data, included, meta, next_token = self.get_liking_users(
credentials,
input_data.tweet_id,
input_data.max_results,
input_data.pagination_token,
input_data.expansions,
input_data.tweet_fields,
input_data.user_fields,
)
if ids:
yield "id", ids
if usernames:
yield "username", usernames
if next_token:
yield "next_token", next_token
if data:
yield "data", data
if included:
yield "included", included
if meta:
yield "meta", meta
except Exception as e:
yield "error", handle_tweepy_exception(e)
class TwitterGetLikedTweetsBlock(Block):
"""
Gets information about tweets liked by you
"""
class Input(TweetExpansionInputs):
credentials: TwitterCredentialsInput = TwitterCredentialsField(
["tweet.read", "users.read", "like.read", "offline.access"]
)
user_id: str = SchemaField(
description="ID of the user to get liked tweets for",
placeholder="Enter user ID",
)
max_results: int | None = SchemaField(
description="Maximum number of results to return (5-100)",
placeholder="100",
default=10,
advanced=True,
)
pagination_token: str | None = SchemaField(
description="Token for getting next/previous page of results",
placeholder="Enter pagination token",
default="",
advanced=True,
)
class Output(BlockSchema):
# Common Outputs that user commonly uses
ids: list[str] = SchemaField(description="All Tweet IDs")
texts: list[str] = SchemaField(description="All Tweet texts")
userIds: list[str] = SchemaField(
description="List of user ids that authored the tweets"
)
userNames: list[str] = SchemaField(
description="List of user names that authored the tweets"
)
next_token: str = SchemaField(description="Next token for pagination")
# Complete Outputs for advanced use
data: list[dict] = SchemaField(description="Complete Tweet data")
included: dict = SchemaField(
description="Additional data that you have requested (Optional) via Expansions field"
)
meta: dict = SchemaField(
description="Provides metadata such as pagination info (next_token) or result counts"
)
# error
error: str = SchemaField(description="Error message if the request failed")
def __init__(self):
super().__init__(
id="292e7c78-a630-11ef-9f40-df5dffaca106",
description="This block gets information about tweets liked by a user.",
categories={BlockCategory.SOCIAL},
input_schema=TwitterGetLikedTweetsBlock.Input,
output_schema=TwitterGetLikedTweetsBlock.Output,
test_input={
"user_id": "1234567890",
"max_results": 2,
"pagination_token": None,
"credentials": TEST_CREDENTIALS_INPUT,
"expansions": None,
"media_fields": None,
"place_fields": None,
"poll_fields": None,
"tweet_fields": None,
"user_fields": None,
},
test_credentials=TEST_CREDENTIALS,
test_output=[
("ids", ["12345", "67890"]),
("texts", ["Tweet 1", "Tweet 2"]),
("userIds", ["67890", "67891"]),
("userNames", ["testuser1", "testuser2"]),
(
"data",
[
{"id": "12345", "text": "Tweet 1"},
{"id": "67890", "text": "Tweet 2"},
],
),
],
test_mock={
"get_liked_tweets": lambda *args, **kwargs: (
["12345", "67890"],
["Tweet 1", "Tweet 2"],
["67890", "67891"],
["testuser1", "testuser2"],
[
{"id": "12345", "text": "Tweet 1"},
{"id": "67890", "text": "Tweet 2"},
],
{},
{},
None,
)
},
)
@staticmethod
def get_liked_tweets(
credentials: TwitterCredentials,
user_id: str,
max_results: int | None,
pagination_token: str | None,
expansions: ExpansionFilter | None,
media_fields: TweetMediaFieldsFilter | None,
place_fields: TweetPlaceFieldsFilter | None,
poll_fields: TweetPollFieldsFilter | None,
tweet_fields: TweetFieldsFilter | None,
user_fields: TweetUserFieldsFilter | None,
):
try:
client = tweepy.Client(
bearer_token=credentials.access_token.get_secret_value()
)
params = {
"id": user_id,
"max_results": max_results,
"pagination_token": (
None if pagination_token == "" else pagination_token
),
"user_auth": False,
}
params = (
TweetExpansionsBuilder(params)
.add_expansions(expansions)
.add_media_fields(media_fields)
.add_place_fields(place_fields)
.add_poll_fields(poll_fields)
.add_tweet_fields(tweet_fields)
.add_user_fields(user_fields)
.build()
)
response = cast(Response, client.get_liked_tweets(**params))
if not response.data and not response.meta:
raise Exception("No liked tweets found")
meta = {}
tweet_ids = []
tweet_texts = []
user_ids = []
user_names = []
next_token = None
if response.meta:
meta = response.meta
next_token = meta.get("next_token")
included = IncludesSerializer.serialize(response.includes)
data = ResponseDataSerializer.serialize_list(response.data)
if response.data:
tweet_ids = [str(tweet.id) for tweet in response.data]
tweet_texts = [tweet.text for tweet in response.data]
if "users" in response.includes:
user_ids = [str(user["id"]) for user in response.includes["users"]]
user_names = [
user["username"] for user in response.includes["users"]
]
return (
tweet_ids,
tweet_texts,
user_ids,
user_names,
data,
included,
meta,
next_token,
)
raise Exception("No liked tweets found")
except tweepy.TweepyException:
raise
def run(
self,
input_data: Input,
*,
credentials: TwitterCredentials,
**kwargs,
) -> BlockOutput:
try:
ids, texts, user_ids, user_names, data, included, meta, next_token = (
self.get_liked_tweets(
credentials,
input_data.user_id,
input_data.max_results,
input_data.pagination_token,
input_data.expansions,
input_data.media_fields,
input_data.place_fields,
input_data.poll_fields,
input_data.tweet_fields,
input_data.user_fields,
)
)
if ids:
yield "ids", ids
if texts:
yield "texts", texts
if user_ids:
yield "userIds", user_ids
if user_names:
yield "userNames", user_names
if next_token:
yield "next_token", next_token
if data:
yield "data", data
if included:
yield "included", included
if meta:
yield "meta", meta
except Exception as e:
yield "error", handle_tweepy_exception(e)
class TwitterUnlikeTweetBlock(Block):
"""
Unlikes a tweet that was previously liked
"""
class Input(BlockSchema):
credentials: TwitterCredentialsInput = TwitterCredentialsField(
["tweet.read", "like.write", "users.read", "offline.access"]
)
tweet_id: str = SchemaField(
description="ID of the tweet to unlike",
placeholder="Enter tweet ID",
)
class Output(BlockSchema):
success: bool = SchemaField(description="Whether the operation was successful")
error: str = SchemaField(description="Error message if the request failed")
def __init__(self):
super().__init__(
id="1ed5eab8-a630-11ef-8e21-cbbbc80cbb85",
description="This block unlikes a tweet.",
categories={BlockCategory.SOCIAL},
input_schema=TwitterUnlikeTweetBlock.Input,
output_schema=TwitterUnlikeTweetBlock.Output,
test_input={
"tweet_id": "1234567890",
"credentials": TEST_CREDENTIALS_INPUT,
},
test_credentials=TEST_CREDENTIALS,
test_output=[
("success", True),
],
test_mock={"unlike_tweet": lambda *args, **kwargs: True},
)
@staticmethod
def unlike_tweet(
credentials: TwitterCredentials,
tweet_id: str,
):
try:
client = tweepy.Client(
bearer_token=credentials.access_token.get_secret_value()
)
client.unlike(tweet_id=tweet_id, user_auth=False)
return True
except tweepy.TweepyException:
raise
def run(
self,
input_data: Input,
*,
credentials: TwitterCredentials,
**kwargs,
) -> BlockOutput:
try:
success = self.unlike_tweet(
credentials,
input_data.tweet_id,
)
yield "success", success
except Exception as e:
yield "error", handle_tweepy_exception(e)

View File

@@ -0,0 +1,546 @@
from datetime import datetime
from typing import List, Literal, Optional, Union, cast
import tweepy
from pydantic import BaseModel
from tweepy.client import Response
from backend.blocks.twitter._auth import (
TEST_CREDENTIALS,
TEST_CREDENTIALS_INPUT,
TwitterCredentials,
TwitterCredentialsField,
TwitterCredentialsInput,
)
from backend.blocks.twitter._builders import (
TweetDurationBuilder,
TweetExpansionsBuilder,
TweetPostBuilder,
TweetSearchBuilder,
)
from backend.blocks.twitter._serializer import (
IncludesSerializer,
ResponseDataSerializer,
)
from backend.blocks.twitter._types import (
ExpansionFilter,
TweetExpansionInputs,
TweetFieldsFilter,
TweetMediaFieldsFilter,
TweetPlaceFieldsFilter,
TweetPollFieldsFilter,
TweetReplySettingsFilter,
TweetTimeWindowInputs,
TweetUserFieldsFilter,
)
from backend.blocks.twitter.tweepy_exceptions import handle_tweepy_exception
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.model import SchemaField
class Media(BaseModel):
discriminator: Literal["media"]
media_ids: Optional[List[str]] = None
media_tagged_user_ids: Optional[List[str]] = None
class DeepLink(BaseModel):
discriminator: Literal["deep_link"]
direct_message_deep_link: Optional[str] = None
class Poll(BaseModel):
discriminator: Literal["poll"]
poll_options: Optional[List[str]] = None
poll_duration_minutes: Optional[int] = None
class Place(BaseModel):
discriminator: Literal["place"]
place_id: Optional[str] = None
class Quote(BaseModel):
discriminator: Literal["quote"]
quote_tweet_id: Optional[str] = None
class TwitterPostTweetBlock(Block):
"""
Create a tweet on Twitter with the option to include one additional element such as a media, quote, or deep link.
"""
class Input(BlockSchema):
credentials: TwitterCredentialsInput = TwitterCredentialsField(
["tweet.read", "tweet.write", "users.read", "offline.access"]
)
tweet_text: str | None = SchemaField(
description="Text of the tweet to post",
placeholder="Enter your tweet",
default=None,
advanced=False,
)
for_super_followers_only: bool = SchemaField(
description="Tweet exclusively for Super Followers",
placeholder="Enter for super followers only",
advanced=True,
default=False,
)
attachment: Union[Media, DeepLink, Poll, Place, Quote] | None = SchemaField(
discriminator="discriminator",
description="Additional tweet data (media, deep link, poll, place or quote)",
advanced=False,
default=Media(discriminator="media"),
)
exclude_reply_user_ids: Optional[List[str]] = SchemaField(
description="User IDs to exclude from reply Tweet thread. [ex - 6253282]",
placeholder="Enter user IDs to exclude",
advanced=True,
default=None,
)
in_reply_to_tweet_id: Optional[str] = SchemaField(
description="Tweet ID being replied to. Please note that in_reply_to_tweet_id needs to be in the request if exclude_reply_user_ids is present",
default=None,
placeholder="Enter in reply to tweet ID",
advanced=True,
)
reply_settings: TweetReplySettingsFilter = SchemaField(
description="Who can reply to the Tweet (mentionedUsers or following)",
placeholder="Enter reply settings",
advanced=True,
default=TweetReplySettingsFilter(All_Users=True),
)
class Output(BlockSchema):
tweet_id: str = SchemaField(description="ID of the created tweet")
tweet_url: str = SchemaField(description="URL to the tweet")
error: str = SchemaField(
description="Error message if the tweet posting failed"
)
def __init__(self):
super().__init__(
id="7bb0048a-a630-11ef-aeb8-abc0dadb9b12",
description="This block posts a tweet on Twitter.",
categories={BlockCategory.SOCIAL},
input_schema=TwitterPostTweetBlock.Input,
output_schema=TwitterPostTweetBlock.Output,
test_input={
"tweet_text": "This is a test tweet.",
"credentials": TEST_CREDENTIALS_INPUT,
"attachment": {
"discriminator": "deep_link",
"direct_message_deep_link": "https://twitter.com/messages/compose",
},
"for_super_followers_only": False,
"exclude_reply_user_ids": [],
"in_reply_to_tweet_id": "",
},
test_credentials=TEST_CREDENTIALS,
test_output=[
("tweet_id", "1234567890"),
("tweet_url", "https://twitter.com/user/status/1234567890"),
],
test_mock={
"post_tweet": lambda *args, **kwargs: (
"1234567890",
"https://twitter.com/user/status/1234567890",
)
},
)
def post_tweet(
self,
credentials: TwitterCredentials,
input_txt: str | None,
attachment: Union[Media, DeepLink, Poll, Place, Quote] | None,
for_super_followers_only: bool,
exclude_reply_user_ids: Optional[List[str]],
in_reply_to_tweet_id: Optional[str],
reply_settings: TweetReplySettingsFilter,
):
try:
client = tweepy.Client(
bearer_token=credentials.access_token.get_secret_value()
)
params = (
TweetPostBuilder()
.add_text(input_txt)
.add_super_followers(for_super_followers_only)
.add_reply_settings(
exclude_reply_user_ids or [],
in_reply_to_tweet_id or "",
reply_settings,
)
)
if isinstance(attachment, Media):
params.add_media(
attachment.media_ids or [], attachment.media_tagged_user_ids or []
)
elif isinstance(attachment, DeepLink):
params.add_deep_link(attachment.direct_message_deep_link or "")
elif isinstance(attachment, Poll):
params.add_poll_options(attachment.poll_options or [])
params.add_poll_duration(attachment.poll_duration_minutes or 0)
elif isinstance(attachment, Place):
params.add_place(attachment.place_id or "")
elif isinstance(attachment, Quote):
params.add_quote(attachment.quote_tweet_id or "")
tweet = cast(Response, client.create_tweet(**params.build()))
if not tweet.data:
raise Exception("Failed to create tweet")
tweet_id = tweet.data["id"]
tweet_url = f"https://twitter.com/user/status/{tweet_id}"
return str(tweet_id), tweet_url
except tweepy.TweepyException:
raise
except Exception:
raise
def run(
self,
input_data: Input,
*,
credentials: TwitterCredentials,
**kwargs,
) -> BlockOutput:
try:
tweet_id, tweet_url = self.post_tweet(
credentials,
input_data.tweet_text,
input_data.attachment,
input_data.for_super_followers_only,
input_data.exclude_reply_user_ids,
input_data.in_reply_to_tweet_id,
input_data.reply_settings,
)
yield "tweet_id", tweet_id
yield "tweet_url", tweet_url
except Exception as e:
yield "error", handle_tweepy_exception(e)
class TwitterDeleteTweetBlock(Block):
"""
Deletes a tweet on Twitter using twitter Id
"""
class Input(BlockSchema):
credentials: TwitterCredentialsInput = TwitterCredentialsField(
["tweet.read", "tweet.write", "users.read", "offline.access"]
)
tweet_id: str = SchemaField(
description="ID of the tweet to delete",
placeholder="Enter tweet ID",
)
class Output(BlockSchema):
success: bool = SchemaField(
description="Whether the tweet was successfully deleted"
)
error: str = SchemaField(
description="Error message if the tweet deletion failed"
)
def __init__(self):
super().__init__(
id="761babf0-a630-11ef-a03d-abceb082f58f",
description="This block deletes a tweet on Twitter.",
categories={BlockCategory.SOCIAL},
input_schema=TwitterDeleteTweetBlock.Input,
output_schema=TwitterDeleteTweetBlock.Output,
test_input={
"tweet_id": "1234567890",
"credentials": TEST_CREDENTIALS_INPUT,
},
test_credentials=TEST_CREDENTIALS,
test_output=[("success", True)],
test_mock={"delete_tweet": lambda *args, **kwargs: True},
)
@staticmethod
def delete_tweet(credentials: TwitterCredentials, tweet_id: str):
try:
client = tweepy.Client(
bearer_token=credentials.access_token.get_secret_value()
)
client.delete_tweet(id=tweet_id, user_auth=False)
return True
except tweepy.TweepyException:
raise
except Exception:
raise
def run(
self,
input_data: Input,
*,
credentials: TwitterCredentials,
**kwargs,
) -> BlockOutput:
try:
success = self.delete_tweet(
credentials,
input_data.tweet_id,
)
yield "success", success
except Exception as e:
yield "error", handle_tweepy_exception(e)
class TwitterSearchRecentTweetsBlock(Block):
"""
Searches all public Tweets in Twitter history
"""
class Input(TweetExpansionInputs, TweetTimeWindowInputs):
credentials: TwitterCredentialsInput = TwitterCredentialsField(
["tweet.read", "users.read", "offline.access"]
)
query: str = SchemaField(
description="Search query (up to 1024 characters)",
placeholder="Enter search query",
)
max_results: int = SchemaField(
description="Maximum number of results per page (10-500)",
placeholder="Enter max results",
default=10,
advanced=True,
)
pagination: str | None = SchemaField(
description="Token for pagination",
default="",
placeholder="Enter pagination token",
advanced=True,
)
class Output(BlockSchema):
# Common Outputs that user commonly uses
tweet_ids: list[str] = SchemaField(description="All Tweet IDs")
tweet_texts: list[str] = SchemaField(description="All Tweet texts")
next_token: str = SchemaField(description="Next token for pagination")
# Complete Outputs for advanced use
data: list[dict] = SchemaField(description="Complete Tweet data")
included: dict = SchemaField(
description="Additional data that you have requested (Optional) via Expansions field"
)
meta: dict = SchemaField(
description="Provides metadata such as pagination info (next_token) or result counts"
)
# error
error: str = SchemaField(description="Error message if the request failed")
def __init__(self):
super().__init__(
id="53e5cf8e-a630-11ef-ba85-df6d666fa5d5",
description="This block searches all public Tweets in Twitter history.",
categories={BlockCategory.SOCIAL},
input_schema=TwitterSearchRecentTweetsBlock.Input,
output_schema=TwitterSearchRecentTweetsBlock.Output,
test_input={
"query": "from:twitterapi #twitterapi",
"credentials": TEST_CREDENTIALS_INPUT,
"max_results": 2,
"start_time": "2024-12-14T18:30:00.000Z",
"end_time": "2024-12-17T18:30:00.000Z",
"since_id": None,
"until_id": None,
"sort_order": None,
"pagination": None,
"expansions": None,
"media_fields": None,
"place_fields": None,
"poll_fields": None,
"tweet_fields": None,
"user_fields": None,
},
test_credentials=TEST_CREDENTIALS,
test_output=[
("tweet_ids", ["1373001119480344583", "1372627771717869568"]),
(
"tweet_texts",
[
"Looking to get started with the Twitter API but new to APIs in general?",
"Thanks to everyone who joined and made today a great session!",
],
),
(
"data",
[
{
"id": "1373001119480344583",
"text": "Looking to get started with the Twitter API but new to APIs in general?",
},
{
"id": "1372627771717869568",
"text": "Thanks to everyone who joined and made today a great session!",
},
],
),
],
test_mock={
"search_tweets": lambda *args, **kwargs: (
["1373001119480344583", "1372627771717869568"],
[
"Looking to get started with the Twitter API but new to APIs in general?",
"Thanks to everyone who joined and made today a great session!",
],
[
{
"id": "1373001119480344583",
"text": "Looking to get started with the Twitter API but new to APIs in general?",
},
{
"id": "1372627771717869568",
"text": "Thanks to everyone who joined and made today a great session!",
},
],
{},
{},
None,
)
},
)
@staticmethod
def search_tweets(
credentials: TwitterCredentials,
query: str,
max_results: int,
start_time: datetime | None,
end_time: datetime | None,
since_id: str | None,
until_id: str | None,
sort_order: str | None,
pagination: str | None,
expansions: ExpansionFilter | None,
media_fields: TweetMediaFieldsFilter | None,
place_fields: TweetPlaceFieldsFilter | None,
poll_fields: TweetPollFieldsFilter | None,
tweet_fields: TweetFieldsFilter | None,
user_fields: TweetUserFieldsFilter | None,
):
try:
client = tweepy.Client(
bearer_token=credentials.access_token.get_secret_value()
)
# Building common params
params = (
TweetSearchBuilder()
.add_query(query)
.add_pagination(max_results, pagination)
.build()
)
# Adding expansions to params If required by the user
params = (
TweetExpansionsBuilder(params)
.add_expansions(expansions)
.add_media_fields(media_fields)
.add_place_fields(place_fields)
.add_poll_fields(poll_fields)
.add_tweet_fields(tweet_fields)
.add_user_fields(user_fields)
.build()
)
# Adding time window to params If required by the user
params = (
TweetDurationBuilder(params)
.add_start_time(start_time)
.add_end_time(end_time)
.add_since_id(since_id)
.add_until_id(until_id)
.add_sort_order(sort_order)
.build()
)
response = cast(Response, client.search_recent_tweets(**params))
if not response.data and not response.meta:
raise Exception("No tweets found")
meta = {}
tweet_ids = []
tweet_texts = []
next_token = None
if response.meta:
meta = response.meta
next_token = meta.get("next_token")
included = IncludesSerializer.serialize(response.includes)
data = ResponseDataSerializer.serialize_list(response.data)
if response.data:
tweet_ids = [str(tweet.id) for tweet in response.data]
tweet_texts = [tweet.text for tweet in response.data]
return tweet_ids, tweet_texts, data, included, meta, next_token
raise Exception("No tweets found")
except tweepy.TweepyException:
raise
def run(
self,
input_data: Input,
*,
credentials: TwitterCredentials,
**kwargs,
) -> BlockOutput:
try:
ids, texts, data, included, meta, next_token = self.search_tweets(
credentials,
input_data.query,
input_data.max_results,
input_data.start_time,
input_data.end_time,
input_data.since_id,
input_data.until_id,
input_data.sort_order,
input_data.pagination,
input_data.expansions,
input_data.media_fields,
input_data.place_fields,
input_data.poll_fields,
input_data.tweet_fields,
input_data.user_fields,
)
if ids:
yield "tweet_ids", ids
if texts:
yield "tweet_texts", texts
if next_token:
yield "next_token", next_token
if data:
yield "data", data
if included:
yield "included", included
if meta:
yield "meta", meta
except Exception as e:
yield "error", handle_tweepy_exception(e)

View File

@@ -0,0 +1,222 @@
from typing import cast
import tweepy
from tweepy.client import Response
from backend.blocks.twitter._auth import (
TEST_CREDENTIALS,
TEST_CREDENTIALS_INPUT,
TwitterCredentials,
TwitterCredentialsField,
TwitterCredentialsInput,
)
from backend.blocks.twitter._builders import TweetExpansionsBuilder
from backend.blocks.twitter._serializer import (
IncludesSerializer,
ResponseDataSerializer,
)
from backend.blocks.twitter._types import (
ExpansionFilter,
TweetExcludesFilter,
TweetExpansionInputs,
TweetFieldsFilter,
TweetMediaFieldsFilter,
TweetPlaceFieldsFilter,
TweetPollFieldsFilter,
TweetUserFieldsFilter,
)
from backend.blocks.twitter.tweepy_exceptions import handle_tweepy_exception
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.model import SchemaField
class TwitterGetQuoteTweetsBlock(Block):
"""
Gets quote tweets for a specified tweet ID
"""
class Input(TweetExpansionInputs):
credentials: TwitterCredentialsInput = TwitterCredentialsField(
["tweet.read", "users.read", "offline.access"]
)
tweet_id: str = SchemaField(
description="ID of the tweet to get quotes for",
placeholder="Enter tweet ID",
)
max_results: int | None = SchemaField(
description="Number of results to return (max 100)",
default=10,
advanced=True,
)
exclude: TweetExcludesFilter | None = SchemaField(
description="Types of tweets to exclude", advanced=True, default=None
)
pagination_token: str | None = SchemaField(
description="Token for pagination",
advanced=True,
default="",
)
class Output(BlockSchema):
# Common Outputs that user commonly uses
ids: list = SchemaField(description="All Tweet IDs ")
texts: list = SchemaField(description="All Tweet texts")
next_token: str = SchemaField(description="Next token for pagination")
# Complete Outputs for advanced use
data: list[dict] = SchemaField(description="Complete Tweet data")
included: dict = SchemaField(
description="Additional data that you have requested (Optional) via Expansions field"
)
meta: dict = SchemaField(
description="Provides metadata such as pagination info (next_token) or result counts"
)
# error
error: str = SchemaField(description="Error message if the request failed")
def __init__(self):
super().__init__(
id="9fbdd208-a630-11ef-9b97-ab7a3a695ca3",
description="This block gets quote tweets for a specific tweet.",
categories={BlockCategory.SOCIAL},
input_schema=TwitterGetQuoteTweetsBlock.Input,
output_schema=TwitterGetQuoteTweetsBlock.Output,
test_input={
"tweet_id": "1234567890",
"max_results": 2,
"pagination_token": None,
"credentials": TEST_CREDENTIALS_INPUT,
},
test_credentials=TEST_CREDENTIALS,
test_output=[
("ids", ["12345", "67890"]),
("texts", ["Tweet 1", "Tweet 2"]),
(
"data",
[
{"id": "12345", "text": "Tweet 1"},
{"id": "67890", "text": "Tweet 2"},
],
),
],
test_mock={
"get_quote_tweets": lambda *args, **kwargs: (
["12345", "67890"],
["Tweet 1", "Tweet 2"],
[
{"id": "12345", "text": "Tweet 1"},
{"id": "67890", "text": "Tweet 2"},
],
{},
{},
None,
)
},
)
@staticmethod
def get_quote_tweets(
credentials: TwitterCredentials,
tweet_id: str,
max_results: int | None,
exclude: TweetExcludesFilter | None,
pagination_token: str | None,
expansions: ExpansionFilter | None,
media_fields: TweetMediaFieldsFilter | None,
place_fields: TweetPlaceFieldsFilter | None,
poll_fields: TweetPollFieldsFilter | None,
tweet_fields: TweetFieldsFilter | None,
user_fields: TweetUserFieldsFilter | None,
):
try:
client = tweepy.Client(
bearer_token=credentials.access_token.get_secret_value()
)
params = {
"id": tweet_id,
"max_results": max_results,
"pagination_token": (
None if pagination_token == "" else pagination_token
),
"exclude": None if exclude == TweetExcludesFilter() else exclude,
"user_auth": False,
}
params = (
TweetExpansionsBuilder(params)
.add_expansions(expansions)
.add_media_fields(media_fields)
.add_place_fields(place_fields)
.add_poll_fields(poll_fields)
.add_tweet_fields(tweet_fields)
.add_user_fields(user_fields)
.build()
)
response = cast(Response, client.get_quote_tweets(**params))
meta = {}
tweet_ids = []
tweet_texts = []
next_token = None
if response.meta:
meta = response.meta
next_token = meta.get("next_token")
included = IncludesSerializer.serialize(response.includes)
data = ResponseDataSerializer.serialize_list(response.data)
if response.data:
tweet_ids = [str(tweet.id) for tweet in response.data]
tweet_texts = [tweet.text for tweet in response.data]
return tweet_ids, tweet_texts, data, included, meta, next_token
raise Exception("No quote tweets found")
except tweepy.TweepyException:
raise
def run(
self,
input_data: Input,
*,
credentials: TwitterCredentials,
**kwargs,
) -> BlockOutput:
try:
ids, texts, data, included, meta, next_token = self.get_quote_tweets(
credentials,
input_data.tweet_id,
input_data.max_results,
input_data.exclude,
input_data.pagination_token,
input_data.expansions,
input_data.media_fields,
input_data.place_fields,
input_data.poll_fields,
input_data.tweet_fields,
input_data.user_fields,
)
if ids:
yield "ids", ids
if texts:
yield "texts", texts
if next_token:
yield "next_token", next_token
if data:
yield "data", data
if included:
yield "included", included
if meta:
yield "meta", meta
except Exception as e:
yield "error", handle_tweepy_exception(e)

View File

@@ -0,0 +1,363 @@
from typing import cast
import tweepy
from tweepy.client import Response
from backend.blocks.twitter._auth import (
TEST_CREDENTIALS,
TEST_CREDENTIALS_INPUT,
TwitterCredentials,
TwitterCredentialsField,
TwitterCredentialsInput,
)
from backend.blocks.twitter._builders import UserExpansionsBuilder
from backend.blocks.twitter._serializer import (
IncludesSerializer,
ResponseDataSerializer,
)
from backend.blocks.twitter._types import (
TweetFieldsFilter,
TweetUserFieldsFilter,
UserExpansionInputs,
UserExpansionsFilter,
)
from backend.blocks.twitter.tweepy_exceptions import handle_tweepy_exception
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.model import SchemaField
class TwitterRetweetBlock(Block):
"""
Retweets a tweet on Twitter
"""
class Input(BlockSchema):
credentials: TwitterCredentialsInput = TwitterCredentialsField(
["tweet.read", "tweet.write", "users.read", "offline.access"]
)
tweet_id: str = SchemaField(
description="ID of the tweet to retweet",
placeholder="Enter tweet ID",
)
class Output(BlockSchema):
success: bool = SchemaField(description="Whether the retweet was successful")
error: str = SchemaField(description="Error message if the retweet failed")
def __init__(self):
super().__init__(
id="bd7b8d3a-a630-11ef-be96-6f4aa4c3c4f4",
description="This block retweets a tweet on Twitter.",
categories={BlockCategory.SOCIAL},
input_schema=TwitterRetweetBlock.Input,
output_schema=TwitterRetweetBlock.Output,
test_input={
"tweet_id": "1234567890",
"credentials": TEST_CREDENTIALS_INPUT,
},
test_credentials=TEST_CREDENTIALS,
test_output=[
("success", True),
],
test_mock={"retweet": lambda *args, **kwargs: True},
)
@staticmethod
def retweet(
credentials: TwitterCredentials,
tweet_id: str,
):
try:
client = tweepy.Client(
bearer_token=credentials.access_token.get_secret_value()
)
client.retweet(
tweet_id=tweet_id,
user_auth=False,
)
return True
except tweepy.TweepyException:
raise
def run(
self,
input_data: Input,
*,
credentials: TwitterCredentials,
**kwargs,
) -> BlockOutput:
try:
success = self.retweet(
credentials,
input_data.tweet_id,
)
yield "success", success
except Exception as e:
yield "error", handle_tweepy_exception(e)
class TwitterRemoveRetweetBlock(Block):
"""
Removes a retweet on Twitter
"""
class Input(BlockSchema):
credentials: TwitterCredentialsInput = TwitterCredentialsField(
["tweet.read", "tweet.write", "users.read", "offline.access"]
)
tweet_id: str = SchemaField(
description="ID of the tweet to remove retweet",
placeholder="Enter tweet ID",
)
class Output(BlockSchema):
success: bool = SchemaField(
description="Whether the retweet was successfully removed"
)
error: str = SchemaField(description="Error message if the removal failed")
def __init__(self):
super().__init__(
id="b6e663f0-a630-11ef-a7f0-8b9b0c542ff8",
description="This block removes a retweet on Twitter.",
categories={BlockCategory.SOCIAL},
input_schema=TwitterRemoveRetweetBlock.Input,
output_schema=TwitterRemoveRetweetBlock.Output,
test_input={
"tweet_id": "1234567890",
"credentials": TEST_CREDENTIALS_INPUT,
},
test_credentials=TEST_CREDENTIALS,
test_output=[
("success", True),
],
test_mock={"remove_retweet": lambda *args, **kwargs: True},
)
@staticmethod
def remove_retweet(
credentials: TwitterCredentials,
tweet_id: str,
):
try:
client = tweepy.Client(
bearer_token=credentials.access_token.get_secret_value()
)
client.unretweet(
source_tweet_id=tweet_id,
user_auth=False,
)
return True
except tweepy.TweepyException:
raise
def run(
self,
input_data: Input,
*,
credentials: TwitterCredentials,
**kwargs,
) -> BlockOutput:
try:
success = self.remove_retweet(
credentials,
input_data.tweet_id,
)
yield "success", success
except Exception as e:
yield "error", handle_tweepy_exception(e)
class TwitterGetRetweetersBlock(Block):
"""
Gets information about who has retweeted a tweet
"""
class Input(UserExpansionInputs):
credentials: TwitterCredentialsInput = TwitterCredentialsField(
["tweet.read", "users.read", "offline.access"]
)
tweet_id: str = SchemaField(
description="ID of the tweet to get retweeters for",
placeholder="Enter tweet ID",
)
max_results: int | None = SchemaField(
description="Maximum number of results per page (1-100)",
default=10,
placeholder="Enter max results",
advanced=True,
)
pagination_token: str | None = SchemaField(
description="Token for pagination",
placeholder="Enter pagination token",
default="",
)
class Output(BlockSchema):
# Common Outputs that user commonly uses
ids: list = SchemaField(description="List of user ids who retweeted")
names: list = SchemaField(description="List of user names who retweeted")
usernames: list = SchemaField(
description="List of user usernames who retweeted"
)
next_token: str = SchemaField(description="Token for next page of results")
# Complete Outputs for advanced use
data: list[dict] = SchemaField(description="Complete Tweet data")
included: dict = SchemaField(
description="Additional data that you have requested (Optional) via Expansions field"
)
meta: dict = SchemaField(
description="Provides metadata such as pagination info (next_token) or result counts"
)
error: str = SchemaField(description="Error message if the request failed")
def __init__(self):
super().__init__(
id="ad7aa6fa-a630-11ef-a6b0-e7ca640aa030",
description="This block gets information about who has retweeted a tweet.",
categories={BlockCategory.SOCIAL},
input_schema=TwitterGetRetweetersBlock.Input,
output_schema=TwitterGetRetweetersBlock.Output,
test_input={
"tweet_id": "1234567890",
"credentials": TEST_CREDENTIALS_INPUT,
"max_results": 1,
"pagination_token": "",
"expansions": None,
"media_fields": None,
"place_fields": None,
"poll_fields": None,
"tweet_fields": None,
"user_fields": None,
},
test_credentials=TEST_CREDENTIALS,
test_output=[
("ids", ["12345"]),
("names", ["Test User"]),
("usernames", ["testuser"]),
(
"data",
[{"id": "12345", "name": "Test User", "username": "testuser"}],
),
],
test_mock={
"get_retweeters": lambda *args, **kwargs: (
[{"id": "12345", "name": "Test User", "username": "testuser"}],
{},
{},
["12345"],
["Test User"],
["testuser"],
None,
)
},
)
@staticmethod
def get_retweeters(
credentials: TwitterCredentials,
tweet_id: str,
max_results: int | None,
pagination_token: str | None,
expansions: UserExpansionsFilter | None,
tweet_fields: TweetFieldsFilter | None,
user_fields: TweetUserFieldsFilter | None,
):
try:
client = tweepy.Client(
bearer_token=credentials.access_token.get_secret_value()
)
params = {
"id": tweet_id,
"max_results": max_results,
"pagination_token": (
None if pagination_token == "" else pagination_token
),
"user_auth": False,
}
params = (
UserExpansionsBuilder(params)
.add_expansions(expansions)
.add_tweet_fields(tweet_fields)
.add_user_fields(user_fields)
.build()
)
response = cast(Response, client.get_retweeters(**params))
meta = {}
ids = []
names = []
usernames = []
next_token = None
if response.meta:
meta = response.meta
next_token = meta.get("next_token")
included = IncludesSerializer.serialize(response.includes)
data = ResponseDataSerializer.serialize_list(response.data)
if response.data:
ids = [str(user.id) for user in response.data]
names = [user.name for user in response.data]
usernames = [user.username for user in response.data]
return data, included, meta, ids, names, usernames, next_token
raise Exception("No retweeters found")
except tweepy.TweepyException:
raise
def run(
self,
input_data: Input,
*,
credentials: TwitterCredentials,
**kwargs,
) -> BlockOutput:
try:
data, included, meta, ids, names, usernames, next_token = (
self.get_retweeters(
credentials,
input_data.tweet_id,
input_data.max_results,
input_data.pagination_token,
input_data.expansions,
input_data.tweet_fields,
input_data.user_fields,
)
)
if ids:
yield "ids", ids
if names:
yield "names", names
if usernames:
yield "usernames", usernames
if next_token:
yield "next_token", next_token
if data:
yield "data", data
if included:
yield "included", included
if meta:
yield "meta", meta
except Exception as e:
yield "error", handle_tweepy_exception(e)

View File

@@ -0,0 +1,757 @@
from datetime import datetime
from typing import cast
import tweepy
from tweepy.client import Response
from backend.blocks.twitter._auth import (
TEST_CREDENTIALS,
TEST_CREDENTIALS_INPUT,
TwitterCredentials,
TwitterCredentialsField,
TwitterCredentialsInput,
)
from backend.blocks.twitter._builders import (
TweetDurationBuilder,
TweetExpansionsBuilder,
)
from backend.blocks.twitter._serializer import (
IncludesSerializer,
ResponseDataSerializer,
)
from backend.blocks.twitter._types import (
ExpansionFilter,
TweetExpansionInputs,
TweetFieldsFilter,
TweetMediaFieldsFilter,
TweetPlaceFieldsFilter,
TweetPollFieldsFilter,
TweetTimeWindowInputs,
TweetUserFieldsFilter,
)
from backend.blocks.twitter.tweepy_exceptions import handle_tweepy_exception
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.model import SchemaField
class TwitterGetUserMentionsBlock(Block):
"""
Returns Tweets where a single user is mentioned, just put that user id
"""
class Input(TweetExpansionInputs, TweetTimeWindowInputs):
credentials: TwitterCredentialsInput = TwitterCredentialsField(
["tweet.read", "users.read", "offline.access"]
)
user_id: str = SchemaField(
description="Unique identifier of the user for whom to return Tweets mentioning the user",
placeholder="Enter user ID",
)
max_results: int | None = SchemaField(
description="Number of tweets to retrieve (5-100)",
default=10,
advanced=True,
)
pagination_token: str | None = SchemaField(
description="Token for pagination", default="", advanced=True
)
class Output(BlockSchema):
# Common Outputs that user commonly uses
ids: list[str] = SchemaField(description="List of Tweet IDs")
texts: list[str] = SchemaField(description="All Tweet texts")
userIds: list[str] = SchemaField(
description="List of user ids that mentioned the user"
)
userNames: list[str] = SchemaField(
description="List of user names that mentioned the user"
)
next_token: str = SchemaField(description="Next token for pagination")
# Complete Outputs for advanced use
data: list[dict] = SchemaField(description="Complete Tweet data")
included: dict = SchemaField(
description="Additional data that you have requested (Optional) via Expansions field"
)
meta: dict = SchemaField(
description="Provides metadata such as pagination info (next_token) or result counts"
)
# error
error: str = SchemaField(description="Error message if the request failed")
def __init__(self):
super().__init__(
id="e01c890c-a630-11ef-9e20-37da24888bd0",
description="This block retrieves Tweets mentioning a specific user.",
categories={BlockCategory.SOCIAL},
input_schema=TwitterGetUserMentionsBlock.Input,
output_schema=TwitterGetUserMentionsBlock.Output,
test_input={
"user_id": "12345",
"credentials": TEST_CREDENTIALS_INPUT,
"max_results": 2,
"start_time": "2024-12-14T18:30:00.000Z",
"end_time": "2024-12-17T18:30:00.000Z",
"since_id": "",
"until_id": "",
"sort_order": None,
"pagination_token": None,
"expansions": None,
"media_fields": None,
"place_fields": None,
"poll_fields": None,
"tweet_fields": None,
"user_fields": None,
},
test_credentials=TEST_CREDENTIALS,
test_output=[
("ids", ["1373001119480344583", "1372627771717869568"]),
("texts", ["Test mention 1", "Test mention 2"]),
("userIds", ["67890", "67891"]),
("userNames", ["testuser1", "testuser2"]),
(
"data",
[
{"id": "1373001119480344583", "text": "Test mention 1"},
{"id": "1372627771717869568", "text": "Test mention 2"},
],
),
],
test_mock={
"get_mentions": lambda *args, **kwargs: (
["1373001119480344583", "1372627771717869568"],
["Test mention 1", "Test mention 2"],
["67890", "67891"],
["testuser1", "testuser2"],
[
{"id": "1373001119480344583", "text": "Test mention 1"},
{"id": "1372627771717869568", "text": "Test mention 2"},
],
{},
{},
None,
)
},
)
@staticmethod
def get_mentions(
credentials: TwitterCredentials,
user_id: str,
max_results: int | None,
start_time: datetime | None,
end_time: datetime | None,
since_id: str | None,
until_id: str | None,
sort_order: str | None,
pagination: str | None,
expansions: ExpansionFilter | None,
media_fields: TweetMediaFieldsFilter | None,
place_fields: TweetPlaceFieldsFilter | None,
poll_fields: TweetPollFieldsFilter | None,
tweet_fields: TweetFieldsFilter | None,
user_fields: TweetUserFieldsFilter | None,
):
try:
client = tweepy.Client(
bearer_token=credentials.access_token.get_secret_value()
)
params = {
"id": user_id,
"max_results": max_results,
"pagination_token": None if pagination == "" else pagination,
"user_auth": False,
}
# Adding expansions to params If required by the user
params = (
TweetExpansionsBuilder(params)
.add_expansions(expansions)
.add_media_fields(media_fields)
.add_place_fields(place_fields)
.add_poll_fields(poll_fields)
.add_tweet_fields(tweet_fields)
.add_user_fields(user_fields)
.build()
)
# Adding time window to params If required by the user
params = (
TweetDurationBuilder(params)
.add_start_time(start_time)
.add_end_time(end_time)
.add_since_id(since_id)
.add_until_id(until_id)
.add_sort_order(sort_order)
.build()
)
response = cast(
Response,
client.get_users_mentions(**params),
)
if not response.data and not response.meta:
raise Exception("No tweets found")
included = IncludesSerializer.serialize(response.includes)
data = ResponseDataSerializer.serialize_list(response.data)
meta = response.meta or {}
next_token = meta.get("next_token", "")
tweet_ids = []
tweet_texts = []
user_ids = []
user_names = []
if response.data:
tweet_ids = [str(tweet.id) for tweet in response.data]
tweet_texts = [tweet.text for tweet in response.data]
if "users" in included:
user_ids = [str(user["id"]) for user in included["users"]]
user_names = [user["username"] for user in included["users"]]
return (
tweet_ids,
tweet_texts,
user_ids,
user_names,
data,
included,
meta,
next_token,
)
except tweepy.TweepyException:
raise
def run(
self,
input_data: Input,
*,
credentials: TwitterCredentials,
**kwargs,
) -> BlockOutput:
try:
ids, texts, user_ids, user_names, data, included, meta, next_token = (
self.get_mentions(
credentials,
input_data.user_id,
input_data.max_results,
input_data.start_time,
input_data.end_time,
input_data.since_id,
input_data.until_id,
input_data.sort_order,
input_data.pagination_token,
input_data.expansions,
input_data.media_fields,
input_data.place_fields,
input_data.poll_fields,
input_data.tweet_fields,
input_data.user_fields,
)
)
if ids:
yield "ids", ids
if texts:
yield "texts", texts
if user_ids:
yield "userIds", user_ids
if user_names:
yield "userNames", user_names
if next_token:
yield "next_token", next_token
if data:
yield "data", data
if included:
yield "included", included
if meta:
yield "meta", meta
except Exception as e:
yield "error", handle_tweepy_exception(e)
class TwitterGetHomeTimelineBlock(Block):
"""
Returns a collection of the most recent Tweets and Retweets posted by you and users you follow
"""
class Input(TweetExpansionInputs, TweetTimeWindowInputs):
credentials: TwitterCredentialsInput = TwitterCredentialsField(
["tweet.read", "users.read", "offline.access"]
)
max_results: int | None = SchemaField(
description="Number of tweets to retrieve (5-100)",
default=10,
advanced=True,
)
pagination_token: str | None = SchemaField(
description="Token for pagination", default="", advanced=True
)
class Output(BlockSchema):
# Common Outputs that user commonly uses
ids: list[str] = SchemaField(description="List of Tweet IDs")
texts: list[str] = SchemaField(description="All Tweet texts")
userIds: list[str] = SchemaField(
description="List of user ids that authored the tweets"
)
userNames: list[str] = SchemaField(
description="List of user names that authored the tweets"
)
next_token: str = SchemaField(description="Next token for pagination")
# Complete Outputs for advanced use
data: list[dict] = SchemaField(description="Complete Tweet data")
included: dict = SchemaField(
description="Additional data that you have requested (Optional) via Expansions field"
)
meta: dict = SchemaField(
description="Provides metadata such as pagination info (next_token) or result counts"
)
# error
error: str = SchemaField(description="Error message if the request failed")
def __init__(self):
super().__init__(
id="d222a070-a630-11ef-a18a-3f52f76c6962",
description="This block retrieves the authenticated user's home timeline.",
categories={BlockCategory.SOCIAL},
input_schema=TwitterGetHomeTimelineBlock.Input,
output_schema=TwitterGetHomeTimelineBlock.Output,
test_input={
"credentials": TEST_CREDENTIALS_INPUT,
"max_results": 2,
"start_time": "2024-12-14T18:30:00.000Z",
"end_time": "2024-12-17T18:30:00.000Z",
"since_id": None,
"until_id": None,
"sort_order": None,
"pagination_token": None,
},
test_credentials=TEST_CREDENTIALS,
test_output=[
("ids", ["1373001119480344583", "1372627771717869568"]),
("texts", ["Test tweet 1", "Test tweet 2"]),
("userIds", ["67890", "67891"]),
("userNames", ["testuser1", "testuser2"]),
(
"data",
[
{"id": "1373001119480344583", "text": "Test tweet 1"},
{"id": "1372627771717869568", "text": "Test tweet 2"},
],
),
],
test_mock={
"get_timeline": lambda *args, **kwargs: (
["1373001119480344583", "1372627771717869568"],
["Test tweet 1", "Test tweet 2"],
["67890", "67891"],
["testuser1", "testuser2"],
[
{"id": "1373001119480344583", "text": "Test tweet 1"},
{"id": "1372627771717869568", "text": "Test tweet 2"},
],
{},
{},
None,
)
},
)
@staticmethod
def get_timeline(
credentials: TwitterCredentials,
max_results: int | None,
start_time: datetime | None,
end_time: datetime | None,
since_id: str | None,
until_id: str | None,
sort_order: str | None,
pagination: str | None,
expansions: ExpansionFilter | None,
media_fields: TweetMediaFieldsFilter | None,
place_fields: TweetPlaceFieldsFilter | None,
poll_fields: TweetPollFieldsFilter | None,
tweet_fields: TweetFieldsFilter | None,
user_fields: TweetUserFieldsFilter | None,
):
try:
client = tweepy.Client(
bearer_token=credentials.access_token.get_secret_value()
)
params = {
"max_results": max_results,
"pagination_token": None if pagination == "" else pagination,
"user_auth": False,
}
# Adding expansions to params If required by the user
params = (
TweetExpansionsBuilder(params)
.add_expansions(expansions)
.add_media_fields(media_fields)
.add_place_fields(place_fields)
.add_poll_fields(poll_fields)
.add_tweet_fields(tweet_fields)
.add_user_fields(user_fields)
.build()
)
# Adding time window to params If required by the user
params = (
TweetDurationBuilder(params)
.add_start_time(start_time)
.add_end_time(end_time)
.add_since_id(since_id)
.add_until_id(until_id)
.add_sort_order(sort_order)
.build()
)
response = cast(
Response,
client.get_home_timeline(**params),
)
if not response.data and not response.meta:
raise Exception("No tweets found")
included = IncludesSerializer.serialize(response.includes)
data = ResponseDataSerializer.serialize_list(response.data)
meta = response.meta or {}
next_token = meta.get("next_token", "")
tweet_ids = []
tweet_texts = []
user_ids = []
user_names = []
if response.data:
tweet_ids = [str(tweet.id) for tweet in response.data]
tweet_texts = [tweet.text for tweet in response.data]
if "users" in included:
user_ids = [str(user["id"]) for user in included["users"]]
user_names = [user["username"] for user in included["users"]]
return (
tweet_ids,
tweet_texts,
user_ids,
user_names,
data,
included,
meta,
next_token,
)
except tweepy.TweepyException:
raise
def run(
self,
input_data: Input,
*,
credentials: TwitterCredentials,
**kwargs,
) -> BlockOutput:
try:
ids, texts, user_ids, user_names, data, included, meta, next_token = (
self.get_timeline(
credentials,
input_data.max_results,
input_data.start_time,
input_data.end_time,
input_data.since_id,
input_data.until_id,
input_data.sort_order,
input_data.pagination_token,
input_data.expansions,
input_data.media_fields,
input_data.place_fields,
input_data.poll_fields,
input_data.tweet_fields,
input_data.user_fields,
)
)
if ids:
yield "ids", ids
if texts:
yield "texts", texts
if user_ids:
yield "userIds", user_ids
if user_names:
yield "userNames", user_names
if next_token:
yield "next_token", next_token
if data:
yield "data", data
if included:
yield "included", included
if meta:
yield "meta", meta
except Exception as e:
yield "error", handle_tweepy_exception(e)
class TwitterGetUserTweetsBlock(Block):
"""
Returns Tweets composed by a single user, specified by the requested user ID
"""
class Input(TweetExpansionInputs, TweetTimeWindowInputs):
credentials: TwitterCredentialsInput = TwitterCredentialsField(
["tweet.read", "users.read", "offline.access"]
)
user_id: str = SchemaField(
description="Unique identifier of the Twitter account (user ID) for whom to return results",
placeholder="Enter user ID",
)
max_results: int | None = SchemaField(
description="Number of tweets to retrieve (5-100)",
default=10,
advanced=True,
)
pagination_token: str | None = SchemaField(
description="Token for pagination", default="", advanced=True
)
class Output(BlockSchema):
# Common Outputs that user commonly uses
ids: list[str] = SchemaField(description="List of Tweet IDs")
texts: list[str] = SchemaField(description="All Tweet texts")
userIds: list[str] = SchemaField(
description="List of user ids that authored the tweets"
)
userNames: list[str] = SchemaField(
description="List of user names that authored the tweets"
)
next_token: str = SchemaField(description="Next token for pagination")
# Complete Outputs for advanced use
data: list[dict] = SchemaField(description="Complete Tweet data")
included: dict = SchemaField(
description="Additional data that you have requested (Optional) via Expansions field"
)
meta: dict = SchemaField(
description="Provides metadata such as pagination info (next_token) or result counts"
)
# error
error: str = SchemaField(description="Error message if the request failed")
def __init__(self):
super().__init__(
id="c44c3ef2-a630-11ef-9ff7-eb7b5ea3a5cb",
description="This block retrieves Tweets composed by a single user.",
categories={BlockCategory.SOCIAL},
input_schema=TwitterGetUserTweetsBlock.Input,
output_schema=TwitterGetUserTweetsBlock.Output,
test_input={
"user_id": "12345",
"credentials": TEST_CREDENTIALS_INPUT,
"max_results": 2,
"start_time": "2024-12-14T18:30:00.000Z",
"end_time": "2024-12-17T18:30:00.000Z",
"since_id": None,
"until_id": None,
"sort_order": None,
"pagination_token": None,
"expansions": None,
"media_fields": None,
"place_fields": None,
"poll_fields": None,
"tweet_fields": None,
"user_fields": None,
},
test_credentials=TEST_CREDENTIALS,
test_output=[
("ids", ["1373001119480344583", "1372627771717869568"]),
("texts", ["Test tweet 1", "Test tweet 2"]),
("userIds", ["67890", "67891"]),
("userNames", ["testuser1", "testuser2"]),
(
"data",
[
{"id": "1373001119480344583", "text": "Test tweet 1"},
{"id": "1372627771717869568", "text": "Test tweet 2"},
],
),
],
test_mock={
"get_user_tweets": lambda *args, **kwargs: (
["1373001119480344583", "1372627771717869568"],
["Test tweet 1", "Test tweet 2"],
["67890", "67891"],
["testuser1", "testuser2"],
[
{"id": "1373001119480344583", "text": "Test tweet 1"},
{"id": "1372627771717869568", "text": "Test tweet 2"},
],
{},
{},
None,
)
},
)
@staticmethod
def get_user_tweets(
credentials: TwitterCredentials,
user_id: str,
max_results: int | None,
start_time: datetime | None,
end_time: datetime | None,
since_id: str | None,
until_id: str | None,
sort_order: str | None,
pagination: str | None,
expansions: ExpansionFilter | None,
media_fields: TweetMediaFieldsFilter | None,
place_fields: TweetPlaceFieldsFilter | None,
poll_fields: TweetPollFieldsFilter | None,
tweet_fields: TweetFieldsFilter | None,
user_fields: TweetUserFieldsFilter | None,
):
try:
client = tweepy.Client(
bearer_token=credentials.access_token.get_secret_value()
)
params = {
"id": user_id,
"max_results": max_results,
"pagination_token": None if pagination == "" else pagination,
"user_auth": False,
}
# Adding expansions to params If required by the user
params = (
TweetExpansionsBuilder(params)
.add_expansions(expansions)
.add_media_fields(media_fields)
.add_place_fields(place_fields)
.add_poll_fields(poll_fields)
.add_tweet_fields(tweet_fields)
.add_user_fields(user_fields)
.build()
)
# Adding time window to params If required by the user
params = (
TweetDurationBuilder(params)
.add_start_time(start_time)
.add_end_time(end_time)
.add_since_id(since_id)
.add_until_id(until_id)
.add_sort_order(sort_order)
.build()
)
response = cast(
Response,
client.get_users_tweets(**params),
)
if not response.data and not response.meta:
raise Exception("No tweets found")
included = IncludesSerializer.serialize(response.includes)
data = ResponseDataSerializer.serialize_list(response.data)
meta = response.meta or {}
next_token = meta.get("next_token", "")
tweet_ids = []
tweet_texts = []
user_ids = []
user_names = []
if response.data:
tweet_ids = [str(tweet.id) for tweet in response.data]
tweet_texts = [tweet.text for tweet in response.data]
if "users" in included:
user_ids = [str(user["id"]) for user in included["users"]]
user_names = [user["username"] for user in included["users"]]
return (
tweet_ids,
tweet_texts,
user_ids,
user_names,
data,
included,
meta,
next_token,
)
except tweepy.TweepyException:
raise
def run(
self,
input_data: Input,
*,
credentials: TwitterCredentials,
**kwargs,
) -> BlockOutput:
try:
ids, texts, user_ids, user_names, data, included, meta, next_token = (
self.get_user_tweets(
credentials,
input_data.user_id,
input_data.max_results,
input_data.start_time,
input_data.end_time,
input_data.since_id,
input_data.until_id,
input_data.sort_order,
input_data.pagination_token,
input_data.expansions,
input_data.media_fields,
input_data.place_fields,
input_data.poll_fields,
input_data.tweet_fields,
input_data.user_fields,
)
)
if ids:
yield "ids", ids
if texts:
yield "texts", texts
if user_ids:
yield "userIds", user_ids
if user_names:
yield "userNames", user_names
if next_token:
yield "next_token", next_token
if data:
yield "data", data
if included:
yield "included", included
if meta:
yield "meta", meta
except Exception as e:
yield "error", handle_tweepy_exception(e)

View File

@@ -0,0 +1,361 @@
from typing import cast
import tweepy
from tweepy.client import Response
from backend.blocks.twitter._auth import (
TEST_CREDENTIALS,
TEST_CREDENTIALS_INPUT,
TwitterCredentials,
TwitterCredentialsField,
TwitterCredentialsInput,
)
from backend.blocks.twitter._builders import TweetExpansionsBuilder
from backend.blocks.twitter._serializer import (
IncludesSerializer,
ResponseDataSerializer,
)
from backend.blocks.twitter._types import (
ExpansionFilter,
TweetExpansionInputs,
TweetFieldsFilter,
TweetMediaFieldsFilter,
TweetPlaceFieldsFilter,
TweetPollFieldsFilter,
TweetUserFieldsFilter,
)
from backend.blocks.twitter.tweepy_exceptions import handle_tweepy_exception
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.model import SchemaField
class TwitterGetTweetBlock(Block):
"""
Returns information about a single Tweet specified by the requested ID
"""
class Input(TweetExpansionInputs):
credentials: TwitterCredentialsInput = TwitterCredentialsField(
["tweet.read", "users.read", "offline.access"]
)
tweet_id: str = SchemaField(
description="Unique identifier of the Tweet to request (ex: 1460323737035677698)",
placeholder="Enter tweet ID",
)
class Output(BlockSchema):
# Common Outputs that user commonly uses
id: str = SchemaField(description="Tweet ID")
text: str = SchemaField(description="Tweet text")
userId: str = SchemaField(description="ID of the tweet author")
userName: str = SchemaField(description="Username of the tweet author")
# Complete Outputs for advanced use
data: dict = SchemaField(description="Tweet data")
included: dict = SchemaField(
description="Additional data that you have requested (Optional) via Expansions field"
)
meta: dict = SchemaField(description="Metadata about the tweet")
error: str = SchemaField(description="Error message if the request failed")
def __init__(self):
super().__init__(
id="f5155c3a-a630-11ef-9cc1-a309988b4d92",
description="This block retrieves information about a specific Tweet.",
categories={BlockCategory.SOCIAL},
input_schema=TwitterGetTweetBlock.Input,
output_schema=TwitterGetTweetBlock.Output,
test_input={
"tweet_id": "1460323737035677698",
"credentials": TEST_CREDENTIALS_INPUT,
"expansions": None,
"media_fields": None,
"place_fields": None,
"poll_fields": None,
"tweet_fields": None,
"user_fields": None,
},
test_credentials=TEST_CREDENTIALS,
test_output=[
("id", "1460323737035677698"),
("text", "Test tweet content"),
("userId", "12345"),
("userName", "testuser"),
("data", {"id": "1460323737035677698", "text": "Test tweet content"}),
("included", {"users": [{"id": "12345", "username": "testuser"}]}),
("meta", {"result_count": 1}),
],
test_mock={
"get_tweet": lambda *args, **kwargs: (
{"id": "1460323737035677698", "text": "Test tweet content"},
{"users": [{"id": "12345", "username": "testuser"}]},
{"result_count": 1},
"12345",
"testuser",
)
},
)
@staticmethod
def get_tweet(
credentials: TwitterCredentials,
tweet_id: str,
expansions: ExpansionFilter | None,
media_fields: TweetMediaFieldsFilter | None,
place_fields: TweetPlaceFieldsFilter | None,
poll_fields: TweetPollFieldsFilter | None,
tweet_fields: TweetFieldsFilter | None,
user_fields: TweetUserFieldsFilter | None,
):
try:
client = tweepy.Client(
bearer_token=credentials.access_token.get_secret_value()
)
params = {"id": tweet_id, "user_auth": False}
# Adding expansions to params If required by the user
params = (
TweetExpansionsBuilder(params)
.add_expansions(expansions)
.add_media_fields(media_fields)
.add_place_fields(place_fields)
.add_poll_fields(poll_fields)
.add_tweet_fields(tweet_fields)
.add_user_fields(user_fields)
.build()
)
response = cast(Response, client.get_tweet(**params))
meta = {}
user_id = ""
user_name = ""
if response.meta:
meta = response.meta
included = IncludesSerializer.serialize(response.includes)
data = ResponseDataSerializer.serialize_dict(response.data)
if included and "users" in included:
user_id = str(included["users"][0]["id"])
user_name = included["users"][0]["username"]
if response.data:
return data, included, meta, user_id, user_name
raise Exception("Tweet not found")
except tweepy.TweepyException:
raise
def run(
self,
input_data: Input,
*,
credentials: TwitterCredentials,
**kwargs,
) -> BlockOutput:
try:
tweet_data, included, meta, user_id, user_name = self.get_tweet(
credentials,
input_data.tweet_id,
input_data.expansions,
input_data.media_fields,
input_data.place_fields,
input_data.poll_fields,
input_data.tweet_fields,
input_data.user_fields,
)
yield "id", str(tweet_data["id"])
yield "text", tweet_data["text"]
if user_id:
yield "userId", user_id
if user_name:
yield "userName", user_name
yield "data", tweet_data
if included:
yield "included", included
if meta:
yield "meta", meta
except Exception as e:
yield "error", handle_tweepy_exception(e)
class TwitterGetTweetsBlock(Block):
"""
Returns information about multiple Tweets specified by the requested IDs
"""
class Input(TweetExpansionInputs):
credentials: TwitterCredentialsInput = TwitterCredentialsField(
["tweet.read", "users.read", "offline.access"]
)
tweet_ids: list[str] = SchemaField(
description="List of Tweet IDs to request (up to 100)",
placeholder="Enter tweet IDs",
)
class Output(BlockSchema):
# Common Outputs that user commonly uses
ids: list[str] = SchemaField(description="All Tweet IDs")
texts: list[str] = SchemaField(description="All Tweet texts")
userIds: list[str] = SchemaField(
description="List of user ids that authored the tweets"
)
userNames: list[str] = SchemaField(
description="List of user names that authored the tweets"
)
# Complete Outputs for advanced use
data: list[dict] = SchemaField(description="Complete Tweet data")
included: dict = SchemaField(
description="Additional data that you have requested (Optional) via Expansions field"
)
meta: dict = SchemaField(description="Metadata about the tweets")
error: str = SchemaField(description="Error message if the request failed")
def __init__(self):
super().__init__(
id="e7cc5420-a630-11ef-bfaf-13bdd8096a51",
description="This block retrieves information about multiple Tweets.",
categories={BlockCategory.SOCIAL},
input_schema=TwitterGetTweetsBlock.Input,
output_schema=TwitterGetTweetsBlock.Output,
test_input={
"tweet_ids": ["1460323737035677698"],
"credentials": TEST_CREDENTIALS_INPUT,
"expansions": None,
"media_fields": None,
"place_fields": None,
"poll_fields": None,
"tweet_fields": None,
"user_fields": None,
},
test_credentials=TEST_CREDENTIALS,
test_output=[
("ids", ["1460323737035677698"]),
("texts", ["Test tweet content"]),
("userIds", ["67890"]),
("userNames", ["testuser1"]),
("data", [{"id": "1460323737035677698", "text": "Test tweet content"}]),
("included", {"users": [{"id": "67890", "username": "testuser1"}]}),
("meta", {"result_count": 1}),
],
test_mock={
"get_tweets": lambda *args, **kwargs: (
["1460323737035677698"], # ids
["Test tweet content"], # texts
["67890"], # user_ids
["testuser1"], # user_names
[
{"id": "1460323737035677698", "text": "Test tweet content"}
], # data
{"users": [{"id": "67890", "username": "testuser1"}]}, # included
{"result_count": 1}, # meta
)
},
)
@staticmethod
def get_tweets(
credentials: TwitterCredentials,
tweet_ids: list[str],
expansions: ExpansionFilter | None,
media_fields: TweetMediaFieldsFilter | None,
place_fields: TweetPlaceFieldsFilter | None,
poll_fields: TweetPollFieldsFilter | None,
tweet_fields: TweetFieldsFilter | None,
user_fields: TweetUserFieldsFilter | None,
):
try:
client = tweepy.Client(
bearer_token=credentials.access_token.get_secret_value()
)
params = {"ids": tweet_ids, "user_auth": False}
# Adding expansions to params If required by the user
params = (
TweetExpansionsBuilder(params)
.add_expansions(expansions)
.add_media_fields(media_fields)
.add_place_fields(place_fields)
.add_poll_fields(poll_fields)
.add_tweet_fields(tweet_fields)
.add_user_fields(user_fields)
.build()
)
response = cast(Response, client.get_tweets(**params))
if not response.data and not response.meta:
raise Exception("No tweets found")
tweet_ids = []
tweet_texts = []
user_ids = []
user_names = []
meta = {}
included = IncludesSerializer.serialize(response.includes)
data = ResponseDataSerializer.serialize_list(response.data)
if response.data:
tweet_ids = [str(tweet.id) for tweet in response.data]
tweet_texts = [tweet.text for tweet in response.data]
if included and "users" in included:
for user in included["users"]:
user_ids.append(str(user["id"]))
user_names.append(user["username"])
if response.meta:
meta = response.meta
return tweet_ids, tweet_texts, user_ids, user_names, data, included, meta
except tweepy.TweepyException:
raise
def run(
self,
input_data: Input,
*,
credentials: TwitterCredentials,
**kwargs,
) -> BlockOutput:
try:
ids, texts, user_ids, user_names, data, included, meta = self.get_tweets(
credentials,
input_data.tweet_ids,
input_data.expansions,
input_data.media_fields,
input_data.place_fields,
input_data.poll_fields,
input_data.tweet_fields,
input_data.user_fields,
)
if ids:
yield "ids", ids
if texts:
yield "texts", texts
if user_ids:
yield "userIds", user_ids
if user_names:
yield "userNames", user_names
if data:
yield "data", data
if included:
yield "included", included
if meta:
yield "meta", meta
except Exception as e:
yield "error", handle_tweepy_exception(e)

View File

@@ -0,0 +1,175 @@
from typing import cast
import tweepy
from tweepy.client import Response
from backend.blocks.twitter._auth import (
TEST_CREDENTIALS,
TEST_CREDENTIALS_INPUT,
TwitterCredentials,
TwitterCredentialsField,
TwitterCredentialsInput,
)
from backend.blocks.twitter._builders import UserExpansionsBuilder
from backend.blocks.twitter._serializer import IncludesSerializer
from backend.blocks.twitter._types import (
TweetFieldsFilter,
TweetUserFieldsFilter,
UserExpansionInputs,
UserExpansionsFilter,
)
from backend.blocks.twitter.tweepy_exceptions import handle_tweepy_exception
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.model import SchemaField
class TwitterGetBlockedUsersBlock(Block):
"""
Get a list of users who are blocked by the authenticating user
"""
class Input(UserExpansionInputs):
credentials: TwitterCredentialsInput = TwitterCredentialsField(
["users.read", "offline.access", "block.read"]
)
max_results: int | None = SchemaField(
description="Maximum number of results to return (1-1000, default 100)",
placeholder="Enter max results",
default=10,
advanced=True,
)
pagination_token: str | None = SchemaField(
description="Token for retrieving next/previous page of results",
placeholder="Enter pagination token",
default="",
advanced=True,
)
class Output(BlockSchema):
user_ids: list[str] = SchemaField(description="List of blocked user IDs")
usernames_: list[str] = SchemaField(description="List of blocked usernames")
included: dict = SchemaField(
description="Additional data requested via expansions"
)
meta: dict = SchemaField(description="Metadata including pagination info")
next_token: str = SchemaField(description="Next token for pagination")
error: str = SchemaField(description="Error message if the request failed")
def __init__(self):
super().__init__(
id="05f409e8-a631-11ef-ae89-93de863ee30d",
description="This block retrieves a list of users blocked by the authenticating user.",
categories={BlockCategory.SOCIAL},
input_schema=TwitterGetBlockedUsersBlock.Input,
output_schema=TwitterGetBlockedUsersBlock.Output,
test_input={
"max_results": 10,
"pagination_token": "",
"credentials": TEST_CREDENTIALS_INPUT,
"expansions": None,
"tweet_fields": None,
"user_fields": None,
},
test_credentials=TEST_CREDENTIALS,
test_output=[
("user_ids", ["12345", "67890"]),
("usernames_", ["testuser1", "testuser2"]),
],
test_mock={
"get_blocked_users": lambda *args, **kwargs: (
{}, # included
{}, # meta
["12345", "67890"], # user_ids
["testuser1", "testuser2"], # usernames
None, # next_token
)
},
)
@staticmethod
def get_blocked_users(
credentials: TwitterCredentials,
max_results: int | None,
pagination_token: str | None,
expansions: UserExpansionsFilter | None,
tweet_fields: TweetFieldsFilter | None,
user_fields: TweetUserFieldsFilter | None,
):
try:
client = tweepy.Client(
bearer_token=credentials.access_token.get_secret_value()
)
params = {
"max_results": max_results,
"pagination_token": (
None if pagination_token == "" else pagination_token
),
"user_auth": False,
}
params = (
UserExpansionsBuilder(params)
.add_expansions(expansions)
.add_tweet_fields(tweet_fields)
.add_user_fields(user_fields)
.build()
)
response = cast(Response, client.get_blocked(**params))
meta = {}
user_ids = []
usernames = []
next_token = None
included = IncludesSerializer.serialize(response.includes)
if response.data:
for user in response.data:
user_ids.append(str(user.id))
usernames.append(user.username)
if response.meta:
meta = response.meta
if "next_token" in meta:
next_token = meta["next_token"]
if user_ids and usernames:
return included, meta, user_ids, usernames, next_token
else:
raise tweepy.TweepyException("No blocked users found")
except tweepy.TweepyException:
raise
def run(
self,
input_data: Input,
*,
credentials: TwitterCredentials,
**kwargs,
) -> BlockOutput:
try:
included, meta, user_ids, usernames, next_token = self.get_blocked_users(
credentials,
input_data.max_results,
input_data.pagination_token,
input_data.expansions,
input_data.tweet_fields,
input_data.user_fields,
)
if user_ids:
yield "user_ids", user_ids
if usernames:
yield "usernames_", usernames
if included:
yield "included", included
if meta:
yield "meta", meta
if next_token:
yield "next_token", next_token
except Exception as e:
yield "error", handle_tweepy_exception(e)

View File

@@ -0,0 +1,510 @@
from typing import cast
import tweepy
from tweepy.client import Response
from backend.blocks.twitter._auth import (
TEST_CREDENTIALS,
TEST_CREDENTIALS_INPUT,
TwitterCredentials,
TwitterCredentialsField,
TwitterCredentialsInput,
)
from backend.blocks.twitter._builders import UserExpansionsBuilder
from backend.blocks.twitter._serializer import (
IncludesSerializer,
ResponseDataSerializer,
)
from backend.blocks.twitter._types import (
TweetFieldsFilter,
TweetUserFieldsFilter,
UserExpansionInputs,
UserExpansionsFilter,
)
from backend.blocks.twitter.tweepy_exceptions import handle_tweepy_exception
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.model import SchemaField
class TwitterUnfollowUserBlock(Block):
"""
Allows a user to unfollow another user specified by target user ID.
The request succeeds with no action when the authenticated user sends a request to a user they're not following or have already unfollowed.
"""
class Input(BlockSchema):
credentials: TwitterCredentialsInput = TwitterCredentialsField(
["users.read", "users.write", "follows.write", "offline.access"]
)
target_user_id: str = SchemaField(
description="The user ID of the user that you would like to unfollow",
placeholder="Enter target user ID",
)
class Output(BlockSchema):
success: bool = SchemaField(
description="Whether the unfollow action was successful"
)
error: str = SchemaField(description="Error message if the request failed")
def __init__(self):
super().__init__(
id="37e386a4-a631-11ef-b7bd-b78204b35fa4",
description="This block unfollows a specified Twitter user.",
categories={BlockCategory.SOCIAL},
input_schema=TwitterUnfollowUserBlock.Input,
output_schema=TwitterUnfollowUserBlock.Output,
test_input={
"target_user_id": "12345",
"credentials": TEST_CREDENTIALS_INPUT,
},
test_credentials=TEST_CREDENTIALS,
test_output=[
("success", True),
],
test_mock={"unfollow_user": lambda *args, **kwargs: True},
)
@staticmethod
def unfollow_user(credentials: TwitterCredentials, target_user_id: str):
try:
client = tweepy.Client(
bearer_token=credentials.access_token.get_secret_value()
)
client.unfollow_user(target_user_id=target_user_id, user_auth=False)
return True
except tweepy.TweepyException:
raise
def run(
self,
input_data: Input,
*,
credentials: TwitterCredentials,
**kwargs,
) -> BlockOutput:
try:
success = self.unfollow_user(credentials, input_data.target_user_id)
yield "success", success
except Exception as e:
yield "error", handle_tweepy_exception(e)
class TwitterFollowUserBlock(Block):
"""
Allows a user to follow another user specified by target user ID. If the target user does not have public Tweets,
this endpoint will send a follow request. The request succeeds with no action when the authenticated user sends a
request to a user they're already following, or if they're sending a follower request to a user that does not have
public Tweets.
"""
class Input(BlockSchema):
credentials: TwitterCredentialsInput = TwitterCredentialsField(
["users.read", "users.write", "follows.write", "offline.access"]
)
target_user_id: str = SchemaField(
description="The user ID of the user that you would like to follow",
placeholder="Enter target user ID",
)
class Output(BlockSchema):
success: bool = SchemaField(
description="Whether the follow action was successful"
)
error: str = SchemaField(description="Error message if the request failed")
def __init__(self):
super().__init__(
id="1aae6a5e-a631-11ef-a090-435900c6d429",
description="This block follows a specified Twitter user.",
categories={BlockCategory.SOCIAL},
input_schema=TwitterFollowUserBlock.Input,
output_schema=TwitterFollowUserBlock.Output,
test_input={
"target_user_id": "12345",
"credentials": TEST_CREDENTIALS_INPUT,
},
test_credentials=TEST_CREDENTIALS,
test_output=[("success", True)],
test_mock={"follow_user": lambda *args, **kwargs: True},
)
@staticmethod
def follow_user(credentials: TwitterCredentials, target_user_id: str):
try:
client = tweepy.Client(
bearer_token=credentials.access_token.get_secret_value()
)
client.follow_user(target_user_id=target_user_id, user_auth=False)
return True
except tweepy.TweepyException:
raise
def run(
self,
input_data: Input,
*,
credentials: TwitterCredentials,
**kwargs,
) -> BlockOutput:
try:
success = self.follow_user(credentials, input_data.target_user_id)
yield "success", success
except Exception as e:
yield "error", handle_tweepy_exception(e)
class TwitterGetFollowersBlock(Block):
"""
Retrieves a list of followers for a specified Twitter user ID
"""
class Input(UserExpansionInputs):
credentials: TwitterCredentialsInput = TwitterCredentialsField(
["users.read", "offline.access", "follows.read"]
)
target_user_id: str = SchemaField(
description="The user ID whose followers you would like to retrieve",
placeholder="Enter target user ID",
)
max_results: int | None = SchemaField(
description="Maximum number of results to return (1-1000, default 100)",
placeholder="Enter max results",
default=10,
advanced=True,
)
pagination_token: str | None = SchemaField(
description="Token for retrieving next/previous page of results",
placeholder="Enter pagination token",
default="",
advanced=True,
)
class Output(BlockSchema):
ids: list[str] = SchemaField(description="List of follower user IDs")
usernames: list[str] = SchemaField(description="List of follower usernames")
next_token: str = SchemaField(description="Next token for pagination")
data: list[dict] = SchemaField(description="Complete user data for followers")
includes: dict = SchemaField(
description="Additional data requested via expansions"
)
meta: dict = SchemaField(description="Metadata including pagination info")
error: str = SchemaField(description="Error message if the request failed")
def __init__(self):
super().__init__(
id="30f66410-a631-11ef-8fe7-d7f888b4f43c",
description="This block retrieves followers of a specified Twitter user.",
categories={BlockCategory.SOCIAL},
input_schema=TwitterGetFollowersBlock.Input,
output_schema=TwitterGetFollowersBlock.Output,
test_input={
"target_user_id": "12345",
"max_results": 1,
"pagination_token": "",
"expansions": None,
"tweet_fields": None,
"user_fields": None,
"credentials": TEST_CREDENTIALS_INPUT,
},
test_credentials=TEST_CREDENTIALS,
test_output=[
("ids", ["1234567890"]),
("usernames", ["testuser"]),
("data", [{"id": "1234567890", "username": "testuser"}]),
],
test_mock={
"get_followers": lambda *args, **kwargs: (
["1234567890"],
["testuser"],
[{"id": "1234567890", "username": "testuser"}],
{},
{},
None,
)
},
)
@staticmethod
def get_followers(
credentials: TwitterCredentials,
target_user_id: str,
max_results: int | None,
pagination_token: str | None,
expansions: UserExpansionsFilter | None,
tweet_fields: TweetFieldsFilter | None,
user_fields: TweetUserFieldsFilter | None,
):
try:
client = tweepy.Client(
bearer_token=credentials.access_token.get_secret_value()
)
params = {
"id": target_user_id,
"max_results": max_results,
"pagination_token": (
None if pagination_token == "" else pagination_token
),
"user_auth": False,
}
params = (
UserExpansionsBuilder(params)
.add_expansions(expansions)
.add_tweet_fields(tweet_fields)
.add_user_fields(user_fields)
.build()
)
response = cast(Response, client.get_users_followers(**params))
meta = {}
follower_ids = []
follower_usernames = []
next_token = None
if response.meta:
meta = response.meta
next_token = meta.get("next_token")
included = IncludesSerializer.serialize(response.includes)
data = ResponseDataSerializer.serialize_list(response.data)
if response.data:
follower_ids = [str(user.id) for user in response.data]
follower_usernames = [user.username for user in response.data]
return (
follower_ids,
follower_usernames,
data,
included,
meta,
next_token,
)
raise Exception("Followers not found")
except tweepy.TweepyException:
raise
def run(
self,
input_data: Input,
*,
credentials: TwitterCredentials,
**kwargs,
) -> BlockOutput:
try:
ids, usernames, data, includes, meta, next_token = self.get_followers(
credentials,
input_data.target_user_id,
input_data.max_results,
input_data.pagination_token,
input_data.expansions,
input_data.tweet_fields,
input_data.user_fields,
)
if ids:
yield "ids", ids
if usernames:
yield "usernames", usernames
if next_token:
yield "next_token", next_token
if data:
yield "data", data
if includes:
yield "includes", includes
if meta:
yield "meta", meta
except Exception as e:
yield "error", handle_tweepy_exception(e)
class TwitterGetFollowingBlock(Block):
"""
Retrieves a list of users that a specified Twitter user ID is following
"""
class Input(UserExpansionInputs):
credentials: TwitterCredentialsInput = TwitterCredentialsField(
["users.read", "offline.access", "follows.read"]
)
target_user_id: str = SchemaField(
description="The user ID whose following you would like to retrieve",
placeholder="Enter target user ID",
)
max_results: int | None = SchemaField(
description="Maximum number of results to return (1-1000, default 100)",
placeholder="Enter max results",
default=10,
advanced=True,
)
pagination_token: str | None = SchemaField(
description="Token for retrieving next/previous page of results",
placeholder="Enter pagination token",
default="",
advanced=True,
)
class Output(BlockSchema):
ids: list[str] = SchemaField(description="List of following user IDs")
usernames: list[str] = SchemaField(description="List of following usernames")
next_token: str = SchemaField(description="Next token for pagination")
data: list[dict] = SchemaField(description="Complete user data for following")
includes: dict = SchemaField(
description="Additional data requested via expansions"
)
meta: dict = SchemaField(description="Metadata including pagination info")
error: str = SchemaField(description="Error message if the request failed")
def __init__(self):
super().__init__(
id="264a399c-a631-11ef-a97d-bfde4ca91173",
description="This block retrieves the users that a specified Twitter user is following.",
categories={BlockCategory.SOCIAL},
input_schema=TwitterGetFollowingBlock.Input,
output_schema=TwitterGetFollowingBlock.Output,
test_input={
"target_user_id": "12345",
"max_results": 1,
"pagination_token": None,
"expansions": None,
"tweet_fields": None,
"user_fields": None,
"credentials": TEST_CREDENTIALS_INPUT,
},
test_credentials=TEST_CREDENTIALS,
test_output=[
("ids", ["1234567890"]),
("usernames", ["testuser"]),
("data", [{"id": "1234567890", "username": "testuser"}]),
],
test_mock={
"get_following": lambda *args, **kwargs: (
["1234567890"],
["testuser"],
[{"id": "1234567890", "username": "testuser"}],
{},
{},
None,
)
},
)
@staticmethod
def get_following(
credentials: TwitterCredentials,
target_user_id: str,
max_results: int | None,
pagination_token: str | None,
expansions: UserExpansionsFilter | None,
tweet_fields: TweetFieldsFilter | None,
user_fields: TweetUserFieldsFilter | None,
):
try:
client = tweepy.Client(
bearer_token=credentials.access_token.get_secret_value()
)
params = {
"id": target_user_id,
"max_results": max_results,
"pagination_token": (
None if pagination_token == "" else pagination_token
),
"user_auth": False,
}
params = (
UserExpansionsBuilder(params)
.add_expansions(expansions)
.add_tweet_fields(tweet_fields)
.add_user_fields(user_fields)
.build()
)
response = cast(Response, client.get_users_following(**params))
meta = {}
following_ids = []
following_usernames = []
next_token = None
if response.meta:
meta = response.meta
next_token = meta.get("next_token")
included = IncludesSerializer.serialize(response.includes)
data = ResponseDataSerializer.serialize_list(response.data)
if response.data:
following_ids = [str(user.id) for user in response.data]
following_usernames = [user.username for user in response.data]
return (
following_ids,
following_usernames,
data,
included,
meta,
next_token,
)
raise Exception("Following not found")
except tweepy.TweepyException:
raise
def run(
self,
input_data: Input,
*,
credentials: TwitterCredentials,
**kwargs,
) -> BlockOutput:
try:
ids, usernames, data, includes, meta, next_token = self.get_following(
credentials,
input_data.target_user_id,
input_data.max_results,
input_data.pagination_token,
input_data.expansions,
input_data.tweet_fields,
input_data.user_fields,
)
if ids:
yield "ids", ids
if usernames:
yield "usernames", usernames
if next_token:
yield "next_token", next_token
if data:
yield "data", data
if includes:
yield "includes", includes
if meta:
yield "meta", meta
except Exception as e:
yield "error", handle_tweepy_exception(e)

View File

@@ -0,0 +1,328 @@
from typing import cast
import tweepy
from tweepy.client import Response
from backend.blocks.twitter._auth import (
TEST_CREDENTIALS,
TEST_CREDENTIALS_INPUT,
TwitterCredentials,
TwitterCredentialsField,
TwitterCredentialsInput,
)
from backend.blocks.twitter._builders import UserExpansionsBuilder
from backend.blocks.twitter._serializer import (
IncludesSerializer,
ResponseDataSerializer,
)
from backend.blocks.twitter._types import (
TweetFieldsFilter,
TweetUserFieldsFilter,
UserExpansionInputs,
UserExpansionsFilter,
)
from backend.blocks.twitter.tweepy_exceptions import handle_tweepy_exception
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.model import SchemaField
class TwitterUnmuteUserBlock(Block):
"""
Allows a user to unmute another user specified by target user ID.
The request succeeds with no action when the user sends a request to a user they're not muting or have already unmuted.
"""
class Input(BlockSchema):
credentials: TwitterCredentialsInput = TwitterCredentialsField(
["users.read", "users.write", "offline.access"]
)
target_user_id: str = SchemaField(
description="The user ID of the user that you would like to unmute",
placeholder="Enter target user ID",
)
class Output(BlockSchema):
success: bool = SchemaField(
description="Whether the unmute action was successful"
)
error: str = SchemaField(description="Error message if the request failed")
def __init__(self):
super().__init__(
id="40458504-a631-11ef-940b-eff92be55422",
description="This block unmutes a specified Twitter user.",
categories={BlockCategory.SOCIAL},
input_schema=TwitterUnmuteUserBlock.Input,
output_schema=TwitterUnmuteUserBlock.Output,
test_input={
"target_user_id": "12345",
"credentials": TEST_CREDENTIALS_INPUT,
},
test_credentials=TEST_CREDENTIALS,
test_output=[
("success", True),
],
test_mock={"unmute_user": lambda *args, **kwargs: True},
)
@staticmethod
def unmute_user(credentials: TwitterCredentials, target_user_id: str):
try:
client = tweepy.Client(
bearer_token=credentials.access_token.get_secret_value()
)
client.unmute(target_user_id=target_user_id, user_auth=False)
return True
except tweepy.TweepyException:
raise
def run(
self,
input_data: Input,
*,
credentials: TwitterCredentials,
**kwargs,
) -> BlockOutput:
try:
success = self.unmute_user(credentials, input_data.target_user_id)
yield "success", success
except Exception as e:
yield "error", handle_tweepy_exception(e)
class TwitterGetMutedUsersBlock(Block):
"""
Returns a list of users who are muted by the authenticating user
"""
class Input(UserExpansionInputs):
credentials: TwitterCredentialsInput = TwitterCredentialsField(
["users.read", "offline.access"]
)
max_results: int | None = SchemaField(
description="The maximum number of results to be returned per page (1-1000). Default is 100.",
placeholder="Enter max results",
default=10,
advanced=True,
)
pagination_token: str | None = SchemaField(
description="Token to request next/previous page of results",
placeholder="Enter pagination token",
default="",
advanced=True,
)
class Output(BlockSchema):
ids: list[str] = SchemaField(description="List of muted user IDs")
usernames: list[str] = SchemaField(description="List of muted usernames")
next_token: str = SchemaField(description="Next token for pagination")
data: list[dict] = SchemaField(description="Complete user data for muted users")
includes: dict = SchemaField(
description="Additional data requested via expansions"
)
meta: dict = SchemaField(description="Metadata including pagination info")
error: str = SchemaField(description="Error message if the request failed")
def __init__(self):
super().__init__(
id="475024da-a631-11ef-9ccd-f724b8b03cda",
description="This block gets a list of users muted by the authenticating user.",
categories={BlockCategory.SOCIAL},
input_schema=TwitterGetMutedUsersBlock.Input,
output_schema=TwitterGetMutedUsersBlock.Output,
test_input={
"max_results": 2,
"pagination_token": "",
"credentials": TEST_CREDENTIALS_INPUT,
"expansions": None,
"tweet_fields": None,
"user_fields": None,
},
test_credentials=TEST_CREDENTIALS,
test_output=[
("ids", ["12345", "67890"]),
("usernames", ["testuser1", "testuser2"]),
(
"data",
[
{"id": "12345", "username": "testuser1"},
{"id": "67890", "username": "testuser2"},
],
),
],
test_mock={
"get_muted_users": lambda *args, **kwargs: (
["12345", "67890"],
["testuser1", "testuser2"],
[
{"id": "12345", "username": "testuser1"},
{"id": "67890", "username": "testuser2"},
],
{},
{},
None,
)
},
)
@staticmethod
def get_muted_users(
credentials: TwitterCredentials,
max_results: int | None,
pagination_token: str | None,
expansions: UserExpansionsFilter | None,
tweet_fields: TweetFieldsFilter | None,
user_fields: TweetUserFieldsFilter | None,
):
try:
client = tweepy.Client(
bearer_token=credentials.access_token.get_secret_value()
)
params = {
"max_results": max_results,
"pagination_token": (
None if pagination_token == "" else pagination_token
),
"user_auth": False,
}
params = (
UserExpansionsBuilder(params)
.add_expansions(expansions)
.add_tweet_fields(tweet_fields)
.add_user_fields(user_fields)
.build()
)
response = cast(Response, client.get_muted(**params))
meta = {}
user_ids = []
usernames = []
next_token = None
if response.meta:
meta = response.meta
next_token = meta.get("next_token")
included = IncludesSerializer.serialize(response.includes)
data = ResponseDataSerializer.serialize_list(response.data)
if response.data:
user_ids = [str(item.id) for item in response.data]
usernames = [item.username for item in response.data]
return user_ids, usernames, data, included, meta, next_token
raise Exception("Muted users not found")
except tweepy.TweepyException:
raise
def run(
self,
input_data: Input,
*,
credentials: TwitterCredentials,
**kwargs,
) -> BlockOutput:
try:
ids, usernames, data, includes, meta, next_token = self.get_muted_users(
credentials,
input_data.max_results,
input_data.pagination_token,
input_data.expansions,
input_data.tweet_fields,
input_data.user_fields,
)
if ids:
yield "ids", ids
if usernames:
yield "usernames", usernames
if next_token:
yield "next_token", next_token
if data:
yield "data", data
if includes:
yield "includes", includes
if meta:
yield "meta", meta
except Exception as e:
yield "error", handle_tweepy_exception(e)
class TwitterMuteUserBlock(Block):
"""
Allows a user to mute another user specified by target user ID
"""
class Input(BlockSchema):
credentials: TwitterCredentialsInput = TwitterCredentialsField(
["users.read", "users.write", "offline.access"]
)
target_user_id: str = SchemaField(
description="The user ID of the user that you would like to mute",
placeholder="Enter target user ID",
)
class Output(BlockSchema):
success: bool = SchemaField(
description="Whether the mute action was successful"
)
error: str = SchemaField(description="Error message if the request failed")
def __init__(self):
super().__init__(
id="4d1919d0-a631-11ef-90ab-3b73af9ce8f1",
description="This block mutes a specified Twitter user.",
categories={BlockCategory.SOCIAL},
input_schema=TwitterMuteUserBlock.Input,
output_schema=TwitterMuteUserBlock.Output,
test_input={
"target_user_id": "12345",
"credentials": TEST_CREDENTIALS_INPUT,
},
test_credentials=TEST_CREDENTIALS,
test_output=[
("success", True),
],
test_mock={"mute_user": lambda *args, **kwargs: True},
)
@staticmethod
def mute_user(credentials: TwitterCredentials, target_user_id: str):
try:
client = tweepy.Client(
bearer_token=credentials.access_token.get_secret_value()
)
client.mute(target_user_id=target_user_id, user_auth=False)
return True
except tweepy.TweepyException:
raise
def run(
self,
input_data: Input,
*,
credentials: TwitterCredentials,
**kwargs,
) -> BlockOutput:
try:
success = self.mute_user(credentials, input_data.target_user_id)
yield "success", success
except Exception as e:
yield "error", handle_tweepy_exception(e)

View File

@@ -0,0 +1,383 @@
from typing import Literal, Union, cast
import tweepy
from pydantic import BaseModel
from tweepy.client import Response
from backend.blocks.twitter._auth import (
TEST_CREDENTIALS,
TEST_CREDENTIALS_INPUT,
TwitterCredentials,
TwitterCredentialsField,
TwitterCredentialsInput,
)
from backend.blocks.twitter._builders import UserExpansionsBuilder
from backend.blocks.twitter._serializer import (
IncludesSerializer,
ResponseDataSerializer,
)
from backend.blocks.twitter._types import (
TweetFieldsFilter,
TweetUserFieldsFilter,
UserExpansionInputs,
UserExpansionsFilter,
)
from backend.blocks.twitter.tweepy_exceptions import handle_tweepy_exception
from backend.data.block import Block, BlockCategory, BlockOutput, BlockSchema
from backend.data.model import SchemaField
class UserId(BaseModel):
discriminator: Literal["user_id"]
user_id: str = SchemaField(description="The ID of the user to lookup", default="")
class Username(BaseModel):
discriminator: Literal["username"]
username: str = SchemaField(
description="The Twitter username (handle) of the user", default=""
)
class TwitterGetUserBlock(Block):
"""
Gets information about a single Twitter user specified by ID or username
"""
class Input(UserExpansionInputs):
credentials: TwitterCredentialsInput = TwitterCredentialsField(
["users.read", "offline.access"]
)
identifier: Union[UserId, Username] = SchemaField(
discriminator="discriminator",
description="Choose whether to identify the user by their unique Twitter ID or by their username",
advanced=False,
)
class Output(BlockSchema):
# Common outputs
id: str = SchemaField(description="User ID")
username_: str = SchemaField(description="User username")
name_: str = SchemaField(description="User name")
# Complete outputs
data: dict = SchemaField(description="Complete user data")
included: dict = SchemaField(
description="Additional data requested via expansions"
)
error: str = SchemaField(description="Error message if the request failed")
def __init__(self):
super().__init__(
id="5446db8e-a631-11ef-812a-cf315d373ee9",
description="This block retrieves information about a specified Twitter user.",
categories={BlockCategory.SOCIAL},
input_schema=TwitterGetUserBlock.Input,
output_schema=TwitterGetUserBlock.Output,
test_input={
"identifier": {"discriminator": "username", "username": "twitter"},
"credentials": TEST_CREDENTIALS_INPUT,
"expansions": None,
"tweet_fields": None,
"user_fields": None,
},
test_credentials=TEST_CREDENTIALS,
test_output=[
("id", "783214"),
("username_", "twitter"),
("name_", "Twitter"),
(
"data",
{
"user": {
"id": "783214",
"username": "twitter",
"name": "Twitter",
}
},
),
],
test_mock={
"get_user": lambda *args, **kwargs: (
{
"user": {
"id": "783214",
"username": "twitter",
"name": "Twitter",
}
},
{},
"twitter",
"783214",
"Twitter",
)
},
)
@staticmethod
def get_user(
credentials: TwitterCredentials,
identifier: Union[UserId, Username],
expansions: UserExpansionsFilter | None,
tweet_fields: TweetFieldsFilter | None,
user_fields: TweetUserFieldsFilter | None,
):
try:
client = tweepy.Client(
bearer_token=credentials.access_token.get_secret_value()
)
params = {
"id": identifier.user_id if isinstance(identifier, UserId) else None,
"username": (
identifier.username if isinstance(identifier, Username) else None
),
"user_auth": False,
}
params = (
UserExpansionsBuilder(params)
.add_expansions(expansions)
.add_tweet_fields(tweet_fields)
.add_user_fields(user_fields)
.build()
)
response = cast(Response, client.get_user(**params))
username = ""
id = ""
name = ""
included = IncludesSerializer.serialize(response.includes)
data = ResponseDataSerializer.serialize_dict(response.data)
if response.data:
username = response.data.username
id = str(response.data.id)
name = response.data.name
if username and id:
return data, included, username, id, name
else:
raise tweepy.TweepyException("User not found")
except tweepy.TweepyException:
raise
def run(
self,
input_data: Input,
*,
credentials: TwitterCredentials,
**kwargs,
) -> BlockOutput:
try:
data, included, username, id, name = self.get_user(
credentials,
input_data.identifier,
input_data.expansions,
input_data.tweet_fields,
input_data.user_fields,
)
if id:
yield "id", id
if username:
yield "username_", username
if name:
yield "name_", name
if data:
yield "data", data
if included:
yield "included", included
except Exception as e:
yield "error", handle_tweepy_exception(e)
class UserIdList(BaseModel):
discriminator: Literal["user_id_list"]
user_ids: list[str] = SchemaField(
description="List of user IDs to lookup (max 100)",
placeholder="Enter user IDs",
default=[],
advanced=False,
)
class UsernameList(BaseModel):
discriminator: Literal["username_list"]
usernames: list[str] = SchemaField(
description="List of Twitter usernames/handles to lookup (max 100)",
placeholder="Enter usernames",
default=[],
advanced=False,
)
class TwitterGetUsersBlock(Block):
"""
Gets information about multiple Twitter users specified by IDs or usernames
"""
class Input(UserExpansionInputs):
credentials: TwitterCredentialsInput = TwitterCredentialsField(
["users.read", "offline.access"]
)
identifier: Union[UserIdList, UsernameList] = SchemaField(
discriminator="discriminator",
description="Choose whether to identify users by their unique Twitter IDs or by their usernames",
advanced=False,
)
class Output(BlockSchema):
# Common outputs
ids: list[str] = SchemaField(description="User IDs")
usernames_: list[str] = SchemaField(description="User usernames")
names_: list[str] = SchemaField(description="User names")
# Complete outputs
data: list[dict] = SchemaField(description="Complete users data")
included: dict = SchemaField(
description="Additional data requested via expansions"
)
error: str = SchemaField(description="Error message if the request failed")
def __init__(self):
super().__init__(
id="5abc857c-a631-11ef-8cfc-f7b79354f7a1",
description="This block retrieves information about multiple Twitter users.",
categories={BlockCategory.SOCIAL},
input_schema=TwitterGetUsersBlock.Input,
output_schema=TwitterGetUsersBlock.Output,
test_input={
"identifier": {
"discriminator": "username_list",
"usernames": ["twitter", "twitterdev"],
},
"credentials": TEST_CREDENTIALS_INPUT,
"expansions": None,
"tweet_fields": None,
"user_fields": None,
},
test_credentials=TEST_CREDENTIALS,
test_output=[
("ids", ["783214", "2244994945"]),
("usernames_", ["twitter", "twitterdev"]),
("names_", ["Twitter", "Twitter Dev"]),
(
"data",
[
{"id": "783214", "username": "twitter", "name": "Twitter"},
{
"id": "2244994945",
"username": "twitterdev",
"name": "Twitter Dev",
},
],
),
],
test_mock={
"get_users": lambda *args, **kwargs: (
[
{"id": "783214", "username": "twitter", "name": "Twitter"},
{
"id": "2244994945",
"username": "twitterdev",
"name": "Twitter Dev",
},
],
{},
["twitter", "twitterdev"],
["783214", "2244994945"],
["Twitter", "Twitter Dev"],
)
},
)
@staticmethod
def get_users(
credentials: TwitterCredentials,
identifier: Union[UserIdList, UsernameList],
expansions: UserExpansionsFilter | None,
tweet_fields: TweetFieldsFilter | None,
user_fields: TweetUserFieldsFilter | None,
):
try:
client = tweepy.Client(
bearer_token=credentials.access_token.get_secret_value()
)
params = {
"ids": (
",".join(identifier.user_ids)
if isinstance(identifier, UserIdList)
else None
),
"usernames": (
",".join(identifier.usernames)
if isinstance(identifier, UsernameList)
else None
),
"user_auth": False,
}
params = (
UserExpansionsBuilder(params)
.add_expansions(expansions)
.add_tweet_fields(tweet_fields)
.add_user_fields(user_fields)
.build()
)
response = cast(Response, client.get_users(**params))
usernames = []
ids = []
names = []
included = IncludesSerializer.serialize(response.includes)
data = ResponseDataSerializer.serialize_list(response.data)
if response.data:
for user in response.data:
usernames.append(user.username)
ids.append(str(user.id))
names.append(user.name)
if usernames and ids:
return data, included, usernames, ids, names
else:
raise tweepy.TweepyException("Users not found")
except tweepy.TweepyException:
raise
def run(
self,
input_data: Input,
*,
credentials: TwitterCredentials,
**kwargs,
) -> BlockOutput:
try:
data, included, usernames, ids, names = self.get_users(
credentials,
input_data.identifier,
input_data.expansions,
input_data.tweet_fields,
input_data.user_fields,
)
if ids:
yield "ids", ids
if usernames:
yield "usernames_", usernames
if names:
yield "names_", names
if data:
yield "data", data
if included:
yield "included", included
except Exception as e:
yield "error", handle_tweepy_exception(e)

View File

@@ -0,0 +1,37 @@
from gravitasml.parser import Parser
from gravitasml.token import tokenize
from backend.data.block import Block, BlockOutput, BlockSchema
from backend.data.model import SchemaField
class XMLParserBlock(Block):
class Input(BlockSchema):
input_xml: str = SchemaField(description="input xml to be parsed")
class Output(BlockSchema):
parsed_xml: dict = SchemaField(description="output parsed xml to dict")
error: str = SchemaField(description="Error in parsing")
def __init__(self):
super().__init__(
id="286380af-9529-4b55-8be0-1d7c854abdb5",
description="Parses XML using gravitasml to tokenize and coverts it to dict",
input_schema=XMLParserBlock.Input,
output_schema=XMLParserBlock.Output,
test_input={"input_xml": "<tag1><tag2>content</tag2></tag1>"},
test_output=[
("parsed_xml", {"tag1": {"tag2": "content"}}),
],
)
def run(self, input_data: Input, **kwargs) -> BlockOutput:
try:
tokens = tokenize(input_data.input_xml)
parser = Parser(tokens)
parsed_result = parser.parse()
yield "parsed_xml", parsed_result
except ValueError as val_e:
raise ValueError(f"Validation error for dict:{val_e}") from val_e
except SyntaxError as syn_e:
raise SyntaxError(f"Error in input xml syntax: {syn_e}") from syn_e

View File

@@ -221,7 +221,8 @@ def event():
@test.command()
@click.argument("server_address")
@click.argument("graph_id")
def websocket(server_address: str, graph_id: str):
@click.argument("graph_version")
def websocket(server_address: str, graph_id: str, graph_version: int):
"""
Tests the websocket connection.
"""
@@ -237,7 +238,9 @@ def websocket(server_address: str, graph_id: str):
try:
msg = WsMessage(
method=Methods.SUBSCRIBE,
data=ExecutionSubscription(graph_id=graph_id).model_dump(),
data=ExecutionSubscription(
graph_id=graph_id, graph_version=graph_version
).model_dump(),
).model_dump_json()
await websocket.send(msg)
print(f"Sending: {msg}")

View File

@@ -64,6 +64,9 @@ class BlockCategory(Enum):
SAFETY = (
"Block that provides AI safety mechanisms such as detecting harmful content"
)
PRODUCTIVITY = "Block that helps with productivity"
ISSUE_TRACKING = "Block that helps with issue tracking"
MULTIMEDIA = "Block that interacts with multimedia content"
def dict(self) -> dict[str, str]:
return {"category": self.name, "description": self.value}
@@ -395,6 +398,7 @@ class Block(ABC, Generic[BlockSchemaInputType, BlockSchemaOutputType]):
}
def execute(self, input_data: BlockInput, **kwargs) -> BlockOutput:
# Merge the input data with the extra execution arguments, preferring the args for security
if error := self.input_schema.validate_data(input_data):
raise ValueError(
f"Unable to execute block with invalid input data: {error}"

View File

@@ -35,6 +35,8 @@ from backend.integrations.credentials_store import (
# =============== Configure the cost for each LLM Model call =============== #
MODEL_COST: dict[LlmModel, int] = {
LlmModel.O3_MINI: 2, # $1.10 / $4.40
LlmModel.O1: 16, # $15 / $60
LlmModel.O1_PREVIEW: 16,
LlmModel.O1_MINI: 4,
LlmModel.GPT4O_MINI: 1,
@@ -42,19 +44,21 @@ MODEL_COST: dict[LlmModel, int] = {
LlmModel.GPT4_TURBO: 10,
LlmModel.GPT3_5_TURBO: 1,
LlmModel.CLAUDE_3_5_SONNET: 4,
LlmModel.CLAUDE_3_5_HAIKU: 1, # $0.80 / $4.00
LlmModel.CLAUDE_3_HAIKU: 1,
LlmModel.LLAMA3_8B: 1,
LlmModel.LLAMA3_70B: 1,
LlmModel.MIXTRAL_8X7B: 1,
LlmModel.GEMMA_7B: 1,
LlmModel.GEMMA2_9B: 1,
LlmModel.LLAMA3_1_405B: 1,
LlmModel.LLAMA3_1_70B: 1,
LlmModel.LLAMA3_3_70B: 1, # $0.59 / $0.79
LlmModel.LLAMA3_1_8B: 1,
LlmModel.OLLAMA_LLAMA3_3: 1,
LlmModel.OLLAMA_LLAMA3_2: 1,
LlmModel.OLLAMA_LLAMA3_8B: 1,
LlmModel.OLLAMA_LLAMA3_405B: 1,
LlmModel.DEEPSEEK_LLAMA_70B: 1, # ? / ?
LlmModel.OLLAMA_DOLPHIN: 1,
LlmModel.GEMINI_FLASH_1_5_8B: 1,
LlmModel.GEMINI_FLASH_1_5: 1,
LlmModel.GROK_BETA: 5,
LlmModel.MISTRAL_NEMO: 1,
LlmModel.COHERE_COMMAND_R_08_2024: 1,

View File

@@ -10,7 +10,6 @@ class BlockCostType(str, Enum):
RUN = "run" # cost X credits per run
BYTE = "byte" # cost X credits per byte
SECOND = "second" # cost X credits per second
DOLLAR = "dollar" # cost X dollars per run
class BlockCost(BaseModel):

View File

@@ -1,40 +1,65 @@
import logging
from abc import ABC, abstractmethod
from collections import defaultdict
from datetime import datetime, timezone
import stripe
from prisma import Json
from prisma.enums import CreditTransactionType
from prisma.errors import UniqueViolationError
from prisma.models import CreditTransaction
from prisma.models import CreditTransaction, User
from prisma.types import CreditTransactionCreateInput, CreditTransactionWhereInput
from pydantic import BaseModel
from backend.data import db
from backend.data.block import Block, BlockInput, get_block
from backend.data.block_cost_config import BLOCK_COSTS
from backend.data.cost import BlockCost, BlockCostType
from backend.util.settings import Config
from backend.data.execution import NodeExecutionEntry
from backend.data.model import AutoTopUpConfig, TransactionHistory, UserTransaction
from backend.data.user import get_user_by_id
from backend.util.settings import Settings
config = Config()
settings = Settings()
stripe.api_key = settings.secrets.stripe_api_key
logger = logging.getLogger(__name__)
class UserCreditBase(ABC):
def __init__(self, num_user_credits_refill: int):
self.num_user_credits_refill = num_user_credits_refill
@abstractmethod
async def get_or_refill_credit(self, user_id: str) -> int:
async def get_credits(self, user_id: str) -> int:
"""
Get the current credit for the user and refill if no transaction has been made in the current cycle.
Get the current credits for the user.
Returns:
int: The current credit for the user.
int: The current credits for the user.
"""
pass
@abstractmethod
async def get_transaction_history(
self,
user_id: str,
transaction_time: datetime,
transaction_count_limit: int,
) -> TransactionHistory:
"""
Get the credit transactions for the user.
Args:
user_id (str): The user ID.
transaction_time (datetime): The upper bound of the transaction time.
transaction_count_limit (int): The transaction count limit.
Returns:
TransactionHistory: The credit transactions for the user.
"""
pass
@abstractmethod
async def spend_credits(
self,
user_id: str,
user_credit: int,
block_id: str,
input_data: BlockInput,
entry: NodeExecutionEntry,
data_size: float,
run_time: float,
) -> int:
@@ -42,10 +67,7 @@ class UserCreditBase(ABC):
Spend the credits for the user based on the block usage.
Args:
user_id (str): The user ID.
user_credit (int): The current credit for the user.
block_id (str): The block ID.
input_data (BlockInput): The input data for the block.
entry (NodeExecutionEntry): The node execution identifiers & data.
data_size (float): The size of the data being processed.
run_time (float): The time taken to run the block.
@@ -57,7 +79,7 @@ class UserCreditBase(ABC):
@abstractmethod
async def top_up_credits(self, user_id: str, amount: int):
"""
Top up the credits for the user.
Top up the credits for the user immediately.
Args:
user_id (str): The user ID.
@@ -65,51 +87,181 @@ class UserCreditBase(ABC):
"""
pass
@abstractmethod
async def top_up_intent(self, user_id: str, amount: int) -> str:
"""
Create a payment intent to top up the credits for the user.
class UserCredit(UserCreditBase):
async def get_or_refill_credit(self, user_id: str) -> int:
cur_time = self.time_now()
cur_month = cur_time.replace(day=1, hour=0, minute=0, second=0, microsecond=0)
nxt_month = (
cur_month.replace(month=cur_month.month + 1)
if cur_month.month < 12
else cur_month.replace(year=cur_month.year + 1, month=1)
)
Args:
user_id (str): The user ID.
amount (int): The amount of credits to top up.
user_credit = await CreditTransaction.prisma().group_by(
by=["userId"],
sum={"amount": True},
Returns:
str: The redirect url to the payment page.
"""
pass
@abstractmethod
async def fulfill_checkout(
self, *, session_id: str | None = None, user_id: str | None = None
):
"""
Fulfill the Stripe checkout session.
Args:
session_id (str | None): The checkout session ID. Will try to fulfill most recent if None.
user_id (str | None): The user ID must be provided if session_id is None.
"""
pass
@staticmethod
def time_now() -> datetime:
return datetime.now(timezone.utc)
# ====== Transaction Helper Methods ====== #
# Any modifications to the transaction table should only be done through these methods #
async def _get_credits(self, user_id: str) -> tuple[int, datetime]:
"""
Returns the current balance of the user & the latest balance snapshot time.
"""
top_time = self.time_now()
snapshot = await CreditTransaction.prisma().find_first(
where={
"userId": user_id,
"createdAt": {"gte": cur_month, "lt": nxt_month},
"createdAt": {"lte": top_time},
"isActive": True,
"runningBalance": {"not": None}, # type: ignore
},
order={"createdAt": "desc"},
)
datetime_min = datetime.min.replace(tzinfo=timezone.utc)
snapshot_balance = snapshot.runningBalance or 0 if snapshot else 0
snapshot_time = snapshot.createdAt if snapshot else datetime_min
# Get transactions after the snapshot, this should not exist, but just in case.
transactions = await CreditTransaction.prisma().group_by(
by=["userId"],
sum={"amount": True},
max={"createdAt": True},
where={
"userId": user_id,
"createdAt": {
"gt": snapshot_time,
"lte": top_time,
},
"isActive": True,
},
)
if user_credit:
credit_sum = user_credit[0].get("_sum") or {}
return credit_sum.get("amount", 0)
key = f"MONTHLY-CREDIT-TOP-UP-{cur_month}"
try:
await CreditTransaction.prisma().create(
data={
"amount": self.num_user_credits_refill,
"type": CreditTransactionType.TOP_UP,
"userId": user_id,
"transactionKey": key,
"createdAt": self.time_now(),
}
transaction_balance = (
int(transactions[0].get("_sum", {}).get("amount", 0) + snapshot_balance)
if transactions
else snapshot_balance
)
transaction_time = (
datetime.fromisoformat(
str(transactions[0].get("_max", {}).get("createdAt", datetime_min))
)
except UniqueViolationError:
pass # Already refilled this month
if transactions
else snapshot_time
)
return transaction_balance, transaction_time
return self.num_user_credits_refill
async def _enable_transaction(
self, transaction_key: str, user_id: str, metadata: Json
):
@staticmethod
def time_now():
return datetime.now(timezone.utc)
transaction = await CreditTransaction.prisma().find_first_or_raise(
where={"transactionKey": transaction_key, "userId": user_id}
)
if transaction.isActive:
return
async with db.locked_transaction(f"usr_trx_{user_id}"):
user_balance, _ = await self._get_credits(user_id)
await CreditTransaction.prisma().update(
where={
"creditTransactionIdentifier": {
"transactionKey": transaction_key,
"userId": user_id,
}
},
data={
"isActive": True,
"runningBalance": user_balance + transaction.amount,
"createdAt": self.time_now(),
"metadata": metadata,
},
)
async def _add_transaction(
self,
user_id: str,
amount: int,
transaction_type: CreditTransactionType,
is_active: bool = True,
transaction_key: str | None = None,
ceiling_balance: int | None = None,
metadata: Json = Json({}),
) -> tuple[int, str]:
"""
Add a new transaction for the user.
This is the only method that should be used to add a new transaction.
Args:
user_id (str): The user ID.
amount (int): The amount of credits to add.
transaction_type (CreditTransactionType): The type of transaction.
is_active (bool): Whether the transaction is active or needs to be manually activated through _enable_transaction.
transaction_key (str | None): The transaction key. Avoids adding transaction if the key already exists.
ceiling_balance (int | None): The ceiling balance. Avoids adding more credits if the balance is already above the ceiling.
metadata (Json): The metadata of the transaction.
Returns:
tuple[int, str]: The new balance & the transaction key.
"""
async with db.locked_transaction(f"usr_trx_{user_id}"):
# Get latest balance snapshot
user_balance, _ = await self._get_credits(user_id)
if ceiling_balance and user_balance >= ceiling_balance:
raise ValueError(
f"You already have enough balance for user {user_id}, balance: {user_balance}, ceiling: {ceiling_balance}"
)
if amount < 0 and user_balance < abs(amount):
raise ValueError(
f"Insufficient balance of ${user_balance/100} to run the block that costs ${abs(amount)/100}"
)
# Create the transaction
transaction_data: CreditTransactionCreateInput = {
"userId": user_id,
"amount": amount,
"runningBalance": user_balance + amount,
"type": transaction_type,
"metadata": metadata,
"isActive": is_active,
"createdAt": self.time_now(),
}
if transaction_key:
transaction_data["transactionKey"] = transaction_key
tx = await CreditTransaction.prisma().create(data=transaction_data)
return user_balance + amount, tx.transactionKey
class UsageTransactionMetadata(BaseModel):
graph_exec_id: str | None = None
graph_id: str | None = None
node_id: str | None = None
node_exec_id: str | None = None
block_id: str | None = None
block: str | None = None
input: BlockInput | None = None
class UserCredit(UserCreditBase):
def _block_usage_cost(
self,
@@ -148,8 +300,8 @@ class UserCredit(UserCreditBase):
) -> bool:
"""
Filter rules:
- If costFilter is an object, then check if costFilter is the subset of inputValues
- Otherwise, check if costFilter is equal to inputValues.
- If cost_filter is an object, then check if cost_filter is the subset of input_data
- Otherwise, check if cost_filter is equal to input_data.
- Undefined, null, and empty string are considered as equal.
"""
if not isinstance(cost_filter, dict) or not isinstance(input_data, dict):
@@ -163,72 +315,366 @@ class UserCredit(UserCreditBase):
async def spend_credits(
self,
user_id: str,
user_credit: int,
block_id: str,
input_data: BlockInput,
entry: NodeExecutionEntry,
data_size: float,
run_time: float,
validate_balance: bool = True,
) -> int:
block = get_block(block_id)
block = get_block(entry.block_id)
if not block:
raise ValueError(f"Block not found: {block_id}")
raise ValueError(f"Block not found: {entry.block_id}")
cost, matching_filter = self._block_usage_cost(
block=block, input_data=input_data, data_size=data_size, run_time=run_time
block=block, input_data=entry.data, data_size=data_size, run_time=run_time
)
if cost <= 0:
if cost == 0:
return 0
if validate_balance and user_credit < cost:
raise ValueError(f"Insufficient credit: {user_credit} < {cost}")
await CreditTransaction.prisma().create(
data={
"userId": user_id,
"amount": -cost,
"type": CreditTransactionType.USAGE,
"blockId": block.id,
"metadata": Json(
{
"block": block.name,
"input": matching_filter,
}
),
"createdAt": self.time_now(),
}
balance, _ = await self._add_transaction(
user_id=entry.user_id,
amount=-cost,
transaction_type=CreditTransactionType.USAGE,
metadata=Json(
UsageTransactionMetadata(
graph_exec_id=entry.graph_exec_id,
graph_id=entry.graph_id,
node_id=entry.node_id,
node_exec_id=entry.node_exec_id,
block_id=entry.block_id,
block=block.name,
input=matching_filter,
).model_dump()
),
)
user_id = entry.user_id
# Auto top-up if balance is below threshold.
auto_top_up = await get_auto_top_up(user_id)
if auto_top_up.threshold and balance < auto_top_up.threshold:
try:
await self._top_up_credits(
user_id=user_id,
amount=auto_top_up.amount,
# Avoid multiple auto top-ups within the same graph execution.
key=f"AUTO-TOP-UP-{user_id}-{entry.graph_exec_id}",
ceiling_balance=auto_top_up.threshold,
)
except Exception as e:
# Failed top-up is not critical, we can move on.
logger.error(
f"Auto top-up failed for user {user_id}, balance: {balance}, amount: {auto_top_up.amount}, error: {e}"
)
return cost
async def top_up_credits(self, user_id: str, amount: int):
await CreditTransaction.prisma().create(
data={
await self._top_up_credits(user_id, amount)
async def _top_up_credits(
self,
user_id: str,
amount: int,
key: str | None = None,
ceiling_balance: int | None = None,
):
if amount < 0:
raise ValueError(f"Top up amount must not be negative: {amount}")
if key is not None and (
await CreditTransaction.prisma().find_first(
where={"transactionKey": key, "userId": user_id}
)
):
raise ValueError(f"Transaction key {key} already exists for user {user_id}")
_, transaction_key = await self._add_transaction(
user_id=user_id,
amount=amount,
transaction_type=CreditTransactionType.TOP_UP,
is_active=False,
transaction_key=key,
ceiling_balance=ceiling_balance,
)
customer_id = await get_stripe_customer_id(user_id)
payment_methods = stripe.PaymentMethod.list(customer=customer_id, type="card")
if not payment_methods:
raise ValueError("No payment method found, please add it on the platform.")
for payment_method in payment_methods:
if amount == 0:
setup_intent = stripe.SetupIntent.create(
customer=customer_id,
usage="off_session",
confirm=True,
payment_method=payment_method.id,
automatic_payment_methods={
"enabled": True,
"allow_redirects": "never",
},
)
if setup_intent.status == "succeeded":
return
else:
payment_intent = stripe.PaymentIntent.create(
amount=amount,
currency="usd",
description="AutoGPT Platform Credits",
customer=customer_id,
off_session=True,
confirm=True,
payment_method=payment_method.id,
automatic_payment_methods={
"enabled": True,
"allow_redirects": "never",
},
)
if payment_intent.status == "succeeded":
await self._enable_transaction(
transaction_key=transaction_key,
user_id=user_id,
metadata=Json({"payment_intent": payment_intent}),
)
return
raise ValueError(
f"Out of {len(payment_methods)} payment methods tried, none is supported"
)
async def top_up_intent(self, user_id: str, amount: int) -> str:
if amount < 500 or amount % 100 != 0:
raise ValueError(
f"Top up amount must be at least 500 credits and multiple of 100 but is {amount}"
)
# Create checkout session
# https://docs.stripe.com/checkout/quickstart?client=react
# unit_amount param is always in the smallest currency unit (so cents for usd)
# which is equal to amount of credits
checkout_session = stripe.checkout.Session.create(
customer=await get_stripe_customer_id(user_id),
line_items=[
{
"price_data": {
"currency": "usd",
"product_data": {
"name": "AutoGPT Platform Credits",
},
"unit_amount": amount,
},
"quantity": 1,
}
],
mode="payment",
ui_mode="hosted",
payment_intent_data={"setup_future_usage": "off_session"},
saved_payment_method_options={"payment_method_save": "enabled"},
success_url=settings.config.frontend_base_url
+ "/marketplace/credits?topup=success",
cancel_url=settings.config.frontend_base_url
+ "/marketplace/credits?topup=cancel",
allow_promotion_codes=True,
)
await self._add_transaction(
user_id=user_id,
amount=amount,
transaction_type=CreditTransactionType.TOP_UP,
transaction_key=checkout_session.id,
is_active=False,
metadata=Json({"checkout_session": checkout_session}),
)
return checkout_session.url or ""
# https://docs.stripe.com/checkout/fulfillment
async def fulfill_checkout(
self, *, session_id: str | None = None, user_id: str | None = None
):
if (not session_id and not user_id) or (session_id and user_id):
raise ValueError("Either session_id or user_id must be provided")
# Retrieve CreditTransaction
find_filter: CreditTransactionWhereInput = {
"type": CreditTransactionType.TOP_UP,
"isActive": False,
}
if session_id:
find_filter["transactionKey"] = session_id
if user_id:
find_filter["userId"] = user_id
# Find the most recent inactive top-up transaction
credit_transaction = await CreditTransaction.prisma().find_first(
where=find_filter,
order={"createdAt": "desc"},
)
# This can be called multiple times for one id, so ignore if already fulfilled
if not credit_transaction:
return
# Retrieve the Checkout Session from the API
checkout_session = stripe.checkout.Session.retrieve(
credit_transaction.transactionKey
)
# Check the Checkout Session's payment_status property
# to determine if fulfillment should be performed
if checkout_session.payment_status in ["paid", "no_payment_required"]:
await self._enable_transaction(
transaction_key=credit_transaction.transactionKey,
user_id=credit_transaction.userId,
metadata=Json({"checkout_session": checkout_session}),
)
async def get_credits(self, user_id: str) -> int:
balance, _ = await self._get_credits(user_id)
return balance
async def get_transaction_history(
self,
user_id: str,
transaction_time: datetime,
transaction_count_limit: int,
) -> TransactionHistory:
transactions = await CreditTransaction.prisma().find_many(
where={
"userId": user_id,
"amount": amount,
"type": CreditTransactionType.TOP_UP,
"createdAt": self.time_now(),
}
"createdAt": {"lt": transaction_time},
"isActive": True,
},
order={"createdAt": "desc"},
take=transaction_count_limit,
)
grouped_transactions: dict[str, UserTransaction] = defaultdict(
lambda: UserTransaction()
)
tx_time = None
for t in transactions:
metadata = (
UsageTransactionMetadata.model_validate(t.metadata)
if t.metadata
else UsageTransactionMetadata()
)
tx_time = t.createdAt.replace(tzinfo=None)
if t.type == CreditTransactionType.USAGE and metadata.graph_exec_id:
gt = grouped_transactions[metadata.graph_exec_id]
gid = metadata.graph_id[:8] if metadata.graph_id else "UNKNOWN"
gt.description = f"Graph #{gid} Execution"
gt.usage_node_count += 1
gt.usage_start_time = min(gt.usage_start_time, tx_time)
gt.usage_execution_id = metadata.graph_exec_id
gt.usage_graph_id = metadata.graph_id
else:
gt = grouped_transactions[t.transactionKey]
gt.description = f"{t.type} Transaction"
gt.amount += t.amount
gt.transaction_type = t.type
if tx_time > gt.transaction_time:
gt.transaction_time = tx_time
gt.balance = t.runningBalance or 0
return TransactionHistory(
transactions=list(grouped_transactions.values()),
next_transaction_time=(
tx_time if len(transactions) == transaction_count_limit else None
),
)
class BetaUserCredit(UserCredit):
"""
This is a temporary class to handle the test user utilizing monthly credit refill.
TODO: Remove this class & its feature toggle.
"""
def __init__(self, num_user_credits_refill: int):
self.num_user_credits_refill = num_user_credits_refill
async def get_credits(self, user_id: str) -> int:
cur_time = self.time_now().date()
balance, snapshot_time = await self._get_credits(user_id)
if (snapshot_time.year, snapshot_time.month) == (cur_time.year, cur_time.month):
return balance
try:
balance, _ = await self._add_transaction(
user_id=user_id,
amount=max(self.num_user_credits_refill - balance, 0),
transaction_type=CreditTransactionType.TOP_UP,
transaction_key=f"MONTHLY-CREDIT-TOP-UP-{cur_time}",
)
return balance
except UniqueViolationError:
# Already refilled this month
return (await self._get_credits(user_id))[0]
class DisabledUserCredit(UserCreditBase):
async def get_or_refill_credit(self, *args, **kwargs) -> int:
async def get_credits(self, *args, **kwargs) -> int:
return 0
async def get_transaction_history(self, *args, **kwargs) -> TransactionHistory:
return TransactionHistory(transactions=[], next_transaction_time=None)
async def spend_credits(self, *args, **kwargs) -> int:
return 0
async def top_up_credits(self, *args, **kwargs):
pass
async def top_up_intent(self, *args, **kwargs) -> str:
return ""
async def fulfill_checkout(self, *args, **kwargs):
pass
def get_user_credit_model() -> UserCreditBase:
if config.enable_credit.lower() == "true":
return UserCredit(config.num_user_credits_refill)
else:
return DisabledUserCredit(0)
if not settings.config.enable_credit:
return DisabledUserCredit()
if settings.config.enable_beta_monthly_credit:
return BetaUserCredit(settings.config.num_user_credits_refill)
return UserCredit()
def get_block_costs() -> dict[str, list[BlockCost]]:
return {block().id: costs for block, costs in BLOCK_COSTS.items()}
async def get_stripe_customer_id(user_id: str) -> str:
user = await get_user_by_id(user_id)
if user.stripeCustomerId:
return user.stripeCustomerId
customer = stripe.Customer.create(name=user.name or "", email=user.email)
await User.prisma().update(
where={"id": user_id}, data={"stripeCustomerId": customer.id}
)
return customer.id
async def set_auto_top_up(user_id: str, config: AutoTopUpConfig):
await User.prisma().update(
where={"id": user_id},
data={"topUpConfig": Json(config.model_dump())},
)
async def get_auto_top_up(user_id: str) -> AutoTopUpConfig:
user = await get_user_by_id(user_id)
if not user.topUpConfig:
return AutoTopUpConfig(threshold=0, amount=0)
return AutoTopUpConfig.model_validate(user.topUpConfig)

View File

@@ -1,5 +1,6 @@
import logging
import os
import zlib
from contextlib import asynccontextmanager
from uuid import uuid4
@@ -54,6 +55,14 @@ async def transaction():
yield tx
@asynccontextmanager
async def locked_transaction(key: str):
lock_key = zlib.crc32(key.encode("utf-8"))
async with transaction() as tx:
await tx.execute_raw(f"SELECT pg_advisory_xact_lock({lock_key})")
yield tx
class BaseDbModel(BaseModel):
id: str = Field(default_factory=lambda: str(uuid4()))

View File

@@ -1,9 +1,11 @@
from collections import defaultdict
from datetime import datetime, timezone
from multiprocessing import Manager
from typing import Any, AsyncGenerator, Generator, Generic, TypeVar
from typing import Any, AsyncGenerator, Generator, Generic, Optional, Type, TypeVar
from prisma import Json
from prisma.enums import AgentExecutionStatus
from prisma.errors import PrismaError
from prisma.models import (
AgentGraphExecution,
AgentNodeExecution,
@@ -14,7 +16,8 @@ from pydantic import BaseModel
from backend.data.block import BlockData, BlockInput, CompletedBlockOutput
from backend.data.includes import EXECUTION_RESULT_INCLUDE, GRAPH_EXECUTION_INCLUDE
from backend.data.queue import AsyncRedisEventBus, RedisEventBus
from backend.util import json, mock
from backend.server.v2.store.exceptions import DatabaseError
from backend.util import mock, type
from backend.util.settings import Config
@@ -22,6 +25,7 @@ class GraphExecutionEntry(BaseModel):
user_id: str
graph_exec_id: str
graph_id: str
graph_version: int
start_node_execs: list["NodeExecutionEntry"]
@@ -31,6 +35,7 @@ class NodeExecutionEntry(BaseModel):
graph_id: str
node_exec_id: str
node_id: str
block_id: str
data: BlockInput
@@ -98,16 +103,16 @@ class ExecutionResult(BaseModel):
def from_db(execution: AgentNodeExecution):
if execution.executionData:
# Execution that has been queued for execution will persist its data.
input_data = json.loads(execution.executionData, target_type=dict[str, Any])
input_data = type.convert(execution.executionData, dict[str, Any])
else:
# For incomplete execution, executionData will not be yet available.
input_data: BlockInput = defaultdict()
for data in execution.Input or []:
input_data[data.name] = json.loads(data.data)
input_data[data.name] = type.convert(data.data, Type[Any])
output_data: CompletedBlockOutput = defaultdict(list)
for data in execution.Output or []:
output_data[data.name].append(json.loads(data.data))
output_data[data.name].append(type.convert(data.data, Type[Any]))
graph_execution: AgentGraphExecution | None = execution.AgentGraphExecution
@@ -136,6 +141,7 @@ async def create_graph_execution(
graph_version: int,
nodes_input: list[tuple[str, BlockInput]],
user_id: str,
preset_id: str | None = None,
) -> tuple[str, list[ExecutionResult]]:
"""
Create a new AgentGraphExecution record.
@@ -154,7 +160,7 @@ async def create_graph_execution(
"executionStatus": ExecutionStatus.INCOMPLETE,
"Input": {
"create": [
{"name": name, "data": json.dumps(data)}
{"name": name, "data": Json(data)}
for name, data in node_input.items()
]
},
@@ -163,6 +169,7 @@ async def create_graph_execution(
]
},
"userId": user_id,
"agentPresetId": preset_id,
},
include=GRAPH_EXECUTION_INCLUDE,
)
@@ -206,7 +213,7 @@ async def upsert_execution_input(
order={"addedTime": "asc"},
include={"Input": True},
)
json_input_data = json.dumps(input_data)
json_input_data = Json(input_data)
if existing_execution:
await AgentNodeExecutionInputOutput.prisma().create(
@@ -218,7 +225,7 @@ async def upsert_execution_input(
)
return existing_execution.id, {
**{
input_data.name: json.loads(input_data.data)
input_data.name: type.convert(input_data.data, Type[Any])
for input_data in existing_execution.Input or []
},
input_name: input_data,
@@ -252,7 +259,7 @@ async def upsert_execution_output(
await AgentNodeExecutionInputOutput.prisma().create(
data={
"name": output_name,
"data": json.dumps(output_data),
"data": Json(output_data),
"referencedByOutputExecId": node_exec_id,
}
)
@@ -277,7 +284,7 @@ async def update_graph_execution_stats(
where={"id": graph_exec_id},
data={
"executionStatus": status,
"stats": json.dumps(stats),
"stats": Json(stats),
},
)
if not res:
@@ -289,7 +296,7 @@ async def update_graph_execution_stats(
async def update_node_execution_stats(node_exec_id: str, stats: dict[str, Any]):
await AgentNodeExecution.prisma().update(
where={"id": node_exec_id},
data={"stats": json.dumps(stats)},
data={"stats": Json(stats)},
)
@@ -309,8 +316,8 @@ async def update_execution_status(
**({"startedTime": now} if status == ExecutionStatus.RUNNING else {}),
**({"endedTime": now} if status == ExecutionStatus.FAILED else {}),
**({"endedTime": now} if status == ExecutionStatus.COMPLETED else {}),
**({"executionData": json.dumps(execution_data)} if execution_data else {}),
**({"stats": json.dumps(stats)} if stats else {}),
**({"executionData": Json(execution_data)} if execution_data else {}),
**({"stats": Json(stats)} if stats else {}),
}
res = await AgentNodeExecution.prisma().update(
@@ -324,6 +331,30 @@ async def update_execution_status(
return ExecutionResult.from_db(res)
async def get_execution(
execution_id: str, user_id: str
) -> Optional[AgentNodeExecution]:
"""
Get an execution by ID. Returns None if not found.
Args:
execution_id: The ID of the execution to retrieve
Returns:
The execution if found, None otherwise
"""
try:
execution = await AgentNodeExecution.prisma().find_unique(
where={
"id": execution_id,
"userId": user_id,
}
)
return execution
except PrismaError:
return None
async def get_execution_results(graph_exec_id: str) -> list[ExecutionResult]:
executions = await AgentNodeExecution.prisma().find_many(
where={"agentGraphExecutionId": graph_exec_id},
@@ -337,6 +368,31 @@ async def get_execution_results(graph_exec_id: str) -> list[ExecutionResult]:
return res
async def get_executions_in_timerange(
user_id: str, start_time: str, end_time: str
) -> list[ExecutionResult]:
try:
executions = await AgentGraphExecution.prisma().find_many(
where={
"AND": [
{
"startedAt": {
"gte": datetime.fromisoformat(start_time),
"lte": datetime.fromisoformat(end_time),
}
},
{"userId": user_id},
]
},
include=GRAPH_EXECUTION_INCLUDE,
)
return [ExecutionResult.from_graph(execution) for execution in executions]
except Exception as e:
raise DatabaseError(
f"Failed to get executions in timerange {start_time} to {end_time} for user {user_id}: {e}"
) from e
LIST_SPLIT = "_$_"
DICT_SPLIT = "_#_"
OBJC_SPLIT = "_@_"
@@ -420,8 +476,7 @@ async def get_latest_execution(node_id: str, graph_eid: str) -> ExecutionResult
where={
"agentNodeId": node_id,
"agentGraphExecutionId": graph_eid,
"executionStatus": {"not": ExecutionStatus.INCOMPLETE},
"executionData": {"not": None}, # type: ignore
"executionStatus": {"not": ExecutionStatus.INCOMPLETE}, # type: ignore
},
order={"queuedTime": "desc"},
include=EXECUTION_RESULT_INCLUDE,

View File

@@ -6,13 +6,20 @@ from datetime import datetime, timezone
from typing import Any, Literal, Optional, Type
import prisma
from prisma.models import AgentGraph, AgentGraphExecution, AgentNode, AgentNodeLink
from prisma import Json
from prisma.models import (
AgentGraph,
AgentGraphExecution,
AgentNode,
AgentNodeLink,
StoreListingVersion,
)
from prisma.types import AgentGraphWhereInput
from pydantic.fields import computed_field
from backend.blocks.agent import AgentExecutorBlock
from backend.blocks.basic import AgentInputBlock, AgentOutputBlock
from backend.util import json
from backend.util import type
from .block import BlockInput, BlockType, get_block, get_blocks
from .db import BaseDbModel, transaction
@@ -68,8 +75,8 @@ class NodeModel(Node):
obj = NodeModel(
id=node.id,
block_id=node.AgentBlock.id,
input_default=json.loads(node.constantInput, target_type=dict[str, Any]),
metadata=json.loads(node.metadata, target_type=dict[str, Any]),
input_default=type.convert(node.constantInput, dict[str, Any]),
metadata=type.convert(node.metadata, dict[str, Any]),
graph_id=node.agentGraphId,
graph_version=node.agentGraphVersion,
webhook_id=node.webhookId,
@@ -119,7 +126,7 @@ class GraphExecution(BaseDbModel):
total_run_time = duration
try:
stats = json.loads(execution.stats or "{}", target_type=dict[str, Any])
stats = type.convert(execution.stats or {}, dict[str, Any])
except ValueError:
stats = {}
@@ -396,11 +403,9 @@ class GraphModel(Graph):
if for_export:
# Remove credentials from node input
if node.constantInput:
constant_input = json.loads(
node.constantInput, target_type=dict[str, Any]
)
constant_input = type.convert(node.constantInput, dict[str, Any])
constant_input = GraphModel._hide_node_input_credentials(constant_input)
node.constantInput = json.dumps(constant_input)
node.constantInput = Json(constant_input)
# Remove webhook info
node.webhookId = None
@@ -529,7 +534,7 @@ async def get_execution(user_id: str, execution_id: str) -> GraphExecution | Non
async def get_graph(
graph_id: str,
version: int | None = None,
template: bool = False,
template: bool = False, # note: currently not in use; TODO: remove from DB entirely
user_id: str | None = None,
for_export: bool = False,
) -> GraphModel | None:
@@ -543,21 +548,35 @@ async def get_graph(
where_clause: AgentGraphWhereInput = {
"id": graph_id,
}
if version is not None:
where_clause["version"] = version
elif not template:
where_clause["isActive"] = True
# TODO: Fix hack workaround to get adding store agents to work
if user_id is not None and not template:
where_clause["userId"] = user_id
graph = await AgentGraph.prisma().find_first(
where=where_clause,
include=AGENT_GRAPH_INCLUDE,
order={"version": "desc"},
)
return GraphModel.from_db(graph, for_export) if graph else None
# For access, the graph must be owned by the user or listed in the store
if graph is None or (
graph.userId != user_id
and not (
await StoreListingVersion.prisma().find_first(
where={
"agentId": graph_id,
"agentVersion": version or graph.version,
"isDeleted": False,
"StoreListing": {"is": {"isApproved": True}},
}
)
)
):
return None
return GraphModel.from_db(graph, for_export)
async def set_graph_active_version(graph_id: str, version: int, user_id: str) -> None:
@@ -634,8 +653,8 @@ async def __create_graph(tx, graph: Graph, user_id: str):
{
"id": node.id,
"agentBlockId": node.block_id,
"constantInput": json.dumps(node.input_default),
"metadata": json.dumps(node.metadata),
"constantInput": Json(node.input_default),
"metadata": Json(node.metadata),
}
for node in graph.nodes
]
@@ -722,7 +741,7 @@ async def fix_llm_provider_credentials():
raise RuntimeError(f"Impossible state while processing node {node}")
node_id: str = node["node_id"]
node_preset_input: dict = json.loads(node["node_preset_input"])
node_preset_input: dict = node["node_preset_input"]
credentials_meta: dict = node_preset_input["credentials"]
credentials = next(
@@ -758,5 +777,5 @@ async def fix_llm_provider_credentials():
store.update_creds(user_id, credentials)
await AgentNode.prisma().update(
where={"id": node_id},
data={"constantInput": json.dumps(node_preset_input)},
data={"constantInput": Json(node_preset_input)},
)

View File

@@ -1,6 +1,8 @@
from __future__ import annotations
import base64
import logging
from datetime import datetime
from typing import (
TYPE_CHECKING,
Annotated,
@@ -16,6 +18,7 @@ from typing import (
)
from uuid import uuid4
from prisma.enums import CreditTransactionType
from pydantic import (
BaseModel,
ConfigDict,
@@ -199,33 +202,49 @@ class OAuth2Credentials(_BaseCredentials):
scopes: list[str]
metadata: dict[str, Any] = Field(default_factory=dict)
def bearer(self) -> str:
def auth_header(self) -> str:
return f"Bearer {self.access_token.get_secret_value()}"
class APIKeyCredentials(_BaseCredentials):
type: Literal["api_key"] = "api_key"
api_key: SecretStr
expires_at: Optional[int]
expires_at: Optional[int] = Field(
default=None,
description="Unix timestamp (seconds) indicating when the API key expires (if at all)",
)
"""Unix timestamp (seconds) indicating when the API key expires (if at all)"""
def bearer(self) -> str:
def auth_header(self) -> str:
return f"Bearer {self.api_key.get_secret_value()}"
class UserPasswordCredentials(_BaseCredentials):
type: Literal["user_password"] = "user_password"
username: SecretStr
password: SecretStr
def auth_header(self) -> str:
# Converting the string to bytes using encode()
# Base64 encoding it with base64.b64encode()
# Converting the resulting bytes back to a string with decode()
return f"Basic {base64.b64encode(f'{self.username.get_secret_value()}:{self.password.get_secret_value()}'.encode()).decode()}"
Credentials = Annotated[
OAuth2Credentials | APIKeyCredentials,
OAuth2Credentials | APIKeyCredentials | UserPasswordCredentials,
Field(discriminator="type"),
]
CredentialsType = Literal["api_key", "oauth2"]
CredentialsType = Literal["api_key", "oauth2", "user_password"]
class OAuthState(BaseModel):
token: str
provider: str
expires_at: int
code_verifier: Optional[str] = None
"""Unix timestamp (seconds) indicating when this OAuth state expires"""
scopes: list[str]
@@ -346,3 +365,27 @@ def CredentialsField(
class ContributorDetails(BaseModel):
name: str = Field(title="Name", description="The name of the contributor.")
class AutoTopUpConfig(BaseModel):
amount: int
"""Amount of credits to top up."""
threshold: int
"""Threshold to trigger auto top up."""
class UserTransaction(BaseModel):
transaction_time: datetime = datetime.min
transaction_type: CreditTransactionType = CreditTransactionType.USAGE
amount: int = 0
balance: int = 0
description: str | None = None
usage_graph_id: str | None = None
usage_execution_id: str | None = None
usage_node_count: int = 0
usage_start_time: datetime = datetime.max
class TransactionHistory(BaseModel):
transactions: list[UserTransaction]
next_transaction_time: datetime | None

View File

@@ -0,0 +1,360 @@
import logging
from datetime import datetime, timedelta
from enum import Enum
from typing import Annotated, Generic, Optional, TypeVar, Union
from prisma import Json
from prisma.enums import NotificationType
from prisma.models import NotificationEvent, UserNotificationBatch
from prisma.types import UserNotificationBatchWhereInput
# from backend.notifications.models import NotificationEvent
from pydantic import BaseModel, EmailStr, Field, field_validator
from backend.server.v2.store.exceptions import DatabaseError
from .db import transaction
logger = logging.getLogger(__name__)
T_co = TypeVar("T_co", bound="BaseNotificationData", covariant=True)
class BatchingStrategy(Enum):
IMMEDIATE = "immediate" # Send right away (errors, critical notifications)
HOURLY = "hourly" # Batch for up to an hour (usage reports)
DAILY = "daily" # Daily digest (summary notifications)
BACKOFF = "backoff" # Backoff strategy (exponential backoff)
class BaseNotificationData(BaseModel):
pass
class AgentRunData(BaseNotificationData):
agent_name: str
credits_used: float
# remaining_balance: float
execution_time: float
graph_id: str
node_count: int = Field(..., description="Number of nodes executed")
class ZeroBalanceData(BaseNotificationData):
last_transaction: float
last_transaction_time: datetime
top_up_link: str
class LowBalanceData(BaseNotificationData):
current_balance: float
threshold_amount: float
top_up_link: str
recent_usage: float = Field(..., description="Usage in the last 24 hours")
class BlockExecutionFailedData(BaseNotificationData):
block_name: str
block_id: str
error_message: str
graph_id: str
node_id: str
execution_id: str
class ContinuousAgentErrorData(BaseNotificationData):
agent_name: str
error_message: str
graph_id: str
execution_id: str
start_time: datetime
error_time: datetime
attempts: int = Field(..., description="Number of retry attempts made")
class BaseSummaryData(BaseNotificationData):
total_credits_used: float
total_executions: int
most_used_agent: str
total_execution_time: float
successful_runs: int
failed_runs: int
average_execution_time: float
cost_breakdown: dict[str, float]
class DailySummaryData(BaseSummaryData):
date: datetime
class WeeklySummaryData(BaseSummaryData):
start_date: datetime
end_date: datetime
week_number: int
year: int
class MonthlySummaryData(BaseSummaryData):
month: int
year: int
NotificationData = Annotated[
Union[
AgentRunData,
ZeroBalanceData,
LowBalanceData,
BlockExecutionFailedData,
ContinuousAgentErrorData,
MonthlySummaryData,
],
Field(discriminator="type"),
]
class NotificationEventDTO(BaseModel):
user_id: str
type: NotificationType
data: dict
created_at: datetime = Field(default_factory=datetime.now)
class NotificationEventModel(BaseModel, Generic[T_co]):
user_id: str
type: NotificationType
data: T_co
created_at: datetime = Field(default_factory=datetime.now)
@property
def strategy(self) -> BatchingStrategy:
return NotificationTypeOverride(self.type).strategy
@field_validator("type", mode="before")
def uppercase_type(cls, v):
if isinstance(v, str):
return v.upper()
return v
@property
def template(self) -> str:
return NotificationTypeOverride(self.type).template
def get_data_type(
notification_type: NotificationType,
) -> type[BaseNotificationData]:
return {
NotificationType.AGENT_RUN: AgentRunData,
NotificationType.ZERO_BALANCE: ZeroBalanceData,
NotificationType.LOW_BALANCE: LowBalanceData,
NotificationType.BLOCK_EXECUTION_FAILED: BlockExecutionFailedData,
NotificationType.CONTINUOUS_AGENT_ERROR: ContinuousAgentErrorData,
NotificationType.DAILY_SUMMARY: DailySummaryData,
NotificationType.WEEKLY_SUMMARY: WeeklySummaryData,
NotificationType.MONTHLY_SUMMARY: MonthlySummaryData,
}[notification_type]
class NotificationBatch(BaseModel):
user_id: str
events: list[NotificationEvent]
strategy: BatchingStrategy
last_update: datetime = datetime.now()
class NotificationResult(BaseModel):
success: bool
message: Optional[str] = None
class NotificationTypeOverride:
def __init__(self, notification_type: NotificationType):
self.notification_type = notification_type
@property
def strategy(self) -> BatchingStrategy:
BATCHING_RULES = {
# These are batched by the notification service
NotificationType.AGENT_RUN: BatchingStrategy.IMMEDIATE,
# These are batched by the notification service, but with a backoff strategy
NotificationType.ZERO_BALANCE: BatchingStrategy.BACKOFF,
NotificationType.LOW_BALANCE: BatchingStrategy.BACKOFF,
NotificationType.BLOCK_EXECUTION_FAILED: BatchingStrategy.BACKOFF,
NotificationType.CONTINUOUS_AGENT_ERROR: BatchingStrategy.BACKOFF,
# These aren't batched by the notification service, so we send them right away
NotificationType.DAILY_SUMMARY: BatchingStrategy.IMMEDIATE,
NotificationType.WEEKLY_SUMMARY: BatchingStrategy.IMMEDIATE,
NotificationType.MONTHLY_SUMMARY: BatchingStrategy.IMMEDIATE,
}
return BATCHING_RULES.get(self.notification_type, BatchingStrategy.HOURLY)
@property
def template(self) -> str:
"""Returns template name for this notification type"""
return {
NotificationType.AGENT_RUN: "agent_run.html",
NotificationType.ZERO_BALANCE: "zero_balance.html",
NotificationType.LOW_BALANCE: "low_balance.html",
NotificationType.BLOCK_EXECUTION_FAILED: "block_failed.html",
NotificationType.CONTINUOUS_AGENT_ERROR: "agent_error.html",
NotificationType.DAILY_SUMMARY: "daily_summary.html",
NotificationType.WEEKLY_SUMMARY: "weekly_summary.html",
NotificationType.MONTHLY_SUMMARY: "monthly_summary.html",
}[self.notification_type]
class NotificationPreference(BaseModel):
user_id: str
email: EmailStr
preferences: dict[NotificationType, bool] = Field(
default_factory=dict, description="Which notifications the user wants"
)
daily_limit: int = 10 # Max emails per day
emails_sent_today: int = 0
last_reset_date: datetime = Field(default_factory=datetime.now)
def get_batch_delay(notification_type: NotificationType) -> timedelta:
return {
NotificationType.AGENT_RUN: timedelta(seconds=1),
NotificationType.ZERO_BALANCE: timedelta(minutes=60),
NotificationType.LOW_BALANCE: timedelta(minutes=60),
NotificationType.BLOCK_EXECUTION_FAILED: timedelta(minutes=60),
NotificationType.CONTINUOUS_AGENT_ERROR: timedelta(minutes=60),
}[notification_type]
async def create_or_add_to_user_notification_batch(
user_id: str,
notification_type: NotificationType,
data: str, # type: 'NotificationEventModel'
) -> dict:
try:
logger.info(
f"Creating or adding to notification batch for {user_id} with type {notification_type} and data {data}"
)
notification_data = NotificationEventModel[
get_data_type(notification_type)
].model_validate_json(data)
# Serialize the data
json_data: Json = Json(notification_data.data.model_dump_json())
# First try to find existing batch
existing_batch = await UserNotificationBatch.prisma().find_unique(
where={
"userId_type": {
"userId": user_id,
"type": notification_type,
}
},
include={"notifications": True},
)
if not existing_batch:
async with transaction() as tx:
notification_event = await tx.notificationevent.create(
data={
"type": notification_type,
"data": json_data,
}
)
# Create new batch
resp = await tx.usernotificationbatch.create(
data={
"userId": user_id,
"type": notification_type,
"notifications": {"connect": [{"id": notification_event.id}]},
},
include={"notifications": True},
)
return resp.model_dump()
else:
async with transaction() as tx:
notification_event = await tx.notificationevent.create(
data={
"type": notification_type,
"data": json_data,
"UserNotificationBatch": {"connect": {"id": existing_batch.id}},
}
)
# Add to existing batch
resp = await tx.usernotificationbatch.update(
where={"id": existing_batch.id},
data={
"notifications": {"connect": [{"id": notification_event.id}]}
},
include={"notifications": True},
)
if not resp:
raise DatabaseError(
f"Failed to add notification event {notification_event.id} to existing batch {existing_batch.id}"
)
return resp.model_dump()
except Exception as e:
raise DatabaseError(
f"Failed to create or add to notification batch for user {user_id} and type {notification_type}: {e}"
) from e
async def get_user_notification_last_message_in_batch(
user_id: str,
notification_type: NotificationType,
) -> NotificationEvent | None:
try:
batch = await UserNotificationBatch.prisma().find_first(
where={"userId": user_id, "type": notification_type},
order={"createdAt": "desc"},
)
if not batch:
return None
if not batch.notifications:
return None
return batch.notifications[-1]
except Exception as e:
raise DatabaseError(
f"Failed to get user notification last message in batch for user {user_id} and type {notification_type}: {e}"
) from e
async def empty_user_notification_batch(
user_id: str, notification_type: NotificationType
) -> None:
try:
async with transaction() as tx:
await tx.notificationevent.delete_many(
where={
"UserNotificationBatch": {
"is": {"userId": user_id, "type": notification_type}
}
}
)
await tx.usernotificationbatch.delete_many(
where=UserNotificationBatchWhereInput(
userId=user_id,
type=notification_type,
)
)
except Exception as e:
raise DatabaseError(
f"Failed to empty user notification batch for user {user_id} and type {notification_type}: {e}"
) from e
async def get_user_notification_batch(
user_id: str,
notification_type: NotificationType,
) -> UserNotificationBatch | None:
try:
return await UserNotificationBatch.prisma().find_first(
where={"userId": user_id, "type": notification_type},
include={"notifications": True},
)
except Exception as e:
raise DatabaseError(
f"Failed to get user notification batch for user {user_id} and type {notification_type}: {e}"
) from e

View File

@@ -0,0 +1,296 @@
import logging
from abc import ABC, abstractmethod
from enum import Enum
from typing import Awaitable, Optional
import aio_pika
import pika
import pika.adapters.blocking_connection
from pika.spec import BasicProperties
from pydantic import BaseModel
from backend.util.retry import conn_retry
from backend.util.settings import Settings
logger = logging.getLogger(__name__)
class ExchangeType(str, Enum):
DIRECT = "direct"
FANOUT = "fanout"
TOPIC = "topic"
HEADERS = "headers"
class Exchange(BaseModel):
name: str
type: ExchangeType
durable: bool = True
auto_delete: bool = False
class Queue(BaseModel):
name: str
durable: bool = True
auto_delete: bool = False
# Optional exchange binding configuration
exchange: Optional[Exchange] = None
routing_key: Optional[str] = None
arguments: Optional[dict] = None
class RabbitMQConfig(BaseModel):
"""Configuration for a RabbitMQ service instance"""
vhost: str = "/"
exchanges: list[Exchange]
queues: list[Queue]
class RabbitMQBase(ABC):
"""Base class for RabbitMQ connections with shared configuration"""
def __init__(self, config: RabbitMQConfig):
settings = Settings()
self.host = settings.config.rabbitmq_host
self.port = settings.config.rabbitmq_port
self.username = settings.secrets.rabbitmq_default_user
self.password = settings.secrets.rabbitmq_default_pass
self.config = config
self._connection = None
self._channel = None
@property
def is_connected(self) -> bool:
"""Check if we have a valid connection"""
return bool(self._connection)
@property
def is_ready(self) -> bool:
"""Check if we have a valid channel"""
return bool(self.is_connected and self._channel)
@abstractmethod
def connect(self) -> None | Awaitable[None]:
"""Establish connection to RabbitMQ"""
pass
@abstractmethod
def disconnect(self) -> None | Awaitable[None]:
"""Close connection to RabbitMQ"""
pass
@abstractmethod
def declare_infrastructure(self) -> None | Awaitable[None]:
"""Declare exchanges and queues for this service"""
pass
class SyncRabbitMQ(RabbitMQBase):
"""Synchronous RabbitMQ client"""
@property
def is_connected(self) -> bool:
return bool(self._connection and self._connection.is_open)
@property
def is_ready(self) -> bool:
return bool(self.is_connected and self._channel and self._channel.is_open)
@conn_retry("RabbitMQ", "Acquiring connection")
def connect(self) -> None:
if self.is_connected:
return
credentials = pika.PlainCredentials(self.username, self.password)
parameters = pika.ConnectionParameters(
host=self.host,
port=self.port,
virtual_host=self.config.vhost,
credentials=credentials,
heartbeat=600,
blocked_connection_timeout=300,
)
self._connection = pika.BlockingConnection(parameters)
self._channel = self._connection.channel()
self._channel.basic_qos(prefetch_count=1)
self.declare_infrastructure()
def disconnect(self) -> None:
if self._channel:
if self._channel.is_open:
self._channel.close()
self._channel = None
if self._connection:
if self._connection.is_open:
self._connection.close()
self._connection = None
def declare_infrastructure(self) -> None:
"""Declare exchanges and queues for this service"""
if not self.is_ready:
self.connect()
if self._channel is None:
raise RuntimeError("Channel should be established after connect")
# Declare exchanges
for exchange in self.config.exchanges:
self._channel.exchange_declare(
exchange=exchange.name,
exchange_type=exchange.type.value,
durable=exchange.durable,
auto_delete=exchange.auto_delete,
)
# Declare queues and bind them to exchanges
for queue in self.config.queues:
self._channel.queue_declare(
queue=queue.name,
durable=queue.durable,
auto_delete=queue.auto_delete,
arguments=queue.arguments,
)
if queue.exchange:
self._channel.queue_bind(
queue=queue.name,
exchange=queue.exchange.name,
routing_key=queue.routing_key or queue.name,
)
def publish_message(
self,
routing_key: str,
message: str,
exchange: Optional[Exchange] = None,
properties: Optional[BasicProperties] = None,
mandatory: bool = True,
) -> None:
if not self.is_ready:
self.connect()
if self._channel is None:
raise RuntimeError("Channel should be established after connect")
self._channel.basic_publish(
exchange=exchange.name if exchange else "",
routing_key=routing_key,
body=message.encode(),
properties=properties or BasicProperties(delivery_mode=2),
mandatory=mandatory,
)
def get_channel(self) -> pika.adapters.blocking_connection.BlockingChannel:
if not self.is_ready:
self.connect()
if self._channel is None:
raise RuntimeError("Channel should be established after connect")
return self._channel
class AsyncRabbitMQ(RabbitMQBase):
"""Asynchronous RabbitMQ client"""
@property
def is_connected(self) -> bool:
return bool(self._connection and not self._connection.is_closed)
@property
def is_ready(self) -> bool:
return bool(self.is_connected and self._channel and not self._channel.is_closed)
@conn_retry("AsyncRabbitMQ", "Acquiring async connection")
async def connect(self):
if self.is_connected:
return
self._connection = await aio_pika.connect_robust(
host=self.host,
port=self.port,
login=self.username,
password=self.password,
virtualhost=self.config.vhost.lstrip("/"),
)
self._channel = await self._connection.channel()
await self._channel.set_qos(prefetch_count=1)
await self.declare_infrastructure()
async def disconnect(self):
if self._channel:
await self._channel.close()
self._channel = None
if self._connection:
await self._connection.close()
self._connection = None
async def declare_infrastructure(self):
"""Declare exchanges and queues for this service"""
if not self.is_ready:
await self.connect()
if self._channel is None:
raise RuntimeError("Channel should be established after connect")
# Declare exchanges
for exchange in self.config.exchanges:
await self._channel.declare_exchange(
name=exchange.name,
type=exchange.type.value,
durable=exchange.durable,
auto_delete=exchange.auto_delete,
)
# Declare queues and bind them to exchanges
for queue in self.config.queues:
queue_obj = await self._channel.declare_queue(
name=queue.name,
durable=queue.durable,
auto_delete=queue.auto_delete,
arguments=queue.arguments,
)
if queue.exchange:
exchange = await self._channel.get_exchange(queue.exchange.name)
await queue_obj.bind(
exchange, routing_key=queue.routing_key or queue.name
)
async def publish_message(
self,
routing_key: str,
message: str,
exchange: Optional[Exchange] = None,
persistent: bool = True,
) -> None:
if not self.is_ready:
await self.connect()
if self._channel is None:
raise RuntimeError("Channel should be established after connect")
if exchange:
exchange_obj = await self._channel.get_exchange(exchange.name)
else:
exchange_obj = self._channel.default_exchange
await exchange_obj.publish(
aio_pika.Message(
body=message.encode(),
delivery_mode=(
aio_pika.DeliveryMode.PERSISTENT
if persistent
else aio_pika.DeliveryMode.NOT_PERSISTENT
),
),
routing_key=routing_key,
)
async def get_channel(self) -> aio_pika.abc.AbstractChannel:
if not self.is_ready:
await self.connect()
if self._channel is None:
raise RuntimeError("Channel should be established after connect")
return self._channel

View File

@@ -1,44 +1,54 @@
import logging
from datetime import datetime, timedelta
from typing import Optional, cast
from autogpt_libs.auth.models import DEFAULT_USER_ID
from fastapi import HTTPException
from prisma import Json
from prisma.enums import NotificationType
from prisma.models import User
from backend.data.db import prisma
from backend.data.model import UserIntegrations, UserMetadata, UserMetadataRaw
from backend.data.notifications import NotificationPreference
from backend.server.v2.store.exceptions import DatabaseError
from backend.util.encryption import JSONCryptor
logger = logging.getLogger(__name__)
async def get_or_create_user(user_data: dict) -> User:
user_id = user_data.get("sub")
if not user_id:
raise HTTPException(status_code=401, detail="User ID not found in token")
try:
user_id = user_data.get("sub")
if not user_id:
raise HTTPException(status_code=401, detail="User ID not found in token")
user_email = user_data.get("email")
if not user_email:
raise HTTPException(status_code=401, detail="Email not found in token")
user_email = user_data.get("email")
if not user_email:
raise HTTPException(status_code=401, detail="Email not found in token")
user = await prisma.user.find_unique(where={"id": user_id})
if not user:
user = await prisma.user.create(
data={
"id": user_id,
"email": user_email,
"name": user_data.get("user_metadata", {}).get("name"),
}
)
return User.model_validate(user)
except Exception as e:
raise DatabaseError(f"Failed to get or create user {user_data}: {e}") from e
async def get_user_by_id(user_id: str) -> User:
user = await prisma.user.find_unique(where={"id": user_id})
if not user:
user = await prisma.user.create(
data={
"id": user_id,
"email": user_email,
"name": user_data.get("user_metadata", {}).get("name"),
}
)
raise ValueError(f"User not found with ID: {user_id}")
return User.model_validate(user)
async def get_user_by_id(user_id: str) -> Optional[User]:
user = await prisma.user.find_unique(where={"id": user_id})
return User.model_validate(user) if user else None
async def create_default_user() -> Optional[User]:
user = await prisma.user.find_unique(where={"id": DEFAULT_USER_ID})
if not user:
@@ -128,3 +138,70 @@ async def migrate_and_encrypt_user_integrations():
where={"id": user.id},
data={"metadata": Json(raw_metadata)},
)
async def get_active_user_ids_in_timerange(start_time: str, end_time: str) -> list[str]:
try:
users = await User.prisma().find_many(
where={
"AgentGraphExecutions": {
"some": {
"createdAt": {
"gte": datetime.fromisoformat(start_time),
"lte": datetime.fromisoformat(end_time),
}
}
}
},
)
return [user.id for user in users]
except Exception as e:
raise DatabaseError(
f"Failed to get active user ids in timerange {start_time} to {end_time}: {e}"
) from e
async def get_active_users_ids() -> list[str]:
user_ids = await get_active_user_ids_in_timerange(
(datetime.now() - timedelta(days=30)).isoformat(),
datetime.now().isoformat(),
)
return user_ids
async def get_user_notification_preference(user_id: str) -> NotificationPreference:
try:
user = await User.prisma().find_unique_or_raise(
where={"id": user_id},
)
# enable notifications by default if user has no notification preference (shouldn't ever happen though)
preferences: dict[NotificationType, bool] = {
NotificationType.AGENT_RUN: user.notifyOnAgentRun or True,
NotificationType.ZERO_BALANCE: user.notifyOnZeroBalance or True,
NotificationType.LOW_BALANCE: user.notifyOnLowBalance or True,
NotificationType.BLOCK_EXECUTION_FAILED: user.notifyOnBlockExecutionFailed
or True,
NotificationType.CONTINUOUS_AGENT_ERROR: user.notifyOnContinuousAgentError
or True,
NotificationType.DAILY_SUMMARY: user.notifyOnDailySummary or True,
NotificationType.WEEKLY_SUMMARY: user.notifyOnWeeklySummary or True,
NotificationType.MONTHLY_SUMMARY: user.notifyOnMonthlySummary or True,
}
daily_limit = user.maxEmailsPerDay or 3
notification_preference = NotificationPreference(
user_id=user.id,
email=user.email,
preferences=preferences,
daily_limit=daily_limit,
# TODO with other changes later, for now we just will email them
emails_sent_today=0,
last_reset_date=datetime.now(),
)
return NotificationPreference.model_validate(notification_preference)
except Exception as e:
raise DatabaseError(
f"Failed to upsert user notification preference for user {user_id}: {e}"
) from e

View File

@@ -4,9 +4,11 @@ from typing import Any, Callable, Concatenate, Coroutine, ParamSpec, TypeVar, ca
from backend.data.credit import get_user_credit_model
from backend.data.execution import (
ExecutionResult,
NodeExecutionEntry,
RedisExecutionEventBus,
create_graph_execution,
get_execution_results,
get_executions_in_timerange,
get_incomplete_executions,
get_latest_execution,
update_execution_status,
@@ -16,9 +18,19 @@ from backend.data.execution import (
upsert_execution_output,
)
from backend.data.graph import get_graph, get_node
from backend.data.notifications import (
create_or_add_to_user_notification_batch,
empty_user_notification_batch,
get_user_notification_batch,
get_user_notification_last_message_in_batch,
)
from backend.data.user import (
get_active_user_ids_in_timerange,
get_active_users_ids,
get_user_by_id,
get_user_integrations,
get_user_metadata,
get_user_notification_preference,
update_user_integrations,
update_user_metadata,
)
@@ -71,6 +83,7 @@ class DatabaseManager(AppService):
update_node_execution_stats = exposed_run_and_wait(update_node_execution_stats)
upsert_execution_input = exposed_run_and_wait(upsert_execution_input)
upsert_execution_output = exposed_run_and_wait(upsert_execution_output)
get_executions_in_timerange = exposed_run_and_wait(get_executions_in_timerange)
# Graphs
get_node = exposed_run_and_wait(get_node)
@@ -78,17 +91,31 @@ class DatabaseManager(AppService):
# Credits
user_credit_model = get_user_credit_model()
get_or_refill_credit = cast(
Callable[[Any, str], int],
exposed_run_and_wait(user_credit_model.get_or_refill_credit),
)
spend_credits = cast(
Callable[[Any, str, int, str, dict[str, str], float, float], int],
Callable[[Any, NodeExecutionEntry, float, float], int],
exposed_run_and_wait(user_credit_model.spend_credits),
)
# User + User Metadata + User Integrations
# User + User Metadata + User Integrations + User Notification Preferences
get_user_metadata = exposed_run_and_wait(get_user_metadata)
update_user_metadata = exposed_run_and_wait(update_user_metadata)
get_user_integrations = exposed_run_and_wait(get_user_integrations)
update_user_integrations = exposed_run_and_wait(update_user_integrations)
get_active_user_ids_in_timerange = exposed_run_and_wait(
get_active_user_ids_in_timerange
)
get_user_by_id = exposed_run_and_wait(get_user_by_id)
get_user_notification_preference = exposed_run_and_wait(
get_user_notification_preference
)
get_active_users_ids = exposed_run_and_wait(get_active_users_ids)
# Notifications
create_or_add_to_user_notification_batch = exposed_run_and_wait(
create_or_add_to_user_notification_batch
)
get_user_notification_last_message_in_batch = exposed_run_and_wait(
get_user_notification_last_message_in_batch
)
empty_user_notification_batch = exposed_run_and_wait(empty_user_notification_batch)
get_user_notification_batch = exposed_run_and_wait(get_user_notification_batch)

View File

@@ -8,7 +8,7 @@ import threading
from concurrent.futures import Future, ProcessPoolExecutor
from contextlib import contextmanager
from multiprocessing.pool import AsyncResult, Pool
from typing import TYPE_CHECKING, Any, Generator, TypeVar, cast
from typing import TYPE_CHECKING, Any, Generator, Optional, TypeVar, cast
from redis.lock import Lock as RedisLock
@@ -40,6 +40,7 @@ from backend.data.graph import GraphModel, Link, Node
from backend.integrations.creds_manager import IntegrationCredentialsManager
from backend.util import json
from backend.util.decorator import error_logged, time_measured
from backend.util.file import clean_exec_files
from backend.util.logging import configure_logging
from backend.util.process import set_service_name
from backend.util.service import (
@@ -162,6 +163,7 @@ def execute_node(
# AgentExecutorBlock specially separate the node input_data & its input_default.
if isinstance(node_block, AgentExecutorBlock):
input_data = {**node.input_default, "data": input_data}
data.data = input_data
# Execute the node
input_data_str = json.dumps(input_data)
@@ -169,7 +171,15 @@ def execute_node(
log_metadata.info("Executed node with input", input=input_data_str)
update_execution(ExecutionStatus.RUNNING)
extra_exec_kwargs = {}
# Inject extra execution arguments for the blocks via kwargs
extra_exec_kwargs: dict = {
"graph_id": graph_id,
"node_id": node_id,
"graph_exec_id": graph_exec_id,
"node_exec_id": node_exec_id,
"user_id": user_id,
}
# Last-minute fetch credentials + acquire a system-wide read-write lock to prevent
# changes during execution. ⚠️ This means a set of credentials can only be used by
# one (running) block at a time; simultaneous execution of blocks using same
@@ -182,12 +192,12 @@ def execute_node(
extra_exec_kwargs[field_name] = credentials
output_size = 0
end_status = ExecutionStatus.COMPLETED
credit = db_client.get_or_refill_credit(user_id)
if credit < 0:
raise ValueError(f"Insufficient credit: {credit}")
try:
# Charge the user for the execution before running the block.
# TODO: We assume the block is executed within 0 seconds.
# This is fine because for now, there is no block that is charged by time.
db_client.spend_credits(data, input_size + output_size, 0)
for output_name, output_data in node_block.execute(
input_data, **extra_exec_kwargs
):
@@ -206,11 +216,12 @@ def execute_node(
):
yield execution
update_execution(ExecutionStatus.COMPLETED)
except Exception as e:
end_status = ExecutionStatus.FAILED
error_msg = str(e)
log_metadata.exception(f"Node execution failed with error {error_msg}")
db_client.upsert_execution_output(node_exec_id, "error", error_msg)
update_execution(ExecutionStatus.FAILED)
for execution in _enqueue_next_nodes(
db_client=db_client,
@@ -232,17 +243,6 @@ def execute_node(
except Exception as e:
log_metadata.error(f"Failed to release credentials lock: {e}")
# Update execution status and spend credits
res = update_execution(end_status)
if end_status == ExecutionStatus.COMPLETED:
s = input_size + output_size
t = (
(res.end_time - res.start_time).total_seconds()
if res.end_time and res.start_time
else 0
)
db_client.spend_credits(user_id, credit, node_block.id, input_data, s, t)
# Update execution stats
if execution_stats is not None:
execution_stats.update(node_block.execution_stats)
@@ -260,7 +260,7 @@ def _enqueue_next_nodes(
log_metadata: LogMetadata,
) -> list[NodeExecutionEntry]:
def add_enqueued_execution(
node_exec_id: str, node_id: str, data: BlockInput
node_exec_id: str, node_id: str, block_id: str, data: BlockInput
) -> NodeExecutionEntry:
exec_update = db_client.update_execution_status(
node_exec_id, ExecutionStatus.QUEUED, data
@@ -272,6 +272,7 @@ def _enqueue_next_nodes(
graph_id=graph_id,
node_exec_id=node_exec_id,
node_id=node_id,
block_id=block_id,
data=data,
)
@@ -325,7 +326,12 @@ def _enqueue_next_nodes(
# Input is complete, enqueue the execution.
log_metadata.info(f"Enqueued {suffix}")
enqueued_executions.append(
add_enqueued_execution(next_node_exec_id, next_node_id, next_node_input)
add_enqueued_execution(
node_exec_id=next_node_exec_id,
node_id=next_node_id,
block_id=next_node.block_id,
data=next_node_input,
)
)
# Next execution stops here if the link is not static.
@@ -355,7 +361,12 @@ def _enqueue_next_nodes(
continue
log_metadata.info(f"Enqueueing static-link execution {suffix}")
enqueued_executions.append(
add_enqueued_execution(iexec.node_exec_id, next_node_id, idata)
add_enqueued_execution(
node_exec_id=iexec.node_exec_id,
node_id=next_node_id,
block_id=next_node.block_id,
data=idata,
)
)
return enqueued_executions
@@ -548,9 +559,15 @@ class Executor:
q.add(execution)
log_metadata.info(f"Finished node execution {node_exec.node_exec_id}")
except Exception as e:
log_metadata.exception(
f"Failed node execution {node_exec.node_exec_id}: {e}"
)
# Avoid user error being marked as an actual error.
if isinstance(e, ValueError):
log_metadata.info(
f"Failed node execution {node_exec.node_exec_id}: {e}"
)
else:
log_metadata.exception(
f"Failed node execution {node_exec.node_exec_id}: {e}"
)
@classmethod
def on_graph_executor_start(cls):
@@ -648,6 +665,10 @@ class Executor:
try:
queue = ExecutionQueue[NodeExecutionEntry]()
for node_exec in graph_exec.start_node_execs:
exec_update = cls.db_client.update_execution_status(
node_exec.node_exec_id, ExecutionStatus.QUEUED, node_exec.data
)
cls.db_client.send_execution_update(exec_update)
queue.add(node_exec)
running_executions: dict[str, AsyncResult] = {}
@@ -716,6 +737,7 @@ class Executor:
finished = True
cancel.set()
cancel_thread.join()
clean_exec_files(graph_exec.graph_exec_id)
return (
exec_stats,
@@ -780,7 +802,8 @@ class ExecutionManager(AppService):
graph_id: str,
data: BlockInput,
user_id: str,
graph_version: int | None = None,
graph_version: Optional[int] = None,
preset_id: str | None = None,
) -> GraphExecutionEntry:
graph: GraphModel | None = self.db_client.get_graph(
graph_id=graph_id, user_id=user_id, version=graph_version
@@ -802,9 +825,9 @@ class ExecutionManager(AppService):
# Extract request input data, and assign it to the input pin.
if block.block_type == BlockType.INPUT:
name = node.input_default.get("name")
if name and name in data:
input_data = {"value": data[name]}
input_name = node.input_default.get("name")
if input_name and input_name in data:
input_data = {"value": data[input_name]}
# Extract webhook payload, and assign it to the input pin
webhook_payload_key = f"webhook_{node.webhook_id}_payload"
@@ -829,6 +852,7 @@ class ExecutionManager(AppService):
graph_version=graph.version,
nodes_input=nodes_input,
user_id=user_id,
preset_id=preset_id,
)
starting_node_execs = []
@@ -840,17 +864,15 @@ class ExecutionManager(AppService):
graph_id=node_exec.graph_id,
node_exec_id=node_exec.node_exec_id,
node_id=node_exec.node_id,
block_id=node_exec.block_id,
data=node_exec.input_data,
)
)
exec_update = self.db_client.update_execution_status(
node_exec.node_exec_id, ExecutionStatus.QUEUED, node_exec.input_data
)
self.db_client.send_execution_update(exec_update)
graph_exec = GraphExecutionEntry(
user_id=user_id,
graph_id=graph_id,
graph_version=graph_version or 0,
graph_exec_id=graph_exec_id,
start_node_execs=starting_node_execs,
)

View File

@@ -63,7 +63,10 @@ def execute_graph(**kwargs):
try:
log(f"Executing recurring job for graph #{args.graph_id}")
get_execution_client().add_execution(
args.graph_id, args.input_data, args.user_id
graph_id=args.graph_id,
data=args.input_data,
user_id=args.user_id,
graph_version=args.graph_version,
)
except Exception as e:
logger.exception(f"Error executing graph {args.graph_id}: {e}")

View File

@@ -1,6 +1,8 @@
import base64
import hashlib
import secrets
from datetime import datetime, timedelta, timezone
from typing import TYPE_CHECKING
from typing import TYPE_CHECKING, Optional
from pydantic import SecretStr
@@ -21,6 +23,15 @@ from backend.util.settings import Settings
settings = Settings()
# This is an overrride since ollama doesn't actually require an API key, but the creddential system enforces one be attached
ollama_credentials = APIKeyCredentials(
id="744fdc56-071a-4761-b5a5-0af0ce10a2b5",
provider="ollama",
api_key=SecretStr("FAKE_API_KEY"),
title="Use Credits for Ollama",
expires_at=None,
)
revid_credentials = APIKeyCredentials(
id="fdb7f412-f519-48d1-9b5f-d2f73d0e01fe",
provider="revid",
@@ -91,9 +102,52 @@ open_router_credentials = APIKeyCredentials(
title="Use Credits for Open Router",
expires_at=None,
)
fal_credentials = APIKeyCredentials(
id="6c0f5bd0-9008-4638-9d79-4b40b631803e",
provider="fal",
api_key=SecretStr(settings.secrets.fal_api_key),
title="Use Credits for FAL",
expires_at=None,
)
exa_credentials = APIKeyCredentials(
id="96153e04-9c6c-4486-895f-5bb683b1ecec",
provider="exa",
api_key=SecretStr(settings.secrets.exa_api_key),
title="Use Credits for Exa search",
expires_at=None,
)
e2b_credentials = APIKeyCredentials(
id="78d19fd7-4d59-4a16-8277-3ce310acf2b7",
provider="e2b",
api_key=SecretStr(settings.secrets.e2b_api_key),
title="Use Credits for E2B",
expires_at=None,
)
nvidia_credentials = APIKeyCredentials(
id="96b83908-2789-4dec-9968-18f0ece4ceb3",
provider="nvidia",
api_key=SecretStr(settings.secrets.nvidia_api_key),
title="Use Credits for Nvidia",
expires_at=None,
)
screenshotone_credentials = APIKeyCredentials(
id="3b1bdd16-8818-4bc2-8cbb-b23f9a3439ed",
provider="screenshotone",
api_key=SecretStr(settings.secrets.screenshotone_api_key),
title="Use Credits for ScreenshotOne",
expires_at=None,
)
mem0_credentials = APIKeyCredentials(
id="ed55ac19-356e-4243-a6cb-bc599e9b716f",
provider="mem0",
api_key=SecretStr(settings.secrets.mem0_api_key),
title="Use Credits for Mem0",
expires_at=None,
)
DEFAULT_CREDENTIALS = [
ollama_credentials,
revid_credentials,
ideogram_credentials,
replicate_credentials,
@@ -104,6 +158,12 @@ DEFAULT_CREDENTIALS = [
jina_credentials,
unreal_credentials,
open_router_credentials,
fal_credentials,
exa_credentials,
e2b_credentials,
mem0_credentials,
nvidia_credentials,
screenshotone_credentials,
]
@@ -135,6 +195,10 @@ class IntegrationCredentialsStore:
def get_all_creds(self, user_id: str) -> list[Credentials]:
users_credentials = self._get_user_integrations(user_id).credentials
all_credentials = users_credentials
# These will always be added
all_credentials.append(ollama_credentials)
# These will only be added if the API key is set
if settings.secrets.revid_api_key:
all_credentials.append(revid_credentials)
if settings.secrets.ideogram_api_key:
@@ -155,6 +219,18 @@ class IntegrationCredentialsStore:
all_credentials.append(unreal_credentials)
if settings.secrets.open_router_api_key:
all_credentials.append(open_router_credentials)
if settings.secrets.fal_api_key:
all_credentials.append(fal_credentials)
if settings.secrets.exa_api_key:
all_credentials.append(exa_credentials)
if settings.secrets.e2b_api_key:
all_credentials.append(e2b_credentials)
if settings.secrets.nvidia_api_key:
all_credentials.append(nvidia_credentials)
if settings.secrets.screenshotone_api_key:
all_credentials.append(screenshotone_credentials)
if settings.secrets.mem0_api_key:
all_credentials.append(mem0_credentials)
return all_credentials
def get_creds_by_id(self, user_id: str, credentials_id: str) -> Credentials | None:
@@ -210,18 +286,24 @@ class IntegrationCredentialsStore:
]
self._set_user_integration_creds(user_id, filtered_credentials)
def store_state_token(self, user_id: str, provider: str, scopes: list[str]) -> str:
def store_state_token(
self, user_id: str, provider: str, scopes: list[str], use_pkce: bool = False
) -> tuple[str, str]:
token = secrets.token_urlsafe(32)
expires_at = datetime.now(timezone.utc) + timedelta(minutes=10)
(code_challenge, code_verifier) = self._generate_code_challenge()
state = OAuthState(
token=token,
provider=provider,
code_verifier=code_verifier,
expires_at=int(expires_at.timestamp()),
scopes=scopes,
)
with self.locked_user_integrations(user_id):
user_integrations = self._get_user_integrations(user_id)
oauth_states = user_integrations.oauth_states
oauth_states.append(state)
@@ -231,39 +313,21 @@ class IntegrationCredentialsStore:
user_id=user_id, data=user_integrations
)
return token
return token, code_challenge
def get_any_valid_scopes_from_state_token(
def _generate_code_challenge(self) -> tuple[str, str]:
"""
Generate code challenge using SHA256 from the code verifier.
Currently only SHA256 is supported.(In future if we want to support more methods we can add them here)
"""
code_verifier = secrets.token_urlsafe(128)
sha256_hash = hashlib.sha256(code_verifier.encode("utf-8")).digest()
code_challenge = base64.urlsafe_b64encode(sha256_hash).decode("utf-8")
return code_challenge.replace("=", ""), code_verifier
def verify_state_token(
self, user_id: str, token: str, provider: str
) -> list[str]:
"""
Get the valid scopes from the OAuth state token. This will return any valid scopes
from any OAuth state token for the given provider. If no valid scopes are found,
an empty list is returned. DO NOT RELY ON THIS TOKEN TO AUTHENTICATE A USER, AS IT
IS TO CHECK IF THE USER HAS GIVEN PERMISSIONS TO THE APPLICATION BEFORE EXCHANGING
THE CODE FOR TOKENS.
"""
user_integrations = self._get_user_integrations(user_id)
oauth_states = user_integrations.oauth_states
now = datetime.now(timezone.utc)
valid_state = next(
(
state
for state in oauth_states
if state.token == token
and state.provider == provider
and state.expires_at > now.timestamp()
),
None,
)
if valid_state:
return valid_state.scopes
return []
def verify_state_token(self, user_id: str, token: str, provider: str) -> bool:
) -> Optional[OAuthState]:
with self.locked_user_integrations(user_id):
user_integrations = self._get_user_integrations(user_id)
oauth_states = user_integrations.oauth_states
@@ -285,9 +349,9 @@ class IntegrationCredentialsStore:
oauth_states.remove(valid_state)
user_integrations.oauth_states = oauth_states
self.db_manager.update_user_integrations(user_id, user_integrations)
return True
return valid_state
return False
return None
def _set_user_integration_creds(
self, user_id: str, credentials: list[Credentials]

View File

@@ -1,8 +1,12 @@
from typing import TYPE_CHECKING
from backend.integrations.oauth.todoist import TodoistOAuthHandler
from .github import GitHubOAuthHandler
from .google import GoogleOAuthHandler
from .linear import LinearOAuthHandler
from .notion import NotionOAuthHandler
from .twitter import TwitterOAuthHandler
if TYPE_CHECKING:
from ..providers import ProviderName
@@ -15,6 +19,9 @@ HANDLERS_BY_NAME: dict["ProviderName", type["BaseOAuthHandler"]] = {
GitHubOAuthHandler,
GoogleOAuthHandler,
NotionOAuthHandler,
TwitterOAuthHandler,
LinearOAuthHandler,
TodoistOAuthHandler,
]
}
# --8<-- [end:HANDLERS_BY_NAMEExample]

View File

@@ -1,7 +1,7 @@
import logging
import time
from abc import ABC, abstractmethod
from typing import ClassVar
from typing import ClassVar, Optional
from backend.data.model import OAuth2Credentials
from backend.integrations.providers import ProviderName
@@ -23,7 +23,9 @@ class BaseOAuthHandler(ABC):
@abstractmethod
# --8<-- [start:BaseOAuthHandler3]
def get_login_url(self, scopes: list[str], state: str) -> str:
def get_login_url(
self, scopes: list[str], state: str, code_challenge: Optional[str]
) -> str:
# --8<-- [end:BaseOAuthHandler3]
"""Constructs a login URL that the user can be redirected to"""
...
@@ -31,7 +33,7 @@ class BaseOAuthHandler(ABC):
@abstractmethod
# --8<-- [start:BaseOAuthHandler4]
def exchange_code_for_tokens(
self, code: str, scopes: list[str]
self, code: str, scopes: list[str], code_verifier: Optional[str]
) -> OAuth2Credentials:
# --8<-- [end:BaseOAuthHandler4]
"""Exchanges the acquired authorization code from login for a set of tokens"""

View File

@@ -34,7 +34,9 @@ class GitHubOAuthHandler(BaseOAuthHandler):
self.token_url = "https://github.com/login/oauth/access_token"
self.revoke_url = "https://api.github.com/applications/{client_id}/token"
def get_login_url(self, scopes: list[str], state: str) -> str:
def get_login_url(
self, scopes: list[str], state: str, code_challenge: Optional[str]
) -> str:
params = {
"client_id": self.client_id,
"redirect_uri": self.redirect_uri,
@@ -44,7 +46,7 @@ class GitHubOAuthHandler(BaseOAuthHandler):
return f"{self.auth_base_url}?{urlencode(params)}"
def exchange_code_for_tokens(
self, code: str, scopes: list[str]
self, code: str, scopes: list[str], code_verifier: Optional[str]
) -> OAuth2Credentials:
return self._request_tokens({"code": code, "redirect_uri": self.redirect_uri})

View File

@@ -1,4 +1,5 @@
import logging
from typing import Optional
from google.auth.external_account_authorized_user import (
Credentials as ExternalAccountCredentials,
@@ -38,7 +39,9 @@ class GoogleOAuthHandler(BaseOAuthHandler):
self.token_uri = "https://oauth2.googleapis.com/token"
self.revoke_uri = "https://oauth2.googleapis.com/revoke"
def get_login_url(self, scopes: list[str], state: str) -> str:
def get_login_url(
self, scopes: list[str], state: str, code_challenge: Optional[str]
) -> str:
all_scopes = list(set(scopes + self.DEFAULT_SCOPES))
logger.debug(f"Setting up OAuth flow with scopes: {all_scopes}")
flow = self._setup_oauth_flow(all_scopes)
@@ -52,7 +55,7 @@ class GoogleOAuthHandler(BaseOAuthHandler):
return authorization_url
def exchange_code_for_tokens(
self, code: str, scopes: list[str]
self, code: str, scopes: list[str], code_verifier: Optional[str]
) -> OAuth2Credentials:
logger.debug(f"Exchanging code for tokens with scopes: {scopes}")

View File

@@ -0,0 +1,165 @@
import json
from typing import Optional
from urllib.parse import urlencode
from pydantic import SecretStr
from backend.blocks.linear._api import LinearAPIException
from backend.data.model import APIKeyCredentials, OAuth2Credentials
from backend.integrations.providers import ProviderName
from backend.util.request import requests
from .base import BaseOAuthHandler
class LinearOAuthHandler(BaseOAuthHandler):
"""
OAuth2 handler for Linear.
"""
PROVIDER_NAME = ProviderName.LINEAR
def __init__(self, client_id: str, client_secret: str, redirect_uri: str):
self.client_id = client_id
self.client_secret = client_secret
self.redirect_uri = redirect_uri
self.auth_base_url = "https://linear.app/oauth/authorize"
self.token_url = "https://api.linear.app/oauth/token" # Correct token URL
self.revoke_url = "https://api.linear.app/oauth/revoke"
def get_login_url(
self, scopes: list[str], state: str, code_challenge: Optional[str]
) -> str:
params = {
"client_id": self.client_id,
"redirect_uri": self.redirect_uri,
"response_type": "code", # Important: include "response_type"
"scope": ",".join(scopes), # Comma-separated, not space-separated
"state": state,
}
return f"{self.auth_base_url}?{urlencode(params)}"
def exchange_code_for_tokens(
self, code: str, scopes: list[str], code_verifier: Optional[str]
) -> OAuth2Credentials:
return self._request_tokens({"code": code, "redirect_uri": self.redirect_uri})
def revoke_tokens(self, credentials: OAuth2Credentials) -> bool:
if not credentials.access_token:
raise ValueError("No access token to revoke")
headers = {
"Authorization": f"Bearer {credentials.access_token.get_secret_value()}"
}
response = requests.post(self.revoke_url, headers=headers)
if not response.ok:
try:
error_data = response.json()
error_message = error_data.get("error", "Unknown error")
except json.JSONDecodeError:
error_message = response.text
raise LinearAPIException(
f"Failed to revoke Linear tokens ({response.status_code}): {error_message}",
response.status_code,
)
return True # Linear doesn't return JSON on successful revoke
def _refresh_tokens(self, credentials: OAuth2Credentials) -> OAuth2Credentials:
if not credentials.refresh_token:
raise ValueError(
"No refresh token available."
) # Linear uses non-expiring tokens
return self._request_tokens(
{
"refresh_token": credentials.refresh_token.get_secret_value(),
"grant_type": "refresh_token",
}
)
def _request_tokens(
self,
params: dict[str, str],
current_credentials: Optional[OAuth2Credentials] = None,
) -> OAuth2Credentials:
request_body = {
"client_id": self.client_id,
"client_secret": self.client_secret,
"grant_type": "authorization_code", # Ensure grant_type is correct
**params,
}
headers = {
"Content-Type": "application/x-www-form-urlencoded"
} # Correct header for token request
response = requests.post(self.token_url, data=request_body, headers=headers)
if not response.ok:
try:
error_data = response.json()
error_message = error_data.get("error", "Unknown error")
except json.JSONDecodeError:
error_message = response.text
raise LinearAPIException(
f"Failed to fetch Linear tokens ({response.status_code}): {error_message}",
response.status_code,
)
token_data = response.json()
# Note: Linear access tokens do not expire, so we set expires_at to None
new_credentials = OAuth2Credentials(
provider=self.PROVIDER_NAME,
title=current_credentials.title if current_credentials else None,
username=token_data.get("user", {}).get(
"name", "Unknown User"
), # extract name or set appropriate
access_token=token_data["access_token"],
scopes=token_data["scope"].split(
","
), # Linear returns comma-separated scopes
refresh_token=token_data.get(
"refresh_token"
), # Linear uses non-expiring tokens so this might be null
access_token_expires_at=None,
refresh_token_expires_at=None,
)
if current_credentials:
new_credentials.id = current_credentials.id
return new_credentials
def _request_username(self, access_token: str) -> Optional[str]:
# Use the LinearClient to fetch user details using GraphQL
from backend.blocks.linear._api import LinearClient
try:
linear_client = LinearClient(
APIKeyCredentials(
api_key=SecretStr(access_token),
title="temp",
provider=self.PROVIDER_NAME,
expires_at=None,
)
) # Temporary credentials for this request
query = """
query Viewer {
viewer {
name
}
}
"""
response = linear_client.query(query)
return response["viewer"]["name"]
except Exception as e: # Handle any errors
print(f"Error fetching username: {e}")
return None

View File

@@ -1,4 +1,5 @@
from base64 import b64encode
from typing import Optional
from urllib.parse import urlencode
from backend.data.model import OAuth2Credentials
@@ -26,7 +27,9 @@ class NotionOAuthHandler(BaseOAuthHandler):
self.auth_base_url = "https://api.notion.com/v1/oauth/authorize"
self.token_url = "https://api.notion.com/v1/oauth/token"
def get_login_url(self, scopes: list[str], state: str) -> str:
def get_login_url(
self, scopes: list[str], state: str, code_challenge: Optional[str]
) -> str:
params = {
"client_id": self.client_id,
"redirect_uri": self.redirect_uri,
@@ -37,7 +40,7 @@ class NotionOAuthHandler(BaseOAuthHandler):
return f"{self.auth_base_url}?{urlencode(params)}"
def exchange_code_for_tokens(
self, code: str, scopes: list[str]
self, code: str, scopes: list[str], code_verifier: Optional[str]
) -> OAuth2Credentials:
request_body = {
"grant_type": "authorization_code",

View File

@@ -0,0 +1,81 @@
import urllib.parse
from typing import ClassVar, Optional
import requests
from backend.data.model import OAuth2Credentials, ProviderName
from backend.integrations.oauth.base import BaseOAuthHandler
class TodoistOAuthHandler(BaseOAuthHandler):
PROVIDER_NAME = ProviderName.TODOIST
DEFAULT_SCOPES: ClassVar[list[str]] = [
"task:add",
"data:read",
"data:read_write",
"data:delete",
"project:delete",
]
AUTHORIZE_URL = "https://todoist.com/oauth/authorize"
TOKEN_URL = "https://todoist.com/oauth/access_token"
def __init__(self, client_id: str, client_secret: str, redirect_uri: str):
self.client_id = client_id
self.client_secret = client_secret
self.redirect_uri = redirect_uri
def get_login_url(
self, scopes: list[str], state: str, code_challenge: Optional[str]
) -> str:
params = {
"client_id": self.client_id,
"scope": ",".join(self.DEFAULT_SCOPES),
"state": state,
}
return f"{self.AUTHORIZE_URL}?{urllib.parse.urlencode(params)}"
def exchange_code_for_tokens(
self, code: str, scopes: list[str], code_verifier: Optional[str]
) -> OAuth2Credentials:
"""Exchange authorization code for access tokens"""
data = {
"client_id": self.client_id,
"client_secret": self.client_secret,
"code": code,
"redirect_uri": self.redirect_uri,
}
response = requests.post(self.TOKEN_URL, data=data)
response.raise_for_status()
tokens = response.json()
response = requests.post(
"https://api.todoist.com/sync/v9/sync",
headers={"Authorization": f"Bearer {tokens['access_token']}"},
data={"sync_token": "*", "resource_types": '["user"]'},
)
response.raise_for_status()
user_info = response.json()
user_email = user_info["user"].get("email")
return OAuth2Credentials(
provider=self.PROVIDER_NAME,
title=None,
username=user_email,
access_token=tokens["access_token"],
refresh_token=None,
access_token_expires_at=None,
refresh_token_expires_at=None,
scopes=scopes,
)
def _refresh_tokens(self, credentials: OAuth2Credentials) -> OAuth2Credentials:
# Todoist does not support token refresh
return credentials
def revoke_tokens(self, credentials: OAuth2Credentials) -> bool:
return False

View File

@@ -0,0 +1,171 @@
import time
import urllib.parse
from typing import ClassVar, Optional
import requests
from backend.data.model import OAuth2Credentials, ProviderName
from backend.integrations.oauth.base import BaseOAuthHandler
class TwitterOAuthHandler(BaseOAuthHandler):
PROVIDER_NAME = ProviderName.TWITTER
DEFAULT_SCOPES: ClassVar[list[str]] = [
"tweet.read",
"tweet.write",
"tweet.moderate.write",
"users.read",
"follows.read",
"follows.write",
"offline.access",
"space.read",
"mute.read",
"mute.write",
"like.read",
"like.write",
"list.read",
"list.write",
"block.read",
"block.write",
"bookmark.read",
"bookmark.write",
]
AUTHORIZE_URL = "https://twitter.com/i/oauth2/authorize"
TOKEN_URL = "https://api.x.com/2/oauth2/token"
USERNAME_URL = "https://api.x.com/2/users/me"
REVOKE_URL = "https://api.x.com/2/oauth2/revoke"
def __init__(self, client_id: str, client_secret: str, redirect_uri: str):
self.client_id = client_id
self.client_secret = client_secret
self.redirect_uri = redirect_uri
def get_login_url(
self, scopes: list[str], state: str, code_challenge: Optional[str]
) -> str:
"""Generate Twitter OAuth 2.0 authorization URL"""
# scopes = self.handle_default_scopes(scopes)
if code_challenge is None:
raise ValueError("code_challenge is required for Twitter OAuth")
params = {
"response_type": "code",
"client_id": self.client_id,
"redirect_uri": self.redirect_uri,
"scope": " ".join(self.DEFAULT_SCOPES),
"state": state,
"code_challenge": code_challenge,
"code_challenge_method": "S256",
}
return f"{self.AUTHORIZE_URL}?{urllib.parse.urlencode(params)}"
def exchange_code_for_tokens(
self, code: str, scopes: list[str], code_verifier: Optional[str]
) -> OAuth2Credentials:
"""Exchange authorization code for access tokens"""
headers = {"Content-Type": "application/x-www-form-urlencoded"}
data = {
"code": code,
"grant_type": "authorization_code",
"redirect_uri": self.redirect_uri,
"code_verifier": code_verifier,
}
auth = (self.client_id, self.client_secret)
response = requests.post(self.TOKEN_URL, headers=headers, data=data, auth=auth)
response.raise_for_status()
tokens = response.json()
username = self._get_username(tokens["access_token"])
return OAuth2Credentials(
provider=self.PROVIDER_NAME,
title=None,
username=username,
access_token=tokens["access_token"],
refresh_token=tokens.get("refresh_token"),
access_token_expires_at=int(time.time()) + tokens["expires_in"],
refresh_token_expires_at=None,
scopes=scopes,
)
def _get_username(self, access_token: str) -> str:
"""Get the username from the access token"""
headers = {"Authorization": f"Bearer {access_token}"}
params = {"user.fields": "username"}
response = requests.get(
f"{self.USERNAME_URL}?{urllib.parse.urlencode(params)}", headers=headers
)
response.raise_for_status()
return response.json()["data"]["username"]
def _refresh_tokens(self, credentials: OAuth2Credentials) -> OAuth2Credentials:
"""Refresh access tokens using refresh token"""
if not credentials.refresh_token:
raise ValueError("No refresh token available")
header = {"Content-Type": "application/x-www-form-urlencoded"}
data = {
"grant_type": "refresh_token",
"refresh_token": credentials.refresh_token.get_secret_value(),
}
auth = (self.client_id, self.client_secret)
response = requests.post(self.TOKEN_URL, headers=header, data=data, auth=auth)
try:
response.raise_for_status()
except requests.exceptions.HTTPError as e:
print("HTTP Error:", e)
print("Response Content:", response.text)
raise
tokens = response.json()
username = self._get_username(tokens["access_token"])
return OAuth2Credentials(
id=credentials.id,
provider=self.PROVIDER_NAME,
title=None,
username=username,
access_token=tokens["access_token"],
refresh_token=tokens["refresh_token"],
access_token_expires_at=int(time.time()) + tokens["expires_in"],
scopes=credentials.scopes,
refresh_token_expires_at=None,
)
def revoke_tokens(self, credentials: OAuth2Credentials) -> bool:
"""Revoke the access token"""
header = {"Content-Type": "application/x-www-form-urlencoded"}
data = {
"token": credentials.access_token.get_secret_value(),
"token_type_hint": "access_token",
}
auth = (self.client_id, self.client_secret)
response = requests.post(self.REVOKE_URL, headers=header, data=data, auth=auth)
try:
response.raise_for_status()
except requests.exceptions.HTTPError as e:
print("HTTP Error:", e)
print("Response Content:", response.text)
raise
return response.status_code == 200

View File

@@ -17,7 +17,9 @@ class ProviderName(str, Enum):
HUBSPOT = "hubspot"
IDEOGRAM = "ideogram"
JINA = "jina"
LINEAR = "linear"
MEDIUM = "medium"
MEM0 = "mem0"
NOTION = "notion"
NVIDIA = "nvidia"
OLLAMA = "ollama"
@@ -25,8 +27,13 @@ class ProviderName(str, Enum):
OPENWEATHERMAP = "openweathermap"
OPEN_ROUTER = "open_router"
PINECONE = "pinecone"
REDDIT = "reddit"
REPLICATE = "replicate"
REVID = "revid"
SCREENSHOTONE = "screenshotone"
SLANT3D = "slant3d"
SMTP = "smtp"
TWITTER = "twitter"
TODOIST = "todoist"
UNREAL_SPEECH = "unreal_speech"
# --8<-- [end:ProviderName]

View File

@@ -168,7 +168,7 @@ class BaseWebhooksManager(ABC, Generic[WT]):
id = str(uuid4())
secret = secrets.token_hex(32)
provider_name = self.PROVIDER_NAME
provider_name: ProviderName = self.PROVIDER_NAME
ingress_url = webhook_ingress_url(provider_name=provider_name, webhook_id=id)
if register:
if not credentials:

View File

@@ -1,7 +1,7 @@
import logging
from backend.data import integrations
from backend.data.model import APIKeyCredentials, Credentials, OAuth2Credentials
from backend.data.model import Credentials
from ._base import WT, BaseWebhooksManager
@@ -25,6 +25,6 @@ class ManualWebhookManagerBase(BaseWebhooksManager[WT]):
async def _deregister_webhook(
self,
webhook: integrations.Webhook,
credentials: OAuth2Credentials | APIKeyCredentials,
credentials: Credentials,
) -> None:
pass

View File

@@ -67,7 +67,7 @@ class GithubWebhooksManager(BaseWebhooksManager):
headers = {
**self.GITHUB_API_DEFAULT_HEADERS,
"Authorization": credentials.bearer(),
"Authorization": credentials.auth_header(),
}
repo, github_hook_id = webhook.resource, webhook.provider_webhook_id
@@ -96,7 +96,7 @@ class GithubWebhooksManager(BaseWebhooksManager):
headers = {
**self.GITHUB_API_DEFAULT_HEADERS,
"Authorization": credentials.bearer(),
"Authorization": credentials.auth_header(),
}
webhook_data = {
"name": "web",
@@ -142,7 +142,7 @@ class GithubWebhooksManager(BaseWebhooksManager):
headers = {
**self.GITHUB_API_DEFAULT_HEADERS,
"Authorization": credentials.bearer(),
"Authorization": credentials.auth_header(),
}
if webhook_type == self.WebhookType.REPO:

Some files were not shown because too many files have changed in this diff Show More