mirror of
https://github.com/Significant-Gravitas/AutoGPT.git
synced 2026-04-30 03:00:41 -04:00
Bumps the production-dependencies group with 8 updates in the /autogpt_platform/autogpt_libs directory: | Package | From | To | | --- | --- | --- | | [fastapi](https://github.com/fastapi/fastapi) | `0.116.1` | `0.128.0` | | [google-cloud-logging](https://github.com/googleapis/python-logging) | `3.12.1` | `3.13.0` | | [launchdarkly-server-sdk](https://github.com/launchdarkly/python-server-sdk) | `9.12.0` | `9.14.1` | | [pydantic](https://github.com/pydantic/pydantic) | `2.11.7` | `2.12.5` | | [pydantic-settings](https://github.com/pydantic/pydantic-settings) | `2.10.1` | `2.12.0` | | [pyjwt](https://github.com/jpadilla/pyjwt) | `2.10.1` | `2.11.0` | | [supabase](https://github.com/supabase/supabase-py) | `2.16.0` | `2.27.2` | | [uvicorn](https://github.com/Kludex/uvicorn) | `0.35.0` | `0.40.0` | Updates `fastapi` from 0.116.1 to 0.128.0 <details> <summary>Release notes</summary> <p><em>Sourced from <a href="https://github.com/fastapi/fastapi/releases">fastapi's releases</a>.</em></p> <blockquote> <h2>0.128.0</h2> <h3>Breaking Changes</h3> <ul> <li>➖ Drop support for <code>pydantic.v1</code>. PR <a href="https://redirect.github.com/fastapi/fastapi/pull/14609">#14609</a> by <a href="https://github.com/tiangolo"><code>@tiangolo</code></a>.</li> </ul> <h3>Internal</h3> <ul> <li>✅ Run performance tests only on Pydantic v2. PR <a href="https://redirect.github.com/fastapi/fastapi/pull/14608">#14608</a> by <a href="https://github.com/tiangolo"><code>@tiangolo</code></a>.</li> </ul> <h2>0.127.1</h2> <h3>Refactors</h3> <ul> <li>🔊 Add a custom <code>FastAPIDeprecationWarning</code>. PR <a href="https://redirect.github.com/fastapi/fastapi/pull/14605">#14605</a> by <a href="https://github.com/tiangolo"><code>@tiangolo</code></a>.</li> </ul> <h3>Docs</h3> <ul> <li>📝 Add documentary to website. PR <a href="https://redirect.github.com/fastapi/fastapi/pull/14600">#14600</a> by <a href="https://github.com/tiangolo"><code>@tiangolo</code></a>.</li> </ul> <h3>Translations</h3> <ul> <li>🌐 Update translations for de (update-outdated). PR <a href="https://redirect.github.com/fastapi/fastapi/pull/14602">#14602</a> by <a href="https://github.com/nilslindemann"><code>@nilslindemann</code></a>.</li> <li>🌐 Update translations for de (update-outdated). PR <a href="https://redirect.github.com/fastapi/fastapi/pull/14581">#14581</a> by <a href="https://github.com/nilslindemann"><code>@nilslindemann</code></a>.</li> </ul> <h3>Internal</h3> <ul> <li>🔧 Update pre-commit to use local Ruff instead of hook. PR <a href="https://redirect.github.com/fastapi/fastapi/pull/14604">#14604</a> by <a href="https://github.com/tiangolo"><code>@tiangolo</code></a>.</li> <li>✅ Add missing tests for code examples. PR <a href="https://redirect.github.com/fastapi/fastapi/pull/14569">#14569</a> by <a href="https://github.com/YuriiMotov"><code>@YuriiMotov</code></a>.</li> <li>👷 Remove <code>lint</code> job from <code>test</code> CI workflow. PR <a href="https://redirect.github.com/fastapi/fastapi/pull/14593">#14593</a> by <a href="https://github.com/YuriiMotov"><code>@YuriiMotov</code></a>.</li> <li>👷 Update secrets check. PR <a href="https://redirect.github.com/fastapi/fastapi/pull/14592">#14592</a> by <a href="https://github.com/tiangolo"><code>@tiangolo</code></a>.</li> <li>👷 Run CodSpeed tests in parallel to other tests to speed up CI. PR <a href="https://redirect.github.com/fastapi/fastapi/pull/14586">#14586</a> by <a href="https://github.com/tiangolo"><code>@tiangolo</code></a>.</li> <li>🔨 Update scripts and pre-commit to autofix files. PR <a href="https://redirect.github.com/fastapi/fastapi/pull/14585">#14585</a> by <a href="https://github.com/tiangolo"><code>@tiangolo</code></a>.</li> </ul> <h2>0.127.0</h2> <h3>Breaking Changes</h3> <ul> <li>🔊 Add deprecation warnings when using <code>pydantic.v1</code>. PR <a href="https://redirect.github.com/fastapi/fastapi/pull/14583">#14583</a> by <a href="https://github.com/tiangolo"><code>@tiangolo</code></a>.</li> </ul> <h3>Translations</h3> <ul> <li>🔧 Add LLM prompt file for Korean, generated from the existing translations. PR <a href="https://redirect.github.com/fastapi/fastapi/pull/14546">#14546</a> by <a href="https://github.com/tiangolo"><code>@tiangolo</code></a>.</li> <li>🔧 Add LLM prompt file for Japanese, generated from the existing translations. PR <a href="https://redirect.github.com/fastapi/fastapi/pull/14545">#14545</a> by <a href="https://github.com/tiangolo"><code>@tiangolo</code></a>.</li> </ul> <h3>Internal</h3> <ul> <li>⬆️ Upgrade OpenAI model for translations to gpt-5.2. PR <a href="https://redirect.github.com/fastapi/fastapi/pull/14579">#14579</a> by <a href="https://github.com/tiangolo"><code>@tiangolo</code></a>.</li> </ul> <h2>0.126.0</h2> <h3>Upgrades</h3> <ul> <li>➖ Drop support for Pydantic v1, keeping short temporary support for Pydantic v2's <code>pydantic.v1</code>. PR <a href="https://redirect.github.com/fastapi/fastapi/pull/14575">#14575</a> by <a href="https://github.com/tiangolo"><code>@tiangolo</code></a>.</li> </ul> <!-- raw HTML omitted --> </blockquote> <p>... (truncated)</p> </details> <details> <summary>Commits</summary> <ul> <li><a href="8322a4445a"><code>8322a44</code></a> 🔖 Release version 0.128.0</li> <li><a href="4b2cfcfd34"><code>4b2cfcf</code></a> 📝 Update release notes</li> <li><a href="e300630551"><code>e300630</code></a> ➖ Drop support for <code>pydantic.v1</code> (<a href="https://redirect.github.com/fastapi/fastapi/issues/14609">#14609</a>)</li> <li><a href="1b3bea8b6b"><code>1b3bea8</code></a> 📝 Update release notes</li> <li><a href="34e884156f"><code>34e8841</code></a> ✅ Run performance tests only on Pydantic v2 (<a href="https://redirect.github.com/fastapi/fastapi/issues/14608">#14608</a>)</li> <li><a href="cd90c78391"><code>cd90c78</code></a> 🔖 Release version 0.127.1</li> <li><a href="93f4dfd88b"><code>93f4dfd</code></a> 📝 Update release notes</li> <li><a href="535b5daa31"><code>535b5da</code></a> 🔊 Add a custom <code>FastAPIDeprecationWarning</code> (<a href="https://redirect.github.com/fastapi/fastapi/issues/14605">#14605</a>)</li> <li><a href="6b53786f62"><code>6b53786</code></a> 📝 Update release notes</li> <li><a href="d98f4eb56e"><code>d98f4eb</code></a> 🔧 Update pre-commit to use local Ruff instead of hook (<a href="https://redirect.github.com/fastapi/fastapi/issues/14604">#14604</a>)</li> <li>Additional commits viewable in <a href="https://github.com/fastapi/fastapi/compare/0.116.1...0.128.0">compare view</a></li> </ul> </details> <br /> Updates `google-cloud-logging` from 3.12.1 to 3.13.0 <details> <summary>Release notes</summary> <p><em>Sourced from <a href="https://github.com/googleapis/python-logging/releases">google-cloud-logging's releases</a>.</em></p> <blockquote> <h2>google-cloud-logging 3.13.0</h2> <h2><a href="https://github.com/googleapis/python-logging/compare/v3.12.1...v3.13.0">3.13.0</a> (2025-12-15)</h2> <h3>Features</h3> <ul> <li>Add support for python 3.14 (<a href="https://redirect.github.com/googleapis/python-logging/issues/1065">#1065</a>) (<a href="https://github.com/googleapis/python-logging/commit/6be3df6a">6be3df6a</a>)</li> </ul> <h3>Bug Fixes</h3> <ul> <li>remove setup.cfg configuration for creating universal wheels (<a href="https://redirect.github.com/googleapis/python-logging/issues/981">#981</a>) (<a href="https://github.com/googleapis/python-logging/commit/70f612c3">70f612c3</a>)</li> </ul> </blockquote> </details> <details> <summary>Changelog</summary> <p><em>Sourced from <a href="https://github.com/googleapis/python-logging/blob/main/CHANGELOG.md">google-cloud-logging's changelog</a>.</em></p> <blockquote> <h2><a href="https://github.com/googleapis/python-logging/compare/v3.12.1...v3.13.0">3.13.0</a> (2025-12-15)</h2> <h3>Features</h3> <ul> <li>Add support for python 3.14 (<a href="https://redirect.github.com/googleapis/python-logging/issues/1065">#1065</a>) (<a href="6be3df6aa9">6be3df6aa94539cd2ab22a4fac55b343862228b2</a>)</li> </ul> <h3>Bug Fixes</h3> <ul> <li>remove setup.cfg configuration for creating universal wheels (<a href="https://redirect.github.com/googleapis/python-logging/issues/981">#981</a>) (<a href="70f612c328">70f612c3281f1df13f3aba6b19bc4e9397297f3d</a>)</li> </ul> </blockquote> </details> <details> <summary>Commits</summary> <ul> <li><a href="1415883be0"><code>1415883</code></a> chore: librarian release pull request: 20251215T134006Z (<a href="https://redirect.github.com/googleapis/python-logging/issues/1066">#1066</a>)</li> <li><a href="6be3df6aa9"><code>6be3df6</code></a> feat: Add support for python 3.14 (<a href="https://redirect.github.com/googleapis/python-logging/issues/1065">#1065</a>)</li> <li><a href="36fb4270b3"><code>36fb427</code></a> chore(librarian): onboard to librarian (<a href="https://redirect.github.com/googleapis/python-logging/issues/1061">#1061</a>)</li> <li><a href="eb189bf712"><code>eb189bf</code></a> chore: update Python generator version to 1.25.1 (<a href="https://redirect.github.com/googleapis/python-logging/issues/1003">#1003</a>)</li> <li><a href="a7a28d1b93"><code>a7a28d1</code></a> test: ignore DeprecationWarning for <code>credentials_file</code> argument and Python ve...</li> <li><a href="70f612c328"><code>70f612c</code></a> fix: remove setup.cfg configuration for creating universal wheels (<a href="https://redirect.github.com/googleapis/python-logging/issues/981">#981</a>)</li> <li><a href="e4c445a856"><code>e4c445a</code></a> chore: Update gapic-generator-python to 1.25.0 (<a href="https://redirect.github.com/googleapis/python-logging/issues/985">#985</a>)</li> <li><a href="14364a534a"><code>14364a5</code></a> test: Added cleanup of old sink storage buckets (<a href="https://redirect.github.com/googleapis/python-logging/issues/991">#991</a>)</li> <li>See full diff in <a href="https://github.com/googleapis/python-logging/compare/v3.12.1...v3.13.0">compare view</a></li> </ul> </details> <br /> Updates `launchdarkly-server-sdk` from 9.12.0 to 9.14.1 <details> <summary>Release notes</summary> <p><em>Sourced from <a href="https://github.com/launchdarkly/python-server-sdk/releases">launchdarkly-server-sdk's releases</a>.</em></p> <blockquote> <h2>v9.14.1</h2> <h2><a href="https://github.com/launchdarkly/python-server-sdk/compare/9.14.0...9.14.1">9.14.1</a> (2025-12-15)</h2> <h3>Bug Fixes</h3> <ul> <li>Remove all synchronizers in daemon mode (<a href="https://redirect.github.com/launchdarkly/python-server-sdk/issues/388">#388</a>) (<a href="441a5ecb3d">441a5ec</a>)</li> </ul> <hr /> <p>This PR was generated with <a href="https://github.com/googleapis/release-please">Release Please</a>. See <a href="https://github.com/googleapis/release-please#release-please">documentation</a>.</p> <!-- raw HTML omitted --> <h2>v9.14.0</h2> <h2><a href="https://github.com/launchdarkly/python-server-sdk/compare/9.13.1...9.14.0">9.14.0</a> (2025-12-04)</h2> <h3>Features</h3> <ul> <li>adding data system option to create file datasource intializer (<a href="e5b121f92a">e5b121f</a>)</li> <li>adding file data source as an intializer (<a href="https://redirect.github.com/launchdarkly/python-server-sdk/issues/381">#381</a>) (<a href="3700d1ddd9">3700d1d</a>)</li> </ul> <h3>Bug Fixes</h3> <ul> <li>Add warning if relying on Redis <code>max_connections</code> parameter (<a href="https://redirect.github.com/launchdarkly/python-server-sdk/issues/387">#387</a>) (<a href="e6395fa531">e6395fa</a>), closes <a href="https://redirect.github.com/launchdarkly/python-server-sdk/issues/386">#386</a></li> <li>modified initializer behavior to spec (<a href="064f65c761">064f65c</a>)</li> </ul> <hr /> <p>This PR was generated with <a href="https://github.com/googleapis/release-please">Release Please</a>. See <a href="https://github.com/googleapis/release-please#release-please">documentation</a>.</p> <!-- raw HTML omitted --> <h2>v9.13.1</h2> <h2><a href="https://github.com/launchdarkly/python-server-sdk/compare/9.13.0...9.13.1">9.13.1</a> (2025-11-19)</h2> <h3>Bug Fixes</h3> <ul> <li>Include ldclient.datasystem in docs (<a href="https://redirect.github.com/launchdarkly/python-server-sdk/issues/379">#379</a>) (<a href="318c6fea07">318c6fe</a>)</li> </ul> <hr /> <p>This PR was generated with <a href="https://github.com/googleapis/release-please">Release Please</a>. See <a href="https://github.com/googleapis/release-please#release-please">documentation</a>.</p> <!-- raw HTML omitted --> <h2>v9.13.0</h2> <h2><a href="https://github.com/launchdarkly/python-server-sdk/compare/9.12.3...9.13.0">9.13.0</a> (2025-11-19)</h2> <!-- raw HTML omitted --> </blockquote> <p>... (truncated)</p> </details> <details> <summary>Changelog</summary> <p><em>Sourced from <a href="https://github.com/launchdarkly/python-server-sdk/blob/main/CHANGELOG.md">launchdarkly-server-sdk's changelog</a>.</em></p> <blockquote> <h2><a href="https://github.com/launchdarkly/python-server-sdk/compare/9.14.0...9.14.1">9.14.1</a> (2025-12-15)</h2> <h3>Bug Fixes</h3> <ul> <li>Remove all synchronizers in daemon mode (<a href="https://redirect.github.com/launchdarkly/python-server-sdk/issues/388">#388</a>) (<a href="441a5ecb3d">441a5ec</a>)</li> </ul> <h2><a href="https://github.com/launchdarkly/python-server-sdk/compare/9.13.1...9.14.0">9.14.0</a> (2025-12-04)</h2> <h3>Features</h3> <ul> <li>adding data system option to create file datasource intializer (<a href="e5b121f92a">e5b121f</a>)</li> <li>adding file data source as an intializer (<a href="https://redirect.github.com/launchdarkly/python-server-sdk/issues/381">#381</a>) (<a href="3700d1ddd9">3700d1d</a>)</li> </ul> <h3>Bug Fixes</h3> <ul> <li>Add warning if relying on Redis <code>max_connections</code> parameter (<a href="https://redirect.github.com/launchdarkly/python-server-sdk/issues/387">#387</a>) (<a href="e6395fa531">e6395fa</a>), closes <a href="https://redirect.github.com/launchdarkly/python-server-sdk/issues/386">#386</a></li> <li>modified initializer behavior to spec (<a href="064f65c761">064f65c</a>)</li> </ul> <h2><a href="https://github.com/launchdarkly/python-server-sdk/compare/9.13.0...9.13.1">9.13.1</a> (2025-11-19)</h2> <h3>Bug Fixes</h3> <ul> <li>Include ldclient.datasystem in docs (<a href="https://redirect.github.com/launchdarkly/python-server-sdk/issues/379">#379</a>) (<a href="318c6fea07">318c6fe</a>)</li> </ul> <h2><a href="https://github.com/launchdarkly/python-server-sdk/compare/9.12.3...9.13.0">9.13.0</a> (2025-11-19)</h2> <h3>Features</h3> <ul> <li><strong>experimental:</strong> Release EAP support for FDv2 data system (<a href="https://redirect.github.com/launchdarkly/python-server-sdk/issues/376">#376</a>) (<a href="0e7c32b4df">0e7c32b</a>)</li> </ul> <h2><a href="https://github.com/launchdarkly/python-server-sdk/compare/9.12.2...9.12.3">9.12.3</a> (2025-10-30)</h2> <h3>Bug Fixes</h3> <ul> <li>Fix overly generic type hint on File data source (<a href="https://redirect.github.com/launchdarkly/python-server-sdk/issues/365">#365</a>) (<a href="52a7499f7c">52a7499</a>), closes <a href="https://redirect.github.com/launchdarkly/python-server-sdk/issues/364">#364</a></li> </ul> <h2><a href="https://github.com/launchdarkly/python-server-sdk/compare/9.12.1...9.12.2">9.12.2</a> (2025-10-27)</h2> <h3>Bug Fixes</h3> <ul> <li>Fix incorrect event count in failure message (<a href="https://redirect.github.com/launchdarkly/python-server-sdk/issues/359">#359</a>) (<a href="91f416329b">91f4163</a>)</li> </ul> <h2><a href="https://github.com/launchdarkly/python-server-sdk/compare/9.12.0...9.12.1">9.12.1</a> (2025-09-30)</h2> <!-- raw HTML omitted --> </blockquote> <p>... (truncated)</p> </details> <details> <summary>Commits</summary> <ul> <li><a href="54e62cc706"><code>54e62cc</code></a> chore(main): release 9.14.1 (<a href="https://redirect.github.com/launchdarkly/python-server-sdk/issues/389">#389</a>)</li> <li><a href="441a5ecb3d"><code>441a5ec</code></a> fix: Remove all synchronizers in daemon mode (<a href="https://redirect.github.com/launchdarkly/python-server-sdk/issues/388">#388</a>)</li> <li><a href="7bb537827f"><code>7bb5378</code></a> chore(main): release 9.14.0 (<a href="https://redirect.github.com/launchdarkly/python-server-sdk/issues/382">#382</a>)</li> <li><a href="e6395fa531"><code>e6395fa</code></a> fix: Add warning if relying on Redis <code>max_connections</code> parameter (<a href="https://redirect.github.com/launchdarkly/python-server-sdk/issues/387">#387</a>)</li> <li><a href="45786a9a7e"><code>45786a9</code></a> chore: Expose flag change listeners from data system (<a href="https://redirect.github.com/launchdarkly/python-server-sdk/issues/384">#384</a>)</li> <li><a href="2b7eedc836"><code>2b7eedc</code></a> chore: Clean up unused _data_availability (<a href="https://redirect.github.com/launchdarkly/python-server-sdk/issues/383">#383</a>)</li> <li><a href="3700d1ddd9"><code>3700d1d</code></a> feat: adding file data source as an intializer (<a href="https://redirect.github.com/launchdarkly/python-server-sdk/issues/381">#381</a>)</li> <li><a href="04a2c538e5"><code>04a2c53</code></a> chore: PR comments</li> <li><a href="064f65c761"><code>064f65c</code></a> fix: modified initializer behavior to spec</li> <li><a href="e5b121f92a"><code>e5b121f</code></a> feat: adding data system option to create file datasource intializer</li> <li>Additional commits viewable in <a href="https://github.com/launchdarkly/python-server-sdk/compare/9.12.0...9.14.1">compare view</a></li> </ul> </details> <br /> Updates `pydantic` from 2.11.7 to 2.12.5 <details> <summary>Release notes</summary> <p><em>Sourced from <a href="https://github.com/pydantic/pydantic/releases">pydantic's releases</a>.</em></p> <blockquote> <h2>v2.12.5 2025-11-26</h2> <h2>v2.12.5 (2025-11-26)</h2> <p>This is the fifth 2.12 patch release, addressing an issue with the <code>MISSING</code> sentinel and providing several documentation improvements.</p> <p>The next 2.13 minor release will be published in a couple weeks, and will include a new <em>polymorphic serialization</em> feature addressing the remaining unexpected changes to the <em>serialize as any</em> behavior.</p> <ul> <li>Fix pickle error when using <code>model_construct()</code> on a model with <code>MISSING</code> as a default value by <a href="https://github.com/ornariece"><code>@ornariece</code></a> in <a href="https://redirect.github.com/pydantic/pydantic/pull/12522">#12522</a>.</li> <li>Several updates to the documentation by <a href="https://github.com/Viicos"><code>@Viicos</code></a>.</li> </ul> <p><strong>Full Changelog</strong>: <a href="https://github.com/pydantic/pydantic/compare/v2.12.4...v2.12.5">https://github.com/pydantic/pydantic/compare/v2.12.4...v2.12.5</a></p> <h2>v2.12.4 2025-11-05</h2> <h2>v2.12.4 (2025-11-05)</h2> <p>This is the fourth 2.12 patch release, fixing more regressions, and reverting a change in the <code>build()</code> method of the <a href="https://docs.pydantic.dev/latest/api/networks/"><code>AnyUrl</code> and Dsn types</a>.</p> <p>This patch release also fixes an issue with the serialization of IP address types, when <code>serialize_as_any</code> is used. The next patch release will try to address the remaining issues with <em>serialize as any</em> behavior by introducing a new <em>polymorphic serialization</em> feature, that should be used in most cases in place of <em>serialize as any</em>.</p> <ul> <li> <p>Fix issue with forward references in parent <code>TypedDict</code> classes by <a href="https://github.com/Viicos"><code>@Viicos</code></a> in <a href="https://redirect.github.com/pydantic/pydantic/pull/12427">#12427</a>.</p> <p>This issue is only relevant on Python 3.14 and greater.</p> </li> <li> <p>Exclude fields with <code>exclude_if</code> from JSON Schema required fields by <a href="https://github.com/Viicos"><code>@Viicos</code></a> in <a href="https://redirect.github.com/pydantic/pydantic/pull/12430">#12430</a></p> </li> <li> <p>Revert URL percent-encoding of credentials in the <code>build()</code> method of the <a href="https://docs.pydantic.dev/latest/api/networks/"><code>AnyUrl</code> and Dsn types</a> by <a href="https://github.com/davidhewitt"><code>@davidhewitt</code></a> in <a href="https://redirect.github.com/pydantic/pydantic-core/pull/1833">pydantic-core#1833</a>.</p> <p>This was initially considered as a bugfix, but caused regressions and as such was fully reverted. The next release will include an opt-in option to percent-encode components of the URL.</p> </li> <li> <p>Add type inference for IP address types by <a href="https://github.com/davidhewitt"><code>@davidhewitt</code></a> in <a href="https://redirect.github.com/pydantic/pydantic-core/pull/1868">pydantic-core#1868</a>.</p> <p>The 2.12 changes to the <code>serialize_as_any</code> behavior made it so that IP address types could not properly serialize to JSON.</p> </li> <li> <p>Avoid getting default values from defaultdict by <a href="https://github.com/davidhewitt"><code>@davidhewitt</code></a> in <a href="https://redirect.github.com/pydantic/pydantic-core/pull/1853">pydantic-core#1853</a>.</p> <p>This fixes a subtle regression in the validation behavior of the <a href="https://docs.python.org/3/library/collections.html#collections.defaultdict"><code>collections.defaultdict</code></a> type.</p> </li> <li> <p>Fix issue with field serializers on nested typed dictionaries by <a href="https://github.com/davidhewitt"><code>@davidhewitt</code></a> in <a href="https://redirect.github.com/pydantic/pydantic-core/pull/1879">pydantic-core#1879</a>.</p> </li> <li> <p>Add more <code>pydantic-core</code> builds for the three-threaded version of Python 3.14 by <a href="https://github.com/davidhewitt"><code>@davidhewitt</code></a> in <a href="https://redirect.github.com/pydantic/pydantic-core/pull/1864">pydantic-core#1864</a>.</p> </li> </ul> <p><strong>Full Changelog</strong>: <a href="https://github.com/pydantic/pydantic/compare/v2.12.3...v2.12.4">https://github.com/pydantic/pydantic/compare/v2.12.3...v2.12.4</a></p> <h2>v2.12.3 2025-10-17</h2> <h2>v2.12.3 (2025-10-17)</h2> <h3>What's Changed</h3> <p>This is the third 2.13 patch release, fixing issues related to the <code>FieldInfo</code> class, and reverting a change to the supported <a href="https://docs.pydantic.dev/latest/concepts/validators/#model-validators"><em>after</em> model validator</a> function signatures.</p> <!-- raw HTML omitted --> </blockquote> <p>... (truncated)</p> </details> <details> <summary>Changelog</summary> <p><em>Sourced from <a href="https://github.com/pydantic/pydantic/blob/main/HISTORY.md">pydantic's changelog</a>.</em></p> <blockquote> <h2>v2.12.5 (2025-11-26)</h2> <p><a href="https://github.com/pydantic/pydantic/releases/tag/v2.12.5">GitHub release</a></p> <p>This is the fifth 2.12 patch release, addressing an issue with the <code>MISSING</code> sentinel and providing several documentation improvements.</p> <p>The next 2.13 minor release will be published in a couple weeks, and will include a new <em>polymorphic serialization</em> feature addressing the remaining unexpected changes to the <em>serialize as any</em> behavior.</p> <ul> <li>Fix pickle error when using <code>model_construct()</code> on a model with <code>MISSING</code> as a default value by <a href="https://github.com/ornariece"><code>@ornariece</code></a> in <a href="https://redirect.github.com/pydantic/pydantic/pull/12522">#12522</a>.</li> <li>Several updates to the documentation by <a href="https://github.com/Viicos"><code>@Viicos</code></a>.</li> </ul> <h2>v2.12.4 (2025-11-05)</h2> <p><a href="https://github.com/pydantic/pydantic/releases/tag/v2.12.4">GitHub release</a></p> <p>This is the fourth 2.12 patch release, fixing more regressions, and reverting a change in the <code>build()</code> method of the <a href="https://docs.pydantic.dev/latest/api/networks/"><code>AnyUrl</code> and Dsn types</a>.</p> <p>This patch release also fixes an issue with the serialization of IP address types, when <code>serialize_as_any</code> is used. The next patch release will try to address the remaining issues with <em>serialize as any</em> behavior by introducing a new <em>polymorphic serialization</em> feature, that should be used in most cases in place of <em>serialize as any</em>.</p> <ul> <li> <p>Fix issue with forward references in parent <code>TypedDict</code> classes by <a href="https://github.com/Viicos"><code>@Viicos</code></a> in <a href="https://redirect.github.com/pydantic/pydantic/pull/12427">#12427</a>.</p> <p>This issue is only relevant on Python 3.14 and greater.</p> </li> <li> <p>Exclude fields with <code>exclude_if</code> from JSON Schema required fields by <a href="https://github.com/Viicos"><code>@Viicos</code></a> in <a href="https://redirect.github.com/pydantic/pydantic/pull/12430">#12430</a></p> </li> <li> <p>Revert URL percent-encoding of credentials in the <code>build()</code> method of the <a href="https://docs.pydantic.dev/latest/api/networks/"><code>AnyUrl</code> and Dsn types</a> by <a href="https://github.com/davidhewitt"><code>@davidhewitt</code></a> in <a href="https://redirect.github.com/pydantic/pydantic-core/pull/1833">pydantic-core#1833</a>.</p> <p>This was initially considered as a bugfix, but caused regressions and as such was fully reverted. The next release will include an opt-in option to percent-encode components of the URL.</p> </li> <li> <p>Add type inference for IP address types by <a href="https://github.com/davidhewitt"><code>@davidhewitt</code></a> in <a href="https://redirect.github.com/pydantic/pydantic-core/pull/1868">pydantic-core#1868</a>.</p> <p>The 2.12 changes to the <code>serialize_as_any</code> behavior made it so that IP address types could not properly serialize to JSON.</p> </li> <li> <p>Avoid getting default values from defaultdict by <a href="https://github.com/davidhewitt"><code>@davidhewitt</code></a> in <a href="https://redirect.github.com/pydantic/pydantic-core/pull/1853">pydantic-core#1853</a>.</p> <p>This fixes a subtle regression in the validation behavior of the <a href="https://docs.python.org/3/library/collections.html#collections.defaultdict"><code>collections.defaultdict</code></a> type.</p> </li> <li> <p>Fix issue with field serializers on nested typed dictionaries by <a href="https://github.com/davidhewitt"><code>@davidhewitt</code></a> in <a href="https://redirect.github.com/pydantic/pydantic-core/pull/1879">pydantic-core#1879</a>.</p> </li> <li> <p>Add more <code>pydantic-core</code> builds for the three-threaded version of Python 3.14 by <a href="https://github.com/davidhewitt"><code>@davidhewitt</code></a> in <a href="https://redirect.github.com/pydantic/pydantic-core/pull/1864">pydantic-core#1864</a>.</p> </li> </ul> <h2>v2.12.3 (2025-10-17)</h2> <p><a href="https://github.com/pydantic/pydantic/releases/tag/v2.12.3">GitHub release</a></p> <!-- raw HTML omitted --> </blockquote> <p>... (truncated)</p> </details> <details> <summary>Commits</summary> <ul> <li><a href="bd2d0dd013"><code>bd2d0dd</code></a> Prepare release v2.12.5</li> <li><a href="7d0302ec7e"><code>7d0302e</code></a> Document security implications when using <code>create_model()</code></li> <li><a href="e9ef980def"><code>e9ef980</code></a> Fix typo in Standard Library Types documentation</li> <li><a href="f2c20c00c2"><code>f2c20c0</code></a> Add <code>pydantic-docs</code> dev dependency, make use of versioning blocks</li> <li><a href="a76c1aa26f"><code>a76c1aa</code></a> Update documentation about JSON Schema</li> <li><a href="8cbc72ca48"><code>8cbc72c</code></a> Add documentation about custom <code>__init__()</code></li> <li><a href="99eba59906"><code>99eba59</code></a> Add additional test for <code>FieldInfo.get_default()</code></li> <li><a href="c71076988e"><code>c710769</code></a> Special case <code>MISSING</code> sentinel in <code>smart_deepcopy()</code></li> <li><a href="20a9d771c2"><code>20a9d77</code></a> Do not delete mock validator/serializer in <code>rebuild_dataclass()</code></li> <li><a href="c86515a3a8"><code>c86515a</code></a> Update parts of the model and <code>revalidate_instances</code> documentation</li> <li>Additional commits viewable in <a href="https://github.com/pydantic/pydantic/compare/v2.11.7...v2.12.5">compare view</a></li> </ul> </details> <br /> Updates `pydantic-settings` from 2.10.1 to 2.12.0 <details> <summary>Release notes</summary> <p><em>Sourced from <a href="https://github.com/pydantic/pydantic-settings/releases">pydantic-settings's releases</a>.</em></p> <blockquote> <h2>v2.12.0</h2> <h2>What's Changed</h2> <ul> <li>Support for enum kebab case. by <a href="https://github.com/kschwab"><code>@kschwab</code></a> in <a href="https://redirect.github.com/pydantic/pydantic-settings/pull/686">pydantic/pydantic-settings#686</a></li> <li>Apply source order: init > env > dotenv > secrets > defaults and pres… by <a href="https://github.com/chbndrhnns"><code>@chbndrhnns</code></a> in <a href="https://redirect.github.com/pydantic/pydantic-settings/pull/688">pydantic/pydantic-settings#688</a></li> <li>Add NestedSecretsSettings source by <a href="https://github.com/makukha"><code>@makukha</code></a> in <a href="https://redirect.github.com/pydantic/pydantic-settings/pull/690">pydantic/pydantic-settings#690</a></li> <li>Strip non-explicit default values. by <a href="https://github.com/kschwab"><code>@kschwab</code></a> in <a href="https://redirect.github.com/pydantic/pydantic-settings/pull/692">pydantic/pydantic-settings#692</a></li> <li>Coerce env vars if strict is True. by <a href="https://github.com/kschwab"><code>@kschwab</code></a> in <a href="https://redirect.github.com/pydantic/pydantic-settings/pull/693">pydantic/pydantic-settings#693</a></li> <li>Restore init kwarg names before returning final state dictionary. by <a href="https://github.com/kschwab"><code>@kschwab</code></a> in <a href="https://redirect.github.com/pydantic/pydantic-settings/pull/700">pydantic/pydantic-settings#700</a></li> <li>Drop Python3.9 support by <a href="https://github.com/hramezani"><code>@hramezani</code></a> in <a href="https://redirect.github.com/pydantic/pydantic-settings/pull/699">pydantic/pydantic-settings#699</a></li> <li>Adapt test_protected_namespace_defaults for dev. Pydantic by <a href="https://github.com/musicinmybrain"><code>@musicinmybrain</code></a> in <a href="https://redirect.github.com/pydantic/pydantic-settings/pull/637">pydantic/pydantic-settings#637</a></li> <li>Add Python 3.14 by <a href="https://github.com/hramezani"><code>@hramezani</code></a> in <a href="https://redirect.github.com/pydantic/pydantic-settings/pull/704">pydantic/pydantic-settings#704</a></li> <li>Prepare release 2.12 by <a href="https://github.com/hramezani"><code>@hramezani</code></a> in <a href="https://redirect.github.com/pydantic/pydantic-settings/pull/705">pydantic/pydantic-settings#705</a></li> </ul> <h2>New Contributors</h2> <ul> <li><a href="https://github.com/chbndrhnns"><code>@chbndrhnns</code></a> made their first contribution in <a href="https://redirect.github.com/pydantic/pydantic-settings/pull/688">pydantic/pydantic-settings#688</a></li> </ul> <p><strong>Full Changelog</strong>: <a href="https://github.com/pydantic/pydantic-settings/compare/v2.11.0...v2.12.0">https://github.com/pydantic/pydantic-settings/compare/v2.11.0...v2.12.0</a></p> <h2>v2.11.0</h2> <h2>What's Changed</h2> <ul> <li>CLI Serialize Support by <a href="https://github.com/kschwab"><code>@kschwab</code></a> in <a href="https://redirect.github.com/pydantic/pydantic-settings/pull/643">pydantic/pydantic-settings#643</a></li> <li>Inspect type aliases to determine if an annotation is complex by <a href="https://github.com/tselepakis"><code>@tselepakis</code></a> in <a href="https://redirect.github.com/pydantic/pydantic-settings/pull/644">pydantic/pydantic-settings#644</a></li> <li>Revert "fix: Respect 'cli_parse_args' from model_config with settings_customise_sources (<a href="https://redirect.github.com/pydantic/pydantic-settings/issues/611">#611</a>)" by <a href="https://github.com/hramezani"><code>@hramezani</code></a> in <a href="https://redirect.github.com/pydantic/pydantic-settings/pull/655">pydantic/pydantic-settings#655</a></li> <li>Remove parsing of command line arguments from <code>CliSettingsSource.__init__</code>. by <a href="https://github.com/trygve-baerland"><code>@trygve-baerland</code></a> in <a href="https://redirect.github.com/pydantic/pydantic-settings/pull/656">pydantic/pydantic-settings#656</a></li> <li>turn off allow_abbrev on subparsers by <a href="https://github.com/mroch"><code>@mroch</code></a> in <a href="https://redirect.github.com/pydantic/pydantic-settings/pull/658">pydantic/pydantic-settings#658</a></li> <li>CLI Serialization Fixes by <a href="https://github.com/kschwab"><code>@kschwab</code></a> in <a href="https://redirect.github.com/pydantic/pydantic-settings/pull/649">pydantic/pydantic-settings#649</a></li> <li>Fix PydanticModel type checking. by <a href="https://github.com/kschwab"><code>@kschwab</code></a> in <a href="https://redirect.github.com/pydantic/pydantic-settings/pull/659">pydantic/pydantic-settings#659</a></li> <li>Avoid env_prefix falling back to env vars without prefix by <a href="https://github.com/tselepakis"><code>@tselepakis</code></a> in <a href="https://redirect.github.com/pydantic/pydantic-settings/pull/648">pydantic/pydantic-settings#648</a></li> <li>Warn if model_config sets unused keys for missing settings sources by <a href="https://github.com/HomerusJa"><code>@HomerusJa</code></a> in <a href="https://redirect.github.com/pydantic/pydantic-settings/pull/663">pydantic/pydantic-settings#663</a></li> <li>Included endpoint_url kwarg in AWSSecretsManagerSettingsSource class by <a href="https://github.com/adrianohrl"><code>@adrianohrl</code></a> in <a href="https://redirect.github.com/pydantic/pydantic-settings/pull/664">pydantic/pydantic-settings#664</a></li> <li>Fix typo ("Accesing") in the "Adding sources" docs by <a href="https://github.com/deepyaman"><code>@deepyaman</code></a> in <a href="https://redirect.github.com/pydantic/pydantic-settings/pull/668">pydantic/pydantic-settings#668</a></li> <li>CLI Windows Path Fix by <a href="https://github.com/kschwab"><code>@kschwab</code></a> in <a href="https://redirect.github.com/pydantic/pydantic-settings/pull/669">pydantic/pydantic-settings#669</a></li> <li>Cli root model support by <a href="https://github.com/kschwab"><code>@kschwab</code></a> in <a href="https://redirect.github.com/pydantic/pydantic-settings/pull/677">pydantic/pydantic-settings#677</a></li> <li>Snake case conversion in Azure Key Vault by <a href="https://github.com/AndreuCodina"><code>@AndreuCodina</code></a> in <a href="https://redirect.github.com/pydantic/pydantic-settings/pull/680">pydantic/pydantic-settings#680</a></li> <li>Make <code>InitSettingsSource</code> resolution deterministic by <a href="https://github.com/enrico-stauss"><code>@enrico-stauss</code></a> in <a href="https://redirect.github.com/pydantic/pydantic-settings/pull/681">pydantic/pydantic-settings#681</a></li> <li>Update deps by <a href="https://github.com/hramezani"><code>@hramezani</code></a> in <a href="https://redirect.github.com/pydantic/pydantic-settings/pull/683">pydantic/pydantic-settings#683</a></li> </ul> <h2>New Contributors</h2> <ul> <li><a href="https://github.com/tselepakis"><code>@tselepakis</code></a> made their first contribution in <a href="https://redirect.github.com/pydantic/pydantic-settings/pull/644">pydantic/pydantic-settings#644</a></li> <li><a href="https://github.com/trygve-baerland"><code>@trygve-baerland</code></a> made their first contribution in <a href="https://redirect.github.com/pydantic/pydantic-settings/pull/656">pydantic/pydantic-settings#656</a></li> <li><a href="https://github.com/mroch"><code>@mroch</code></a> made their first contribution in <a href="https://redirect.github.com/pydantic/pydantic-settings/pull/658">pydantic/pydantic-settings#658</a></li> <li><a href="https://github.com/HomerusJa"><code>@HomerusJa</code></a> made their first contribution in <a href="https://redirect.github.com/pydantic/pydantic-settings/pull/663">pydantic/pydantic-settings#663</a></li> <li><a href="https://github.com/adrianohrl"><code>@adrianohrl</code></a> made their first contribution in <a href="https://redirect.github.com/pydantic/pydantic-settings/pull/664">pydantic/pydantic-settings#664</a></li> <li><a href="https://github.com/deepyaman"><code>@deepyaman</code></a> made their first contribution in <a href="https://redirect.github.com/pydantic/pydantic-settings/pull/668">pydantic/pydantic-settings#668</a></li> <li><a href="https://github.com/enrico-stauss"><code>@enrico-stauss</code></a> made their first contribution in <a href="https://redirect.github.com/pydantic/pydantic-settings/pull/681">pydantic/pydantic-settings#681</a></li> </ul> <p><strong>Full Changelog</strong>: <a href="https://github.com/pydantic/pydantic-settings/compare/2.10.1...v2.11.0">https://github.com/pydantic/pydantic-settings/compare/2.10.1...v2.11.0</a></p> </blockquote> </details> <details> <summary>Commits</summary> <ul> <li><a href="584983d253"><code>584983d</code></a> Prepare release 2.12 (<a href="https://redirect.github.com/pydantic/pydantic-settings/issues/705">#705</a>)</li> <li><a href="6b4d87e776"><code>6b4d87e</code></a> Add Python 3.14 (<a href="https://redirect.github.com/pydantic/pydantic-settings/issues/704">#704</a>)</li> <li><a href="02de5b622b"><code>02de5b6</code></a> Adapt test_protected_namespace_defaults for dev. Pydantic (<a href="https://redirect.github.com/pydantic/pydantic-settings/issues/637">#637</a>)</li> <li><a href="4239ea460a"><code>4239ea4</code></a> Drop Python3.9 support (<a href="https://redirect.github.com/pydantic/pydantic-settings/issues/699">#699</a>)</li> <li><a href="5008c694f6"><code>5008c69</code></a> Restore init kwarg names before returning final state dictionary. (<a href="https://redirect.github.com/pydantic/pydantic-settings/issues/700">#700</a>)</li> <li><a href="4433101fef"><code>4433101</code></a> Coerce env vars if strict is True. (<a href="https://redirect.github.com/pydantic/pydantic-settings/issues/693">#693</a>)</li> <li><a href="4d2ebfd543"><code>4d2ebfd</code></a> Strip non-explicit default values. (<a href="https://redirect.github.com/pydantic/pydantic-settings/issues/692">#692</a>)</li> <li><a href="4a6ffcaeae"><code>4a6ffca</code></a> Add NestedSecretsSettings source (<a href="https://redirect.github.com/pydantic/pydantic-settings/issues/690">#690</a>)</li> <li><a href="7a6e96ebfc"><code>7a6e96e</code></a> Apply source order: init > env > dotenv > secrets > defaults and pres… (<a href="https://redirect.github.com/pydantic/pydantic-settings/issues/688">#688</a>)</li> <li><a href="68563eddc0"><code>68563ed</code></a> Support for enum kebab case. (<a href="https://redirect.github.com/pydantic/pydantic-settings/issues/686">#686</a>)</li> <li>Additional commits viewable in <a href="https://github.com/pydantic/pydantic-settings/compare/2.10.1...v2.12.0">compare view</a></li> </ul> </details> <br /> Updates `pyjwt` from 2.10.1 to 2.11.0 <details> <summary>Release notes</summary> <p><em>Sourced from <a href="https://github.com/jpadilla/pyjwt/releases">pyjwt's releases</a>.</em></p> <blockquote> <h2>2.11.0</h2> <h2>What's Changed</h2> <ul> <li>Fixed type error in comment by <a href="https://github.com/shuhaib-aot"><code>@shuhaib-aot</code></a> in <a href="https://redirect.github.com/jpadilla/pyjwt/pull/1026">jpadilla/pyjwt#1026</a></li> <li>[pre-commit.ci] pre-commit autoupdate by <a href="https://github.com/pre-commit-ci"><code>@pre-commit-ci</code></a>[bot] in <a href="https://redirect.github.com/jpadilla/pyjwt/pull/1018">jpadilla/pyjwt#1018</a></li> <li>[pre-commit.ci] pre-commit autoupdate by <a href="https://github.com/pre-commit-ci"><code>@pre-commit-ci</code></a>[bot] in <a href="https://redirect.github.com/jpadilla/pyjwt/pull/1033">jpadilla/pyjwt#1033</a></li> <li>Make note of use of leeway with nbf by <a href="https://github.com/djw8605"><code>@djw8605</code></a> in <a href="https://redirect.github.com/jpadilla/pyjwt/pull/1034">jpadilla/pyjwt#1034</a></li> <li>[pre-commit.ci] pre-commit autoupdate by <a href="https://github.com/pre-commit-ci"><code>@pre-commit-ci</code></a>[bot] in <a href="https://redirect.github.com/jpadilla/pyjwt/pull/1035">jpadilla/pyjwt#1035</a></li> <li>Fixes <a href="https://redirect.github.com/jpadilla/pyjwt/issues/964">#964</a>: Validate key against allowed types for Algorithm family by <a href="https://github.com/pachewise"><code>@pachewise</code></a> in <a href="https://redirect.github.com/jpadilla/pyjwt/pull/985">jpadilla/pyjwt#985</a></li> <li>Feat <a href="https://redirect.github.com/jpadilla/pyjwt/issues/1024">#1024</a>: Add iterator for PyJWKSet by <a href="https://github.com/pachewise"><code>@pachewise</code></a> in <a href="https://redirect.github.com/jpadilla/pyjwt/pull/1041">jpadilla/pyjwt#1041</a></li> <li>Fixes <a href="https://redirect.github.com/jpadilla/pyjwt/issues/1039">#1039</a>: Add iss, issuer type checks by <a href="https://github.com/pachewise"><code>@pachewise</code></a> in <a href="https://redirect.github.com/jpadilla/pyjwt/pull/1040">jpadilla/pyjwt#1040</a></li> <li>Fixes <a href="https://redirect.github.com/jpadilla/pyjwt/issues/660">#660</a>: Improve typing/logic for <code>options</code> in decode, decode_complete; Improve docs by <a href="https://github.com/pachewise"><code>@pachewise</code></a> in <a href="https://redirect.github.com/jpadilla/pyjwt/pull/1045">jpadilla/pyjwt#1045</a></li> <li>[pre-commit.ci] pre-commit autoupdate by <a href="https://github.com/pre-commit-ci"><code>@pre-commit-ci</code></a>[bot] in <a href="https://redirect.github.com/jpadilla/pyjwt/pull/1042">jpadilla/pyjwt#1042</a></li> <li>[pre-commit.ci] pre-commit autoupdate by <a href="https://github.com/pre-commit-ci"><code>@pre-commit-ci</code></a>[bot] in <a href="https://redirect.github.com/jpadilla/pyjwt/pull/1052">jpadilla/pyjwt#1052</a></li> <li>[pre-commit.ci] pre-commit autoupdate by <a href="https://github.com/pre-commit-ci"><code>@pre-commit-ci</code></a>[bot] in <a href="https://redirect.github.com/jpadilla/pyjwt/pull/1053">jpadilla/pyjwt#1053</a></li> <li>Fix <a href="https://redirect.github.com/jpadilla/pyjwt/issues/1022">#1022</a>: Map <code>algorithm=None</code> to "none" by <a href="https://github.com/qqii"><code>@qqii</code></a> in <a href="https://redirect.github.com/jpadilla/pyjwt/pull/1056">jpadilla/pyjwt#1056</a></li> <li>[pre-commit.ci] pre-commit autoupdate by <a href="https://github.com/pre-commit-ci"><code>@pre-commit-ci</code></a>[bot] in <a href="https://redirect.github.com/jpadilla/pyjwt/pull/1055">jpadilla/pyjwt#1055</a></li> <li>[pre-commit.ci] pre-commit autoupdate by <a href="https://github.com/pre-commit-ci"><code>@pre-commit-ci</code></a>[bot] in <a href="https://redirect.github.com/jpadilla/pyjwt/pull/1058">jpadilla/pyjwt#1058</a></li> <li>[pre-commit.ci] pre-commit autoupdate by <a href="https://github.com/pre-commit-ci"><code>@pre-commit-ci</code></a>[bot] in <a href="https://redirect.github.com/jpadilla/pyjwt/pull/1060">jpadilla/pyjwt#1060</a></li> <li>[pre-commit.ci] pre-commit autoupdate by <a href="https://github.com/pre-commit-ci"><code>@pre-commit-ci</code></a>[bot] in <a href="https://redirect.github.com/jpadilla/pyjwt/pull/1061">jpadilla/pyjwt#1061</a></li> <li>Fixes <a href="https://redirect.github.com/jpadilla/pyjwt/issues/1047">#1047</a>: Correct <code>PyJWKClient.get_signing_key_from_jwt</code> annotation by <a href="https://github.com/khvn26"><code>@khvn26</code></a> in <a href="https://redirect.github.com/jpadilla/pyjwt/pull/1048">jpadilla/pyjwt#1048</a></li> <li>[pre-commit.ci] pre-commit autoupdate by <a href="https://github.com/pre-commit-ci"><code>@pre-commit-ci</code></a>[bot] in <a href="https://redirect.github.com/jpadilla/pyjwt/pull/1062">jpadilla/pyjwt#1062</a></li> <li>Fixed doc string typo in _validate_jti() function <a href="https://redirect.github.com/jpadilla/pyjwt/issues/1063">#1063</a> by <a href="https://github.com/kuldeepkhatke"><code>@kuldeepkhatke</code></a> in <a href="https://redirect.github.com/jpadilla/pyjwt/pull/1064">jpadilla/pyjwt#1064</a></li> <li>[pre-commit.ci] pre-commit autoupdate by <a href="https://github.com/pre-commit-ci"><code>@pre-commit-ci</code></a>[bot] in <a href="https://redirect.github.com/jpadilla/pyjwt/pull/1065">jpadilla/pyjwt#1065</a></li> <li>Update SECURITY.md by <a href="https://github.com/auvipy"><code>@auvipy</code></a> in <a href="https://redirect.github.com/jpadilla/pyjwt/pull/1057">jpadilla/pyjwt#1057</a></li> <li>Typing fix: use <code>float</code> instead of <code>int</code> for <code>lifespan</code> and <code>timeout</code> by <a href="https://github.com/nikitagashkov"><code>@nikitagashkov</code></a> in <a href="https://redirect.github.com/jpadilla/pyjwt/pull/1068">jpadilla/pyjwt#1068</a></li> <li>[pre-commit.ci] pre-commit autoupdate by <a href="https://github.com/pre-commit-ci"><code>@pre-commit-ci</code></a>[bot] in <a href="https://redirect.github.com/jpadilla/pyjwt/pull/1067">jpadilla/pyjwt#1067</a></li> <li>[pre-commit.ci] pre-commit autoupdate by <a href="https://github.com/pre-commit-ci"><code>@pre-commit-ci</code></a>[bot] in <a href="https://redirect.github.com/jpadilla/pyjwt/pull/1071">jpadilla/pyjwt#1071</a></li> <li>[pre-commit.ci] pre-commit autoupdate by <a href="https://github.com/pre-commit-ci"><code>@pre-commit-ci</code></a>[bot] in <a href="https://redirect.github.com/jpadilla/pyjwt/pull/1076">jpadilla/pyjwt#1076</a></li> <li>Fix TYP header documentation by <a href="https://github.com/fobiasmog"><code>@fobiasmog</code></a> in <a href="https://redirect.github.com/jpadilla/pyjwt/pull/1046">jpadilla/pyjwt#1046</a></li> <li>doc: Document claims sub and jti by <a href="https://github.com/cleder"><code>@cleder</code></a> in <a href="https://redirect.github.com/jpadilla/pyjwt/pull/1088">jpadilla/pyjwt#1088</a></li> <li>[pre-commit.ci] pre-commit autoupdate by <a href="https://github.com/pre-commit-ci"><code>@pre-commit-ci</code></a>[bot] in <a href="https://redirect.github.com/jpadilla/pyjwt/pull/1077">jpadilla/pyjwt#1077</a></li> <li>Bump actions/setup-python from 5 to 6 by <a href="https://github.com/dependabot"><code>@dependabot</code></a>[bot] in <a href="https://redirect.github.com/jpadilla/pyjwt/pull/1089">jpadilla/pyjwt#1089</a></li> <li>Bump actions/stale from 8 to 10 by <a href="https://github.com/dependabot"><code>@dependabot</code></a>[bot] in <a href="https://redirect.github.com/jpadilla/pyjwt/pull/1090">jpadilla/pyjwt#1090</a></li> <li>Bump actions/checkout from 4 to 5 by <a href="https://github.com/dependabot"><code>@dependabot</code></a>[bot] in <a href="https://redirect.github.com/jpadilla/pyjwt/pull/1083">jpadilla/pyjwt#1083</a></li> <li>[pre-commit.ci] pre-commit autoupdate by <a href="https://github.com/pre-commit-ci"><code>@pre-commit-ci</code></a>[bot] in <a href="https://redirect.github.com/jpadilla/pyjwt/pull/1091">jpadilla/pyjwt#1091</a></li> <li>[pre-commit.ci] pre-commit autoupdate by <a href="https://github.com/pre-commit-ci"><code>@pre-commit-ci</code></a>[bot] in <a href="https://redirect.github.com/jpadilla/pyjwt/pull/1093">jpadilla/pyjwt#1093</a></li> <li>[pre-commit.ci] pre-commit autoupdate by <a href="https://github.com/pre-commit-ci"><code>@pre-commit-ci</code></a>[bot] in <a href="https://redirect.github.com/jpadilla/pyjwt/pull/1096">jpadilla/pyjwt#1096</a></li> <li>Resolve package build warnings by <a href="https://github.com/kurtmckee"><code>@kurtmckee</code></a> in <a href="https://redirect.github.com/jpadilla/pyjwt/pull/1105">jpadilla/pyjwt#1105</a></li> <li>Support Python 3.14, and test against PyPy 3.10+ by <a href="https://github.com/kurtmckee"><code>@kurtmckee</code></a> in <a href="https://redirect.github.com/jpadilla/pyjwt/pull/1104">jpadilla/pyjwt#1104</a></li> <li>Fix a <code>SyntaxWarning</code> caused by invalid escape sequences by <a href="https://github.com/kurtmckee"><code>@kurtmckee</code></a> in <a href="https://redirect.github.com/jpadilla/pyjwt/pull/1103">jpadilla/pyjwt#1103</a></li> <li>Standardize CHANGELOG links to PRs by <a href="https://github.com/kurtmckee"><code>@kurtmckee</code></a> in <a href="https://redirect.github.com/jpadilla/pyjwt/pull/1110">jpadilla/pyjwt#1110</a></li> <li>Migrate from <code>pep517</code>, which is deprecated, to <code>build</code> by <a href="https://github.com/kurtmckee"><code>@kurtmckee</code></a> in <a href="https://redirect.github.com/jpadilla/pyjwt/pull/1108">jpadilla/pyjwt#1108</a></li> <li>Fix incorrectly-named test suite function by <a href="https://github.com/kurtmckee"><code>@kurtmckee</code></a> in <a href="https://redirect.github.com/jpadilla/pyjwt/pull/1116">jpadilla/pyjwt#1116</a></li> <li>Fix Read the Docs builds by <a href="https://github.com/kurtmckee"><code>@kurtmckee</code></a> in <a href="https://redirect.github.com/jpadilla/pyjwt/pull/1111">jpadilla/pyjwt#1111</a></li> <li>Bump actions/download-artifact from 4 to 6 by <a href="https://github.com/dependabot"><code>@dependabot</code></a>[bot] in <a href="https://redirect.github.com/jpadilla/pyjwt/pull/1118">jpadilla/pyjwt#1118</a></li> <li>Escalate test suite warnings to errors by <a href="https://github.com/kurtmckee"><code>@kurtmckee</code></a> in <a href="https://redirect.github.com/jpadilla/pyjwt/pull/1107">jpadilla/pyjwt#1107</a></li> <li>Add pyupgrade as a pre-commit hook by <a href="https://github.com/kurtmckee"><code>@kurtmckee</code></a> in <a href="https://redirect.github.com/jpadilla/pyjwt/pull/1109">jpadilla/pyjwt#1109</a></li> <li>Simplify the test suite decorators by <a href="https://github.com/kurtmckee"><code>@kurtmckee</code></a> in <a href="https://redirect.github.com/jpadilla/pyjwt/pull/1113">jpadilla/pyjwt#1113</a></li> <li>Improve coverage config and eliminate unused test suite code by <a href="https://github.com/kurtmckee"><code>@kurtmckee</code></a> in <a href="https://redirect.github.com/jpadilla/pyjwt/pull/1115">jpadilla/pyjwt#1115</a></li> <li>Build a shared wheel once in the test suite by <a href="https://github.com/kurtmckee"><code>@kurtmckee</code></a> in <a href="https://redirect.github.com/jpadilla/pyjwt/pull/1114">jpadilla/pyjwt#1114</a></li> </ul> <!-- raw HTML omitted --> </blockquote> <p>... (truncated)</p> </details> <details> <summary>Changelog</summary> <p><em>Sourced from <a href="https://github.com/jpadilla/pyjwt/blob/master/CHANGELOG.rst">pyjwt's changelog</a>.</em></p> <blockquote> <h2><code>v2.11.0 <https://github.com/jpadilla/pyjwt/compare/2.10.1...2.11.0></code>__</h2> <p>Fixed</p> <pre><code> - Enforce ECDSA curve validation per RFC 7518 Section 3.4. - Fix build system warnings by @kurtmckee in `[#1105](https://github.com/jpadilla/pyjwt/issues/1105) <https://github.com/jpadilla/pyjwt/pull/1105>`__ - Validate key against allowed types for Algorithm family in `[#964](https://github.com/jpadilla/pyjwt/issues/964) <https://github.com/jpadilla/pyjwt/pull/964>`__ - Add iterator for JWKSet in `[#1041](https://github.com/jpadilla/pyjwt/issues/1041) <https://github.com/jpadilla/pyjwt/pull/1041>`__ - Validate `iss` claim is a string during encoding and decoding by @pachewise in `[#1040](https://github.com/jpadilla/pyjwt/issues/1040) <https://github.com/jpadilla/pyjwt/pull/1040>`__ - Improve typing/logic for `options` in decode, decode_complete by @pachewise in `[#1045](https://github.com/jpadilla/pyjwt/issues/1045) <https://github.com/jpadilla/pyjwt/pull/1045>`__ - Declare float supported type for lifespan and timeout by @nikitagashkov in `[#1068](https://github.com/jpadilla/pyjwt/issues/1068) <https://github.com/jpadilla/pyjwt/pull/1068>`__ - Fix ``SyntaxWarning``\s/``DeprecationWarning``\s caused by invalid escape sequences by @kurtmckee in `[#1103](https://github.com/jpadilla/pyjwt/issues/1103) <https://github.com/jpadilla/pyjwt/pull/1103>`__ - Development: Build a shared wheel once to speed up test suite setup times by @kurtmckee in `[#1114](https://github.com/jpadilla/pyjwt/issues/1114) <https://github.com/jpadilla/pyjwt/pull/1114>`__ - Development: Test type annotations across all supported Python versions, increase the strictness of the type checking, and remove the mypy pre-commit hook by @kurtmckee in `[#1112](https://github.com/jpadilla/pyjwt/issues/1112) <https://github.com/jpadilla/pyjwt/pull/1112>`__ <p>Added </code></pre></p> <ul> <li>Support Python 3.14, and test against PyPy 3.10 and 3.11 by <a href="https://github.com/kurtmckee"><code>@kurtmckee</code></a> in <code>[#1104](https://github.com/jpadilla/pyjwt/issues/1104) <https://github.com/jpadilla/pyjwt/pull/1104></code>__</li> <li>Development: Migrate to <code>build</code> to test package building in CI by <a href="https://github.com/kurtmckee"><code>@kurtmckee</code></a> in <code>[#1108](https://github.com/jpadilla/pyjwt/issues/1108) <https://github.com/jpadilla/pyjwt/pull/1108></code>__</li> <li>Development: Improve coverage config and eliminate unused test suite code by <a href="https://github.com/kurtmckee"><code>@kurtmckee</code></a> in <code>[#1115](https://github.com/jpadilla/pyjwt/issues/1115) <https://github.com/jpadilla/pyjwt/pull/1115></code>__</li> <li>Docs: Standardize CHANGELOG links to PRs by <a href="https://github.com/kurtmckee"><code>@kurtmckee</code></a> in <code>[#1110](https://github.com/jpadilla/pyjwt/issues/1110) <https://github.com/jpadilla/pyjwt/pull/1110></code>__</li> <li>Docs: Fix Read the Docs builds by <a href="https://github.com/kurtmckee"><code>@kurtmckee</code></a> in <code>[#1111](https://github.com/jpadilla/pyjwt/issues/1111) <https://github.com/jpadilla/pyjwt/pull/1111></code>__</li> <li>Docs: Add example of using leeway with nbf by <a href="https://github.com/djw8605"><code>@djw8605</code></a> in <code>[#1034](https://github.com/jpadilla/pyjwt/issues/1034) <https://github.com/jpadilla/pyjwt/pull/1034></code>__</li> <li>Docs: Refactored docs with <code>autodoc</code>; added <code>PyJWS</code> and <code>jwt.algorithms</code> docs by <a href="https://github.com/pachewise"><code>@pachewise</code></a> in <code>[#1045](https://github.com/jpadilla/pyjwt/issues/1045) <https://github.com/jpadilla/pyjwt/pull/1045></code>__</li> <li>Docs: Documentation improvements for "sub" and "jti" claims by <a href="https://github.com/cleder"><code>@cleder</code></a> in <code>[#1088](https://github.com/jpadilla/pyjwt/issues/1088) <https://github.com/jpadilla/pyjwt/pull/1088></code>__</li> <li>Development: Add pyupgrade as a pre-commit hook by <a href="https://github.com/kurtmckee"><code>@kurtmckee</code></a> in <code>[#1109](https://github.com/jpadilla/pyjwt/issues/1109) <https://github.com/jpadilla/pyjwt/pull/1109></code>__</li> <li>Add minimum key length validation for HMAC and RSA keys (CWE-326). Warns by default via <code>InsecureKeyLengthWarning</code> when keys are below minimum recommended lengths per RFC 7518 Section 3.2 (HMAC) and NIST SP 800-131A (RSA). Pass <code>enforce_minimum_key_length=True</code> in options to <code>PyJWT</code> or <code>PyJWS</code> to raise <code>InvalidKeyError</code> instead.</li> <li>Refactor <code>PyJWT</code> to own an internal <code>PyJWS</code> instance instead of calling global <code>api_jws</code> functions.</li> </ul> </blockquote> </details> <details> <summary>Commits</summary> <ul> <li><a href="697344d259"><code>697344d</code></a> bump up version</li> <li><a href="e4d0aec024"><code>e4d0aec</code></a> fix: pre-commit</li> <li><a href="df9a6a0c44"><code>df9a6a0</code></a> fix: failing test</li> <li><a href="2b2e53cd23"><code>2b2e53c</code></a> fix: docs</li> <li><a href="635c8d89dd"><code>635c8d8</code></a> fix: failing mypy</li> <li><a href="96ae3563b9"><code>96ae356</code></a> feat: add minimum key length validation for HMAC and RSA</li> <li><a href="5b86227733"><code>5b86227</code></a> fix: enforce ECDSA curve validation per RFC 7518 Section 3.4</li> <li><a href="04947d75dc"><code>04947d7</code></a> Bump actions/download-artifact from 6 to 7 (<a href="https://redirect.github.com/jpadilla/pyjwt/issues/1125">#1125</a>)</li> <li><a href="dd448344c3"><code>dd44834</code></a> Fix leeway value in usage documentation (<a href="https://redirect.github.com/jpadilla/pyjwt/issues/1124">#1124</a>)</li> <li><a href="407f0bde99"><code>407f0bd</code></a> Thoroughly test type annotations, and resolve errors (<a href="https://redirect.github.com/jpadilla/pyjwt/issues/1112">#1112</a>)</li> <li>Additional commits viewable in <a href="https://github.com/jpadilla/pyjwt/compare/2.10.1...2.11.0">compare view</a></li> </ul> </details> <br /> Updates `supabase` from 2.16.0 to 2.27.2 <details> <summary>Release notes</summary> <p><em>Sourced from <a href="https://github.com/supabase/supabase-py/releases">supabase's releases</a>.</em></p> <blockquote> <h2>v2.27.2</h2> <h2><a href="https://github.com/supabase/supabase-py/compare/v2.27.1...v2.27.2">2.27.2</a> (2026-01-14)</h2> <h3>Bug Fixes</h3> <ul> <li><strong>ci:</strong> generate new token for release-please (<a href="https://redirect.github.com/supabase/supabase-py/issues/1348">#1348</a>) (<a href="c2ad37f9dc">c2ad37f</a>)</li> <li><strong>ci:</strong> run CI when .github files change (<a href="https://redirect.github.com/supabase/supabase-py/issues/1349">#1349</a>) (<a href="a221aac029">a221aac</a>)</li> <li><strong>realtime:</strong> ammend reconnect logic to not unsubscribe (<a href="https://redirect.github.com/supabase/supabase-py/issues/1346">#1346</a>) (<a href="cfbe5943cb">cfbe594</a>)</li> </ul> <h2>v2.27.1</h2> <h2><a href="https://github.com/supabase/supabase-py/compare/v2.27.0...v2.27.1">2.27.1</a> (2026-01-06)</h2> <h3>Bug Fixes</h3> <ul> <li><strong>realtime:</strong> use 'event' instead of 'events' in postgres_changes protocol (<a href="https://redirect.github.com/supabase/supabase-py/issues/1339">#1339</a>) (<a href="c1e7986c5e">c1e7986</a>)</li> <li><strong>storage:</strong> catch bad responses from server (<a href="https://redirect.github.com/supabase/supabase-py/issues/1344">#1344</a>) (<a href="ddb50547db">ddb5054</a>)</li> </ul> <h2>v2.27.0</h2> <h2><a href="https://github.com/supabase/supabase-py/compare/v2.26.0...v2.27.0">2.27.0</a> (2025-12-16)</h2> <h3>Features</h3> <ul> <li><strong>auth:</strong> add X (OAuth 2.0) provider (<a href="https://redirect.github.com/supabase/supabase-py/issues/1335">#1335</a>) (<a href="f600f96b52">f600f96</a>)</li> </ul> <h3>Bug Fixes</h3> <ul> <li><strong>storage:</strong> replace deprecated pydantic Extra with literal values (<a href="https://redirect.github.com/supabase/supabase-py/issues/1334">#1334</a>) (<a href="6df3545785">6df3545</a>)</li> </ul> <h2>v2.26.... _Description has been truncated_ --------- Signed-off-by: dependabot[bot] <support@github.com> Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> Co-authored-by: claude[bot] <41898282+claude[bot]@users.noreply.github.com> Co-authored-by: Nicholas Tindle <ntindle@users.noreply.github.com> Co-authored-by: Nicholas Tindle <nicholas.tindle@agpt.co> Co-authored-by: Nick Tindle <nick@ntindle.com>
1408 lines
52 KiB
Python
1408 lines
52 KiB
Python
import time
|
|
from datetime import datetime
|
|
from enum import Enum
|
|
from typing import Annotated, Any, Dict, List, Optional
|
|
|
|
from exa_py import AsyncExa, Exa
|
|
from exa_py.websets.types import (
|
|
CreateCriterionParameters,
|
|
CreateEnrichmentParameters,
|
|
CreateWebsetParameters,
|
|
CreateWebsetParametersSearch,
|
|
ExcludeItem,
|
|
Format,
|
|
ImportItem,
|
|
ImportSource,
|
|
Option,
|
|
ScopeItem,
|
|
ScopeRelationship,
|
|
ScopeSourceType,
|
|
WebsetArticleEntity,
|
|
WebsetCompanyEntity,
|
|
WebsetCustomEntity,
|
|
WebsetPersonEntity,
|
|
WebsetResearchPaperEntity,
|
|
WebsetStatus,
|
|
)
|
|
from pydantic import Field
|
|
|
|
from backend.sdk import (
|
|
APIKeyCredentials,
|
|
BaseModel,
|
|
Block,
|
|
BlockCategory,
|
|
BlockOutput,
|
|
BlockSchemaInput,
|
|
BlockSchemaOutput,
|
|
CredentialsMetaInput,
|
|
SchemaField,
|
|
)
|
|
|
|
from ._config import exa
|
|
|
|
|
|
class SearchEntityType(str, Enum):
|
|
COMPANY = "company"
|
|
PERSON = "person"
|
|
ARTICLE = "article"
|
|
RESEARCH_PAPER = "research_paper"
|
|
CUSTOM = "custom"
|
|
AUTO = "auto"
|
|
|
|
|
|
class SearchType(str, Enum):
|
|
IMPORT = "import"
|
|
WEBSET = "webset"
|
|
|
|
|
|
class EnrichmentFormat(str, Enum):
|
|
TEXT = "text"
|
|
DATE = "date"
|
|
NUMBER = "number"
|
|
OPTIONS = "options"
|
|
EMAIL = "email"
|
|
PHONE = "phone"
|
|
|
|
|
|
class Webset(BaseModel):
|
|
id: str
|
|
status: WebsetStatus | None = Field(..., title="WebsetStatus")
|
|
"""
|
|
The status of the webset
|
|
"""
|
|
external_id: Annotated[Optional[str], Field(alias="externalId")] = None
|
|
"""
|
|
The external identifier for the webset
|
|
NOTE: Returning dict to avoid ui crashing due to nested objects
|
|
"""
|
|
searches: List[dict[str, Any]] | None = None
|
|
"""
|
|
The searches that have been performed on the webset.
|
|
NOTE: Returning dict to avoid ui crashing due to nested objects
|
|
"""
|
|
enrichments: List[dict[str, Any]] | None = None
|
|
"""
|
|
The Enrichments to apply to the Webset Items.
|
|
NOTE: Returning dict to avoid ui crashing due to nested objects
|
|
"""
|
|
monitors: List[dict[str, Any]] | None = None
|
|
"""
|
|
The Monitors for the Webset.
|
|
NOTE: Returning dict to avoid ui crashing due to nested objects
|
|
"""
|
|
metadata: Optional[Dict[str, Any]] = {}
|
|
"""
|
|
Set of key-value pairs you want to associate with this object.
|
|
"""
|
|
created_at: Annotated[datetime | None, Field(alias="createdAt")] = None
|
|
"""
|
|
The date and time the webset was created
|
|
"""
|
|
updated_at: Annotated[datetime | None, Field(alias="updatedAt")] = None
|
|
"""
|
|
The date and time the webset was last updated
|
|
"""
|
|
|
|
|
|
class ExaCreateWebsetBlock(Block):
|
|
class Input(BlockSchemaInput):
|
|
credentials: CredentialsMetaInput = exa.credentials_field(
|
|
description="The Exa integration requires an API Key."
|
|
)
|
|
|
|
# Search parameters (flattened)
|
|
search_query: str = SchemaField(
|
|
description="Your search query. Use this to describe what you are looking for. Any URL provided will be crawled and used as context for the search.",
|
|
placeholder="Marketing agencies based in the US, that focus on consumer products",
|
|
)
|
|
search_count: Optional[int] = SchemaField(
|
|
default=10,
|
|
description="Number of items the search will attempt to find. The actual number of items found may be less than this number depending on the search complexity.",
|
|
ge=1,
|
|
le=1000,
|
|
)
|
|
search_entity_type: SearchEntityType = SchemaField(
|
|
default=SearchEntityType.AUTO,
|
|
description="Entity type: 'company', 'person', 'article', 'research_paper', or 'custom'. If not provided, we automatically detect the entity from the query.",
|
|
advanced=True,
|
|
)
|
|
search_entity_description: Optional[str] = SchemaField(
|
|
default=None,
|
|
description="Description for custom entity type (required when search_entity_type is 'custom')",
|
|
advanced=True,
|
|
)
|
|
|
|
# Search criteria (flattened)
|
|
search_criteria: list[str] = SchemaField(
|
|
default_factory=list,
|
|
description="List of criteria descriptions that every item will be evaluated against. If not provided, we automatically detect the criteria from the query.",
|
|
advanced=True,
|
|
)
|
|
|
|
# Search exclude sources (flattened)
|
|
search_exclude_sources: list[str] = SchemaField(
|
|
default_factory=list,
|
|
description="List of source IDs (imports or websets) to exclude from search results",
|
|
advanced=True,
|
|
)
|
|
search_exclude_types: list[SearchType] = SchemaField(
|
|
default_factory=list,
|
|
description="List of source types corresponding to exclude sources ('import' or 'webset')",
|
|
advanced=True,
|
|
)
|
|
|
|
# Search scope sources (flattened)
|
|
search_scope_sources: list[str] = SchemaField(
|
|
default_factory=list,
|
|
description="List of source IDs (imports or websets) to limit search scope to",
|
|
advanced=True,
|
|
)
|
|
search_scope_types: list[SearchType] = SchemaField(
|
|
default_factory=list,
|
|
description="List of source types corresponding to scope sources ('import' or 'webset')",
|
|
advanced=True,
|
|
)
|
|
search_scope_relationships: list[str] = SchemaField(
|
|
default_factory=list,
|
|
description="List of relationship definitions for hop searches (optional, one per scope source)",
|
|
advanced=True,
|
|
)
|
|
search_scope_relationship_limits: list[int] = SchemaField(
|
|
default_factory=list,
|
|
description="List of limits on the number of related entities to find (optional, one per scope relationship)",
|
|
advanced=True,
|
|
)
|
|
|
|
# Import parameters (flattened)
|
|
import_sources: list[str] = SchemaField(
|
|
default_factory=list,
|
|
description="List of source IDs to import from",
|
|
advanced=True,
|
|
)
|
|
import_types: list[SearchType] = SchemaField(
|
|
default_factory=list,
|
|
description="List of source types corresponding to import sources ('import' or 'webset')",
|
|
advanced=True,
|
|
)
|
|
|
|
# Enrichment parameters (flattened)
|
|
enrichment_descriptions: list[str] = SchemaField(
|
|
default_factory=list,
|
|
description="List of enrichment task descriptions to perform on each webset item",
|
|
advanced=True,
|
|
)
|
|
enrichment_formats: list[EnrichmentFormat] = SchemaField(
|
|
default_factory=list,
|
|
description="List of formats for enrichment responses ('text', 'date', 'number', 'options', 'email', 'phone'). If not specified, we automatically select the best format.",
|
|
advanced=True,
|
|
)
|
|
enrichment_options: list[list[str]] = SchemaField(
|
|
default_factory=list,
|
|
description="List of option lists for enrichments with 'options' format. Each inner list contains the option labels.",
|
|
advanced=True,
|
|
)
|
|
enrichment_metadata: list[dict] = SchemaField(
|
|
default_factory=list,
|
|
description="List of metadata dictionaries for enrichments",
|
|
advanced=True,
|
|
)
|
|
|
|
# Webset metadata
|
|
external_id: Optional[str] = SchemaField(
|
|
default=None,
|
|
description="External identifier for the webset. You can use this to reference the webset by your own internal identifiers.",
|
|
placeholder="my-webset-123",
|
|
advanced=True,
|
|
)
|
|
metadata: Optional[dict] = SchemaField(
|
|
default_factory=dict,
|
|
description="Key-value pairs to associate with this webset",
|
|
advanced=True,
|
|
)
|
|
|
|
# Polling parameters
|
|
wait_for_initial_results: bool = SchemaField(
|
|
default=True,
|
|
description="Wait for the initial search to complete before returning. This ensures you get results immediately.",
|
|
)
|
|
polling_timeout: int = SchemaField(
|
|
default=300,
|
|
description="Maximum time to wait for completion in seconds (only used if wait_for_initial_results is True)",
|
|
advanced=True,
|
|
ge=1,
|
|
le=600,
|
|
)
|
|
|
|
class Output(BlockSchemaOutput):
|
|
webset: Webset = SchemaField(description="The created webset with full details")
|
|
initial_item_count: Optional[int] = SchemaField(
|
|
description="Number of items found in the initial search (only if wait_for_initial_results was True)"
|
|
)
|
|
completion_time: Optional[float] = SchemaField(
|
|
description="Time taken to complete the initial search in seconds (only if wait_for_initial_results was True)"
|
|
)
|
|
|
|
def __init__(self):
|
|
super().__init__(
|
|
id="0cda29ff-c549-4a19-8805-c982b7d4ec34",
|
|
description="Create a new Exa Webset for persistent web search collections with optional waiting for initial results",
|
|
categories={BlockCategory.SEARCH},
|
|
input_schema=ExaCreateWebsetBlock.Input,
|
|
output_schema=ExaCreateWebsetBlock.Output,
|
|
)
|
|
|
|
async def run(
|
|
self, input_data: Input, *, credentials: APIKeyCredentials, **kwargs
|
|
) -> BlockOutput:
|
|
|
|
exa = Exa(credentials.api_key.get_secret_value())
|
|
|
|
entity = None
|
|
if input_data.search_entity_type == SearchEntityType.COMPANY:
|
|
entity = WebsetCompanyEntity(type="company")
|
|
elif input_data.search_entity_type == SearchEntityType.PERSON:
|
|
entity = WebsetPersonEntity(type="person")
|
|
elif input_data.search_entity_type == SearchEntityType.ARTICLE:
|
|
entity = WebsetArticleEntity(type="article")
|
|
elif input_data.search_entity_type == SearchEntityType.RESEARCH_PAPER:
|
|
entity = WebsetResearchPaperEntity(type="research_paper")
|
|
elif (
|
|
input_data.search_entity_type == SearchEntityType.CUSTOM
|
|
and input_data.search_entity_description
|
|
):
|
|
entity = WebsetCustomEntity(
|
|
type="custom", description=input_data.search_entity_description
|
|
)
|
|
|
|
criteria = None
|
|
if input_data.search_criteria:
|
|
criteria = [
|
|
CreateCriterionParameters(description=item)
|
|
for item in input_data.search_criteria
|
|
]
|
|
|
|
exclude_items = None
|
|
if input_data.search_exclude_sources:
|
|
exclude_items = []
|
|
for idx, src_id in enumerate(input_data.search_exclude_sources):
|
|
src_type = None
|
|
if input_data.search_exclude_types and idx < len(
|
|
input_data.search_exclude_types
|
|
):
|
|
src_type = input_data.search_exclude_types[idx]
|
|
# Default to IMPORT if type missing
|
|
if src_type == SearchType.WEBSET:
|
|
source_enum = ImportSource.webset
|
|
else:
|
|
source_enum = ImportSource.import_
|
|
exclude_items.append(ExcludeItem(source=source_enum, id=src_id))
|
|
|
|
scope_items = None
|
|
if input_data.search_scope_sources:
|
|
scope_items = []
|
|
for idx, src_id in enumerate(input_data.search_scope_sources):
|
|
src_type = None
|
|
if input_data.search_scope_types and idx < len(
|
|
input_data.search_scope_types
|
|
):
|
|
src_type = input_data.search_scope_types[idx]
|
|
relationship = None
|
|
if input_data.search_scope_relationships and idx < len(
|
|
input_data.search_scope_relationships
|
|
):
|
|
rel_def = input_data.search_scope_relationships[idx]
|
|
lim = None
|
|
if input_data.search_scope_relationship_limits and idx < len(
|
|
input_data.search_scope_relationship_limits
|
|
):
|
|
lim = input_data.search_scope_relationship_limits[idx]
|
|
relationship = ScopeRelationship(definition=rel_def, limit=lim)
|
|
if src_type == SearchType.WEBSET:
|
|
src_enum = ScopeSourceType.webset
|
|
else:
|
|
src_enum = ScopeSourceType.import_
|
|
scope_items.append(
|
|
ScopeItem(source=src_enum, id=src_id, relationship=relationship)
|
|
)
|
|
|
|
search_params = None
|
|
if input_data.search_query:
|
|
search_params = CreateWebsetParametersSearch(
|
|
query=input_data.search_query,
|
|
count=input_data.search_count,
|
|
entity=entity,
|
|
criteria=criteria,
|
|
exclude=exclude_items,
|
|
scope=scope_items,
|
|
)
|
|
|
|
imports_params = None
|
|
if input_data.import_sources:
|
|
imports_params = []
|
|
for idx, src_id in enumerate(input_data.import_sources):
|
|
src_type = None
|
|
if input_data.import_types and idx < len(input_data.import_types):
|
|
src_type = input_data.import_types[idx]
|
|
if src_type == SearchType.WEBSET:
|
|
source_enum = ImportSource.webset
|
|
else:
|
|
source_enum = ImportSource.import_
|
|
imports_params.append(ImportItem(source=source_enum, id=src_id))
|
|
|
|
enrichments_params = None
|
|
if input_data.enrichment_descriptions:
|
|
enrichments_params = []
|
|
for idx, desc in enumerate(input_data.enrichment_descriptions):
|
|
fmt = None
|
|
if input_data.enrichment_formats and idx < len(
|
|
input_data.enrichment_formats
|
|
):
|
|
fmt_enum = input_data.enrichment_formats[idx]
|
|
if fmt_enum is not None:
|
|
fmt = Format(
|
|
fmt_enum.value if isinstance(fmt_enum, Enum) else fmt_enum
|
|
)
|
|
options_list = None
|
|
if input_data.enrichment_options and idx < len(
|
|
input_data.enrichment_options
|
|
):
|
|
raw_opts = input_data.enrichment_options[idx]
|
|
if raw_opts:
|
|
options_list = [Option(label=o) for o in raw_opts]
|
|
metadata_obj = None
|
|
if input_data.enrichment_metadata and idx < len(
|
|
input_data.enrichment_metadata
|
|
):
|
|
metadata_obj = input_data.enrichment_metadata[idx]
|
|
enrichments_params.append(
|
|
CreateEnrichmentParameters(
|
|
description=desc,
|
|
format=fmt,
|
|
options=options_list,
|
|
metadata=metadata_obj,
|
|
)
|
|
)
|
|
|
|
try:
|
|
start_time = time.time()
|
|
webset = exa.websets.create(
|
|
params=CreateWebsetParameters(
|
|
search=search_params,
|
|
imports=imports_params,
|
|
enrichments=enrichments_params,
|
|
external_id=input_data.external_id,
|
|
metadata=input_data.metadata,
|
|
)
|
|
)
|
|
|
|
webset_result = Webset.model_validate(webset.model_dump(by_alias=True))
|
|
|
|
# If wait_for_initial_results is True, poll for completion
|
|
if input_data.wait_for_initial_results and search_params:
|
|
final_webset = exa.websets.wait_until_idle(
|
|
id=webset_result.id,
|
|
timeout=input_data.polling_timeout,
|
|
poll_interval=5,
|
|
)
|
|
completion_time = time.time() - start_time
|
|
|
|
item_count = 0
|
|
if final_webset.searches:
|
|
for search in final_webset.searches:
|
|
if search.progress:
|
|
item_count += search.progress.found
|
|
|
|
yield "webset", webset_result
|
|
yield "initial_item_count", item_count
|
|
yield "completion_time", completion_time
|
|
else:
|
|
yield "webset", webset_result
|
|
|
|
except ValueError as e:
|
|
raise ValueError(f"Invalid webset configuration: {e}") from e
|
|
|
|
|
|
class ExaCreateOrFindWebsetBlock(Block):
|
|
"""Create a new webset or return existing one if external_id already exists (idempotent)."""
|
|
|
|
class Input(BlockSchemaInput):
|
|
credentials: CredentialsMetaInput = exa.credentials_field(
|
|
description="The Exa integration requires an API Key."
|
|
)
|
|
|
|
external_id: str = SchemaField(
|
|
description="External identifier for this webset - used to find existing or create new",
|
|
placeholder="my-unique-webset-id",
|
|
)
|
|
|
|
search_query: Optional[str] = SchemaField(
|
|
default=None,
|
|
description="Search query (optional - only needed if creating new webset)",
|
|
placeholder="Marketing agencies based in the US",
|
|
)
|
|
search_count: int = SchemaField(
|
|
default=10,
|
|
description="Number of items to find in initial search",
|
|
ge=1,
|
|
le=1000,
|
|
)
|
|
|
|
metadata: Optional[dict] = SchemaField(
|
|
default=None,
|
|
description="Key-value pairs to associate with the webset",
|
|
advanced=True,
|
|
)
|
|
|
|
class Output(BlockSchemaOutput):
|
|
webset: Webset = SchemaField(
|
|
description="The webset (existing or newly created)"
|
|
)
|
|
was_created: bool = SchemaField(
|
|
description="True if webset was newly created, False if it already existed"
|
|
)
|
|
|
|
def __init__(self):
|
|
super().__init__(
|
|
id="214542b6-3603-4bea-bc07-f51c2871cbd9",
|
|
description="Create a new webset or return existing one by external_id (idempotent operation)",
|
|
categories={BlockCategory.SEARCH},
|
|
input_schema=ExaCreateOrFindWebsetBlock.Input,
|
|
output_schema=ExaCreateOrFindWebsetBlock.Output,
|
|
)
|
|
|
|
async def run(
|
|
self, input_data: Input, *, credentials: APIKeyCredentials, **kwargs
|
|
) -> BlockOutput:
|
|
import httpx
|
|
|
|
aexa = AsyncExa(api_key=credentials.api_key.get_secret_value())
|
|
|
|
try:
|
|
webset = await aexa.websets.get(id=input_data.external_id)
|
|
webset_result = Webset.model_validate(webset.model_dump(by_alias=True))
|
|
|
|
yield "webset", webset_result
|
|
yield "was_created", False
|
|
|
|
except httpx.HTTPStatusError as e:
|
|
if e.response.status_code == 404:
|
|
# Not found - create new webset
|
|
search_params = None
|
|
if input_data.search_query:
|
|
search_params = CreateWebsetParametersSearch(
|
|
query=input_data.search_query,
|
|
count=input_data.search_count,
|
|
)
|
|
|
|
webset = await aexa.websets.create(
|
|
params=CreateWebsetParameters(
|
|
search=search_params,
|
|
external_id=input_data.external_id,
|
|
metadata=input_data.metadata,
|
|
)
|
|
)
|
|
|
|
webset_result = Webset.model_validate(webset.model_dump(by_alias=True))
|
|
|
|
yield "webset", webset_result
|
|
yield "was_created", True
|
|
else:
|
|
# Other HTTP errors should propagate
|
|
raise
|
|
|
|
|
|
class ExaUpdateWebsetBlock(Block):
|
|
class Input(BlockSchemaInput):
|
|
credentials: CredentialsMetaInput = exa.credentials_field(
|
|
description="The Exa integration requires an API Key."
|
|
)
|
|
webset_id: str = SchemaField(
|
|
description="The ID or external ID of the Webset to update",
|
|
placeholder="webset-id-or-external-id",
|
|
)
|
|
metadata: Optional[dict] = SchemaField(
|
|
default=None,
|
|
description="Key-value pairs to associate with this webset (set to null to clear)",
|
|
)
|
|
|
|
class Output(BlockSchemaOutput):
|
|
webset_id: str = SchemaField(description="The unique identifier for the webset")
|
|
status: str = SchemaField(description="The status of the webset")
|
|
external_id: Optional[str] = SchemaField(
|
|
description="The external identifier for the webset"
|
|
)
|
|
metadata: dict = SchemaField(description="Updated metadata for the webset")
|
|
updated_at: str = SchemaField(
|
|
description="The date and time the webset was updated"
|
|
)
|
|
|
|
def __init__(self):
|
|
super().__init__(
|
|
id="89ccd99a-3c2b-4fbf-9e25-0ffa398d0314",
|
|
description="Update metadata for an existing Webset",
|
|
categories={BlockCategory.SEARCH},
|
|
input_schema=ExaUpdateWebsetBlock.Input,
|
|
output_schema=ExaUpdateWebsetBlock.Output,
|
|
)
|
|
|
|
async def run(
|
|
self, input_data: Input, *, credentials: APIKeyCredentials, **kwargs
|
|
) -> BlockOutput:
|
|
aexa = AsyncExa(api_key=credentials.api_key.get_secret_value())
|
|
|
|
payload = {}
|
|
if input_data.metadata is not None:
|
|
payload["metadata"] = input_data.metadata
|
|
|
|
sdk_webset = await aexa.websets.update(id=input_data.webset_id, params=payload)
|
|
|
|
status_str = (
|
|
sdk_webset.status.value
|
|
if hasattr(sdk_webset.status, "value")
|
|
else str(sdk_webset.status)
|
|
)
|
|
|
|
yield "webset_id", sdk_webset.id
|
|
yield "status", status_str
|
|
yield "external_id", sdk_webset.external_id
|
|
yield "metadata", sdk_webset.metadata or {}
|
|
yield "updated_at", (
|
|
sdk_webset.updated_at.isoformat() if sdk_webset.updated_at else ""
|
|
)
|
|
|
|
|
|
class ExaListWebsetsBlock(Block):
|
|
class Input(BlockSchemaInput):
|
|
credentials: CredentialsMetaInput = exa.credentials_field(
|
|
description="The Exa integration requires an API Key."
|
|
)
|
|
trigger: Any | None = SchemaField(
|
|
default=None,
|
|
description="Trigger for the webset, value is ignored!",
|
|
advanced=False,
|
|
)
|
|
cursor: Optional[str] = SchemaField(
|
|
default=None,
|
|
description="Cursor for pagination through results",
|
|
advanced=True,
|
|
)
|
|
limit: int = SchemaField(
|
|
default=25,
|
|
description="Number of websets to return (1-100)",
|
|
ge=1,
|
|
le=100,
|
|
advanced=True,
|
|
)
|
|
|
|
class Output(BlockSchemaOutput):
|
|
websets: list[Webset] = SchemaField(description="List of websets")
|
|
has_more: bool = SchemaField(
|
|
description="Whether there are more results to paginate through"
|
|
)
|
|
next_cursor: Optional[str] = SchemaField(
|
|
description="Cursor for the next page of results"
|
|
)
|
|
|
|
def __init__(self):
|
|
super().__init__(
|
|
id="1dcd8fd6-c13f-4e6f-bd4c-654428fa4757",
|
|
description="List all Websets with pagination support",
|
|
categories={BlockCategory.SEARCH},
|
|
input_schema=ExaListWebsetsBlock.Input,
|
|
output_schema=ExaListWebsetsBlock.Output,
|
|
)
|
|
|
|
async def run(
|
|
self, input_data: Input, *, credentials: APIKeyCredentials, **kwargs
|
|
) -> BlockOutput:
|
|
aexa = AsyncExa(api_key=credentials.api_key.get_secret_value())
|
|
|
|
response = await aexa.websets.list(
|
|
cursor=input_data.cursor,
|
|
limit=input_data.limit,
|
|
)
|
|
|
|
websets_data = [
|
|
w.model_dump(by_alias=True, exclude_none=True) for w in response.data
|
|
]
|
|
|
|
yield "websets", websets_data
|
|
yield "has_more", response.has_more
|
|
yield "next_cursor", response.next_cursor
|
|
|
|
|
|
class ExaGetWebsetBlock(Block):
|
|
class Input(BlockSchemaInput):
|
|
credentials: CredentialsMetaInput = exa.credentials_field(
|
|
description="The Exa integration requires an API Key."
|
|
)
|
|
webset_id: str = SchemaField(
|
|
description="The ID or external ID of the Webset to retrieve",
|
|
placeholder="webset-id-or-external-id",
|
|
)
|
|
|
|
class Output(BlockSchemaOutput):
|
|
webset_id: str = SchemaField(description="The unique identifier for the webset")
|
|
status: str = SchemaField(description="The status of the webset")
|
|
external_id: Optional[str] = SchemaField(
|
|
description="The external identifier for the webset"
|
|
)
|
|
searches: list[dict] = SchemaField(
|
|
description="The searches performed on the webset"
|
|
)
|
|
enrichments: list[dict] = SchemaField(
|
|
description="The enrichments applied to the webset"
|
|
)
|
|
monitors: list[dict] = SchemaField(description="The monitors for the webset")
|
|
metadata: dict = SchemaField(
|
|
description="Key-value pairs associated with the webset"
|
|
)
|
|
created_at: str = SchemaField(
|
|
description="The date and time the webset was created"
|
|
)
|
|
updated_at: str = SchemaField(
|
|
description="The date and time the webset was last updated"
|
|
)
|
|
|
|
def __init__(self):
|
|
super().__init__(
|
|
id="6ab8e12a-132c-41bf-b5f3-d662620fa832",
|
|
description="Retrieve a Webset by ID or external ID",
|
|
categories={BlockCategory.SEARCH},
|
|
input_schema=ExaGetWebsetBlock.Input,
|
|
output_schema=ExaGetWebsetBlock.Output,
|
|
)
|
|
|
|
async def run(
|
|
self, input_data: Input, *, credentials: APIKeyCredentials, **kwargs
|
|
) -> BlockOutput:
|
|
aexa = AsyncExa(api_key=credentials.api_key.get_secret_value())
|
|
|
|
sdk_webset = await aexa.websets.get(id=input_data.webset_id)
|
|
|
|
status_str = (
|
|
sdk_webset.status.value
|
|
if hasattr(sdk_webset.status, "value")
|
|
else str(sdk_webset.status)
|
|
)
|
|
|
|
searches_data = [
|
|
s.model_dump(by_alias=True, exclude_none=True)
|
|
for s in sdk_webset.searches or []
|
|
]
|
|
enrichments_data = [
|
|
e.model_dump(by_alias=True, exclude_none=True)
|
|
for e in sdk_webset.enrichments or []
|
|
]
|
|
monitors_data = [
|
|
m.model_dump(by_alias=True, exclude_none=True)
|
|
for m in sdk_webset.monitors or []
|
|
]
|
|
|
|
yield "webset_id", sdk_webset.id
|
|
yield "status", status_str
|
|
yield "external_id", sdk_webset.external_id
|
|
yield "searches", searches_data
|
|
yield "enrichments", enrichments_data
|
|
yield "monitors", monitors_data
|
|
yield "metadata", sdk_webset.metadata or {}
|
|
yield "created_at", (
|
|
sdk_webset.created_at.isoformat() if sdk_webset.created_at else ""
|
|
)
|
|
yield "updated_at", (
|
|
sdk_webset.updated_at.isoformat() if sdk_webset.updated_at else ""
|
|
)
|
|
|
|
|
|
class ExaDeleteWebsetBlock(Block):
|
|
class Input(BlockSchemaInput):
|
|
credentials: CredentialsMetaInput = exa.credentials_field(
|
|
description="The Exa integration requires an API Key."
|
|
)
|
|
webset_id: str = SchemaField(
|
|
description="The ID or external ID of the Webset to delete",
|
|
placeholder="webset-id-or-external-id",
|
|
)
|
|
|
|
class Output(BlockSchemaOutput):
|
|
webset_id: str = SchemaField(
|
|
description="The unique identifier for the deleted webset"
|
|
)
|
|
external_id: Optional[str] = SchemaField(
|
|
description="The external identifier for the deleted webset"
|
|
)
|
|
status: str = SchemaField(description="The status of the deleted webset")
|
|
success: str = SchemaField(description="Whether the deletion was successful")
|
|
|
|
def __init__(self):
|
|
super().__init__(
|
|
id="aa6994a2-e986-421f-8d4c-7671d3be7b7e",
|
|
description="Delete a Webset and all its items",
|
|
categories={BlockCategory.SEARCH},
|
|
input_schema=ExaDeleteWebsetBlock.Input,
|
|
output_schema=ExaDeleteWebsetBlock.Output,
|
|
)
|
|
|
|
async def run(
|
|
self, input_data: Input, *, credentials: APIKeyCredentials, **kwargs
|
|
) -> BlockOutput:
|
|
aexa = AsyncExa(api_key=credentials.api_key.get_secret_value())
|
|
|
|
deleted_webset = await aexa.websets.delete(id=input_data.webset_id)
|
|
|
|
status_str = (
|
|
deleted_webset.status.value
|
|
if hasattr(deleted_webset.status, "value")
|
|
else str(deleted_webset.status)
|
|
)
|
|
|
|
yield "webset_id", deleted_webset.id
|
|
yield "external_id", deleted_webset.external_id
|
|
yield "status", status_str
|
|
yield "success", "true"
|
|
|
|
|
|
class ExaCancelWebsetBlock(Block):
|
|
class Input(BlockSchemaInput):
|
|
credentials: CredentialsMetaInput = exa.credentials_field(
|
|
description="The Exa integration requires an API Key."
|
|
)
|
|
webset_id: str = SchemaField(
|
|
description="The ID or external ID of the Webset to cancel",
|
|
placeholder="webset-id-or-external-id",
|
|
)
|
|
|
|
class Output(BlockSchemaOutput):
|
|
webset_id: str = SchemaField(description="The unique identifier for the webset")
|
|
status: str = SchemaField(
|
|
description="The status of the webset after cancellation"
|
|
)
|
|
external_id: Optional[str] = SchemaField(
|
|
description="The external identifier for the webset"
|
|
)
|
|
success: str = SchemaField(
|
|
description="Whether the cancellation was successful"
|
|
)
|
|
|
|
def __init__(self):
|
|
super().__init__(
|
|
id="e40a6420-1db8-47bb-b00a-0e6aecd74176",
|
|
description="Cancel all operations being performed on a Webset",
|
|
categories={BlockCategory.SEARCH},
|
|
input_schema=ExaCancelWebsetBlock.Input,
|
|
output_schema=ExaCancelWebsetBlock.Output,
|
|
)
|
|
|
|
async def run(
|
|
self, input_data: Input, *, credentials: APIKeyCredentials, **kwargs
|
|
) -> BlockOutput:
|
|
aexa = AsyncExa(api_key=credentials.api_key.get_secret_value())
|
|
|
|
canceled_webset = await aexa.websets.cancel(id=input_data.webset_id)
|
|
|
|
status_str = (
|
|
canceled_webset.status.value
|
|
if hasattr(canceled_webset.status, "value")
|
|
else str(canceled_webset.status)
|
|
)
|
|
|
|
yield "webset_id", canceled_webset.id
|
|
yield "status", status_str
|
|
yield "external_id", canceled_webset.external_id
|
|
yield "success", "true"
|
|
|
|
|
|
# Mirrored models for Preview response stability
|
|
class PreviewCriterionModel(BaseModel):
|
|
"""Stable model for preview criteria."""
|
|
|
|
description: str
|
|
|
|
@classmethod
|
|
def from_sdk(cls, sdk_criterion) -> "PreviewCriterionModel":
|
|
"""Convert SDK criterion to our model."""
|
|
return cls(description=sdk_criterion.description)
|
|
|
|
|
|
class PreviewEnrichmentModel(BaseModel):
|
|
"""Stable model for preview enrichment."""
|
|
|
|
description: str
|
|
format: str
|
|
options: List[str]
|
|
|
|
@classmethod
|
|
def from_sdk(cls, sdk_enrichment) -> "PreviewEnrichmentModel":
|
|
"""Convert SDK enrichment to our model."""
|
|
format_str = (
|
|
sdk_enrichment.format.value
|
|
if hasattr(sdk_enrichment.format, "value")
|
|
else str(sdk_enrichment.format)
|
|
)
|
|
|
|
options_list = []
|
|
if sdk_enrichment.options:
|
|
for opt in sdk_enrichment.options:
|
|
opt_dict = opt.model_dump(by_alias=True)
|
|
options_list.append(opt_dict.get("label", ""))
|
|
|
|
return cls(
|
|
description=sdk_enrichment.description,
|
|
format=format_str,
|
|
options=options_list,
|
|
)
|
|
|
|
|
|
class PreviewSearchModel(BaseModel):
|
|
"""Stable model for preview search details."""
|
|
|
|
entity_type: str
|
|
entity_description: Optional[str]
|
|
criteria: List[PreviewCriterionModel]
|
|
|
|
@classmethod
|
|
def from_sdk(cls, sdk_search) -> "PreviewSearchModel":
|
|
"""Convert SDK search preview to our model."""
|
|
# Extract entity type from union
|
|
entity_dict = sdk_search.entity.model_dump(by_alias=True)
|
|
entity_type = entity_dict.get("type", "auto")
|
|
entity_description = entity_dict.get("description")
|
|
|
|
# Convert criteria
|
|
criteria = [
|
|
PreviewCriterionModel.from_sdk(c) for c in sdk_search.criteria or []
|
|
]
|
|
|
|
return cls(
|
|
entity_type=entity_type,
|
|
entity_description=entity_description,
|
|
criteria=criteria,
|
|
)
|
|
|
|
|
|
class PreviewWebsetModel(BaseModel):
|
|
"""Stable model for preview response."""
|
|
|
|
search: PreviewSearchModel
|
|
enrichments: List[PreviewEnrichmentModel]
|
|
|
|
@classmethod
|
|
def from_sdk(cls, sdk_preview) -> "PreviewWebsetModel":
|
|
"""Convert SDK PreviewWebsetResponse to our model."""
|
|
|
|
search = PreviewSearchModel.from_sdk(sdk_preview.search)
|
|
enrichments = [
|
|
PreviewEnrichmentModel.from_sdk(e) for e in sdk_preview.enrichments or []
|
|
]
|
|
|
|
return cls(search=search, enrichments=enrichments)
|
|
|
|
|
|
class ExaPreviewWebsetBlock(Block):
|
|
class Input(BlockSchemaInput):
|
|
credentials: CredentialsMetaInput = exa.credentials_field(
|
|
description="The Exa integration requires an API Key."
|
|
)
|
|
query: str = SchemaField(
|
|
description="Your search query to preview. Use this to see how Exa will interpret your search before creating a webset.",
|
|
placeholder="Marketing agencies based in the US, with brands worked with and city",
|
|
)
|
|
entity_type: Optional[SearchEntityType] = SchemaField(
|
|
default=None,
|
|
description="Entity type to force: 'company', 'person', 'article', 'research_paper', or 'custom'. If not provided, Exa will auto-detect.",
|
|
advanced=True,
|
|
)
|
|
entity_description: Optional[str] = SchemaField(
|
|
default=None,
|
|
description="Description for custom entity type (required when entity_type is 'custom')",
|
|
advanced=True,
|
|
)
|
|
|
|
class Output(BlockSchemaOutput):
|
|
preview: PreviewWebsetModel = SchemaField(
|
|
description="Full preview response with search and enrichment details"
|
|
)
|
|
entity_type: str = SchemaField(
|
|
description="The detected or specified entity type"
|
|
)
|
|
entity_description: Optional[str] = SchemaField(
|
|
description="Description of the entity type"
|
|
)
|
|
criteria: list[PreviewCriterionModel] = SchemaField(
|
|
description="Generated search criteria that will be used"
|
|
)
|
|
enrichment_columns: list[PreviewEnrichmentModel] = SchemaField(
|
|
description="Available enrichment columns that can be extracted"
|
|
)
|
|
interpretation: str = SchemaField(
|
|
description="Human-readable interpretation of how the query will be processed"
|
|
)
|
|
suggestions: list[str] = SchemaField(
|
|
description="Suggestions for improving the query"
|
|
)
|
|
|
|
def __init__(self):
|
|
super().__init__(
|
|
id="f8c4e2a1-9b3d-4e5f-a6c7-d8e9f0a1b2c3",
|
|
description="Preview how a search query will be interpreted before creating a webset. Helps understand entity detection, criteria generation, and available enrichments.",
|
|
categories={BlockCategory.SEARCH},
|
|
input_schema=ExaPreviewWebsetBlock.Input,
|
|
output_schema=ExaPreviewWebsetBlock.Output,
|
|
)
|
|
|
|
async def run(
|
|
self, input_data: Input, *, credentials: APIKeyCredentials, **kwargs
|
|
) -> BlockOutput:
|
|
aexa = AsyncExa(api_key=credentials.api_key.get_secret_value())
|
|
|
|
payload: dict[str, Any] = {
|
|
"query": input_data.query,
|
|
}
|
|
|
|
if input_data.entity_type:
|
|
entity: dict[str, Any] = {"type": input_data.entity_type.value}
|
|
if (
|
|
input_data.entity_type == SearchEntityType.CUSTOM
|
|
and input_data.entity_description
|
|
):
|
|
entity["description"] = input_data.entity_description
|
|
payload["entity"] = entity
|
|
|
|
sdk_preview = await aexa.websets.preview(params=payload)
|
|
|
|
preview = PreviewWebsetModel.from_sdk(sdk_preview)
|
|
|
|
entity_type = preview.search.entity_type
|
|
entity_description = preview.search.entity_description
|
|
criteria = preview.search.criteria
|
|
enrichments = preview.enrichments
|
|
|
|
# Generate interpretation
|
|
interpretation = f"Query will search for {entity_type}"
|
|
if entity_description:
|
|
interpretation += f" ({entity_description})"
|
|
if criteria:
|
|
interpretation += f" with {len(criteria)} criteria"
|
|
if enrichments:
|
|
interpretation += f" and {len(enrichments)} available enrichment columns"
|
|
|
|
# Generate suggestions
|
|
suggestions = []
|
|
if not criteria:
|
|
suggestions.append(
|
|
"Consider adding specific criteria to narrow your search"
|
|
)
|
|
if not enrichments:
|
|
suggestions.append(
|
|
"Consider specifying what data points you want to extract"
|
|
)
|
|
|
|
# Yield full model first
|
|
yield "preview", preview
|
|
|
|
# Then yield individual fields for graph flexibility
|
|
yield "entity_type", entity_type
|
|
yield "entity_description", entity_description
|
|
yield "criteria", criteria
|
|
yield "enrichment_columns", enrichments
|
|
yield "interpretation", interpretation
|
|
yield "suggestions", suggestions
|
|
|
|
|
|
class ExaWebsetStatusBlock(Block):
|
|
"""Get a quick status overview of a webset without fetching all details."""
|
|
|
|
class Input(BlockSchemaInput):
|
|
credentials: CredentialsMetaInput = exa.credentials_field(
|
|
description="The Exa integration requires an API Key."
|
|
)
|
|
webset_id: str = SchemaField(
|
|
description="The ID or external ID of the Webset",
|
|
placeholder="webset-id-or-external-id",
|
|
)
|
|
|
|
class Output(BlockSchemaOutput):
|
|
webset_id: str = SchemaField(description="The webset identifier")
|
|
status: str = SchemaField(
|
|
description="Current status (idle, running, paused, etc.)"
|
|
)
|
|
item_count: int = SchemaField(description="Total number of items in the webset")
|
|
search_count: int = SchemaField(description="Number of searches performed")
|
|
enrichment_count: int = SchemaField(
|
|
description="Number of enrichments configured"
|
|
)
|
|
monitor_count: int = SchemaField(description="Number of monitors configured")
|
|
last_updated: str = SchemaField(description="When the webset was last updated")
|
|
is_processing: bool = SchemaField(
|
|
description="Whether any operations are currently running"
|
|
)
|
|
|
|
def __init__(self):
|
|
super().__init__(
|
|
id="47cc3cd8-840f-4ec4-8d40-fcaba75fbe1a",
|
|
description="Get a quick status overview of a webset",
|
|
categories={BlockCategory.SEARCH},
|
|
input_schema=ExaWebsetStatusBlock.Input,
|
|
output_schema=ExaWebsetStatusBlock.Output,
|
|
)
|
|
|
|
async def run(
|
|
self, input_data: Input, *, credentials: APIKeyCredentials, **kwargs
|
|
) -> BlockOutput:
|
|
aexa = AsyncExa(api_key=credentials.api_key.get_secret_value())
|
|
|
|
webset = await aexa.websets.get(id=input_data.webset_id)
|
|
|
|
status = (
|
|
webset.status.value
|
|
if hasattr(webset.status, "value")
|
|
else str(webset.status)
|
|
)
|
|
is_processing = status in ["running", "pending"]
|
|
|
|
# Estimate item count from search progress
|
|
item_count = 0
|
|
if webset.searches:
|
|
for search in webset.searches:
|
|
if search.progress:
|
|
item_count += search.progress.found
|
|
|
|
# Count searches, enrichments, monitors
|
|
search_count = len(webset.searches or [])
|
|
enrichment_count = len(webset.enrichments or [])
|
|
monitor_count = len(webset.monitors or [])
|
|
|
|
yield "webset_id", webset.id
|
|
yield "status", status
|
|
yield "item_count", item_count
|
|
yield "search_count", search_count
|
|
yield "enrichment_count", enrichment_count
|
|
yield "monitor_count", monitor_count
|
|
yield "last_updated", webset.updated_at.isoformat() if webset.updated_at else ""
|
|
yield "is_processing", is_processing
|
|
|
|
|
|
# Summary models for ExaWebsetSummaryBlock
|
|
class SearchSummaryModel(BaseModel):
|
|
"""Summary of searches in a webset."""
|
|
|
|
total_searches: int
|
|
completed_searches: int
|
|
total_items_found: int
|
|
queries: List[str]
|
|
|
|
|
|
class EnrichmentSummaryModel(BaseModel):
|
|
"""Summary of enrichments in a webset."""
|
|
|
|
total_enrichments: int
|
|
completed_enrichments: int
|
|
enrichment_types: List[str]
|
|
titles: List[str]
|
|
|
|
|
|
class MonitorSummaryModel(BaseModel):
|
|
"""Summary of monitors in a webset."""
|
|
|
|
total_monitors: int
|
|
active_monitors: int
|
|
next_run: Optional[datetime] = None
|
|
|
|
|
|
class WebsetStatisticsModel(BaseModel):
|
|
"""Various statistics about a webset."""
|
|
|
|
total_operations: int
|
|
is_processing: bool
|
|
has_monitors: bool
|
|
avg_items_per_search: float
|
|
|
|
|
|
class ExaWebsetSummaryBlock(Block):
|
|
"""Get a comprehensive summary of a webset including samples and statistics."""
|
|
|
|
class Input(BlockSchemaInput):
|
|
credentials: CredentialsMetaInput = exa.credentials_field(
|
|
description="The Exa integration requires an API Key."
|
|
)
|
|
webset_id: str = SchemaField(
|
|
description="The ID or external ID of the Webset",
|
|
placeholder="webset-id-or-external-id",
|
|
)
|
|
include_sample_items: bool = SchemaField(
|
|
default=True,
|
|
description="Include sample items in the summary",
|
|
)
|
|
sample_size: int = SchemaField(
|
|
default=3,
|
|
description="Number of sample items to include",
|
|
ge=0,
|
|
le=10,
|
|
)
|
|
include_search_details: bool = SchemaField(
|
|
default=True,
|
|
description="Include details about searches",
|
|
)
|
|
include_enrichment_details: bool = SchemaField(
|
|
default=True,
|
|
description="Include details about enrichments",
|
|
)
|
|
|
|
class Output(BlockSchemaOutput):
|
|
webset_id: str = SchemaField(description="The webset identifier")
|
|
status: str = SchemaField(description="Current status")
|
|
entity_type: str = SchemaField(description="Type of entities in the webset")
|
|
total_items: int = SchemaField(description="Total number of items")
|
|
sample_items: list[Dict[str, Any]] = SchemaField(
|
|
description="Sample items from the webset"
|
|
)
|
|
search_summary: SearchSummaryModel = SchemaField(
|
|
description="Summary of searches performed"
|
|
)
|
|
enrichment_summary: EnrichmentSummaryModel = SchemaField(
|
|
description="Summary of enrichments applied"
|
|
)
|
|
monitor_summary: MonitorSummaryModel = SchemaField(
|
|
description="Summary of monitors configured"
|
|
)
|
|
statistics: WebsetStatisticsModel = SchemaField(
|
|
description="Various statistics about the webset"
|
|
)
|
|
created_at: str = SchemaField(description="When the webset was created")
|
|
updated_at: str = SchemaField(description="When the webset was last updated")
|
|
|
|
def __init__(self):
|
|
super().__init__(
|
|
id="9eff1710-a49b-490e-b486-197bf8b23c61",
|
|
description="Get a comprehensive summary of a webset with samples and statistics",
|
|
categories={BlockCategory.SEARCH},
|
|
input_schema=ExaWebsetSummaryBlock.Input,
|
|
output_schema=ExaWebsetSummaryBlock.Output,
|
|
)
|
|
|
|
async def run(
|
|
self, input_data: Input, *, credentials: APIKeyCredentials, **kwargs
|
|
) -> BlockOutput:
|
|
aexa = AsyncExa(api_key=credentials.api_key.get_secret_value())
|
|
|
|
webset = await aexa.websets.get(id=input_data.webset_id)
|
|
|
|
# Extract basic info
|
|
webset_id = webset.id
|
|
status = (
|
|
webset.status.value
|
|
if hasattr(webset.status, "value")
|
|
else str(webset.status)
|
|
)
|
|
|
|
# Determine entity type from searches
|
|
entity_type = "unknown"
|
|
searches = webset.searches or []
|
|
if searches:
|
|
first_search = searches[0]
|
|
if first_search.entity:
|
|
entity_dict = first_search.entity.model_dump(
|
|
by_alias=True, exclude_none=True
|
|
)
|
|
entity_type = entity_dict.get("type", "unknown")
|
|
|
|
# Get sample items if requested
|
|
sample_items_data = []
|
|
total_items = 0
|
|
|
|
if input_data.include_sample_items and input_data.sample_size > 0:
|
|
items_response = await aexa.websets.items.list(
|
|
webset_id=input_data.webset_id, limit=input_data.sample_size
|
|
)
|
|
sample_items_data = [
|
|
item.model_dump(by_alias=True, exclude_none=True)
|
|
for item in items_response.data
|
|
]
|
|
total_items = len(sample_items_data)
|
|
|
|
# Build search summary using Pydantic model
|
|
search_summary = SearchSummaryModel(
|
|
total_searches=0,
|
|
completed_searches=0,
|
|
total_items_found=0,
|
|
queries=[],
|
|
)
|
|
if input_data.include_search_details and searches:
|
|
search_summary = SearchSummaryModel(
|
|
total_searches=len(searches),
|
|
completed_searches=sum(
|
|
1
|
|
for s in searches
|
|
if (s.status.value if hasattr(s.status, "value") else str(s.status))
|
|
== "completed"
|
|
),
|
|
total_items_found=int(
|
|
sum(s.progress.found if s.progress else 0 for s in searches)
|
|
),
|
|
queries=[s.query for s in searches[:3]], # First 3 queries
|
|
)
|
|
|
|
# Build enrichment summary using Pydantic model
|
|
enrichment_summary = EnrichmentSummaryModel(
|
|
total_enrichments=0,
|
|
completed_enrichments=0,
|
|
enrichment_types=[],
|
|
titles=[],
|
|
)
|
|
enrichments = webset.enrichments or []
|
|
if input_data.include_enrichment_details and enrichments:
|
|
enrichment_summary = EnrichmentSummaryModel(
|
|
total_enrichments=len(enrichments),
|
|
completed_enrichments=sum(
|
|
1
|
|
for e in enrichments
|
|
if (e.status.value if hasattr(e.status, "value") else str(e.status))
|
|
== "completed"
|
|
),
|
|
enrichment_types=list(
|
|
set(
|
|
(
|
|
e.format.value
|
|
if e.format and hasattr(e.format, "value")
|
|
else str(e.format) if e.format else "text"
|
|
)
|
|
for e in enrichments
|
|
)
|
|
),
|
|
titles=[(e.title or e.description or "")[:50] for e in enrichments[:3]],
|
|
)
|
|
|
|
# Build monitor summary using Pydantic model
|
|
monitors = webset.monitors or []
|
|
next_run_dt = None
|
|
if monitors:
|
|
next_runs = [m.next_run_at for m in monitors if m.next_run_at]
|
|
if next_runs:
|
|
next_run_dt = min(next_runs)
|
|
|
|
monitor_summary = MonitorSummaryModel(
|
|
total_monitors=len(monitors),
|
|
active_monitors=sum(
|
|
1
|
|
for m in monitors
|
|
if (m.status.value if hasattr(m.status, "value") else str(m.status))
|
|
== "enabled"
|
|
),
|
|
next_run=next_run_dt,
|
|
)
|
|
|
|
# Build statistics using Pydantic model
|
|
statistics = WebsetStatisticsModel(
|
|
total_operations=len(searches) + len(enrichments),
|
|
is_processing=status in ["running", "pending"],
|
|
has_monitors=len(monitors) > 0,
|
|
avg_items_per_search=(
|
|
search_summary.total_items_found / len(searches) if searches else 0
|
|
),
|
|
)
|
|
|
|
yield "webset_id", webset_id
|
|
yield "status", status
|
|
yield "entity_type", entity_type
|
|
yield "total_items", total_items
|
|
yield "sample_items", sample_items_data
|
|
yield "search_summary", search_summary
|
|
yield "enrichment_summary", enrichment_summary
|
|
yield "monitor_summary", monitor_summary
|
|
yield "statistics", statistics
|
|
yield "created_at", webset.created_at.isoformat() if webset.created_at else ""
|
|
yield "updated_at", webset.updated_at.isoformat() if webset.updated_at else ""
|
|
|
|
|
|
class ExaWebsetReadyCheckBlock(Block):
|
|
"""Check if a webset is ready for the next operation (conditional workflow helper)."""
|
|
|
|
class Input(BlockSchemaInput):
|
|
credentials: CredentialsMetaInput = exa.credentials_field(
|
|
description="The Exa integration requires an API Key."
|
|
)
|
|
webset_id: str = SchemaField(
|
|
description="The ID or external ID of the Webset to check",
|
|
placeholder="webset-id-or-external-id",
|
|
)
|
|
min_items: int = SchemaField(
|
|
default=1,
|
|
description="Minimum number of items required to be 'ready'",
|
|
ge=0,
|
|
)
|
|
|
|
class Output(BlockSchemaOutput):
|
|
is_ready: bool = SchemaField(
|
|
description="True if webset is idle AND has minimum items"
|
|
)
|
|
status: str = SchemaField(description="Current webset status")
|
|
item_count: int = SchemaField(description="Number of items in webset")
|
|
has_searches: bool = SchemaField(
|
|
description="Whether webset has any searches configured"
|
|
)
|
|
has_enrichments: bool = SchemaField(
|
|
description="Whether webset has any enrichments"
|
|
)
|
|
recommendation: str = SchemaField(
|
|
description="Suggested next action (ready_to_process, waiting_for_results, needs_search, etc.)"
|
|
)
|
|
|
|
def __init__(self):
|
|
super().__init__(
|
|
id="faf9f0f3-e659-4264-b33b-284a02166bec",
|
|
description="Check if webset is ready for next operation - enables conditional workflow branching",
|
|
categories={BlockCategory.SEARCH, BlockCategory.LOGIC},
|
|
input_schema=ExaWebsetReadyCheckBlock.Input,
|
|
output_schema=ExaWebsetReadyCheckBlock.Output,
|
|
)
|
|
|
|
async def run(
|
|
self, input_data: Input, *, credentials: APIKeyCredentials, **kwargs
|
|
) -> BlockOutput:
|
|
aexa = AsyncExa(api_key=credentials.api_key.get_secret_value())
|
|
|
|
# Get webset details
|
|
webset = await aexa.websets.get(id=input_data.webset_id)
|
|
|
|
status = (
|
|
webset.status.value
|
|
if hasattr(webset.status, "value")
|
|
else str(webset.status)
|
|
)
|
|
|
|
# Estimate item count from search progress
|
|
item_count = 0
|
|
if webset.searches:
|
|
for search in webset.searches:
|
|
if search.progress:
|
|
item_count += search.progress.found
|
|
|
|
# Determine readiness
|
|
is_idle = status == "idle"
|
|
has_min_items = item_count >= input_data.min_items
|
|
is_ready = is_idle and has_min_items
|
|
|
|
# Check resources
|
|
has_searches = len(webset.searches or []) > 0
|
|
has_enrichments = len(webset.enrichments or []) > 0
|
|
|
|
# Generate recommendation
|
|
recommendation = ""
|
|
if not has_searches:
|
|
recommendation = "needs_search"
|
|
elif status in ["running", "pending"]:
|
|
recommendation = "waiting_for_results"
|
|
elif not has_min_items:
|
|
recommendation = "insufficient_items"
|
|
elif not has_enrichments:
|
|
recommendation = "ready_to_enrich"
|
|
else:
|
|
recommendation = "ready_to_process"
|
|
|
|
yield "is_ready", is_ready
|
|
yield "status", status
|
|
yield "item_count", item_count
|
|
yield "has_searches", has_searches
|
|
yield "has_enrichments", has_enrichments
|
|
yield "recommendation", recommendation
|