Compare commits

...

44 Commits

Author SHA1 Message Date
AnmolShukla2002
c72e205c44 docs: migration of core package samples from go sdk 2026-01-19 14:49:14 +05:30
AnmolShukla2002
4aed8b5bcb docsite sample restructure for go sdk 2026-01-19 11:04:53 +05:30
AnmolShukla2002
51a1f71c59 docs: migration of Go Sdk docs 2026-01-12 15:54:14 +05:30
Anmol Shukla
4928f0be8f Merge branch 'main' into sdk-docs-migrate 2026-01-07 14:11:20 +05:30
Shobhit Singh
a4506009b9 feat(bigquery): Make credentials scope configurable (#2210)
## Description

This change addresses the ask where the user may want to use custom
scopes. For instance, the default scope (bigquery) falls short from
running sql that utilizes integration with other google products, such
as Drive, Vertex AI, Cloud Run etc. With this change the user would be
able to configure custom scopes depending on their use case.

The custom scopes can be configured in the tools.yaml file, e.g.:

```yaml
sources:
  bigquery-source:
    kind: "bigquery"
    project: ${BIGQUERY_PROJECT}
    location: ${BIGQUERY_LOCATION:}
    useClientOAuth: ${BIGQUERY_USE_CLIENT_OAUTH:false}
    scopes:
      - "https://www.googleapis.com/auth/bigquery"
      - "https://www.googleapis.com/auth/drive"
```

and if the [bigquery prebuilt
config](https://github.com/googleapis/genai-toolbox/blob/main/internal/prebuiltconfigs/tools/bigquery.yaml)
is being used, then it can be set in the environment variable as well:

```shell
$ export BIGQUERY_SCOPES="https://www.googleapis.com/auth/bigquery,https://www.googleapis.com/auth/drive"
```

## PR Checklist

> Thank you for opening a Pull Request! Before submitting your PR, there
are a
> few things you can do to make sure it goes smoothly:

- [ ] Make sure you reviewed

[CONTRIBUTING.md](https://github.com/googleapis/genai-toolbox/blob/main/CONTRIBUTING.md)
- [x] Make sure to open an issue as a

[bug/issue](https://github.com/googleapis/genai-toolbox/issues/new/choose)
  before writing your code! That way we can discuss the change, evaluate
  designs, and agree on the general idea
- [x] Ensure the tests and linter pass
- [x] Code coverage does not decrease (if any source code was changed)
- [x] Appropriate docs were updated (if necessary)
- [ ] Make sure to add `!` if this involve a breaking change

🛠️ Fixes #1942
2026-01-07 02:07:49 +00:00
Wenxin Du
17b70ccaa7 feat(tools/postgressql): Add Parameter embeddedBy config support (#2151)
Add parameter `embeddedBy` field to support vector embedding & semantic
search.
Major change in `internal/util/parameters/parameters.go`

This PR only adds vector formatter for the postgressql tool. Other tools
requiring vector formatting may not work with embeddedBy.

Second part of the Semantic Search support. First part:
https://github.com/googleapis/genai-toolbox/pull/2121
2026-01-06 17:54:43 -05:00
dependabot[bot]
001d634de1 chore(deps): bump qs, body-parser and express in /docs/en/getting-started/quickstart/js/genkit (#2263)
Bumps [qs](https://github.com/ljharb/qs),
[body-parser](https://github.com/expressjs/body-parser) and
[express](https://github.com/expressjs/express). These dependencies
needed to be updated together.
Updates `qs` from 6.13.0 to 6.14.1
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/ljharb/qs/blob/main/CHANGELOG.md">qs's
changelog</a>.</em></p>
<blockquote>
<h2><strong>6.14.1</strong></h2>
<ul>
<li>[Fix] ensure arrayLength applies to <code>[]</code> notation as
well</li>
<li>[Fix] <code>parse</code>: when a custom decoder returns
<code>null</code> for a key, ignore that key</li>
<li>[Refactor] <code>parse</code>: extract key segment splitting
helper</li>
<li>[meta] add threat model</li>
<li>[actions] add workflow permissions</li>
<li>[Tests] <code>stringify</code>: increase coverage</li>
<li>[Dev Deps] update <code>eslint</code>,
<code>@ljharb/eslint-config</code>, <code>npmignore</code>,
<code>es-value-fixtures</code>, <code>for-each</code>,
<code>object-inspect</code></li>
</ul>
<h2><strong>6.14.0</strong></h2>
<ul>
<li>[New] <code>parse</code>: add
<code>throwOnParameterLimitExceeded</code> option (<a
href="https://redirect.github.com/ljharb/qs/issues/517">#517</a>)</li>
<li>[Refactor] <code>parse</code>: use <code>utils.combine</code>
more</li>
<li>[patch] <code>parse</code>: add explicit
<code>throwOnLimitExceeded</code> default</li>
<li>[actions] use shared action; re-add finishers</li>
<li>[meta] Fix changelog formatting bug</li>
<li>[Deps] update <code>side-channel</code></li>
<li>[Dev Deps] update <code>es-value-fixtures</code>,
<code>has-bigints</code>, <code>has-proto</code>,
<code>has-symbols</code></li>
<li>[Tests] increase coverage</li>
</ul>
<h2><strong>6.13.1</strong></h2>
<ul>
<li>[Fix] <code>stringify</code>: avoid a crash when a
<code>filter</code> key is <code>null</code></li>
<li>[Fix] <code>utils.merge</code>: functions should not be stringified
into keys</li>
<li>[Fix] <code>parse</code>: avoid a crash with
interpretNumericEntities: true, comma: true, and iso charset</li>
<li>[Fix] <code>stringify</code>: ensure a non-string
<code>filter</code> does not crash</li>
<li>[Refactor] use <code>__proto__</code> syntax instead of
<code>Object.create</code> for null objects</li>
<li>[Refactor] misc cleanup</li>
<li>[Tests] <code>utils.merge</code>: add some coverage</li>
<li>[Tests] fix a test case</li>
<li>[actions] split out node 10-20, and 20+</li>
<li>[Dev Deps] update <code>es-value-fixtures</code>,
<code>mock-property</code>, <code>object-inspect</code>,
<code>tape</code></li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="3fa11a5f64"><code>3fa11a5</code></a>
v6.14.1</li>
<li><a
href="a62670423c"><code>a626704</code></a>
[Dev Deps] update <code>npmignore</code></li>
<li><a
href="3086902ecf"><code>3086902</code></a>
[Fix] ensure arrayLength applies to <code>[]</code> notation as
well</li>
<li><a
href="fc7930e86c"><code>fc7930e</code></a>
[Dev Deps] update <code>eslint</code>,
<code>@ljharb/eslint-config</code></li>
<li><a
href="0b06aac566"><code>0b06aac</code></a>
[Dev Deps] update <code>@ljharb/eslint-config</code></li>
<li><a
href="64951f6200"><code>64951f6</code></a>
[Refactor] <code>parse</code>: extract key segment splitting helper</li>
<li><a
href="e1bd2599cd"><code>e1bd259</code></a>
[Dev Deps] update <code>@ljharb/eslint-config</code></li>
<li><a
href="f4b3d39709"><code>f4b3d39</code></a>
[eslint] add eslint 9 optional peer dep</li>
<li><a
href="6e94d9596c"><code>6e94d95</code></a>
[Dev Deps] update <code>eslint</code>,
<code>@ljharb/eslint-config</code>, <code>npmignore</code></li>
<li><a
href="973dc3c51c"><code>973dc3c</code></a>
[actions] add workflow permissions</li>
<li>Additional commits viewable in <a
href="https://github.com/ljharb/qs/compare/v6.13.0...v6.14.1">compare
view</a></li>
</ul>
</details>
<br />

Updates `body-parser` from 1.20.3 to 1.20.4
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/expressjs/body-parser/releases">body-parser's
releases</a>.</em></p>
<blockquote>
<h2>1.20.4</h2>
<h2>What's Changed</h2>
<ul>
<li>Remove redundant depth check by <a
href="https://github.com/blakeembrey"><code>@​blakeembrey</code></a> in
<a
href="https://redirect.github.com/expressjs/body-parser/pull/538">expressjs/body-parser#538</a></li>
<li>ci: add support for Node.js v23 by <a
href="https://github.com/Phillip9587"><code>@​Phillip9587</code></a> in
<a
href="https://redirect.github.com/expressjs/body-parser/pull/553">expressjs/body-parser#553</a></li>
<li>ci: restore CI for 1.x branch by <a
href="https://github.com/bjohansebas"><code>@​bjohansebas</code></a> in
<a
href="https://redirect.github.com/expressjs/body-parser/pull/665">expressjs/body-parser#665</a></li>
<li>deps: qs@^6.14.0 by <a
href="https://github.com/bjohansebas"><code>@​bjohansebas</code></a> in
<a
href="https://redirect.github.com/expressjs/body-parser/pull/664">expressjs/body-parser#664</a></li>
<li>deps: use tilde notation and update certain dependencies by <a
href="https://github.com/Phillip9587"><code>@​Phillip9587</code></a> in
<a
href="https://redirect.github.com/expressjs/body-parser/pull/668">expressjs/body-parser#668</a></li>
<li>chore: remove SECURITY.md by <a
href="https://github.com/Phillip9587"><code>@​Phillip9587</code></a> in
<a
href="https://redirect.github.com/expressjs/body-parser/pull/669">expressjs/body-parser#669</a></li>
<li>ci: add CodeQL (SAST) by <a
href="https://github.com/Phillip9587"><code>@​Phillip9587</code></a> in
<a
href="https://redirect.github.com/expressjs/body-parser/pull/670">expressjs/body-parser#670</a></li>
<li>Release: 1.20.4 by <a
href="https://github.com/UlisesGascon"><code>@​UlisesGascon</code></a>
in <a
href="https://redirect.github.com/expressjs/body-parser/pull/672">expressjs/body-parser#672</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/expressjs/body-parser/compare/1.20.3...1.20.4">https://github.com/expressjs/body-parser/compare/1.20.3...1.20.4</a></p>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/expressjs/body-parser/blob/master/HISTORY.md">body-parser's
changelog</a>.</em></p>
<blockquote>
<h1>1.20.4 / 2025-12-01</h1>
<ul>
<li>deps: qs@~6.14.0</li>
<li>deps: use tilde notation for dependencies</li>
<li>deps: http-errors@~2.0.1</li>
<li>deps: raw-body@~2.5.3</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="7db202cac8"><code>7db202c</code></a>
1.20.4 (<a
href="https://redirect.github.com/expressjs/body-parser/issues/672">#672</a>)</li>
<li><a
href="d8f8adb898"><code>d8f8adb</code></a>
ci: add CodeQL (SAST) (<a
href="https://redirect.github.com/expressjs/body-parser/issues/670">#670</a>)</li>
<li><a
href="6d133c19b3"><code>6d133c1</code></a>
chore: remove SECURITY.md (<a
href="https://redirect.github.com/expressjs/body-parser/issues/669">#669</a>)</li>
<li><a
href="fcd1535504"><code>fcd1535</code></a>
deps: use tilde notation and update certain dependencies (<a
href="https://redirect.github.com/expressjs/body-parser/issues/668">#668</a>)</li>
<li><a
href="ec5fa290d2"><code>ec5fa29</code></a>
deps: qs@~6.14.0 (<a
href="https://redirect.github.com/expressjs/body-parser/issues/664">#664</a>)</li>
<li><a
href="ffb95c12c7"><code>ffb95c1</code></a>
ci: restore CI for 1.x branch (<a
href="https://redirect.github.com/expressjs/body-parser/issues/665">#665</a>)</li>
<li><a
href="48a5f074a4"><code>48a5f07</code></a>
ci: add support for Node.js v23 (<a
href="https://redirect.github.com/expressjs/body-parser/issues/553">#553</a>)</li>
<li><a
href="f20f6adc71"><code>f20f6ad</code></a>
Remove redundant depth check (<a
href="https://redirect.github.com/expressjs/body-parser/issues/538">#538</a>)</li>
<li>See full diff in <a
href="https://github.com/expressjs/body-parser/compare/1.20.3...1.20.4">compare
view</a></li>
</ul>
</details>
<br />

Updates `express` from 4.21.2 to 4.22.1
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/expressjs/express/releases">express's
releases</a>.</em></p>
<blockquote>
<h2>v4.22.1</h2>
<h2>What's Changed</h2>
<blockquote>
<p>[!IMPORTANT]<br />
The prior release (4.22.0) included an erroneous breaking change related
to the extended query parser. There is no actual security vulnerability
associated with this behavior (CVE-2024-51999 has been rejected). The
change has been fully reverted in this release.</p>
</blockquote>
<ul>
<li>Release: 4.22.1 by <a
href="https://github.com/UlisesGascon"><code>@​UlisesGascon</code></a>
in <a
href="https://redirect.github.com/expressjs/express/pull/6934">expressjs/express#6934</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/expressjs/express/compare/4.22.0...v4.22.1">https://github.com/expressjs/express/compare/4.22.0...v4.22.1</a></p>
<h2>4.22.0</h2>
<h2>Important: Security</h2>
<ul>
<li>Security fix for <a
href="https://www.cve.org/CVERecord?id=CVE-2024-51999">CVE-2024-51999</a>
(<a
href="https://github.com/expressjs/express/security/advisories/GHSA-pj86-cfqh-vqx6">GHSA-pj86-cfqh-vqx6</a>)</li>
</ul>
<h2>What's Changed</h2>
<ul>
<li>Refactor: improve readability by <a
href="https://github.com/sazk07"><code>@​sazk07</code></a> in <a
href="https://redirect.github.com/expressjs/express/pull/6190">expressjs/express#6190</a></li>
<li>ci: add support for Node.js@23.0 by <a
href="https://github.com/UlisesGascon"><code>@​UlisesGascon</code></a>
in <a
href="https://redirect.github.com/expressjs/express/pull/6080">expressjs/express#6080</a></li>
<li>Method functions with no path should error by <a
href="https://github.com/wesleytodd"><code>@​wesleytodd</code></a> in <a
href="https://redirect.github.com/expressjs/express/pull/5957">expressjs/express#5957</a></li>
<li>ci: updated github actions ci workflow by <a
href="https://github.com/Phillip9587"><code>@​Phillip9587</code></a> in
<a
href="https://redirect.github.com/expressjs/express/pull/6323">expressjs/express#6323</a></li>
<li>ci: reorder <code>npm i</code> steps to fix ci for older node
versions by <a
href="https://github.com/Phillip9587"><code>@​Phillip9587</code></a> in
<a
href="https://redirect.github.com/expressjs/express/pull/6336">expressjs/express#6336</a></li>
<li>Backport: ci: add node.js 24 to test matrix by <a
href="https://github.com/Phillip9587"><code>@​Phillip9587</code></a> in
<a
href="https://redirect.github.com/expressjs/express/pull/6506">expressjs/express#6506</a></li>
<li>chore(4.x): wider range for query test skip by <a
href="https://github.com/jonchurch"><code>@​jonchurch</code></a> in <a
href="https://redirect.github.com/expressjs/express/pull/6513">expressjs/express#6513</a></li>
<li>use tilde notation for certain dependencies by <a
href="https://github.com/UlisesGascon"><code>@​UlisesGascon</code></a>
in <a
href="https://redirect.github.com/expressjs/express/pull/6905">expressjs/express#6905</a></li>
<li>deps: qs@6.14.0 by <a
href="https://github.com/UlisesGascon"><code>@​UlisesGascon</code></a>
in <a
href="https://redirect.github.com/expressjs/express/pull/6909">expressjs/express#6909</a></li>
<li>deps: use tilde notation for <code>qs</code> by <a
href="https://github.com/Phillip9587"><code>@​Phillip9587</code></a> in
<a
href="https://redirect.github.com/expressjs/express/pull/6919">expressjs/express#6919</a></li>
<li>Release: 4.22.0 by <a
href="https://github.com/UlisesGascon"><code>@​UlisesGascon</code></a>
in <a
href="https://redirect.github.com/expressjs/express/pull/6921">expressjs/express#6921</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/expressjs/express/compare/4.21.2...4.22.0">https://github.com/expressjs/express/compare/4.21.2...4.22.0</a></p>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/expressjs/express/blob/v4.22.1/History.md">express's
changelog</a>.</em></p>
<blockquote>
<h1>4.22.1 / 2025-12-01</h1>
<ul>
<li>Revert security fix for <a
href="https://www.cve.org/CVERecord?id=CVE-2024-51999">CVE-2024-51999</a>
(<a
href="https://github.com/expressjs/express/security/advisories/GHSA-pj86-cfqh-vqx6">GHSA-pj86-cfqh-vqx6</a>)</li>
</ul>
<h1>4.22.0 / 2025-12-01</h1>
<ul>
<li>Security fix for <a
href="https://www.cve.org/CVERecord?id=CVE-2024-51999">CVE-2024-51999</a>
(<a
href="https://github.com/expressjs/express/security/advisories/GHSA-pj86-cfqh-vqx6">GHSA-pj86-cfqh-vqx6</a>)</li>
<li>deps: use tilde notation for dependencies</li>
<li>deps: qs@6.14.0</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="12fae14531"><code>12fae14</code></a>
4.22.1</li>
<li><a
href="5ddf311af3"><code>5ddf311</code></a>
Revert &quot;sec: security patch for CVE-2024-51999&quot;</li>
<li><a
href="49744abd11"><code>49744ab</code></a>
4.22.0 (<a
href="https://redirect.github.com/expressjs/express/issues/6921">#6921</a>)</li>
<li><a
href="6e97452f60"><code>6e97452</code></a>
sec: security patch for CVE-2024-51999</li>
<li><a
href="6a23d34d65"><code>6a23d34</code></a>
deps: use tilde notation for <code>qs</code> (<a
href="https://redirect.github.com/expressjs/express/issues/6919">#6919</a>)</li>
<li><a
href="8c12cdf93b"><code>8c12cdf</code></a>
deps: qs@6.14.0 (<a
href="https://redirect.github.com/expressjs/express/issues/6909">#6909</a>)</li>
<li><a
href="7fea74fcf0"><code>7fea74f</code></a>
deps: use tilde notation for certain dependencies (<a
href="https://redirect.github.com/expressjs/express/issues/6905">#6905</a>)</li>
<li><a
href="dac7a0475a"><code>dac7a04</code></a>
chore: wider range for query test skip (<a
href="https://redirect.github.com/expressjs/express/issues/6513">#6513</a>)</li>
<li><a
href="997919b488"><code>997919b</code></a>
ci: add node.js 24 to test matrix (<a
href="https://redirect.github.com/expressjs/express/issues/6506">#6506</a>)</li>
<li><a
href="36fb59c6c7"><code>36fb59c</code></a>
fix(ci): reorder <code>npm i</code> steps to fix ci for older node
versions (<a
href="https://redirect.github.com/expressjs/express/issues/6336">#6336</a>)</li>
<li>Additional commits viewable in <a
href="https://github.com/expressjs/express/compare/4.21.2...v4.22.1">compare
view</a></li>
</ul>
</details>
<br />


Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)
You can disable automated security fix PRs for this repo from the
[Security Alerts
page](https://github.com/googleapis/genai-toolbox/network/alerts).

</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Yuan Teoh <45984206+Yuan325@users.noreply.github.com>
2026-01-06 18:55:47 +00:00
Wenxin Du
268700bdbf fix(tools/looker): Looker client OAuth nil pointer error (#2231)
The original implementation initializes auth session using direct struct
creation `&rtl.AuthSession` which does not have a source field
initialized, causing nil pointer error when the sdk is trying to access
this source field. This is fixed by using the`NewAuthSession()`
constructor which automatically initializes a source field.
Fix: https://github.com/googleapis/genai-toolbox/issues/2230

---------

Co-authored-by: Dr. Strangelove <drstrangelove@google.com>
2026-01-06 17:51:24 +00:00
Dr. Strangelove
eb793398cd feat(tools/looker): add ability to set destination folder with make_look and make_dashboard. (#2245)
## Description

When running with a service account, the user has no personal folder id.
This allows a destination
folder to be specified as part of the call to make_dashboard and
make_look. If a folder is not specified
the user's personal folder will be used.

## PR Checklist

> Thank you for opening a Pull Request! Before submitting your PR, there
are a
> few things you can do to make sure it goes smoothly:

- [x] Make sure you reviewed

[CONTRIBUTING.md](https://github.com/googleapis/genai-toolbox/blob/main/CONTRIBUTING.md)
- [x] Make sure to open an issue as a

[bug/issue](https://github.com/googleapis/genai-toolbox/issues/new/choose)
  before writing your code! That way we can discuss the change, evaluate
  designs, and agree on the general idea
- [x] Ensure the tests and linter pass
- [x] Code coverage does not decrease (if any source code was changed)
- [x] Appropriate docs were updated (if necessary)
- [x] Make sure to add `!` if this involve a breaking change

🛠️ Fixes #2225

---------

Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
2026-01-06 17:30:20 +00:00
Siddharth Ravi
cf0fc515b5 feat: add tool to list store procedure (#2156)
Adds the following tools for Postgres:
(1) list_stored_procedure: Retrieves stored procedure metadata returning
schema name, procedure name, procedure owner, language, definition, and
description, filtered by optional role name (procedure owner), schema
name, and limit (default 20).

<img width="3808" height="1181" alt="image"
src="https://github.com/user-attachments/assets/43513a04-95ce-478f-a59f-3e5dafdb6b23"
/>

<img width="2654" height="1288" alt="image"
src="https://github.com/user-attachments/assets/84aca162-3779-4daa-ae2f-61620560589f"
/>


> Should include a concise description of the changes (bug or feature),
it's
> impact, along with a summary of the solution

## PR Checklist

> Thank you for opening a Pull Request! Before submitting your PR, there
are a
> few things you can do to make sure it goes smoothly:

- [x] Make sure you reviewed

[CONTRIBUTING.md](https://github.com/googleapis/genai-toolbox/blob/main/CONTRIBUTING.md)
- [x] Make sure to open an issue as a

[bug/issue](https://github.com/googleapis/genai-toolbox/issues/new/choose)
  before writing your code! That way we can discuss the change, evaluate
  designs, and agree on the general idea
- [x] Ensure the tests and linter pass
- [x] Code coverage does not decrease (if any source code was changed)
- [x] Appropriate docs were updated (if necessary)
- [x] Make sure to add `!` if this involve a breaking change

🛠️ Fixes #1738
2026-01-06 08:36:45 +00:00
Wenxin Du
9c62f313ff feat: Add embeddingModel support (#2121)
First part of the implementation to support semantic search in tools.
Second part: https://github.com/googleapis/genai-toolbox/pull/2151
2026-01-05 19:34:54 -05:00
Averi Kitsch
731a32e536 feat: update CSQL MySQL prebuilt tools to use IAM (#2202)
## Description

> Should include a concise description of the changes (bug or feature),
it's
> impact, along with a summary of the solution

## PR Checklist

> Thank you for opening a Pull Request! Before submitting your PR, there
are a
> few things you can do to make sure it goes smoothly:

- [ ] Make sure you reviewed

[CONTRIBUTING.md](https://github.com/googleapis/genai-toolbox/blob/main/CONTRIBUTING.md)
- [ ] Make sure to open an issue as a

[bug/issue](https://github.com/googleapis/genai-toolbox/issues/new/choose)
  before writing your code! That way we can discuss the change, evaluate
  designs, and agree on the general idea
- [ ] Ensure the tests and linter pass
- [ ] Code coverage does not decrease (if any source code was changed)
- [ ] Appropriate docs were updated (if necessary)
- [ ] Make sure to add `!` if this involve a breaking change

🛠️ Fixes #<issue_number_goes_here>
2026-01-05 18:30:25 +00:00
Divyansh
53885e6c0d docs: Updating dataplex docs to include new syntax for semantic search (#2165)
## Description

Dataplex.md is currently misaligned with the Dataplex backend, leading
to failed search queries.

## PR Checklist

> Thank you for opening a Pull Request! Before submitting your PR, there
are a
> few things you can do to make sure it goes smoothly:

- [x] Make sure you reviewed

[CONTRIBUTING.md](https://github.com/googleapis/genai-toolbox/blob/main/CONTRIBUTING.md)
- [] Make sure to open an issue as a

[bug/issue](https://github.com/googleapis/genai-toolbox/issues/new/choose)
  before writing your code! That way we can discuss the change, evaluate
  designs, and agree on the general idea
- [x ] Ensure the tests and linter pass
- [x] Code coverage does not decrease (if any source code was changed)
- [x] Appropriate docs were updated (if necessary)
- [x] Make sure to add `!` if this involve a breaking change

🛠️ Fixes #<issue_number_goes_here>

---------

Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
Co-authored-by: Averi Kitsch <akitsch@google.com>
Co-authored-by: Yuan Teoh <45984206+Yuan325@users.noreply.github.com>
2025-12-31 07:28:00 +00:00
dependabot[bot]
b4346dcb8f chore(deps): bump qs from 6.14.0 to 6.14.1 in /docs/en/getting-started/quickstart/js/adk (#2250)
Bumps [qs](https://github.com/ljharb/qs) from 6.14.0 to 6.14.1.
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/ljharb/qs/blob/main/CHANGELOG.md">qs's
changelog</a>.</em></p>
<blockquote>
<h2><strong>6.14.1</strong></h2>
<ul>
<li>[Fix] ensure arrayLength applies to <code>[]</code> notation as
well</li>
<li>[Fix] <code>parse</code>: when a custom decoder returns
<code>null</code> for a key, ignore that key</li>
<li>[Refactor] <code>parse</code>: extract key segment splitting
helper</li>
<li>[meta] add threat model</li>
<li>[actions] add workflow permissions</li>
<li>[Tests] <code>stringify</code>: increase coverage</li>
<li>[Dev Deps] update <code>eslint</code>,
<code>@ljharb/eslint-config</code>, <code>npmignore</code>,
<code>es-value-fixtures</code>, <code>for-each</code>,
<code>object-inspect</code></li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="3fa11a5f64"><code>3fa11a5</code></a>
v6.14.1</li>
<li><a
href="a62670423c"><code>a626704</code></a>
[Dev Deps] update <code>npmignore</code></li>
<li><a
href="3086902ecf"><code>3086902</code></a>
[Fix] ensure arrayLength applies to <code>[]</code> notation as
well</li>
<li><a
href="fc7930e86c"><code>fc7930e</code></a>
[Dev Deps] update <code>eslint</code>,
<code>@ljharb/eslint-config</code></li>
<li><a
href="0b06aac566"><code>0b06aac</code></a>
[Dev Deps] update <code>@ljharb/eslint-config</code></li>
<li><a
href="64951f6200"><code>64951f6</code></a>
[Refactor] <code>parse</code>: extract key segment splitting helper</li>
<li><a
href="e1bd2599cd"><code>e1bd259</code></a>
[Dev Deps] update <code>@ljharb/eslint-config</code></li>
<li><a
href="f4b3d39709"><code>f4b3d39</code></a>
[eslint] add eslint 9 optional peer dep</li>
<li><a
href="6e94d9596c"><code>6e94d95</code></a>
[Dev Deps] update <code>eslint</code>,
<code>@ljharb/eslint-config</code>, <code>npmignore</code></li>
<li><a
href="973dc3c51c"><code>973dc3c</code></a>
[actions] add workflow permissions</li>
<li>Additional commits viewable in <a
href="https://github.com/ljharb/qs/compare/v6.14.0...v6.14.1">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=qs&package-manager=npm_and_yarn&previous-version=6.14.0&new-version=6.14.1)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)
You can disable automated security fix PRs for this repo from the
[Security Alerts
page](https://github.com/googleapis/genai-toolbox/network/alerts).

</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-12-30 22:37:37 -08:00
Yuan Teoh
0f27f956c7 refactor(sources/bigquery): move source implementation in Invoke() function to Source (#2242)
Move source-related queries from `Invoke()` function into Source.

This is an effort to generalizing tools to work with any Source that
implements a specific interface. This will provide a better segregation
of the roles for Tools vs Source.

Tool's role will be limited to the following:
* Resolve any pre-implementation steps or parameters (e.g. template
parameters)
* Retrieving Source
* Calling the source's implementation
2025-12-31 05:43:09 +00:00
Yuan Teoh
f9df2635c6 refactor(sources/neo4j): move source implementation in Invoke() function to Source (#2241)
Move source-related queries from `Invoke()` function into Source.

This is an effort to generalizing tools to work with any Source that
implements a specific interface. This will provide a better segregation
of the roles for Tools vs Source.

Tool's role will be limited to the following:
* Resolve any pre-implementation steps or parameters (e.g. template
parameters)
* Retrieving Source
* Calling the source's implementation
2025-12-31 05:06:32 +00:00
Yuan Teoh
c1b87e209f refactor: move source implementation in Invoke() function to Source (#2240)
Move source-related queries from `Invoke()` function into Source.

The following sources are updated in this PR:
* alloydb-pg
* cloudsql-pg
* postgres

This is an effort to generalizing tools to work with any Source that
implements a specific interface. This will provide a better segregation
of the roles for Tools vs Source.

Tool's role will be limited to the following:
* Resolve any pre-implementation steps or parameters (e.g. template
parameters)
* Retrieving Source
* Calling the source's implementation
2025-12-31 04:46:18 +00:00
Yuan Teoh
55eb958c2a refactor: move source implementation in Invoke() function to Source (#2238)
Move source-related queries from `Invoke()` function into Source.

The following sources were updated in this PR:
* mssql
* cloudsql-mssql
* mysql
* cloudsql-mysql

This is an effort to generalizing tools to work with any Source that
implements a specific interface. This will provide a better segregation
of the roles for Tools vs Source.

Tool's role will be limited to the following:
* Resolve any pre-implementation steps or parameters (e.g. template
parameters)
* Retrieving Source
* Calling the source's implementation
2025-12-31 04:23:59 +00:00
Yuan Teoh
20447746e1 refactor: move source implementation in Invoke() function to Source (#2237)
Move source-related queries from `Invoke()` function into Source.

The following sources are updated in this PR:
* spanner
* sqlite
* tidb
* trino
* valkey
* yugabytedb

This is an effort to generalizing tools to work with any Source that
implements a specific interface. This will provide a better segregation
of the roles for Tools vs Source.

Tool's role will be limited to the following:
* Resolve any pre-implementation steps or parameters (e.g. template
parameters)
* Retrieving Source
* Calling the source's implementation
2025-12-31 04:00:12 +00:00
Yuan Teoh
83670dbe34 refactor: move source implementation in Invoke() function to Source (#2236)
Move source-related queries from `Invoke()` function into Source.

The following sources are updated in this PR:
* mindsdb
* oceanbase
* oracle
* redis
* singlestore
* cloudmonitoring

This is an effort to generalizing tools to work with any Source that
implements a specific interface. This will provide a better segregation
of the roles for Tools vs Source.

Tool's role will be limited to the following:
* Resolve any pre-implementation steps or parameters (e.g. template
parameters)
* Retrieving Source
* Calling the source's implementation
* reduce oracle integration test coverage to 20%. There's no code change
or test reduction in this PR. It might be because the Invoke() function
was dedupe, hence the total line covered is reduced.
2025-12-30 23:34:11 +00:00
Yuan Teoh
df2f6a9f0b refactor: move source implementation in Invoke() function to Source (#2234)
Move source-related queries from `Invoke()` function into Source.

The following sources are updated in this PR:
* couchbase
* dgraph
* elasticsearch
* firebird

This is an effort to generalizing tools to work with any Source that
implements a specific interface. This will provide a better segregation
of the roles for Tools vs Source.

Tool's role will be limited to the following:
* Resolve any pre-implementation steps or parameters (e.g. template
parameters)
* Retrieving Source
* Calling the source's implementation
2025-12-30 22:45:48 +00:00
Yuan Teoh
2fa9cdb522 refactor(sources/cloudsqladmin): move source implementation in Invoke() function to Source (#2233)
Move source-related queries from `Invoke()` function into Source.

This is an effort to generalizing tools to work with any Source that
implements a specific interface. This will provide a better segregation
of the roles for Tools vs Source.

Tool's role will be limited to the following:
* Resolve any pre-implementation steps or parameters (e.g. template
parameters)
* Retrieving Source
* Calling the source's implementation


Along with these updates, this PR also resolve some comments from
Gemini:
* update `fmt.Printf()` to logging as a Debug log and remove the
training `\n` within the log
* move `regexp.MustCompile` to the top so that it's compiled once at the
package level and reused. It is a relatively expensive operation to be
called on every invocation.
* `fetchInstanceData()` to return the `*sqladmin.DatabaseInstance`
struct directly instead of converting to map and use map lookups. More
typesafe and efficient.


Did not move `cloudsqlpgupgradeprecheck` tool since that invocation is
very specific towards cloudsql for postgres
2025-12-30 22:16:27 +00:00
Yuan Teoh
285cdcd69a refactor: move source implementation in Invoke() function to Source (#2229)
Move source-related queries from `Invoke()` function into Source.

The following sources were updated in this PR:
* bigtable
* cassandra
* clickhouse
* cloud gemini data analytics

This is an effort to generalizing tools to work with any Source that
implements a specific interface. This will provide a better segregation
of the roles for Tools vs Source.

Tool's role will be limited to the following:
* Resolve any pre-implementation steps or parameters (e.g. template
parameters)
* Retrieving Source
* Calling the source's implementation


This PR also fix the following gemini review recommendations:
* Bigtable `resultRow.GetByName()` to throw an error and return false
* Clickhouselistdatabases and Clickhouselisttables to reuse the
`RunSQL()` function
2025-12-30 21:55:02 +00:00
Mend Renovate
38d127a354 chore(deps): update dependency langchain to v1.2.3 [security] (#2248)
This PR contains the following updates:

| Package | Change |
[Age](https://docs.renovatebot.com/merge-confidence/) |
[Confidence](https://docs.renovatebot.com/merge-confidence/) |
|---|---|---|---|
|
[langchain](https://redirect.github.com/langchain-ai/langchainjs/tree/main/libs/langchain/)
([source](https://redirect.github.com/langchain-ai/langchainjs)) |
[`1.0.2` →
`1.2.3`](https://renovatebot.com/diffs/npm/langchain/1.0.2/1.2.3) |
![age](https://developer.mend.io/api/mc/badges/age/npm/langchain/1.2.3?slim=true)
|
![confidence](https://developer.mend.io/api/mc/badges/confidence/npm/langchain/1.0.2/1.2.3?slim=true)
|

### GitHub Vulnerability Alerts

####
[CVE-2025-68665](https://redirect.github.com/langchain-ai/langchainjs/security/advisories/GHSA-r399-636x-v7f6)

## Context

A serialization injection vulnerability exists in LangChain JS's
`toJSON()` method (and subsequently when string-ifying objects using
`JSON.stringify()`. The method did not escape objects with `'lc'` keys
when serializing free-form data in kwargs. The `'lc'` key is used
internally by LangChain to mark serialized objects. When user-controlled
data contains this key structure, it is treated as a legitimate
LangChain object during deserialization rather than plain user data.

### Attack surface

The core vulnerability was in `Serializable.toJSON()`: this method
failed to escape user-controlled objects containing `'lc'` keys within
kwargs (e.g., `additional_kwargs`, `metadata`, `response_metadata`).
When this unescaped data was later deserialized via `load()`, the
injected structures were treated as legitimate LangChain objects rather
than plain user data.

This escaping bug enabled several attack vectors:

1. **Injection via user data**: Malicious LangChain object structures
could be injected through user-controlled fields like `metadata`,
`additional_kwargs`, or `response_metadata`
2. **Secret extraction**: Injected secret structures could extract
environment variables when `secretsFromEnv` was enabled (which had no
explicit default, effectively defaulting to `true` behavior)
3. **Class instantiation via import maps**: Injected constructor
structures could instantiate any class available in the provided import
maps with attacker-controlled parameters

**Note on import maps:** Classes must be explicitly included in import
maps to be instantiatable. The core import map includes standard types
(messages, prompts, documents), and users can extend this via
`importMap` and `optionalImportsMap` options. This architecture
naturally limits the attack surface—an `allowedObjects` parameter is not
necessary because users control which classes are available through the
import maps they provide.

**Security hardening:** This patch fixes the escaping bug in `toJSON()`
and introduces new restrictive defaults in `load()`: `secretsFromEnv`
now explicitly defaults to `false`, and a `maxDepth` parameter protects
against DoS via deeply nested structures. JSDoc security warnings have
been added to all import map options.

## Who is affected?

Applications are vulnerable if they:

1. **Serialize untrusted data via `JSON.stringify()` on Serializable
objects, then deserialize with `load()`** — Trusting your own
serialization output makes you vulnerable if user-controlled data (e.g.,
from LLM responses, metadata fields, or user inputs) contains `'lc'` key
structures.
2. **Deserialize untrusted data with `load()`** — Directly deserializing
untrusted data that may contain injected `'lc'` structures.
3. **Use LangGraph checkpoints** — Checkpoint
serialization/deserialization paths may be affected.

The most common attack vector is through **LLM response fields** like
`additional_kwargs` or `response_metadata`, which can be controlled via
prompt injection and then serialized/deserialized in streaming
operations.

## Impact

Attackers who control serialized data can extract environment variable
secrets by injecting `{"lc": 1, "type": "secret", "id": ["ENV_VAR"]}` to
load environment variables during deserialization (when `secretsFromEnv:
true`). They can also instantiate classes with controlled parameters by
injecting constructor structures to instantiate any class within the
provided import maps with attacker-controlled parameters, potentially
triggering side effects such as network calls or file operations.

Key severity factors:

- Affects the serialization path—applications trusting their own
serialization output are vulnerable
- Enables secret extraction when combined with `secretsFromEnv: true`
- LLM responses in `additional_kwargs` can be controlled via prompt
injection

## Exploit example

```typescript
import { load } from "@&#8203;langchain/core/load";

// Attacker injects secret structure into user-controlled data
const attackerPayload = JSON.stringify({
  user_data: {
    lc: 1,
    type: "secret",
    id: ["OPENAI_API_KEY"],
  },
});

process.env.OPENAI_API_KEY = "sk-secret-key-12345";

// With secretsFromEnv: true, the secret is extracted
const deserialized = await load(attackerPayload, { secretsFromEnv: true });

console.log(deserialized.user_data); // "sk-secret-key-12345" - SECRET LEAKED!
```

## Security hardening changes

This patch introduces the following changes to `load()`:

1. **`secretsFromEnv` default changed to `false`**: Disables automatic
secret loading from environment variables. Secrets not found in
`secretsMap` now throw an error instead of being loaded from
`process.env`. This fail-safe behavior ensures missing secrets are
caught immediately rather than silently continuing with `null`.
2. **New `maxDepth` parameter** (defaults to `50`): Protects against
denial-of-service attacks via deeply nested JSON structures that could
cause stack overflow.
3. **Escape mechanism in `toJSON()`**: User-controlled objects
containing `'lc'` keys are now wrapped in `{"__lc_escaped__": {...}}`
during serialization and unwrapped as plain data during deserialization.
4. **JSDoc security warnings**: All import map options (`importMap`,
`optionalImportsMap`, `optionalImportEntrypoints`) now include security
warnings about never populating them from user input.

## Migration guide

### No changes needed for most users

If you're deserializing standard LangChain types (messages, documents,
prompts) using the core import map, your code will work without changes:

```typescript
import { load } from "@&#8203;langchain/core/load";

// Works with default settings
const obj = await load(serializedData);
```

### For secrets from environment

`secretsFromEnv` now defaults to `false`, and missing secrets throw an
error. If you need to load secrets:

```typescript
import { load } from "@&#8203;langchain/core/load";

// Provide secrets explicitly (recommended)
const obj = await load(serializedData, {
  secretsMap: { OPENAI_API_KEY: process.env.OPENAI_API_KEY },
});

// Or explicitly opt-in to load from env (only use with trusted data)
const obj = await load(serializedData, { secretsFromEnv: true });
```

> **Warning:** Only enable `secretsFromEnv` if you trust the serialized
data. Untrusted data could extract any environment variable.

> **Note:** If a secret reference is encountered but not found in
`secretsMap` (and `secretsFromEnv` is `false` or the secret is not in
the environment), an error is thrown. This fail-safe behavior ensures
you're aware of missing secrets rather than silently receiving `null`
values.

### For deeply nested structures

If you have legitimate deeply nested data that exceeds the default depth
limit of 50:

```typescript
import { load } from "@&#8203;langchain/core/load";

const obj = await load(serializedData, { maxDepth: 100 });
```

### For custom import maps

If you provide custom import maps, ensure they only contain trusted
modules:

```typescript
import { load } from "@&#8203;langchain/core/load";
import * as myModule from "./my-trusted-module";

// GOOD - explicitly include only trusted modules
const obj = await load(serializedData, {
  importMap: { my_module: myModule },
});

// BAD - never populate from user input
const obj = await load(serializedData, {
  importMap: userProvidedImports, // DANGEROUS!
});
```

---

### Release Notes

<details>
<summary>langchain-ai/langchainjs (langchain)</summary>

###
[`v1.2.3`](https://redirect.github.com/langchain-ai/langchainjs/releases/tag/%40langchain/anthropic%401.2.3)

##### Patch Changes

- Updated dependencies
\[[`0bade90`](0bade90ed4),
[`6c40d00`](6c40d00e92)]:
-
[@&#8203;langchain/core](https://redirect.github.com/langchain/core)@&#8203;1.1.4

###
[`v1.2.2`](https://redirect.github.com/langchain-ai/langchainjs/releases/tag/%40langchain/anthropic%401.2.2)

##### Patch Changes

-
[#&#8203;9520](https://redirect.github.com/langchain-ai/langchainjs/pull/9520)
[`cc022b0`](cc022b0aab)
Thanks [@&#8203;yukukotani](https://redirect.github.com/yukukotani)! -
Includes cache creation/read tokens in input\_tokens of usage metadata

- Updated dependencies
\[[`bd2c46e`](bd2c46e09e),
[`487378b`](487378bf14),
[`138e7fb`](138e7fb628)]:
-
[@&#8203;langchain/core](https://redirect.github.com/langchain/core)@&#8203;1.1.3

###
[`v1.2.1`](https://redirect.github.com/langchain-ai/langchainjs/releases/tag/%40langchain/anthropic%401.2.1)

[Compare
Source](https://redirect.github.com/langchain-ai/langchainjs/compare/langchain@1.2.0...langchain@1.2.1)

##### Patch Changes

- Updated dependencies
\[[`833f578`](833f57834d)]:
-
[@&#8203;langchain/core](https://redirect.github.com/langchain/core)@&#8203;1.1.2

###
[`v1.2.0`](https://redirect.github.com/langchain-ai/langchainjs/releases/tag/langchain%401.2.0)

[Compare
Source](https://redirect.github.com/langchain-ai/langchainjs/compare/langchain@1.1.6...langchain@1.2.0)

##### Minor Changes

-
[#&#8203;9651](https://redirect.github.com/langchain-ai/langchainjs/pull/9651)
[`348c37c`](348c37c01a)
Thanks
[@&#8203;christian-bromann](https://redirect.github.com/christian-bromann)!
- feat(langchain): allow to set strict tag manually in providerStrategy
[#&#8203;9578](https://redirect.github.com/langchain-ai/langchainjs/issues/9578)

###
[`v1.1.6`](https://redirect.github.com/langchain-ai/langchainjs/releases/tag/langchain%401.1.6)

[Compare
Source](https://redirect.github.com/langchain-ai/langchainjs/compare/langchain@1.1.5...langchain@1.1.6)

##### Patch Changes

-
[#&#8203;9586](https://redirect.github.com/langchain-ai/langchainjs/pull/9586)
[`bc8e90f`](bc8e90f4f7)
Thanks [@&#8203;hntrl](https://redirect.github.com/hntrl)! - patch
prompts created from runs fix

-
[#&#8203;9623](https://redirect.github.com/langchain-ai/langchainjs/pull/9623)
[`ade8b8a`](ade8b8af0b)
Thanks
[@&#8203;christian-bromann](https://redirect.github.com/christian-bromann)!
- fix(langchain): properly retrieve structured output from thinking
block

-
[#&#8203;9637](https://redirect.github.com/langchain-ai/langchainjs/pull/9637)
[`88bb788`](88bb7882fa)
Thanks
[@&#8203;christian-bromann](https://redirect.github.com/christian-bromann)!
- fix(langchain): Prevent functions from being accidentally assignable
to AgentMiddleware

-
[#&#8203;8964](https://redirect.github.com/langchain-ai/langchainjs/pull/8964)
[`38ff1b5`](38ff1b55d3)
Thanks [@&#8203;jnjacobson](https://redirect.github.com/jnjacobson)! -
add support for anyOf, allOf, oneOf in openapi conversion

-
[#&#8203;9640](https://redirect.github.com/langchain-ai/langchainjs/pull/9640)
[`aa8c4f8`](aa8c4f867a)
Thanks
[@&#8203;christian-bromann](https://redirect.github.com/christian-bromann)!
- fix(langchain): prevent summarization middleware from leaking
streaming events

-
[#&#8203;9648](https://redirect.github.com/langchain-ai/langchainjs/pull/9648)
[`29a8480`](29a8480799)
Thanks
[@&#8203;christian-bromann](https://redirect.github.com/christian-bromann)!
- fix(langchain): allow to set strict tag manually in providerStrategy
[#&#8203;9578](https://redirect.github.com/langchain-ai/langchainjs/issues/9578)

-
[#&#8203;9630](https://redirect.github.com/langchain-ai/langchainjs/pull/9630)
[`a2df2d4`](a2df2d422e)
Thanks [@&#8203;nephix](https://redirect.github.com/nephix)! -
fix(summary-middleware): use summaryPrefix or fall back to default
prefix

- Updated dependencies
\[[`005c729`](005c72903b),
[`ab78246`](ab78246275),
[`8cc81c7`](8cc81c7cee),
[`f32e499`](f32e4991d0),
[`a28d83d`](a28d83d49d),
[`2e5ad70`](2e5ad70d16),
[`e456c66`](e456c661aa),
[`1cfe603`](1cfe603e97)]:
-
[@&#8203;langchain/core](https://redirect.github.com/langchain/core)@&#8203;1.1.5

###
[`v1.1.5`](https://redirect.github.com/langchain-ai/langchainjs/releases/tag/langchain%401.1.5)

[Compare
Source](https://redirect.github.com/langchain-ai/langchainjs/compare/langchain@1.1.4...langchain@1.1.5)

##### Patch Changes

- Updated dependencies
\[[`0bade90`](0bade90ed4),
[`6c40d00`](6c40d00e92)]:
-
[@&#8203;langchain/core](https://redirect.github.com/langchain/core)@&#8203;1.1.4

###
[`v1.1.4`](https://redirect.github.com/langchain-ai/langchainjs/releases/tag/%40langchain/core%401.1.4)

[Compare
Source](https://redirect.github.com/langchain-ai/langchainjs/compare/langchain@1.1.3...langchain@1.1.4)

##### Patch Changes

-
[#&#8203;9575](https://redirect.github.com/langchain-ai/langchainjs/pull/9575)
[`0bade90`](0bade90ed4)
Thanks [@&#8203;hntrl](https://redirect.github.com/hntrl)! - bin p-retry

-
[#&#8203;9574](https://redirect.github.com/langchain-ai/langchainjs/pull/9574)
[`6c40d00`](6c40d00e92)
Thanks [@&#8203;hntrl](https://redirect.github.com/hntrl)! - Revert
"fix([@&#8203;langchain/core](https://redirect.github.com/langchain/core)):
update and bundle dependencies
([#&#8203;9534](https://redirect.github.com/langchain-ai/langchainjs/issues/9534))"

###
[`v1.1.3`](https://redirect.github.com/langchain-ai/langchainjs/releases/tag/%40langchain/core%401.1.3)

[Compare
Source](https://redirect.github.com/langchain-ai/langchainjs/compare/langchain@1.1.2...langchain@1.1.3)

##### Patch Changes

-
[#&#8203;9534](https://redirect.github.com/langchain-ai/langchainjs/pull/9534)
[`bd2c46e`](bd2c46e09e)
Thanks
[@&#8203;christian-bromann](https://redirect.github.com/christian-bromann)!
-
fix([@&#8203;langchain/core](https://redirect.github.com/langchain/core)):
update and bundle `p-retry`, `ansi-styles`, `camelcase` and `decamelize`
dependencies

-
[#&#8203;9544](https://redirect.github.com/langchain-ai/langchainjs/pull/9544)
[`487378b`](487378bf14)
Thanks [@&#8203;hntrl](https://redirect.github.com/hntrl)! - fix tool
chunk concat behavior
([#&#8203;9450](https://redirect.github.com/langchain-ai/langchainjs/issues/9450))

-
[#&#8203;9505](https://redirect.github.com/langchain-ai/langchainjs/pull/9505)
[`138e7fb`](138e7fb628)
Thanks [@&#8203;chosh-dev](https://redirect.github.com/chosh-dev)! -
feat: replace btoa with toBase64Url for encoding in drawMermaidImage

###
[`v1.1.2`](https://redirect.github.com/langchain-ai/langchainjs/releases/tag/%40langchain/core%401.1.2)

[Compare
Source](https://redirect.github.com/langchain-ai/langchainjs/compare/langchain@1.1.1...langchain@1.1.2)

##### Patch Changes

-
[#&#8203;9511](https://redirect.github.com/langchain-ai/langchainjs/pull/9511)
[`833f578`](833f57834d)
Thanks [@&#8203;dqbd](https://redirect.github.com/dqbd)! - allow parsing
more partial JSON

###
[`v1.1.1`](https://redirect.github.com/langchain-ai/langchainjs/releases/tag/%40langchain/core%401.1.1)

##### Patch Changes

-
[#&#8203;9495](https://redirect.github.com/langchain-ai/langchainjs/pull/9495)
[`636b994`](636b99459b)
Thanks [@&#8203;gsriram24](https://redirect.github.com/gsriram24)! -
fix: use dynamic import for p-retry to support CommonJS environments

-
[#&#8203;9531](https://redirect.github.com/langchain-ai/langchainjs/pull/9531)
[`38f0162`](38f0162b7b)
Thanks [@&#8203;hntrl](https://redirect.github.com/hntrl)! - add
`extras` to tools

###
[`v1.1.0`](https://redirect.github.com/langchain-ai/langchainjs/releases/tag/%40langchain/anthropic%401.1.0)

##### Minor Changes

-
[#&#8203;9424](https://redirect.github.com/langchain-ai/langchainjs/pull/9424)
[`f17b2c9`](f17b2c9db0)
Thanks [@&#8203;hntrl](https://redirect.github.com/hntrl)! - add support
for `betas` param

-
[#&#8203;9424](https://redirect.github.com/langchain-ai/langchainjs/pull/9424)
[`f17b2c9`](f17b2c9db0)
Thanks [@&#8203;hntrl](https://redirect.github.com/hntrl)! - add support
for native structured output

##### Patch Changes

-
[#&#8203;9424](https://redirect.github.com/langchain-ai/langchainjs/pull/9424)
[`f17b2c9`](f17b2c9db0)
Thanks [@&#8203;hntrl](https://redirect.github.com/hntrl)! - bump sdk
version

###
[`v1.0.6`](https://redirect.github.com/langchain-ai/langchainjs/releases/tag/langchain%401.0.6)

[Compare
Source](https://redirect.github.com/langchain-ai/langchainjs/compare/langchain@1.0.5...langchain@1.0.6)

##### Patch Changes

-
[#&#8203;9434](https://redirect.github.com/langchain-ai/langchainjs/pull/9434)
[`f7cfece`](f7cfecec29)
Thanks [@&#8203;deepansh946](https://redirect.github.com/deepansh946)! -
Updated error handling behaviour of AgentNode

###
[`v1.0.5`](https://redirect.github.com/langchain-ai/langchainjs/releases/tag/langchain%401.0.5)

##### Patch Changes

-
[#&#8203;9403](https://redirect.github.com/langchain-ai/langchainjs/pull/9403)
[`944bf56`](944bf56ff0)
Thanks
[@&#8203;christian-bromann](https://redirect.github.com/christian-bromann)!
- improvements to toolEmulator middleware

-
[#&#8203;9388](https://redirect.github.com/langchain-ai/langchainjs/pull/9388)
[`831168a`](831168a545)
Thanks [@&#8203;hntrl](https://redirect.github.com/hntrl)! - use
`profile.maxInputTokens` in summarization middleware

-
[#&#8203;9393](https://redirect.github.com/langchain-ai/langchainjs/pull/9393)
[`f1e2f9e`](f1e2f9eeb3)
Thanks
[@&#8203;christian-bromann](https://redirect.github.com/christian-bromann)!
- align context editing with summarization interface

-
[#&#8203;9427](https://redirect.github.com/langchain-ai/langchainjs/pull/9427)
[`bad7aea`](bad7aea86d)
Thanks [@&#8203;dqbd](https://redirect.github.com/dqbd)! -
fix(langchain): add tool call contents and tool call ID to improve token
count approximation

-
[#&#8203;9396](https://redirect.github.com/langchain-ai/langchainjs/pull/9396)
[`ed6b581`](ed6b581e52)
Thanks
[@&#8203;christian-bromann](https://redirect.github.com/christian-bromann)!
- rename exit behavior from throw to error

###
[`v1.0.4`](https://redirect.github.com/langchain-ai/langchainjs/releases/tag/%40langchain/community%401.0.4)

##### Patch Changes

-
[#&#8203;9326](https://redirect.github.com/langchain-ai/langchainjs/pull/9326)
[`3e0cab6`](3e0cab61b3)
Thanks [@&#8203;ayanyev](https://redirect.github.com/ayanyev)! - Milvus
vector store client: ignore auto-calculated fields in collection schema
during payload validation

- Updated dependencies
\[[`415cb0b`](415cb0bfd2),
[`a2ad61e`](a2ad61e787),
[`34c472d`](34c472d129)]:
-
[@&#8203;langchain/openai](https://redirect.github.com/langchain/openai)@&#8203;1.1.2
-
[@&#8203;langchain/classic](https://redirect.github.com/langchain/classic)@&#8203;1.0.4

###
[`v1.0.3`](https://redirect.github.com/langchain-ai/langchainjs/releases/tag/%40langchain/google-gauth%401.0.3)

##### Patch Changes

- Updated dependencies \[]:
-
[@&#8203;langchain/google-common](https://redirect.github.com/langchain/google-common)@&#8203;1.0.3

</details>

---

### Configuration

📅 **Schedule**: Branch creation - "" (UTC), Automerge - At any time (no
schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/googleapis/genai-toolbox).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0Mi42Ni4xNCIsInVwZGF0ZWRJblZlciI6IjQyLjY2LjE0IiwidGFyZ2V0QnJhbmNoIjoibWFpbiIsImxhYmVscyI6W119-->

Co-authored-by: Yuan Teoh <45984206+Yuan325@users.noreply.github.com>
2025-12-30 12:01:21 -08:00
dependabot[bot]
3d140a657e chore(deps): bump golang.org/x/crypto from 0.43.0 to 0.45.0 in /docs/en/getting-started/quickstart/go/adkgo (#2249)
Bumps [golang.org/x/crypto](https://github.com/golang/crypto) from
0.43.0 to 0.45.0.
<details>
<summary>Commits</summary>
<ul>
<li><a
href="4e0068c009"><code>4e0068c</code></a>
go.mod: update golang.org/x dependencies</li>
<li><a
href="e79546e28b"><code>e79546e</code></a>
ssh: curb GSSAPI DoS risk by limiting number of specified OIDs</li>
<li><a
href="f91f7a7c31"><code>f91f7a7</code></a>
ssh/agent: prevent panic on malformed constraint</li>
<li><a
href="2df4153a03"><code>2df4153</code></a>
acme/autocert: let automatic renewal work with short lifetime certs</li>
<li><a
href="bcf6a849ef"><code>bcf6a84</code></a>
acme: pass context to request</li>
<li><a
href="b4f2b62076"><code>b4f2b62</code></a>
ssh: fix error message on unsupported cipher</li>
<li><a
href="79ec3a51fc"><code>79ec3a5</code></a>
ssh: allow to bind to a hostname in remote forwarding</li>
<li><a
href="122a78f140"><code>122a78f</code></a>
go.mod: update golang.org/x dependencies</li>
<li><a
href="c0531f9c34"><code>c0531f9</code></a>
all: eliminate vet diagnostics</li>
<li><a
href="0997000b45"><code>0997000</code></a>
all: fix some comments</li>
<li>Additional commits viewable in <a
href="https://github.com/golang/crypto/compare/v0.43.0...v0.45.0">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=golang.org/x/crypto&package-manager=go_modules&previous-version=0.43.0&new-version=0.45.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)
You can disable automated security fix PRs for this repo from the
[Security Alerts
page](https://github.com/googleapis/genai-toolbox/network/alerts).

</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-12-30 11:40:42 -08:00
dependabot[bot]
0714d3e126 chore(deps): bump golang.org/x/crypto from 0.43.0 to 0.45.0 in /docs/en/getting-started/quickstart/go/openAI (#2247)
Bumps [golang.org/x/crypto](https://github.com/golang/crypto) from
0.43.0 to 0.45.0.
<details>
<summary>Commits</summary>
<ul>
<li><a
href="4e0068c009"><code>4e0068c</code></a>
go.mod: update golang.org/x dependencies</li>
<li><a
href="e79546e28b"><code>e79546e</code></a>
ssh: curb GSSAPI DoS risk by limiting number of specified OIDs</li>
<li><a
href="f91f7a7c31"><code>f91f7a7</code></a>
ssh/agent: prevent panic on malformed constraint</li>
<li><a
href="2df4153a03"><code>2df4153</code></a>
acme/autocert: let automatic renewal work with short lifetime certs</li>
<li><a
href="bcf6a849ef"><code>bcf6a84</code></a>
acme: pass context to request</li>
<li><a
href="b4f2b62076"><code>b4f2b62</code></a>
ssh: fix error message on unsupported cipher</li>
<li><a
href="79ec3a51fc"><code>79ec3a5</code></a>
ssh: allow to bind to a hostname in remote forwarding</li>
<li><a
href="122a78f140"><code>122a78f</code></a>
go.mod: update golang.org/x dependencies</li>
<li><a
href="c0531f9c34"><code>c0531f9</code></a>
all: eliminate vet diagnostics</li>
<li><a
href="0997000b45"><code>0997000</code></a>
all: fix some comments</li>
<li>Additional commits viewable in <a
href="https://github.com/golang/crypto/compare/v0.43.0...v0.45.0">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=golang.org/x/crypto&package-manager=go_modules&previous-version=0.43.0&new-version=0.45.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)
You can disable automated security fix PRs for this repo from the
[Security Alerts
page](https://github.com/googleapis/genai-toolbox/network/alerts).

</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-12-30 18:44:41 +00:00
dependabot[bot]
0baffff3b5 chore(deps): bump @langchain/core and @langchain/google-genai in /docs/en/getting-started/quickstart/js/langchain (#2232)
Bumps [@langchain/core](https://github.com/langchain-ai/langchainjs) to
1.1.8 and updates ancestor dependency
[@langchain/google-genai](https://github.com/langchain-ai/langchainjs).
These dependencies need to be updated together.

Updates `@langchain/core` from 1.1.0 to 1.1.8
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/langchain-ai/langchainjs/releases"><code>@​langchain/core</code>'s
releases</a>.</em></p>
<blockquote>
<h2><code>@​langchain/core</code><a
href="https://github.com/1"><code>@​1</code></a>.1.8</h2>
<h3>Patch Changes</h3>
<ul>
<li>
<p><a
href="https://redirect.github.com/langchain-ai/langchainjs/pull/9707">#9707</a>
<a
href="e5063f9c6e"><code>e5063f9</code></a>
Thanks <a href="https://github.com/hntrl"><code>@​hntrl</code></a>! -
add security hardening for <code>load</code></p>
</li>
<li>
<p><a
href="https://redirect.github.com/langchain-ai/langchainjs/pull/9684">#9684</a>
<a
href="89966470e8"><code>8996647</code></a>
Thanks <a
href="https://github.com/christian-bromann"><code>@​christian-bromann</code></a>!
- fix(core): document purpose of name in base message</p>
</li>
</ul>
<h2><code>@​langchain/core</code><a
href="https://github.com/1"><code>@​1</code></a>.1.6</h2>
<h3>Patch Changes</h3>
<ul>
<li>
<p><a
href="https://redirect.github.com/langchain-ai/langchainjs/pull/9668">#9668</a>
<a
href="a7b2a7db5e"><code>a7b2a7d</code></a>
Thanks <a
href="https://github.com/bracesproul"><code>@​bracesproul</code></a>! -
fix: Cannot merge two undefined objects error</p>
</li>
<li>
<p><a
href="https://redirect.github.com/langchain-ai/langchainjs/pull/9657">#9657</a>
<a
href="a496c5fc64"><code>a496c5f</code></a>
Thanks <a href="https://github.com/dqbd"><code>@​dqbd</code></a>! -
fix(core): avoid writing to TransformStream in
EventStreamCallbackHandler when underlying ReadableStream is closed</p>
</li>
<li>
<p><a
href="https://redirect.github.com/langchain-ai/langchainjs/pull/9658">#9658</a>
<a
href="1da1325aea"><code>1da1325</code></a>
Thanks <a href="https://github.com/dqbd"><code>@​dqbd</code></a>! -
fix(core): ensure streaming test chat models respect AbortSignal</p>
</li>
</ul>
<h2><code>@​langchain/core</code><a
href="https://github.com/1"><code>@​1</code></a>.1.5</h2>
<h3>Patch Changes</h3>
<ul>
<li>
<p><a
href="https://redirect.github.com/langchain-ai/langchainjs/pull/9641">#9641</a>
<a
href="005c72903b"><code>005c729</code></a>
Thanks <a
href="https://github.com/christian-bromann"><code>@​christian-bromann</code></a>!
- fix(community/core): various security fixes</p>
</li>
<li>
<p><a
href="https://redirect.github.com/langchain-ai/langchainjs/pull/7907">#7907</a>
<a
href="ab78246275"><code>ab78246</code></a>
Thanks <a
href="https://github.com/jasonphillips"><code>@​jasonphillips</code></a>!
- fix(core): handle subgraph nesting better in graph_mermaid</p>
</li>
<li>
<p><a
href="https://redirect.github.com/langchain-ai/langchainjs/pull/9589">#9589</a>
<a
href="8cc81c7cee"><code>8cc81c7</code></a>
Thanks <a
href="https://github.com/nathannewyen"><code>@​nathannewyen</code></a>!
- test(core): add test for response_metadata in streamEvents</p>
</li>
<li>
<p><a
href="https://redirect.github.com/langchain-ai/langchainjs/pull/9644">#9644</a>
<a
href="f32e4991d0"><code>f32e499</code></a>
Thanks <a href="https://github.com/hntrl"><code>@​hntrl</code></a>! -
add bindTools to FakeListChatModel</p>
</li>
<li>
<p><a
href="https://redirect.github.com/langchain-ai/langchainjs/pull/9508">#9508</a>
<a
href="a28d83d49d"><code>a28d83d</code></a>
Thanks <a
href="https://github.com/shubham-021"><code>@​shubham-021</code></a>! -
Fix toFormattedString() to properly display nested objects in tool call
arguments instead of [object Object]</p>
</li>
<li>
<p><a
href="https://redirect.github.com/langchain-ai/langchainjs/pull/9165">#9165</a>
<a
href="2e5ad70d16"><code>2e5ad70</code></a>
Thanks <a
href="https://github.com/pawel-twardziak"><code>@​pawel-twardziak</code></a>!
- fix(mcp-adapters): preserve timeout from RunnableConfig in MCP tool
calls</p>
</li>
<li>
<p><a
href="https://redirect.github.com/langchain-ai/langchainjs/pull/9647">#9647</a>
<a
href="e456c661aa"><code>e456c66</code></a>
Thanks <a href="https://github.com/hntrl"><code>@​hntrl</code></a>! -
handle missing parent runs in tracer to prevent LangSmith 400 errors</p>
</li>
<li>
<p><a
href="https://redirect.github.com/langchain-ai/langchainjs/pull/9597">#9597</a>
<a
href="1cfe603e97"><code>1cfe603</code></a>
Thanks <a href="https://github.com/hntrl"><code>@​hntrl</code></a>! -
use uuid7 for run ids</p>
</li>
</ul>
<h2><code>@​langchain/core</code><a
href="https://github.com/1"><code>@​1</code></a>.1.4</h2>
<h3>Patch Changes</h3>
<ul>
<li>
<p><a
href="https://redirect.github.com/langchain-ai/langchainjs/pull/9575">#9575</a>
<a
href="0bade90ed4"><code>0bade90</code></a>
Thanks <a href="https://github.com/hntrl"><code>@​hntrl</code></a>! -
bin p-retry</p>
</li>
<li>
<p><a
href="https://redirect.github.com/langchain-ai/langchainjs/pull/9574">#9574</a>
<a
href="6c40d00e92"><code>6c40d00</code></a>
Thanks <a href="https://github.com/hntrl"><code>@​hntrl</code></a>! -
Revert &quot;fix(<code>@​langchain/core</code>): update and bundle
dependencies (<a
href="https://redirect.github.com/langchain-ai/langchainjs/issues/9534">#9534</a>)&quot;</p>
</li>
</ul>
<h2><code>@​langchain/core</code><a
href="https://github.com/1"><code>@​1</code></a>.1.3</h2>
<h3>Patch Changes</h3>
<ul>
<li>
<p><a
href="https://redirect.github.com/langchain-ai/langchainjs/pull/9534">#9534</a>
<a
href="bd2c46e09e"><code>bd2c46e</code></a>
Thanks <a
href="https://github.com/christian-bromann"><code>@​christian-bromann</code></a>!
- fix(<code>@​langchain/core</code>): update and bundle
<code>p-retry</code>, <code>ansi-styles</code>, <code>camelcase</code>
and <code>decamelize</code> dependencies</p>
</li>
<li>
<p><a
href="https://redirect.github.com/langchain-ai/langchainjs/pull/9544">#9544</a>
<a
href="487378bf14"><code>487378b</code></a>
Thanks <a href="https://github.com/hntrl"><code>@​hntrl</code></a>! -
fix tool chunk concat behavior (<a
href="https://redirect.github.com/langchain-ai/langchainjs/issues/9450">#9450</a>)</p>
</li>
<li>
<p><a
href="https://redirect.github.com/langchain-ai/langchainjs/pull/9505">#9505</a>
<a
href="138e7fb628"><code>138e7fb</code></a>
Thanks <a
href="https://github.com/chosh-dev"><code>@​chosh-dev</code></a>! -
feat: replace btoa with toBase64Url for encoding in drawMermaidImage</p>
</li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="de32b32b0b"><code>de32b32</code></a>
chore: version packages (<a
href="https://redirect.github.com/langchain-ai/langchainjs/issues/9697">#9697</a>)</li>
<li><a
href="e5063f9c6e"><code>e5063f9</code></a>
fix!(core/langchain): hardening for <code>load</code> (<a
href="https://redirect.github.com/langchain-ai/langchainjs/issues/9707">#9707</a>)</li>
<li><a
href="8b3e611a6c"><code>8b3e611</code></a>
chore(turbopuffer): rollback version (<a
href="https://redirect.github.com/langchain-ai/langchainjs/issues/9698">#9698</a>)</li>
<li><a
href="89966470e8"><code>8996647</code></a>
fix(core): document purpose of name in base message (<a
href="https://redirect.github.com/langchain-ai/langchainjs/issues/9684">#9684</a>)</li>
<li><a
href="8df6264efe"><code>8df6264</code></a>
chore: version packages (<a
href="https://redirect.github.com/langchain-ai/langchainjs/issues/9676">#9676</a>)</li>
<li><a
href="df9c42b3ab"><code>df9c42b</code></a>
feat(core): usage_metadata in extra.metadata (<a
href="https://redirect.github.com/langchain-ai/langchainjs/issues/9686">#9686</a>)</li>
<li><a
href="4ea3a52f86"><code>4ea3a52</code></a>
fix(ci): use appropriate path for core PR labels (<a
href="https://redirect.github.com/langchain-ai/langchainjs/issues/9696">#9696</a>)</li>
<li><a
href="ffb24026cd"><code>ffb2402</code></a>
feat(langchain): <code>context</code> (<a
href="https://redirect.github.com/langchain-ai/langchainjs/issues/9673">#9673</a>)</li>
<li><a
href="8d2982bb94"><code>8d2982b</code></a>
feat(core): Make runnable transform trace in a single payload in
LangChainTra...</li>
<li><a
href="2b36431bab"><code>2b36431</code></a>
fix(mcp-adapters): bump <code>@​modelcontextprotocol/sdk</code> to
address CVE-2025-66414 (...</li>
<li>Additional commits viewable in <a
href="https://github.com/langchain-ai/langchainjs/compare/@langchain/aws@1.1.0...@langchain/core@1.1.8">compare
view</a></li>
</ul>
</details>
<br />

Updates `@langchain/google-genai` from 2.0.0 to 2.1.3
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/langchain-ai/langchainjs/releases"><code>@​langchain/google-genai</code>'s
releases</a>.</em></p>
<blockquote>
<h2><code>@​langchain/google-genai</code><a
href="https://github.com/2"><code>@​2</code></a>.1.3</h2>
<h3>Patch Changes</h3>
<ul>
<li>Updated dependencies [<a
href="e5063f9c6e"><code>e5063f9</code></a>,
<a
href="89966470e8"><code>8996647</code></a>]:
<ul>
<li><code>@​langchain/core</code><a
href="https://github.com/1"><code>@​1</code></a>.1.8</li>
</ul>
</li>
</ul>
<h2><code>@​langchain/google-genai</code><a
href="https://github.com/2"><code>@​2</code></a>.1.1</h2>
<h3>Patch Changes</h3>
<ul>
<li>Updated dependencies [<a
href="a7b2a7db5e"><code>a7b2a7d</code></a>,
<a
href="a496c5fc64"><code>a496c5f</code></a>,
<a
href="1da1325aea"><code>1da1325</code></a>]:
<ul>
<li><code>@​langchain/core</code><a
href="https://github.com/1"><code>@​1</code></a>.1.6</li>
</ul>
</li>
</ul>
<h2><code>@​langchain/google-genai</code><a
href="https://github.com/2"><code>@​2</code></a>.1.0</h2>
<h3>Patch Changes</h3>
<ul>
<li>
<p><a
href="https://redirect.github.com/langchain-ai/langchainjs/pull/8327">#8327</a>
<a
href="89a79097ac"><code>89a7909</code></a>
Thanks <a
href="https://github.com/caspherola"><code>@​caspherola</code></a>! -
support of adding custom headers on ChatGoogleGenerativeAI <a
href="https://redirect.github.com/langchain-ai/langchainjs/issues/6648">#6648</a></p>
</li>
<li>
<p><a
href="https://redirect.github.com/langchain-ai/langchainjs/pull/9584">#9584</a>
<a
href="f4ef9a1dc9"><code>f4ef9a1</code></a>
Thanks <a
href="https://github.com/encodedz"><code>@​encodedz</code></a>! - safe
access around custom content parts</p>
</li>
<li>
<p><a
href="https://redirect.github.com/langchain-ai/langchainjs/pull/9583">#9583</a>
<a
href="5b27f38581"><code>5b27f38</code></a>
Thanks <a
href="https://github.com/maslo55555"><code>@​maslo55555</code></a>! -
fix(google-genai): support custom agent names in createAgent</p>
</li>
<li>
<p>Updated dependencies [<a
href="005c72903b"><code>005c729</code></a>,
<a
href="ab78246275"><code>ab78246</code></a>,
<a
href="8cc81c7cee"><code>8cc81c7</code></a>,
<a
href="f32e4991d0"><code>f32e499</code></a>,
<a
href="a28d83d49d"><code>a28d83d</code></a>,
<a
href="2e5ad70d16"><code>2e5ad70</code></a>,
<a
href="e456c661aa"><code>e456c66</code></a>,
<a
href="1cfe603e97"><code>1cfe603</code></a>]:</p>
<ul>
<li><code>@​langchain/core</code><a
href="https://github.com/1"><code>@​1</code></a>.1.5</li>
</ul>
</li>
</ul>
<h2><code>@​langchain/google-genai</code><a
href="https://github.com/2"><code>@​2</code></a>.0.4</h2>
<h3>Patch Changes</h3>
<ul>
<li>Updated dependencies [<a
href="0bade90ed4"><code>0bade90</code></a>,
<a
href="6c40d00e92"><code>6c40d00</code></a>]:
<ul>
<li><code>@​langchain/core</code><a
href="https://github.com/1"><code>@​1</code></a>.1.4</li>
</ul>
</li>
</ul>
<h2><code>@​langchain/google-genai</code><a
href="https://github.com/2"><code>@​2</code></a>.0.3</h2>
<h3>Patch Changes</h3>
<ul>
<li>Updated dependencies [<a
href="bd2c46e09e"><code>bd2c46e</code></a>,
<a
href="487378bf14"><code>487378b</code></a>,
<a
href="138e7fb628"><code>138e7fb</code></a>]:
<ul>
<li><code>@​langchain/core</code><a
href="https://github.com/1"><code>@​1</code></a>.1.3</li>
</ul>
</li>
</ul>
<h2><code>@​langchain/google-genai</code><a
href="https://github.com/2"><code>@​2</code></a>.0.2</h2>
<h3>Patch Changes</h3>
<ul>
<li>Updated dependencies [<a
href="833f57834d"><code>833f578</code></a>]:
<ul>
<li><code>@​langchain/core</code><a
href="https://github.com/1"><code>@​1</code></a>.1.2</li>
</ul>
</li>
</ul>
<h2><code>@​langchain/google-genai</code><a
href="https://github.com/2"><code>@​2</code></a>.0.1</h2>
<h3>Patch Changes</h3>
<ul>
<li>Updated dependencies [<a
href="636b99459b"><code>636b994</code></a>,
<a
href="38f0162b7b"><code>38f0162</code></a>]:
<ul>
<li><code>@​langchain/core</code><a
href="https://github.com/1"><code>@​1</code></a>.1.1</li>
</ul>
</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li>See full diff in <a
href="https://github.com/langchain-ai/langchainjs/commits/@langchain/google-genai@2.1.3">compare
view</a></li>
</ul>
</details>
<br />


Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)
You can disable automated security fix PRs for this repo from the
[Security Alerts
page](https://github.com/googleapis/genai-toolbox/network/alerts).

</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-12-30 10:22:00 -08:00
Mend Renovate
f87ed05aac chore(deps): update pip (#2215)
This PR contains the following updates:

| Package | Change |
[Age](https://docs.renovatebot.com/merge-confidence/) |
[Confidence](https://docs.renovatebot.com/merge-confidence/) |
|---|---|---|---|
| [google-adk](https://redirect.github.com/google/adk-python)
([changelog](https://redirect.github.com/google/adk-python/blob/main/CHANGELOG.md))
| `==1.19.0` → `==1.21.0` |
![age](https://developer.mend.io/api/mc/badges/age/pypi/google-adk/1.21.0?slim=true)
|
![confidence](https://developer.mend.io/api/mc/badges/confidence/pypi/google-adk/1.19.0/1.21.0?slim=true)
|
| [google-genai](https://redirect.github.com/googleapis/python-genai) |
`==1.52.0` → `==1.56.0` |
![age](https://developer.mend.io/api/mc/badges/age/pypi/google-genai/1.56.0?slim=true)
|
![confidence](https://developer.mend.io/api/mc/badges/confidence/pypi/google-genai/1.52.0/1.56.0?slim=true)
|
| [langchain](https://redirect.github.com/langchain-ai/langchain)
([source](https://redirect.github.com/langchain-ai/langchain/tree/HEAD/libs/langchain),
[changelog](https://redirect.github.com/langchain-ai/langchain/releases?q=tag%3A%22langchain%3D%3D1%22))
| `==1.1.0` → `==1.2.0` |
![age](https://developer.mend.io/api/mc/badges/age/pypi/langchain/1.2.0?slim=true)
|
![confidence](https://developer.mend.io/api/mc/badges/confidence/pypi/langchain/1.1.0/1.2.0?slim=true)
|
|
[langchain-google-vertexai](https://redirect.github.com/langchain-ai/langchain-google)
([source](https://redirect.github.com/langchain-ai/langchain-google/tree/HEAD/libs/vertexai),
[changelog](https://redirect.github.com/langchain-ai/langchain-google/releases?q=%22vertexai%22))
| `==3.1.0` → `==3.2.0` |
![age](https://developer.mend.io/api/mc/badges/age/pypi/langchain-google-vertexai/3.2.0?slim=true)
|
![confidence](https://developer.mend.io/api/mc/badges/confidence/pypi/langchain-google-vertexai/3.1.0/3.2.0?slim=true)
|
| [langgraph](https://redirect.github.com/langchain-ai/langgraph)
([source](https://redirect.github.com/langchain-ai/langgraph/tree/HEAD/libs/langgraph),
[changelog](https://redirect.github.com/langchain-ai/langgraph/releases))
| `==1.0.4` → `==1.0.5` |
![age](https://developer.mend.io/api/mc/badges/age/pypi/langgraph/1.0.5?slim=true)
|
![confidence](https://developer.mend.io/api/mc/badges/confidence/pypi/langgraph/1.0.4/1.0.5?slim=true)
|
| [llama-index](https://redirect.github.com/run-llama/llama_index) |
`==0.14.10` → `==0.14.12` |
![age](https://developer.mend.io/api/mc/badges/age/pypi/llama-index/0.14.12?slim=true)
|
![confidence](https://developer.mend.io/api/mc/badges/confidence/pypi/llama-index/0.14.10/0.14.12?slim=true)
|
| llama-index-llms-google-genai | `==0.7.3` → `==0.8.3` |
![age](https://developer.mend.io/api/mc/badges/age/pypi/llama-index-llms-google-genai/0.8.3?slim=true)
|
![confidence](https://developer.mend.io/api/mc/badges/confidence/pypi/llama-index-llms-google-genai/0.7.3/0.8.3?slim=true)
|
| [pytest](https://redirect.github.com/pytest-dev/pytest)
([changelog](https://docs.pytest.org/en/stable/changelog.html)) |
`==9.0.1` → `==9.0.2` |
![age](https://developer.mend.io/api/mc/badges/age/pypi/pytest/9.0.2?slim=true)
|
![confidence](https://developer.mend.io/api/mc/badges/confidence/pypi/pytest/9.0.1/9.0.2?slim=true)
|
|
[toolbox-core](https://redirect.github.com/googleapis/mcp-toolbox-sdk-python)
([changelog](https://redirect.github.com/googleapis/mcp-toolbox-sdk-python/blob/main/packages/toolbox-core/CHANGELOG.md))
| `==0.5.3` → `==0.5.4` |
![age](https://developer.mend.io/api/mc/badges/age/pypi/toolbox-core/0.5.4?slim=true)
|
![confidence](https://developer.mend.io/api/mc/badges/confidence/pypi/toolbox-core/0.5.3/0.5.4?slim=true)
|
|
[toolbox-langchain](https://redirect.github.com/googleapis/mcp-toolbox-sdk-python)
([changelog](https://redirect.github.com/googleapis/mcp-toolbox-sdk-python/blob/main/packages/toolbox-langchain/CHANGELOG.md))
| `==0.5.3` → `==0.5.4` |
![age](https://developer.mend.io/api/mc/badges/age/pypi/toolbox-langchain/0.5.4?slim=true)
|
![confidence](https://developer.mend.io/api/mc/badges/confidence/pypi/toolbox-langchain/0.5.3/0.5.4?slim=true)
|
|
[toolbox-llamaindex](https://redirect.github.com/googleapis/mcp-toolbox-sdk-python)
([changelog](https://redirect.github.com/googleapis/mcp-toolbox-sdk-python/blob/main/packages/toolbox-llamaindex/CHANGELOG.md))
| `==0.5.3` → `==0.5.4` |
![age](https://developer.mend.io/api/mc/badges/age/pypi/toolbox-llamaindex/0.5.4?slim=true)
|
![confidence](https://developer.mend.io/api/mc/badges/confidence/pypi/toolbox-llamaindex/0.5.3/0.5.4?slim=true)
|

---

### Release Notes

<details>
<summary>google/adk-python (google-adk)</summary>

###
[`v1.21.0`](https://redirect.github.com/google/adk-python/blob/HEAD/CHANGELOG.md#1210-2025-12-11)

[Compare
Source](https://redirect.github.com/google/adk-python/compare/v1.20.0...v1.21.0)

##### Features

- **\[Interactions API Support]**
- The newly released Gemini [Interactions
API](https://ai.google.dev/gemini-api/docs/interactions) is supported in
ADK now. To use it:
  ```Python
  Agent(
    model=Gemini(
        model="gemini-3-pro-preview",
        use_interactions_api=True,
    ),
    name="...",
    description="...",
    instruction="...",
  )
  ```
see
[samples](https://redirect.github.com/google/adk-python/tree/main/contributing/samples/interactions_api)
for details

- **\[Services]**
- Add `add_session_to_memory` to `CallbackContext` and `ToolContext` to
explicitly save the current session to memory
([7b356dd](7b356ddc1b))

- **\[Plugins]**
- Add location for table in agent events in plugin
BigQueryAgentAnalytics
([507424a](507424acb9))
- Upgrade BigQueryAgentAnalyticsPlugin to v2.0 with improved
performance, multimodal support, and reliability
([7b2fe14](7b2fe14dab))

- **\[A2A]**
- Adds ADK EventActions to A2A response
([32e87f6](32e87f6381))

- **\[Tools]**
- Add `header_provider` to `OpenAPIToolset` and `RestApiTool`
([e1a7593](e1a7593ae8))
- Allow overriding connection template
([cde7f7c](cde7f7c243))
- Add SSL certificate verification configuration to OpenAPI tools using
the `verify` parameter
([9d2388a](9d2388a46f))
- Use json schema for function tool declaration when feature enabled
([cb3244b](cb3244bb58))

- **\[Models]**
- Add Gemma3Ollama model integration and a sample
([e9182e5](e9182e5eb4))

##### Bug Fixes

- Install dependencies for py 3.10
([9cccab4](9cccab4537))
- Refactor LiteLLM response schema formatting for different models
([894d8c6](894d8c6c26))
- Resolve project and credentials before creating Spanner client
([99f893a](99f893ae28))
- Avoid false positive "App name mismatch" warnings in Runner
([6388ba3](6388ba3b20))
- Update the code to work with either 1 event or more than 1 events
([4f54660](4f54660d6d))
- OpenAPI schema generation by skipping JSON schema for
judge\_model\_config
([56775af](56775afc48))
- Add tool\_name\_prefix support to OpenAPIToolset
([82e6623](82e6623fa9))
- Pass context to client interceptors
([143ad44](143ad44f8c))
- Yield event with error code when agent run raised A2AClientHTTPError
([b7ce5e1](b7ce5e17b6))
- Handle string function responses in LiteLLM conversion
([2b64715](2b64715505))
- ApigeeLLM support for Built-in tools like GoogleSearch,
BuiltInCodeExecutor when calling Gemini models through Apigee
([a9b853f](a9b853fe36))
- Extract and propagate task\_id in RemoteA2aAgent
([82bd4f3](82bd4f380b))
- Update FastAPI and Starlette to fix CVE-2025-62727 (ReDoS
vulnerability)
([c557b0a](c557b0a1f2))
- Add client id to token exchange
([f273517](f2735177f1))

##### Improvements

- Normalize multipart content for LiteLLM's ollama\_chat provider
([055dfc7](055dfc7974))
- Update adk web, fixes image not rendering, state not updating, update
drop down box width and trace icons
([df86847](df8684734b))
- Add sample agent for interaction api integration
([68d7048](68d70488b9))
- Update genAI SDK version
([f0bdcab](f0bdcaba44))
- Introduce `build_function_declaration_with_json_schema` to use
pydantic to generate json schema for FunctionTool
([51a638b](51a638b6b8))
- Update component definition for triaging agent
([ee743bd](ee743bd19a))
- Migrate Google tools to use the new feature decorator
([bab5729](bab57296d5))
- Migrate computer to use the new feature decorator
([1ae944b](1ae944b39d))
- Add Spanner execute sql query result mode using list of dictionaries
([f22bac0](f22bac0b20))
- Improve error message for missing `invocation_id` and `new_message` in
`run_async`
([de841a4](de841a4a09))

###
[`v1.20.0`](https://redirect.github.com/google/adk-python/blob/HEAD/CHANGELOG.md#1200-2025-12-01)

[Compare
Source](https://redirect.github.com/google/adk-python/compare/v1.19.0...v1.20.0)

##### Features

- **\[Core]**
- Add enum constraint to `agent_name` for `transfer_to_agent`
([4a42d0d](4a42d0d9d8))
- Add validation for unique sub-agent names
([#&#8203;3557](https://redirect.github.com/google/adk-python/issues/3557))
([2247a45](2247a45922))
- Support streaming function call arguments in progressive SSE streaming
feature
([786aaed](786aaed335))

- **\[Models]**
- Enable multi-provider support for Claude and LiteLLM
([d29261a](d29261a3dc))

- **\[Tools]**
- Create APIRegistryToolset to add tools from Cloud API registry to
agent
([ec4ccd7](ec4ccd718f))
- Add an option to disallow propagating runner plugins to AgentTool
runner
([777dba3](777dba3033))

- **\[Web]**
- Added an endpoint to list apps with details
([b57fe5f](b57fe5f459))

##### Bug Fixes

- Allow image parts in user messages for Anthropic Claude
([5453b5b](5453b5bfde))
- Mark the Content as non-empty if its first part contains text or
inline\_data or file\_data or func call/response
([631b583](631b58336d))
- Fixes double response processing issue in `base_llm_flow.py` where, in
Bidi-streaming (live) mode, the multi-agent structure causes duplicated
responses after tool calling.
([cf21ca3](cf21ca3584))
- Fix out of bounds error in \_run\_async\_impl
([8fc6128](8fc6128b62))
- Fix paths for public docs
([cd54f48](cd54f48fed))
- Ensure request bodies without explicit names are named 'body'
([084c2de](084c2de0da)),
closes
[#&#8203;2213](https://redirect.github.com/google/adk-python/issues/2213)
- Optimize Stale Agent with GraphQL and Search API to resolve 429 Quota
errors
([cb19d07](cb19d0714c))
- Update AgentTool to use Agent's description when input\_schema is
provided in FunctionDeclaration
([52674e7](52674e7fac))
- Update LiteLLM system instruction role from "developer" to "system"
([2e1f730](2e1f730c3b)),
closes
[#&#8203;3657](https://redirect.github.com/google/adk-python/issues/3657)
- Update session last update time when appending events
([a3e4ad3](a3e4ad3cd1)),
closes
[#&#8203;2721](https://redirect.github.com/google/adk-python/issues/2721)
- Update the retry\_on\_closed\_resource decorator to retry on all
errors
([a3aa077](a3aa07722a))
- Windows Path Handling and Normalize Cross-Platform Path Resolution in
AgentLoader
([a1c09b7](a1c09b724b))

##### Documentation

- Add Code Wiki badge to README
([caf23ac](caf23ac49f))

</details>

<details>
<summary>googleapis/python-genai (google-genai)</summary>

###
[`v1.56.0`](https://redirect.github.com/googleapis/python-genai/blob/HEAD/CHANGELOG.md#1560-2025-12-16)

[Compare
Source](https://redirect.github.com/googleapis/python-genai/compare/v1.55.0...v1.56.0)

##### Features

- Add minimal and medium thinking levels.
([96d644c](96d644cd52))
- Add support for Struct in ToolResult Content.
([8fd4886](8fd4886a04))
- Add ultra high resolution to the media resolution in Parts.
([356c320](356c320566))
- Add ULTRA\_HIGH MediaResolution and new ThinkingLevel enums
([336b823](336b8236c0))
- Define and use DocumentMimeType for DocumentContent
([dc7f00f](dc7f00f78b))
- Support multi speaker for Vertex AI
([ecb00c2](ecb00c2241))

##### Bug Fixes

- Api version handling for interactions.
([436ca2e](436ca2e1d5))

##### Documentation

- Add documentation for the new Interactions API (Preview).
([e28a69c](e28a69c92a))
- Update and restructure codegen\_instructions
([00422de](00422de07b))
- Update docs for 1.55
([1cc43e7](1cc43e7d06))

###
[`v1.55.0`](https://redirect.github.com/googleapis/python-genai/blob/HEAD/CHANGELOG.md#1550-2025-12-11)

[Compare
Source](https://redirect.github.com/googleapis/python-genai/compare/v1.54.0...v1.55.0)

##### Features

- Add the Interactions API
([836a3](836a33c93f))
- Add enableEnhancedCivicAnswers feature in GenerateContentConfig
([15d1ea9](15d1ea9fbb))
- Add IMAGE\_RECITATION and IMAGE\_OTHER enum values to FinishReason
([8bb4b9a](8bb4b9a8b7))
- Add voice activity detection signal.
([feae46d](feae46dd76))

##### Bug Fixes

- Replicated voice config bytes handling
([c9f8668](c9f8668cea))

##### Documentation

- Regenerate docs for 1.54.0
([8bac8d2](8bac8d2d92))

###
[`v1.54.0`](https://redirect.github.com/googleapis/python-genai/blob/HEAD/CHANGELOG.md#1540-2025-12-08)

[Compare
Source](https://redirect.github.com/googleapis/python-genai/compare/v1.53.0...v1.54.0)

##### Features

- Support ReplicatedVoiceConfig
([07c74dd](07c74dd120))

##### Bug Fixes

- Apply timeout to the total request duration in aiohttp
([a4f4205](a4f4205dd9))
- Make APIError class picklable (fixes
[#&#8203;1144](https://redirect.github.com/googleapis/python-genai/issues/1144))
([e3d5712](e3d5712d9f))

##### Documentation

- Regenerate docs for 1.53.0
([3a2b970](3a2b9702ec))

###
[`v1.53.0`](https://redirect.github.com/googleapis/python-genai/blob/HEAD/CHANGELOG.md#1530-2025-12-03)

[Compare
Source](https://redirect.github.com/googleapis/python-genai/compare/v1.52.0...v1.53.0)

##### Features

- Add empty response for tunings.cancel()
([97cc7e4](97cc7e4eaf))

##### Bug Fixes

- Convert 'citationSources' key in CitationMetadata to 'citations' when
present (fixes
[#&#8203;1222](https://redirect.github.com/googleapis/python-genai/issues/1222))
([2f28b02](2f28b02517))
- Fix google.auth.transport.requests import error in Live API
([a842721](a842721cb1))

##### Documentation

- Improve docs for google.genai.types
([5b50adc](5b50adce2a))
- Recommend using response\_json\_schema in error messages and
docstrings.
([c0b175a](c0b175a0ca))
- Updating codegen instructions to use gemini 3 pro and nano banana pro
([060f015](060f015d7e))

</details>

<details>
<summary>langchain-ai/langgraph (langgraph)</summary>

###
[`v1.0.5`](https://redirect.github.com/langchain-ai/langgraph/releases/tag/1.0.5):
langgraph&#x3D;&#x3D;1.0.5

[Compare
Source](https://redirect.github.com/langchain-ai/langgraph/compare/1.0.4...1.0.5)

Changes since 1.0.4

- release(langgraph): bump to 1.0.5
([#&#8203;6582](https://redirect.github.com/langchain-ai/langgraph/issues/6582))
- feat(sdk-py): emit id as part of stream events
([#&#8203;6581](https://redirect.github.com/langchain-ai/langgraph/issues/6581))
- fix: update readme
([#&#8203;6570](https://redirect.github.com/langchain-ai/langgraph/issues/6570))
- release(checkpoint-postgres): 3.0.1
([#&#8203;6568](https://redirect.github.com/langchain-ai/langgraph/issues/6568))
- release(checkpoint-sqlite): 3.0.1
([#&#8203;6566](https://redirect.github.com/langchain-ai/langgraph/issues/6566))
- chore(cli): Pass through webhook configuration in dev server
([#&#8203;6557](https://redirect.github.com/langchain-ai/langgraph/issues/6557))
- feat: custom encryption at rest
([#&#8203;6482](https://redirect.github.com/langchain-ai/langgraph/issues/6482))
- chore: fix links for docs
([#&#8203;6538](https://redirect.github.com/langchain-ai/langgraph/issues/6538))
- chore: Bump lockfile
([#&#8203;6537](https://redirect.github.com/langchain-ai/langgraph/issues/6537))
- feat: Include pagination in assistants search response
([#&#8203;6526](https://redirect.github.com/langchain-ai/langgraph/issues/6526))

</details>

<details>
<summary>run-llama/llama_index (llama-index)</summary>

###
[`v0.14.12`](https://redirect.github.com/run-llama/llama_index/blob/HEAD/CHANGELOG.md#2025-12-30)

[Compare
Source](https://redirect.github.com/run-llama/llama_index/compare/v0.14.10...v0.14.12)

##### llama-index-callbacks-agentops \[0.4.1]

- Feat/async tool spec support
([#&#8203;20338](https://redirect.github.com/run-llama/llama_index/pull/20338))

##### llama-index-core \[0.14.12]

- Feat/async tool spec support
([#&#8203;20338](https://redirect.github.com/run-llama/llama_index/pull/20338))
- Improve `MockFunctionCallingLLM`
([#&#8203;20356](https://redirect.github.com/run-llama/llama_index/pull/20356))
- fix(openai): sanitize generic Pydantic model schema names
([#&#8203;20371](https://redirect.github.com/run-llama/llama_index/pull/20371))
- Element node parser
([#&#8203;20399](https://redirect.github.com/run-llama/llama_index/pull/20399))
- improve llama dev logging
([#&#8203;20411](https://redirect.github.com/run-llama/llama_index/pull/20411))
- test(node\_parser): add unit tests for Java CodeSplitter
([#&#8203;20423](https://redirect.github.com/run-llama/llama_index/pull/20423))
- fix: crash in log\_vector\_store\_query\_result when result.ids is
None
([#&#8203;20427](https://redirect.github.com/run-llama/llama_index/pull/20427))

##### llama-index-embeddings-litellm \[0.4.1]

- Add docstring to LiteLLM embedding class
([#&#8203;20336](https://redirect.github.com/run-llama/llama_index/pull/20336))

##### llama-index-embeddings-ollama \[0.8.5]

- feat(llama-index-embeddings-ollama): Add keep\_alive parameter
([#&#8203;20395](https://redirect.github.com/run-llama/llama_index/pull/20395))
- docs: improve Ollama embeddings README with comprehensive
documentation
([#&#8203;20414](https://redirect.github.com/run-llama/llama_index/pull/20414))

##### llama-index-embeddings-voyageai \[0.5.2]

- Voyage multimodal 35
([#&#8203;20398](https://redirect.github.com/run-llama/llama_index/pull/20398))

##### llama-index-graph-stores-nebula \[0.5.1]

- feat(nebula): add MENTIONS edge to property graph store
([#&#8203;20401](https://redirect.github.com/run-llama/llama_index/pull/20401))

##### llama-index-llms-aibadgr \[0.1.0]

- feat(llama-index-llms-aibadgr): Add AI Badgr OpenAI‑compatible LLM
integration
([#&#8203;20365](https://redirect.github.com/run-llama/llama_index/pull/20365))

##### llama-index-llms-anthropic \[0.10.4]

- add back haiku-3 support
([#&#8203;20408](https://redirect.github.com/run-llama/llama_index/pull/20408))

##### llama-index-llms-bedrock-converse \[0.12.3]

- fix: bedrock converse thinking block issue
([#&#8203;20355](https://redirect.github.com/run-llama/llama_index/pull/20355))

##### llama-index-llms-google-genai \[0.8.3]

- Switch use\_file\_api to Flexible file\_mode; Improve File Upload
Handling & Bump google-genai to v1.52.0
([#&#8203;20347](https://redirect.github.com/run-llama/llama_index/pull/20347))
- Fix missing role from Google-GenAI
([#&#8203;20357](https://redirect.github.com/run-llama/llama_index/pull/20357))
- Add signature index fix
([#&#8203;20362](https://redirect.github.com/run-llama/llama_index/pull/20362))
- Add positional thought signature for thoughts
([#&#8203;20418](https://redirect.github.com/run-llama/llama_index/pull/20418))

##### llama-index-llms-ollama \[0.9.1]

- feature: pydantic no longer complains if you pass 'low', 'medium', 'h…
([#&#8203;20394](https://redirect.github.com/run-llama/llama_index/pull/20394))

##### llama-index-llms-openai \[0.6.12]

- fix: Handle tools=None in OpenAIResponses.\_get\_model\_kwargs
([#&#8203;20358](https://redirect.github.com/run-llama/llama_index/pull/20358))
- feat: add support for gpt-5.2 and 5.2 pro
([#&#8203;20361](https://redirect.github.com/run-llama/llama_index/pull/20361))

##### llama-index-readers-confluence \[0.6.1]

- fix(confluence): support Python 3.14
([#&#8203;20370](https://redirect.github.com/run-llama/llama_index/pull/20370))

##### llama-index-readers-file \[0.5.6]

- Loosen constraint on `pandas` version
([#&#8203;20387](https://redirect.github.com/run-llama/llama_index/pull/20387))

##### llama-index-readers-service-now \[0.2.2]

- chore(deps): bump urllib3 from 2.5.0 to 2.6.0 in
/llama-index-integrations/readers/llama-index-readers-service-now in the
pip group across 1 directory
([#&#8203;20341](https://redirect.github.com/run-llama/llama_index/pull/20341))

##### llama-index-tools-mcp \[0.4.5]

- fix: pass timeout parameters to transport clients in BasicMCPClient
([#&#8203;20340](https://redirect.github.com/run-llama/llama_index/pull/20340))
- feature: Permit to pass a custom httpx.AsyncClient when creating a
BasicMcpClient
([#&#8203;20368](https://redirect.github.com/run-llama/llama_index/pull/20368))

##### llama-index-tools-typecast \[0.1.0]

- feat: add Typecast tool integration with text to speech features
([#&#8203;20343](https://redirect.github.com/run-llama/llama_index/pull/20343))

##### llama-index-vector-stores-azurepostgresql \[0.2.0]

- Feat/async tool spec support
([#&#8203;20338](https://redirect.github.com/run-llama/llama_index/pull/20338))

##### llama-index-vector-stores-chroma \[0.5.5]

- Fix chroma nested metadata filters
([#&#8203;20424](https://redirect.github.com/run-llama/llama_index/pull/20424))
- fix(chroma): support multimodal results
([#&#8203;20426](https://redirect.github.com/run-llama/llama_index/pull/20426))

##### llama-index-vector-stores-couchbase \[0.6.0]

- Update FTS & GSI reference docs for Couchbase vector-store
([#&#8203;20346](https://redirect.github.com/run-llama/llama_index/pull/20346))

##### llama-index-vector-stores-faiss \[0.5.2]

- fix(faiss): pass numpy array instead of int to add\_with\_ids
([#&#8203;20384](https://redirect.github.com/run-llama/llama_index/pull/20384))

##### llama-index-vector-stores-lancedb \[0.4.4]

- Feat/async tool spec support
([#&#8203;20338](https://redirect.github.com/run-llama/llama_index/pull/20338))
- fix(vector\_stores/lancedb): add missing '<' filter operator
([#&#8203;20364](https://redirect.github.com/run-llama/llama_index/pull/20364))
- fix(lancedb): fix metadata filtering logic and list value SQL
generation
([#&#8203;20374](https://redirect.github.com/run-llama/llama_index/pull/20374))

##### llama-index-vector-stores-mongodb \[0.9.0]

- Update mongo vector store to initialize without list permissions
([#&#8203;20354](https://redirect.github.com/run-llama/llama_index/pull/20354))
- add mongodb delete index
([#&#8203;20429](https://redirect.github.com/run-llama/llama_index/pull/20429))
- async mongodb atlas support
([#&#8203;20430](https://redirect.github.com/run-llama/llama_index/pull/20430))

##### llama-index-vector-stores-redis \[0.6.2]

- Redis metadata filter fix
([#&#8203;20359](https://redirect.github.com/run-llama/llama_index/pull/20359))

##### llama-index-vector-stores-vertexaivectorsearch \[0.3.3]

- feat(vertex-vector-search): Add Google Vertex AI Vector Search v2.0
support
([#&#8203;20351](https://redirect.github.com/run-llama/llama_index/pull/20351))

</details>

<details>
<summary>pytest-dev/pytest (pytest)</summary>

###
[`v9.0.2`](https://redirect.github.com/pytest-dev/pytest/releases/tag/9.0.2)

[Compare
Source](https://redirect.github.com/pytest-dev/pytest/compare/9.0.1...9.0.2)

### pytest 9.0.2 (2025-12-06)

#### Bug fixes

-
[#&#8203;13896](https://redirect.github.com/pytest-dev/pytest/issues/13896):
The terminal progress feature added in pytest 9.0.0 has been disabled by
default, except on Windows, due to compatibility issues with some
terminal emulators.

You may enable it again by passing `-p terminalprogress`. We may enable
it by default again once compatibility improves in the future.

Additionally, when the environment variable `TERM` is `dumb`, the escape
codes are no longer emitted, even if the plugin is enabled.

-
[#&#8203;13904](https://redirect.github.com/pytest-dev/pytest/issues/13904):
Fixed the TOML type of the `tmp_path_retention_count` settings in the
API reference from number to string.

-
[#&#8203;13946](https://redirect.github.com/pytest-dev/pytest/issues/13946):
The private `config.inicfg` attribute was changed in a breaking manner
in pytest 9.0.0.
Due to its usage in the ecosystem, it is now restored to working order
using a compatibility shim.
  It will be deprecated in pytest 9.1 and removed in pytest 10.

-
[#&#8203;13965](https://redirect.github.com/pytest-dev/pytest/issues/13965):
Fixed quadratic-time behavior when handling `unittest` subtests in
Python 3.10.

#### Improved documentation

-
[#&#8203;4492](https://redirect.github.com/pytest-dev/pytest/issues/4492):
The API Reference now contains cross-reference-able documentation of
`pytest's command-line flags <command-line-flags>`.

</details>

<details>
<summary>googleapis/mcp-toolbox-sdk-python (toolbox-core)</summary>

###
[`v0.5.4`](https://redirect.github.com/googleapis/mcp-toolbox-sdk-python/releases/tag/toolbox-llamaindex-v0.5.4):
toolbox-llamaindex: v0.5.4

[Compare
Source](https://redirect.github.com/googleapis/mcp-toolbox-sdk-python/compare/toolbox-core-v0.5.3...toolbox-core-v0.5.4)

##### Features

- **toolbox-llamaindex:** add protocol toggle to llamaindex clients
([#&#8203;453](https://redirect.github.com/googleapis/mcp-toolbox-sdk-python/issues/453))
([d5eece0](d5eece0d84))

</details>

---

### Configuration

📅 **Schedule**: Branch creation - At any time (no schedule defined),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

👻 **Immortal**: This PR will be recreated if closed unmerged. Get
[config
help](https://redirect.github.com/renovatebot/renovate/discussions) if
that's undesired.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/googleapis/genai-toolbox).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0Mi41OS4wIiwidXBkYXRlZEluVmVyIjoiNDIuNjYuMTQiLCJ0YXJnZXRCcmFuY2giOiJtYWluIiwibGFiZWxzIjpbXX0=-->
2025-12-30 10:00:49 -08:00
dependabot[bot]
a35f64ef7d chore(deps): bump jws from 4.0.0 to 4.0.1 in /docs/en/getting-started/quickstart/js/adk (#2143)
Bumps [jws](https://github.com/brianloveswords/node-jws) from 4.0.0 to
4.0.1.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/brianloveswords/node-jws/releases">jws's
releases</a>.</em></p>
<blockquote>
<h2>v4.0.1</h2>
<h3>Changed</h3>
<ul>
<li>Fix advisory GHSA-869p-cjfg-cm3x: createSign and createVerify now
require
that a non empty secret is provided (via opts.secret, opts.privateKey or
opts.key)
when using HMAC algorithms.</li>
<li>Upgrading JWA version to 2.0.1, addressing a compatibility issue for
Node &gt;= 25.</li>
</ul>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/auth0/node-jws/blob/master/CHANGELOG.md">jws's
changelog</a>.</em></p>
<blockquote>
<h2>[4.0.1]</h2>
<h3>Changed</h3>
<ul>
<li>Fix advisory GHSA-869p-cjfg-cm3x: createSign and createVerify now
require
that a non empty secret is provided (via opts.secret, opts.privateKey or
opts.key)
when using HMAC algorithms.</li>
<li>Upgrading JWA version to 2.0.1, adressing a compatibility issue for
Node &gt;= 25.</li>
</ul>
<h2>[3.2.3]</h2>
<h3>Changed</h3>
<ul>
<li>Fix advisory GHSA-869p-cjfg-cm3x: createSign and createVerify now
require
that a non empty secret is provided (via opts.secret, opts.privateKey or
opts.key)
when using HMAC algorithms.</li>
<li>Upgrading JWA version to 1.4.2, adressing a compatibility issue for
Node &gt;= 25.</li>
</ul>
<h2>[3.0.0]</h2>
<h3>Changed</h3>
<ul>
<li><strong>BREAKING</strong>: <code>jwt.verify</code> now requires an
<code>algorithm</code> parameter, and
<code>jws.createVerify</code> requires an <code>algorithm</code> option.
The <code>&quot;alg&quot;</code> field
signature headers is ignored. This mitigates a critical security flaw
in the library which would allow an attacker to generate signatures with
arbitrary contents that would be accepted by <code>jwt.verify</code>.
See
<a
href="https://auth0.com/blog/2015/03/31/critical-vulnerabilities-in-json-web-token-libraries/">https://auth0.com/blog/2015/03/31/critical-vulnerabilities-in-json-web-token-libraries/</a>
for details.</li>
</ul>
<h2><a
href="https://github.com/brianloveswords/node-jws/compare/v1.0.1...v2.0.0">2.0.0</a>
- 2015-01-30</h2>
<h3>Changed</h3>
<ul>
<li>
<p><strong>BREAKING</strong>: Default payload encoding changed from
<code>binary</code> to
<code>utf8</code>. <code>utf8</code> is a is a more sensible default
than <code>binary</code> because
many payloads, as far as I can tell, will contain user-facing
strings that could be in any language. (<!-- raw HTML omitted
-->[6b6de48]<!-- raw HTML omitted -->)</p>
</li>
<li>
<p>Code reorganization, thanks [<a
href="https://github.com/fearphage"><code>@​fearphage</code></a>]! (<!--
raw HTML omitted --><a
href="https://github.com/brianloveswords/node-jws/commit/7880050">7880050</a><!--
raw HTML omitted -->)</p>
</li>
</ul>
<h3>Added</h3>
<ul>
<li>Option in all relevant methods for <code>encoding</code>. For those
few users
that might be depending on a <code>binary</code> encoding of the
messages, this
is for them. (<!-- raw HTML omitted -->[6b6de48]<!-- raw HTML omitted
-->)</li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="34c45b2c04"><code>34c45b2</code></a>
Merge commit from fork</li>
<li><a
href="49bc39b1f5"><code>49bc39b</code></a>
version 4.0.1</li>
<li><a
href="d42350ccab"><code>d42350c</code></a>
Enhance tests for HMAC streaming sign and verify</li>
<li><a
href="5cb007cf82"><code>5cb007c</code></a>
Improve secretOrKey initialization in VerifyStream</li>
<li><a
href="f9a2e1c8c6"><code>f9a2e1c</code></a>
Improve secret handling in SignStream</li>
<li><a
href="b9fb8d30e9"><code>b9fb8d3</code></a>
Merge pull request <a
href="https://redirect.github.com/brianloveswords/node-jws/issues/102">#102</a>
from auth0/SRE-57-Upload-opslevel-yaml</li>
<li><a
href="95b75ee56c"><code>95b75ee</code></a>
Upload OpsLevel YAML</li>
<li><a
href="8857ee7762"><code>8857ee7</code></a>
test: remove unused variable (<a
href="https://redirect.github.com/brianloveswords/node-jws/issues/96">#96</a>)</li>
<li>See full diff in <a
href="https://github.com/brianloveswords/node-jws/compare/v4.0.0...v4.0.1">compare
view</a></li>
</ul>
</details>
<details>
<summary>Maintainer changes</summary>
<p>This version was pushed to npm by <a
href="https://www.npmjs.com/~julien.wollscheid">julien.wollscheid</a>, a
new releaser for jws since your current version.</p>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=jws&package-manager=npm_and_yarn&previous-version=4.0.0&new-version=4.0.1)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)
You can disable automated security fix PRs for this repo from the
[Security Alerts
page](https://github.com/googleapis/genai-toolbox/network/alerts).

</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-12-30 17:39:07 +00:00
Mend Renovate
980600f31b chore(deps): update github actions (#2096)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
| [actions/setup-node](https://redirect.github.com/actions/setup-node)
([changelog](2028fbc5c2..395ad32622))
| action | digest | `2028fbc` -> `395ad32` |
|
[lycheeverse/lychee-action](https://redirect.github.com/lycheeverse/lychee-action)
| action | pinDigest | -> `a8c4c7c` |

---

### Configuration

📅 **Schedule**: Branch creation - At any time (no schedule defined),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

👻 **Immortal**: This PR will be recreated if closed unmerged. Get
[config
help](https://redirect.github.com/renovatebot/renovate/discussions) if
that's undesired.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/googleapis/genai-toolbox).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0Mi4xOS45IiwidXBkYXRlZEluVmVyIjoiNDIuNTkuMCIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOltdfQ==-->
2025-12-30 06:34:59 +00:00
dependabot[bot]
f4c22b3e27 chore(deps): bump golang.org/x/crypto from 0.43.0 to 0.45.0 in /docs/en/getting-started/quickstart/go/genkit (#1999)
Bumps [golang.org/x/crypto](https://github.com/golang/crypto) from
0.43.0 to 0.45.0.
<details>
<summary>Commits</summary>
<ul>
<li><a
href="4e0068c009"><code>4e0068c</code></a>
go.mod: update golang.org/x dependencies</li>
<li><a
href="e79546e28b"><code>e79546e</code></a>
ssh: curb GSSAPI DoS risk by limiting number of specified OIDs</li>
<li><a
href="f91f7a7c31"><code>f91f7a7</code></a>
ssh/agent: prevent panic on malformed constraint</li>
<li><a
href="2df4153a03"><code>2df4153</code></a>
acme/autocert: let automatic renewal work with short lifetime certs</li>
<li><a
href="bcf6a849ef"><code>bcf6a84</code></a>
acme: pass context to request</li>
<li><a
href="b4f2b62076"><code>b4f2b62</code></a>
ssh: fix error message on unsupported cipher</li>
<li><a
href="79ec3a51fc"><code>79ec3a5</code></a>
ssh: allow to bind to a hostname in remote forwarding</li>
<li><a
href="122a78f140"><code>122a78f</code></a>
go.mod: update golang.org/x dependencies</li>
<li><a
href="c0531f9c34"><code>c0531f9</code></a>
all: eliminate vet diagnostics</li>
<li><a
href="0997000b45"><code>0997000</code></a>
all: fix some comments</li>
<li>Additional commits viewable in <a
href="https://github.com/golang/crypto/compare/v0.43.0...v0.45.0">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=golang.org/x/crypto&package-manager=go_modules&previous-version=0.43.0&new-version=0.45.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)
You can disable automated security fix PRs for this repo from the
[Security Alerts
page](https://github.com/googleapis/genai-toolbox/network/alerts).

</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Yuan Teoh <45984206+Yuan325@users.noreply.github.com>
2025-12-30 05:15:00 +00:00
Mend Renovate
c2df6223e6 chore(deps): update github actions (major) (#1905)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
| [actions/cache](https://redirect.github.com/actions/cache) | action |
major | `v4` -> `v5` |
| [actions/checkout](https://redirect.github.com/actions/checkout) |
action | major | `v5.0.1` -> `v6.0.1` |
| [actions/checkout](https://redirect.github.com/actions/checkout) |
action | major | `v5` -> `v6` |
|
[golangci/golangci-lint-action](https://redirect.github.com/golangci/golangci-lint-action)
| action | major | `v8.0.0` -> `v9.2.0` |

---

### Release Notes

<details>
<summary>actions/cache (actions/cache)</summary>

### [`v5`](https://redirect.github.com/actions/cache/compare/v4...v5)

[Compare
Source](https://redirect.github.com/actions/cache/compare/v4...v5)

</details>

<details>
<summary>actions/checkout (actions/checkout)</summary>

###
[`v6.0.1`](https://redirect.github.com/actions/checkout/compare/v6.0.0...v6.0.1)

[Compare
Source](https://redirect.github.com/actions/checkout/compare/v6.0.0...v6.0.1)

###
[`v6.0.0`](https://redirect.github.com/actions/checkout/compare/v5.0.1...v6.0.0)

[Compare
Source](https://redirect.github.com/actions/checkout/compare/v5.0.1...v6.0.0)

</details>

<details>
<summary>golangci/golangci-lint-action
(golangci/golangci-lint-action)</summary>

###
[`v9.2.0`](https://redirect.github.com/golangci/golangci-lint-action/releases/tag/v9.2.0)

[Compare
Source](https://redirect.github.com/golangci/golangci-lint-action/compare/v9.1.0...v9.2.0)

<!-- Release notes generated using configuration in .github/release.yml
at v9.2.0 -->

#### What's Changed

##### Changes

- feat: add version-file option by
[@&#8203;ldez](https://redirect.github.com/ldez) in
[#&#8203;1320](https://redirect.github.com/golangci/golangci-lint-action/pull/1320)
- chore: move samples into fixtures by
[@&#8203;ldez](https://redirect.github.com/ldez) in
[#&#8203;1321](https://redirect.github.com/golangci/golangci-lint-action/pull/1321)

##### Dependencies

- build(deps-dev): bump the dev-dependencies group with 2 updates by
[@&#8203;dependabot](https://redirect.github.com/dependabot)\[bot] in
[#&#8203;1317](https://redirect.github.com/golangci/golangci-lint-action/pull/1317)
- build(deps): bump actions/checkout from 5 to 6 by
[@&#8203;dependabot](https://redirect.github.com/dependabot)\[bot] in
[#&#8203;1318](https://redirect.github.com/golangci/golangci-lint-action/pull/1318)
- build(deps-dev): bump the dev-dependencies group with 3 updates by
[@&#8203;dependabot](https://redirect.github.com/dependabot)\[bot] in
[#&#8203;1323](https://redirect.github.com/golangci/golangci-lint-action/pull/1323)
- build(deps): bump yaml from 2.8.1 to 2.8.2 in the dependencies group
by [@&#8203;dependabot](https://redirect.github.com/dependabot)\[bot] in
[#&#8203;1324](https://redirect.github.com/golangci/golangci-lint-action/pull/1324)

**Full Changelog**:
<https://github.com/golangci/golangci-lint-action/compare/v9.1.0...v9.2.0>

###
[`v9.1.0`](https://redirect.github.com/golangci/golangci-lint-action/releases/tag/v9.1.0)

[Compare
Source](https://redirect.github.com/golangci/golangci-lint-action/compare/v9.0.0...v9.1.0)

<!-- Release notes generated using configuration in .github/release.yml
at v9.1.0 -->

#### What's Changed

##### Changes

- feat: automatic module directories by
[@&#8203;ldez](https://redirect.github.com/ldez) in
[#&#8203;1315](https://redirect.github.com/golangci/golangci-lint-action/pull/1315)

##### Documentation

- docs: organize options by
[@&#8203;ldez](https://redirect.github.com/ldez) in
[#&#8203;1314](https://redirect.github.com/golangci/golangci-lint-action/pull/1314)

##### Dependencies

- build(deps-dev): bump the dev-dependencies group with 2 updates by
[@&#8203;dependabot](https://redirect.github.com/dependabot)\[bot] in
[#&#8203;1307](https://redirect.github.com/golangci/golangci-lint-action/pull/1307)
- build(deps-dev): bump js-yaml from 4.1.0 to 4.1.1 by
[@&#8203;dependabot](https://redirect.github.com/dependabot)\[bot] in
[#&#8203;1309](https://redirect.github.com/golangci/golangci-lint-action/pull/1309)
- build(deps-dev): bump the dev-dependencies group with 2 updates by
[@&#8203;dependabot](https://redirect.github.com/dependabot)\[bot] in
[#&#8203;1310](https://redirect.github.com/golangci/golangci-lint-action/pull/1310)
- build(deps): bump the dependencies group with 2 updates by
[@&#8203;dependabot](https://redirect.github.com/dependabot)\[bot] in
[#&#8203;1311](https://redirect.github.com/golangci/golangci-lint-action/pull/1311)

**Full Changelog**:
<https://github.com/golangci/golangci-lint-action/compare/v9.0.0...v9.1.0>

###
[`v9.0.0`](https://redirect.github.com/golangci/golangci-lint-action/releases/tag/v9.0.0)

[Compare
Source](https://redirect.github.com/golangci/golangci-lint-action/compare/v8.0.0...v9.0.0)

In the scope of this release, we change Nodejs runtime from node20 to
node24
(<https://github.blog/changelog/2025-09-19-deprecation-of-node-20-on-github-actions-runners/>).

#### What's Changed

##### Changes

- feat: add install-only option by
[@&#8203;ldez](https://redirect.github.com/ldez) in
[#&#8203;1305](https://redirect.github.com/golangci/golangci-lint-action/pull/1305)
- feat: support Module Plugin System by
[@&#8203;ldez](https://redirect.github.com/ldez) in
[#&#8203;1306](https://redirect.github.com/golangci/golangci-lint-action/pull/1306)

**Full Changelog**:
<https://github.com/golangci/golangci-lint-action/compare/v8.0.0...v9.0.0>

</details>

---

### Configuration

📅 **Schedule**: Branch creation - At any time (no schedule defined),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

👻 **Immortal**: This PR will be recreated if closed unmerged. Get
[config
help](https://redirect.github.com/renovatebot/renovate/discussions) if
that's undesired.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/googleapis/genai-toolbox).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MS4xNTkuNCIsInVwZGF0ZWRJblZlciI6IjQyLjQyLjIiLCJ0YXJnZXRCcmFuY2giOiJtYWluIiwibGFiZWxzIjpbXX0=-->

Co-authored-by: Yuan Teoh <45984206+Yuan325@users.noreply.github.com>
2025-12-29 20:52:11 -08:00
Xie Yanbo
4d6f70b55e docs: clarify versioning in README (#2177)
Update the README.md to explicitly define MAJOR, MINOR, and PATCH
increments for post-1.0.0 versioning, enhancing clarity and readability.

## Description

> Should include a concise description of the changes (bug or feature),
it's
> impact, along with a summary of the solution

## PR Checklist

> Thank you for opening a Pull Request! Before submitting your PR, there
are a
> few things you can do to make sure it goes smoothly:

- [x] Make sure you reviewed

[CONTRIBUTING.md](https://github.com/googleapis/genai-toolbox/blob/main/CONTRIBUTING.md)
- [ ] Make sure to open an issue as a

[bug/issue](https://github.com/googleapis/genai-toolbox/issues/new/choose)
  before writing your code! That way we can discuss the change, evaluate
  designs, and agree on the general idea
- [ ] Ensure the tests and linter pass
- [ ] Code coverage does not decrease (if any source code was changed)
- [ ] Appropriate docs were updated (if necessary)
- [ ] Make sure to add `!` if this involve a breaking change

🛠️ Fixes #<issue_number_goes_here>

Co-authored-by: Yuan Teoh <45984206+Yuan325@users.noreply.github.com>
2025-12-27 06:30:58 +00:00
manuka rahul
c088d4ed42 ci: add link checker workflow (#2189)
This workflow prevents the links that are broken or 404 errors by
checking the documentation links during development and before merging
into the main code base. This ensures all project documentation (Readme
, contribution files) remains current and functional , proactively
addressing technical debt. Please note this is a resubmission of a
previous [PR](https://github.com/googleapis/genai-toolbox/pull/1756)
that was closed due to merge conflicts

---------

Co-authored-by: Twisha Bansal <58483338+twishabansal@users.noreply.github.com>
2025-12-24 15:20:10 +05:30
Yuan Teoh
0202709efc refactor(sources/alloydbadmin, sources/alloydbpg): move source implementation in Invoke() function to Source (#2226)
Move source-related queries from `Invoke()` function into Source.

This is an effort to generalizing tools to work with any Source that
implements a specific interface. This will provide a better segregation
of the roles for Tools vs Source.

Tool's role will be limited to the following:
* Resolve any pre-implementation steps or parameters (e.g. template
parameters)
* Retrieving Source
* Calling the source's implementation


Along with these updates, this PR also resolve some comments from
Gemini:
* update `fmt.Printf()` to logging as a Debug log -- within
`GetOperations()`
* update `fmt.Printf()` during failure to retrieve user agent into
throwing an error. UserAgent are expected to be retrieved successfully
during source initialization. Failure to retrieve will indicate a server
error.
2025-12-24 09:09:22 +00:00
Niraj Nandre
9695fc5eeb docs: Add Antigravity connection steps for Looker (#2192)
## Description

This PR adds a new section to the `looker_mcp.md` document that explains
how to connect Looker to Antigravity.

The new **"Connect with Antigravity"** section provides two methods for
connecting:

- **MCP Store:** A straightforward method using the built-in MCP Store
in Antigravity.
- **Custom config:** For connecting to a custom MCP server by adding a
configuration to the mcp_config.json file.

These changes will help users easily connect Looker to Antigravity.

## PR Checklist

> Thank you for opening a Pull Request! Before submitting your PR, there
are a
> few things you can do to make sure it goes smoothly:

- [x] Make sure you reviewed

[CONTRIBUTING.md](https://github.com/googleapis/genai-toolbox/blob/main/CONTRIBUTING.md)
- [x] Make sure to open an issue as a

[bug/issue](https://github.com/googleapis/genai-toolbox/issues/new/choose)
  before writing your code! That way we can discuss the change, evaluate
  designs, and agree on the general idea
- [x] Ensure the tests and linter pass
- [ ] Code coverage does not decrease (if any source code was changed)
- [ ] Appropriate docs were updated (if necessary)
- [ ] Make sure to add `!` if this involve a breaking change

Co-authored-by: Averi Kitsch <akitsch@google.com>
2025-12-24 07:17:42 +00:00
Srividya Reddy
5447c94ca8 test(source/postgres): fix list_database_stats integration test (#2235)
## Description
The list_database_stats test fails intermittently when run in parallel
on shared instances. Specifically, the "filter by tablespace" and "sort
by size" test cases fail because they encounter unexpected databases in
the pg_default tablespace created by concurrent test runs.
This PR narrows the scope of these test cases by filtering for specific
database names. This ensures assertions remain isolated to the current
test run regardless of other databases present in the shared
environment.

```
 go test -tags=integration tests/postgres/postgres_integration_test.go
ok      command-line-arguments  14.455s
```

> Should include a concise description of the changes (bug or feature),
it's
> impact, along with a summary of the solution

## PR Checklist

> Thank you for opening a Pull Request! Before submitting your PR, there
are a
> few things you can do to make sure it goes smoothly:

- [x] Make sure you reviewed

[CONTRIBUTING.md](https://github.com/googleapis/genai-toolbox/blob/main/CONTRIBUTING.md)
- [x] Make sure to open an issue as a

[bug/issue](https://github.com/googleapis/genai-toolbox/issues/new/choose)
  before writing your code! That way we can discuss the change, evaluate
  designs, and agree on the general idea
- [x] Ensure the tests and linter pass
- [x] Code coverage does not decrease (if any source code was changed)
- [x] Appropriate docs were updated (if necessary)
- [x] Make sure to add `!` if this involve a breaking change

🛠️ Fixes #<1738>
2025-12-23 22:27:12 -08:00
Sri Varshitha
7053fbb195 fix(tools/alloydb-wait-for-operation): Fix connection message generation (#2228)
## Description

This PR fixes issues in the `alloydb-wait-for-operation` tool where the
connection message was not being generated correctly upon operation
completion.

## PR Checklist

> Thank you for opening a Pull Request! Before submitting your PR, there
are a
> few things you can do to make sure it goes smoothly:

- [x] Make sure you reviewed

[CONTRIBUTING.md](https://github.com/googleapis/genai-toolbox/blob/main/CONTRIBUTING.md)
- [ ] Make sure to open an issue as a

[bug/issue](https://github.com/googleapis/genai-toolbox/issues/new/choose)
  before writing your code! That way we can discuss the change, evaluate
  designs, and agree on the general idea
- [x] Ensure the tests and linter pass
- [x] Code coverage does not decrease (if any source code was changed)
- [x] Appropriate docs were updated (if necessary)
- [x] Make sure to add `!` if this involve a breaking change

🛠️ Fixes #<issue_number_goes_here>
2025-12-23 10:32:32 -08:00
manuka rahul
5a09d38056 docs: fix broken links (#2223)
Fix broken links
2025-12-23 15:01:52 +05:30
Harsh Jha
0cb3ad9026 Merge branch 'main' into sdk-docs-migrate 2025-12-10 14:03:46 +05:30
Harsh Jha
290cba0f1e Merge branch 'main' into sdk-docs-migrate 2025-12-01 14:57:07 +05:30
Harsh Jha
047def93ef Merge branch 'main' into sdk-docs-migrate 2025-12-01 14:55:19 +05:30
Harsh Jha
875b5277e3 Merge branch 'main' into sdk-docs-migrate 2025-11-21 12:39:20 +05:30
Harsh Jha
a29f9e5484 feat: added basic template for sdks doc migrate 2025-11-17 13:57:49 +05:30
296 changed files with 6961 additions and 4580 deletions

View File

@@ -875,8 +875,8 @@ steps:
total_coverage=$(go tool cover -func=oracle_coverage.out | grep "total:" | awk '{print $3}')
echo "Oracle total coverage: $total_coverage"
coverage_numeric=$(echo "$total_coverage" | sed 's/%//')
if awk -v cov="$coverage_numeric" 'BEGIN {exit !(cov < 30)}'; then
echo "Coverage failure: $total_coverage is below 30%."
if awk -v cov="$coverage_numeric" 'BEGIN {exit !(cov < 20)}'; then
echo "Coverage failure: $total_coverage is below 20%."
exit 1
fi

View File

@@ -40,7 +40,7 @@ jobs:
group: docs-deployment
cancel-in-progress: false
steps:
- uses: actions/checkout@93cb6efe18208431cddfb8368fd83d5badbf9bfd # v5
- uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6
with:
fetch-depth: 0 # Fetch all history for .GitInfo and .Lastmod
@@ -51,12 +51,12 @@ jobs:
extended: true
- name: Setup Node
uses: actions/setup-node@2028fbc5c25fe9cf00d9f06a71cc4710d4507903 # v6
uses: actions/setup-node@395ad3262231945c25e8478fd5baf05154b1d79f # v6
with:
node-version: "22"
- name: Cache dependencies
uses: actions/cache@0057852bfaa89a56745cba8c7296529d2fc39830 # v4
uses: actions/cache@9255dc7a253b0ccc959486e2bca901246202afeb # v5
with:
path: ~/.npm
key: ${{ runner.os }}-node-${{ hashFiles('**/package-lock.json') }}

View File

@@ -30,14 +30,14 @@ jobs:
steps:
- name: Checkout main branch (for latest templates and theme)
uses: actions/checkout@93cb6efe18208431cddfb8368fd83d5badbf9bfd # v5
uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6
with:
ref: 'main'
submodules: 'recursive'
fetch-depth: 0
- name: Checkout old content from tag into a temporary directory
uses: actions/checkout@93cb6efe18208431cddfb8368fd83d5badbf9bfd # v5
uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6
with:
ref: ${{ github.event.inputs.version_tag }}
path: 'old_version_source' # Checkout into a temp subdir
@@ -57,7 +57,7 @@ jobs:
with:
hugo-version: "0.145.0"
extended: true
- uses: actions/setup-node@2028fbc5c25fe9cf00d9f06a71cc4710d4507903 # v6
- uses: actions/setup-node@395ad3262231945c25e8478fd5baf05154b1d79f # v6
with:
node-version: "22"

View File

@@ -30,7 +30,7 @@ jobs:
cancel-in-progress: false
steps:
- name: Checkout Code at Tag
uses: actions/checkout@93cb6efe18208431cddfb8368fd83d5badbf9bfd # v5
uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6
with:
ref: ${{ github.event.release.tag_name }}
@@ -44,7 +44,7 @@ jobs:
extended: true
- name: Setup Node
uses: actions/setup-node@2028fbc5c25fe9cf00d9f06a71cc4710d4507903 # v6
uses: actions/setup-node@395ad3262231945c25e8478fd5baf05154b1d79f # v6
with:
node-version: "22"

View File

@@ -34,7 +34,7 @@ jobs:
group: "preview-${{ github.event.number }}"
cancel-in-progress: true
steps:
- uses: actions/checkout@93cb6efe18208431cddfb8368fd83d5badbf9bfd # v5
- uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6
with:
ref: versioned-gh-pages

View File

@@ -49,7 +49,7 @@ jobs:
group: "preview-${{ github.event.number }}"
cancel-in-progress: true
steps:
- uses: actions/checkout@93cb6efe18208431cddfb8368fd83d5badbf9bfd # v5
- uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6
with:
# Checkout the PR's HEAD commit (supports forks).
ref: ${{ github.event.pull_request.head.sha }}
@@ -62,12 +62,12 @@ jobs:
extended: true
- name: Setup Node
uses: actions/setup-node@2028fbc5c25fe9cf00d9f06a71cc4710d4507903 # v6
uses: actions/setup-node@395ad3262231945c25e8478fd5baf05154b1d79f # v6
with:
node-version: "22"
- name: Cache dependencies
uses: actions/cache@0057852bfaa89a56745cba8c7296529d2fc39830 # v4
uses: actions/cache@9255dc7a253b0ccc959486e2bca901246202afeb # v5
with:
path: ~/.npm
key: ${{ runner.os }}-node-${{ hashFiles('**/package-lock.json') }}

View File

@@ -0,0 +1,59 @@
# Copyright 2025 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
name: Link Checker
on:
pull_request:
jobs:
link-check:
runs-on: ubuntu-latest
steps:
- name: Checkout Repository
uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6
- name: Restore lychee cache
uses: actions/cache@9255dc7a253b0ccc959486e2bca901246202afeb # v5
with:
path: .lycheecache
key: cache-lychee-${{ github.sha }}
restore-keys: cache-lychee-
- name: Link Checker
uses: lycheeverse/lychee-action@a8c4c7cb88f0c7386610c35eb25108e448569cb0 # v2
with:
args: >
--verbose
--no-progress
--cache
--max-cache-age 1d
README.md
docs/
output: /tmp/foo.txt
fail: true
jobSummary: true
debug: true
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
# This step only runs if the 'lychee_check' step fails, ensuring the
# context note only appears when the developer needs to troubleshoot.
- name: Display Link Context Note on Failure
if: ${{ failure() }}
run: |
echo "## Link Resolution Note" >> $GITHUB_STEP_SUMMARY
echo "Local links and directory changes work differently on GitHub than on the docsite." >> $GITHUB_STEP_SUMMARY
echo "You must ensure fixes pass the **GitHub check** and also work with **\`hugo server\`**." >> $GITHUB_STEP_SUMMARY
echo "---" >> $GITHUB_STEP_SUMMARY

View File

@@ -55,7 +55,7 @@ jobs:
with:
go-version: "1.25"
- name: Checkout code
uses: actions/checkout@93cb6efe18208431cddfb8368fd83d5badbf9bfd # v5.0.1
uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
with:
ref: ${{ github.event.pull_request.head.sha }}
repository: ${{ github.event.pull_request.head.repo.full_name }}
@@ -66,7 +66,7 @@ jobs:
run: |
go mod tidy && git diff --exit-code
- name: golangci-lint
uses: golangci/golangci-lint-action@4afd733a84b1f43292c63897423277bb7f4313a9 # v8.0.0
uses: golangci/golangci-lint-action@1e7e51e771db61008b38414a730f564565cf7c20 # v9.2.0
with:
version: latest
args: --timeout 10m

View File

@@ -29,7 +29,7 @@ jobs:
steps:
- name: Checkout code
uses: actions/checkout@93cb6efe18208431cddfb8368fd83d5badbf9bfd # v5
uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6
- name: Wait for image in Artifact Registry
shell: bash

View File

@@ -29,7 +29,7 @@ jobs:
issues: 'write'
pull-requests: 'write'
steps:
- uses: actions/checkout@93cb6efe18208431cddfb8368fd83d5badbf9bfd # v5.0.1
- uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
- uses: micnncim/action-label-syncer@3abd5ab72fda571e69fffd97bd4e0033dd5f495c # v1.3.0
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}

View File

@@ -62,7 +62,7 @@ jobs:
go-version: "1.24"
- name: Checkout code
uses: actions/checkout@93cb6efe18208431cddfb8368fd83d5badbf9bfd # v5.0.1
uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
with:
ref: ${{ github.event.pull_request.head.sha }}
repository: ${{ github.event.pull_request.head.repo.full_name }}

45
.lycheeignore Normal file
View File

@@ -0,0 +1,45 @@
# Ignore documentation placeholders and generic example domains
^https?://([a-zA-Z0-9-]+\.)?example\.com(:\d+)?(/.*)?$
^http://example\.net
# Shields.io badges often trigger rate limits or intermittent 503s
^https://img\.shields\.io/.*
# PDF files are ignored as lychee cannot reliably parse internal PDF links
\.pdf$
# Standard mailto: protocol is not a web URL
^mailto:
# Ignore local development endpoints that won't resolve in CI/CD environments
^https?://(127\.0\.0\.1|localhost)(:\d+)?(/.*)?$
# Placeholder for Google Cloud Run service discovery
https://cloud-run-url.app/
# DGraph Cloud and private instance endpoints
https://xxx.cloud.dgraph.io/
https://cloud.dgraph.io/login
https://dgraph.io/docs
# MySQL Community downloads and main site (often protected by bot mitigation)
https://dev.mysql.com/downloads/installer/
https://www.mysql.com/
# Claude desktop download link
https://claude.ai/download
# Google Cloud Run product page
https://cloud.google.com/run
# These specific deep links are known to cause redirect loops or 403s in automated scrapers
https://dev.mysql.com/doc/refman/8.4/en/sql-prepared-statements.html
https://dev.mysql.com/doc/refman/8.4/en/user-names.html
# npmjs links can occasionally trigger rate limiting during high-frequency CI builds
https://www.npmjs.com/package/@toolbox-sdk/core
https://www.npmjs.com/package/@toolbox-sdk/adk
# Ignore social media and blog profiles to reduce external request overhead
https://medium.com/@mcp_toolbox

View File

@@ -207,6 +207,30 @@ variables for each source.
* SQLite - setup in the integration test, where we create a temporary database
file
### Link Checking and Fixing with Lychee
We use **[lychee](https://github.com/lycheeverse/lychee-action)** for repository link checks.
* To run the checker **locally**, see the [command-line usage guide](https://github.com/lycheeverse/lychee?tab=readme-ov-file#commandline-usage).
#### Fixing Broken Links
1. **Update the Link:** Correct the broken URL or update the content where it is used.
2. **Ignore the Link:** If you can't fix the link (e.g., due to **external rate-limits** or if it's a **local-only URL**), tell Lychee to **ignore** it.
* List **regular expressions** or **direct links** in the **[.lycheeignore](https://github.com/googleapis/genai-toolbox/blob/main/.lycheeignore)** file, one entry per line.
* **Always add a comment** explaining **why** the link is being skipped to prevent link rot. **Example `.lycheeignore`:**
```text
# These are email addresses, not standard web URLs, and usually cause check failures.
^mailto:.*
```
> [!NOTE]
> To avoid build failures in GitHub Actions, follow the linking pattern demonstrated here: <br>
> **Avoid:** (Works in Hugo, breaks Link Checker): `[Read more](docs/setup)` or `[Read more](docs/setup/)` <br>
> **Reason:** The link checker cannot find a file named "setup" or a directory with that name containing an index. <br>
> **Preferred:** `[Read more](docs/setup.md)` <br>
> **Reason:** The GitHub Action finds the physical file. Hugo then uses its internal logic (or render hooks) to resolve this to the correct `/docs/setup/` web URL. <br>
### Other GitHub Checks
* License header check (`.github/header-checker-lint.yml`) - Ensures files have
@@ -280,6 +304,7 @@ There are 3 GHA workflows we use to achieve document versioning:
Request a repo owner to run the preview deployment workflow on your PR. A
preview link will be automatically added as a comment to your PR.
#### Maintainers
1. **Inspect Changes:** Review the proposed changes in the PR to ensure they are

View File

@@ -1035,12 +1035,12 @@ The version will be incremented as follows:
### Post-1.0.0 Versioning
Once the project reaches a stable `1.0.0` release, the versioning will follow
the more common convention:
Once the project reaches a stable `1.0.0` release, the version number
**`MAJOR.MINOR.PATCH`** will follow the more common convention:
- **`MAJOR.MINOR.PATCH`**: Incremented for incompatible API changes.
- **`MAJOR.MINOR.PATCH`**: Incremented for new, backward-compatible functionality.
- **`MAJOR.MINOR.PATCH`**: Incremented for backward-compatible bug fixes.
- **`MAJOR`**: Incremented for incompatible API changes.
- **`MINOR`**: Incremented for new, backward-compatible functionality.
- **`PATCH`**: Incremented for backward-compatible bug fixes.
The public API that this applies to is the CLI associated with Toolbox, the
interactions with official SDKs, and the definitions in the `tools.yaml` file.

View File

@@ -33,6 +33,7 @@ import (
"github.com/fsnotify/fsnotify"
yaml "github.com/goccy/go-yaml"
"github.com/googleapis/genai-toolbox/internal/auth"
"github.com/googleapis/genai-toolbox/internal/embeddingmodels"
"github.com/googleapis/genai-toolbox/internal/log"
"github.com/googleapis/genai-toolbox/internal/prebuiltconfigs"
"github.com/googleapis/genai-toolbox/internal/prompts"
@@ -197,6 +198,7 @@ import (
_ "github.com/googleapis/genai-toolbox/internal/tools/postgres/postgreslistroles"
_ "github.com/googleapis/genai-toolbox/internal/tools/postgres/postgreslistschemas"
_ "github.com/googleapis/genai-toolbox/internal/tools/postgres/postgreslistsequences"
_ "github.com/googleapis/genai-toolbox/internal/tools/postgres/postgresliststoredprocedure"
_ "github.com/googleapis/genai-toolbox/internal/tools/postgres/postgreslisttables"
_ "github.com/googleapis/genai-toolbox/internal/tools/postgres/postgreslisttablespaces"
_ "github.com/googleapis/genai-toolbox/internal/tools/postgres/postgreslisttablestats"
@@ -385,12 +387,13 @@ func NewCommand(opts ...Option) *Command {
}
type ToolsFile struct {
Sources server.SourceConfigs `yaml:"sources"`
AuthSources server.AuthServiceConfigs `yaml:"authSources"` // Deprecated: Kept for compatibility.
AuthServices server.AuthServiceConfigs `yaml:"authServices"`
Tools server.ToolConfigs `yaml:"tools"`
Toolsets server.ToolsetConfigs `yaml:"toolsets"`
Prompts server.PromptConfigs `yaml:"prompts"`
Sources server.SourceConfigs `yaml:"sources"`
AuthSources server.AuthServiceConfigs `yaml:"authSources"` // Deprecated: Kept for compatibility.
AuthServices server.AuthServiceConfigs `yaml:"authServices"`
EmbeddingModels server.EmbeddingModelConfigs `yaml:"embeddingModels"`
Tools server.ToolConfigs `yaml:"tools"`
Toolsets server.ToolsetConfigs `yaml:"toolsets"`
Prompts server.PromptConfigs `yaml:"prompts"`
}
// parseEnv replaces environment variables ${ENV_NAME} with their values.
@@ -439,11 +442,12 @@ func parseToolsFile(ctx context.Context, raw []byte) (ToolsFile, error) {
// All resource names (sources, authServices, tools, toolsets) must be unique across all files.
func mergeToolsFiles(files ...ToolsFile) (ToolsFile, error) {
merged := ToolsFile{
Sources: make(server.SourceConfigs),
AuthServices: make(server.AuthServiceConfigs),
Tools: make(server.ToolConfigs),
Toolsets: make(server.ToolsetConfigs),
Prompts: make(server.PromptConfigs),
Sources: make(server.SourceConfigs),
AuthServices: make(server.AuthServiceConfigs),
EmbeddingModels: make(server.EmbeddingModelConfigs),
Tools: make(server.ToolConfigs),
Toolsets: make(server.ToolsetConfigs),
Prompts: make(server.PromptConfigs),
}
var conflicts []string
@@ -479,6 +483,15 @@ func mergeToolsFiles(files ...ToolsFile) (ToolsFile, error) {
}
}
// Check for conflicts and merge embeddingModels
for name, em := range file.EmbeddingModels {
if _, exists := merged.EmbeddingModels[name]; exists {
conflicts = append(conflicts, fmt.Sprintf("embedding model '%s' (file #%d)", name, fileIndex+1))
} else {
merged.EmbeddingModels[name] = em
}
}
// Check for conflicts and merge tools
for name, tool := range file.Tools {
if _, exists := merged.Tools[name]; exists {
@@ -583,14 +596,14 @@ func handleDynamicReload(ctx context.Context, toolsFile ToolsFile, s *server.Ser
panic(err)
}
sourcesMap, authServicesMap, toolsMap, toolsetsMap, promptsMap, promptsetsMap, err := validateReloadEdits(ctx, toolsFile)
sourcesMap, authServicesMap, embeddingModelsMap, toolsMap, toolsetsMap, promptsMap, promptsetsMap, err := validateReloadEdits(ctx, toolsFile)
if err != nil {
errMsg := fmt.Errorf("unable to validate reloaded edits: %w", err)
logger.WarnContext(ctx, errMsg.Error())
return err
}
s.ResourceMgr.SetResources(sourcesMap, authServicesMap, toolsMap, toolsetsMap, promptsMap, promptsetsMap)
s.ResourceMgr.SetResources(sourcesMap, authServicesMap, embeddingModelsMap, toolsMap, toolsetsMap, promptsMap, promptsetsMap)
return nil
}
@@ -598,7 +611,7 @@ func handleDynamicReload(ctx context.Context, toolsFile ToolsFile, s *server.Ser
// validateReloadEdits checks that the reloaded tools file configs can initialized without failing
func validateReloadEdits(
ctx context.Context, toolsFile ToolsFile,
) (map[string]sources.Source, map[string]auth.AuthService, map[string]tools.Tool, map[string]tools.Toolset, map[string]prompts.Prompt, map[string]prompts.Promptset, error,
) (map[string]sources.Source, map[string]auth.AuthService, map[string]embeddingmodels.EmbeddingModel, map[string]tools.Tool, map[string]tools.Toolset, map[string]prompts.Prompt, map[string]prompts.Promptset, error,
) {
logger, err := util.LoggerFromContext(ctx)
if err != nil {
@@ -616,22 +629,23 @@ func validateReloadEdits(
defer span.End()
reloadedConfig := server.ServerConfig{
Version: versionString,
SourceConfigs: toolsFile.Sources,
AuthServiceConfigs: toolsFile.AuthServices,
ToolConfigs: toolsFile.Tools,
ToolsetConfigs: toolsFile.Toolsets,
PromptConfigs: toolsFile.Prompts,
Version: versionString,
SourceConfigs: toolsFile.Sources,
AuthServiceConfigs: toolsFile.AuthServices,
EmbeddingModelConfigs: toolsFile.EmbeddingModels,
ToolConfigs: toolsFile.Tools,
ToolsetConfigs: toolsFile.Toolsets,
PromptConfigs: toolsFile.Prompts,
}
sourcesMap, authServicesMap, toolsMap, toolsetsMap, promptsMap, promptsetsMap, err := server.InitializeConfigs(ctx, reloadedConfig)
sourcesMap, authServicesMap, embeddingModelsMap, toolsMap, toolsetsMap, promptsMap, promptsetsMap, err := server.InitializeConfigs(ctx, reloadedConfig)
if err != nil {
errMsg := fmt.Errorf("unable to initialize reloaded configs: %w", err)
logger.WarnContext(ctx, errMsg.Error())
return nil, nil, nil, nil, nil, nil, err
return nil, nil, nil, nil, nil, nil, nil, err
}
return sourcesMap, authServicesMap, toolsMap, toolsetsMap, promptsMap, promptsetsMap, nil
return sourcesMap, authServicesMap, embeddingModelsMap, toolsMap, toolsetsMap, promptsMap, promptsetsMap, nil
}
// watchChanges checks for changes in the provided yaml tools file(s) or folder.

View File

@@ -32,6 +32,7 @@ import (
"github.com/google/go-cmp/cmp"
"github.com/googleapis/genai-toolbox/internal/auth/google"
"github.com/googleapis/genai-toolbox/internal/embeddingmodels/gemini"
"github.com/googleapis/genai-toolbox/internal/log"
"github.com/googleapis/genai-toolbox/internal/prebuiltconfigs"
"github.com/googleapis/genai-toolbox/internal/prompts"
@@ -1503,7 +1504,7 @@ func TestPrebuiltTools(t *testing.T) {
wantToolset: server.ToolsetConfigs{
"alloydb_postgres_database_tools": tools.ToolsetConfig{
Name: "alloydb_postgres_database_tools",
ToolNames: []string{"execute_sql", "list_tables", "list_active_queries", "list_available_extensions", "list_installed_extensions", "list_autovacuum_configurations", "list_memory_configurations", "list_top_bloated_tables", "list_replication_slots", "list_invalid_indexes", "get_query_plan", "list_views", "list_schemas", "database_overview", "list_triggers", "list_indexes", "list_sequences", "long_running_transactions", "list_locks", "replication_stats", "list_query_stats", "get_column_cardinality", "list_publication_tables", "list_tablespaces", "list_pg_settings", "list_database_stats", "list_roles", "list_table_stats"},
ToolNames: []string{"execute_sql", "list_tables", "list_active_queries", "list_available_extensions", "list_installed_extensions", "list_autovacuum_configurations", "list_memory_configurations", "list_top_bloated_tables", "list_replication_slots", "list_invalid_indexes", "get_query_plan", "list_views", "list_schemas", "database_overview", "list_triggers", "list_indexes", "list_sequences", "long_running_transactions", "list_locks", "replication_stats", "list_query_stats", "get_column_cardinality", "list_publication_tables", "list_tablespaces", "list_pg_settings", "list_database_stats", "list_roles", "list_table_stats", "list_stored_procedure"},
},
},
},
@@ -1533,7 +1534,7 @@ func TestPrebuiltTools(t *testing.T) {
wantToolset: server.ToolsetConfigs{
"cloud_sql_postgres_database_tools": tools.ToolsetConfig{
Name: "cloud_sql_postgres_database_tools",
ToolNames: []string{"execute_sql", "list_tables", "list_active_queries", "list_available_extensions", "list_installed_extensions", "list_autovacuum_configurations", "list_memory_configurations", "list_top_bloated_tables", "list_replication_slots", "list_invalid_indexes", "get_query_plan", "list_views", "list_schemas", "database_overview", "list_triggers", "list_indexes", "list_sequences", "long_running_transactions", "list_locks", "replication_stats", "list_query_stats", "get_column_cardinality", "list_publication_tables", "list_tablespaces", "list_pg_settings", "list_database_stats", "list_roles", "list_table_stats"},
ToolNames: []string{"execute_sql", "list_tables", "list_active_queries", "list_available_extensions", "list_installed_extensions", "list_autovacuum_configurations", "list_memory_configurations", "list_top_bloated_tables", "list_replication_slots", "list_invalid_indexes", "get_query_plan", "list_views", "list_schemas", "database_overview", "list_triggers", "list_indexes", "list_sequences", "long_running_transactions", "list_locks", "replication_stats", "list_query_stats", "get_column_cardinality", "list_publication_tables", "list_tablespaces", "list_pg_settings", "list_database_stats", "list_roles", "list_table_stats", "list_stored_procedure"},
},
},
},
@@ -1633,7 +1634,7 @@ func TestPrebuiltTools(t *testing.T) {
wantToolset: server.ToolsetConfigs{
"postgres_database_tools": tools.ToolsetConfig{
Name: "postgres_database_tools",
ToolNames: []string{"execute_sql", "list_tables", "list_active_queries", "list_available_extensions", "list_installed_extensions", "list_autovacuum_configurations", "list_memory_configurations", "list_top_bloated_tables", "list_replication_slots", "list_invalid_indexes", "get_query_plan", "list_views", "list_schemas", "database_overview", "list_triggers", "list_indexes", "list_sequences", "long_running_transactions", "list_locks", "replication_stats", "list_query_stats", "get_column_cardinality", "list_publication_tables", "list_tablespaces", "list_pg_settings", "list_database_stats", "list_roles", "list_table_stats"},
ToolNames: []string{"execute_sql", "list_tables", "list_active_queries", "list_available_extensions", "list_installed_extensions", "list_autovacuum_configurations", "list_memory_configurations", "list_top_bloated_tables", "list_replication_slots", "list_invalid_indexes", "get_query_plan", "list_views", "list_schemas", "database_overview", "list_triggers", "list_indexes", "list_sequences", "long_running_transactions", "list_locks", "replication_stats", "list_query_stats", "get_column_cardinality", "list_publication_tables", "list_tablespaces", "list_pg_settings", "list_database_stats", "list_roles", "list_table_stats", "list_stored_procedure"},
},
},
},
@@ -1830,9 +1831,10 @@ func TestFileLoadingErrors(t *testing.T) {
func TestMergeToolsFiles(t *testing.T) {
file1 := ToolsFile{
Sources: server.SourceConfigs{"source1": httpsrc.Config{Name: "source1"}},
Tools: server.ToolConfigs{"tool1": http.Config{Name: "tool1"}},
Toolsets: server.ToolsetConfigs{"set1": tools.ToolsetConfig{Name: "set1"}},
Sources: server.SourceConfigs{"source1": httpsrc.Config{Name: "source1"}},
Tools: server.ToolConfigs{"tool1": http.Config{Name: "tool1"}},
Toolsets: server.ToolsetConfigs{"set1": tools.ToolsetConfig{Name: "set1"}},
EmbeddingModels: server.EmbeddingModelConfigs{"model1": gemini.Config{Name: "gemini-text"}},
}
file2 := ToolsFile{
AuthServices: server.AuthServiceConfigs{"auth1": google.Config{Name: "auth1"}},
@@ -1854,11 +1856,12 @@ func TestMergeToolsFiles(t *testing.T) {
name: "merge two distinct files",
files: []ToolsFile{file1, file2},
want: ToolsFile{
Sources: server.SourceConfigs{"source1": httpsrc.Config{Name: "source1"}},
AuthServices: server.AuthServiceConfigs{"auth1": google.Config{Name: "auth1"}},
Tools: server.ToolConfigs{"tool1": http.Config{Name: "tool1"}, "tool2": http.Config{Name: "tool2"}},
Toolsets: server.ToolsetConfigs{"set1": tools.ToolsetConfig{Name: "set1"}, "set2": tools.ToolsetConfig{Name: "set2"}},
Prompts: server.PromptConfigs{},
Sources: server.SourceConfigs{"source1": httpsrc.Config{Name: "source1"}},
AuthServices: server.AuthServiceConfigs{"auth1": google.Config{Name: "auth1"}},
Tools: server.ToolConfigs{"tool1": http.Config{Name: "tool1"}, "tool2": http.Config{Name: "tool2"}},
Toolsets: server.ToolsetConfigs{"set1": tools.ToolsetConfig{Name: "set1"}, "set2": tools.ToolsetConfig{Name: "set2"}},
Prompts: server.PromptConfigs{},
EmbeddingModels: server.EmbeddingModelConfigs{"model1": gemini.Config{Name: "gemini-text"}},
},
wantErr: false,
},
@@ -1871,22 +1874,24 @@ func TestMergeToolsFiles(t *testing.T) {
name: "merge single file",
files: []ToolsFile{file1},
want: ToolsFile{
Sources: file1.Sources,
AuthServices: make(server.AuthServiceConfigs),
Tools: file1.Tools,
Toolsets: file1.Toolsets,
Prompts: server.PromptConfigs{},
Sources: file1.Sources,
AuthServices: make(server.AuthServiceConfigs),
EmbeddingModels: server.EmbeddingModelConfigs{"model1": gemini.Config{Name: "gemini-text"}},
Tools: file1.Tools,
Toolsets: file1.Toolsets,
Prompts: server.PromptConfigs{},
},
},
{
name: "merge empty list",
files: []ToolsFile{},
want: ToolsFile{
Sources: make(server.SourceConfigs),
AuthServices: make(server.AuthServiceConfigs),
Tools: make(server.ToolConfigs),
Toolsets: make(server.ToolsetConfigs),
Prompts: server.PromptConfigs{},
Sources: make(server.SourceConfigs),
AuthServices: make(server.AuthServiceConfigs),
EmbeddingModels: make(server.EmbeddingModelConfigs),
Tools: make(server.ToolConfigs),
Toolsets: make(server.ToolsetConfigs),
Prompts: server.PromptConfigs{},
},
},
}

View File

@@ -68,6 +68,7 @@ The BigQuery MCP server is configured using environment variables.
export BIGQUERY_PROJECT="<your-gcp-project-id>"
export BIGQUERY_LOCATION="<your-dataset-location>" # Optional
export BIGQUERY_USE_CLIENT_OAUTH="true" # Optional
export BIGQUERY_SCOPES="<comma-separated-scopes>" # Optional
```
Add the following configuration to your MCP client (e.g., `settings.json` for Gemini CLI, `mcp_config.json` for Antigravity):

View File

@@ -28,11 +28,11 @@ require (
go.opentelemetry.io/otel/metric v1.38.0 // indirect
go.opentelemetry.io/otel/sdk v1.38.0 // indirect
go.opentelemetry.io/otel/trace v1.38.0 // indirect
golang.org/x/crypto v0.43.0 // indirect
golang.org/x/net v0.46.0 // indirect
golang.org/x/crypto v0.45.0 // indirect
golang.org/x/net v0.47.0 // indirect
golang.org/x/oauth2 v0.32.0 // indirect
golang.org/x/sys v0.37.0 // indirect
golang.org/x/text v0.30.0 // indirect
golang.org/x/sys v0.38.0 // indirect
golang.org/x/text v0.31.0 // indirect
google.golang.org/api v0.255.0 // indirect
google.golang.org/genproto/googleapis/rpc v0.0.0-20251029180050-ab9386a59fda // indirect
google.golang.org/grpc v1.76.0 // indirect

View File

@@ -88,18 +88,18 @@ go.opentelemetry.io/otel/trace v1.38.0 h1:Fxk5bKrDZJUH+AMyyIXGcFAPah0oRcT+LuNtJr
go.opentelemetry.io/otel/trace v1.38.0/go.mod h1:j1P9ivuFsTceSWe1oY+EeW3sc+Pp42sO++GHkg4wwhs=
go.uber.org/goleak v1.3.0 h1:2K3zAYmnTNqV73imy9J1T3WC+gmCePx2hEGkimedGto=
go.uber.org/goleak v1.3.0/go.mod h1:CoHD4mav9JJNrW/WLlf7HGZPjdw8EucARQHekz1X6bE=
golang.org/x/crypto v0.43.0 h1:dduJYIi3A3KOfdGOHX8AVZ/jGiyPa3IbBozJ5kNuE04=
golang.org/x/crypto v0.43.0/go.mod h1:BFbav4mRNlXJL4wNeejLpWxB7wMbc79PdRGhWKncxR0=
golang.org/x/net v0.46.0 h1:giFlY12I07fugqwPuWJi68oOnpfqFnJIJzaIIm2JVV4=
golang.org/x/net v0.46.0/go.mod h1:Q9BGdFy1y4nkUwiLvT5qtyhAnEHgnQ/zd8PfU6nc210=
golang.org/x/crypto v0.45.0 h1:jMBrvKuj23MTlT0bQEOBcAE0mjg8mK9RXFhRH6nyF3Q=
golang.org/x/crypto v0.45.0/go.mod h1:XTGrrkGJve7CYK7J8PEww4aY7gM3qMCElcJQ8n8JdX4=
golang.org/x/net v0.47.0 h1:Mx+4dIFzqraBXUugkia1OOvlD6LemFo1ALMHjrXDOhY=
golang.org/x/net v0.47.0/go.mod h1:/jNxtkgq5yWUGYkaZGqo27cfGZ1c5Nen03aYrrKpVRU=
golang.org/x/oauth2 v0.32.0 h1:jsCblLleRMDrxMN29H3z/k1KliIvpLgCkE6R8FXXNgY=
golang.org/x/oauth2 v0.32.0/go.mod h1:lzm5WQJQwKZ3nwavOZ3IS5Aulzxi68dUSgRHujetwEA=
golang.org/x/sync v0.17.0 h1:l60nONMj9l5drqw6jlhIELNv9I0A4OFgRsG9k2oT9Ug=
golang.org/x/sync v0.17.0/go.mod h1:9KTHXmSnoGruLpwFjVSX0lNNA75CykiMECbovNTZqGI=
golang.org/x/sys v0.37.0 h1:fdNQudmxPjkdUTPnLn5mdQv7Zwvbvpaxqs831goi9kQ=
golang.org/x/sys v0.37.0/go.mod h1:OgkHotnGiDImocRcuBABYBEXf8A9a87e/uXjp9XT3ks=
golang.org/x/text v0.30.0 h1:yznKA/E9zq54KzlzBEAWn1NXSQ8DIp/NYMy88xJjl4k=
golang.org/x/text v0.30.0/go.mod h1:yDdHFIX9t+tORqspjENWgzaCVXgk0yYnYuSZ8UzzBVM=
golang.org/x/sync v0.18.0 h1:kr88TuHDroi+UVf+0hZnirlk8o8T+4MrK6mr60WkH/I=
golang.org/x/sync v0.18.0/go.mod h1:9KTHXmSnoGruLpwFjVSX0lNNA75CykiMECbovNTZqGI=
golang.org/x/sys v0.38.0 h1:3yZWxaJjBmCWXqhN1qh02AkOnCQ1poK6oF+a7xWL6Gc=
golang.org/x/sys v0.38.0/go.mod h1:OgkHotnGiDImocRcuBABYBEXf8A9a87e/uXjp9XT3ks=
golang.org/x/text v0.31.0 h1:aC8ghyu4JhP8VojJ2lEHBnochRno1sgL6nEi9WGFGMM=
golang.org/x/text v0.31.0/go.mod h1:tKRAlv61yKIjGGHX/4tP1LTbc13YSec1pxVEWXzfoeM=
golang.org/x/time v0.14.0 h1:MRx4UaLrDotUKUdCIqzPC48t1Y9hANFKIRpNx+Te8PI=
golang.org/x/time v0.14.0/go.mod h1:eL/Oa2bBBK0TkX57Fyni+NgnyQQN4LitPmob2Hjnqw4=
gonum.org/v1/gonum v0.16.0 h1:5+ul4Swaf3ESvrOnidPp4GZbzf0mxVQpDCYUQE7OJfk=

View File

@@ -39,11 +39,11 @@ require (
go.opentelemetry.io/otel/metric v1.38.0 // indirect
go.opentelemetry.io/otel/sdk v1.38.0 // indirect
go.opentelemetry.io/otel/trace v1.38.0 // indirect
golang.org/x/crypto v0.43.0 // indirect
golang.org/x/net v0.46.0 // indirect
golang.org/x/crypto v0.45.0 // indirect
golang.org/x/net v0.47.0 // indirect
golang.org/x/oauth2 v0.32.0 // indirect
golang.org/x/sys v0.37.0 // indirect
golang.org/x/text v0.30.0 // indirect
golang.org/x/sys v0.38.0 // indirect
golang.org/x/text v0.31.0 // indirect
google.golang.org/api v0.255.0 // indirect
google.golang.org/genai v1.34.0 // indirect
google.golang.org/genproto/googleapis/rpc v0.0.0-20251029180050-ab9386a59fda // indirect

View File

@@ -123,18 +123,18 @@ go.opentelemetry.io/otel/trace v1.38.0 h1:Fxk5bKrDZJUH+AMyyIXGcFAPah0oRcT+LuNtJr
go.opentelemetry.io/otel/trace v1.38.0/go.mod h1:j1P9ivuFsTceSWe1oY+EeW3sc+Pp42sO++GHkg4wwhs=
go.uber.org/goleak v1.3.0 h1:2K3zAYmnTNqV73imy9J1T3WC+gmCePx2hEGkimedGto=
go.uber.org/goleak v1.3.0/go.mod h1:CoHD4mav9JJNrW/WLlf7HGZPjdw8EucARQHekz1X6bE=
golang.org/x/crypto v0.43.0 h1:dduJYIi3A3KOfdGOHX8AVZ/jGiyPa3IbBozJ5kNuE04=
golang.org/x/crypto v0.43.0/go.mod h1:BFbav4mRNlXJL4wNeejLpWxB7wMbc79PdRGhWKncxR0=
golang.org/x/net v0.46.0 h1:giFlY12I07fugqwPuWJi68oOnpfqFnJIJzaIIm2JVV4=
golang.org/x/net v0.46.0/go.mod h1:Q9BGdFy1y4nkUwiLvT5qtyhAnEHgnQ/zd8PfU6nc210=
golang.org/x/crypto v0.45.0 h1:jMBrvKuj23MTlT0bQEOBcAE0mjg8mK9RXFhRH6nyF3Q=
golang.org/x/crypto v0.45.0/go.mod h1:XTGrrkGJve7CYK7J8PEww4aY7gM3qMCElcJQ8n8JdX4=
golang.org/x/net v0.47.0 h1:Mx+4dIFzqraBXUugkia1OOvlD6LemFo1ALMHjrXDOhY=
golang.org/x/net v0.47.0/go.mod h1:/jNxtkgq5yWUGYkaZGqo27cfGZ1c5Nen03aYrrKpVRU=
golang.org/x/oauth2 v0.32.0 h1:jsCblLleRMDrxMN29H3z/k1KliIvpLgCkE6R8FXXNgY=
golang.org/x/oauth2 v0.32.0/go.mod h1:lzm5WQJQwKZ3nwavOZ3IS5Aulzxi68dUSgRHujetwEA=
golang.org/x/sync v0.17.0 h1:l60nONMj9l5drqw6jlhIELNv9I0A4OFgRsG9k2oT9Ug=
golang.org/x/sync v0.17.0/go.mod h1:9KTHXmSnoGruLpwFjVSX0lNNA75CykiMECbovNTZqGI=
golang.org/x/sys v0.37.0 h1:fdNQudmxPjkdUTPnLn5mdQv7Zwvbvpaxqs831goi9kQ=
golang.org/x/sys v0.37.0/go.mod h1:OgkHotnGiDImocRcuBABYBEXf8A9a87e/uXjp9XT3ks=
golang.org/x/text v0.30.0 h1:yznKA/E9zq54KzlzBEAWn1NXSQ8DIp/NYMy88xJjl4k=
golang.org/x/text v0.30.0/go.mod h1:yDdHFIX9t+tORqspjENWgzaCVXgk0yYnYuSZ8UzzBVM=
golang.org/x/sync v0.18.0 h1:kr88TuHDroi+UVf+0hZnirlk8o8T+4MrK6mr60WkH/I=
golang.org/x/sync v0.18.0/go.mod h1:9KTHXmSnoGruLpwFjVSX0lNNA75CykiMECbovNTZqGI=
golang.org/x/sys v0.38.0 h1:3yZWxaJjBmCWXqhN1qh02AkOnCQ1poK6oF+a7xWL6Gc=
golang.org/x/sys v0.38.0/go.mod h1:OgkHotnGiDImocRcuBABYBEXf8A9a87e/uXjp9XT3ks=
golang.org/x/text v0.31.0 h1:aC8ghyu4JhP8VojJ2lEHBnochRno1sgL6nEi9WGFGMM=
golang.org/x/text v0.31.0/go.mod h1:tKRAlv61yKIjGGHX/4tP1LTbc13YSec1pxVEWXzfoeM=
golang.org/x/time v0.14.0 h1:MRx4UaLrDotUKUdCIqzPC48t1Y9hANFKIRpNx+Te8PI=
golang.org/x/time v0.14.0/go.mod h1:eL/Oa2bBBK0TkX57Fyni+NgnyQQN4LitPmob2Hjnqw4=
gonum.org/v1/gonum v0.16.0 h1:5+ul4Swaf3ESvrOnidPp4GZbzf0mxVQpDCYUQE7OJfk=

View File

@@ -26,11 +26,11 @@ require (
go.opentelemetry.io/otel v1.38.0 // indirect
go.opentelemetry.io/otel/metric v1.38.0 // indirect
go.opentelemetry.io/otel/trace v1.38.0 // indirect
golang.org/x/crypto v0.43.0 // indirect
golang.org/x/net v0.46.0 // indirect
golang.org/x/crypto v0.45.0 // indirect
golang.org/x/net v0.47.0 // indirect
golang.org/x/oauth2 v0.32.0 // indirect
golang.org/x/sys v0.37.0 // indirect
golang.org/x/text v0.30.0 // indirect
golang.org/x/sys v0.38.0 // indirect
golang.org/x/text v0.31.0 // indirect
google.golang.org/api v0.255.0 // indirect
google.golang.org/genproto/googleapis/rpc v0.0.0-20251029180050-ab9386a59fda // indirect
google.golang.org/grpc v1.76.0 // indirect

View File

@@ -94,18 +94,18 @@ go.opentelemetry.io/otel/sdk/metric v1.38.0 h1:aSH66iL0aZqo//xXzQLYozmWrXxyFkBJ6
go.opentelemetry.io/otel/sdk/metric v1.38.0/go.mod h1:dg9PBnW9XdQ1Hd6ZnRz689CbtrUp0wMMs9iPcgT9EZA=
go.opentelemetry.io/otel/trace v1.38.0 h1:Fxk5bKrDZJUH+AMyyIXGcFAPah0oRcT+LuNtJrmcNLE=
go.opentelemetry.io/otel/trace v1.38.0/go.mod h1:j1P9ivuFsTceSWe1oY+EeW3sc+Pp42sO++GHkg4wwhs=
golang.org/x/crypto v0.43.0 h1:dduJYIi3A3KOfdGOHX8AVZ/jGiyPa3IbBozJ5kNuE04=
golang.org/x/crypto v0.43.0/go.mod h1:BFbav4mRNlXJL4wNeejLpWxB7wMbc79PdRGhWKncxR0=
golang.org/x/net v0.46.0 h1:giFlY12I07fugqwPuWJi68oOnpfqFnJIJzaIIm2JVV4=
golang.org/x/net v0.46.0/go.mod h1:Q9BGdFy1y4nkUwiLvT5qtyhAnEHgnQ/zd8PfU6nc210=
golang.org/x/crypto v0.45.0 h1:jMBrvKuj23MTlT0bQEOBcAE0mjg8mK9RXFhRH6nyF3Q=
golang.org/x/crypto v0.45.0/go.mod h1:XTGrrkGJve7CYK7J8PEww4aY7gM3qMCElcJQ8n8JdX4=
golang.org/x/net v0.47.0 h1:Mx+4dIFzqraBXUugkia1OOvlD6LemFo1ALMHjrXDOhY=
golang.org/x/net v0.47.0/go.mod h1:/jNxtkgq5yWUGYkaZGqo27cfGZ1c5Nen03aYrrKpVRU=
golang.org/x/oauth2 v0.32.0 h1:jsCblLleRMDrxMN29H3z/k1KliIvpLgCkE6R8FXXNgY=
golang.org/x/oauth2 v0.32.0/go.mod h1:lzm5WQJQwKZ3nwavOZ3IS5Aulzxi68dUSgRHujetwEA=
golang.org/x/sync v0.17.0 h1:l60nONMj9l5drqw6jlhIELNv9I0A4OFgRsG9k2oT9Ug=
golang.org/x/sync v0.17.0/go.mod h1:9KTHXmSnoGruLpwFjVSX0lNNA75CykiMECbovNTZqGI=
golang.org/x/sys v0.37.0 h1:fdNQudmxPjkdUTPnLn5mdQv7Zwvbvpaxqs831goi9kQ=
golang.org/x/sys v0.37.0/go.mod h1:OgkHotnGiDImocRcuBABYBEXf8A9a87e/uXjp9XT3ks=
golang.org/x/text v0.30.0 h1:yznKA/E9zq54KzlzBEAWn1NXSQ8DIp/NYMy88xJjl4k=
golang.org/x/text v0.30.0/go.mod h1:yDdHFIX9t+tORqspjENWgzaCVXgk0yYnYuSZ8UzzBVM=
golang.org/x/sync v0.18.0 h1:kr88TuHDroi+UVf+0hZnirlk8o8T+4MrK6mr60WkH/I=
golang.org/x/sync v0.18.0/go.mod h1:9KTHXmSnoGruLpwFjVSX0lNNA75CykiMECbovNTZqGI=
golang.org/x/sys v0.38.0 h1:3yZWxaJjBmCWXqhN1qh02AkOnCQ1poK6oF+a7xWL6Gc=
golang.org/x/sys v0.38.0/go.mod h1:OgkHotnGiDImocRcuBABYBEXf8A9a87e/uXjp9XT3ks=
golang.org/x/text v0.31.0 h1:aC8ghyu4JhP8VojJ2lEHBnochRno1sgL6nEi9WGFGMM=
golang.org/x/text v0.31.0/go.mod h1:tKRAlv61yKIjGGHX/4tP1LTbc13YSec1pxVEWXzfoeM=
golang.org/x/time v0.14.0 h1:MRx4UaLrDotUKUdCIqzPC48t1Y9hANFKIRpNx+Te8PI=
golang.org/x/time v0.14.0/go.mod h1:eL/Oa2bBBK0TkX57Fyni+NgnyQQN4LitPmob2Hjnqw4=
gonum.org/v1/gonum v0.16.0 h1:5+ul4Swaf3ESvrOnidPp4GZbzf0mxVQpDCYUQE7OJfk=

View File

@@ -18,7 +18,6 @@
"resolved": "https://registry.npmjs.org/@google-cloud/paginator/-/paginator-5.0.2.tgz",
"integrity": "sha512-DJS3s0OVH4zFDB1PzjxAsHqJT6sKVbRwwML0ZBP9PbU7Yebtu/7SWMRzvO2J3nUi9pRNITCfu4LJeooM2w4pjg==",
"license": "Apache-2.0",
"peer": true,
"dependencies": {
"arrify": "^2.0.0",
"extend": "^3.0.2"
@@ -32,7 +31,6 @@
"resolved": "https://registry.npmjs.org/@google-cloud/projectify/-/projectify-4.0.0.tgz",
"integrity": "sha512-MmaX6HeSvyPbWGwFq7mXdo0uQZLGBYCwziiLIGq5JVX+/bdI3SAq6bP98trV5eTWfLuvsMcIC1YJOF2vfteLFA==",
"license": "Apache-2.0",
"peer": true,
"engines": {
"node": ">=14.0.0"
}
@@ -42,7 +40,6 @@
"resolved": "https://registry.npmjs.org/@google-cloud/promisify/-/promisify-4.0.0.tgz",
"integrity": "sha512-Orxzlfb9c67A15cq2JQEyVc7wEsmFBmHjZWZYQMUyJ1qivXyMwdyNOs9odi79hze+2zqdTtu1E19IM/FtqZ10g==",
"license": "Apache-2.0",
"peer": true,
"engines": {
"node": ">=14"
}
@@ -52,7 +49,6 @@
"resolved": "https://registry.npmjs.org/@google-cloud/storage/-/storage-7.18.0.tgz",
"integrity": "sha512-r3ZwDMiz4nwW6R922Z1pwpePxyRwE5GdevYX63hRmAQUkUQJcBH/79EnQPDv5cOv1mFBgevdNWQfi3tie3dHrQ==",
"license": "Apache-2.0",
"peer": true,
"dependencies": {
"@google-cloud/paginator": "^5.0.0",
"@google-cloud/projectify": "^4.0.0",
@@ -79,7 +75,6 @@
"resolved": "https://registry.npmjs.org/uuid/-/uuid-8.3.2.tgz",
"integrity": "sha512-+NYs2QeMWy+GWFOEm9xnn6HCDp0l7QBD7ml8zLUmJ+93Q5NF0NocErnwkTkXVFNiX3/fpC6afS8Dhb/gz7R7eg==",
"license": "MIT",
"peer": true,
"bin": {
"uuid": "dist/bin/uuid"
}
@@ -102,6 +97,7 @@
"resolved": "https://registry.npmjs.org/@google/genai/-/genai-1.14.0.tgz",
"integrity": "sha512-jirYprAAJU1svjwSDVCzyVq+FrJpJd5CSxR/g2Ga/gZ0ZYZpcWjMS75KJl9y71K1mDN+tcx6s21CzCbB2R840g==",
"license": "Apache-2.0",
"peer": true,
"dependencies": {
"google-auth-library": "^9.14.2",
"ws": "^8.18.0"
@@ -140,6 +136,7 @@
"resolved": "https://registry.npmjs.org/@modelcontextprotocol/sdk/-/sdk-1.17.5.tgz",
"integrity": "sha512-QakrKIGniGuRVfWBdMsDea/dx1PNE739QJ7gCM41s9q+qaCYTHCdsIBXQVVXry3mfWAiaM9kT22Hyz53Uw8mfg==",
"license": "MIT",
"peer": true,
"dependencies": {
"ajv": "^6.12.6",
"content-type": "^1.0.5",
@@ -302,7 +299,6 @@
"resolved": "https://registry.npmjs.org/@tootallnate/once/-/once-2.0.0.tgz",
"integrity": "sha512-XCuKFP5PS55gnMVu3dty8KPatLqUoy/ZYzDzAGCQ8JNFCkLXzmI7vNHCR+XpbZaMWQK/vQubr7PkYq8g470J/A==",
"license": "MIT",
"peer": true,
"engines": {
"node": ">= 10"
}
@@ -311,15 +307,13 @@
"version": "0.12.5",
"resolved": "https://registry.npmjs.org/@types/caseless/-/caseless-0.12.5.tgz",
"integrity": "sha512-hWtVTC2q7hc7xZ/RLbxapMvDMgUnDvKvMOpKal4DrMyfGBUfB1oKaZlIRr6mJL+If3bAP6sV/QneGzF6tJjZDg==",
"license": "MIT",
"peer": true
"license": "MIT"
},
"node_modules/@types/node": {
"version": "24.10.1",
"resolved": "https://registry.npmjs.org/@types/node/-/node-24.10.1.tgz",
"integrity": "sha512-GNWcUTRBgIRJD5zj+Tq0fKOJ5XZajIiBroOF0yvj2bSU1WvNdYS/dn9UxwsujGW4JX06dnHyjV2y9rRaybH0iQ==",
"license": "MIT",
"peer": true,
"dependencies": {
"undici-types": "~7.16.0"
}
@@ -329,7 +323,6 @@
"resolved": "https://registry.npmjs.org/@types/request/-/request-2.48.13.tgz",
"integrity": "sha512-FGJ6udDNUCjd19pp0Q3iTiDkwhYup7J8hpMW9c4k53NrccQFFWKRho6hvtPPEhnXWKvukfwAlB6DbDz4yhH5Gg==",
"license": "MIT",
"peer": true,
"dependencies": {
"@types/caseless": "*",
"@types/node": "*",
@@ -342,7 +335,6 @@
"resolved": "https://registry.npmjs.org/form-data/-/form-data-2.5.5.tgz",
"integrity": "sha512-jqdObeR2rxZZbPSGL+3VckHMYtu+f9//KXBsVny6JSX/pa38Fy+bGjuG8eW/H6USNQWhLi8Num++cU2yOCNz4A==",
"license": "MIT",
"peer": true,
"dependencies": {
"asynckit": "^0.4.0",
"combined-stream": "^1.0.8",
@@ -360,7 +352,6 @@
"resolved": "https://registry.npmjs.org/mime-db/-/mime-db-1.52.0.tgz",
"integrity": "sha512-sPU4uV7dYlvtWJxwwxHD0PuihVNiE7TyAbQ5SWxDCB9mUYvOgroQOwYQQOKPJ8CIbE+1ETVlOoK1UC2nU3gYvg==",
"license": "MIT",
"peer": true,
"engines": {
"node": ">= 0.6"
}
@@ -370,7 +361,6 @@
"resolved": "https://registry.npmjs.org/mime-types/-/mime-types-2.1.35.tgz",
"integrity": "sha512-ZDY+bPm5zTTF+YpCrAU9nK0UgICYPT0QtT1NZWFv4s++TNkcgVaT0g6+4R2uI4MjQjzysHB1zxuWL50hzaeXiw==",
"license": "MIT",
"peer": true,
"dependencies": {
"mime-db": "1.52.0"
},
@@ -382,15 +372,13 @@
"version": "4.0.5",
"resolved": "https://registry.npmjs.org/@types/tough-cookie/-/tough-cookie-4.0.5.tgz",
"integrity": "sha512-/Ad8+nIOV7Rl++6f1BdKxFSMgmoqEoYbHRpPcx3JEfv8VRsQe9Z4mCXeJBzxs7mbHY/XOZZuXlRNfhpVPbs6ZA==",
"license": "MIT",
"peer": true
"license": "MIT"
},
"node_modules/abort-controller": {
"version": "3.0.0",
"resolved": "https://registry.npmjs.org/abort-controller/-/abort-controller-3.0.0.tgz",
"integrity": "sha512-h8lQ8tacZYnR3vNQTgibj+tODHI5/+l06Au2Pcriv/Gmet0eaj4TwWH41sO9wnHDiQsEj19q0drzdWdeAHtweg==",
"license": "MIT",
"peer": true,
"dependencies": {
"event-target-shim": "^5.0.0"
},
@@ -465,7 +453,6 @@
"resolved": "https://registry.npmjs.org/arrify/-/arrify-2.0.1.tgz",
"integrity": "sha512-3duEwti880xqi4eAMN8AyR4a0ByT90zoYdLlevfrvU43vb0YZwZVfxOgxWrLXXXpyugL0hNZc9G6BiB5B3nUug==",
"license": "MIT",
"peer": true,
"engines": {
"node": ">=8"
}
@@ -475,7 +462,6 @@
"resolved": "https://registry.npmjs.org/async-retry/-/async-retry-1.3.3.tgz",
"integrity": "sha512-wfr/jstw9xNi/0teMHrRW7dsz3Lt5ARhYNZ2ewpadnhaIp5mbALhOAP+EAdsC7t4Z6wqsDVv9+W6gm1Dk9mEyw==",
"license": "MIT",
"peer": true,
"dependencies": {
"retry": "0.13.1"
}
@@ -768,7 +754,6 @@
"resolved": "https://registry.npmjs.org/duplexify/-/duplexify-4.1.3.tgz",
"integrity": "sha512-M3BmBhwJRZsSx38lZyhE53Csddgzl5R7xGJNk7CVddZD6CcmwMCH8J+7AprIrQKH7TonKxaCjcv27Qmf+sQ+oA==",
"license": "MIT",
"peer": true,
"dependencies": {
"end-of-stream": "^1.4.1",
"inherits": "^2.0.3",
@@ -817,7 +802,6 @@
"resolved": "https://registry.npmjs.org/end-of-stream/-/end-of-stream-1.4.5.tgz",
"integrity": "sha512-ooEGc6HP26xXq/N+GCGOT0JKCLDGrq2bQUZrQ7gyrJiZANJ/8YDTxTpQBXGMn+WbIQXNVpyWymm7KYVICQnyOg==",
"license": "MIT",
"peer": true,
"dependencies": {
"once": "^1.4.0"
}
@@ -887,7 +871,6 @@
"resolved": "https://registry.npmjs.org/event-target-shim/-/event-target-shim-5.0.1.tgz",
"integrity": "sha512-i/2XbnSz/uxRCU6+NdVJgKWDTM427+MqYbkQzD321DuCQJUqOuJKIA0IM2+W2xtYHdKOmZ4dR6fExsd4SXL+WQ==",
"license": "MIT",
"peer": true,
"engines": {
"node": ">=6"
}
@@ -918,6 +901,7 @@
"resolved": "https://registry.npmjs.org/express/-/express-5.1.0.tgz",
"integrity": "sha512-DT9ck5YIRU+8GYzzU5kT3eHGA5iL+1Zd0EutOmTE9Dtk+Tvuzd23VBU+ec7HPNSTxXYO55gPV/hq4pSBJDjFpA==",
"license": "MIT",
"peer": true,
"dependencies": {
"accepts": "^2.0.0",
"body-parser": "^2.2.0",
@@ -999,7 +983,6 @@
}
],
"license": "MIT",
"peer": true,
"dependencies": {
"strnum": "^1.1.1"
},
@@ -1350,8 +1333,7 @@
"url": "https://patreon.com/mdevils"
}
],
"license": "MIT",
"peer": true
"license": "MIT"
},
"node_modules/http-errors": {
"version": "2.0.0",
@@ -1383,7 +1365,6 @@
"resolved": "https://registry.npmjs.org/http-proxy-agent/-/http-proxy-agent-5.0.0.tgz",
"integrity": "sha512-n2hY8YdoRE1i7r6M0w9DIw5GgZN0G25P8zLCRQ8rjXtTU3vsNFBI/vWK/UIeE6g5MUUz6avwAPXmL6Fy9D/90w==",
"license": "MIT",
"peer": true,
"dependencies": {
"@tootallnate/once": "2",
"agent-base": "6",
@@ -1398,7 +1379,6 @@
"resolved": "https://registry.npmjs.org/agent-base/-/agent-base-6.0.2.tgz",
"integrity": "sha512-RZNwNclF7+MS/8bDg70amg32dyeZGZxiDuQmZxKLAlQjr3jGyLx+4Kkk58UO7D2QdgFIQCovuSuZESne6RG6XQ==",
"license": "MIT",
"peer": true,
"dependencies": {
"debug": "4"
},
@@ -1525,12 +1505,12 @@
}
},
"node_modules/jws": {
"version": "4.0.0",
"resolved": "https://registry.npmjs.org/jws/-/jws-4.0.0.tgz",
"integrity": "sha512-KDncfTmOZoOMTFG4mBlG0qUIOlc03fmzH+ru6RgYVZhPkyiy/92Owlt/8UEN+a4TXR1FQetfIpJE8ApdvdVxTg==",
"version": "4.0.1",
"resolved": "https://registry.npmjs.org/jws/-/jws-4.0.1.tgz",
"integrity": "sha512-EKI/M/yqPncGUUh44xz0PxSidXFr/+r0pA70+gIYhjv+et7yxM+s29Y+VGDkovRofQem0fs7Uvf4+YmAdyRduA==",
"license": "MIT",
"dependencies": {
"jwa": "^2.0.0",
"jwa": "^2.0.1",
"safe-buffer": "^5.0.1"
}
},
@@ -1575,7 +1555,6 @@
"resolved": "https://registry.npmjs.org/mime/-/mime-3.0.0.tgz",
"integrity": "sha512-jSCU7/VB1loIWBZe14aEYHU/+1UMEHoaO7qxCOVJOw9GgH72VAWppxNcjU+x9a2k3GSIBXNKxXQFqRvvZ7vr3A==",
"license": "MIT",
"peer": true,
"bin": {
"mime": "cli.js"
},
@@ -1736,7 +1715,6 @@
"resolved": "https://registry.npmjs.org/p-limit/-/p-limit-3.1.0.tgz",
"integrity": "sha512-TYOanM3wGwNGsZN2cVTYPArw454xnXj5qmWF1bEoAc4+cU/ol7GVh7odevjp1FNHduHc3KZMcFduxU5Xc6uJRQ==",
"license": "MIT",
"peer": true,
"dependencies": {
"yocto-queue": "^0.1.0"
},
@@ -1835,9 +1813,9 @@
}
},
"node_modules/qs": {
"version": "6.14.0",
"resolved": "https://registry.npmjs.org/qs/-/qs-6.14.0.tgz",
"integrity": "sha512-YWWTjgABSKcvs/nWBi9PycY/JiPJqOD4JA6o9Sej2AtvSGarXxKC3OQSk4pAarbdQlKAh5D4FCQkJNkW+GAn3w==",
"version": "6.14.1",
"resolved": "https://registry.npmjs.org/qs/-/qs-6.14.1.tgz",
"integrity": "sha512-4EK3+xJl8Ts67nLYNwqw/dsFVnCf+qR7RgXSK9jEEm9unao3njwMDdmsdvoKBKHzxd7tCYz5e5M+SnMjdtXGQQ==",
"license": "BSD-3-Clause",
"dependencies": {
"side-channel": "^1.1.0"
@@ -1878,7 +1856,6 @@
"resolved": "https://registry.npmjs.org/readable-stream/-/readable-stream-3.6.2.tgz",
"integrity": "sha512-9u/sniCrY3D5WdsERHzHE4G2YCXqoG5FTHUiCC4SIbr6XcLZBY05ya9EKjYek9O5xOAwjGq+1JdGBAS7Q9ScoA==",
"license": "MIT",
"peer": true,
"dependencies": {
"inherits": "^2.0.3",
"string_decoder": "^1.1.1",
@@ -1893,7 +1870,6 @@
"resolved": "https://registry.npmjs.org/retry/-/retry-0.13.1.tgz",
"integrity": "sha512-XQBQ3I8W1Cge0Seh+6gjj03LbmRFWuoszgK9ooCpwYIrhhoO80pfq4cUkU5DkknwfOfFteRwlZ56PYOGYyFWdg==",
"license": "MIT",
"peer": true,
"engines": {
"node": ">= 4"
}
@@ -1903,7 +1879,6 @@
"resolved": "https://registry.npmjs.org/retry-request/-/retry-request-7.0.2.tgz",
"integrity": "sha512-dUOvLMJ0/JJYEn8NrpOaGNE7X3vpI5XlZS/u0ANjqtcZVKnIxP7IgCFwrKTxENw29emmwug53awKtaMm4i9g5w==",
"license": "MIT",
"peer": true,
"dependencies": {
"@types/request": "^2.48.8",
"extend": "^3.0.2",
@@ -2132,7 +2107,6 @@
"resolved": "https://registry.npmjs.org/stream-events/-/stream-events-1.0.5.tgz",
"integrity": "sha512-E1GUzBSgvct8Jsb3v2X15pjzN1tYebtbLaMg+eBOUOAxgbLoSbT2NS91ckc5lJD1KfLjId+jXJRgo0qnV5Nerg==",
"license": "MIT",
"peer": true,
"dependencies": {
"stubs": "^3.0.0"
}
@@ -2141,15 +2115,13 @@
"version": "1.0.3",
"resolved": "https://registry.npmjs.org/stream-shift/-/stream-shift-1.0.3.tgz",
"integrity": "sha512-76ORR0DO1o1hlKwTbi/DM3EXWGf3ZJYO8cXX5RJwnul2DEg2oyoZyjLNoQM8WsvZiFKCRfC1O0J7iCvie3RZmQ==",
"license": "MIT",
"peer": true
"license": "MIT"
},
"node_modules/string_decoder": {
"version": "1.3.0",
"resolved": "https://registry.npmjs.org/string_decoder/-/string_decoder-1.3.0.tgz",
"integrity": "sha512-hkRX8U1WjJFd8LsDJ2yQ/wWWxaopEsABU1XfkM8A+j0+85JAGppt16cr1Whg6KIbb4okU6Mql6BOj+uup/wKeA==",
"license": "MIT",
"peer": true,
"dependencies": {
"safe-buffer": "~5.2.0"
}
@@ -2260,22 +2232,19 @@
"url": "https://github.com/sponsors/NaturalIntelligence"
}
],
"license": "MIT",
"peer": true
"license": "MIT"
},
"node_modules/stubs": {
"version": "3.0.0",
"resolved": "https://registry.npmjs.org/stubs/-/stubs-3.0.0.tgz",
"integrity": "sha512-PdHt7hHUJKxvTCgbKX9C1V/ftOcjJQgz8BZwNfV5c4B6dcGqlpelTbJ999jBGZ2jYiPAwcX5dP6oBwVlBlUbxw==",
"license": "MIT",
"peer": true
"license": "MIT"
},
"node_modules/teeny-request": {
"version": "9.0.0",
"resolved": "https://registry.npmjs.org/teeny-request/-/teeny-request-9.0.0.tgz",
"integrity": "sha512-resvxdc6Mgb7YEThw6G6bExlXKkv6+YbuzGg9xuXxSgxJF7Ozs+o8Y9+2R3sArdWdW8nOokoQb1yrpFB0pQK2g==",
"license": "Apache-2.0",
"peer": true,
"dependencies": {
"http-proxy-agent": "^5.0.0",
"https-proxy-agent": "^5.0.0",
@@ -2292,7 +2261,6 @@
"resolved": "https://registry.npmjs.org/agent-base/-/agent-base-6.0.2.tgz",
"integrity": "sha512-RZNwNclF7+MS/8bDg70amg32dyeZGZxiDuQmZxKLAlQjr3jGyLx+4Kkk58UO7D2QdgFIQCovuSuZESne6RG6XQ==",
"license": "MIT",
"peer": true,
"dependencies": {
"debug": "4"
},
@@ -2305,7 +2273,6 @@
"resolved": "https://registry.npmjs.org/https-proxy-agent/-/https-proxy-agent-5.0.1.tgz",
"integrity": "sha512-dFcAjpTQFgoLMzC2VwU+C/CbS7uRL0lWmxDITmqm7C+7F0Odmj6s9l6alZc6AELXhrnggM2CeWSXHGOdX2YtwA==",
"license": "MIT",
"peer": true,
"dependencies": {
"agent-base": "6",
"debug": "4"
@@ -2347,8 +2314,7 @@
"version": "7.16.0",
"resolved": "https://registry.npmjs.org/undici-types/-/undici-types-7.16.0.tgz",
"integrity": "sha512-Zz+aZWSj8LE6zoxD+xrjh4VfkIG8Ya6LvYkZqtUQGJPZjYl53ypCaUwWqo7eI0x66KBGeRo+mlBEkMSeSZ38Nw==",
"license": "MIT",
"peer": true
"license": "MIT"
},
"node_modules/unpipe": {
"version": "1.0.0",
@@ -2372,8 +2338,7 @@
"version": "1.0.2",
"resolved": "https://registry.npmjs.org/util-deprecate/-/util-deprecate-1.0.2.tgz",
"integrity": "sha512-EPD5q1uXyFxJpCrLnCc1nHnq3gOa6DZBocAIiI2TaSCA7VCJ1UJDMagCzIkXNsUYfD1daK//LTEQ8xiIbrHtcw==",
"license": "MIT",
"peer": true
"license": "MIT"
},
"node_modules/uuid": {
"version": "9.0.1",
@@ -2560,7 +2525,6 @@
"resolved": "https://registry.npmjs.org/yocto-queue/-/yocto-queue-0.1.0.tgz",
"integrity": "sha512-rVksvsnNCdJ/ohGc6xgPwyN8eheCxsiLM8mxuE/t/mOVqJewPuO1miLpTHQiRgTKCLexL4MeAFVagts7HmNZ2Q==",
"license": "MIT",
"peer": true,
"engines": {
"node": ">=10"
},
@@ -2573,6 +2537,7 @@
"resolved": "https://registry.npmjs.org/zod/-/zod-3.25.76.tgz",
"integrity": "sha512-gzUt/qt81nXsFGKIFcC3YnfEAx5NkunCfnDlvuBSSFS02bcXu4Lmea0AFIUwbLWxWPx3d9p8S5QoaujKcNQxcQ==",
"license": "MIT",
"peer": true,
"funding": {
"url": "https://github.com/sponsors/colinhacks"
}

View File

@@ -3376,22 +3376,23 @@
}
},
"node_modules/body-parser": {
"version": "1.20.3",
"resolved": "https://registry.npmjs.org/body-parser/-/body-parser-1.20.3.tgz",
"integrity": "sha512-7rAxByjUMqQ3/bHJy7D6OGXvx/MMc4IqBn/X0fcM1QUcAItpZrBEYhWGem+tzXH90c+G01ypMcYJBO9Y30203g==",
"version": "1.20.4",
"resolved": "https://registry.npmjs.org/body-parser/-/body-parser-1.20.4.tgz",
"integrity": "sha512-ZTgYYLMOXY9qKU/57FAo8F+HA2dGX7bqGc71txDRC1rS4frdFI5R7NhluHxH6M0YItAP0sHB4uqAOcYKxO6uGA==",
"license": "MIT",
"dependencies": {
"bytes": "3.1.2",
"bytes": "~3.1.2",
"content-type": "~1.0.5",
"debug": "2.6.9",
"depd": "2.0.0",
"destroy": "1.2.0",
"http-errors": "2.0.0",
"iconv-lite": "0.4.24",
"on-finished": "2.4.1",
"qs": "6.13.0",
"raw-body": "2.5.2",
"destroy": "~1.2.0",
"http-errors": "~2.0.1",
"iconv-lite": "~0.4.24",
"on-finished": "~2.4.1",
"qs": "~6.14.0",
"raw-body": "~2.5.3",
"type-is": "~1.6.18",
"unpipe": "1.0.0"
"unpipe": "~1.0.0"
},
"engines": {
"node": ">= 0.8",
@@ -3406,11 +3407,40 @@
"ms": "2.0.0"
}
},
"node_modules/body-parser/node_modules/http-errors": {
"version": "2.0.1",
"resolved": "https://registry.npmjs.org/http-errors/-/http-errors-2.0.1.tgz",
"integrity": "sha512-4FbRdAX+bSdmo4AUFuS0WNiPz8NgFt+r8ThgNWmlrjQjt1Q7ZR9+zTlce2859x4KSXrwIsaeTqDoKQmtP8pLmQ==",
"license": "MIT",
"dependencies": {
"depd": "~2.0.0",
"inherits": "~2.0.4",
"setprototypeof": "~1.2.0",
"statuses": "~2.0.2",
"toidentifier": "~1.0.1"
},
"engines": {
"node": ">= 0.8"
},
"funding": {
"type": "opencollective",
"url": "https://opencollective.com/express"
}
},
"node_modules/body-parser/node_modules/ms": {
"version": "2.0.0",
"resolved": "https://registry.npmjs.org/ms/-/ms-2.0.0.tgz",
"integrity": "sha512-Tpp60P6IUJDTuOq/5Z8cdskzJujfwqfOTkrwIwj7IRISpnkJnT6SyJ4PCPnGMoFjC9ddhal5KVIYtAt97ix05A=="
},
"node_modules/body-parser/node_modules/statuses": {
"version": "2.0.2",
"resolved": "https://registry.npmjs.org/statuses/-/statuses-2.0.2.tgz",
"integrity": "sha512-DvEy55V3DB7uknRo+4iOGT5fP1slR8wQohVdknigZPMpMstaKJQWhwiYBACJE3Ul2pTnATihhBYnRhZQHGBiRw==",
"license": "MIT",
"engines": {
"node": ">= 0.8"
}
},
"node_modules/buffer-equal-constant-time": {
"version": "1.0.1",
"resolved": "https://registry.npmjs.org/buffer-equal-constant-time/-/buffer-equal-constant-time-1.0.1.tgz",
@@ -3434,6 +3464,7 @@
"version": "3.1.2",
"resolved": "https://registry.npmjs.org/bytes/-/bytes-3.1.2.tgz",
"integrity": "sha512-/Nf7TyzTx6S3yRJObOAV7956r8cr2+Oj8AC5dt8wSP3BQAoeX58NoHyCU8P8zGkNXStjTSi6fzO6F0pBdcYbEg==",
"license": "MIT",
"engines": {
"node": ">= 0.8"
}
@@ -3830,38 +3861,39 @@
}
},
"node_modules/express": {
"version": "4.21.2",
"resolved": "https://registry.npmjs.org/express/-/express-4.21.2.tgz",
"integrity": "sha512-28HqgMZAmih1Czt9ny7qr6ek2qddF4FclbMzwhCREB6OFfH+rXAnuNCwo1/wFvrtbgsQDb4kSbX9de9lFbrXnA==",
"version": "4.22.1",
"resolved": "https://registry.npmjs.org/express/-/express-4.22.1.tgz",
"integrity": "sha512-F2X8g9P1X7uCPZMA3MVf9wcTqlyNp7IhH5qPCI0izhaOIYXaW9L535tGA3qmjRzpH+bZczqq7hVKxTR4NWnu+g==",
"license": "MIT",
"dependencies": {
"accepts": "~1.3.8",
"array-flatten": "1.1.1",
"body-parser": "1.20.3",
"content-disposition": "0.5.4",
"body-parser": "~1.20.3",
"content-disposition": "~0.5.4",
"content-type": "~1.0.4",
"cookie": "0.7.1",
"cookie-signature": "1.0.6",
"cookie": "~0.7.1",
"cookie-signature": "~1.0.6",
"debug": "2.6.9",
"depd": "2.0.0",
"encodeurl": "~2.0.0",
"escape-html": "~1.0.3",
"etag": "~1.8.1",
"finalhandler": "1.3.1",
"fresh": "0.5.2",
"http-errors": "2.0.0",
"finalhandler": "~1.3.1",
"fresh": "~0.5.2",
"http-errors": "~2.0.0",
"merge-descriptors": "1.0.3",
"methods": "~1.1.2",
"on-finished": "2.4.1",
"on-finished": "~2.4.1",
"parseurl": "~1.3.3",
"path-to-regexp": "0.1.12",
"path-to-regexp": "~0.1.12",
"proxy-addr": "~2.0.7",
"qs": "6.13.0",
"qs": "~6.14.0",
"range-parser": "~1.2.1",
"safe-buffer": "5.2.1",
"send": "0.19.0",
"serve-static": "1.16.2",
"send": "~0.19.0",
"serve-static": "~1.16.2",
"setprototypeof": "1.2.0",
"statuses": "2.0.1",
"statuses": "~2.0.1",
"type-is": "~1.6.18",
"utils-merge": "1.0.1",
"vary": "~1.1.2"
@@ -4904,6 +4936,7 @@
"version": "0.4.24",
"resolved": "https://registry.npmjs.org/iconv-lite/-/iconv-lite-0.4.24.tgz",
"integrity": "sha512-v3MXnZAcvnywkTUEZomIActle7RXXeedOR31wwl7VlyoXO4Qi9arvSenNQWne1TcRwhCL1HwLI21bEqdpj8/rA==",
"license": "MIT",
"dependencies": {
"safer-buffer": ">= 2.1.2 < 3"
},
@@ -5661,11 +5694,12 @@
}
},
"node_modules/qs": {
"version": "6.13.0",
"resolved": "https://registry.npmjs.org/qs/-/qs-6.13.0.tgz",
"integrity": "sha512-+38qI9SOr8tfZ4QmJNplMUxqjbe7LKvvZgWdExBOmd+egZTtjLB67Gu0HRX3u/XOq7UU2Nx6nsjvS16Z9uwfpg==",
"version": "6.14.1",
"resolved": "https://registry.npmjs.org/qs/-/qs-6.14.1.tgz",
"integrity": "sha512-4EK3+xJl8Ts67nLYNwqw/dsFVnCf+qR7RgXSK9jEEm9unao3njwMDdmsdvoKBKHzxd7tCYz5e5M+SnMjdtXGQQ==",
"license": "BSD-3-Clause",
"dependencies": {
"side-channel": "^1.0.6"
"side-channel": "^1.1.0"
},
"engines": {
"node": ">=0.6"
@@ -5683,19 +5717,49 @@
}
},
"node_modules/raw-body": {
"version": "2.5.2",
"resolved": "https://registry.npmjs.org/raw-body/-/raw-body-2.5.2.tgz",
"integrity": "sha512-8zGqypfENjCIqGhgXToC8aB2r7YrBX+AQAfIPs/Mlk+BtPTztOvTS01NRW/3Eh60J+a48lt8qsCzirQ6loCVfA==",
"version": "2.5.3",
"resolved": "https://registry.npmjs.org/raw-body/-/raw-body-2.5.3.tgz",
"integrity": "sha512-s4VSOf6yN0rvbRZGxs8Om5CWj6seneMwK3oDb4lWDH0UPhWcxwOWw5+qk24bxq87szX1ydrwylIOp2uG1ojUpA==",
"license": "MIT",
"dependencies": {
"bytes": "3.1.2",
"http-errors": "2.0.0",
"iconv-lite": "0.4.24",
"unpipe": "1.0.0"
"bytes": "~3.1.2",
"http-errors": "~2.0.1",
"iconv-lite": "~0.4.24",
"unpipe": "~1.0.0"
},
"engines": {
"node": ">= 0.8"
}
},
"node_modules/raw-body/node_modules/http-errors": {
"version": "2.0.1",
"resolved": "https://registry.npmjs.org/http-errors/-/http-errors-2.0.1.tgz",
"integrity": "sha512-4FbRdAX+bSdmo4AUFuS0WNiPz8NgFt+r8ThgNWmlrjQjt1Q7ZR9+zTlce2859x4KSXrwIsaeTqDoKQmtP8pLmQ==",
"license": "MIT",
"dependencies": {
"depd": "~2.0.0",
"inherits": "~2.0.4",
"setprototypeof": "~1.2.0",
"statuses": "~2.0.2",
"toidentifier": "~1.0.1"
},
"engines": {
"node": ">= 0.8"
},
"funding": {
"type": "opencollective",
"url": "https://opencollective.com/express"
}
},
"node_modules/raw-body/node_modules/statuses": {
"version": "2.0.2",
"resolved": "https://registry.npmjs.org/statuses/-/statuses-2.0.2.tgz",
"integrity": "sha512-DvEy55V3DB7uknRo+4iOGT5fP1slR8wQohVdknigZPMpMstaKJQWhwiYBACJE3Ul2pTnATihhBYnRhZQHGBiRw==",
"license": "MIT",
"engines": {
"node": ">= 0.8"
}
},
"node_modules/readable-stream": {
"version": "3.6.2",
"resolved": "https://registry.npmjs.org/readable-stream/-/readable-stream-3.6.2.tgz",
@@ -5813,7 +5877,8 @@
"node_modules/safer-buffer": {
"version": "2.1.2",
"resolved": "https://registry.npmjs.org/safer-buffer/-/safer-buffer-2.1.2.tgz",
"integrity": "sha512-YZo3K82SD7Riyi0E1EQPojLz7kpepnSQI9IyPbHHg1XXXevb5dJI7tpyN2ADxGcQbHG7vcyRHk0cbwqcQriUtg=="
"integrity": "sha512-YZo3K82SD7Riyi0E1EQPojLz7kpepnSQI9IyPbHHg1XXXevb5dJI7tpyN2ADxGcQbHG7vcyRHk0cbwqcQriUtg==",
"license": "MIT"
},
"node_modules/semver": {
"version": "7.7.2",

View File

@@ -45,9 +45,9 @@
}
},
"node_modules/@langchain/core": {
"version": "1.1.0",
"resolved": "https://registry.npmjs.org/@langchain/core/-/core-1.1.0.tgz",
"integrity": "sha512-yJ6JHcU9psjnQbzRFkXjIdNTA+3074dA+2pHdH8ewvQCSleSk6JcjkCMIb5+NASjeMoi1ZuntlLKVsNqF38YxA==",
"version": "1.1.8",
"resolved": "https://registry.npmjs.org/@langchain/core/-/core-1.1.8.tgz",
"integrity": "sha512-kIUidOgc0ZdyXo4Ahn9Zas+OayqOfk4ZoKPi7XaDipNSWSApc2+QK5BVcjvwtzxstsNOrmXJiJWEN6WPF/MvAw==",
"license": "MIT",
"peer": true,
"dependencies": {
@@ -56,10 +56,9 @@
"camelcase": "6",
"decamelize": "1.2.0",
"js-tiktoken": "^1.0.12",
"langsmith": "^0.3.64",
"langsmith": ">=0.4.0 <1.0.0",
"mustache": "^4.2.0",
"p-queue": "^6.6.2",
"p-retry": "^7.0.0",
"uuid": "^10.0.0",
"zod": "^3.25.76 || ^4"
},
@@ -67,25 +66,10 @@
"node": ">=20"
}
},
"node_modules/@langchain/core/node_modules/p-retry": {
"version": "7.1.0",
"resolved": "https://registry.npmjs.org/p-retry/-/p-retry-7.1.0.tgz",
"integrity": "sha512-xL4PiFRQa/f9L9ZvR4/gUCRNus4N8YX80ku8kv9Jqz+ZokkiZLM0bcvX0gm1F3PDi9SPRsww1BDsTWgE6Y1GLQ==",
"license": "MIT",
"dependencies": {
"is-network-error": "^1.1.0"
},
"engines": {
"node": ">=20"
},
"funding": {
"url": "https://github.com/sponsors/sindresorhus"
}
},
"node_modules/@langchain/google-genai": {
"version": "2.0.0",
"resolved": "https://registry.npmjs.org/@langchain/google-genai/-/google-genai-2.0.0.tgz",
"integrity": "sha512-PaAWkogQdF+Y2bhhXWXUrC2nO7sTgWLtobBbZl/0V8Aa1F/KG2wrMECie3S17bAdFu/6VmQOuFFrlgSMwQC5KA==",
"version": "2.1.3",
"resolved": "https://registry.npmjs.org/@langchain/google-genai/-/google-genai-2.1.3.tgz",
"integrity": "sha512-ZdlFK/N10GyU6ATzkM01Sk1rlHBoy36Q/MawGD1SyXdD2lQxZxuQZjFWewj6uzWQ2Nnjj70EvU/kmmHVPn6sfQ==",
"license": "MIT",
"dependencies": {
"@google/generative-ai": "^0.24.0",
@@ -95,7 +79,7 @@
"node": ">=20"
},
"peerDependencies": {
"@langchain/core": "1.1.0"
"@langchain/core": "1.1.8"
}
},
"node_modules/@langchain/google-genai/node_modules/uuid": {
@@ -814,18 +798,6 @@
"node": ">=8"
}
},
"node_modules/is-network-error": {
"version": "1.3.0",
"resolved": "https://registry.npmjs.org/is-network-error/-/is-network-error-1.3.0.tgz",
"integrity": "sha512-6oIwpsgRfnDiyEDLMay/GqCl3HoAtH5+RUKW29gYkL0QA+ipzpDLA16yQs7/RHCSu+BwgbJaOUqa4A99qNVQVw==",
"license": "MIT",
"engines": {
"node": ">=16"
},
"funding": {
"url": "https://github.com/sponsors/sindresorhus"
}
},
"node_modules/isexe": {
"version": "2.0.0",
"resolved": "https://registry.npmjs.org/isexe/-/isexe-2.0.0.tgz",
@@ -882,13 +854,14 @@
}
},
"node_modules/langchain": {
"version": "1.0.2",
"resolved": "https://registry.npmjs.org/langchain/-/langchain-1.0.2.tgz",
"integrity": "sha512-He/xvjVl8DHESvdaW6Dpyba72OaLCAfS2CyOm1aWrlJ4C38dKXyTIxphtld8hiii6MWX7qMSmu2EaUwWBx2STg==",
"version": "1.2.3",
"resolved": "https://registry.npmjs.org/langchain/-/langchain-1.2.3.tgz",
"integrity": "sha512-3k986xJuqg4az53JxV5LnGlOzIXF1d9Kq6Y9s7XjitvzhpsbFuTDV5/kiF4cx3pkNGyw0mUXC4tLz9RxucO0hw==",
"license": "MIT",
"dependencies": {
"@langchain/langgraph": "^1.0.0",
"@langchain/langgraph-checkpoint": "^1.0.0",
"langsmith": "~0.3.74",
"langsmith": ">=0.4.0 <1.0.0",
"uuid": "^10.0.0",
"zod": "^3.25.76 || ^4"
},
@@ -896,19 +869,19 @@
"node": ">=20"
},
"peerDependencies": {
"@langchain/core": "^1.0.0"
"@langchain/core": "1.1.8"
}
},
"node_modules/langsmith": {
"version": "0.3.77",
"resolved": "https://registry.npmjs.org/langsmith/-/langsmith-0.3.77.tgz",
"integrity": "sha512-wbS/9IX/hOAsOEOtPj8kCS8H0tFHaelwQ97gTONRtIfoPPLd9MMUmhk0KQB5DdsGAI5abg966+f0dZ/B+YRRzg==",
"version": "0.4.3",
"resolved": "https://registry.npmjs.org/langsmith/-/langsmith-0.4.3.tgz",
"integrity": "sha512-vuBAagBZulXj0rpZhUTxmHhrYIBk53z8e2Q8ty4OHVkahN4ul7Im3OZxD9jsXZB0EuncK1xRYtY8J3BW4vj1zw==",
"license": "MIT",
"dependencies": {
"@types/uuid": "^10.0.0",
"chalk": "^4.1.2",
"console-table-printer": "^2.12.1",
"p-queue": "^6.6.2",
"p-retry": "4",
"semver": "^7.6.3",
"uuid": "^10.0.0"
},

View File

@@ -1,3 +1,3 @@
google-adk==1.19.0
toolbox-core==0.5.3
pytest==9.0.1
google-adk==1.21.0
toolbox-core==0.5.4
pytest==9.0.2

View File

@@ -1,3 +1,3 @@
google-genai==1.52.0
toolbox-core==0.5.3
pytest==9.0.1
google-genai==1.56.0
toolbox-core==0.5.4
pytest==9.0.2

View File

@@ -1,5 +1,5 @@
langchain==1.1.0
langchain-google-vertexai==3.1.0
langgraph==1.0.4
toolbox-langchain==0.5.3
pytest==9.0.1
langchain==1.2.0
langchain-google-vertexai==3.2.0
langgraph==1.0.5
toolbox-langchain==0.5.4
pytest==9.0.2

View File

@@ -1,4 +1,4 @@
llama-index==0.14.10
llama-index-llms-google-genai==0.7.3
toolbox-llamaindex==0.5.3
pytest==9.0.1
llama-index==0.14.12
llama-index-llms-google-genai==0.8.3
toolbox-llamaindex==0.5.4
pytest==9.0.2

View File

@@ -18,6 +18,7 @@ to expose your developer assistant tools to a Looker instance:
* [Cline][cline] (VS Code extension)
* [Claude desktop][claudedesktop]
* [Claude code][claudecode]
* [Antigravity][antigravity]
[toolbox]: https://github.com/googleapis/genai-toolbox
[gemini-cli]: #configure-your-mcp-client
@@ -27,6 +28,7 @@ to expose your developer assistant tools to a Looker instance:
[cline]: #configure-your-mcp-client
[claudedesktop]: #configure-your-mcp-client
[claudecode]: #configure-your-mcp-client
[antigravity]: #connect-with-antigravity
## Set up Looker
@@ -38,6 +40,55 @@ to expose your developer assistant tools to a Looker instance:
listening at a different port, and you will need to use
`https://looker.example.com:19999` instead.
## Connect with Antigravity
You can connect Looker to Antigravity in the following ways:
* Using the MCP Store
* Using a custom configuration
{{< notice note >}}
You don't need to download the MCP Toolbox binary to use these methods.
{{< /notice >}}
{{< tabpane text=true >}}
{{% tab header="MCP Store" lang="en" %}}
The most straightforward way to connect to Looker in Antigravity is by using the built-in MCP Store.
1. Open Antigravity and open the editor's agent panel.
1. Click the **"..."** icon at the top of the panel and select **MCP Servers**.
1. Locate **Looker** in the list of available servers and click Install.
1. Follow the on-screen prompts to securely link your accounts where applicable.
After you install Looker in the MCP Store, resources and tools from the server are automatically available to the editor.
{{% /tab %}}
{{% tab header="Custom config" lang="en" %}}
To connect to a custom MCP server, follow these steps:
1. Open Antigravity and navigate to the MCP store using the **"..."** drop-down at the top of the editor's agent panel.
1. To open the **mcp_config.json** file, click **MCP Servers** and then click **Manage MCP Servers > View raw config**.
1. Add the following configuration, replace the environment variables with your values, and save.
```json
{
"mcpServers": {
"looker": {
"command": "npx",
"args": ["-y", "@toolbox-sdk/server", "--prebuilt", "looker", "--stdio"],
"env": {
"LOOKER_BASE_URL": "https://looker.example.com",
"LOOKER_CLIENT_ID": "your-client-id",
"LOOKER_CLIENT_SECRET": "your-client-secret"
}
}
}
}
```
{{% /tab %}}
{{< /tabpane >}}
## Install MCP Toolbox
1. Download the latest version of Toolbox as a binary. Select the [correct
@@ -290,7 +341,7 @@ assistant to list models, explores, dimensions, and measures. Run a
query, retrieve the SQL for a query, and run a saved Look.
The full tool list is available in the [Prebuilt Tools
Reference](../../reference/prebuilt-tools/#looker).
Reference](../../reference/prebuilt-tools.md/#looker).
The following tools are available to the LLM:

View File

@@ -105,6 +105,8 @@ See [Usage Examples](../reference/cli.md#examples).
* `BIGQUERY_LOCATION`: (Optional) The dataset location.
* `BIGQUERY_USE_CLIENT_OAUTH`: (Optional) If `true`, forwards the client's
OAuth access token for authentication. Defaults to `false`.
* `BIGQUERY_SCOPES`: (Optional) A comma-separated list of OAuth scopes to
use for authentication.
* **Permissions:**
* **BigQuery User** (`roles/bigquery.user`) to execute queries and view
metadata.

View File

@@ -0,0 +1,84 @@
---
title: "EmbeddingModels"
type: docs
weight: 2
description: >
EmbeddingModels represent services that transform text into vector embeddings for semantic search.
---
EmbeddingModels represent services that generate vector representations of text
data. In the MCP Toolbox, these models enable **Semantic Queries**,
allowing [Tools](../tools/) to automatically convert human-readable text into
numerical vectors before using them in a query.
This is primarily used in two scenarios:
- **Vector Ingestion**: Converting a text parameter into a vector string during
an `INSERT` operation.
- **Semantic Search**: Converting a natural language query into a vector to
perform similarity searches.
## Example
The following configuration defines an embedding model and applies it to
specific tool parameters.
{{< notice tip >}}
Use environment variable replacement with the format ${ENV_NAME}
instead of hardcoding your API keys into the configuration file.
{{< /notice >}}
### Step 1 - Define an Embedding Model
Define an embedding model in the `embeddingModels` section:
```yaml
embeddingModels:
gemini-model: # Name of the embedding model
kind: gemini
model: gemini-embedding-001
apiKey: ${GOOGLE_API_KEY}
dimension: 768
```
### Step 2 - Embed Tool Parameters
Use the defined embedding model, embed your query parameters using the
`embeddedBy` field. Only string-typed
parameters can be embedded:
```yaml
tools:
# Vector ingestion tool
insert_embedding:
kind: postgres-sql
source: my-pg-instance
statement: |
INSERT INTO documents (content, embedding)
VALUES ($1, $2);
parameters:
- name: content
type: string
- name: vector_string
type: string
description: The text to be vectorized and stored.
embeddedBy: gemini-model # refers to the name of a defined embedding model
# Semantic search tool
search_embedding:
kind: postgres-sql
source: my-pg-instance
statement: |
SELECT id, content, embedding <-> $1 AS distance
FROM documents
ORDER BY distance LIMIT 1
parameters:
- name: semantic_search_string
type: string
description: The search query that will be converted to a vector.
embeddedBy: gemini-model # refers to the name of a defined embedding model
```
## Kinds of Embedding Models

View File

@@ -0,0 +1,73 @@
---
title: "Gemini Embedding"
type: docs
weight: 1
description: >
Use Google's Gemini models to generate high-performance text embeddings for vector databases.
---
## About
Google Gemini provides state-of-the-art embedding models that convert text into
high-dimensional vectors.
### Authentication
Toolbox uses your [Application Default Credentials
(ADC)][adc] to authorize with the
Gemini API client.
Optionally, you can use an [API key][api-key] obtain an API
Key from the [Google AI Studio][ai-studio].
We recommend using an API key for testing and using application default
credentials for production.
[adc]: https://cloud.google.com/docs/authentication#adc
[api-key]: https://ai.google.dev/gemini-api/docs/api-key#api-keys
[ai-studio]: https://aistudio.google.com/app/apikey
## Behavior
### Automatic Vectorization
When a tool parameter is configured with `embeddedBy: <your-gemini-model-name>`,
the Toolbox intercepts the raw text input from the client and sends it to the
Gemini API. The resulting numerical array is then formatted before being passed
to your database source.
### Dimension Matching
The `dimension` field must match the expected size of your database column
(e.g., a `vector(768)` column in PostgreSQL). This setting is supported by newer
models since 2024 only. You cannot set this value if using the earlier model
(`models/embedding-001`). Check out [available Gemini models][modellist] for more
information.
[modellist]:
https://docs.cloud.google.com/vertex-ai/generative-ai/docs/embeddings/get-text-embeddings#supported-models
## Example
```yaml
embeddingModels:
gemini-model:
kind: gemini
model: gemini-embedding-001
apiKey: ${GOOGLE_API_KEY}
dimension: 768
```
{{< notice tip >}}
Use environment variable replacement with the format ${ENV_NAME}
instead of hardcoding your secrets into the configuration file.
{{< /notice >}}
## Reference
| **field** | **type** | **required** | **description** |
|-----------|:--------:|:------------:|--------------------------------------------------------------|
| kind | string | true | Must be `gemini`. |
| model | string | true | The Gemini model ID to use (e.g., `gemini-embedding-001`). |
| apiKey | string | false | Your API Key from Google AI Studio. |
| dimension | integer | false | The number of dimensions in the output vector (e.g., `768`). |

View File

@@ -94,7 +94,10 @@ cluster][alloydb-free-trial].
instance.
- [`postgres-list-roles`](../tools/postgres/postgres-list-roles.md)
Lists all the user-created roles in PostgreSQL database..
Lists all the user-created roles in PostgreSQL database.
- [`postgres-list-stored-procedure`](../tools/postgres/postgres-list-stored-procedure.md)
Lists all the stored procedure in PostgreSQL database.
### Pre-built Configurations

View File

@@ -94,6 +94,13 @@ intend to run. Common roles include `roles/bigquery.user` (which includes
permissions to run jobs and read data) or `roles/bigbigquery.dataViewer`.
Follow this [guide][set-adc] to set up your ADC.
If you are running on Google Compute Engine (GCE) or Google Kubernetes Engine
(GKE), you might need to explicitly set the access scopes for the service
account. While you can configure scopes when creating the VM or node pool, you
can also specify them in the source configuration using the `scopes` field.
Common scopes include `https://www.googleapis.com/auth/bigquery` or
`https://www.googleapis.com/auth/cloud-platform`.
### Authentication via User's OAuth Access Token
If the `useClientOAuth` parameter is set to `true`, Toolbox will instead use the
@@ -124,6 +131,9 @@ sources:
# - "my_dataset_1"
# - "other_project.my_dataset_2"
# impersonateServiceAccount: "service-account@project-id.iam.gserviceaccount.com" # Optional: Service account to impersonate
# scopes: # Optional: List of OAuth scopes to request.
# - "https://www.googleapis.com/auth/bigquery"
# - "https://www.googleapis.com/auth/drive.readonly"
```
Initialize a BigQuery source that uses the client's access token:
@@ -140,6 +150,9 @@ sources:
# - "my_dataset_1"
# - "other_project.my_dataset_2"
# impersonateServiceAccount: "service-account@project-id.iam.gserviceaccount.com" # Optional: Service account to impersonate
# scopes: # Optional: List of OAuth scopes to request.
# - "https://www.googleapis.com/auth/bigquery"
# - "https://www.googleapis.com/auth/drive.readonly"
```
## Reference
@@ -152,4 +165,5 @@ sources:
| writeMode | string | false | Controls the write behavior for tools. `allowed` (default): All queries are permitted. `blocked`: Only `SELECT` statements are allowed for the `bigquery-execute-sql` tool. `protected`: Enables session-based execution where all tools associated with this source instance share the same [BigQuery session](https://cloud.google.com/bigquery/docs/sessions-intro). This allows for stateful operations using temporary tables (e.g., `CREATE TEMP TABLE`). For `bigquery-execute-sql`, `SELECT` statements can be used on all tables, but write operations are restricted to the session's temporary dataset. For tools like `bigquery-sql`, `bigquery-forecast`, and `bigquery-analyze-contribution`, the `writeMode` restrictions do not apply, but they will operate within the shared session. **Note:** The `protected` mode cannot be used with `useClientOAuth: true`. It is also not recommended for multi-user server environments, as all users would share the same session. A session is terminated automatically after 24 hours of inactivity or after 7 days, whichever comes first. A new session is created on the next request, and any temporary data from the previous session will be lost. |
| allowedDatasets | []string | false | An optional list of dataset IDs that tools using this source are allowed to access. If provided, any tool operation attempting to access a dataset not in this list will be rejected. To enforce this, two types of operations are also disallowed: 1) Dataset-level operations (e.g., `CREATE SCHEMA`), and 2) operations where table access cannot be statically analyzed (e.g., `EXECUTE IMMEDIATE`, `CREATE PROCEDURE`). If a single dataset is provided, it will be treated as the default for prebuilt tools. |
| useClientOAuth | bool | false | If true, forwards the client's OAuth access token from the "Authorization" header to downstream queries. **Note:** This cannot be used with `writeMode: protected`. |
| scopes | []string | false | A list of OAuth 2.0 scopes to use for the credentials. If not provided, default scopes are used. |
| impersonateServiceAccount | string | false | Service account email to impersonate when making BigQuery and Dataplex API calls. The authenticated principal must have the `roles/iam.serviceAccountTokenCreator` role on the target service account. [Learn More](https://cloud.google.com/iam/docs/service-account-impersonation) |

View File

@@ -91,7 +91,10 @@ to a database by following these instructions][csql-pg-quickstart].
instance.
- [`postgres-list-roles`](../tools/postgres/postgres-list-roles.md)
Lists all the user-created roles in PostgreSQL database..
Lists all the user-created roles in PostgreSQL database.
- [`postgres-list-stored-procedure`](../tools/postgres/postgres-list-stored-procedure.md)
Lists all the stored procedure in PostgreSQL database.
### Pre-built Configurations

View File

@@ -229,22 +229,38 @@ Finds resources that were created within, before, or after a given date or time.
### Aspect Search
To search for entries based on their attached aspects, use the following query syntax.
aspect:x Matches x as a substring of the full path to the aspect type of an aspect that is attached to the entry, in the format projectid.location.ASPECT_TYPE_ID
aspect=x Matches x as the full path to the aspect type of an aspect that is attached to the entry, in the format projectid.location.ASPECT_TYPE_ID
aspect:xOPERATORvalue
Searches for aspect field values. Matches x as a substring of the full path to the aspect type and field name of an aspect that is attached to the entry, in the format projectid.location.ASPECT_TYPE_ID.FIELD_NAME
`has:x`
Matches `x` as a substring of the full path to the aspect type of an aspect that is attached to the entry, in the format `projectid.location.ASPECT_TYPE_ID`
The list of supported {OPERATOR}s depends on the type of field in the aspect, as follows:
- String: = (exact match) and : (substring)
- All number types: =, :, <, >, <=, >=, =>, =<
- Enum: =
- Datetime: same as for numbers, but the values to compare are treated as datetimes instead of numbers
- Boolean: =
`has=x`
Matches `x` as the full path to the aspect type of an aspect that is attached to the entry, in the format `projectid.location.ASPECT_TYPE_ID`
Only top-level fields of the aspect are searchable. For example, all of the following queries match entries where the value of the is-enrolled field in the employee-info aspect type is true. Other entries that match on the substring are also returned.
- aspect:example-project.us-central1.employee-info.is-enrolled=true
- aspect:example-project.us-central1.employee=true
- aspect:employee=true
`xOPERATORvalue`
Searches for aspect field values. Matches x as a substring of the full path to the aspect type and field name of an aspect that is attached to the entry, in the format `projectid.location.ASPECT_TYPE_ID.FIELD_NAME`
The list of supported operators depends on the type of field in the aspect, as follows:
* **String**: `=` (exact match)
* **All number types**: `=`, `:`, `<`, `>`, `<=`, `>=`, `=>`, `=<`
* **Enum**: `=` (exact match only)
* **Datetime**: same as for numbers, but the values to compare are treated as datetimes instead of numbers
* **Boolean**: `=`
Only top-level fields of the aspect are searchable.
* Syntax for system aspect types:
* `ASPECT_TYPE_ID.FIELD_NAME`
* `dataplex-types.ASPECT_TYPE_ID.FIELD_NAME`
* `dataplex-types.LOCATION.ASPECT_TYPE_ID.FIELD_NAME`
For example, the following queries match entries where the value of the `type` field in the `bigquery-dataset` aspect is `default`:
* `bigquery-dataset.type=default`
* `dataplex-types.bigquery-dataset.type=default`
* `dataplex-types.global.bigquery-dataset.type=default`
* Syntax for custom aspect types:
* If the aspect is created in the global region: `PROJECT_ID.ASPECT_TYPE_ID.FIELD_NAME`
* If the aspect is created in a specific region: `PROJECT_ID.REGION.ASPECT_TYPE_ID.FIELD_NAME`
For example, the following queries match entries where the value of the `is-enrolled` field in the `employee-info` aspect is `true`.
* `example-project.us-central1.employee-info.is-enrolled=true`
* `example-project.employee-info.is-enrolled=true`
Example:-
You can use following filters
@@ -258,6 +274,25 @@ Logical AND and logical OR are supported. For example, foo OR bar.
You can negate a predicate with a - (hyphen) or NOT prefix. For example, -name:foo returns resources with names that don't match the predicate foo.
Logical operators are case-sensitive. `OR` and `AND` are acceptable whereas `or` and `and` are not.
### Abbreviated syntax
An abbreviated search syntax is also available, using `|` (vertical bar) for `OR` operators and `,` (comma) for `AND` operators.
For example, to search for entries inside one of many projects using the `OR` operator, you can use the following abbreviated syntax:
`projectid:(id1|id2|id3|id4)`
The same search without using abbreviated syntax looks like the following:
`projectid:id1 OR projectid:id2 OR projectid:id3 OR projectid:id4`
To search for entries with matching column names, use the following:
* **AND**: `column:(name1,name2,name3)`
* **OR**: `column:(name1|name2|name3)`
This abbreviated syntax works for the qualified predicates except for `label` in keyword search.
### Request
1. Always try to rewrite the prompt using search syntax.

View File

@@ -85,7 +85,10 @@ reputation for reliability, feature robustness, and performance.
server.
- [`postgres-list-roles`](../tools/postgres/postgres-list-roles.md)
Lists all the user-created roles in PostgreSQL database..
Lists all the user-created roles in PostgreSQL database.
- [`postgres-list-stored-procedure`](../tools/postgres/postgres-list-stored-procedure.md)
Lists all the stored procedure in PostgreSQL database.
### Pre-built Configurations

View File

@@ -18,9 +18,11 @@ It's compatible with the following sources:
- [looker](../../sources/looker.md)
`looker-make-dashboard` takes one parameter:
`looker-make-dashboard` takes three parameters:
1. the `title`
2. the `description`
3. an optional `folder` id. If not provided, the user's default folder will be used.
## Example

View File

@@ -18,7 +18,7 @@ It's compatible with the following sources:
- [looker](../../sources/looker.md)
`looker-make-look` takes eleven parameters:
`looker-make-look` takes twelve parameters:
1. the `model`
2. the `explore`
@@ -31,6 +31,7 @@ It's compatible with the following sources:
9. an optional `vis_config`
10. the `title`
11. an optional `description`
12. an optional `folder` id. If not provided, the user's default folder will be used.
## Example

View File

@@ -169,5 +169,5 @@ tools:
| source | string | true | Name of the source the SQL should execute on. |
| description | string | true | Description of the tool that is passed to the LLM. |
| statement | string | true | SQL statement to execute on. |
| parameters | [parameters](_index#specifying-parameters) | false | List of [parameters](_index#specifying-parameters) that will be inserted into the SQL statement. |
| templateParameters | [templateParameters](_index#template-parameters) | false | List of [templateParameters](_index#template-parameters) that will be inserted into the SQL statement before executing prepared statement. |
| parameters | [parameters](../#specifying-parameters) | false | List of [parameters](../#specifying-parameters) that will be inserted into the SQL statement. |
| templateParameters | [templateParameters](../#template-parameters) | false | List of [templateParameters](../#template-parameters) that will be inserted into the SQL statement before executing prepared statement. |

View File

@@ -0,0 +1,141 @@
---
title: "postgres-list-stored-procedure"
type: docs
weight: 1
description: >
The "postgres-list-stored-procedure" tool retrieves metadata for stored procedures in PostgreSQL, including procedure definitions, owners, languages, and descriptions.
aliases:
- /resources/tools/postgres-list-stored-procedure
---
## About
The `postgres-list-stored-procedure` tool queries PostgreSQL system catalogs (`pg_proc`, `pg_namespace`, `pg_roles`, and `pg_language`) to retrieve comprehensive metadata about stored procedures in the database. It filters for procedures (kind = 'p') and provides the full procedure definition along with ownership and language information.
Compatible sources:
- [alloydb-postgres](../../sources/alloydb-pg.md)
- [cloud-sql-postgres](../../sources/cloud-sql-pg.md)
- [postgres](../../sources/postgres.md)
The tool returns a JSON array where each element represents a stored procedure with its schema, name, owner, language, complete definition, and optional description. Results are sorted by schema name and procedure name, with a default limit of 20 procedures.
## Parameters
| parameter | type | required | default | description |
|--------------|---------|----------|---------|-------------|
| role_name | string | false | null | Optional: The owner name to filter stored procedures by (supports partial matching) |
| schema_name | string | false | null | Optional: The schema name to filter stored procedures by (supports partial matching) |
| limit | integer | false | 20 | Optional: The maximum number of stored procedures to return |
## Example
```yaml
tools:
list_stored_procedure:
kind: postgres-list-stored-procedure
source: postgres-source
description: "Retrieves stored procedure metadata including definitions and owners."
```
### Example Requests
**List all stored procedures (default limit 20):**
```json
{}
```
**Filter by specific owner (role):**
```json
{
"role_name": "app_user"
}
```
**Filter by schema:**
```json
{
"schema_name": "public"
}
```
**Filter by owner and schema with custom limit:**
```json
{
"role_name": "postgres",
"schema_name": "public",
"limit": 50
}
```
**Filter by partial schema name:**
```json
{
"schema_name": "audit"
}
```
### Example Response
```json
[
{
"schema_name": "public",
"name": "process_payment",
"owner": "postgres",
"language": "plpgsql",
"definition": "CREATE OR REPLACE PROCEDURE public.process_payment(p_order_id integer, p_amount numeric)\n LANGUAGE plpgsql\nAS $procedure$\nBEGIN\n UPDATE orders SET status = 'paid', amount = p_amount WHERE id = p_order_id;\n INSERT INTO payment_log (order_id, amount, timestamp) VALUES (p_order_id, p_amount, now());\n COMMIT;\nEND\n$procedure$",
"description": "Processes payment for an order and logs the transaction"
},
{
"schema_name": "public",
"name": "cleanup_old_records",
"owner": "postgres",
"language": "plpgsql",
"definition": "CREATE OR REPLACE PROCEDURE public.cleanup_old_records(p_days_old integer)\n LANGUAGE plpgsql\nAS $procedure$\nDECLARE\n v_deleted integer;\nBEGIN\n DELETE FROM audit_logs WHERE created_at < now() - (p_days_old || ' days')::interval;\n GET DIAGNOSTICS v_deleted = ROW_COUNT;\n RAISE NOTICE 'Deleted % records', v_deleted;\nEND\n$procedure$",
"description": "Removes audit log records older than specified days"
},
{
"schema_name": "audit",
"name": "audit_table_changes",
"owner": "app_user",
"language": "plpgsql",
"definition": "CREATE OR REPLACE PROCEDURE audit.audit_table_changes()\n LANGUAGE plpgsql\nAS $procedure$\nBEGIN\n INSERT INTO audit.change_log (table_name, operation, changed_at) VALUES (TG_TABLE_NAME, TG_OP, now());\nEND\n$procedure$",
"description": null
}
]
```
## Output Fields Reference
| field | type | description |
|-------------|---------|-------------|
| schema_name | string | Name of the schema containing the stored procedure. |
| name | string | Name of the stored procedure. |
| owner | string | PostgreSQL role/user who owns the stored procedure. |
| language | string | Programming language in which the procedure is written (e.g., plpgsql, sql, c). |
| definition | string | Complete SQL definition of the stored procedure, including the CREATE PROCEDURE statement. |
| description | string | Optional description or comment for the procedure (may be null if no comment is set). |
## Use Cases
- **Code review and auditing**: Export procedure definitions for version control or compliance audits.
- **Documentation generation**: Automatically extract procedure metadata and descriptions for documentation.
- **Permission auditing**: Identify procedures owned by specific users or in specific schemas.
- **Migration planning**: Retrieve all procedure definitions when planning database migrations.
- **Dependency analysis**: Review procedure definitions to understand dependencies and call chains.
- **Security assessment**: Audit which roles own and can modify stored procedures.
## Performance Considerations
- The tool filters at the database level using LIKE pattern matching, so partial matches are supported.
- Procedure definitions can be large; consider using the `limit` parameter for large databases with many procedures.
- Results are ordered by schema name and procedure name for consistent output.
- The default limit of 20 procedures is suitable for most use cases; increase as needed.
## Notes
- Only stored **procedures** are returned; functions and other callable objects are excluded via the `prokind = 'p'` filter.
- Filtering uses `LIKE` pattern matching, so filter values support partial matches (e.g., `role_name: "app"` will match "app_user", "app_admin", etc.).
- The `definition` field contains the complete, runnable CREATE PROCEDURE statement.
- The `description` field is populated from comments set via PostgreSQL's COMMENT command and may be null.

View File

@@ -0,0 +1,25 @@
---
title: "JS SDK"
type: docs
weight: 7
description: >
JS SDKs to connect to the MCP Toolbox server.
---
## Overview
The MCP Toolbox service provides a centralized way to manage and expose tools
(like API connectors, database query tools, etc.) for use by GenAI applications.
These JS SDKs act as clients for that service. They handle the communication needed to:
* Fetch tool definitions from your running Toolbox instance.
* Provide convenient JS objects or functions representing those tools.
* Invoke the tools (calling the underlying APIs/services configured in Toolbox).
* Handle authentication and parameter binding as needed.
By using these SDKs, you can easily leverage your Toolbox-managed tools directly
within your JS applications or AI orchestration frameworks.
[Github](https://github.com/googleapis/mcp-toolbox-sdk-js)

View File

@@ -1,15 +0,0 @@
---
title: "Go SDK"
weight: 2
description: Go lang client SDK
icon: fa-brands fa-golang
manualLink: "https://github.com/googleapis/mcp-toolbox-sdk-go"
manualLinkTarget: _blank
---
<html>
<head>
<link rel="canonical" href="https://github.com/googleapis/mcp-toolbox-sdk-go"/>
<meta http-equiv="refresh" content="0;url=https://github.com/googleapis/mcp-toolbox-sdk-go"/>
</head>
</html>

View File

@@ -0,0 +1,114 @@
---
title: "Go SDK"
type: docs
weight: 7
description: >
Go SDKs to connect to the MCP Toolbox server.
---
## Overview
![MCP Toolbox
Logo](https://raw.githubusercontent.com/googleapis/genai-toolbox/main/logo.png)
# MCP Toolbox SDKs for Go
[![License: Apache
2.0](https://img.shields.io/badge/License-Apache%202.0-blue.svg)](https://opensource.org/licenses/Apache-2.0)
[![Docs](https://img.shields.io/badge/Docs-MCP_Toolbox-blue)](https://googleapis.github.io/genai-toolbox/)
[![Discord](https://img.shields.io/badge/Discord-%235865F2.svg?style=flat&logo=discord&logoColor=white)](https://discord.gg/Dmm69peqjh)
[![Medium](https://img.shields.io/badge/Medium-12100E?style=flat&logo=medium&logoColor=white)](https://medium.com/@mcp_toolbox)
[![Go Report Card](https://goreportcard.com/badge/github.com/googleapis/mcp-toolbox-sdk-go)](https://goreportcard.com/report/github.com/googleapis/mcp-toolbox-sdk-go)
[![Module Version](https://img.shields.io/github/v/release/googleapis/mcp-toolbox-sdk-go)](https://img.shields.io/github/v/release/googleapis/mcp-toolbox-sdk-go)
[![Go Version](https://img.shields.io/github/go-mod/go-version/googleapis/mcp-toolbox-sdk-go)](https://img.shields.io/github/go-mod/go-version/googleapis/mcp-toolbox-sdk-go)
This repository contains the Go SDK designed to seamlessly integrate the
functionalities of the [MCP
Toolbox](https://github.com/googleapis/genai-toolbox) into your Gen AI
applications. The SDK allow you to load tools defined in Toolbox and use them
as standard Go tools within popular orchestration frameworks
or your custom code.
This simplifies the process of incorporating external functionalities (like
Databases or APIs) managed by Toolbox into your GenAI applications.
<!-- TOC -->
- [Overview](#overview)
- [Which Package Should I Use?](#which-package-should-i-use)
- [Available Packages](#available-packages)
- [Getting Started](#getting-started)
<!-- /TOC -->
## Overview
The MCP Toolbox service provides a centralized way to manage and expose tools
(like API connectors, database query tools, etc.) for use by GenAI applications.
The Go SDK act as clients for that service. They handle the communication needed to:
* Fetch tool definitions from your running Toolbox instance.
* Provide convenient Go structs representing those tools.
* Invoke the tools (calling the underlying APIs/services configured in Toolbox).
* Handle authentication and parameter binding as needed.
By using the SDK, you can easily leverage your Toolbox-managed tools directly
within your Go applications or AI orchestration frameworks.
## Which Package Should I Use?
Choosing the right package depends on how you are building your application:
- [`core`](https://github.com/googleapis/mcp-toolbox-sdk-go/tree/main/core):
This is a framework agnostic way to connect the tools to popular frameworks
like Google GenAI, LangChain, etc.
- [`tbadk`](https://github.com/googleapis/mcp-toolbox-sdk-go/tree/main/tbadk):
This package provides a way to connect tools to ADK Go.
- [`tbgenkit`](https://github.com/googleapis/mcp-toolbox-sdk-go/tree/main/tbgenkit):
This package provides a functionality to convert the Tool fetched using the core package
into a Genkit Go compatible tool.
## Available Packages
This repository hosts the following Go packages. See the package-specific
README for detailed installation and usage instructions:
| Package | Target Use Case | Integration | Path | Details (README) |
| :------ | :----------| :---------- | :---------------------- | :---------- |
| `core` | Framework-agnostic / Custom applications | Use directly / Custom | `core/` | 📄 [View README](https://github.com/googleapis/mcp-toolbox-sdk-go/blob/main/core/README.md) |
| `tbadk` | ADK Go | Use directly | `tbadk/` | 📄 [View README](https://github.com/googleapis/mcp-toolbox-sdk-go/blob/main/tbadk/README.md) |
| `tbgenkit` | Genkit Go | Along with core | `tbgenkit/` | 📄 [View README](https://github.com/googleapis/mcp-toolbox-sdk-go/blob/main/tbgenkit/README.md) |
## Getting Started
To get started using Toolbox tools with an application, follow these general steps:
1. **Set up and Run the Toolbox Service:**
Before using the SDKs, you need the MCP Toolbox server running. Follow
the instructions here: [**Toolbox Getting Started
Guide**](https://github.com/googleapis/genai-toolbox?tab=readme-ov-file#getting-started)
2. **Install the Appropriate SDK:**
Choose the package based on your needs (see "[Which Package Should I Use?](#which-package-should-i-use)" above)
Use this command to install the SDK module
```bash
# For the core, framework-agnostic SDK
go get github.com/googleapis/mcp-toolbox-sdk-go
```
3. **Use the SDK:**
Consult the README for your chosen package (linked in the "[Available
Packages](#available-packages)" section above) for detailed instructions on
how to connect the client, load tool definitions, invoke tools, configure
authentication/binding, and integrate them into your application or
framework.
[Github](https://github.com/googleapis/mcp-toolbox-sdk-go)

View File

@@ -0,0 +1,998 @@
---
title: "Core Package"
linkTitle: "Core"
type: docs
weight: 1
---
![MCP Toolbox Logo](https://raw.githubusercontent.com/googleapis/genai-toolbox/main/logo.png)
# MCP Toolbox Core SDK
[![License: Apache 2.0](https://img.shields.io/badge/License-Apache%202.0-blue.svg)](https://opensource.org/licenses/Apache-2.0)
This SDK allows you to seamlessly integrate the functionalities of
[Toolbox](https://github.com/googleapis/genai-toolbox) allowing you to load and
use tools defined in the service as standard Go structs within your GenAI
applications.
This simplifies integrating external functionalities (like APIs, databases, or
custom logic) managed by the Toolbox into your workflows, especially those
involving Large Language Models (LLMs).
<!-- TOC ignore:true -->
<!-- TOC -->
- [MCP Toolbox Core SDK](#mcp-toolbox-core-sdk)
- [Installation](#installation)
- [Quickstart](#quickstart)
- [Usage](#usage)
- [Transport Protocols](#transport-protocols)
- [Supported Protocols](#supported-protocols)
- [Example](#example)
- [Loading Tools](#loading-tools)
- [Load a toolset](#load-a-toolset)
- [Load a single tool](#load-a-single-tool)
- [Invoking Tools](#invoking-tools)
- [Client to Server Authentication](#client-to-server-authentication)
- [When is Client-to-Server Authentication Needed?](#when-is-client-to-server-authentication-needed)
- [How it works](#how-it-works)
- [Configuration](#configuration)
- [Authenticating with Google Cloud Servers](#authenticating-with-google-cloud-servers)
- [Step by Step Guide for Cloud Run](#step-by-step-guide-for-cloud-run)
- [Authenticating Tools](#authenticating-tools)
- [When is Authentication Needed?](#when-is-authentication-needed)
- [Supported Authentication Mechanisms](#supported-authentication-mechanisms)
- [Step 1: Configure Tools in Toolbox Service](#step-1-configure-tools-in-toolbox-service)
- [Step 2: Configure SDK Client](#step-2-configure-sdk-client)
- [Provide an ID Token Retriever Function](#provide-an-id-token-retriever-function)
- [Option A: Add Default Authentication to a Client](#option-a-add-default-authentication-to-a-client)
- [Option B: Add Authentication to a Loaded Tool](#option-b-add-authentication-to-a-loaded-tool)
- [Option C: Add Authentication While Loading Tools](#option-c-add-authentication-while-loading-tools)
- [Complete Authentication Example](#complete-authentication-example)
- [Binding Parameter Values](#binding-parameter-values)
- [Why Bind Parameters?](#why-bind-parameters)
- [Option A: Add Default Bound Parameters to a Client](#option-a-add-default-bound-parameters-to-a-client)
- [Option B: Binding Parameters to a Loaded Tool](#option-b-binding-parameters-to-a-loaded-tool)
- [Option C: Binding Parameters While Loading Tools](#option-c-binding-parameters-while-loading-tools)
- [Binding Dynamic Values](#binding-dynamic-values)
- [Using with Orchestration Frameworks](#using-with-orchestration-frameworks)
- [Contributing](#contributing)
- [License](#license)
- [Support](#support)
<!-- /TOC -->
## Installation
```bash
go get github.com/googleapis/mcp-toolbox-sdk-go
```
This SDK is supported on Go version 1.24.4 and higher.
> [!NOTE]
>
> - While the SDK itself is synchronous, you can execute its functions within goroutines to achieve asynchronous behavior.
## Quickstart
Here's a minimal example to get you started. Ensure your Toolbox service is
running and accessible.
```go
package main
import (
"context"
"fmt"
"github.com/googleapis/mcp-toolbox-sdk-go/core"
)
func quickstart() string {
ctx := context.Background()
inputs := map[string]any{"location": "London"}
client, err := core.NewToolboxClient("http://localhost:5000")
if err != nil {
return fmt.Sprintln("Could not start Toolbox Client", err)
}
tool, err := client.LoadTool("get_weather", ctx)
if err != nil {
return fmt.Sprintln("Could not load Toolbox Tool", err)
}
result, err := tool.Invoke(ctx, inputs)
if err != nil {
return fmt.Sprintln("Could not invoke tool", err)
}
return fmt.Sprintln(result)
}
func main() {
fmt.Println(quickstart())
}
```
## Usage
Import and initialize a Toolbox client, pointing it to the URL of your running
Toolbox service.
```go
import "github.com/googleapis/mcp-toolbox-sdk-go/core"
client, err := core.NewToolboxClient("http://localhost:5000")
```
All interactions for loading and invoking tools happen through this client.
> [!NOTE]
> For advanced use cases, you can provide an external custom `http.Client`
> during initialization (e.g., `core.NewToolboxClient(URL, core.WithHTTPClient(myClient)`). If you
> provide your own session, you are responsible for managing its lifecycle;
> `ToolboxClient` *will not* close it.
> [!IMPORTANT]
> Closing the `ToolboxClient` also closes the underlying network session shared by
> all tools loaded from that client. As a result, any tool instances you have
> loaded will cease to function and will raise an error if you attempt to invoke
> them after the client is closed.
## Transport Protocols
The SDK supports multiple transport protocols for communicating with the Toolbox server. By default, the client uses the latest supported version of the **Model Context Protocol (MCP)**.
You can explicitly select a protocol using the `core.WithProtocol` option during client initialization. This is useful if you need to use the native Toolbox HTTP protocol or pin the client to a specific legacy version of MCP.
> [!NOTE]
> * **Native Toolbox Transport**: This uses the service's native **REST over HTTP** API.
> * **MCP Transports**: These options use the **Model Context Protocol over HTTP**.
### Supported Protocols
| Constant | Description |
| :--- | :--- |
| `core.MCP` | **(Default)** Alias for the latest supported MCP version (currently `v2025-06-18`). |
| `core.Toolbox` | The native Toolbox HTTP protocol. |
| `core.MCPv20250618` | MCP Protocol version 2025-06-18. |
| `core.MCPv20250326` | MCP Protocol version 2025-03-26. |
| `core.MCPv20241105` | MCP Protocol version 2024-11-05. |
### Example
If you wish to use the native Toolbox protocol:
```go
import "github.com/googleapis/mcp-toolbox-sdk-go/core"
client, err := core.NewToolboxClient(
"http://localhost:5000",
core.WithProtocol(core.Toolbox),
)
```
If you want to pin the MCP Version 2025-03-26:
```go
import "github.com/googleapis/mcp-toolbox-sdk-go/core"
client, err := core.NewToolboxClient(
"http://localhost:5000",
core.WithProtocol(core.MCPv20250326),
)
```
## Loading Tools
You can load tools individually or in groups (toolsets) as defined in your
Toolbox service configuration. Loading a toolset is convenient when working with
multiple related functions, while loading a single tool offers more granular
control.
### Load a toolset
A toolset is a collection of related tools. You can load all tools in a toolset
or a specific one:
```go
// Load default toolset by providing an empty string as the name
tools, err := client.LoadToolset("", ctx)
// Load a specific toolset
tools, err := client.LoadToolset("my-toolset", ctx)
```
### Load a single tool
Loads a specific tool by its unique name. This provides fine-grained control.
```go
tool, err = client.LoadTool("my-tool", ctx)
```
## Invoking Tools
Once loaded, tools behave like Go structs. You invoke them using `Invoke` method
by passing arguments corresponding to the parameters defined in the tool's
configuration within the Toolbox service.
```go
tool, err = client.LoadTool("my-tool", ctx)
inputs := map[string]any{"location": "London"}
result, err := tool.Invoke(ctx, inputs)
```
> [!TIP]
> For a more comprehensive guide on setting up the Toolbox service itself, which
> you'll need running to use this SDK, please refer to the [Toolbox Quickstart
> Guide](https://googleapis.github.io/genai-toolbox/getting-started/local_quickstart).
## Client to Server Authentication
This section describes how to authenticate the ToolboxClient itself when
connecting to a Toolbox server instance that requires authentication. This is
crucial for securing your Toolbox server endpoint, especially when deployed on
platforms like Cloud Run, GKE, or any environment where unauthenticated access is restricted.
This client-to-server authentication ensures that the Toolbox server can verify
the identity of the client making the request before any tool is loaded or
called. It is different from [Authenticating Tools](#authenticating-tools),
which deals with providing credentials for specific tools within an already
connected Toolbox session.
### When is Client-to-Server Authentication Needed?
You'll need this type of authentication if your Toolbox server is configured to
deny unauthenticated requests. For example:
- Your Toolbox server is deployed on Cloud Run and configured to "Require authentication."
- Your server is behind an Identity-Aware Proxy (IAP) or a similar
authentication layer.
- You have custom authentication middleware on your self-hosted Toolbox server.
Without proper client authentication in these scenarios, attempts to connect or
make calls (like `LoadTool`) will likely fail with `Unauthorized` errors.
### How it works
The `ToolboxClient` allows you to specify TokenSources that dynamically generate HTTP headers for
every request sent to the Toolbox server. The most common use case is to add an
Authorization header with a bearer token (e.g., a Google ID token).
These header-generating functions are called just before each request, ensuring
that fresh credentials or header values can be used.
### Configuration
You can configure these dynamic headers as seen below:
```go
import "github.com/googleapis/mcp-toolbox-sdk-go/core"
tokenProvider := func() string {
return "header3_value"
}
staticTokenSource := oauth2.StaticTokenSource(&oauth2.Token{AccessToken: "header2_value"})
dynamicTokenSource := core.NewCustomTokenSource(tokenProvider)
client, err := core.NewToolboxClient(
"toolbox-url",
core.WithClientHeaderString("header1", "header1_value"),
core.WithClientHeaderTokenSource("header2", staticTokenSource),
core.WithClientHeaderTokenSource("header3", dynamicTokenSource),
)
```
### Authenticating with Google Cloud Servers
For Toolbox servers hosted on Google Cloud (e.g., Cloud Run) and requiring
`Google ID token` authentication, the helper module
[auth_methods](https://github.com/googleapis/mcp-toolbox-sdk-go/blob/main/core/auth.go) provides utility functions.
### Step by Step Guide for Cloud Run
1. **Configure Permissions**: [Grant](https://cloud.google.com/run/docs/securing/managing-access#service-add-principals) the `roles/run.invoker` IAM role on the Cloud
Run service to the principal. This could be your `user account email` or a
`service account`.
2. **Configure Credentials**
- Local Development: Set up
[ADC](https://cloud.google.com/docs/authentication/set-up-adc-local-dev-environment).
- Google Cloud Environments: When running within Google Cloud (e.g., Compute
Engine, GKE, another Cloud Run service, Cloud Functions), ADC is typically
configured automatically, using the environment's default service account.
3. **Connect to the Toolbox Server**
```go
import "github.com/googleapis/mcp-toolbox-sdk-go/core"
import "context"
ctx := context.Background()
token, err := core.GetGoogleIDToken(ctx, URL)
client, err := core.NewToolboxClient(
URL,
core.WithClientHeaderString("Authorization", token),
)
// Now, you can use the client as usual.
```
## Authenticating Tools
> [!WARNING]
> **Always use HTTPS** to connect your application with the Toolbox service,
> especially in **production environments** or whenever the communication
> involves **sensitive data** (including scenarios where tools require
> authentication tokens). Using plain HTTP lacks encryption and exposes your
> application and data to significant security risks, such as eavesdropping and
> tampering.
Tools can be configured within the Toolbox service to require authentication,
ensuring only authorized users or applications can invoke them, especially when
accessing sensitive data.
### When is Authentication Needed?
Authentication is configured per-tool within the Toolbox service itself. If a
tool you intend to use is marked as requiring authentication in the service, you
must configure the SDK client to provide the necessary credentials (currently
Oauth2 tokens) when invoking that specific tool.
### Supported Authentication Mechanisms
The Toolbox service enables secure tool usage through **Authenticated Parameters**.
For detailed information on how these mechanisms work within the Toolbox service and how to configure them, please refer to [Toolbox Service Documentation - Authenticated Parameters](https://googleapis.github.io/genai-toolbox/resources/tools/#authenticated-parameters).
### Step 1: Configure Tools in Toolbox Service
First, ensure the target tool(s) are configured correctly in the Toolbox service
to require authentication. Refer to the [Toolbox Service Documentation -
Authenticated
Parameters](https://googleapis.github.io/genai-toolbox/resources/tools/#authenticated-parameters)
for instructions.
### Step 2: Configure SDK Client
Your application needs a way to obtain the required Oauth2 token for the
authenticated user. The SDK requires you to provide a function capable of
retrieving this token *when the tool is invoked*.
#### Provide an ID Token Retriever Function
You must provide the SDK with a function that returns the
necessary token when called. The implementation depends on your application's
authentication flow (e.g., retrieving a stored token, initiating an OAuth flow).
> [!IMPORTANT]
> The name used when registering the getter function with the SDK (e.g.,
> `"my_api_token"`) must exactly match the `name` of the corresponding
> `authServices` defined in the tool's configuration within the Toolbox service.
```go
func getAuthToken() string {
// ... Logic to retrieve ID token (e.g., from local storage, OAuth flow)
// This example just returns a placeholder. Replace with your actual token retrieval.
return "YOUR_ID_TOKEN" // Placeholder
}
```
> [!TIP]
> Your token retriever function is invoked every time an authenticated parameter
> requires a token for a tool call. Consider implementing caching logic within
> this function to avoid redundant token fetching or generation, especially for
> tokens with longer validity periods or if the retrieval process is
> resource-intensive.
#### Option A: Add Default Authentication to a Client
You can add default tool level authentication to a client.
Every tool / toolset loaded by the client will contain the auth token.
```go
ctx := context.Background()
client, err := core.NewToolboxClient("http://127.0.0.1:5000",
core.WithDefaultToolOptions(
core.WithAuthTokenString("my-auth-1", "auth-value"),
),
)
AuthTool, err := client.LoadTool("my-tool", ctx)
```
#### Option B: Add Authentication to a Loaded Tool
You can add the token retriever function to a tool object *after* it has been
loaded. This modifies the specific tool instance.
```go
ctx := context.Background()
client, err := core.NewToolboxClient("http://127.0.0.1:5000")
tool, err := client.LoadTool("my-tool", ctx)
AuthTool, err := tool.ToolFrom(
core.WithAuthTokenSource("my-auth", headerTokenSource),
core.WithAuthTokenString("my-auth-1", "value"),
)
```
#### Option C: Add Authentication While Loading Tools
You can provide the token retriever(s) directly during the `LoadTool` or
`LoadToolset` calls. This applies the authentication configuration only to the
tools loaded in that specific call, without modifying the original tool objects
if they were loaded previously.
```go
AuthTool, err := client.LoadTool("my-tool", ctx, core.WithAuthTokenString("my-auth-1", "value"))
// or
AuthTools, err := client.LoadToolset(
"my-toolset",
ctx,
core.WithAuthTokenString("my-auth-1", "value"),
)
```
> [!NOTE]
> Adding auth tokens during loading only affect the tools loaded within that
> call.
### Complete Authentication Example
```go
import "github.com/googleapis/mcp-toolbox-sdk-go/core"
import "fmt"
func getAuthToken() string {
// ... Logic to retrieve ID token (e.g., from local storage, OAuth flow)
// This example just returns a placeholder. Replace with your actual token retrieval.
return "YOUR_ID_TOKEN" // Placeholder
}
func main() {
ctx := context.Background()
inputs := map[string]any{"input": "some input"}
dynamicTokenSource := core.NewCustomTokenSource(getAuthToken)
client, err := core.NewToolboxClient("http://127.0.0.1:5000")
tool, err := client.LoadTool("my-tool", ctx)
AuthTool, err := tool.ToolFrom(core.WithAuthTokenSource("my_auth", dynamicTokenSource))
result, err := AuthTool.Invoke(ctx, inputs)
fmt.Println(result)
}
```
> [!NOTE]
> An auth token getter for a specific name (e.g., "GOOGLE_ID") will replace any
> client header with the same name followed by "_token" (e.g.,
> "GOOGLE_ID_token").
## Binding Parameter Values
The SDK allows you to pre-set, or "bind", values for specific tool parameters
before the tool is invoked or even passed to an LLM. These bound values are
fixed and will not be requested or modified by the LLM during tool use.
### Why Bind Parameters?
- **Protecting sensitive information:** API keys, secrets, etc.
- **Enforcing consistency:** Ensuring specific values for certain parameters.
- **Pre-filling known data:** Providing defaults or context.
> [!IMPORTANT]
> The parameter names used for binding (e.g., `"api_key"`) must exactly match the
> parameter names defined in the tool's configuration within the Toolbox
> service.
> [!NOTE]
> You do not need to modify the tool's configuration in the Toolbox service to
> bind parameter values using the SDK.
#### Option A: Add Default Bound Parameters to a Client
You can add default tool level bound parameters to a client. Every tool / toolset
loaded by the client will have the bound parameter.
```go
ctx := context.Background()
client, err := core.NewToolboxClient("http://127.0.0.1:5000",
core.WithDefaultToolOptions(
core.WithBindParamString("param1", "value"),
),
)
boundTool, err := client.LoadTool("my-tool", ctx)
```
### Option B: Binding Parameters to a Loaded Tool
Bind values to a tool object *after* it has been loaded. This modifies the
specific tool instance.
```go
client, err := core.NewToolboxClient("http://127.0.0.1:5000")
tool, err := client.LoadTool("my-tool", ctx)
boundTool, err := tool.ToolFrom(
core.WithBindParamString("param1", "value"),
core.WithBindParamString("param2", "value")
)
```
### Option C: Binding Parameters While Loading Tools
Specify bound parameters directly when loading tools. This applies the binding
only to the tools loaded in that specific call.
```go
boundTool, err := client.LoadTool("my-tool", ctx, core.WithBindParamString("param", "value"))
// OR
boundTool, err := client.LoadToolset("", ctx, core.WithBindParamString("param", "value"))
```
> [!NOTE]
> Bound values during loading only affect the tools loaded in that call.
### Binding Dynamic Values
Instead of a static value, you can bind a parameter to a synchronous or
asynchronous function. This function will be called *each time* the tool is
invoked to dynamically determine the parameter's value at runtime.
Functions with the return type (data_type, error) can be provided.
```go
getDynamicValue := func() (string, error) { return "req-123", nil }
dynamicBoundTool, err := tool.ToolFrom(core.WithBindParamStringFunc("param", getDynamicValue))
```
> [!IMPORTANT]
> You don't need to modify tool configurations to bind parameter values.
# Using with Orchestration Frameworks
To see how the MCP Toolbox Go SDK works with orchestration frameworks, check out the end-to-end examples in the [/samples/](https://github.com/googleapis/mcp-toolbox-sdk-go/tree/main/core/samples) folder.
Use the [tbgenkit package](https://github.com/googleapis/mcp-toolbox-sdk-go/tree/main/tbgenkit) to convert Toolbox Tools into Genkit compatible tools.
# Contributing
Contributions are welcome! Please refer to the [DEVELOPER.md](https://github.com/googleapis/mcp-toolbox-sdk-go/blob/main/DEVELOPER.md)
file for guidelines on how to set up a development environment and run tests.
# License
This project is licensed under the Apache License 2.0. See the
[LICENSE](https://github.com/googleapis/mcp-toolbox-sdk-go/blob/main/LICENSE) file for details.
# Support
If you encounter issues or have questions, check the existing [GitHub Issues](https://github.com/googleapis/genai-toolbox/issues) for the main Toolbox project.
## Samples for Reference
These samples demonstrate how to integrate the MCP Toolbox Go Core SDK with popular orchestration frameworks.
{{< tabpane persist=header >}}
{{< tab header="Google GenAI" lang="go" >}}
// This sample demonstrates integration with the standard Google GenAI framework.
package main
import (
"context"
"encoding/json"
"fmt"
"log"
"os"
"github.com/googleapis/mcp-toolbox-sdk-go/core"
"google.golang.org/genai"
)
// ConvertToGenaiTool translates a ToolboxTool into the genai.FunctionDeclaration format.
func ConvertToGenaiTool(toolboxTool *core.ToolboxTool) *genai.Tool {
inputschema, err := toolboxTool.InputSchema()
if err != nil {
return &genai.Tool{}
}
var schema *genai.Schema
_ = json.Unmarshal(inputschema, &schema)
// First, create the function declaration.
funcDeclaration := &genai.FunctionDeclaration{
Name: toolboxTool.Name(),
Description: toolboxTool.Description(),
Parameters: schema,
}
// Then, wrap the function declaration in a genai.Tool struct.
return &genai.Tool{
FunctionDeclarations: []*genai.FunctionDeclaration{funcDeclaration},
}
}
// printResponse extracts and prints the relevant parts of the model's response.
func printResponse(resp *genai.GenerateContentResponse) {
for _, cand := range resp.Candidates {
if cand.Content != nil {
for _, part := range cand.Content.Parts {
fmt.Println(part.Text)
}
}
}
}
func main() {
// Setup
ctx := context.Background()
apiKey := os.Getenv("GOOGLE_API_KEY")
toolboxURL := "http://localhost:5000"
// Initialize the Google GenAI client using the explicit ClientConfig.
client, err := genai.NewClient(ctx, &genai.ClientConfig{
APIKey: apiKey,
})
if err != nil {
log.Fatalf("Failed to create Google GenAI client: %v", err)
}
// Initialize the MCP Toolbox client.
toolboxClient, err := core.NewToolboxClient(toolboxURL)
if err != nil {
log.Fatalf("Failed to create Toolbox client: %v", err)
}
// Load the tools using the MCP Toolbox SDK.
tools, err := toolboxClient.LoadToolset("my-toolset", ctx)
if err != nil {
log.Fatalf("Failed to load tools: %v\nMake sure your Toolbox server is running and the tool is configured.", err)
}
genAITools := make([]*genai.Tool, len(tools))
toolsMap := make(map[string]*core.ToolboxTool, len(tools))
for i, tool := range tools {
// Convert the tools into usable format
genAITools[i] = ConvertToGenaiTool(tool)
// Add tool to a map for lookup later
toolsMap[tool.Name()] = tool
}
// Set up the generative model with the available tool.
modelName := "gemini-2.0-flash"
query := "Find hotels in Basel with Basel in it's name and share the names with me"
// Create the initial content prompt for the model.
contents := []*genai.Content{
genai.NewContentFromText(query, genai.RoleUser),
}
config := &genai.GenerateContentConfig{
Tools: genAITools,
ToolConfig: &genai.ToolConfig{
FunctionCallingConfig: &genai.FunctionCallingConfig{
Mode: genai.FunctionCallingConfigModeAny,
},
},
}
genContentResp, _ := client.Models.GenerateContent(ctx, modelName, contents, config)
printResponse(genContentResp)
functionCalls := genContentResp.FunctionCalls()
if len(functionCalls) == 0 {
log.Println("No function call returned by the AI. The model likely answered directly.")
return
}
// Process the first function call (the example assumes one for simplicity).
fc := functionCalls[0]
log.Printf("--- Gemini requested function call: %s ---\n", fc.Name)
log.Printf("--- Arguments: %+v ---\n", fc.Args)
var toolResultString string
if fc.Name == "search-hotels-by-name" {
tool := toolsMap["search-hotels-by-name"]
toolResult, err := tool.Invoke(ctx, fc.Args)
toolResultString = fmt.Sprintf("%v", toolResult)
if err != nil {
log.Fatalf("Failed to execute tool '%s': %v", fc.Name, err)
}
} else {
log.Println("LLM did not request our tool")
}
resultContents := []*genai.Content{
genai.NewContentFromText("The tool returned this result, share it with the user based of their previous querys"+toolResultString, genai.RoleUser),
}
finalResponse, err := client.Models.GenerateContent(ctx, modelName, resultContents, &genai.GenerateContentConfig{})
if err != nil {
log.Fatalf("Error calling GenerateContent (with function result): %v", err)
}
log.Println("=== Final Response from Model (after processing function result) ===")
printResponse(finalResponse)
}
{{< /tab >}}
{{< tab header="LangChain Go" lang="go" >}}
// This sample demonstrates how to use Toolbox tools as function definitions in LangChain Go.
package main
import (
"context"
"encoding/json"
"fmt"
"log"
"os"
"github.com/googleapis/mcp-toolbox-sdk-go/core"
"github.com/tmc/langchaingo/llms"
"github.com/tmc/langchaingo/llms/googleai"
)
// ConvertToLangchainTool converts a generic core.ToolboxTool into a LangChainGo llms.Tool.
func ConvertToLangchainTool(toolboxTool *core.ToolboxTool) llms.Tool {
// Fetch the tool's input schema
inputschema, err := toolboxTool.InputSchema()
if err != nil {
return llms.Tool{}
}
var paramsSchema map[string]any
_ = json.Unmarshal(inputschema, &paramsSchema)
// Convert into LangChain's llms.Tool
return llms.Tool{
Type: "function",
Function: &llms.FunctionDefinition{
Name: toolboxTool.Name(),
Description: toolboxTool.Description(),
Parameters: paramsSchema,
},
}
}
func main() {
genaiKey := os.Getenv("GOOGLE_API_KEY")
toolboxURL := "http://localhost:5000"
ctx := context.Background()
// Initialize the Google AI client (LLM).
llm, err := googleai.New(ctx, googleai.WithAPIKey(genaiKey), googleai.WithDefaultModel("gemini-1.5-flash"))
if err != nil {
log.Fatalf("Failed to create Google AI client: %v", err)
}
// Initialize the MCP Toolbox client.
toolboxClient, err := core.NewToolboxClient(toolboxURL)
if err != nil {
log.Fatalf("Failed to create Toolbox client: %v", err)
}
// Load the tools using the MCP Toolbox SDK.
tools, err := toolboxClient.LoadToolset("my-toolset", ctx)
if err != nil {
log.Fatalf("Failed to load tools: %v\nMake sure your Toolbox server is running and the tool is configured.", err)
}
toolsMap := make(map[string]*core.ToolboxTool, len(tools))
langchainTools := make([]llms.Tool, len(tools))
for i, tool := range tools {
// Convert the loaded ToolboxTools into the format LangChainGo requires.
langchainTools[i] = ConvertToLangchainTool(tool)
// Add tool to a map for lookup later
toolsMap[tool.Name()] = tool
}
// Start the conversation history.
messageHistory := []llms.MessageContent{
llms.TextParts(llms.ChatMessageTypeHuman, "Find hotels in Basel with Basel in it's name."),
}
// Make the first call to the LLM, making it aware of the tool.
resp, err := llm.GenerateContent(ctx, messageHistory, llms.WithTools(langchainTools))
if err != nil {
log.Fatalf("LLM call failed: %v", err)
}
// Add the model's response (which should be a tool call) to the history.
respChoice := resp.Choices[0]
assistantResponse := llms.TextParts(llms.ChatMessageTypeAI, respChoice.Content)
for _, tc := range respChoice.ToolCalls {
assistantResponse.Parts = append(assistantResponse.Parts, tc)
}
messageHistory = append(messageHistory, assistantResponse)
// Process each tool call requested by the model.
for _, tc := range respChoice.ToolCalls {
toolName := tc.FunctionCall.Name
switch tc.FunctionCall.Name {
case "search-hotels-by-name":
var args map[string]any
if err := json.Unmarshal([]byte(tc.FunctionCall.Arguments), &args); err != nil {
log.Fatalf("Failed to unmarshal arguments for tool '%s': %v", toolName, err)
}
tool := toolsMap["search-hotels-by-name"]
toolResult, err := tool.Invoke(ctx, args)
if err != nil {
log.Fatalf("Failed to execute tool '%s': %v", toolName, err)
}
// Create the tool call response message and add it to the history.
toolResponse := llms.MessageContent{
Role: llms.ChatMessageTypeTool,
Parts: []llms.ContentPart{
llms.ToolCallResponse{
Name: toolName,
Content: fmt.Sprintf("%v", toolResult),
},
},
}
messageHistory = append(messageHistory, toolResponse)
default:
log.Fatalf("got unexpected function call: %v", tc.FunctionCall.Name)
}
}
// Final LLM Call for Natural Language Response
log.Println("Sending tool response back to LLM for a final answer...")
// Call the LLM again with the updated history, which now includes the tool's result.
finalResp, err := llm.GenerateContent(ctx, messageHistory)
if err != nil {
log.Fatalf("Final LLM call failed: %v", err)
}
// Display the Result
fmt.Println("\n======================================")
fmt.Println("Final Response from LLM:")
fmt.Println(finalResp.Choices[0].Content)
fmt.Println("======================================")
}
{{< /tab >}}
{{< tab header="OpenAI Go" lang="go" >}}
// This sample demonstrates integration with the OpenAI Go client.
package main
import (
"context"
"encoding/json"
"fmt"
"log"
"github.com/googleapis/mcp-toolbox-sdk-go/core"
openai "github.com/openai/openai-go"
)
// ConvertToOpenAITool converts a ToolboxTool into the go-openai library's Tool format.
func ConvertToOpenAITool(toolboxTool *core.ToolboxTool) openai.ChatCompletionToolParam {
// Get the input schema
jsonSchemaBytes, err := toolboxTool.InputSchema()
if err != nil {
return openai.ChatCompletionToolParam{}
}
// Unmarshal the JSON bytes into FunctionParameters
var paramsSchema openai.FunctionParameters
if err := json.Unmarshal(jsonSchemaBytes, &paramsSchema); err != nil {
return openai.ChatCompletionToolParam{}
}
// Create and return the final tool parameter struct.
return openai.ChatCompletionToolParam{
Function: openai.FunctionDefinitionParam{
Name: toolboxTool.Name(),
Description: openai.String(toolboxTool.Description()),
Parameters: paramsSchema,
},
}
}
func main() {
// Setup
ctx := context.Background()
toolboxURL := "http://localhost:5000"
openAIClient := openai.NewClient()
// Initialize the MCP Toolbox client.
toolboxClient, err := core.NewToolboxClient(toolboxURL)
if err != nil {
log.Fatalf("Failed to create Toolbox client: %v", err)
}
// Load the tools using the MCP Toolbox SDK.
tools, err := toolboxClient.LoadToolset("my-toolset", ctx)
if err != nil {
log.Fatalf("Failed to load tool : %v\nMake sure your Toolbox server is running and the tool is configured.", err)
}
openAITools := make([]openai.ChatCompletionToolParam, len(tools))
toolsMap := make(map[string]*core.ToolboxTool, len(tools))
for i, tool := range tools {
// Convert the Toolbox tool into the openAI FunctionDeclaration format.
openAITools[i] = ConvertToOpenAITool(tool)
// Add tool to a map for lookup later
toolsMap[tool.Name()] = tool
}
question := "Find hotels in Basel with Basel in it's name "
params := openai.ChatCompletionNewParams{
Messages: []openai.ChatCompletionMessageParamUnion{
openai.UserMessage(question),
},
Tools: openAITools,
Seed: openai.Int(0),
Model: openai.ChatModelGPT4o,
}
// Make initial chat completion request
completion, err := openAIClient.Chat.Completions.New(ctx, params)
if err != nil {
panic(err)
}
toolCalls := completion.Choices[0].Message.ToolCalls
// Return early if there are no tool calls
if len(toolCalls) == 0 {
fmt.Printf("No function call")
return
}
// If there is a was a function call, continue the conversation
params.Messages = append(params.Messages, completion.Choices[0].Message.ToParam())
for _, toolCall := range toolCalls {
if toolCall.Function.Name == "search-hotels-by-name" {
// Extract the location from the function call arguments
var args map[string]interface{}
tool := toolsMap["search-hotels-by-name"]
err := json.Unmarshal([]byte(toolCall.Function.Arguments), &args)
if err != nil {
panic(err)
}
result, err := tool.Invoke(ctx, args)
if err != nil {
log.Fatal("Could not invoke tool", err)
}
params.Messages = append(params.Messages, openai.ToolMessage(result.(string), toolCall.ID))
}
}
completion, err = openAIClient.Chat.Completions.New(ctx, params)
if err != nil {
panic(err)
}
fmt.Println(completion.Choices[0].Message.Content)
}
{{< /tab >}}
{{< /tabpane >}}

View File

@@ -1,15 +0,0 @@
---
title: "JS SDK"
weight: 2
description: Javascript client SDK
icon: fa-brands fa-node-js
manualLink: "https://github.com/googleapis/mcp-toolbox-sdk-js"
manualLinkTarget: _blank
---
<html>
<head>
<link rel="canonical" href="https://github.com/googleapis/mcp-toolbox-sdk-js"/>
<meta http-equiv="refresh" content="0;url=https://github.com/googleapis/mcp-toolbox-sdk-js"/>
</head>
</html>

View File

@@ -1,15 +0,0 @@
---
title: "Python SDK"
weight: 2
description: Python client SDK
icon: fa-brands fa-python
manualLink: "https://github.com/googleapis/mcp-toolbox-sdk-python"
manualLinkTarget: _blank
---
<html>
<head>
<link rel="canonical" href="https://github.com/googleapis/mcp-toolbox-sdk-python"/>
<meta http-equiv="refresh" content="0;url=https://github.com/googleapis/mcp-toolbox-sdk-python"/>
</head>
</html>

View File

@@ -0,0 +1,25 @@
---
title: "Python SDK"
type: docs
weight: 7
description: >
Python SDKs to connect to the MCP Toolbox server.
---
## Overview
The MCP Toolbox service provides a centralized way to manage and expose tools
(like API connectors, database query tools, etc.) for use by GenAI applications.
These Python SDKs act as clients for that service. They handle the communication needed to:
* Fetch tool definitions from your running Toolbox instance.
* Provide convenient Python objects or functions representing those tools.
* Invoke the tools (calling the underlying APIs/services configured in Toolbox).
* Handle authentication and parameter binding as needed.
By using these SDKs, you can easily leverage your Toolbox-managed tools directly
within your Python applications or AI orchestration frameworks.
[Github](https://github.com/googleapis/mcp-toolbox-sdk-python)

3
go.mod
View File

@@ -37,7 +37,6 @@ require (
github.com/google/go-cmp v0.7.0
github.com/google/uuid v1.6.0
github.com/jackc/pgx/v5 v5.7.6
github.com/json-iterator/go v1.1.12
github.com/looker-open-source/sdk-codegen/go v0.25.21
github.com/microsoft/go-mssqldb v1.9.3
github.com/nakagami/firebirdsql v0.9.15
@@ -60,6 +59,7 @@ require (
go.opentelemetry.io/otel/trace v1.38.0
golang.org/x/oauth2 v0.33.0
google.golang.org/api v0.256.0
google.golang.org/genai v1.37.0
google.golang.org/genproto v0.0.0-20251022142026-3a174f9686a8
google.golang.org/protobuf v1.36.10
modernc.org/sqlite v1.40.0
@@ -138,6 +138,7 @@ require (
github.com/jcmturner/goidentity/v6 v6.0.1 // indirect
github.com/jcmturner/gokrb5/v8 v8.4.4 // indirect
github.com/jcmturner/rpc/v2 v2.0.3 // indirect
github.com/json-iterator/go v1.1.12 // indirect
github.com/kardianos/osext v0.0.0-20190222173326-2bc1f35cddc0 // indirect
github.com/klauspost/compress v1.18.0 // indirect
github.com/klauspost/cpuid/v2 v2.2.11 // indirect

2
go.sum
View File

@@ -1869,6 +1869,8 @@ google.golang.org/appengine v1.6.1/go.mod h1:i06prIuMbXzDqacNJfV5OdTW448YApPu5ww
google.golang.org/appengine v1.6.5/go.mod h1:8WjMMxjGQR8xUklV/ARdw2HLXBOI7O7uCIDZVag1xfc=
google.golang.org/appengine v1.6.6/go.mod h1:8WjMMxjGQR8xUklV/ARdw2HLXBOI7O7uCIDZVag1xfc=
google.golang.org/appengine v1.6.7/go.mod h1:8WjMMxjGQR8xUklV/ARdw2HLXBOI7O7uCIDZVag1xfc=
google.golang.org/genai v1.37.0 h1:dgp71k1wQ+/+APdZrN3LFgAGnVnr5IdTF1Oj0Dg+BQc=
google.golang.org/genai v1.37.0/go.mod h1:A3kkl0nyBjyFlNjgxIwKq70julKbIxpSxqKO5gw/gmk=
google.golang.org/genproto v0.0.0-20180817151627-c66870c02cf8/go.mod h1:JiN7NxoALGmiZfu7CAH4rXhgtRTLTxftemlI0sWmxmc=
google.golang.org/genproto v0.0.0-20190307195333-5fe7a883aa19/go.mod h1:VzzqZJRnGkLBvHegQrXjBqPurQTc5/KpmUdxsrq26oE=
google.golang.org/genproto v0.0.0-20190418145605-e7d98fc518a7/go.mod h1:VzzqZJRnGkLBvHegQrXjBqPurQTc5/KpmUdxsrq26oE=

View File

@@ -0,0 +1,59 @@
// Copyright 2026 Google LLC
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
package embeddingmodels
import (
"context"
"strconv"
"strings"
)
// EmbeddingModelConfig is the interface for configuring embedding models.
type EmbeddingModelConfig interface {
EmbeddingModelConfigKind() string
Initialize(context.Context) (EmbeddingModel, error)
}
type EmbeddingModel interface {
EmbeddingModelKind() string
ToConfig() EmbeddingModelConfig
EmbedParameters(context.Context, []string) ([][]float32, error)
}
type VectorFormatter func(vectorFloats []float32) any
// FormatVectorForPgvector converts a slice of floats into a PostgreSQL vector literal string: '[x, y, z]'
func FormatVectorForPgvector(vectorFloats []float32) any {
if len(vectorFloats) == 0 {
return "[]"
}
// Pre-allocate the builder.
var b strings.Builder
b.Grow(len(vectorFloats) * 10)
b.WriteByte('[')
for i, f := range vectorFloats {
if i > 0 {
b.WriteString(", ")
}
b.Write(strconv.AppendFloat(nil, float64(f), 'g', -1, 32))
}
b.WriteByte(']')
return b.String()
}
var _ VectorFormatter = FormatVectorForPgvector

View File

@@ -0,0 +1,122 @@
// Copyright 2026 Google LLC
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
package gemini
import (
"context"
"fmt"
"github.com/googleapis/genai-toolbox/internal/embeddingmodels"
"github.com/googleapis/genai-toolbox/internal/util"
"google.golang.org/genai"
)
const EmbeddingModelKind string = "gemini"
// validate interface
var _ embeddingmodels.EmbeddingModelConfig = Config{}
type Config struct {
Name string `yaml:"name" validate:"required"`
Kind string `yaml:"kind" validate:"required"`
Model string `yaml:"model" validate:"required"`
ApiKey string `yaml:"apiKey"`
Dimension int32 `yaml:"dimension"`
}
// Returns the embedding model kind
func (cfg Config) EmbeddingModelConfigKind() string {
return EmbeddingModelKind
}
// Initialize a Gemini embedding model
func (cfg Config) Initialize(ctx context.Context) (embeddingmodels.EmbeddingModel, error) {
// Get client configs
configs := &genai.ClientConfig{}
if cfg.ApiKey != "" {
configs.APIKey = cfg.ApiKey
}
// Create new Gemini API client
client, err := genai.NewClient(ctx, configs)
if err != nil {
return nil, fmt.Errorf("unable to create Gemini API client")
}
m := &EmbeddingModel{
Config: cfg,
Client: client,
}
return m, nil
}
var _ embeddingmodels.EmbeddingModel = EmbeddingModel{}
type EmbeddingModel struct {
Client *genai.Client
Config
}
// Returns the embedding model kind
func (m EmbeddingModel) EmbeddingModelKind() string {
return EmbeddingModelKind
}
func (m EmbeddingModel) ToConfig() embeddingmodels.EmbeddingModelConfig {
return m.Config
}
func (m EmbeddingModel) EmbedParameters(ctx context.Context, parameters []string) ([][]float32, error) {
logger, err := util.LoggerFromContext(ctx)
if err != nil {
return nil, fmt.Errorf("unable to get logger from ctx: %s", err)
}
contents := convertStringsToContents(parameters)
embedConfig := &genai.EmbedContentConfig{
TaskType: "SEMANTIC_SIMILARITY",
}
if m.Dimension > 0 {
embedConfig.OutputDimensionality = genai.Ptr(m.Dimension)
}
result, err := m.Client.Models.EmbedContent(ctx, m.Model, contents, embedConfig)
if err != nil {
logger.ErrorContext(ctx, "Error calling EmbedContent for model %s: %v", m.Model, err)
return nil, err
}
embeddings := make([][]float32, 0, len(result.Embeddings))
for _, embedding := range result.Embeddings {
embeddings = append(embeddings, embedding.Values)
}
logger.InfoContext(ctx, "Successfully embedded %d text parameters using model %s", len(parameters), m.Model)
return embeddings, nil
}
// convertStringsToContents takes a slice of strings and converts it into a slice of *genai.Content objects.
func convertStringsToContents(texts []string) []*genai.Content {
contents := make([]*genai.Content, 0, len(texts))
for _, text := range texts {
content := genai.NewContentFromText(text, "")
contents = append(contents, content)
}
return contents
}

View File

@@ -0,0 +1,130 @@
// Copyright 2026 Google LLC
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
package gemini_test
import (
"testing"
yaml "github.com/goccy/go-yaml"
"github.com/google/go-cmp/cmp"
"github.com/googleapis/genai-toolbox/internal/embeddingmodels"
"github.com/googleapis/genai-toolbox/internal/embeddingmodels/gemini"
"github.com/googleapis/genai-toolbox/internal/server"
"github.com/googleapis/genai-toolbox/internal/testutils"
)
func TestParseFromYamlGemini(t *testing.T) {
tcs := []struct {
desc string
in string
want server.EmbeddingModelConfigs
}{
{
desc: "basic example",
in: `
embeddingModels:
my-gemini-model:
kind: gemini
model: text-embedding-004
`,
want: map[string]embeddingmodels.EmbeddingModelConfig{
"my-gemini-model": gemini.Config{
Name: "my-gemini-model",
Kind: gemini.EmbeddingModelKind,
Model: "text-embedding-004",
},
},
},
{
desc: "full example with optional fields",
in: `
embeddingModels:
complex-gemini:
kind: gemini
model: text-embedding-004
apiKey: "test-api-key"
dimension: 768
`,
want: map[string]embeddingmodels.EmbeddingModelConfig{
"complex-gemini": gemini.Config{
Name: "complex-gemini",
Kind: gemini.EmbeddingModelKind,
Model: "text-embedding-004",
ApiKey: "test-api-key",
Dimension: 768,
},
},
},
}
for _, tc := range tcs {
t.Run(tc.desc, func(t *testing.T) {
got := struct {
Models server.EmbeddingModelConfigs `yaml:"embeddingModels"`
}{}
// Parse contents
err := yaml.Unmarshal(testutils.FormatYaml(tc.in), &got)
if err != nil {
t.Fatalf("unable to unmarshal: %s", err)
}
if !cmp.Equal(tc.want, got.Models) {
t.Fatalf("incorrect parse: %v", cmp.Diff(tc.want, got.Models))
}
})
}
}
func TestFailParseFromYamlGemini(t *testing.T) {
tcs := []struct {
desc string
in string
err string
}{
{
desc: "missing required model field",
in: `
embeddingModels:
bad-model:
kind: gemini
`,
// Removed the specific model name from the prefix to match your output
err: "unable to parse as \"gemini\": Key: 'Config.Model' Error:Field validation for 'Model' failed on the 'required' tag",
},
{
desc: "unknown field",
in: `
embeddingModels:
bad-field:
kind: gemini
model: text-embedding-004
invalid_param: true
`,
// Updated to match the specific line-starting format of your error output
err: "unable to parse as \"gemini\": [1:1] unknown field \"invalid_param\"\n> 1 | invalid_param: true\n ^\n 2 | kind: gemini\n 3 | model: text-embedding-004",
},
}
for _, tc := range tcs {
t.Run(tc.desc, func(t *testing.T) {
got := struct {
Models server.EmbeddingModelConfigs `yaml:"embeddingModels"`
}{}
err := yaml.Unmarshal(testutils.FormatYaml(tc.in), &got)
if err == nil {
t.Fatalf("expect parsing to fail")
}
if err.Error() != tc.err {
t.Fatalf("unexpected error:\ngot: %q\nwant: %q", err.Error(), tc.err)
}
})
}
}

View File

@@ -224,6 +224,10 @@ tools:
kind: postgres-list-roles
source: alloydb-pg-source
list_stored_procedure:
kind: postgres-list-stored-procedure
source: alloydb-pg-source
toolsets:
alloydb_postgres_database_tools:
- execute_sql
@@ -254,3 +258,4 @@ toolsets:
- list_database_stats
- list_roles
- list_table_stats
- list_stored_procedure

View File

@@ -18,6 +18,7 @@ sources:
project: ${BIGQUERY_PROJECT}
location: ${BIGQUERY_LOCATION:}
useClientOAuth: ${BIGQUERY_USE_CLIENT_OAUTH:false}
scopes: ${BIGQUERY_SCOPES:}
tools:
analyze_contribution:

View File

@@ -19,8 +19,8 @@ sources:
region: ${CLOUD_SQL_MYSQL_REGION}
instance: ${CLOUD_SQL_MYSQL_INSTANCE}
database: ${CLOUD_SQL_MYSQL_DATABASE}
user: ${CLOUD_SQL_MYSQL_USER}
password: ${CLOUD_SQL_MYSQL_PASSWORD}
user: ${CLOUD_SQL_MYSQL_USER:}
password: ${CLOUD_SQL_MYSQL_PASSWORD:}
ipType: ${CLOUD_SQL_MYSQL_IP_TYPE:PUBLIC}
tools:
execute_sql:

View File

@@ -226,6 +226,10 @@ tools:
kind: postgres-list-roles
source: cloudsql-pg-source
list_stored_procedure:
kind: postgres-list-stored-procedure
source: cloudsql-pg-source
toolsets:
cloud_sql_postgres_database_tools:
- execute_sql
@@ -256,3 +260,4 @@ toolsets:
- list_database_stats
- list_roles
- list_table_stats
- list_stored_procedure

View File

@@ -225,6 +225,10 @@ tools:
kind: postgres-list-roles
source: postgresql-source
list_stored_procedure:
kind: postgres-list-stored-procedure
source: postgresql-source
toolsets:
postgres_database_tools:
- execute_sql
@@ -255,3 +259,4 @@ toolsets:
- list_database_stats
- list_roles
- list_table_stats
- list_stored_procedure

View File

@@ -246,6 +246,14 @@ func toolInvokeHandler(s *Server, w http.ResponseWriter, r *http.Request) {
}
s.logger.DebugContext(ctx, fmt.Sprintf("invocation params: %s", params))
params, err = tool.EmbedParams(ctx, params, s.ResourceMgr.GetEmbeddingModelMap())
if err != nil {
err = fmt.Errorf("error embedding parameters: %w", err)
s.logger.DebugContext(ctx, err.Error())
_ = render.Render(w, r, newErrResponse(err, http.StatusBadRequest))
return
}
res, err := tool.Invoke(ctx, s.ResourceMgr, params, accessToken)
// Determine what error to return to the users.

View File

@@ -24,6 +24,7 @@ import (
"testing"
"github.com/go-chi/chi/v5"
"github.com/googleapis/genai-toolbox/internal/embeddingmodels"
"github.com/googleapis/genai-toolbox/internal/log"
"github.com/googleapis/genai-toolbox/internal/prompts"
"github.com/googleapis/genai-toolbox/internal/server/resources"
@@ -64,6 +65,10 @@ func (t MockTool) ParseParams(data map[string]any, claimsMap map[string]map[stri
return parameters.ParseParams(t.Params, data, claimsMap)
}
func (t MockTool) EmbedParams(ctx context.Context, paramValues parameters.ParamValues, embeddingModelsMap map[string]embeddingmodels.EmbeddingModel) (parameters.ParamValues, error) {
return parameters.EmbedParams(ctx, t.Params, paramValues, embeddingModelsMap, nil)
}
func (t MockTool) Manifest() tools.Manifest {
pMs := make([]parameters.ParameterManifest, 0, len(t.Params))
for _, p := range t.Params {
@@ -276,7 +281,7 @@ func setUpServer(t *testing.T, router string, tools map[string]tools.Tool, tools
sseManager := newSseManager(ctx)
resourceManager := resources.NewResourceManager(nil, nil, tools, toolsets, prompts, promptsets)
resourceManager := resources.NewResourceManager(nil, nil, nil, tools, toolsets, prompts, promptsets)
server := Server{
version: fakeVersionString,

View File

@@ -21,6 +21,8 @@ import (
yaml "github.com/goccy/go-yaml"
"github.com/googleapis/genai-toolbox/internal/auth"
"github.com/googleapis/genai-toolbox/internal/auth/google"
"github.com/googleapis/genai-toolbox/internal/embeddingmodels"
"github.com/googleapis/genai-toolbox/internal/embeddingmodels/gemini"
"github.com/googleapis/genai-toolbox/internal/prompts"
"github.com/googleapis/genai-toolbox/internal/sources"
"github.com/googleapis/genai-toolbox/internal/tools"
@@ -38,6 +40,8 @@ type ServerConfig struct {
SourceConfigs SourceConfigs
// AuthServiceConfigs defines what sources of authentication are available for tools.
AuthServiceConfigs AuthServiceConfigs
// EmbeddingModelConfigs defines a models used to embed parameters.
EmbeddingModelConfigs EmbeddingModelConfigs
// ToolConfigs defines what tools are available.
ToolConfigs ToolConfigs
// ToolsetConfigs defines what tools are available.
@@ -205,6 +209,50 @@ func (c *AuthServiceConfigs) UnmarshalYAML(ctx context.Context, unmarshal func(i
return nil
}
// EmbeddingModelConfigs is a type used to allow unmarshal of the embedding model config map
type EmbeddingModelConfigs map[string]embeddingmodels.EmbeddingModelConfig
// validate interface
var _ yaml.InterfaceUnmarshalerContext = &EmbeddingModelConfigs{}
func (c *EmbeddingModelConfigs) UnmarshalYAML(ctx context.Context, unmarshal func(interface{}) error) error {
*c = make(EmbeddingModelConfigs)
// Parse the 'kind' fields for each embedding model
var raw map[string]util.DelayedUnmarshaler
if err := unmarshal(&raw); err != nil {
return err
}
for name, u := range raw {
// Unmarshal to a general type that ensure it capture all fields
var v map[string]any
if err := u.Unmarshal(&v); err != nil {
return fmt.Errorf("unable to unmarshal embedding model %q: %w", name, err)
}
kind, ok := v["kind"]
if !ok {
return fmt.Errorf("missing 'kind' field for embedding model %q", name)
}
dec, err := util.NewStrictDecoder(v)
if err != nil {
return fmt.Errorf("error creating decoder: %w", err)
}
switch kind {
case gemini.EmbeddingModelKind:
actual := gemini.Config{Name: name}
if err := dec.DecodeContext(ctx, &actual); err != nil {
return fmt.Errorf("unable to parse as %q: %w", kind, err)
}
(*c)[name] = actual
default:
return fmt.Errorf("%q is not a valid kind of auth source", kind)
}
}
return nil
}
// ToolConfigs is a type used to allow unmarshal of the tool configs
type ToolConfigs map[string]tools.ToolConfig

View File

@@ -1107,7 +1107,7 @@ func TestStdioSession(t *testing.T) {
sseManager := newSseManager(ctx)
resourceManager := resources.NewResourceManager(nil, nil, toolsMap, toolsets, promptsMap, promptsets)
resourceManager := resources.NewResourceManager(nil, nil, nil, toolsMap, toolsets, promptsMap, promptsets)
server := &Server{
version: fakeVersionString,

View File

@@ -18,6 +18,7 @@ import (
"sync"
"github.com/googleapis/genai-toolbox/internal/auth"
"github.com/googleapis/genai-toolbox/internal/embeddingmodels"
"github.com/googleapis/genai-toolbox/internal/prompts"
"github.com/googleapis/genai-toolbox/internal/sources"
"github.com/googleapis/genai-toolbox/internal/tools"
@@ -25,30 +26,33 @@ import (
// ResourceManager contains available resources for the server. Should be initialized with NewResourceManager().
type ResourceManager struct {
mu sync.RWMutex
sources map[string]sources.Source
authServices map[string]auth.AuthService
tools map[string]tools.Tool
toolsets map[string]tools.Toolset
prompts map[string]prompts.Prompt
promptsets map[string]prompts.Promptset
mu sync.RWMutex
sources map[string]sources.Source
authServices map[string]auth.AuthService
embeddingModels map[string]embeddingmodels.EmbeddingModel
tools map[string]tools.Tool
toolsets map[string]tools.Toolset
prompts map[string]prompts.Prompt
promptsets map[string]prompts.Promptset
}
func NewResourceManager(
sourcesMap map[string]sources.Source,
authServicesMap map[string]auth.AuthService,
embeddingModelsMap map[string]embeddingmodels.EmbeddingModel,
toolsMap map[string]tools.Tool, toolsetsMap map[string]tools.Toolset,
promptsMap map[string]prompts.Prompt, promptsetsMap map[string]prompts.Promptset,
) *ResourceManager {
resourceMgr := &ResourceManager{
mu: sync.RWMutex{},
sources: sourcesMap,
authServices: authServicesMap,
tools: toolsMap,
toolsets: toolsetsMap,
prompts: promptsMap,
promptsets: promptsetsMap,
mu: sync.RWMutex{},
sources: sourcesMap,
authServices: authServicesMap,
embeddingModels: embeddingModelsMap,
tools: toolsMap,
toolsets: toolsetsMap,
prompts: promptsMap,
promptsets: promptsetsMap,
}
return resourceMgr
@@ -68,6 +72,13 @@ func (r *ResourceManager) GetAuthService(authServiceName string) (auth.AuthServi
return authService, ok
}
func (r *ResourceManager) GetEmbeddingModel(embeddingModelName string) (embeddingmodels.EmbeddingModel, bool) {
r.mu.RLock()
defer r.mu.RUnlock()
model, ok := r.embeddingModels[embeddingModelName]
return model, ok
}
func (r *ResourceManager) GetTool(toolName string) (tools.Tool, bool) {
r.mu.RLock()
defer r.mu.RUnlock()
@@ -96,11 +107,12 @@ func (r *ResourceManager) GetPromptset(promptsetName string) (prompts.Promptset,
return promptset, ok
}
func (r *ResourceManager) SetResources(sourcesMap map[string]sources.Source, authServicesMap map[string]auth.AuthService, toolsMap map[string]tools.Tool, toolsetsMap map[string]tools.Toolset, promptsMap map[string]prompts.Prompt, promptsetsMap map[string]prompts.Promptset) {
func (r *ResourceManager) SetResources(sourcesMap map[string]sources.Source, authServicesMap map[string]auth.AuthService, embeddingModelsMap map[string]embeddingmodels.EmbeddingModel, toolsMap map[string]tools.Tool, toolsetsMap map[string]tools.Toolset, promptsMap map[string]prompts.Prompt, promptsetsMap map[string]prompts.Promptset) {
r.mu.Lock()
defer r.mu.Unlock()
r.sources = sourcesMap
r.authServices = authServicesMap
r.embeddingModels = embeddingModelsMap
r.tools = toolsMap
r.toolsets = toolsetsMap
r.prompts = promptsMap
@@ -117,6 +129,16 @@ func (r *ResourceManager) GetAuthServiceMap() map[string]auth.AuthService {
return copiedMap
}
func (r *ResourceManager) GetEmbeddingModelMap() map[string]embeddingmodels.EmbeddingModel {
r.mu.RLock()
defer r.mu.RUnlock()
copiedMap := make(map[string]embeddingmodels.EmbeddingModel, len(r.embeddingModels))
for k, v := range r.embeddingModels {
copiedMap[k] = v
}
return copiedMap
}
func (r *ResourceManager) GetToolsMap() map[string]tools.Tool {
r.mu.RLock()
defer r.mu.RUnlock()

View File

@@ -19,6 +19,7 @@ import (
"github.com/google/go-cmp/cmp"
"github.com/googleapis/genai-toolbox/internal/auth"
"github.com/googleapis/genai-toolbox/internal/embeddingmodels"
"github.com/googleapis/genai-toolbox/internal/prompts"
"github.com/googleapis/genai-toolbox/internal/server/resources"
"github.com/googleapis/genai-toolbox/internal/sources"
@@ -36,6 +37,7 @@ func TestUpdateServer(t *testing.T) {
},
}
newAuth := map[string]auth.AuthService{"example-auth": nil}
newEmbeddingModels := map[string]embeddingmodels.EmbeddingModel{"example-model": nil}
newTools := map[string]tools.Tool{"example-tool": nil}
newToolsets := map[string]tools.Toolset{
"example-toolset": {
@@ -54,7 +56,7 @@ func TestUpdateServer(t *testing.T) {
Prompts: []*prompts.Prompt{},
},
}
resMgr := resources.NewResourceManager(newSources, newAuth, newTools, newToolsets, newPrompts, newPromptsets)
resMgr := resources.NewResourceManager(newSources, newAuth, newEmbeddingModels, newTools, newToolsets, newPrompts, newPromptsets)
gotSource, _ := resMgr.GetSource("example-source")
if diff := cmp.Diff(gotSource, newSources["example-source"]); diff != "" {
@@ -95,7 +97,7 @@ func TestUpdateServer(t *testing.T) {
},
}
resMgr.SetResources(updateSource, newAuth, newTools, newToolsets, newPrompts, newPromptsets)
resMgr.SetResources(updateSource, newAuth, newEmbeddingModels, newTools, newToolsets, newPrompts, newPromptsets)
gotSource, _ = resMgr.GetSource("example-source2")
if diff := cmp.Diff(gotSource, updateSource["example-source2"]); diff != "" {
t.Errorf("error updating server, sources (-want +got):\n%s", diff)

View File

@@ -30,6 +30,7 @@ import (
"github.com/go-chi/cors"
"github.com/go-chi/httplog/v2"
"github.com/googleapis/genai-toolbox/internal/auth"
"github.com/googleapis/genai-toolbox/internal/embeddingmodels"
"github.com/googleapis/genai-toolbox/internal/log"
"github.com/googleapis/genai-toolbox/internal/prompts"
"github.com/googleapis/genai-toolbox/internal/server/resources"
@@ -56,6 +57,7 @@ type Server struct {
func InitializeConfigs(ctx context.Context, cfg ServerConfig) (
map[string]sources.Source,
map[string]auth.AuthService,
map[string]embeddingmodels.EmbeddingModel,
map[string]tools.Tool,
map[string]tools.Toolset,
map[string]prompts.Prompt,
@@ -91,7 +93,7 @@ func InitializeConfigs(ctx context.Context, cfg ServerConfig) (
return s, nil
}()
if err != nil {
return nil, nil, nil, nil, nil, nil, err
return nil, nil, nil, nil, nil, nil, nil, err
}
sourcesMap[name] = s
}
@@ -119,7 +121,7 @@ func InitializeConfigs(ctx context.Context, cfg ServerConfig) (
return a, nil
}()
if err != nil {
return nil, nil, nil, nil, nil, nil, err
return nil, nil, nil, nil, nil, nil, nil, err
}
authServicesMap[name] = a
}
@@ -129,6 +131,34 @@ func InitializeConfigs(ctx context.Context, cfg ServerConfig) (
}
l.InfoContext(ctx, fmt.Sprintf("Initialized %d authServices: %s", len(authServicesMap), strings.Join(authServiceNames, ", ")))
// Initialize and validate embedding models from configs.
embeddingModelsMap := make(map[string]embeddingmodels.EmbeddingModel)
for name, ec := range cfg.EmbeddingModelConfigs {
em, err := func() (embeddingmodels.EmbeddingModel, error) {
_, span := instrumentation.Tracer.Start(
ctx,
"toolbox/server/embeddingmodel/init",
trace.WithAttributes(attribute.String("model_kind", ec.EmbeddingModelConfigKind())),
trace.WithAttributes(attribute.String("model_name", name)),
)
defer span.End()
em, err := ec.Initialize(ctx)
if err != nil {
return nil, fmt.Errorf("unable to initialize embedding model %q: %w", name, err)
}
return em, nil
}()
if err != nil {
return nil, nil, nil, nil, nil, nil, nil, err
}
embeddingModelsMap[name] = em
}
embeddingModelNames := make([]string, 0, len(embeddingModelsMap))
for name := range embeddingModelsMap {
embeddingModelNames = append(embeddingModelNames, name)
}
l.InfoContext(ctx, fmt.Sprintf("Initialized %d embeddingModels: %s", len(embeddingModelsMap), strings.Join(embeddingModelNames, ", ")))
// initialize and validate the tools from configs
toolsMap := make(map[string]tools.Tool)
for name, tc := range cfg.ToolConfigs {
@@ -147,7 +177,7 @@ func InitializeConfigs(ctx context.Context, cfg ServerConfig) (
return t, nil
}()
if err != nil {
return nil, nil, nil, nil, nil, nil, err
return nil, nil, nil, nil, nil, nil, nil, err
}
toolsMap[name] = t
}
@@ -184,7 +214,7 @@ func InitializeConfigs(ctx context.Context, cfg ServerConfig) (
return t, err
}()
if err != nil {
return nil, nil, nil, nil, nil, nil, err
return nil, nil, nil, nil, nil, nil, nil, err
}
toolsetsMap[name] = t
}
@@ -216,7 +246,7 @@ func InitializeConfigs(ctx context.Context, cfg ServerConfig) (
return p, nil
}()
if err != nil {
return nil, nil, nil, nil, nil, nil, err
return nil, nil, nil, nil, nil, nil, nil, err
}
promptsMap[name] = p
}
@@ -253,7 +283,7 @@ func InitializeConfigs(ctx context.Context, cfg ServerConfig) (
return p, err
}()
if err != nil {
return nil, nil, nil, nil, nil, nil, err
return nil, nil, nil, nil, nil, nil, nil, err
}
promptsetsMap[name] = p
}
@@ -267,7 +297,7 @@ func InitializeConfigs(ctx context.Context, cfg ServerConfig) (
}
l.InfoContext(ctx, fmt.Sprintf("Initialized %d promptsets: %s", len(promptsetsMap), strings.Join(promptsetNames, ", ")))
return sourcesMap, authServicesMap, toolsMap, toolsetsMap, promptsMap, promptsetsMap, nil
return sourcesMap, authServicesMap, embeddingModelsMap, toolsMap, toolsetsMap, promptsMap, promptsetsMap, nil
}
// NewServer returns a Server object based on provided Config.
@@ -320,7 +350,7 @@ func NewServer(ctx context.Context, cfg ServerConfig) (*Server, error) {
httpLogger := httplog.NewLogger("httplog", httpOpts)
r.Use(httplog.RequestLogger(httpLogger))
sourcesMap, authServicesMap, toolsMap, toolsetsMap, promptsMap, promptsetsMap, err := InitializeConfigs(ctx, cfg)
sourcesMap, authServicesMap, embeddingModelsMap, toolsMap, toolsetsMap, promptsMap, promptsetsMap, err := InitializeConfigs(ctx, cfg)
if err != nil {
return nil, fmt.Errorf("unable to initialize configs: %w", err)
}
@@ -330,7 +360,7 @@ func NewServer(ctx context.Context, cfg ServerConfig) (*Server, error) {
sseManager := newSseManager(ctx)
resourceManager := resources.NewResourceManager(sourcesMap, authServicesMap, toolsMap, toolsetsMap, promptsMap, promptsetsMap)
resourceManager := resources.NewResourceManager(sourcesMap, authServicesMap, embeddingModelsMap, toolsMap, toolsetsMap, promptsMap, promptsetsMap)
s := &Server{
version: cfg.Version,

View File

@@ -25,6 +25,7 @@ import (
"github.com/google/go-cmp/cmp"
"github.com/googleapis/genai-toolbox/internal/auth"
"github.com/googleapis/genai-toolbox/internal/embeddingmodels"
"github.com/googleapis/genai-toolbox/internal/log"
"github.com/googleapis/genai-toolbox/internal/prompts"
"github.com/googleapis/genai-toolbox/internal/server"
@@ -144,6 +145,7 @@ func TestUpdateServer(t *testing.T) {
},
}
newAuth := map[string]auth.AuthService{"example-auth": nil}
newEmbeddingModels := map[string]embeddingmodels.EmbeddingModel{"example-model": nil}
newTools := map[string]tools.Tool{"example-tool": nil}
newToolsets := map[string]tools.Toolset{
"example-toolset": {
@@ -162,7 +164,7 @@ func TestUpdateServer(t *testing.T) {
Prompts: []*prompts.Prompt{},
},
}
s.ResourceMgr.SetResources(newSources, newAuth, newTools, newToolsets, newPrompts, newPromptsets)
s.ResourceMgr.SetResources(newSources, newAuth, newEmbeddingModels, newTools, newToolsets, newPrompts, newPromptsets)
if err != nil {
t.Errorf("error updating server: %s", err)
}

View File

@@ -15,8 +15,12 @@ package alloydbadmin
import (
"context"
"encoding/json"
"fmt"
"html/template"
"net/http"
"strings"
"time"
"github.com/goccy/go-yaml"
"github.com/googleapis/genai-toolbox/internal/sources"
@@ -61,7 +65,7 @@ func (r Config) SourceConfigKind() string {
func (r Config) Initialize(ctx context.Context, tracer trace.Tracer) (sources.Source, error) {
ua, err := util.UserAgentFromContext(ctx)
if err != nil {
fmt.Printf("Error in User Agent retrieval: %s", err)
return nil, fmt.Errorf("error in User Agent retrieval: %s", err)
}
var client *http.Client
@@ -114,7 +118,7 @@ func (s *Source) GetDefaultProject() string {
return s.DefaultProject
}
func (s *Source) GetService(ctx context.Context, accessToken string) (*alloydbrestapi.Service, error) {
func (s *Source) getService(ctx context.Context, accessToken string) (*alloydbrestapi.Service, error) {
if s.UseClientOAuth {
token := &oauth2.Token{AccessToken: accessToken}
client := oauth2.NewClient(ctx, oauth2.StaticTokenSource(token))
@@ -130,3 +134,287 @@ func (s *Source) GetService(ctx context.Context, accessToken string) (*alloydbre
func (s *Source) UseClientAuthorization() bool {
return s.UseClientOAuth
}
func (s *Source) CreateCluster(ctx context.Context, project, location, network, user, password, cluster, accessToken string) (any, error) {
// Build the request body using the type-safe Cluster struct.
clusterBody := &alloydbrestapi.Cluster{
NetworkConfig: &alloydbrestapi.NetworkConfig{
Network: fmt.Sprintf("projects/%s/global/networks/%s", project, network),
},
InitialUser: &alloydbrestapi.UserPassword{
User: user,
Password: password,
},
}
service, err := s.getService(ctx, accessToken)
if err != nil {
return nil, err
}
urlString := fmt.Sprintf("projects/%s/locations/%s", project, location)
// The Create API returns a long-running operation.
resp, err := service.Projects.Locations.Clusters.Create(urlString, clusterBody).ClusterId(cluster).Do()
if err != nil {
return nil, fmt.Errorf("error creating AlloyDB cluster: %w", err)
}
return resp, nil
}
func (s *Source) CreateInstance(ctx context.Context, project, location, cluster, instanceID, instanceType, displayName string, nodeCount int, accessToken string) (any, error) {
// Build the request body using the type-safe Instance struct.
instance := &alloydbrestapi.Instance{
InstanceType: instanceType,
NetworkConfig: &alloydbrestapi.InstanceNetworkConfig{
EnablePublicIp: true,
},
DatabaseFlags: map[string]string{
"password.enforce_complexity": "on",
},
}
if displayName != "" {
instance.DisplayName = displayName
}
if instanceType == "READ_POOL" {
instance.ReadPoolConfig = &alloydbrestapi.ReadPoolConfig{
NodeCount: int64(nodeCount),
}
}
service, err := s.getService(ctx, accessToken)
if err != nil {
return nil, err
}
urlString := fmt.Sprintf("projects/%s/locations/%s/clusters/%s", project, location, cluster)
// The Create API returns a long-running operation.
resp, err := service.Projects.Locations.Clusters.Instances.Create(urlString, instance).InstanceId(instanceID).Do()
if err != nil {
return nil, fmt.Errorf("error creating AlloyDB instance: %w", err)
}
return resp, nil
}
func (s *Source) CreateUser(ctx context.Context, userType, password string, roles []string, accessToken, project, location, cluster, userID string) (any, error) {
// Build the request body using the type-safe User struct.
user := &alloydbrestapi.User{
UserType: userType,
}
if userType == "ALLOYDB_BUILT_IN" {
user.Password = password
}
if len(roles) > 0 {
user.DatabaseRoles = roles
}
service, err := s.getService(ctx, accessToken)
if err != nil {
return nil, err
}
urlString := fmt.Sprintf("projects/%s/locations/%s/clusters/%s", project, location, cluster)
// The Create API returns a long-running operation.
resp, err := service.Projects.Locations.Clusters.Users.Create(urlString, user).UserId(userID).Do()
if err != nil {
return nil, fmt.Errorf("error creating AlloyDB user: %w", err)
}
return resp, nil
}
func (s *Source) GetCluster(ctx context.Context, project, location, cluster, accessToken string) (any, error) {
service, err := s.getService(ctx, accessToken)
if err != nil {
return nil, err
}
urlString := fmt.Sprintf("projects/%s/locations/%s/clusters/%s", project, location, cluster)
resp, err := service.Projects.Locations.Clusters.Get(urlString).Do()
if err != nil {
return nil, fmt.Errorf("error getting AlloyDB cluster: %w", err)
}
return resp, nil
}
func (s *Source) GetInstance(ctx context.Context, project, location, cluster, instance, accessToken string) (any, error) {
service, err := s.getService(ctx, accessToken)
if err != nil {
return nil, err
}
urlString := fmt.Sprintf("projects/%s/locations/%s/clusters/%s/instances/%s", project, location, cluster, instance)
resp, err := service.Projects.Locations.Clusters.Instances.Get(urlString).Do()
if err != nil {
return nil, fmt.Errorf("error getting AlloyDB instance: %w", err)
}
return resp, nil
}
func (s *Source) GetUsers(ctx context.Context, project, location, cluster, user, accessToken string) (any, error) {
service, err := s.getService(ctx, accessToken)
if err != nil {
return nil, err
}
urlString := fmt.Sprintf("projects/%s/locations/%s/clusters/%s/users/%s", project, location, cluster, user)
resp, err := service.Projects.Locations.Clusters.Users.Get(urlString).Do()
if err != nil {
return nil, fmt.Errorf("error getting AlloyDB user: %w", err)
}
return resp, nil
}
func (s *Source) ListCluster(ctx context.Context, project, location, accessToken string) (any, error) {
service, err := s.getService(ctx, accessToken)
if err != nil {
return nil, err
}
urlString := fmt.Sprintf("projects/%s/locations/%s", project, location)
resp, err := service.Projects.Locations.Clusters.List(urlString).Do()
if err != nil {
return nil, fmt.Errorf("error listing AlloyDB clusters: %w", err)
}
return resp, nil
}
func (s *Source) ListInstance(ctx context.Context, project, location, cluster, accessToken string) (any, error) {
service, err := s.getService(ctx, accessToken)
if err != nil {
return nil, err
}
urlString := fmt.Sprintf("projects/%s/locations/%s/clusters/%s", project, location, cluster)
resp, err := service.Projects.Locations.Clusters.Instances.List(urlString).Do()
if err != nil {
return nil, fmt.Errorf("error listing AlloyDB instances: %w", err)
}
return resp, nil
}
func (s *Source) ListUsers(ctx context.Context, project, location, cluster, accessToken string) (any, error) {
service, err := s.getService(ctx, accessToken)
if err != nil {
return nil, err
}
urlString := fmt.Sprintf("projects/%s/locations/%s/clusters/%s", project, location, cluster)
resp, err := service.Projects.Locations.Clusters.Users.List(urlString).Do()
if err != nil {
return nil, fmt.Errorf("error listing AlloyDB users: %w", err)
}
return resp, nil
}
func (s *Source) GetOperations(ctx context.Context, project, location, operation, connectionMessageTemplate string, delay time.Duration, accessToken string) (any, error) {
logger, err := util.LoggerFromContext(ctx)
if err != nil {
return nil, err
}
service, err := s.getService(ctx, accessToken)
if err != nil {
return nil, err
}
name := fmt.Sprintf("projects/%s/locations/%s/operations/%s", project, location, operation)
op, err := service.Projects.Locations.Operations.Get(name).Do()
if err != nil {
logger.DebugContext(ctx, fmt.Sprintf("error getting operation: %s, retrying in %v\n", err, delay))
} else {
if op.Done {
if op.Error != nil {
var errorBytes []byte
errorBytes, err = json.Marshal(op.Error)
if err != nil {
return nil, fmt.Errorf("operation finished with error but could not marshal error object: %w", err)
}
return nil, fmt.Errorf("operation finished with error: %s", string(errorBytes))
}
var opBytes []byte
opBytes, err = op.MarshalJSON()
if err != nil {
return nil, fmt.Errorf("could not marshal operation: %w", err)
}
if op.Response != nil {
var responseData map[string]any
if err := json.Unmarshal(op.Response, &responseData); err == nil && responseData != nil {
if msg, ok := generateAlloyDBConnectionMessage(responseData, connectionMessageTemplate); ok {
return msg, nil
}
}
}
return string(opBytes), nil
}
logger.DebugContext(ctx, fmt.Sprintf("Operation not complete, retrying in %v\n", delay))
}
return nil, nil
}
func generateAlloyDBConnectionMessage(responseData map[string]any, connectionMessageTemplate string) (string, bool) {
resourceName, ok := responseData["name"].(string)
if !ok {
return "", false
}
parts := strings.Split(resourceName, "/")
var project, region, cluster, instance string
// Expected format: projects/{project}/locations/{location}/clusters/{cluster}
// or projects/{project}/locations/{location}/clusters/{cluster}/instances/{instance}
if len(parts) < 6 || parts[0] != "projects" || parts[2] != "locations" || parts[4] != "clusters" {
return "", false
}
project = parts[1]
region = parts[3]
cluster = parts[5]
if len(parts) >= 8 && parts[6] == "instances" {
instance = parts[7]
} else {
return "", false
}
tmpl, err := template.New("alloydb-connection").Parse(connectionMessageTemplate)
if err != nil {
// This should not happen with a static template
return fmt.Sprintf("template parsing error: %v", err), false
}
data := struct {
Project string
Region string
Cluster string
Instance string
}{
Project: project,
Region: region,
Cluster: cluster,
Instance: instance,
}
var b strings.Builder
if err := tmpl.Execute(&b, data); err != nil {
return fmt.Sprintf("template execution error: %v", err), false
}
return b.String(), true
}

View File

@@ -24,6 +24,7 @@ import (
"github.com/goccy/go-yaml"
"github.com/googleapis/genai-toolbox/internal/sources"
"github.com/googleapis/genai-toolbox/internal/util"
"github.com/googleapis/genai-toolbox/internal/util/orderedmap"
"github.com/jackc/pgx/v5/pgxpool"
"go.opentelemetry.io/otel/trace"
)
@@ -101,6 +102,33 @@ func (s *Source) PostgresPool() *pgxpool.Pool {
return s.Pool
}
func (s *Source) RunSQL(ctx context.Context, statement string, params []any) (any, error) {
results, err := s.Pool.Query(ctx, statement, params...)
if err != nil {
return nil, fmt.Errorf("unable to execute query: %w", err)
}
defer results.Close()
fields := results.FieldDescriptions()
var out []any
for results.Next() {
v, err := results.Values()
if err != nil {
return nil, fmt.Errorf("unable to parse row: %w", err)
}
row := orderedmap.Row{}
for i, f := range fields {
row.Add(f.Name, v[i])
}
out = append(out, row)
}
// this will catch actual query execution errors
if err := results.Err(); err != nil {
return nil, fmt.Errorf("unable to execute query: %w", err)
}
return out, nil
}
func getOpts(ipType, userAgent string, useIAM bool) ([]alloydbconn.Option, error) {
opts := []alloydbconn.Option{alloydbconn.WithUserAgent(userAgent)}
switch strings.ToLower(ipType) {

View File

@@ -17,7 +17,9 @@ package bigquery
import (
"context"
"fmt"
"math/big"
"net/http"
"reflect"
"strings"
"sync"
"time"
@@ -26,18 +28,24 @@ import (
dataplexapi "cloud.google.com/go/dataplex/apiv1"
"github.com/goccy/go-yaml"
"github.com/googleapis/genai-toolbox/internal/sources"
"github.com/googleapis/genai-toolbox/internal/tools"
"github.com/googleapis/genai-toolbox/internal/util"
"github.com/googleapis/genai-toolbox/internal/util/orderedmap"
"go.opentelemetry.io/otel/trace"
"golang.org/x/oauth2"
"golang.org/x/oauth2/google"
bigqueryrestapi "google.golang.org/api/bigquery/v2"
"google.golang.org/api/googleapi"
"google.golang.org/api/impersonate"
"google.golang.org/api/iterator"
"google.golang.org/api/option"
)
const SourceKind string = "bigquery"
// CloudPlatformScope is a broad scope for Google Cloud Platform services.
const CloudPlatformScope = "https://www.googleapis.com/auth/cloud-platform"
const (
// No write operations are allowed.
WriteModeBlocked string = "blocked"
@@ -72,14 +80,42 @@ func newConfig(ctx context.Context, name string, decoder *yaml.Decoder) (sources
type Config struct {
// BigQuery configs
Name string `yaml:"name" validate:"required"`
Kind string `yaml:"kind" validate:"required"`
Project string `yaml:"project" validate:"required"`
Location string `yaml:"location"`
WriteMode string `yaml:"writeMode"`
AllowedDatasets []string `yaml:"allowedDatasets"`
UseClientOAuth bool `yaml:"useClientOAuth"`
ImpersonateServiceAccount string `yaml:"impersonateServiceAccount"`
Name string `yaml:"name" validate:"required"`
Kind string `yaml:"kind" validate:"required"`
Project string `yaml:"project" validate:"required"`
Location string `yaml:"location"`
WriteMode string `yaml:"writeMode"`
AllowedDatasets StringOrStringSlice `yaml:"allowedDatasets"`
UseClientOAuth bool `yaml:"useClientOAuth"`
ImpersonateServiceAccount string `yaml:"impersonateServiceAccount"`
Scopes StringOrStringSlice `yaml:"scopes"`
}
// StringOrStringSlice is a custom type that can unmarshal both a single string
// (which it splits by comma) and a sequence of strings into a string slice.
type StringOrStringSlice []string
// UnmarshalYAML implements the yaml.Unmarshaler interface.
func (s *StringOrStringSlice) UnmarshalYAML(unmarshal func(any) error) error {
var v any
if err := unmarshal(&v); err != nil {
return err
}
switch val := v.(type) {
case string:
*s = strings.Split(val, ",")
return nil
case []any:
for _, item := range val {
if str, ok := item.(string); ok {
*s = append(*s, str)
} else {
return fmt.Errorf("element in sequence is not a string: %v", item)
}
}
return nil
}
return fmt.Errorf("cannot unmarshal %T into StringOrStringSlice", v)
}
func (r Config) SourceConfigKind() string {
@@ -128,7 +164,7 @@ func (r Config) Initialize(ctx context.Context, tracer trace.Tracer) (sources.So
} else {
// Initializes a BigQuery Google SQL source
client, restService, tokenSource, err = initBigQueryConnection(ctx, tracer, r.Name, r.Project, r.Location, r.ImpersonateServiceAccount)
client, restService, tokenSource, err = initBigQueryConnection(ctx, tracer, r.Name, r.Project, r.Location, r.ImpersonateServiceAccount, r.Scopes)
if err != nil {
return nil, fmt.Errorf("error creating client from ADC: %w", err)
}
@@ -391,19 +427,26 @@ func (s *Source) BigQueryTokenSource() oauth2.TokenSource {
return s.TokenSource
}
func (s *Source) BigQueryTokenSourceWithScope(ctx context.Context, scope string) (oauth2.TokenSource, error) {
func (s *Source) BigQueryTokenSourceWithScope(ctx context.Context, scopes []string) (oauth2.TokenSource, error) {
if len(scopes) == 0 {
scopes = s.Scopes
if len(scopes) == 0 {
scopes = []string{CloudPlatformScope}
}
}
if s.ImpersonateServiceAccount != "" {
// Create impersonated credentials token source with the requested scope
// Create impersonated credentials token source with the requested scopes
ts, err := impersonate.CredentialsTokenSource(ctx, impersonate.CredentialsConfig{
TargetPrincipal: s.ImpersonateServiceAccount,
Scopes: []string{scope},
Scopes: scopes,
})
if err != nil {
return nil, fmt.Errorf("failed to create impersonated credentials for %q with scope %q: %w", s.ImpersonateServiceAccount, scope, err)
return nil, fmt.Errorf("failed to create impersonated credentials for %q with scopes %v: %w", s.ImpersonateServiceAccount, scopes, err)
}
return ts, nil
}
return google.DefaultTokenSource(ctx, scope)
return google.DefaultTokenSource(ctx, scopes...)
}
func (s *Source) GetMaxQueryResultRows() int {
@@ -449,7 +492,7 @@ func (s *Source) lazyInitDataplexClient(ctx context.Context, tracer trace.Tracer
return func() (*dataplexapi.CatalogClient, DataplexClientCreator, error) {
once.Do(func() {
c, cc, e := initDataplexConnection(ctx, tracer, s.Name, s.Project, s.UseClientOAuth, s.ImpersonateServiceAccount)
c, cc, e := initDataplexConnection(ctx, tracer, s.Name, s.Project, s.UseClientOAuth, s.ImpersonateServiceAccount, s.Scopes)
if e != nil {
err = fmt.Errorf("failed to initialize dataplex client: %w", e)
return
@@ -483,6 +526,131 @@ func (s *Source) lazyInitDataplexClient(ctx context.Context, tracer trace.Tracer
}
}
func (s *Source) RetrieveClientAndService(accessToken tools.AccessToken) (*bigqueryapi.Client, *bigqueryrestapi.Service, error) {
bqClient := s.BigQueryClient()
restService := s.BigQueryRestService()
// Initialize new client if using user OAuth token
if s.UseClientAuthorization() {
tokenStr, err := accessToken.ParseBearerToken()
if err != nil {
return nil, nil, fmt.Errorf("error parsing access token: %w", err)
}
bqClient, restService, err = s.BigQueryClientCreator()(tokenStr, true)
if err != nil {
return nil, nil, fmt.Errorf("error creating client from OAuth access token: %w", err)
}
}
return bqClient, restService, nil
}
func (s *Source) RunSQL(ctx context.Context, bqClient *bigqueryapi.Client, statement, statementType string, params []bigqueryapi.QueryParameter, connProps []*bigqueryapi.ConnectionProperty) (any, error) {
query := bqClient.Query(statement)
query.Location = bqClient.Location
if params != nil {
query.Parameters = params
}
if connProps != nil {
query.ConnectionProperties = connProps
}
// This block handles SELECT statements, which return a row set.
// We iterate through the results, convert each row into a map of
// column names to values, and return the collection of rows.
job, err := query.Run(ctx)
if err != nil {
return nil, fmt.Errorf("unable to execute query: %w", err)
}
it, err := job.Read(ctx)
if err != nil {
return nil, fmt.Errorf("unable to read query results: %w", err)
}
var out []any
for {
var val []bigqueryapi.Value
err = it.Next(&val)
if err == iterator.Done {
break
}
if err != nil {
return nil, fmt.Errorf("unable to iterate through query results: %w", err)
}
schema := it.Schema
row := orderedmap.Row{}
for i, field := range schema {
row.Add(field.Name, NormalizeValue(val[i]))
}
out = append(out, row)
}
// If the query returned any rows, return them directly.
if len(out) > 0 {
return out, nil
}
// This handles the standard case for a SELECT query that successfully
// executes but returns zero rows.
if statementType == "SELECT" {
return "The query returned 0 rows.", nil
}
// This is the fallback for a successful query that doesn't return content.
// In most cases, this will be for DML/DDL statements like INSERT, UPDATE, CREATE, etc.
// However, it is also possible that this was a query that was expected to return rows
// but returned none, a case that we cannot distinguish here.
return "Query executed successfully and returned no content.", nil
}
// NormalizeValue converts BigQuery specific types to standard JSON-compatible types.
// Specifically, it handles *big.Rat (used for NUMERIC/BIGNUMERIC) by converting
// them to decimal strings with up to 38 digits of precision, trimming trailing zeros.
// It recursively handles slices (arrays) and maps (structs) using reflection.
func NormalizeValue(v any) any {
if v == nil {
return nil
}
// Handle *big.Rat specifically.
if rat, ok := v.(*big.Rat); ok {
// Convert big.Rat to a decimal string.
// Use a precision of 38 digits (enough for BIGNUMERIC and NUMERIC)
// and trim trailing zeros to match BigQuery's behavior.
s := rat.FloatString(38)
if strings.Contains(s, ".") {
s = strings.TrimRight(s, "0")
s = strings.TrimRight(s, ".")
}
return s
}
// Use reflection for slices and maps to handle various underlying types.
rv := reflect.ValueOf(v)
switch rv.Kind() {
case reflect.Slice, reflect.Array:
// Preserve []byte as is, so json.Marshal encodes it as Base64 string (BigQuery BYTES behavior).
if rv.Type().Elem().Kind() == reflect.Uint8 {
return v
}
newSlice := make([]any, rv.Len())
for i := 0; i < rv.Len(); i++ {
newSlice[i] = NormalizeValue(rv.Index(i).Interface())
}
return newSlice
case reflect.Map:
// Ensure keys are strings to produce a JSON-compatible map.
if rv.Type().Key().Kind() != reflect.String {
return v
}
newMap := make(map[string]any, rv.Len())
iter := rv.MapRange()
for iter.Next() {
newMap[iter.Key().String()] = NormalizeValue(iter.Value().Interface())
}
return newMap
}
return v
}
func initBigQueryConnection(
ctx context.Context,
tracer trace.Tracer,
@@ -490,6 +658,7 @@ func initBigQueryConnection(
project string,
location string,
impersonateServiceAccount string,
scopes []string,
) (*bigqueryapi.Client, *bigqueryrestapi.Service, oauth2.TokenSource, error) {
ctx, span := sources.InitConnectionSpan(ctx, tracer, SourceKind, name)
defer span.End()
@@ -502,12 +671,21 @@ func initBigQueryConnection(
var tokenSource oauth2.TokenSource
var opts []option.ClientOption
var credScopes []string
if len(scopes) > 0 {
credScopes = scopes
} else if impersonateServiceAccount != "" {
credScopes = []string{CloudPlatformScope}
} else {
credScopes = []string{bigqueryapi.Scope}
}
if impersonateServiceAccount != "" {
// Create impersonated credentials token source with cloud-platform scope
// Create impersonated credentials token source
// This broader scope is needed for tools like conversational analytics
cloudPlatformTokenSource, err := impersonate.CredentialsTokenSource(ctx, impersonate.CredentialsConfig{
TargetPrincipal: impersonateServiceAccount,
Scopes: []string{"https://www.googleapis.com/auth/cloud-platform"},
Scopes: credScopes,
})
if err != nil {
return nil, nil, nil, fmt.Errorf("failed to create impersonated credentials for %q: %w", impersonateServiceAccount, err)
@@ -519,9 +697,9 @@ func initBigQueryConnection(
}
} else {
// Use default credentials
cred, err := google.FindDefaultCredentials(ctx, bigqueryapi.Scope)
cred, err := google.FindDefaultCredentials(ctx, credScopes...)
if err != nil {
return nil, nil, nil, fmt.Errorf("failed to find default Google Cloud credentials with scope %q: %w", bigqueryapi.Scope, err)
return nil, nil, nil, fmt.Errorf("failed to find default Google Cloud credentials with scopes %v: %w", credScopes, err)
}
tokenSource = cred.TokenSource
opts = []option.ClientOption{
@@ -612,6 +790,7 @@ func initDataplexConnection(
project string,
useClientOAuth bool,
impersonateServiceAccount string,
scopes []string,
) (*dataplexapi.CatalogClient, DataplexClientCreator, error) {
var client *dataplexapi.CatalogClient
var clientCreator DataplexClientCreator
@@ -630,11 +809,16 @@ func initDataplexConnection(
} else {
var opts []option.ClientOption
credScopes := scopes
if len(credScopes) == 0 {
credScopes = []string{CloudPlatformScope}
}
if impersonateServiceAccount != "" {
// Create impersonated credentials token source
ts, err := impersonate.CredentialsTokenSource(ctx, impersonate.CredentialsConfig{
TargetPrincipal: impersonateServiceAccount,
Scopes: []string{"https://www.googleapis.com/auth/cloud-platform"},
Scopes: credScopes,
})
if err != nil {
return nil, nil, fmt.Errorf("failed to create impersonated credentials for %q: %w", impersonateServiceAccount, err)
@@ -645,7 +829,7 @@ func initDataplexConnection(
}
} else {
// Use default credentials
cred, err := google.FindDefaultCredentials(ctx)
cred, err := google.FindDefaultCredentials(ctx, credScopes...)
if err != nil {
return nil, nil, fmt.Errorf("failed to find default Google Cloud credentials: %w", err)
}

View File

@@ -15,6 +15,8 @@
package bigquery_test
import (
"math/big"
"reflect"
"testing"
yaml "github.com/goccy/go-yaml"
@@ -130,6 +132,28 @@ func TestParseFromYamlBigQuery(t *testing.T) {
},
},
},
{
desc: "with custom scopes example",
in: `
sources:
my-instance:
kind: bigquery
project: my-project
location: us
scopes:
- https://www.googleapis.com/auth/bigquery
- https://www.googleapis.com/auth/cloud-platform
`,
want: server.SourceConfigs{
"my-instance": bigquery.Config{
Name: "my-instance",
Kind: bigquery.SourceKind,
Project: "my-project",
Location: "us",
Scopes: []string{"https://www.googleapis.com/auth/bigquery", "https://www.googleapis.com/auth/cloud-platform"},
},
},
},
}
for _, tc := range tcs {
t.Run(tc.desc, func(t *testing.T) {
@@ -195,3 +219,105 @@ func TestFailParseFromYaml(t *testing.T) {
})
}
}
func TestNormalizeValue(t *testing.T) {
tests := []struct {
name string
input any
expected any
}{
{
name: "big.Rat 1/3 (NUMERIC scale 9)",
input: new(big.Rat).SetFrac64(1, 3), // 0.33333333333...
expected: "0.33333333333333333333333333333333333333", // FloatString(38)
},
{
name: "big.Rat 19/2 (9.5)",
input: new(big.Rat).SetFrac64(19, 2),
expected: "9.5",
},
{
name: "big.Rat 12341/10 (1234.1)",
input: new(big.Rat).SetFrac64(12341, 10),
expected: "1234.1",
},
{
name: "big.Rat 10/1 (10)",
input: new(big.Rat).SetFrac64(10, 1),
expected: "10",
},
{
name: "string",
input: "hello",
expected: "hello",
},
{
name: "int",
input: 123,
expected: 123,
},
{
name: "nested slice of big.Rat",
input: []any{
new(big.Rat).SetFrac64(19, 2),
new(big.Rat).SetFrac64(1, 4),
},
expected: []any{"9.5", "0.25"},
},
{
name: "nested map of big.Rat",
input: map[string]any{
"val1": new(big.Rat).SetFrac64(19, 2),
"val2": new(big.Rat).SetFrac64(1, 2),
},
expected: map[string]any{
"val1": "9.5",
"val2": "0.5",
},
},
{
name: "complex nested structure",
input: map[string]any{
"list": []any{
map[string]any{
"rat": new(big.Rat).SetFrac64(3, 2),
},
},
},
expected: map[string]any{
"list": []any{
map[string]any{
"rat": "1.5",
},
},
},
},
{
name: "slice of *big.Rat",
input: []*big.Rat{
new(big.Rat).SetFrac64(19, 2),
new(big.Rat).SetFrac64(1, 4),
},
expected: []any{"9.5", "0.25"},
},
{
name: "slice of strings",
input: []string{"a", "b"},
expected: []any{"a", "b"},
},
{
name: "byte slice (BYTES)",
input: []byte("hello"),
expected: []byte("hello"),
},
}
for _, tt := range tests {
t.Run(tt.name, func(t *testing.T) {
got := bigquery.NormalizeValue(tt.input)
if !reflect.DeepEqual(got, tt.expected) {
t.Errorf("NormalizeValue() = %v, want %v", got, tt.expected)
}
})
}
}

View File

@@ -22,6 +22,7 @@ import (
"github.com/goccy/go-yaml"
"github.com/googleapis/genai-toolbox/internal/sources"
"github.com/googleapis/genai-toolbox/internal/util"
"github.com/googleapis/genai-toolbox/internal/util/parameters"
"go.opentelemetry.io/otel/trace"
"google.golang.org/api/option"
)
@@ -88,6 +89,94 @@ func (s *Source) BigtableClient() *bigtable.Client {
return s.Client
}
func getBigtableType(paramType string) (bigtable.SQLType, error) {
switch paramType {
case "boolean":
return bigtable.BoolSQLType{}, nil
case "string":
return bigtable.StringSQLType{}, nil
case "integer":
return bigtable.Int64SQLType{}, nil
case "float":
return bigtable.Float64SQLType{}, nil
case "array":
return bigtable.ArraySQLType{}, nil
default:
return nil, fmt.Errorf("unknow param type %s", paramType)
}
}
func getMapParamsType(tparams parameters.Parameters) (map[string]bigtable.SQLType, error) {
btParamTypes := make(map[string]bigtable.SQLType)
for _, p := range tparams {
if p.GetType() == "array" {
itemType, err := getBigtableType(p.Manifest().Items.Type)
if err != nil {
return nil, err
}
btParamTypes[p.GetName()] = bigtable.ArraySQLType{
ElemType: itemType,
}
continue
}
paramType, err := getBigtableType(p.GetType())
if err != nil {
return nil, err
}
btParamTypes[p.GetName()] = paramType
}
return btParamTypes, nil
}
func (s *Source) RunSQL(ctx context.Context, statement string, configParam parameters.Parameters, params parameters.ParamValues) (any, error) {
mapParamsType, err := getMapParamsType(configParam)
if err != nil {
return nil, fmt.Errorf("fail to get map params: %w", err)
}
ps, err := s.BigtableClient().PrepareStatement(
ctx,
statement,
mapParamsType,
)
if err != nil {
return nil, fmt.Errorf("unable to prepare statement: %w", err)
}
bs, err := ps.Bind(params.AsMap())
if err != nil {
return nil, fmt.Errorf("unable to bind: %w", err)
}
var out []any
var rowErr error
err = bs.Execute(ctx, func(resultRow bigtable.ResultRow) bool {
vMap := make(map[string]any)
cols := resultRow.Metadata.Columns
for _, c := range cols {
var columValue any
if err = resultRow.GetByName(c.Name, &columValue); err != nil {
rowErr = err
return false
}
vMap[c.Name] = columValue
}
out = append(out, vMap)
return true
})
if err != nil {
return nil, fmt.Errorf("unable to execute client: %w", err)
}
if rowErr != nil {
return nil, fmt.Errorf("error processing row: %w", rowErr)
}
return out, nil
}
func initBigtableClient(ctx context.Context, tracer trace.Tracer, name, project, instance string) (*bigtable.Client, error) {
//nolint:all // Reassigned ctx
ctx, span := sources.InitConnectionSpan(ctx, tracer, SourceKind, name)

View File

@@ -21,6 +21,7 @@ import (
gocql "github.com/apache/cassandra-gocql-driver/v2"
"github.com/goccy/go-yaml"
"github.com/googleapis/genai-toolbox/internal/sources"
"github.com/googleapis/genai-toolbox/internal/util/parameters"
"go.opentelemetry.io/otel/trace"
)
@@ -89,10 +90,32 @@ func (s *Source) ToConfig() sources.SourceConfig {
}
// SourceKind implements sources.Source.
func (s Source) SourceKind() string {
func (s *Source) SourceKind() string {
return SourceKind
}
func (s *Source) RunSQL(ctx context.Context, statement string, params parameters.ParamValues) (any, error) {
sliceParams := params.AsSlice()
iter := s.CassandraSession().Query(statement, sliceParams...).IterContext(ctx)
// Create a slice to store the out
var out []map[string]interface{}
// Scan results into a map and append to the slice
for {
row := make(map[string]interface{}) // Create a new map for each row
if !iter.MapScan(row) {
break // No more rows
}
out = append(out, row)
}
if err := iter.Close(); err != nil {
return nil, fmt.Errorf("unable to parse rows: %w", err)
}
return out, nil
}
var _ sources.Source = &Source{}
func initCassandraSession(ctx context.Context, tracer trace.Tracer, c Config) (*gocql.Session, error) {

View File

@@ -24,6 +24,7 @@ import (
_ "github.com/ClickHouse/clickhouse-go/v2"
"github.com/goccy/go-yaml"
"github.com/googleapis/genai-toolbox/internal/sources"
"github.com/googleapis/genai-toolbox/internal/util/parameters"
"go.opentelemetry.io/otel/trace"
)
@@ -99,6 +100,69 @@ func (s *Source) ClickHousePool() *sql.DB {
return s.Pool
}
func (s *Source) RunSQL(ctx context.Context, statement string, params parameters.ParamValues) (any, error) {
var sliceParams []any
if params != nil {
sliceParams = params.AsSlice()
}
results, err := s.ClickHousePool().QueryContext(ctx, statement, sliceParams...)
if err != nil {
return nil, fmt.Errorf("unable to execute query: %w", err)
}
defer results.Close()
cols, err := results.Columns()
if err != nil {
return nil, fmt.Errorf("unable to retrieve rows column name: %w", err)
}
// create an array of values for each column, which can be re-used to scan each row
rawValues := make([]any, len(cols))
values := make([]any, len(cols))
for i := range rawValues {
values[i] = &rawValues[i]
}
colTypes, err := results.ColumnTypes()
if err != nil {
return nil, fmt.Errorf("unable to get column types: %w", err)
}
var out []any
for results.Next() {
err := results.Scan(values...)
if err != nil {
return nil, fmt.Errorf("unable to parse row: %w", err)
}
vMap := make(map[string]any)
for i, name := range cols {
// ClickHouse driver may return specific types that need handling
switch colTypes[i].DatabaseTypeName() {
case "String", "FixedString":
if rawValues[i] != nil {
// Handle potential []byte to string conversion if needed
if b, ok := rawValues[i].([]byte); ok {
vMap[name] = string(b)
} else {
vMap[name] = rawValues[i]
}
} else {
vMap[name] = nil
}
default:
vMap[name] = rawValues[i]
}
}
out = append(out, vMap)
}
if err := results.Err(); err != nil {
return nil, fmt.Errorf("errors encountered by results.Scan: %w", err)
}
return out, nil
}
func validateConfig(protocol string) error {
validProtocols := map[string]bool{"http": true, "https": true}

View File

@@ -14,8 +14,11 @@
package cloudgda
import (
"bytes"
"context"
"encoding/json"
"fmt"
"io"
"net/http"
"github.com/goccy/go-yaml"
@@ -131,3 +134,43 @@ func (s *Source) GetClient(ctx context.Context, accessToken string) (*http.Clien
func (s *Source) UseClientAuthorization() bool {
return s.UseClientOAuth
}
func (s *Source) RunQuery(ctx context.Context, tokenStr string, bodyBytes []byte) (any, error) {
// The API endpoint itself always uses the "global" location.
apiLocation := "global"
apiParent := fmt.Sprintf("projects/%s/locations/%s", s.GetProjectID(), apiLocation)
apiURL := fmt.Sprintf("%s/v1beta/%s:queryData", s.GetBaseURL(), apiParent)
client, err := s.GetClient(ctx, tokenStr)
if err != nil {
return nil, fmt.Errorf("failed to get HTTP client: %w", err)
}
req, err := http.NewRequestWithContext(ctx, http.MethodPost, apiURL, bytes.NewBuffer(bodyBytes))
if err != nil {
return nil, fmt.Errorf("failed to create request: %w", err)
}
req.Header.Set("Content-Type", "application/json")
resp, err := client.Do(req)
if err != nil {
return nil, fmt.Errorf("failed to execute request: %w", err)
}
defer resp.Body.Close()
respBody, err := io.ReadAll(resp.Body)
if err != nil {
return nil, fmt.Errorf("failed to read response body: %w", err)
}
if resp.StatusCode != http.StatusOK {
return nil, fmt.Errorf("API request failed with status %d: %s", resp.StatusCode, string(respBody))
}
var result map[string]any
if err := json.Unmarshal(respBody, &result); err != nil {
return nil, fmt.Errorf("failed to unmarshal response: %w", err)
}
return result, nil
}

View File

@@ -15,7 +15,9 @@ package cloudmonitoring
import (
"context"
"encoding/json"
"fmt"
"io"
"net/http"
"github.com/goccy/go-yaml"
@@ -131,3 +133,44 @@ func (s *Source) GetClient(ctx context.Context, accessToken string) (*http.Clien
func (s *Source) UseClientAuthorization() bool {
return s.UseClientOAuth
}
func (s *Source) RunQuery(projectID, query string) (any, error) {
url := fmt.Sprintf("%s/v1/projects/%s/location/global/prometheus/api/v1/query", s.BaseURL(), projectID)
req, err := http.NewRequest(http.MethodGet, url, nil)
if err != nil {
return nil, err
}
q := req.URL.Query()
q.Add("query", query)
req.URL.RawQuery = q.Encode()
req.Header.Set("User-Agent", s.UserAgent())
resp, err := s.Client().Do(req)
if err != nil {
return nil, err
}
defer resp.Body.Close()
body, err := io.ReadAll(resp.Body)
if err != nil {
return nil, fmt.Errorf("failed to read response body: %w", err)
}
if resp.StatusCode != http.StatusOK {
return nil, fmt.Errorf("request failed: %s, body: %s", resp.Status, string(body))
}
if len(body) == 0 {
return nil, nil
}
var result map[string]any
if err := json.Unmarshal(body, &result); err != nil {
return nil, fmt.Errorf("failed to unmarshal json: %w, body: %s", err, string(body))
}
return result, nil
}

View File

@@ -15,10 +15,16 @@ package cloudsqladmin
import (
"context"
"encoding/json"
"fmt"
"net/http"
"regexp"
"strings"
"text/template"
"time"
"github.com/goccy/go-yaml"
"github.com/googleapis/genai-toolbox/internal/log"
"github.com/googleapis/genai-toolbox/internal/sources"
"github.com/googleapis/genai-toolbox/internal/util"
"go.opentelemetry.io/otel/trace"
@@ -30,6 +36,8 @@ import (
const SourceKind string = "cloud-sql-admin"
var targetLinkRegex = regexp.MustCompile(`/projects/([^/]+)/instances/([^/]+)/databases/([^/]+)`)
// validate interface
var _ sources.SourceConfig = Config{}
@@ -130,3 +138,304 @@ func (s *Source) GetService(ctx context.Context, accessToken string) (*sqladmin.
func (s *Source) UseClientAuthorization() bool {
return s.UseClientOAuth
}
func (s *Source) CloneInstance(ctx context.Context, project, sourceInstanceName, destinationInstanceName, pointInTime, preferredZone, preferredSecondaryZone, accessToken string) (any, error) {
cloneContext := &sqladmin.CloneContext{
DestinationInstanceName: destinationInstanceName,
}
if pointInTime != "" {
cloneContext.PointInTime = pointInTime
}
if preferredZone != "" {
cloneContext.PreferredZone = preferredZone
}
if preferredSecondaryZone != "" {
cloneContext.PreferredSecondaryZone = preferredSecondaryZone
}
rb := &sqladmin.InstancesCloneRequest{
CloneContext: cloneContext,
}
service, err := s.GetService(ctx, accessToken)
if err != nil {
return nil, err
}
resp, err := service.Instances.Clone(project, sourceInstanceName, rb).Do()
if err != nil {
return nil, fmt.Errorf("error cloning instance: %w", err)
}
return resp, nil
}
func (s *Source) CreateDatabase(ctx context.Context, name, project, instance, accessToken string) (any, error) {
database := sqladmin.Database{
Name: name,
Project: project,
Instance: instance,
}
service, err := s.GetService(ctx, accessToken)
if err != nil {
return nil, err
}
resp, err := service.Databases.Insert(project, instance, &database).Do()
if err != nil {
return nil, fmt.Errorf("error creating database: %w", err)
}
return resp, nil
}
func (s *Source) CreateUsers(ctx context.Context, project, instance, name, password string, iamUser bool, accessToken string) (any, error) {
service, err := s.GetService(ctx, accessToken)
if err != nil {
return nil, err
}
user := sqladmin.User{
Name: name,
}
if iamUser {
user.Type = "CLOUD_IAM_USER"
} else {
user.Type = "BUILT_IN"
if password == "" {
return nil, fmt.Errorf("missing 'password' parameter for non-IAM user")
}
user.Password = password
}
resp, err := service.Users.Insert(project, instance, &user).Do()
if err != nil {
return nil, fmt.Errorf("error creating user: %w", err)
}
return resp, nil
}
func (s *Source) GetInstance(ctx context.Context, projectId, instanceId, accessToken string) (any, error) {
service, err := s.GetService(ctx, accessToken)
if err != nil {
return nil, err
}
resp, err := service.Instances.Get(projectId, instanceId).Do()
if err != nil {
return nil, fmt.Errorf("error getting instance: %w", err)
}
return resp, nil
}
func (s *Source) ListDatabase(ctx context.Context, project, instance, accessToken string) (any, error) {
service, err := s.GetService(ctx, accessToken)
if err != nil {
return nil, err
}
resp, err := service.Databases.List(project, instance).Do()
if err != nil {
return nil, fmt.Errorf("error listing databases: %w", err)
}
if resp.Items == nil {
return []any{}, nil
}
type databaseInfo struct {
Name string `json:"name"`
Charset string `json:"charset"`
Collation string `json:"collation"`
}
var databases []databaseInfo
for _, item := range resp.Items {
databases = append(databases, databaseInfo{
Name: item.Name,
Charset: item.Charset,
Collation: item.Collation,
})
}
return databases, nil
}
func (s *Source) ListInstance(ctx context.Context, project, accessToken string) (any, error) {
service, err := s.GetService(ctx, accessToken)
if err != nil {
return nil, err
}
resp, err := service.Instances.List(project).Do()
if err != nil {
return nil, fmt.Errorf("error listing instances: %w", err)
}
if resp.Items == nil {
return []any{}, nil
}
type instanceInfo struct {
Name string `json:"name"`
InstanceType string `json:"instanceType"`
}
var instances []instanceInfo
for _, item := range resp.Items {
instances = append(instances, instanceInfo{
Name: item.Name,
InstanceType: item.InstanceType,
})
}
return instances, nil
}
func (s *Source) CreateInstance(ctx context.Context, project, name, dbVersion, rootPassword string, settings sqladmin.Settings, accessToken string) (any, error) {
instance := sqladmin.DatabaseInstance{
Name: name,
DatabaseVersion: dbVersion,
RootPassword: rootPassword,
Settings: &settings,
Project: project,
}
service, err := s.GetService(ctx, accessToken)
if err != nil {
return nil, err
}
resp, err := service.Instances.Insert(project, &instance).Do()
if err != nil {
return nil, fmt.Errorf("error creating instance: %w", err)
}
return resp, nil
}
func (s *Source) GetWaitForOperations(ctx context.Context, service *sqladmin.Service, project, operation, connectionMessageTemplate string, delay time.Duration) (any, error) {
logger, err := util.LoggerFromContext(ctx)
if err != nil {
return nil, err
}
op, err := service.Operations.Get(project, operation).Do()
if err != nil {
logger.DebugContext(ctx, fmt.Sprintf("error getting operation: %s, retrying in %v", err, delay))
} else {
if op.Status == "DONE" {
if op.Error != nil {
var errorBytes []byte
errorBytes, err = json.Marshal(op.Error)
if err != nil {
return nil, fmt.Errorf("operation finished with error but could not marshal error object: %w", err)
}
return nil, fmt.Errorf("operation finished with error: %s", string(errorBytes))
}
var opBytes []byte
opBytes, err = op.MarshalJSON()
if err != nil {
return nil, fmt.Errorf("could not marshal operation: %w", err)
}
var data map[string]any
if err := json.Unmarshal(opBytes, &data); err != nil {
return nil, fmt.Errorf("could not unmarshal operation: %w", err)
}
if msg, ok := generateCloudSQLConnectionMessage(ctx, s, logger, data, connectionMessageTemplate); ok {
return msg, nil
}
return string(opBytes), nil
}
logger.DebugContext(ctx, fmt.Sprintf("operation not complete, retrying in %v", delay))
}
return nil, nil
}
func generateCloudSQLConnectionMessage(ctx context.Context, source *Source, logger log.Logger, opResponse map[string]any, connectionMessageTemplate string) (string, bool) {
operationType, ok := opResponse["operationType"].(string)
if !ok || operationType != "CREATE_DATABASE" {
return "", false
}
targetLink, ok := opResponse["targetLink"].(string)
if !ok {
return "", false
}
matches := targetLinkRegex.FindStringSubmatch(targetLink)
if len(matches) < 4 {
return "", false
}
project := matches[1]
instance := matches[2]
database := matches[3]
dbInstance, err := fetchInstanceData(ctx, source, project, instance)
if err != nil {
logger.DebugContext(ctx, fmt.Sprintf("error fetching instance data: %v", err))
return "", false
}
region := dbInstance.Region
if region == "" {
return "", false
}
databaseVersion := dbInstance.DatabaseVersion
if databaseVersion == "" {
return "", false
}
var dbType string
if strings.Contains(databaseVersion, "POSTGRES") {
dbType = "postgres"
} else if strings.Contains(databaseVersion, "MYSQL") {
dbType = "mysql"
} else if strings.Contains(databaseVersion, "SQLSERVER") {
dbType = "mssql"
} else {
return "", false
}
tmpl, err := template.New("cloud-sql-connection").Parse(connectionMessageTemplate)
if err != nil {
return fmt.Sprintf("template parsing error: %v", err), false
}
data := struct {
Project string
Region string
Instance string
DBType string
DBTypeUpper string
Database string
}{
Project: project,
Region: region,
Instance: instance,
DBType: dbType,
DBTypeUpper: strings.ToUpper(dbType),
Database: database,
}
var b strings.Builder
if err := tmpl.Execute(&b, data); err != nil {
return fmt.Sprintf("template execution error: %v", err), false
}
return b.String(), true
}
func fetchInstanceData(ctx context.Context, source *Source, project, instance string) (*sqladmin.DatabaseInstance, error) {
service, err := source.GetService(ctx, "")
if err != nil {
return nil, err
}
resp, err := service.Instances.Get(project, instance).Do()
if err != nil {
return nil, fmt.Errorf("error getting instance: %w", err)
}
return resp, nil
}

View File

@@ -25,6 +25,7 @@ import (
"github.com/goccy/go-yaml"
"github.com/googleapis/genai-toolbox/internal/sources"
"github.com/googleapis/genai-toolbox/internal/util"
"github.com/googleapis/genai-toolbox/internal/util/orderedmap"
"go.opentelemetry.io/otel/trace"
)
@@ -107,6 +108,48 @@ func (s *Source) MSSQLDB() *sql.DB {
return s.Db
}
func (s *Source) RunSQL(ctx context.Context, statement string, params []any) (any, error) {
results, err := s.MSSQLDB().QueryContext(ctx, statement, params...)
if err != nil {
return nil, fmt.Errorf("unable to execute query: %w", err)
}
defer results.Close()
cols, err := results.Columns()
// If Columns() errors, it might be a DDL/DML without an OUTPUT clause.
// We proceed, and results.Err() will catch actual query execution errors.
// 'out' will remain nil if cols is empty or err is not nil here.
var out []any
if err == nil && len(cols) > 0 {
// create an array of values for each column, which can be re-used to scan each row
rawValues := make([]any, len(cols))
values := make([]any, len(cols))
for i := range rawValues {
values[i] = &rawValues[i]
}
for results.Next() {
scanErr := results.Scan(values...)
if scanErr != nil {
return nil, fmt.Errorf("unable to parse row: %w", scanErr)
}
row := orderedmap.Row{}
for i, name := range cols {
row.Add(name, rawValues[i])
}
out = append(out, row)
}
}
// Check for errors from iterating over rows or from the query execution itself.
// results.Close() is handled by defer.
if err := results.Err(); err != nil {
return nil, fmt.Errorf("errors encountered during query execution or row processing: %w", err)
}
return out, nil
}
func initCloudSQLMssqlConnection(ctx context.Context, tracer trace.Tracer, name, project, region, instance, ipType, user, pass, dbname string) (*sql.DB, error) {
//nolint:all // Reassigned ctx
ctx, span := sources.InitConnectionSpan(ctx, tracer, SourceKind, name)

View File

@@ -24,7 +24,9 @@ import (
"cloud.google.com/go/cloudsqlconn/mysql/mysql"
"github.com/goccy/go-yaml"
"github.com/googleapis/genai-toolbox/internal/sources"
"github.com/googleapis/genai-toolbox/internal/tools/mysql/mysqlcommon"
"github.com/googleapis/genai-toolbox/internal/util"
"github.com/googleapis/genai-toolbox/internal/util/orderedmap"
"go.opentelemetry.io/otel/trace"
)
@@ -100,6 +102,60 @@ func (s *Source) MySQLPool() *sql.DB {
return s.Pool
}
func (s *Source) RunSQL(ctx context.Context, statement string, params []any) (any, error) {
results, err := s.MySQLPool().QueryContext(ctx, statement, params...)
if err != nil {
return nil, fmt.Errorf("unable to execute query: %w", err)
}
defer results.Close()
cols, err := results.Columns()
if err != nil {
return nil, fmt.Errorf("unable to retrieve rows column name: %w", err)
}
// create an array of values for each column, which can be re-used to scan each row
rawValues := make([]any, len(cols))
values := make([]any, len(cols))
for i := range rawValues {
values[i] = &rawValues[i]
}
colTypes, err := results.ColumnTypes()
if err != nil {
return nil, fmt.Errorf("unable to get column types: %w", err)
}
var out []any
for results.Next() {
err := results.Scan(values...)
if err != nil {
return nil, fmt.Errorf("unable to parse row: %w", err)
}
row := orderedmap.Row{}
for i, name := range cols {
val := rawValues[i]
if val == nil {
row.Add(name, nil)
continue
}
convertedValue, err := mysqlcommon.ConvertToType(colTypes[i], val)
if err != nil {
return nil, fmt.Errorf("errors encountered when converting values: %w", err)
}
row.Add(name, convertedValue)
}
out = append(out, row)
}
if err := results.Err(); err != nil {
return nil, fmt.Errorf("errors encountered during row iteration: %w", err)
}
return out, nil
}
func getConnectionConfig(ctx context.Context, user, pass string) (string, string, bool, error) {
useIAM := true

View File

@@ -23,6 +23,7 @@ import (
"github.com/goccy/go-yaml"
"github.com/googleapis/genai-toolbox/internal/sources"
"github.com/googleapis/genai-toolbox/internal/util"
"github.com/googleapis/genai-toolbox/internal/util/orderedmap"
"github.com/jackc/pgx/v5/pgxpool"
"go.opentelemetry.io/otel/trace"
)
@@ -99,6 +100,33 @@ func (s *Source) PostgresPool() *pgxpool.Pool {
return s.Pool
}
func (s *Source) RunSQL(ctx context.Context, statement string, params []any) (any, error) {
results, err := s.PostgresPool().Query(ctx, statement, params...)
if err != nil {
return nil, fmt.Errorf("unable to execute query: %w", err)
}
defer results.Close()
fields := results.FieldDescriptions()
var out []any
for results.Next() {
values, err := results.Values()
if err != nil {
return nil, fmt.Errorf("unable to parse row: %w", err)
}
row := orderedmap.Row{}
for i, f := range fields {
row.Add(f.Name, values[i])
}
out = append(out, row)
}
// this will catch actual query execution errors
if err := results.Err(); err != nil {
return nil, fmt.Errorf("unable to execute query: %w", err)
}
return out, nil
}
func getConnectionConfig(ctx context.Context, user, pass, dbname string) (string, bool, error) {
userAgent, err := util.UserAgentFromContext(ctx)
if err != nil {

View File

@@ -17,6 +17,7 @@ package couchbase
import (
"context"
"crypto/tls"
"encoding/json"
"fmt"
"os"
@@ -24,6 +25,7 @@ import (
tlsutil "github.com/couchbase/tools-common/http/tls"
"github.com/goccy/go-yaml"
"github.com/googleapis/genai-toolbox/internal/sources"
"github.com/googleapis/genai-toolbox/internal/util/parameters"
"go.opentelemetry.io/otel/trace"
)
@@ -110,6 +112,27 @@ func (s *Source) CouchbaseQueryScanConsistency() uint {
return s.QueryScanConsistency
}
func (s *Source) RunSQL(statement string, params parameters.ParamValues) (any, error) {
results, err := s.CouchbaseScope().Query(statement, &gocb.QueryOptions{
ScanConsistency: gocb.QueryScanConsistency(s.CouchbaseQueryScanConsistency()),
NamedParameters: params.AsMap(),
})
if err != nil {
return nil, fmt.Errorf("unable to execute query: %w", err)
}
var out []any
for results.Next() {
var result json.RawMessage
err := results.Row(&result)
if err != nil {
return nil, fmt.Errorf("error processing row: %w", err)
}
out = append(out, result)
}
return out, nil
}
func (r Config) createCouchbaseOptions() (gocb.ClusterOptions, error) {
cbOpts := gocb.ClusterOptions{}

View File

@@ -26,6 +26,7 @@ import (
"github.com/goccy/go-yaml"
"github.com/googleapis/genai-toolbox/internal/sources"
"github.com/googleapis/genai-toolbox/internal/util/parameters"
"go.opentelemetry.io/otel/trace"
)
@@ -114,6 +115,28 @@ func (s *Source) DgraphClient() *DgraphClient {
return s.Client
}
func (s *Source) RunSQL(statement string, params parameters.ParamValues, isQuery bool, timeout string) (any, error) {
paramsMap := params.AsMapWithDollarPrefix()
resp, err := s.DgraphClient().ExecuteQuery(statement, paramsMap, isQuery, timeout)
if err != nil {
return nil, err
}
if err := checkError(resp); err != nil {
return nil, err
}
var result struct {
Data map[string]interface{} `json:"data"`
}
if err := json.Unmarshal(resp, &result); err != nil {
return nil, fmt.Errorf("error parsing JSON: %v", err)
}
return result.Data, nil
}
func initDgraphHttpClient(ctx context.Context, tracer trace.Tracer, r Config) (*DgraphClient, error) {
//nolint:all // Reassigned ctx
ctx, span := sources.InitConnectionSpan(ctx, tracer, SourceKind, r.Name)
@@ -285,7 +308,7 @@ func (hc *DgraphClient) doLogin(creds map[string]interface{}) error {
return err
}
if err := CheckError(resp); err != nil {
if err := checkError(resp); err != nil {
return err
}
@@ -370,7 +393,7 @@ func getUrl(baseUrl, resource string, params url.Values) (string, error) {
return u.String(), nil
}
func CheckError(resp []byte) error {
func checkError(resp []byte) error {
var errResp struct {
Errors []struct {
Message string `json:"message"`

View File

@@ -15,7 +15,9 @@
package elasticsearch
import (
"bytes"
"context"
"encoding/json"
"fmt"
"net/http"
@@ -149,3 +151,80 @@ func (s *Source) ToConfig() sources.SourceConfig {
func (s *Source) ElasticsearchClient() EsClient {
return s.Client
}
type EsqlColumn struct {
Name string `json:"name"`
Type string `json:"type"`
}
type EsqlResult struct {
Columns []EsqlColumn `json:"columns"`
Values [][]any `json:"values"`
}
func (s *Source) RunSQL(ctx context.Context, format, query string, params []map[string]any) (any, error) {
bodyStruct := struct {
Query string `json:"query"`
Params []map[string]any `json:"params,omitempty"`
}{
Query: query,
Params: params,
}
body, err := json.Marshal(bodyStruct)
if err != nil {
return nil, fmt.Errorf("failed to marshal query body: %w", err)
}
res, err := esapi.EsqlQueryRequest{
Body: bytes.NewReader(body),
Format: format,
FilterPath: []string{"columns", "values"},
Instrument: s.ElasticsearchClient().InstrumentationEnabled(),
}.Do(ctx, s.ElasticsearchClient())
if err != nil {
return nil, err
}
defer res.Body.Close()
if res.IsError() {
// Try to extract error message from response
var esErr json.RawMessage
err = util.DecodeJSON(res.Body, &esErr)
if err != nil {
return nil, fmt.Errorf("elasticsearch error: status %s", res.Status())
}
return esErr, nil
}
var result EsqlResult
err = util.DecodeJSON(res.Body, &result)
if err != nil {
return nil, fmt.Errorf("failed to decode response body: %w", err)
}
output := EsqlToMap(result)
return output, nil
}
// EsqlToMap converts the esqlResult to a slice of maps.
func EsqlToMap(result EsqlResult) []map[string]any {
output := make([]map[string]any, 0, len(result.Values))
for _, value := range result.Values {
row := make(map[string]any)
if value == nil {
output = append(output, row)
continue
}
for i, col := range result.Columns {
if i < len(value) {
row[col.Name] = value[i]
} else {
row[col.Name] = nil
}
}
output = append(output, row)
}
return output
}

View File

@@ -15,6 +15,7 @@
package elasticsearch_test
import (
"reflect"
"testing"
yaml "github.com/goccy/go-yaml"
@@ -64,3 +65,155 @@ func TestParseFromYamlElasticsearch(t *testing.T) {
})
}
}
func TestTool_esqlToMap(t1 *testing.T) {
tests := []struct {
name string
result elasticsearch.EsqlResult
want []map[string]any
}{
{
name: "simple case with two rows",
result: elasticsearch.EsqlResult{
Columns: []elasticsearch.EsqlColumn{
{Name: "first_name", Type: "text"},
{Name: "last_name", Type: "text"},
},
Values: [][]any{
{"John", "Doe"},
{"Jane", "Smith"},
},
},
want: []map[string]any{
{"first_name": "John", "last_name": "Doe"},
{"first_name": "Jane", "last_name": "Smith"},
},
},
{
name: "different data types",
result: elasticsearch.EsqlResult{
Columns: []elasticsearch.EsqlColumn{
{Name: "id", Type: "integer"},
{Name: "active", Type: "boolean"},
{Name: "score", Type: "float"},
},
Values: [][]any{
{1, true, 95.5},
{2, false, 88.0},
},
},
want: []map[string]any{
{"id": 1, "active": true, "score": 95.5},
{"id": 2, "active": false, "score": 88.0},
},
},
{
name: "no rows",
result: elasticsearch.EsqlResult{
Columns: []elasticsearch.EsqlColumn{
{Name: "id", Type: "integer"},
{Name: "name", Type: "text"},
},
Values: [][]any{},
},
want: []map[string]any{},
},
{
name: "null values",
result: elasticsearch.EsqlResult{
Columns: []elasticsearch.EsqlColumn{
{Name: "id", Type: "integer"},
{Name: "name", Type: "text"},
},
Values: [][]any{
{1, nil},
{2, "Alice"},
},
},
want: []map[string]any{
{"id": 1, "name": nil},
{"id": 2, "name": "Alice"},
},
},
{
name: "missing values in a row",
result: elasticsearch.EsqlResult{
Columns: []elasticsearch.EsqlColumn{
{Name: "id", Type: "integer"},
{Name: "name", Type: "text"},
{Name: "age", Type: "integer"},
},
Values: [][]any{
{1, "Bob"},
{2, "Charlie", 30},
},
},
want: []map[string]any{
{"id": 1, "name": "Bob", "age": nil},
{"id": 2, "name": "Charlie", "age": 30},
},
},
{
name: "all null row",
result: elasticsearch.EsqlResult{
Columns: []elasticsearch.EsqlColumn{
{Name: "id", Type: "integer"},
{Name: "name", Type: "text"},
},
Values: [][]any{
nil,
},
},
want: []map[string]any{
{},
},
},
{
name: "empty columns",
result: elasticsearch.EsqlResult{
Columns: []elasticsearch.EsqlColumn{},
Values: [][]any{
{},
{},
},
},
want: []map[string]any{
{},
{},
},
},
{
name: "more values than columns",
result: elasticsearch.EsqlResult{
Columns: []elasticsearch.EsqlColumn{
{Name: "id", Type: "integer"},
},
Values: [][]any{
{1, "extra"},
},
},
want: []map[string]any{
{"id": 1},
},
},
{
name: "no columns but with values",
result: elasticsearch.EsqlResult{
Columns: []elasticsearch.EsqlColumn{},
Values: [][]any{
{1, "data"},
},
},
want: []map[string]any{
{},
},
},
}
for _, tt := range tests {
t1.Run(tt.name, func(t1 *testing.T) {
if got := elasticsearch.EsqlToMap(tt.result); !reflect.DeepEqual(got, tt.want) {
t1.Errorf("esqlToMap() = %v, want %v", got, tt.want)
}
})
}
}

View File

@@ -96,6 +96,53 @@ func (s *Source) FirebirdDB() *sql.DB {
return s.Db
}
func (s *Source) RunSQL(ctx context.Context, statement string, params []any) (any, error) {
rows, err := s.FirebirdDB().QueryContext(ctx, statement, params...)
if err != nil {
return nil, fmt.Errorf("unable to execute query: %w", err)
}
defer rows.Close()
cols, err := rows.Columns()
if err != nil {
return nil, fmt.Errorf("unable to get columns: %w", err)
}
values := make([]any, len(cols))
scanArgs := make([]any, len(values))
for i := range values {
scanArgs[i] = &values[i]
}
var out []any
for rows.Next() {
err = rows.Scan(scanArgs...)
if err != nil {
return nil, fmt.Errorf("unable to parse row: %w", err)
}
vMap := make(map[string]any)
for i, col := range cols {
if b, ok := values[i].([]byte); ok {
vMap[col] = string(b)
} else {
vMap[col] = values[i]
}
}
out = append(out, vMap)
}
if err := rows.Err(); err != nil {
return nil, fmt.Errorf("error iterating rows: %w", err)
}
// In most cases, DML/DDL statements like INSERT, UPDATE, CREATE, etc. might return no rows
// However, it is also possible that this was a query that was expected to return rows
// but returned none, a case that we cannot distinguish here.
return out, nil
}
func initFirebirdConnectionPool(ctx context.Context, tracer trace.Tracer, name, host, port, user, pass, dbname string) (*sql.DB, error) {
_, span := sources.InitConnectionSpan(ctx, tracer, SourceKind, name)
defer span.End()

View File

@@ -23,6 +23,7 @@ import (
_ "github.com/go-sql-driver/mysql"
"github.com/goccy/go-yaml"
"github.com/googleapis/genai-toolbox/internal/sources"
"github.com/googleapis/genai-toolbox/internal/tools/mysql/mysqlcommon"
"go.opentelemetry.io/otel/trace"
)
@@ -101,6 +102,61 @@ func (s *Source) MySQLPool() *sql.DB {
return s.Pool
}
func (s *Source) RunSQL(ctx context.Context, statement string, params []any) (any, error) {
// MindsDB now supports MySQL prepared statements natively
results, err := s.MindsDBPool().QueryContext(ctx, statement, params...)
if err != nil {
return nil, fmt.Errorf("unable to execute query: %w", err)
}
cols, err := results.Columns()
if err != nil {
return nil, fmt.Errorf("unable to retrieve rows column name: %w", err)
}
// create an array of values for each column, which can be re-used to scan each row
rawValues := make([]any, len(cols))
values := make([]any, len(cols))
for i := range rawValues {
values[i] = &rawValues[i]
}
defer results.Close()
colTypes, err := results.ColumnTypes()
if err != nil {
return nil, fmt.Errorf("unable to get column types: %w", err)
}
var out []any
for results.Next() {
err := results.Scan(values...)
if err != nil {
return nil, fmt.Errorf("unable to parse row: %w", err)
}
vMap := make(map[string]any)
for i, name := range cols {
val := rawValues[i]
if val == nil {
vMap[name] = nil
continue
}
// MindsDB uses mysql driver
vMap[name], err = mysqlcommon.ConvertToType(colTypes[i], val)
if err != nil {
return nil, fmt.Errorf("errors encountered when converting values: %w", err)
}
}
out = append(out, vMap)
}
if err := results.Err(); err != nil {
return nil, fmt.Errorf("errors encountered during row iteration: %w", err)
}
return out, nil
}
func initMindsDBConnectionPool(ctx context.Context, tracer trace.Tracer, name, host, port, user, pass, dbname, queryTimeout string) (*sql.DB, error) {
//nolint:all // Reassigned ctx
ctx, span := sources.InitConnectionSpan(ctx, tracer, SourceKind, name)

View File

@@ -23,6 +23,7 @@ import (
"github.com/goccy/go-yaml"
"github.com/googleapis/genai-toolbox/internal/sources"
"github.com/googleapis/genai-toolbox/internal/util"
"github.com/googleapis/genai-toolbox/internal/util/orderedmap"
_ "github.com/microsoft/go-mssqldb"
"go.opentelemetry.io/otel/trace"
)
@@ -104,6 +105,48 @@ func (s *Source) MSSQLDB() *sql.DB {
return s.Db
}
func (s *Source) RunSQL(ctx context.Context, statement string, params []any) (any, error) {
results, err := s.MSSQLDB().QueryContext(ctx, statement, params...)
if err != nil {
return nil, fmt.Errorf("unable to execute query: %w", err)
}
defer results.Close()
cols, err := results.Columns()
// If Columns() errors, it might be a DDL/DML without an OUTPUT clause.
// We proceed, and results.Err() will catch actual query execution errors.
// 'out' will remain nil if cols is empty or err is not nil here.
var out []any
if err == nil && len(cols) > 0 {
// create an array of values for each column, which can be re-used to scan each row
rawValues := make([]any, len(cols))
values := make([]any, len(cols))
for i := range rawValues {
values[i] = &rawValues[i]
}
for results.Next() {
scanErr := results.Scan(values...)
if scanErr != nil {
return nil, fmt.Errorf("unable to parse row: %w", scanErr)
}
row := orderedmap.Row{}
for i, name := range cols {
row.Add(name, rawValues[i])
}
out = append(out, row)
}
}
// Check for errors from iterating over rows or from the query execution itself.
// results.Close() is handled by defer.
if err := results.Err(); err != nil {
return nil, fmt.Errorf("errors encountered during query execution or row processing: %w", err)
}
return out, nil
}
func initMssqlConnection(
ctx context.Context,
tracer trace.Tracer,

View File

@@ -24,7 +24,9 @@ import (
_ "github.com/go-sql-driver/mysql"
"github.com/goccy/go-yaml"
"github.com/googleapis/genai-toolbox/internal/sources"
"github.com/googleapis/genai-toolbox/internal/tools/mysql/mysqlcommon"
"github.com/googleapis/genai-toolbox/internal/util"
"github.com/googleapis/genai-toolbox/internal/util/orderedmap"
"go.opentelemetry.io/otel/trace"
)
@@ -100,6 +102,60 @@ func (s *Source) MySQLPool() *sql.DB {
return s.Pool
}
func (s *Source) RunSQL(ctx context.Context, statement string, params []any) (any, error) {
results, err := s.MySQLPool().QueryContext(ctx, statement, params...)
if err != nil {
return nil, fmt.Errorf("unable to execute query: %w", err)
}
defer results.Close()
cols, err := results.Columns()
if err != nil {
return nil, fmt.Errorf("unable to retrieve rows column name: %w", err)
}
// create an array of values for each column, which can be re-used to scan each row
rawValues := make([]any, len(cols))
values := make([]any, len(cols))
for i := range rawValues {
values[i] = &rawValues[i]
}
colTypes, err := results.ColumnTypes()
if err != nil {
return nil, fmt.Errorf("unable to get column types: %w", err)
}
var out []any
for results.Next() {
err := results.Scan(values...)
if err != nil {
return nil, fmt.Errorf("unable to parse row: %w", err)
}
row := orderedmap.Row{}
for i, name := range cols {
val := rawValues[i]
if val == nil {
row.Add(name, nil)
continue
}
convertedValue, err := mysqlcommon.ConvertToType(colTypes[i], val)
if err != nil {
return nil, fmt.Errorf("errors encountered when converting values: %w", err)
}
row.Add(name, convertedValue)
}
out = append(out, row)
}
if err := results.Err(); err != nil {
return nil, fmt.Errorf("errors encountered during row iteration: %w", err)
}
return out, nil
}
func initMySQLConnectionPool(ctx context.Context, tracer trace.Tracer, name, host, port, user, pass, dbname, queryTimeout string, queryParams map[string]string) (*sql.DB, error) {
//nolint:all // Reassigned ctx
ctx, span := sources.InitConnectionSpan(ctx, tracer, SourceKind, name)

View File

@@ -20,14 +20,19 @@ import (
"github.com/goccy/go-yaml"
"github.com/googleapis/genai-toolbox/internal/sources"
"github.com/googleapis/genai-toolbox/internal/tools/neo4j/neo4jexecutecypher/classifier"
"github.com/googleapis/genai-toolbox/internal/tools/neo4j/neo4jschema/helpers"
"github.com/googleapis/genai-toolbox/internal/util"
"github.com/neo4j/neo4j-go-driver/v5/neo4j"
neo4jconf "github.com/neo4j/neo4j-go-driver/v5/neo4j/config"
"go.opentelemetry.io/otel/trace"
)
const SourceKind string = "neo4j"
var sourceClassifier *classifier.QueryClassifier = classifier.NewQueryClassifier()
// validate interface
var _ sources.SourceConfig = Config{}
@@ -102,6 +107,79 @@ func (s *Source) Neo4jDatabase() string {
return s.Database
}
func (s *Source) RunQuery(ctx context.Context, cypherStr string, params map[string]any, readOnly, dryRun bool) (any, error) {
// validate the cypher query before executing
cf := sourceClassifier.Classify(cypherStr)
if cf.Error != nil {
return nil, cf.Error
}
if cf.Type == classifier.WriteQuery && readOnly {
return nil, fmt.Errorf("this tool is read-only and cannot execute write queries")
}
if dryRun {
// Add EXPLAIN to the beginning of the query to validate it without executing
cypherStr = "EXPLAIN " + cypherStr
}
config := neo4j.ExecuteQueryWithDatabase(s.Neo4jDatabase())
results, err := neo4j.ExecuteQuery[*neo4j.EagerResult](ctx, s.Neo4jDriver(), cypherStr, params,
neo4j.EagerResultTransformer, config)
if err != nil {
return nil, fmt.Errorf("unable to execute query: %w", err)
}
// If dry run, return the summary information only
if dryRun {
summary := results.Summary
plan := summary.Plan()
execPlan := map[string]any{
"queryType": cf.Type.String(),
"statementType": summary.StatementType(),
"operator": plan.Operator(),
"arguments": plan.Arguments(),
"identifiers": plan.Identifiers(),
"childrenCount": len(plan.Children()),
}
if len(plan.Children()) > 0 {
execPlan["children"] = addPlanChildren(plan)
}
return []map[string]any{execPlan}, nil
}
var out []map[string]any
keys := results.Keys
records := results.Records
for _, record := range records {
vMap := make(map[string]any)
for col, value := range record.Values {
vMap[keys[col]] = helpers.ConvertValue(value)
}
out = append(out, vMap)
}
return out, nil
}
// Recursive function to add plan children
func addPlanChildren(p neo4j.Plan) []map[string]any {
var children []map[string]any
for _, child := range p.Children() {
childMap := map[string]any{
"operator": child.Operator(),
"arguments": child.Arguments(),
"identifiers": child.Identifiers(),
"children_count": len(child.Children()),
}
if len(child.Children()) > 0 {
childMap["children"] = addPlanChildren(child)
}
children = append(children, childMap)
}
return children
}
func initNeo4jDriver(ctx context.Context, tracer trace.Tracer, uri, user, password, name string) (neo4j.DriverWithContext, error) {
//nolint:all // Reassigned ctx
ctx, span := sources.InitConnectionSpan(ctx, tracer, SourceKind, name)

View File

@@ -23,6 +23,7 @@ import (
_ "github.com/go-sql-driver/mysql"
"github.com/goccy/go-yaml"
"github.com/googleapis/genai-toolbox/internal/sources"
"github.com/googleapis/genai-toolbox/internal/tools/mysql/mysqlcommon"
"go.opentelemetry.io/otel/trace"
)
@@ -97,6 +98,60 @@ func (s *Source) OceanBasePool() *sql.DB {
return s.Pool
}
func (s *Source) RunSQL(ctx context.Context, statement string, params []any) (any, error) {
results, err := s.OceanBasePool().QueryContext(ctx, statement, params...)
if err != nil {
return nil, fmt.Errorf("unable to execute query: %w", err)
}
cols, err := results.Columns()
if err != nil {
return nil, fmt.Errorf("unable to retrieve rows column name: %w", err)
}
// create an array of values for each column, which can be re-used to scan each row
rawValues := make([]any, len(cols))
values := make([]any, len(cols))
for i := range rawValues {
values[i] = &rawValues[i]
}
defer results.Close()
colTypes, err := results.ColumnTypes()
if err != nil {
return nil, fmt.Errorf("unable to get column types: %w", err)
}
var out []any
for results.Next() {
err := results.Scan(values...)
if err != nil {
return nil, fmt.Errorf("unable to parse row: %w", err)
}
vMap := make(map[string]any)
for i, name := range cols {
val := rawValues[i]
if val == nil {
vMap[name] = nil
continue
}
// oceanbase uses mysql driver
vMap[name], err = mysqlcommon.ConvertToType(colTypes[i], val)
if err != nil {
return nil, fmt.Errorf("errors encountered when converting values: %w", err)
}
}
out = append(out, vMap)
}
if err := results.Err(); err != nil {
return nil, fmt.Errorf("errors encountered during row iteration: %w", err)
}
return out, nil
}
func initOceanBaseConnectionPool(ctx context.Context, tracer trace.Tracer, name, host, port, user, pass, dbname, queryTimeout string) (*sql.DB, error) {
_, span := sources.InitConnectionSpan(ctx, tracer, SourceKind, name)
defer span.End()

View File

@@ -4,6 +4,7 @@ package oracle
import (
"context"
"database/sql"
"encoding/json"
"fmt"
"os"
"strings"
@@ -135,6 +136,107 @@ func (s *Source) OracleDB() *sql.DB {
return s.DB
}
func (s *Source) RunSQL(ctx context.Context, statement string, params []any) (any, error) {
rows, err := s.OracleDB().QueryContext(ctx, statement, params...)
if err != nil {
return nil, fmt.Errorf("unable to execute query: %w", err)
}
defer rows.Close()
// If Columns() errors, it might be a DDL/DML without an OUTPUT clause.
// We proceed, and results.Err() will catch actual query execution errors.
// 'out' will remain nil if cols is empty or err is not nil here.
cols, _ := rows.Columns()
// Get Column types
colTypes, err := rows.ColumnTypes()
if err != nil {
if err := rows.Err(); err != nil {
return nil, fmt.Errorf("query execution error: %w", err)
}
return []any{}, nil
}
var out []any
for rows.Next() {
values := make([]any, len(cols))
for i, colType := range colTypes {
switch strings.ToUpper(colType.DatabaseTypeName()) {
case "NUMBER", "FLOAT", "BINARY_FLOAT", "BINARY_DOUBLE":
if _, scale, ok := colType.DecimalSize(); ok && scale == 0 {
// Scale is 0, treat it as an integer.
values[i] = new(sql.NullInt64)
} else {
// Scale is non-zero or unknown, treat
// it as a float.
values[i] = new(sql.NullFloat64)
}
case "DATE", "TIMESTAMP", "TIMESTAMP WITH TIME ZONE", "TIMESTAMP WITH LOCAL TIME ZONE":
values[i] = new(sql.NullTime)
case "JSON":
values[i] = new(sql.RawBytes)
default:
values[i] = new(sql.NullString)
}
}
if err := rows.Scan(values...); err != nil {
return nil, fmt.Errorf("unable to scan row: %w", err)
}
vMap := make(map[string]any)
for i, col := range cols {
receiver := values[i]
switch v := receiver.(type) {
case *sql.NullInt64:
if v.Valid {
vMap[col] = v.Int64
} else {
vMap[col] = nil
}
case *sql.NullFloat64:
if v.Valid {
vMap[col] = v.Float64
} else {
vMap[col] = nil
}
case *sql.NullString:
if v.Valid {
vMap[col] = v.String
} else {
vMap[col] = nil
}
case *sql.NullTime:
if v.Valid {
vMap[col] = v.Time
} else {
vMap[col] = nil
}
case *sql.RawBytes:
if *v != nil {
var unmarshaledData any
if err := json.Unmarshal(*v, &unmarshaledData); err != nil {
return nil, fmt.Errorf("unable to unmarshal json data for column %s", col)
}
vMap[col] = unmarshaledData
} else {
vMap[col] = nil
}
default:
return nil, fmt.Errorf("unexpected receiver type: %T", v)
}
}
out = append(out, vMap)
}
if err := rows.Err(); err != nil {
return nil, fmt.Errorf("errors encountered during query execution or row processing: %w", err)
}
return out, nil
}
func initOracleConnection(ctx context.Context, tracer trace.Tracer, config Config) (*sql.DB, error) {
//nolint:all // Reassigned ctx
ctx, span := sources.InitConnectionSpan(ctx, tracer, SourceKind, config.Name)

View File

@@ -23,6 +23,7 @@ import (
"github.com/goccy/go-yaml"
"github.com/googleapis/genai-toolbox/internal/sources"
"github.com/googleapis/genai-toolbox/internal/util"
"github.com/googleapis/genai-toolbox/internal/util/orderedmap"
"github.com/jackc/pgx/v5/pgxpool"
"go.opentelemetry.io/otel/trace"
)
@@ -98,6 +99,33 @@ func (s *Source) PostgresPool() *pgxpool.Pool {
return s.Pool
}
func (s *Source) RunSQL(ctx context.Context, statement string, params []any) (any, error) {
results, err := s.PostgresPool().Query(ctx, statement, params...)
if err != nil {
return nil, fmt.Errorf("unable to execute query: %w", err)
}
defer results.Close()
fields := results.FieldDescriptions()
var out []any
for results.Next() {
values, err := results.Values()
if err != nil {
return nil, fmt.Errorf("unable to parse row: %w", err)
}
row := orderedmap.Row{}
for i, f := range fields {
row.Add(f.Name, values[i])
}
out = append(out, row)
}
// this will catch actual query execution errors
if err := results.Err(); err != nil {
return nil, fmt.Errorf("unable to execute query: %w", err)
}
return out, nil
}
func initPostgresConnectionPool(ctx context.Context, tracer trace.Tracer, name, host, port, user, pass, dbname string, queryParams map[string]string) (*pgxpool.Pool, error) {
//nolint:all // Reassigned ctx
ctx, span := sources.InitConnectionSpan(ctx, tracer, SourceKind, name)

View File

@@ -152,3 +152,50 @@ func (s *Source) ToConfig() sources.SourceConfig {
func (s *Source) RedisClient() RedisClient {
return s.Client
}
func (s *Source) RunCommand(ctx context.Context, cmds [][]any) (any, error) {
// Execute commands
responses := make([]*redis.Cmd, len(cmds))
for i, cmd := range cmds {
responses[i] = s.RedisClient().Do(ctx, cmd...)
}
// Parse responses
out := make([]any, len(cmds))
for i, resp := range responses {
if err := resp.Err(); err != nil {
// Add error from each command to `errSum`
errString := fmt.Sprintf("error from executing command at index %d: %s", i, err)
out[i] = errString
continue
}
val, err := resp.Result()
if err != nil {
return nil, fmt.Errorf("error getting result: %s", err)
}
out[i] = convertRedisResult(val)
}
return out, nil
}
// convertRedisResult recursively converts redis results (map[any]any) to be
// JSON-marshallable (map[string]any).
// It converts map[any]any to map[string]any and handles nested structures.
func convertRedisResult(v any) any {
switch val := v.(type) {
case map[any]any:
m := make(map[string]any)
for k, v := range val {
m[fmt.Sprint(k)] = convertRedisResult(v)
}
return m
case []any:
s := make([]any, len(val))
for i, v := range val {
s[i] = convertRedisResult(v)
}
return s
default:
return v
}
}

View File

@@ -25,6 +25,7 @@ import (
_ "github.com/go-sql-driver/mysql"
"github.com/goccy/go-yaml"
"github.com/googleapis/genai-toolbox/internal/sources"
"github.com/googleapis/genai-toolbox/internal/tools/mysql/mysqlcommon"
"go.opentelemetry.io/otel/trace"
)
@@ -106,6 +107,59 @@ func (s *Source) SingleStorePool() *sql.DB {
return s.Pool
}
func (s *Source) RunSQL(ctx context.Context, statement string, params []any) (any, error) {
results, err := s.SingleStorePool().QueryContext(ctx, statement, params...)
if err != nil {
return nil, fmt.Errorf("unable to execute query: %w", err)
}
cols, err := results.Columns()
if err != nil {
return nil, fmt.Errorf("unable to retrieve rows column name: %w", err)
}
// create an array of values for each column, which can be re-used to scan each row
rawValues := make([]any, len(cols))
values := make([]any, len(cols))
for i := range rawValues {
values[i] = &rawValues[i]
}
defer results.Close()
colTypes, err := results.ColumnTypes()
if err != nil {
return nil, fmt.Errorf("unable to get column types: %w", err)
}
var out []any
for results.Next() {
err := results.Scan(values...)
if err != nil {
return nil, fmt.Errorf("unable to parse row: %w", err)
}
vMap := make(map[string]any)
for i, name := range cols {
val := rawValues[i]
if val == nil {
vMap[name] = nil
continue
}
vMap[name], err = mysqlcommon.ConvertToType(colTypes[i], val)
if err != nil {
return nil, fmt.Errorf("errors encountered when converting values: %w", err)
}
}
out = append(out, vMap)
}
if err := results.Err(); err != nil {
return nil, fmt.Errorf("errors encountered during row iteration: %w", err)
}
return out, nil
}
func initSingleStoreConnectionPool(ctx context.Context, tracer trace.Tracer, name, host, port, user, pass, dbname, queryTimeout string) (*sql.DB, error) {
//nolint:all // Reassigned ctx
ctx, span := sources.InitConnectionSpan(ctx, tracer, SourceKind, name)

View File

@@ -16,13 +16,16 @@ package spanner
import (
"context"
"encoding/json"
"fmt"
"cloud.google.com/go/spanner"
"github.com/goccy/go-yaml"
"github.com/googleapis/genai-toolbox/internal/sources"
"github.com/googleapis/genai-toolbox/internal/util"
"github.com/googleapis/genai-toolbox/internal/util/orderedmap"
"go.opentelemetry.io/otel/trace"
"google.golang.org/api/iterator"
)
const SourceKind string = "spanner"
@@ -93,6 +96,79 @@ func (s *Source) DatabaseDialect() string {
return s.Dialect.String()
}
// processRows iterates over the spanner.RowIterator and converts each row to a map[string]any.
func processRows(iter *spanner.RowIterator) ([]any, error) {
var out []any
defer iter.Stop()
for {
row, err := iter.Next()
if err == iterator.Done {
break
}
if err != nil {
return nil, fmt.Errorf("unable to parse row: %w", err)
}
rowMap := orderedmap.Row{}
cols := row.ColumnNames()
for i, c := range cols {
if c == "object_details" { // for list graphs or list tables
val := row.ColumnValue(i)
if val == nil { // ColumnValue returns the Cloud Spanner Value of column i, or nil for invalid column.
rowMap.Add(c, nil)
} else {
jsonString, ok := val.AsInterface().(string)
if !ok {
return nil, fmt.Errorf("column 'object_details' is not a string, but %T", val.AsInterface())
}
var details map[string]any
if err := json.Unmarshal([]byte(jsonString), &details); err != nil {
return nil, fmt.Errorf("unable to unmarshal JSON: %w", err)
}
rowMap.Add(c, details)
}
} else {
rowMap.Add(c, row.ColumnValue(i))
}
}
out = append(out, rowMap)
}
return out, nil
}
func (s *Source) RunSQL(ctx context.Context, readOnly bool, statement string, params map[string]any) (any, error) {
var results []any
var err error
var opErr error
stmt := spanner.Statement{
SQL: statement,
}
if params != nil {
stmt.Params = params
}
if readOnly {
iter := s.SpannerClient().Single().Query(ctx, stmt)
results, opErr = processRows(iter)
} else {
_, opErr = s.SpannerClient().ReadWriteTransaction(ctx, func(ctx context.Context, txn *spanner.ReadWriteTransaction) error {
iter := txn.Query(ctx, stmt)
results, err = processRows(iter)
if err != nil {
return err
}
return nil
})
}
if opErr != nil {
return nil, fmt.Errorf("unable to execute client: %w", opErr)
}
return results, nil
}
func initSpannerClient(ctx context.Context, tracer trace.Tracer, name, project, instance, dbname string) (*spanner.Client, error) {
//nolint:all // Reassigned ctx
ctx, span := sources.InitConnectionSpan(ctx, tracer, SourceKind, name)

View File

@@ -17,10 +17,12 @@ package sqlite
import (
"context"
"database/sql"
"encoding/json"
"fmt"
"github.com/goccy/go-yaml"
"github.com/googleapis/genai-toolbox/internal/sources"
"github.com/googleapis/genai-toolbox/internal/util/orderedmap"
"go.opentelemetry.io/otel/trace"
_ "modernc.org/sqlite" // Pure Go SQLite driver
)
@@ -91,6 +93,66 @@ func (s *Source) SQLiteDB() *sql.DB {
return s.Db
}
func (s *Source) RunSQL(ctx context.Context, statement string, params []any) (any, error) {
// Execute the SQL query with parameters
rows, err := s.SQLiteDB().QueryContext(ctx, statement, params...)
if err != nil {
return nil, fmt.Errorf("unable to execute query: %w", err)
}
defer rows.Close()
// Get column names
cols, err := rows.Columns()
if err != nil {
return nil, fmt.Errorf("unable to get column names: %w", err)
}
// The sqlite driver does not support ColumnTypes, so we can't get the
// underlying database type of the columns. We'll have to rely on the
// generic `any` type and then handle the JSON data separately.
rawValues := make([]any, len(cols))
values := make([]any, len(cols))
for i := range rawValues {
values[i] = &rawValues[i]
}
// Prepare the result slice
var out []any
for rows.Next() {
if err := rows.Scan(values...); err != nil {
return nil, fmt.Errorf("unable to scan row: %w", err)
}
// Create a map for this row
row := orderedmap.Row{}
for i, name := range cols {
val := rawValues[i]
// Handle nil values
if val == nil {
row.Add(name, nil)
continue
}
// Handle JSON data
if jsonString, ok := val.(string); ok {
var unmarshaledData any
if json.Unmarshal([]byte(jsonString), &unmarshaledData) == nil {
row.Add(name, unmarshaledData)
continue
}
}
// Store the value in the map
row.Add(name, val)
}
out = append(out, row)
}
if err = rows.Err(); err != nil {
return nil, fmt.Errorf("error iterating rows: %w", err)
}
return out, nil
}
func initSQLiteConnection(ctx context.Context, tracer trace.Tracer, name, dbPath string) (*sql.DB, error) {
//nolint:all // Reassigned ctx
ctx, span := sources.InitConnectionSpan(ctx, tracer, SourceKind, name)

View File

@@ -17,6 +17,7 @@ package tidb
import (
"context"
"database/sql"
"encoding/json"
"fmt"
"regexp"
@@ -104,6 +105,79 @@ func (s *Source) TiDBPool() *sql.DB {
return s.Pool
}
func (s *Source) RunSQL(ctx context.Context, statement string, params []any) (any, error) {
results, err := s.TiDBPool().QueryContext(ctx, statement, params...)
if err != nil {
return nil, fmt.Errorf("unable to execute query: %w", err)
}
cols, err := results.Columns()
if err != nil {
return nil, fmt.Errorf("unable to retrieve rows column name: %w", err)
}
// create an array of values for each column, which can be re-used to scan each row
rawValues := make([]any, len(cols))
values := make([]any, len(cols))
for i := range rawValues {
values[i] = &rawValues[i]
}
defer results.Close()
colTypes, err := results.ColumnTypes()
if err != nil {
return nil, fmt.Errorf("unable to get column types: %w", err)
}
var out []any
for results.Next() {
err := results.Scan(values...)
if err != nil {
return nil, fmt.Errorf("unable to parse row: %w", err)
}
vMap := make(map[string]any)
for i, name := range cols {
val := rawValues[i]
if val == nil {
vMap[name] = nil
continue
}
// mysql driver return []uint8 type for "TEXT", "VARCHAR", and "NVARCHAR"
// we'll need to cast it back to string
switch colTypes[i].DatabaseTypeName() {
case "JSON":
// unmarshal JSON data before storing to prevent double
// marshaling
byteVal, ok := val.([]byte)
if !ok {
return nil, fmt.Errorf("expected []byte for JSON column, but got %T", val)
}
var unmarshaledData any
if err := json.Unmarshal(byteVal, &unmarshaledData); err != nil {
return nil, fmt.Errorf("unable to unmarshal json data %s", val)
}
vMap[name] = unmarshaledData
case "TEXT", "VARCHAR", "NVARCHAR":
byteVal, ok := val.([]byte)
if !ok {
return nil, fmt.Errorf("expected []byte for text-like column, but got %T", val)
}
vMap[name] = string(byteVal)
default:
vMap[name] = val
}
}
out = append(out, vMap)
}
if err := results.Err(); err != nil {
return nil, fmt.Errorf("errors encountered during row iteration: %w", err)
}
return out, nil
}
func IsTiDBCloudHost(host string) bool {
pattern := `gateway\d{2}\.(.+)\.(prod|dev|staging)\.(.+)\.tidbcloud\.com`
match, err := regexp.MatchString(pattern, host)

View File

@@ -102,6 +102,56 @@ func (s *Source) TrinoDB() *sql.DB {
return s.Pool
}
func (s *Source) RunSQL(ctx context.Context, statement string, params []any) (any, error) {
results, err := s.TrinoDB().QueryContext(ctx, statement, params...)
if err != nil {
return nil, fmt.Errorf("unable to execute query: %w", err)
}
defer results.Close()
cols, err := results.Columns()
if err != nil {
return nil, fmt.Errorf("unable to retrieve column names: %w", err)
}
// create an array of values for each column, which can be re-used to scan each row
rawValues := make([]any, len(cols))
values := make([]any, len(cols))
for i := range rawValues {
values[i] = &rawValues[i]
}
var out []any
for results.Next() {
err := results.Scan(values...)
if err != nil {
return nil, fmt.Errorf("unable to parse row: %w", err)
}
vMap := make(map[string]any)
for i, name := range cols {
val := rawValues[i]
if val == nil {
vMap[name] = nil
continue
}
// Convert byte arrays to strings for text fields
if b, ok := val.([]byte); ok {
vMap[name] = string(b)
} else {
vMap[name] = val
}
}
out = append(out, vMap)
}
if err := results.Err(); err != nil {
return nil, fmt.Errorf("errors encountered during row iteration: %w", err)
}
return out, nil
}
func initTrinoConnectionPool(ctx context.Context, tracer trace.Tracer, name, host, port, user, password, catalog, schema, queryTimeout, accessToken string, kerberosEnabled, sslEnabled bool) (*sql.DB, error) {
//nolint:all // Reassigned ctx
ctx, span := sources.InitConnectionSpan(ctx, tracer, SourceKind, name)

View File

@@ -125,3 +125,37 @@ func (s *Source) ToConfig() sources.SourceConfig {
func (s *Source) ValkeyClient() valkey.Client {
return s.Client
}
func (s *Source) RunCommand(ctx context.Context, cmds [][]string) (any, error) {
// Build commands
builtCmds := make(valkey.Commands, len(cmds))
for i, cmd := range cmds {
builtCmds[i] = s.ValkeyClient().B().Arbitrary(cmd...).Build()
}
if len(builtCmds) == 0 {
return nil, fmt.Errorf("no valid commands were built to execute")
}
// Execute commands
responses := s.ValkeyClient().DoMulti(ctx, builtCmds...)
// Parse responses
out := make([]any, len(cmds))
for i, resp := range responses {
if err := resp.Error(); err != nil {
// Store error message in the output for this command
out[i] = fmt.Sprintf("error from executing command at index %d: %s", i, err)
continue
}
val, err := resp.ToAny()
if err != nil {
out[i] = fmt.Sprintf("error parsing response: %s", err)
continue
}
out[i] = val
}
return out, nil
}

Some files were not shown because too many files have changed in this diff Show More