Compare commits

..

66 Commits

Author SHA1 Message Date
Twisha Bansal
0dfcf24859 fix import 2026-01-28 18:32:04 +05:30
Twisha Bansal
be0b7fc96e logic fix 2026-01-28 18:30:55 +05:30
Twisha Bansal
d7016d2251 Update docs/en/samples/pre_post_processing/python/langchain/agent.py
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
2026-01-28 18:27:48 +05:30
Twisha Bansal
d44283ffcf license header 2026-01-28 18:21:04 +05:30
Twisha Bansal
69e3f2eb24 lint 2026-01-28 18:20:40 +05:30
Twisha Bansal
c724bea786 gemini code review 2026-01-28 18:19:53 +05:30
Twisha Bansal
4bc684d3ed remove not needed files 2026-01-28 18:16:19 +05:30
Twisha Bansal
9434450a65 docs: add pre/post processing docs for langchain python 2026-01-28 18:14:05 +05:30
Anubhav Dhawan
362ed8df41 docs: migrate to toolbox-adk and simplified ToolboxToolset (#2211)
Updates all quickstart guides and samples to use the new `toolbox-adk`
package instead of the legacy `toolbox-core`. Also updates
`ToolboxToolset` usage to rely on the simplified constructor (implicit
authentication) and ensures correct dependency installation.

> [!NOTE]
> The integration tests are failing because the `google-adk` package is
not released yet with the newer changes from `toolbox-adk`. This is
expected behavior until the [package update](cl/853799009) is released.
2026-01-28 11:26:13 +05:30
Yuan Teoh
293c1d6889 feat!: update configuration file v2 (#2369)
This PR introduces a significant update to the Toolbox configuration
file format, which is one of the primary **breaking changes** required
for the implementation of the Advanced Control Plane.

# Summary of Changes
The configuration schema has been updated to enforce resource isolation
and facilitate atomic, incremental updates.
* Resource Isolation: Resource definitions are now separated into
individual blocks, using a distinct structure for each resource type
(Source, Tool, Toolset, etc.). This improves readability, management,
and auditing of configuration files.
* Field Name Modification: Internal field names have been modified to
align with declarative methodologies. Specifically, the configuration
now separates kind (general resource type, e.g., Source) from type
(specific implementation, e.g., Postgres).

# User Impact
Existing tools.yaml configuration files are now in an outdated format.
Users must eventually update their files to the new YAML format.

# Mitigation & Compatibility
Backward compatibility is maintained during this transition to ensure no
immediate user action is required for existing files.
* Immediate Backward Compatibility: The source code includes a
pre-processing layer that automatically detects outdated configuration
files (v1 format) and converts them to the new v2 format under the hood.
* [COMING SOON] Migration Support: The new toolbox migrate subcommand
will be introduced to allow users to automatically convert their old
configuration files to the latest format.

# Example
Example for config file v2:
```
kind: sources
name: my-pg-instance
type: cloud-sql-postgres
project: my-project
region: my-region
instance: my-instance
database: my_db
user: my_user
password: my_pass
---
kind: authServices
name: my-google-auth
type: google
clientId: testing-id
---
kind: tools
name: example_tool
type: postgres-sql
source: my-pg-instance
description: some description
statement: SELECT * FROM SQL_STATEMENT;
parameters:
- name: country
  type: string
  description: some description
---
kind: tools
name: example_tool_2
type: postgres-sql
source: my-pg-instance
description: returning the number one
statement: SELECT 1;
---
kind: toolsets
name: example_toolset
tools:
- example_tool
```

---------

Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
Co-authored-by: Averi Kitsch <akitsch@google.com>
2026-01-27 16:58:43 -08:00
Wenxin Du
cf477b529a refactor: Add GetParameters() to Tools interface (#2374)
As a first part to refactor the `ParseParam()` method of the Tool
interface.
2026-01-27 17:27:11 -05:00
Averi Kitsch
cdc4d0d304 docs: add github trending badge (#2368)
## Description

> Should include a concise description of the changes (bug or feature),
it's
> impact, along with a summary of the solution

## PR Checklist

> Thank you for opening a Pull Request! Before submitting your PR, there
are a
> few things you can do to make sure it goes smoothly:

- [ ] Make sure you reviewed

[CONTRIBUTING.md](https://github.com/googleapis/genai-toolbox/blob/main/CONTRIBUTING.md)
- [ ] Make sure to open an issue as a

[bug/issue](https://github.com/googleapis/genai-toolbox/issues/new/choose)
  before writing your code! That way we can discuss the change, evaluate
  designs, and agree on the general idea
- [ ] Ensure the tests and linter pass
- [ ] Code coverage does not decrease (if any source code was changed)
- [ ] Appropriate docs were updated (if necessary)
- [ ] Make sure to add `!` if this involve a breaking change

🛠️ Fixes #<issue_number_goes_here>
2026-01-27 12:12:03 -08:00
Twisha Bansal
3aa1b79c13 docs: fix flag name (#2372)
## Description

> Should include a concise description of the changes (bug or feature),
it's
> impact, along with a summary of the solution

## PR Checklist

> Thank you for opening a Pull Request! Before submitting your PR, there
are a
> few things you can do to make sure it goes smoothly:

- [x] Make sure you reviewed

[CONTRIBUTING.md](https://github.com/googleapis/genai-toolbox/blob/main/CONTRIBUTING.md)
- [ ] Make sure to open an issue as a

[bug/issue](https://github.com/googleapis/genai-toolbox/issues/new/choose)
  before writing your code! That way we can discuss the change, evaluate
  designs, and agree on the general idea
- [ ] Ensure the tests and linter pass
- [ ] Code coverage does not decrease (if any source code was changed)
- [ ] Appropriate docs were updated (if necessary)
- [ ] Make sure to add `!` if this involve a breaking change

🛠️ Fixes #<issue_number_goes_here>
2026-01-27 22:22:08 +05:30
dependabot[bot]
941ed689b4 chore(deps): bump jws from 3.2.2 to 3.2.3 in /docs/en/getting-started/quickstart/js/genkit (#2125)
Bumps [jws](https://github.com/brianloveswords/node-jws) from 3.2.2 to
3.2.3.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/brianloveswords/node-jws/releases">jws's
releases</a>.</em></p>
<blockquote>
<h2>v3.2.3</h2>
<h3>Changed</h3>
<ul>
<li>Fix advisory GHSA-869p-cjfg-cm3x: createSign and createVerify now
require
that a non empty secret is provided (via opts.secret, opts.privateKey or
opts.key)
when using HMAC algorithms.</li>
<li>Upgrading JWA version to 1.4.2, addressing a compatibility issue for
Node &gt;= 25.</li>
</ul>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/auth0/node-jws/blob/master/CHANGELOG.md">jws's
changelog</a>.</em></p>
<blockquote>
<h2>[3.2.3]</h2>
<h3>Changed</h3>
<ul>
<li>Fix advisory GHSA-869p-cjfg-cm3x: createSign and createVerify now
require
that a non empty secret is provided (via opts.secret, opts.privateKey or
opts.key)
when using HMAC algorithms.</li>
<li>Upgrading JWA version to 1.4.2, adressing a compatibility issue for
Node &gt;= 25.</li>
</ul>
<h2>[3.0.0]</h2>
<h3>Changed</h3>
<ul>
<li><strong>BREAKING</strong>: <code>jwt.verify</code> now requires an
<code>algorithm</code> parameter, and
<code>jws.createVerify</code> requires an <code>algorithm</code> option.
The <code>&quot;alg&quot;</code> field
signature headers is ignored. This mitigates a critical security flaw
in the library which would allow an attacker to generate signatures with
arbitrary contents that would be accepted by <code>jwt.verify</code>.
See
<a
href="https://auth0.com/blog/2015/03/31/critical-vulnerabilities-in-json-web-token-libraries/">https://auth0.com/blog/2015/03/31/critical-vulnerabilities-in-json-web-token-libraries/</a>
for details.</li>
</ul>
<h2><a
href="https://github.com/brianloveswords/node-jws/compare/v1.0.1...v2.0.0">2.0.0</a>
- 2015-01-30</h2>
<h3>Changed</h3>
<ul>
<li>
<p><strong>BREAKING</strong>: Default payload encoding changed from
<code>binary</code> to
<code>utf8</code>. <code>utf8</code> is a is a more sensible default
than <code>binary</code> because
many payloads, as far as I can tell, will contain user-facing
strings that could be in any language. (<!-- raw HTML omitted --><a
href="https://github.com/brianloveswords/node-jws/commit/6b6de48">6b6de48</a><!--
raw HTML omitted -->)</p>
</li>
<li>
<p>Code reorganization, thanks <a
href="https://github.com/fearphage"><code>@​fearphage</code></a>! (<!--
raw HTML omitted --><a
href="https://github.com/brianloveswords/node-jws/commit/7880050">7880050</a><!--
raw HTML omitted -->)</p>
</li>
</ul>
<h3>Added</h3>
<ul>
<li>Option in all relevant methods for <code>encoding</code>. For those
few users
that might be depending on a <code>binary</code> encoding of the
messages, this
is for them. (<!-- raw HTML omitted --><a
href="https://github.com/brianloveswords/node-jws/commit/6b6de48">6b6de48</a><!--
raw HTML omitted -->)</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="4f6e73f24d"><code>4f6e73f</code></a>
Merge commit from fork</li>
<li><a
href="bd0fea57f3"><code>bd0fea5</code></a>
version 3.2.3</li>
<li><a
href="7c3b4b4110"><code>7c3b4b4</code></a>
Enhance tests for HMAC streaming sign and verify</li>
<li><a
href="a9b8ed999d"><code>a9b8ed9</code></a>
Improve secretOrKey initialization in VerifyStream</li>
<li><a
href="6707fde62c"><code>6707fde</code></a>
Improve secret handling in SignStream</li>
<li>See full diff in <a
href="https://github.com/brianloveswords/node-jws/compare/v3.2.2...v3.2.3">compare
view</a></li>
</ul>
</details>
<details>
<summary>Maintainer changes</summary>
<p>This version was pushed to npm by <a
href="https://www.npmjs.com/~julien.wollscheid">julien.wollscheid</a>, a
new releaser for jws since your current version.</p>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=jws&package-manager=npm_and_yarn&previous-version=3.2.2&new-version=3.2.3)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

You can trigger a rebase of this PR by commenting `@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)
You can disable automated security fix PRs for this repo from the
[Security Alerts
page](https://github.com/googleapis/genai-toolbox/network/alerts).

</details>

> **Note**
> Automatic rebases have been disabled on this pull request as it has
been open for over 30 days.

---------

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Wenxin Du <117315983+duwenxin99@users.noreply.github.com>
Co-authored-by: Twisha Bansal <58483338+twishabansal@users.noreply.github.com>
Co-authored-by: Twisha Bansal <twishabansal07@gmail.com>
2026-01-27 14:28:31 +05:30
dependabot[bot]
f298c8f444 chore(deps): bump lodash from 4.17.21 to 4.17.23 in /docs/en/getting-started/quickstart/js/llamaindex (#2354)
Bumps [lodash](https://github.com/lodash/lodash) from 4.17.21 to
4.17.23.
<details>
<summary>Commits</summary>
<ul>
<li><a
href="dec55b7a3b"><code>dec55b7</code></a>
Bump main to v4.17.23 (<a
href="https://redirect.github.com/lodash/lodash/issues/6088">#6088</a>)</li>
<li><a
href="19c9251b36"><code>19c9251</code></a>
fix: setCacheHas JSDoc return type should be boolean (<a
href="https://redirect.github.com/lodash/lodash/issues/6071">#6071</a>)</li>
<li><a
href="b5e672995a"><code>b5e6729</code></a>
jsdoc: Add -0 and BigInt zeros to _.compact falsey values list (<a
href="https://redirect.github.com/lodash/lodash/issues/6062">#6062</a>)</li>
<li><a
href="edadd45214"><code>edadd45</code></a>
Prevent prototype pollution on baseUnset function</li>
<li><a
href="4879a7a7d0"><code>4879a7a</code></a>
doc: fix autoLink function, conversion of source links (<a
href="https://redirect.github.com/lodash/lodash/issues/6056">#6056</a>)</li>
<li><a
href="9648f692b0"><code>9648f69</code></a>
chore: remove <code>yarn.lock</code> file (<a
href="https://redirect.github.com/lodash/lodash/issues/6053">#6053</a>)</li>
<li><a
href="dfa407db0b"><code>dfa407d</code></a>
ci: remove legacy configuration files (<a
href="https://redirect.github.com/lodash/lodash/issues/6052">#6052</a>)</li>
<li><a
href="156e1965ae"><code>156e196</code></a>
feat: add renovate setup (<a
href="https://redirect.github.com/lodash/lodash/issues/6039">#6039</a>)</li>
<li><a
href="933e1061b8"><code>933e106</code></a>
ci: add pipeline for Bun (<a
href="https://redirect.github.com/lodash/lodash/issues/6023">#6023</a>)</li>
<li><a
href="072a807ff7"><code>072a807</code></a>
docs: update links related to Open JS Foundation (<a
href="https://redirect.github.com/lodash/lodash/issues/5968">#5968</a>)</li>
<li>Additional commits viewable in <a
href="https://github.com/lodash/lodash/compare/4.17.21...4.17.23">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=lodash&package-manager=npm_and_yarn&previous-version=4.17.21&new-version=4.17.23)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)
You can disable automated security fix PRs for this repo from the
[Security Alerts
page](https://github.com/googleapis/genai-toolbox/network/alerts).

</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Twisha Bansal <58483338+twishabansal@users.noreply.github.com>
2026-01-27 12:02:55 +05:30
Mend Renovate
f6474739e3 chore(deps): update github actions (#2298)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
| [actions/cache](https://redirect.github.com/actions/cache)
([changelog](9255dc7a25..8b402f58fb))
| action | digest | `9255dc7` → `8b402f5` |
| [actions/checkout](https://redirect.github.com/actions/checkout) |
action | patch | `v6.0.1` → `v6.0.2` |
| [actions/setup-go](https://redirect.github.com/actions/setup-go) |
action | minor | `v6.1.0` → `v6.2.0` |
| [actions/setup-node](https://redirect.github.com/actions/setup-node)
([changelog](395ad32622..6044e13b5d))
| action | digest | `395ad32` → `6044e13` |

---

### Release Notes

<details>
<summary>actions/checkout (actions/checkout)</summary>

###
[`v6.0.2`](https://redirect.github.com/actions/checkout/blob/HEAD/CHANGELOG.md#v602)

[Compare
Source](https://redirect.github.com/actions/checkout/compare/v6.0.1...v6.0.2)

- Fix tag handling: preserve annotations and explicit fetch-tags by
[@&#8203;ericsciple](https://redirect.github.com/ericsciple) in
[#&#8203;2356](https://redirect.github.com/actions/checkout/pull/2356)

</details>

<details>
<summary>actions/setup-go (actions/setup-go)</summary>

###
[`v6.2.0`](https://redirect.github.com/actions/setup-go/releases/tag/v6.2.0)

[Compare
Source](https://redirect.github.com/actions/setup-go/compare/v6.1.0...v6.2.0)

##### What's Changed

##### Enhancements

- Example for restore-only cache in documentation by
[@&#8203;aparnajyothi-y](https://redirect.github.com/aparnajyothi-y) in
[#&#8203;696](https://redirect.github.com/actions/setup-go/pull/696)
- Update Node.js version in action.yml by
[@&#8203;ccoVeille](https://redirect.github.com/ccoVeille) in
[#&#8203;691](https://redirect.github.com/actions/setup-go/pull/691)
- Documentation update of actions/checkout by
[@&#8203;deining](https://redirect.github.com/deining) in
[#&#8203;683](https://redirect.github.com/actions/setup-go/pull/683)

##### Dependency updates

- Upgrade js-yaml from 3.14.1 to 3.14.2 by
[@&#8203;dependabot](https://redirect.github.com/dependabot) in
[#&#8203;682](https://redirect.github.com/actions/setup-go/pull/682)
- Upgrade
[@&#8203;actions/cache](https://redirect.github.com/actions/cache) to v5
by [@&#8203;salmanmkc](https://redirect.github.com/salmanmkc) in
[#&#8203;695](https://redirect.github.com/actions/setup-go/pull/695)
- Upgrade actions/checkout from 5 to 6 by
[@&#8203;dependabot](https://redirect.github.com/dependabot) in
[#&#8203;686](https://redirect.github.com/actions/setup-go/pull/686)
- Upgrade qs from 6.14.0 to 6.14.1 by
[@&#8203;dependabot](https://redirect.github.com/dependabot) in
[#&#8203;703](https://redirect.github.com/actions/setup-go/pull/703)

##### New Contributors

- [@&#8203;ccoVeille](https://redirect.github.com/ccoVeille) made their
first contribution in
[#&#8203;691](https://redirect.github.com/actions/setup-go/pull/691)
- [@&#8203;deining](https://redirect.github.com/deining) made their
first contribution in
[#&#8203;683](https://redirect.github.com/actions/setup-go/pull/683)

**Full Changelog**:
<https://github.com/actions/setup-go/compare/v6...v6.2.0>

</details>

---

### Configuration

📅 **Schedule**: Branch creation - At any time (no schedule defined),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

👻 **Immortal**: This PR will be recreated if closed unmerged. Get
[config
help](https://redirect.github.com/renovatebot/renovate/discussions) if
that's undesired.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/googleapis/genai-toolbox).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0Mi43NC41IiwidXBkYXRlZEluVmVyIjoiNDIuOTIuMSIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOltdfQ==-->

Co-authored-by: Averi Kitsch <akitsch@google.com>
2026-01-23 20:41:19 +00:00
mahlevanshika
1d7c498116 fix(dataplex): Capture GCP HTTP errors in MCP Toolbox (#2347)
### Description

fix: Surface Dataplex API errors in MCP results

This change addresses issue
https://github.com/googleapis/genai-toolbox/issues/2203, where Dataplex
API errors, such as '403 Forbidden' (Permission Denied), were not being
properly surfaced in the MCP tool results. Previously, these critical
API errors would manifest as generic "connection interrupted" messages,
significantly hindering developer debugging and trust in the Toolbox.

The fix enhances the error handling within the 'dataplexsearchentries'
and 'dataplexsearchaspecttypes' tools. When an error occurs during the
iteration of Dataplex API results, the system now:

Utilizes 'google.golang.org/grpc/status.FromError' to attempt to convert
the returned error into a gRPC status. This is crucial because Google
Cloud client libraries often return errors compatible with gRPC.
If the error is a gRPC status, the canonical error code (e.g.,
'codes.PermissionDenied') and the associated error message are
extracted.
This ensures that users receive clear actionable error feedback,
allowing for quicker diagnosis and resolution of issues like missing IAM
permissions. This aligns with best practices for API error surfacing,
improving the usability and reliability of the Dataplex tools within the
GenAI Toolbox.

Fixes https://github.com/googleapis/genai-toolbox/issues/2203



## PR Checklist

> Thank you for opening a Pull Request! Before submitting your PR, there
are a
> few things you can do to make sure it goes smoothly:

- [ ] Make sure you reviewed

[CONTRIBUTING.md](https://github.com/googleapis/genai-toolbox/blob/main/CONTRIBUTING.md)
- [ ] Make sure to open an issue as a

[bug/issue](https://github.com/googleapis/genai-toolbox/issues/new/choose)
  before writing your code! That way we can discuss the change, evaluate
  designs, and agree on the general idea
- [ ] Ensure the tests and linter pass
- [ ] Code coverage does not decrease (if any source code was changed)
- [ ] Appropriate docs were updated (if necessary)
- [ ] Make sure to add `!` if this involve a breaking change

🛠️ Fixes #<issue_number_goes_here>

---------

Co-authored-by: Wenxin Du <117315983+duwenxin99@users.noreply.github.com>
Co-authored-by: Averi Kitsch <akitsch@google.com>
2026-01-23 19:53:59 +00:00
Yuan Teoh
9294ce39c8 ci: add oceanbase link to lycheeignore (#2360)
oceanbase's link is hitting `Rejected status code (this depends on your
"accept" configuration): Too Many Requests` error. It might have
temporarily blocked Lychee due to the too many request being sent in a
short time. Adding the link to lycheeignore to unblock GHA failure.
2026-01-23 18:40:33 +00:00
release-please[bot]
86bf7bf8d0 chore(main): release 0.26.0 (#2286)
🤖 I have created a release *beep* *boop*
---


##
[0.26.0](https://github.com/googleapis/genai-toolbox/compare/v0.25.0...v0.26.0)
(2026-01-22)


### ⚠ BREAKING CHANGES

* Validate tool naming
([#2305](https://github.com/googleapis/genai-toolbox/issues/2305))
([5054212](5054212fa4))
* **tools/cloudgda:** Update description and parameter name for cloudgda
tool ([#2288](https://github.com/googleapis/genai-toolbox/issues/2288))
([6b02591](6b02591703))

### Features

* Add new `user-agent-metadata` flag
([#2302](https://github.com/googleapis/genai-toolbox/issues/2302))
([adc9589](adc9589766))
* Add remaining flag to Toolbox server in MCP registry
([#2272](https://github.com/googleapis/genai-toolbox/issues/2272))
([5e0999e](5e0999ebf5))
* **embeddingModel:** Add embedding model to MCP handler
([#2310](https://github.com/googleapis/genai-toolbox/issues/2310))
([e4f60e5](e4f60e5633))
* **sources/bigquery:** Make maximum rows returned from queries
configurable
([#2262](https://github.com/googleapis/genai-toolbox/issues/2262))
([4abf0c3](4abf0c39e7))
* **prebuilt/cloud-sql:** Add create backup tool for Cloud SQL
([#2141](https://github.com/googleapis/genai-toolbox/issues/2141))
([8e0fb03](8e0fb03483))
* **prebuilt/cloud-sql:** Add restore backup tool for Cloud SQL
([#2171](https://github.com/googleapis/genai-toolbox/issues/2171))
([00c3e6d](00c3e6d8cb))
* Support combining multiple prebuilt configurations
([#2295](https://github.com/googleapis/genai-toolbox/issues/2295))
([e535b37](e535b372ea))
* Support MCP specs version 2025-11-25
([#2303](https://github.com/googleapis/genai-toolbox/issues/2303))
([4d23a3b](4d23a3bbf2))
* **tools:** Add `valueFromParam` support to Tool config
([#2333](https://github.com/googleapis/genai-toolbox/issues/2333))
([15101b1](15101b1edb))


### Bug Fixes

* **tools/cloudhealthcare:** Add check for client authorization before
retrieving token string
([#2327](https://github.com/googleapis/genai-toolbox/issues/2327))
([c25a233](c25a2330fe))

---
This PR was generated with [Release
Please](https://github.com/googleapis/release-please). See
[documentation](https://github.com/googleapis/release-please#release-please).

---------

Co-authored-by: release-please[bot] <55107282+release-please[bot]@users.noreply.github.com>
Co-authored-by: Yuan Teoh <45984206+Yuan325@users.noreply.github.com>
2026-01-22 16:22:50 -08:00
Yuan Teoh
ad2893d809 chore: release 0.26.0 (#2355)
Release-As: 0.26.0
2026-01-22 23:40:47 +00:00
dishaprakash
e535b372ea feat: Support combining multiple prebuilt configurations (#2295)
## Description
This PR introduces support for merging multiple prebuilt configurations.
To ensure compatibility, the following restrictions apply:

- No Naming Collisions: Configurations cannot share duplicate names for
any resources (Tools, Sources, Toolsets, Auth Services, etc.).
- Shared Environment Variables: If multiple sources rely on the same
environment variable, they must share the same value; unique values for
the same variable are not supported

## Usage Examples

### Successful Initialization

You can load multiple prebuilt configurations by either repeating the
--prebuilt flag or by providing a comma-separated list.

**Option 1:** Multiple Flags
```
./toolbox --prebuilt alloydb-postgres --prebuilt alloydb-postgres-admin
```

**Option 2:** Comma-Separated Values
```
./toolbox --prebuilt alloydb-postgres,alloydb-postgres-admin
```

### Initialization Failure (Resource Conflict)

If two or more configurations define a resource with the same name (such
as a Tool or Source, etc.), the server will fail to start and display a
conflict error.

```
./toolbox --prebuilt alloydb-postgres --prebuilt cloud-sql-mysql
2026-01-13T11:14:50.758121799Z INFO "Using prebuilt tool configurations for: alloydb-postgres, cloud-sql-mysql" 
2026-01-13T11:14:50.764578167Z ERROR "resource conflicts detected:\n  - tool 'execute_sql' (file #2)\n  - tool 'list_active_queries' (file #2)\n  - tool 'get_query_plan' (file #2)\n  - tool 'list_tables' (file #2)\n\nPlease ensure each source, authService, tool, toolset and prompt has a unique name across all files" 
```

## PR Checklist

> Thank you for opening a Pull Request! Before submitting your PR, there
are a
> few things you can do to make sure it goes smoothly:

- [x] Make sure you reviewed

[CONTRIBUTING.md](https://github.com/googleapis/genai-toolbox/blob/main/CONTRIBUTING.md)
- [x] Make sure to open an issue as a

[bug/issue](https://github.com/googleapis/genai-toolbox/issues/new/choose)
  before writing your code! That way we can discuss the change, evaluate
  designs, and agree on the general idea
- [x] Ensure the tests and linter pass
- [x] Code coverage does not decrease (if any source code was changed)
- [x] Appropriate docs were updated (if necessary)
- [x] Make sure to add `!` if this involve a breaking change

🛠️ Fixes #1855

---------

Co-authored-by: Averi Kitsch <akitsch@google.com>
2026-01-22 23:00:17 +00:00
Yuan Teoh
5054212fa4 feat!: validate tool naming (#2305)
Check for naming validation for Tool. This validation follows the MCP
SEP986 [naming
guidance](1b1eb60ec4/docs/specification/draft/server/tools.mdx (tool-names)).

Name will be checked before MCP initialization (where specs version is
confirmed). Hence, we will be implementing this across all versions and
endpoints.

This will be a breaking change for user that currently uses other
special character as name (other than `_`, `-`, `.`)
2026-01-22 21:52:37 +00:00
Wenxin Du
93ca4578da fix(looker): upgrade go sdk to v0.25.22 (#2350) 2026-01-22 13:29:20 -08:00
Yuan Teoh
ec936aed03 chore: update mcp registry title (#2311)
Update title to reflect the full name of Toolbox.
2026-01-22 11:46:51 -08:00
Yuan Teoh
fe69272c84 docs(sources/dgraph): add best effort maintenance notes (#2319)
Update note to state that dgraph is currently under best effort
maintenance.

ref #2318
2026-01-22 10:58:51 -08:00
Wenxin Du
15101b1edb feat(tools): Add valueFromParam support to Tool config (#2333)
This PR introduces a new configuration field valueFromParam to the tool
definitions. This feature allows a parameter to automatically inherit
its value from another sibling parameter, mainly to streamline the
configuration of vector insertion tools.

Parameters utilizing valueFromParam are excluded from the Tool and MCP
manifests. This means the LLM does not see these parameters and is not
required to generate them. The value is resolved internally by the
Toolbox during execution.
2026-01-21 16:35:27 -08:00
Wenxin Du
e4f60e5633 fix(embeddingModel): add embedding model to MCP handler (#2310)
- Add embedding model to mcp handlers
- Add integration tests
2026-01-21 00:20:11 +00:00
vaibhavba-google
d7af21bdde tests(cloudhealthcare): use t.Cleanup() instead of defer (#2332)
## Description

Use t.Cleanup() to register cleanup of FHIR and DICOM stores immediately
after creation. This fixes the uncleaned FHIR/DICOM stores that remain
in the project(In the earlier implementation, teardown does not get
triggered if the test failed).

🛠️ Fixes #1986

---------

Co-authored-by: Yuan Teoh <yuanteoh@google.com>
Co-authored-by: Yuan Teoh <45984206+Yuan325@users.noreply.github.com>
2026-01-20 14:58:33 -08:00
Yuan Teoh
adc9589766 feat: add new user-agent-metadata flag (#2302)
## Description

Add a new `--user-agent-metadata` flag that allows user to append
additional user agent metadata. The flag takes in []string and will
concatenate it with `.`.

```
go run . --user-agent-metadata=foo
```
 produces `0.25.0+dev.darwin.arm64+foo` user agent string

```
go run . --user-agent-metadata=foo,bar
```
produces `0.25.0+dev.darwin.arm64+foo+bar` user agent string

## PR Checklist

> Thank you for opening a Pull Request! Before submitting your PR, there
are a
> few things you can do to make sure it goes smoothly:

- [x] Make sure you reviewed

[CONTRIBUTING.md](https://github.com/googleapis/genai-toolbox/blob/main/CONTRIBUTING.md)
- [x] Make sure to open an issue as a

[bug/issue](https://github.com/googleapis/genai-toolbox/issues/new/choose)
  before writing your code! That way we can discuss the change, evaluate
  designs, and agree on the general idea
- [x] Ensure the tests and linter pass
- [x] Code coverage does not decrease (if any source code was changed)
- [x] Appropriate docs were updated (if necessary)
- [x] Make sure to add `!` if this involve a breaking change

🛠️ Fixes #<issue_number_goes_here>
2026-01-20 19:23:50 +00:00
Yuan Teoh
c25a2330fe fix: add check for client authorization before retrieving token string (#2327)
Previous refactoring (#2273) accidentally removed the authorization
checks prior to token retrieval. This issue went unnoticed because the
integration tests were disabled. I am re-adding the necessary checks.
2026-01-20 18:57:11 +00:00
Juexin Wang
6e09b08c6a docs(tools/cloudgda): update cloud gda datasource references note (#2326)
## Description

Update the GDA source document to clarify that only `AlloyDbReference`,
`SpannerReference`, and `CloudSqlReference` are supported.

## PR Checklist

> Thank you for opening a Pull Request! Before submitting your PR, there
are a
> few things you can do to make sure it goes smoothly:

- [x] Make sure you reviewed

[CONTRIBUTING.md](https://github.com/googleapis/genai-toolbox/blob/main/CONTRIBUTING.md)
- [x] Make sure to open an issue as a

[bug/issue](https://github.com/googleapis/genai-toolbox/issues/new/choose)
  before writing your code! That way we can discuss the change, evaluate
  designs, and agree on the general idea
- [x] Ensure the tests and linter pass
- [x] Code coverage does not decrease (if any source code was changed)
- [x] Appropriate docs were updated (if necessary)
- [x] Make sure to add `!` if this involve a breaking change

🛠️ Fixes #2324

---------

Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
2026-01-16 18:57:06 +00:00
Wenxin Du
1f15a111f1 docs: fix redis array sample (#2301)
The Redis tool code sample is missing the "items" field for the array
parameter, causing confusion.
fix: https://github.com/googleapis/genai-toolbox/issues/2293
2026-01-16 17:08:47 +00:00
Twisha Bansal
dfddeb528d docs: update cloud run connection docs (#2320)
## Description

Partially fixes
https://github.com/googleapis/mcp-toolbox-sdk-python/issues/496

## PR Checklist

> Thank you for opening a Pull Request! Before submitting your PR, there
are a
> few things you can do to make sure it goes smoothly:

- [x] Make sure you reviewed

[CONTRIBUTING.md](https://github.com/googleapis/genai-toolbox/blob/main/CONTRIBUTING.md)
- [ ] Make sure to open an issue as a

[bug/issue](https://github.com/googleapis/genai-toolbox/issues/new/choose)
  before writing your code! That way we can discuss the change, evaluate
  designs, and agree on the general idea
- [ ] Ensure the tests and linter pass
- [ ] Code coverage does not decrease (if any source code was changed)
- [ ] Appropriate docs were updated (if necessary)
- [ ] Make sure to add `!` if this involve a breaking change

🛠️ Fixes #<issue_number_goes_here>
2026-01-16 10:05:05 +05:30
Eric Wang
00c3e6d8cb feat(prebuilt/cloud-sql): Add restore backup tool for cloud sql (#2171)
## Description

This pull request adds a new tool, cloud-sql-restore-backup, which
enables restoring a backup onto a Cloud SQL instance from the toolbox
using the Cloud SQL Admin API. The tool supports restoring standard,
project level, and BackupDR backups.

Tested:
<img width="3758" height="532" alt="image"
src="https://github.com/user-attachments/assets/d1d61af7-d96e-417c-898c-65b876de4c5e"
/>


## PR Checklist

> Thank you for opening a Pull Request! Before submitting your PR, there
are a
> few things you can do to make sure it goes smoothly:

- [x] Make sure you reviewed

[CONTRIBUTING.md](https://github.com/googleapis/genai-toolbox/blob/main/CONTRIBUTING.md)
- [x] Make sure to open an issue as a

[bug/issue](https://github.com/googleapis/genai-toolbox/issues/new/choose)
  before writing your code! That way we can discuss the change, evaluate
  designs, and agree on the general idea
- [x] Ensure the tests and linter pass
- [x] Code coverage does not decrease (if any source code was changed)
- [x] Appropriate docs were updated (if necessary)
- [x] Make sure to add `!` if this involve a breaking change

🛠️ Fixes #2170

Co-authored-by: Averi Kitsch <akitsch@google.com>
2026-01-16 00:16:46 +00:00
Yuan Teoh
d00b6fdf18 chore: update host validation error to 403 (#2306)
Update error code from 400 to 403 according to MCP
[updates](https://github.com/modelcontextprotocol/modelcontextprotocol/pull/1439)
for invalid origin header.

Also updated hostCheck to only check host, not port.

To test, run Toolbox with the following (also work with port number e.g.
`--allowed-host=127.0.0.1:5000`):
```
go run . --allowed-hosts=127.0.0.1 
```

Test with the following:
```
// curl successfully
curl -H "Host: 127.0.0.1:5000" http://127.0.0.1:5000

// curl successfully
curl -H "Host: 127.0.0.1:3000" http://127.0.0.1:5000

// will show Invalid Host Header error
curl -H "Host: attacker:5000" http://127.0.0.1:5000
```
2026-01-15 21:09:40 +00:00
Yuan Teoh
4d23a3bbf2 feat: add new v20251125 version (#2303)
Add new `v20251125` specs for MCP.
https://modelcontextprotocol.io/specification/2025-11-25
2026-01-15 20:14:11 +00:00
Yuan Teoh
5e0999ebf5 feat: add remaining toolbox server flag (#2272)
Add remaining CLI flags for the server published on official mcp
registry.

ref: https://googleapis.github.io/genai-toolbox/reference/cli/

_note: mcp registry do not support shorthand flag (there are no options
to defined an alternate name). The only way is to define them as
separate named arguments but it may not work well since both would try
to set the same underlying value._
2026-01-15 19:30:40 +00:00
Juexin Wang
6b02591703 refactor(tools/cloudgda)!: update description and parameter name for cloudgda tool (#2288)
- Refactors the 'cloud-gemini-data-analytics-query' tool to update its
default description with detailed tool guidance and usage guidance.
- Append the default description to the tools.yaml description no matter
whether the tools.yaml description exists since this guidance will
always be useful to the agent on how to use the tool.
- Renames the input parameter from 'prompt' to 'query' for better
consistency.
2026-01-14 23:54:43 +00:00
Eric Wang
8e0fb03483 feat(prebuilt/cloud-sql): Add create backup tool for Cloud SQL (#2141)
## Description

This pull request adds a new tool, cloud-sql-create-backup, which
enables taking a backup on a Cloud SQL instance from the toolbox using
the Cloud SQL Admin API. The tool supports optionally supplying a
location or description for the backup.

Tested:
<img width="1561" height="425" alt="image"
src="https://github.com/user-attachments/assets/c8984b07-5450-470a-9ac6-df16943e25e9"
/>


## PR Checklist

> Thank you for opening a Pull Request! Before submitting your PR, there
are a
> few things you can do to make sure it goes smoothly:

- [x] Make sure you reviewed

[CONTRIBUTING.md](https://github.com/googleapis/genai-toolbox/blob/main/CONTRIBUTING.md)
- [x] Make sure to open an issue as a

[bug/issue](https://github.com/googleapis/genai-toolbox/issues/new/choose)
  before writing your code! That way we can discuss the change, evaluate
  designs, and agree on the general idea
- [x] Ensure the tests and linter pass
- [x] Code coverage does not decrease (if any source code was changed)
- [x] Appropriate docs were updated (if necessary)
- [x] Make sure to add `!` if this involve a breaking change

🛠️ Fixes #2140

---------

Co-authored-by: Yuan Teoh <45984206+Yuan325@users.noreply.github.com>
Co-authored-by: Averi Kitsch <akitsch@google.com>
2026-01-14 22:55:11 +00:00
Wenxin Du
2817dd1e5d docs: Add PR guidelines in 'CONTRIBUTING.md' (#2304)
Add PR guidelines - some best practices
2026-01-14 19:08:56 +00:00
Giuseppe Villani
68a218407e docs: add quickstart guide for MCP with Neo4j (#1774)
## Description

Samples for MCP with Neo4j for this page:
https://googleapis.github.io/genai-toolbox/samples/

## PR Checklist

> Thank you for opening a Pull Request! Before submitting your PR, there
are a
> few things you can do to make sure it goes smoothly:

- [x] Make sure you reviewed

[CONTRIBUTING.md](https://github.com/googleapis/genai-toolbox/blob/main/CONTRIBUTING.md)
- [ ] Make sure to open an issue as a

[bug/issue](https://github.com/googleapis/genai-toolbox/issues/new/choose)
  before writing your code! That way we can discuss the change, evaluate
  designs, and agree on the general idea
- [x] Ensure the tests and linter pass
- [x] Code coverage does not decrease (if any source code was changed)
- [x] Appropriate docs were updated (if necessary)
- [x] Make sure to add `!` if this involve a breaking change

---------

Co-authored-by: Yuan Teoh <45984206+Yuan325@users.noreply.github.com>
2026-01-14 01:57:10 +00:00
Sahaja Reddy Pabbathi Reddy
d69792d843 chore: update dataplex aspecttypes integration tests (#2193)
## Description

Addresses an issue where Dataplex AspectTypes created during integration
tests were not consistently deleted. This accumulation led to exceeding
the AspectType quota.
To prevent this, the test setup now includes a step to list and delete
all existing AspectTypes within the test project and location *before*
attempting to create any new ones. This ensures a clean state for each
test run and avoids hitting the quota.


## PR Checklist

> Thank you for opening a Pull Request! Before submitting your PR, there
are a
> few things you can do to make sure it goes smoothly:

- [ ] Make sure you reviewed

[CONTRIBUTING.md](https://github.com/googleapis/genai-toolbox/blob/main/CONTRIBUTING.md)
- [ ] Make sure to open an issue as a

[bug/issue](https://github.com/googleapis/genai-toolbox/issues/new/choose)
  before writing your code! That way we can discuss the change, evaluate
  designs, and agree on the general idea
- [ ] Ensure the tests and linter pass
- [ ] Code coverage does not decrease (if any source code was changed)
- [ ] Appropriate docs were updated (if necessary)
- [ ] Make sure to add `!` if this involve a breaking change

🛠️ Fixes #2057

Co-authored-by: Averi Kitsch <akitsch@google.com>
2026-01-13 08:24:33 -08:00
Yuan Teoh
647b04d3a7 docs(tools/alloydbainl): only require psv installation when needed (#2283)
Update existing docs to install PSV extensions **only** when needed.
This tool could be use without installing PSV if NLConfigParam is not
used.

---------

Co-authored-by: Averi Kitsch <akitsch@google.com>
2026-01-12 16:57:59 -08:00
Yuan Teoh
030df9766f refactor(sources/looker): move source implementation in Invoke() function into Source (#2278)
Move source-related queries from `Invoke()` function into Source.

This is an effort to generalizing tools to work with any Source that
implements a specific interface. This will provide a better segregation
of the roles for Tools vs Source.

Tool's role will be limited to the following:
* Resolve any pre-implementation steps or parameters (e.g. template
parameters)
* Retrieving Source
* Calling the source's implementation
2026-01-12 23:53:33 +00:00
Yuan Teoh
5dbf207162 refactor(sources/mongodb): move source implementation in Invoke() function into Source (#2277)
Move source-related queries from `Invoke()` function into Source.

This is an effort to generalizing tools to work with any Source that
implements a specific interface. This will provide a better segregation
of the roles for Tools vs Source.

Tool's role will be limited to the following:
* Resolve any pre-implementation steps or parameters (e.g. template
parameters)
* Retrieving Source
* Calling the source's implementation
2026-01-12 15:25:39 -08:00
洪鈞閔 ( jasper )
9c3720e31d docs(README.md): fix container section version 0.11.0 => 0.24.0 (#2251)
## Description

> Should include a concise description of the changes (bug or feature),
it's
> impact, along with a summary of the solution

## PR Checklist

> Thank you for opening a Pull Request! Before submitting your PR, there
are a
> few things you can do to make sure it goes smoothly:

- [x] Make sure you reviewed

[CONTRIBUTING.md](https://github.com/googleapis/genai-toolbox/blob/main/CONTRIBUTING.md)
- [ ] Make sure to open an issue as a

[bug/issue](https://github.com/googleapis/genai-toolbox/issues/new/choose)
  before writing your code! That way we can discuss the change, evaluate
  designs, and agree on the general idea
- [ ] Ensure the tests and linter pass
- [ ] Code coverage does not decrease (if any source code was changed)
- [ ] Appropriate docs were updated (if necessary)
- [ ] Make sure to add `!` if this involve a breaking change

🛠️ Fixes #<issue_number_goes_here>

Co-authored-by: Averi Kitsch <akitsch@google.com>
2026-01-12 20:17:15 +00:00
Yuan Teoh
3cd3c39d66 refactor(sources/firestore): move source implementation in Invoke() function into Source (#2275)
Move source-related queries from `Invoke()` function into Source.

This is an effort to generalizing tools to work with any Source that
implements a specific interface. This will provide a better segregation
of the roles for Tools vs Source.

Tool's role will be limited to the following:
* Resolve any pre-implementation steps or parameters (e.g. template
parameters)
* Retrieving Source
* Calling the source's implementation
2026-01-12 18:43:35 +00:00
Yuan Teoh
0691a6f715 refactor: move source implementation in Invoke() function to Source (#2274)
Move source-related queries from `Invoke()` function into Source.

This PR addresses the following sources:
* dataplex
* http
* serverlessspark

This is an effort to generalizing tools to work with any Source that
implements a specific interface. This will provide a better segregation
of the roles for Tools vs Source.

Tool's role will be limited to the following:
* Resolve any pre-implementation steps or parameters (e.g. template
parameters)
* Retrieving Source
* Calling the source's implementation
2026-01-12 18:16:32 +00:00
Yuan Teoh
467b96a23b refactor(sources/cloudhealthcare): move source implementation in Invoke() function to Source (#2273)
Move source-related queries from `Invoke()` function into Source.

This is an effort to generalizing tools to work with any Source that
implements a specific interface. This will provide a better segregation
of the roles for Tools vs Source.

Tool's role will be limited to the following:
* Resolve any pre-implementation steps or parameters (e.g. template
parameters)
* Retrieving Source
* Calling the source's implementation
2026-01-12 17:51:58 +00:00
Shobhit Singh
4abf0c39e7 feat(bigquery): make maximum rows returned from queries configurable (#2262)
This change allows the agent developer to control the maxium number of
rows returned from tools running BigQuery SQL query. Using this feature
the agent developer could limit how large output is presented to LLM in
an agentic user journey.

## Description

> Should include a concise description of the changes (bug or feature),
it's
> impact, along with a summary of the solution

## PR Checklist

> Thank you for opening a Pull Request! Before submitting your PR, there
are a
> few things you can do to make sure it goes smoothly:

- [x] Make sure you reviewed

[CONTRIBUTING.md](https://github.com/googleapis/genai-toolbox/blob/main/CONTRIBUTING.md)
- [ ] Make sure to open an issue
https://github.com/googleapis/genai-toolbox/issues/2261
  before writing your code! That way we can discuss the change, evaluate
  designs, and agree on the general idea
- [x] Ensure the tests and linter pass
- [x] Code coverage does not decrease (if any source code was changed)
- [x] Appropriate docs were updated (if necessary)
- [ ] Make sure to add `!` if this involve a breaking change

🛠️ Fixes #2261 2261
2026-01-09 20:43:46 +00:00
Yuan Teoh
dd7b9de623 docs: add issue and pr triaging and SLO (#2257)
## Description

update docs to reflect triaging workflow and SLO

## PR Checklist

> Thank you for opening a Pull Request! Before submitting your PR, there
are a
> few things you can do to make sure it goes smoothly:

- [x] Make sure you reviewed

[CONTRIBUTING.md](https://github.com/googleapis/genai-toolbox/blob/main/CONTRIBUTING.md)
- [x] Make sure to open an issue as a

[bug/issue](https://github.com/googleapis/genai-toolbox/issues/new/choose)
  before writing your code! That way we can discuss the change, evaluate
  designs, and agree on the general idea
- [x] Ensure the tests and linter pass
- [x] Code coverage does not decrease (if any source code was changed)
- [x] Appropriate docs were updated (if necessary)
- [x] Make sure to add `!` if this involve a breaking change

🛠️ Fixes #<issue_number_goes_here>
2026-01-09 19:21:41 +00:00
release-please[bot]
41b518b955 chore(main): release 0.25.0 (#2218)
🤖 I have created a release *beep* *boop*
---


##
[0.25.0](https://github.com/googleapis/genai-toolbox/compare/v0.24.0...v0.25.0)
(2026-01-08)


### Features

* Add `embeddingModel` support
([#2121](https://github.com/googleapis/genai-toolbox/issues/2121))
([9c62f31](9c62f313ff))
* Add `allowed-hosts` flag
([#2254](https://github.com/googleapis/genai-toolbox/issues/2254))
([17b41f6](17b41f6453))
* Add parameter default value to manifest
([#2264](https://github.com/googleapis/genai-toolbox/issues/2264))
([9d1feca](9d1feca108))
* **snowflake:** Add Snowflake Source and Tools
([#858](https://github.com/googleapis/genai-toolbox/issues/858))
([b706b5b](b706b5bc68))
* **prebuilt/cloud-sql-mysql:** Update CSQL MySQL prebuilt tools to use
IAM ([#2202](https://github.com/googleapis/genai-toolbox/issues/2202))
([731a32e](731a32e536))
* **sources/bigquery:** Make credentials scope configurable
([#2210](https://github.com/googleapis/genai-toolbox/issues/2210))
([a450600](a4506009b9))
* **sources/trino:** Add ssl verification options and fix docs example
([#2155](https://github.com/googleapis/genai-toolbox/issues/2155))
([4a4cf1e](4a4cf1e712))
* **tools/looker:** Add ability to set destination folder with
`make_look` and `make_dashboard`.
([#2245](https://github.com/googleapis/genai-toolbox/issues/2245))
([eb79339](eb793398cd))
* **tools/postgressql:** Add tool to list store procedure
([#2156](https://github.com/googleapis/genai-toolbox/issues/2156))
([cf0fc51](cf0fc515b5))
* **tools/postgressql:** Add Parameter `embeddedBy` config support
([#2151](https://github.com/googleapis/genai-toolbox/issues/2151))
([17b70cc](17b70ccaa7))


### Bug Fixes

* **server:** Add `embeddingModel` config initialization
([#2281](https://github.com/googleapis/genai-toolbox/issues/2281))
([a779975](a7799757c9))
* **sources/cloudgda:** Add import for cloudgda source
([#2217](https://github.com/googleapis/genai-toolbox/issues/2217))
([7daa411](7daa4111f4))
* **tools/alloydb-wait-for-operation:** Fix connection message
generation
([#2228](https://github.com/googleapis/genai-toolbox/issues/2228))
([7053fbb](7053fbb195))
* **tools/alloydbainl:** Only add psv when NL Config Param is defined
([#2265](https://github.com/googleapis/genai-toolbox/issues/2265))
([ef8f3b0](ef8f3b02f2))
* **tools/looker:** Looker client OAuth nil pointer error
([#2231](https://github.com/googleapis/genai-toolbox/issues/2231))
([268700b](268700bdbf))

---
This PR was generated with [Release
Please](https://github.com/googleapis/release-please). See
[documentation](https://github.com/googleapis/release-please#release-please).

---------

Co-authored-by: release-please[bot] <55107282+release-please[bot]@users.noreply.github.com>
Co-authored-by: Yuan Teoh <45984206+Yuan325@users.noreply.github.com>
2026-01-08 14:56:45 -08:00
Yuan Teoh
6e8a9eb8ec chore: update hugo for release (#2282)
Update hugo version for release v0.25.0
2026-01-08 22:18:59 +00:00
Yuan Teoh
ef8f3b02f2 fix(tools/alloydbainl): only add psv when NL Config Param is defined (#2265)
## Description

PSV should only be required when when it is needed. Currently, we
require psv even whenever user uses AlloyDB AI NL tool. This is due to
the statement that we use to execute nl query.

This PR modified the statement query to only utilize `param_names` and
`param_values` when needed.

Manually tested with a db that does not have psv installed.

🛠️ Fixes #1970
2026-01-08 21:52:05 +00:00
Yuan Teoh
351b007fe3 chore: update mcp registry schema version (#2266) 2026-01-08 13:30:35 -08:00
Yuan Teoh
9d1feca108 feat: add default value to manifest (#2264)
## Description

Add default value to manifest (for both native endpoint and mcp
endpoint).

## PR Checklist

> Thank you for opening a Pull Request! Before submitting your PR, there
are a
> few things you can do to make sure it goes smoothly:

- [x] Make sure you reviewed

[CONTRIBUTING.md](https://github.com/googleapis/genai-toolbox/blob/main/CONTRIBUTING.md)
- [x] Make sure to open an issue as a

[bug/issue](https://github.com/googleapis/genai-toolbox/issues/new/choose)
  before writing your code! That way we can discuss the change, evaluate
  designs, and agree on the general idea
- [x] Ensure the tests and linter pass
- [x] Code coverage does not decrease (if any source code was changed)
- [x] Appropriate docs were updated (if necessary)
- [x] Make sure to add `!` if this involve a breaking change

🛠️ Fixes #1602
2026-01-08 20:53:45 +00:00
Yuan Teoh
17b41f6453 feat: add allowed-hosts flag (#2254)
## Description

Previously added `allowed-origins` (for CORs) is not sufficient for
preventing DNS rebinding attacks. We'll have to check host headers.

To test, run Toolbox with the following:
```
go run . --allowed-hosts=127.0.0.1:5000
```

Test with the following:
```
// curl successfully
curl -H "Host: 127.0.0.1:5000" http://127.0.0.1:5000

// will show Invalid Host Header error
curl -H "Host: attacker:5000" http://127.0.0.1:5000
```

## PR Checklist

> Thank you for opening a Pull Request! Before submitting your PR, there
are a
> few things you can do to make sure it goes smoothly:

- [ ] Make sure you reviewed

[CONTRIBUTING.md](https://github.com/googleapis/genai-toolbox/blob/main/CONTRIBUTING.md)
- [ ] Make sure to open an issue as a

[bug/issue](https://github.com/googleapis/genai-toolbox/issues/new/choose)
  before writing your code! That way we can discuss the change, evaluate
  designs, and agree on the general idea
- [ ] Ensure the tests and linter pass
- [ ] Code coverage does not decrease (if any source code was changed)
- [ ] Appropriate docs were updated (if necessary)
- [ ] Make sure to add `!` if this involve a breaking change

🛠️ Fixes #<issue_number_goes_here>
2026-01-08 19:42:54 +00:00
Wenxin Du
a7799757c9 fix(server): Add embeddingModel config initialization (#2281)
Embedding Models were only loaded in hot reload because it was not
initialized properly.
2026-01-08 13:58:39 -05:00
Twisha Bansal
d961e373e1 docs: link medium blogs to toolbox docsite (#2269)
## Description

Adds a section in the navbar that links to the toolbox medium blog: 
<img width="492" height="822" alt="87F2yTQdcbpMHs3"
src="https://github.com/user-attachments/assets/74d8b552-1e8f-449c-8b09-4f86218d2817"
/>


## PR Checklist

> Thank you for opening a Pull Request! Before submitting your PR, there
are a
> few things you can do to make sure it goes smoothly:

- [ ] Make sure you reviewed

[CONTRIBUTING.md](https://github.com/googleapis/genai-toolbox/blob/main/CONTRIBUTING.md)
- [ ] Make sure to open an issue as a

[bug/issue](https://github.com/googleapis/genai-toolbox/issues/new/choose)
  before writing your code! That way we can discuss the change, evaluate
  designs, and agree on the general idea
- [ ] Ensure the tests and linter pass
- [ ] Code coverage does not decrease (if any source code was changed)
- [ ] Appropriate docs were updated (if necessary)
- [ ] Make sure to add `!` if this involve a breaking change

🛠️ Fixes #<issue_number_goes_here>

---------

Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
2026-01-08 10:49:55 +00:00
Mend Renovate
bcb40a720d chore(deps): update pip (#2270)
This PR contains the following updates:

| Package | Change |
[Age](https://docs.renovatebot.com/merge-confidence/) |
[Confidence](https://docs.renovatebot.com/merge-confidence/) |
|---|---|---|---|
| [google-genai](https://redirect.github.com/googleapis/python-genai) |
`==1.56.0` → `==1.57.0` |
![age](https://developer.mend.io/api/mc/badges/age/pypi/google-genai/1.57.0?slim=true)
|
![confidence](https://developer.mend.io/api/mc/badges/confidence/pypi/google-genai/1.56.0/1.57.0?slim=true)
|
| [langchain](https://redirect.github.com/langchain-ai/langchain)
([source](https://redirect.github.com/langchain-ai/langchain/tree/HEAD/libs/langchain),
[changelog](https://redirect.github.com/langchain-ai/langchain/releases?q=tag%3A%22langchain%3D%3D1%22))
| `==1.2.1` → `==1.2.2` |
![age](https://developer.mend.io/api/mc/badges/age/pypi/langchain/1.2.2?slim=true)
|
![confidence](https://developer.mend.io/api/mc/badges/confidence/pypi/langchain/1.2.1/1.2.2?slim=true)
|

---

### Release Notes

<details>
<summary>googleapis/python-genai (google-genai)</summary>

###
[`v1.57.0`](https://redirect.github.com/googleapis/python-genai/blob/HEAD/CHANGELOG.md#1570-2026-01-07)

[Compare
Source](https://redirect.github.com/googleapis/python-genai/compare/v1.56.0...v1.57.0)

##### Features

- \[Python] add RegisterFiles so gcs files can be used with genai.
([68fa075](68fa075429))
- Add gemini-3-pro-preview support for local tokenizer
([48f8256](48f8256202))
- Add PersonGeneration to ImageConfig for Vertex Gempix
([c66e0ce](c66e0ce16b))

##### Bug Fixes

- Remove validation for empty text parts on Chat, this will support
keeping the history in chat when the API yields back such a part.
([215c852](215c852465))

##### Documentation

- Regenerate docs for 1.56.0
([b4c063e](b4c063e7f2))
- Update `codegen_instructions.md` for Gemini 3 Flash
([22500b5](22500b5ef9))
- Update Virtual Try-On model id in samples and docstrings
([5bf4d62](5bf4d625f3))

</details>

---

### Configuration

📅 **Schedule**: Branch creation - At any time (no schedule defined),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

👻 **Immortal**: This PR will be recreated if closed unmerged. Get
[config
help](https://redirect.github.com/renovatebot/renovate/discussions) if
that's undesired.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/googleapis/genai-toolbox).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0Mi42OS4xIiwidXBkYXRlZEluVmVyIjoiNDIuNjkuMSIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOltdfQ==-->

Co-authored-by: Yuan Teoh <45984206+Yuan325@users.noreply.github.com>
2026-01-07 21:50:31 -08:00
gRedHeadphone
4a4cf1e712 feat(sources/trino): add ssl verification options and fix docs example (#2155)
## Description

Adds options such as disableSslVerification, sslCert and sslCertPath to
trino source. Also fixes trino-sql docs on params

## PR Checklist

- [x] Make sure you reviewed

[CONTRIBUTING.md](https://github.com/googleapis/genai-toolbox/blob/main/CONTRIBUTING.md)
- [x] Make sure to open an issue as a

[bug/issue](https://github.com/googleapis/genai-toolbox/issues/new/choose)
  before writing your code! That way we can discuss the change, evaluate
  designs, and agree on the general idea
- [x] Ensure the tests and linter pass
- [x] Code coverage does not decrease (if any source code was changed)
- [x] Appropriate docs were updated (if necessary)
- [x] Make sure to add `!` if this involve a breaking change

🛠️ Fixes #1910

---------
2026-01-08 01:19:23 +00:00
igor-elbert
b706b5bc68 feat(snowflake): add Snowflake Source and Tools (#858)
Initial version supporting snowflake. Connects and executes arbitrary
SQL. An rudimentary Python example is provided as well.

---------

Co-authored-by: duwenxin <duwenxin@google.com>
Co-authored-by: Wenxin Du <117315983+duwenxin99@users.noreply.github.com>
2026-01-07 19:02:20 -05:00
Wenxin Du
4d3332d37d refactor: add configurable result options for ExecuteSql tests (#2271)
Adds WithExecuteCreateWant, WithExecuteDropWant, and
WithExecuteSelectEmptyWant to RunExecuteSqlToolInvokeTest to allow
sources like Snowflake to validate specific DDL/DML status responses
instead of defaulting to null.
2026-01-07 22:30:41 +00:00
dependabot[bot]
1203b7370a chore(deps): bump jws from 4.0.0 to 4.0.1 in /docs/en/getting-started/quickstart/js/llamaindex (#2260)
Bumps [jws](https://github.com/brianloveswords/node-jws) from 4.0.0 to
4.0.1.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/brianloveswords/node-jws/releases">jws's
releases</a>.</em></p>
<blockquote>
<h2>v4.0.1</h2>
<h3>Changed</h3>
<ul>
<li>Fix advisory GHSA-869p-cjfg-cm3x: createSign and createVerify now
require
that a non empty secret is provided (via opts.secret, opts.privateKey or
opts.key)
when using HMAC algorithms.</li>
<li>Upgrading JWA version to 2.0.1, addressing a compatibility issue for
Node &gt;= 25.</li>
</ul>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/auth0/node-jws/blob/master/CHANGELOG.md">jws's
changelog</a>.</em></p>
<blockquote>
<h2>[4.0.1]</h2>
<h3>Changed</h3>
<ul>
<li>Fix advisory GHSA-869p-cjfg-cm3x: createSign and createVerify now
require
that a non empty secret is provided (via opts.secret, opts.privateKey or
opts.key)
when using HMAC algorithms.</li>
<li>Upgrading JWA version to 2.0.1, adressing a compatibility issue for
Node &gt;= 25.</li>
</ul>
<h2>[3.2.3]</h2>
<h3>Changed</h3>
<ul>
<li>Fix advisory GHSA-869p-cjfg-cm3x: createSign and createVerify now
require
that a non empty secret is provided (via opts.secret, opts.privateKey or
opts.key)
when using HMAC algorithms.</li>
<li>Upgrading JWA version to 1.4.2, adressing a compatibility issue for
Node &gt;= 25.</li>
</ul>
<h2>[3.0.0]</h2>
<h3>Changed</h3>
<ul>
<li><strong>BREAKING</strong>: <code>jwt.verify</code> now requires an
<code>algorithm</code> parameter, and
<code>jws.createVerify</code> requires an <code>algorithm</code> option.
The <code>&quot;alg&quot;</code> field
signature headers is ignored. This mitigates a critical security flaw
in the library which would allow an attacker to generate signatures with
arbitrary contents that would be accepted by <code>jwt.verify</code>.
See
<a
href="https://auth0.com/blog/2015/03/31/critical-vulnerabilities-in-json-web-token-libraries/">https://auth0.com/blog/2015/03/31/critical-vulnerabilities-in-json-web-token-libraries/</a>
for details.</li>
</ul>
<h2><a
href="https://github.com/brianloveswords/node-jws/compare/v1.0.1...v2.0.0">2.0.0</a>
- 2015-01-30</h2>
<h3>Changed</h3>
<ul>
<li>
<p><strong>BREAKING</strong>: Default payload encoding changed from
<code>binary</code> to
<code>utf8</code>. <code>utf8</code> is a is a more sensible default
than <code>binary</code> because
many payloads, as far as I can tell, will contain user-facing
strings that could be in any language. (<!-- raw HTML omitted
-->[6b6de48]<!-- raw HTML omitted -->)</p>
</li>
<li>
<p>Code reorganization, thanks [<a
href="https://github.com/fearphage"><code>@​fearphage</code></a>]! (<!--
raw HTML omitted --><a
href="https://github.com/brianloveswords/node-jws/commit/7880050">7880050</a><!--
raw HTML omitted -->)</p>
</li>
</ul>
<h3>Added</h3>
<ul>
<li>Option in all relevant methods for <code>encoding</code>. For those
few users
that might be depending on a <code>binary</code> encoding of the
messages, this
is for them. (<!-- raw HTML omitted -->[6b6de48]<!-- raw HTML omitted
-->)</li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="34c45b2c04"><code>34c45b2</code></a>
Merge commit from fork</li>
<li><a
href="49bc39b1f5"><code>49bc39b</code></a>
version 4.0.1</li>
<li><a
href="d42350ccab"><code>d42350c</code></a>
Enhance tests for HMAC streaming sign and verify</li>
<li><a
href="5cb007cf82"><code>5cb007c</code></a>
Improve secretOrKey initialization in VerifyStream</li>
<li><a
href="f9a2e1c8c6"><code>f9a2e1c</code></a>
Improve secret handling in SignStream</li>
<li><a
href="b9fb8d30e9"><code>b9fb8d3</code></a>
Merge pull request <a
href="https://redirect.github.com/brianloveswords/node-jws/issues/102">#102</a>
from auth0/SRE-57-Upload-opslevel-yaml</li>
<li><a
href="95b75ee56c"><code>95b75ee</code></a>
Upload OpsLevel YAML</li>
<li><a
href="8857ee7762"><code>8857ee7</code></a>
test: remove unused variable (<a
href="https://redirect.github.com/brianloveswords/node-jws/issues/96">#96</a>)</li>
<li>See full diff in <a
href="https://github.com/brianloveswords/node-jws/compare/v4.0.0...v4.0.1">compare
view</a></li>
</ul>
</details>
<details>
<summary>Maintainer changes</summary>
<p>This version was pushed to npm by <a
href="https://www.npmjs.com/~julien.wollscheid">julien.wollscheid</a>, a
new releaser for jws since your current version.</p>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=jws&package-manager=npm_and_yarn&previous-version=4.0.0&new-version=4.0.1)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)
You can disable automated security fix PRs for this repo from the
[Security Alerts
page](https://github.com/googleapis/genai-toolbox/network/alerts).

</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Yuan Teoh <45984206+Yuan325@users.noreply.github.com>
2026-01-07 21:37:47 +00:00
dependabot[bot]
4a26ce3c1b chore(deps): bump jws from 4.0.0 to 4.0.1 in /docs/en/getting-started/quickstart/js/genAI (#2259)
Bumps [jws](https://github.com/brianloveswords/node-jws) from 4.0.0 to
4.0.1.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/brianloveswords/node-jws/releases">jws's
releases</a>.</em></p>
<blockquote>
<h2>v4.0.1</h2>
<h3>Changed</h3>
<ul>
<li>Fix advisory GHSA-869p-cjfg-cm3x: createSign and createVerify now
require
that a non empty secret is provided (via opts.secret, opts.privateKey or
opts.key)
when using HMAC algorithms.</li>
<li>Upgrading JWA version to 2.0.1, addressing a compatibility issue for
Node &gt;= 25.</li>
</ul>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/auth0/node-jws/blob/master/CHANGELOG.md">jws's
changelog</a>.</em></p>
<blockquote>
<h2>[4.0.1]</h2>
<h3>Changed</h3>
<ul>
<li>Fix advisory GHSA-869p-cjfg-cm3x: createSign and createVerify now
require
that a non empty secret is provided (via opts.secret, opts.privateKey or
opts.key)
when using HMAC algorithms.</li>
<li>Upgrading JWA version to 2.0.1, adressing a compatibility issue for
Node &gt;= 25.</li>
</ul>
<h2>[3.2.3]</h2>
<h3>Changed</h3>
<ul>
<li>Fix advisory GHSA-869p-cjfg-cm3x: createSign and createVerify now
require
that a non empty secret is provided (via opts.secret, opts.privateKey or
opts.key)
when using HMAC algorithms.</li>
<li>Upgrading JWA version to 1.4.2, adressing a compatibility issue for
Node &gt;= 25.</li>
</ul>
<h2>[3.0.0]</h2>
<h3>Changed</h3>
<ul>
<li><strong>BREAKING</strong>: <code>jwt.verify</code> now requires an
<code>algorithm</code> parameter, and
<code>jws.createVerify</code> requires an <code>algorithm</code> option.
The <code>&quot;alg&quot;</code> field
signature headers is ignored. This mitigates a critical security flaw
in the library which would allow an attacker to generate signatures with
arbitrary contents that would be accepted by <code>jwt.verify</code>.
See
<a
href="https://auth0.com/blog/2015/03/31/critical-vulnerabilities-in-json-web-token-libraries/">https://auth0.com/blog/2015/03/31/critical-vulnerabilities-in-json-web-token-libraries/</a>
for details.</li>
</ul>
<h2><a
href="https://github.com/brianloveswords/node-jws/compare/v1.0.1...v2.0.0">2.0.0</a>
- 2015-01-30</h2>
<h3>Changed</h3>
<ul>
<li>
<p><strong>BREAKING</strong>: Default payload encoding changed from
<code>binary</code> to
<code>utf8</code>. <code>utf8</code> is a is a more sensible default
than <code>binary</code> because
many payloads, as far as I can tell, will contain user-facing
strings that could be in any language. (<!-- raw HTML omitted
-->[6b6de48]<!-- raw HTML omitted -->)</p>
</li>
<li>
<p>Code reorganization, thanks [<a
href="https://github.com/fearphage"><code>@​fearphage</code></a>]! (<!--
raw HTML omitted --><a
href="https://github.com/brianloveswords/node-jws/commit/7880050">7880050</a><!--
raw HTML omitted -->)</p>
</li>
</ul>
<h3>Added</h3>
<ul>
<li>Option in all relevant methods for <code>encoding</code>. For those
few users
that might be depending on a <code>binary</code> encoding of the
messages, this
is for them. (<!-- raw HTML omitted -->[6b6de48]<!-- raw HTML omitted
-->)</li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="34c45b2c04"><code>34c45b2</code></a>
Merge commit from fork</li>
<li><a
href="49bc39b1f5"><code>49bc39b</code></a>
version 4.0.1</li>
<li><a
href="d42350ccab"><code>d42350c</code></a>
Enhance tests for HMAC streaming sign and verify</li>
<li><a
href="5cb007cf82"><code>5cb007c</code></a>
Improve secretOrKey initialization in VerifyStream</li>
<li><a
href="f9a2e1c8c6"><code>f9a2e1c</code></a>
Improve secret handling in SignStream</li>
<li><a
href="b9fb8d30e9"><code>b9fb8d3</code></a>
Merge pull request <a
href="https://redirect.github.com/brianloveswords/node-jws/issues/102">#102</a>
from auth0/SRE-57-Upload-opslevel-yaml</li>
<li><a
href="95b75ee56c"><code>95b75ee</code></a>
Upload OpsLevel YAML</li>
<li><a
href="8857ee7762"><code>8857ee7</code></a>
test: remove unused variable (<a
href="https://redirect.github.com/brianloveswords/node-jws/issues/96">#96</a>)</li>
<li>See full diff in <a
href="https://github.com/brianloveswords/node-jws/compare/v4.0.0...v4.0.1">compare
view</a></li>
</ul>
</details>
<details>
<summary>Maintainer changes</summary>
<p>This version was pushed to npm by <a
href="https://www.npmjs.com/~julien.wollscheid">julien.wollscheid</a>, a
new releaser for jws since your current version.</p>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=jws&package-manager=npm_and_yarn&previous-version=4.0.0&new-version=4.0.1)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)
You can disable automated security fix PRs for this repo from the
[Security Alerts
page](https://github.com/googleapis/genai-toolbox/network/alerts).

</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Yuan Teoh <45984206+Yuan325@users.noreply.github.com>
2026-01-07 13:17:26 -08:00
Mend Renovate
306b5becda chore(deps): update pip (#2258)
This PR contains the following updates:

| Package | Change |
[Age](https://docs.renovatebot.com/merge-confidence/) |
[Confidence](https://docs.renovatebot.com/merge-confidence/) |
|---|---|---|---|
| [langchain](https://redirect.github.com/langchain-ai/langchain)
([source](https://redirect.github.com/langchain-ai/langchain/tree/HEAD/libs/langchain),
[changelog](https://redirect.github.com/langchain-ai/langchain/releases?q=tag%3A%22langchain%3D%3D1%22))
| `==1.2.0` → `==1.2.1` |
![age](https://developer.mend.io/api/mc/badges/age/pypi/langchain/1.2.1?slim=true)
|
![confidence](https://developer.mend.io/api/mc/badges/confidence/pypi/langchain/1.2.0/1.2.1?slim=true)
|
|
[langchain-google-vertexai](https://redirect.github.com/langchain-ai/langchain-google)
([source](https://redirect.github.com/langchain-ai/langchain-google/tree/HEAD/libs/vertexai),
[changelog](https://redirect.github.com/langchain-ai/langchain-google/releases?q=%22vertexai%22))
| `==3.2.0` → `==3.2.1` |
![age](https://developer.mend.io/api/mc/badges/age/pypi/langchain-google-vertexai/3.2.1?slim=true)
|
![confidence](https://developer.mend.io/api/mc/badges/confidence/pypi/langchain-google-vertexai/3.2.0/3.2.1?slim=true)
|

---

### Configuration

📅 **Schedule**: Branch creation - At any time (no schedule defined),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

👻 **Immortal**: This PR will be recreated if closed unmerged. Get
[config
help](https://redirect.github.com/renovatebot/renovate/discussions) if
that's undesired.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/googleapis/genai-toolbox).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0Mi42OS4xIiwidXBkYXRlZEluVmVyIjoiNDIuNjkuMSIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOltdfQ==-->

Co-authored-by: Yuan Teoh <45984206+Yuan325@users.noreply.github.com>
2026-01-07 10:31:03 -08:00
865 changed files with 22786 additions and 17439 deletions

View File

@@ -87,7 +87,7 @@ steps:
- "CLOUD_SQL_POSTGRES_REGION=$_REGION" - "CLOUD_SQL_POSTGRES_REGION=$_REGION"
- "SERVICE_ACCOUNT_EMAIL=$SERVICE_ACCOUNT_EMAIL" - "SERVICE_ACCOUNT_EMAIL=$SERVICE_ACCOUNT_EMAIL"
secretEnv: secretEnv:
["CLOUD_SQL_POSTGRES_USER", "CLOUD_SQL_POSTGRES_PASS", "CLIENT_ID"] ["CLOUD_SQL_POSTGRES_USER", "CLOUD_SQL_POSTGRES_PASS", "CLIENT_ID", "API_KEY"]
volumes: volumes:
- name: "go" - name: "go"
path: "/gopath" path: "/gopath"
@@ -134,7 +134,7 @@ steps:
- "ALLOYDB_POSTGRES_DATABASE=$_DATABASE_NAME" - "ALLOYDB_POSTGRES_DATABASE=$_DATABASE_NAME"
- "ALLOYDB_POSTGRES_REGION=$_REGION" - "ALLOYDB_POSTGRES_REGION=$_REGION"
- "SERVICE_ACCOUNT_EMAIL=$SERVICE_ACCOUNT_EMAIL" - "SERVICE_ACCOUNT_EMAIL=$SERVICE_ACCOUNT_EMAIL"
secretEnv: ["ALLOYDB_POSTGRES_USER", "ALLOYDB_POSTGRES_PASS", "CLIENT_ID"] secretEnv: ["ALLOYDB_POSTGRES_USER", "ALLOYDB_POSTGRES_PASS", "CLIENT_ID", "API_KEY"]
volumes: volumes:
- name: "go" - name: "go"
path: "/gopath" path: "/gopath"
@@ -293,7 +293,7 @@ steps:
.ci/test_with_coverage.sh \ .ci/test_with_coverage.sh \
"Cloud Healthcare API" \ "Cloud Healthcare API" \
cloudhealthcare \ cloudhealthcare \
cloudhealthcare || echo "Integration tests failed." cloudhealthcare
- id: "postgres" - id: "postgres"
name: golang:1 name: golang:1
@@ -305,7 +305,7 @@ steps:
- "POSTGRES_HOST=$_POSTGRES_HOST" - "POSTGRES_HOST=$_POSTGRES_HOST"
- "POSTGRES_PORT=$_POSTGRES_PORT" - "POSTGRES_PORT=$_POSTGRES_PORT"
- "SERVICE_ACCOUNT_EMAIL=$SERVICE_ACCOUNT_EMAIL" - "SERVICE_ACCOUNT_EMAIL=$SERVICE_ACCOUNT_EMAIL"
secretEnv: ["POSTGRES_USER", "POSTGRES_PASS", "CLIENT_ID"] secretEnv: ["POSTGRES_USER", "POSTGRES_PASS", "CLIENT_ID", "API_KEY"]
volumes: volumes:
- name: "go" - name: "go"
path: "/gopath" path: "/gopath"
@@ -825,6 +825,26 @@ steps:
elasticsearch \ elasticsearch \
elasticsearch elasticsearch
- id: "snowflake"
name: golang:1
waitFor: ["compile-test-binary"]
entrypoint: /bin/bash
env:
- "GOPATH=/gopath"
- "SERVICE_ACCOUNT_EMAIL=$SERVICE_ACCOUNT_EMAIL"
- "SNOWFLAKE_DATABASE=$_SNOWFLAKE_DATABASE"
- "SNOWFLAKE_SCHEMA=$_SNOWFLAKE_SCHEMA"
secretEnv: ["CLIENT_ID", "SNOWFLAKE_USER", "SNOWFLAKE_PASS", "SNOWFLAKE_ACCOUNT"]
volumes:
- name: "go"
path: "/gopath"
args:
- -c
- |
.ci/test_with_coverage.sh \
"Snowflake" \
snowflake \
snowflake
- id: "cassandra" - id: "cassandra"
name: golang:1 name: golang:1
@@ -944,6 +964,13 @@ steps:
availableSecrets: availableSecrets:
secretManager: secretManager:
# Common secrets
- versionName: projects/$PROJECT_ID/secrets/client_id/versions/latest
env: CLIENT_ID
- versionName: projects/$PROJECT_ID/secrets/api_key/versions/latest
env: API_KEY
# Resource-specific secrets
- versionName: projects/$PROJECT_ID/secrets/cloud_sql_pg_user/versions/latest - versionName: projects/$PROJECT_ID/secrets/cloud_sql_pg_user/versions/latest
env: CLOUD_SQL_POSTGRES_USER env: CLOUD_SQL_POSTGRES_USER
- versionName: projects/$PROJECT_ID/secrets/cloud_sql_pg_pass/versions/latest - versionName: projects/$PROJECT_ID/secrets/cloud_sql_pg_pass/versions/latest
@@ -960,8 +987,6 @@ availableSecrets:
env: POSTGRES_USER env: POSTGRES_USER
- versionName: projects/$PROJECT_ID/secrets/postgres_pass/versions/latest - versionName: projects/$PROJECT_ID/secrets/postgres_pass/versions/latest
env: POSTGRES_PASS env: POSTGRES_PASS
- versionName: projects/$PROJECT_ID/secrets/client_id/versions/latest
env: CLIENT_ID
- versionName: projects/$PROJECT_ID/secrets/neo4j_user/versions/latest - versionName: projects/$PROJECT_ID/secrets/neo4j_user/versions/latest
env: NEO4J_USER env: NEO4J_USER
- versionName: projects/$PROJECT_ID/secrets/neo4j_pass/versions/latest - versionName: projects/$PROJECT_ID/secrets/neo4j_pass/versions/latest
@@ -1038,6 +1063,12 @@ availableSecrets:
env: ELASTICSEARCH_USER env: ELASTICSEARCH_USER
- versionName: projects/$PROJECT_ID/secrets/elastic_search_pass/versions/latest - versionName: projects/$PROJECT_ID/secrets/elastic_search_pass/versions/latest
env: ELASTICSEARCH_PASS env: ELASTICSEARCH_PASS
- versionName: projects/$PROJECT_ID/secrets/snowflake_account/versions/latest
env: SNOWFLAKE_ACCOUNT
- versionName: projects/$PROJECT_ID/secrets/snowflake_user/versions/latest
env: SNOWFLAKE_USER
- versionName: projects/$PROJECT_ID/secrets/snowflake_pass/versions/latest
env: SNOWFLAKE_PASS
- versionName: projects/$PROJECT_ID/secrets/cassandra_user/versions/latest - versionName: projects/$PROJECT_ID/secrets/cassandra_user/versions/latest
env: CASSANDRA_USER env: CASSANDRA_USER
- versionName: projects/$PROJECT_ID/secrets/cassandra_pass/versions/latest - versionName: projects/$PROJECT_ID/secrets/cassandra_pass/versions/latest
@@ -1124,4 +1155,5 @@ substitutions:
_SINGLESTORE_USER: "root" _SINGLESTORE_USER: "root"
_MARIADB_PORT: "3307" _MARIADB_PORT: "3307"
_MARIADB_DATABASE: test_database _MARIADB_DATABASE: test_database
_SNOWFLAKE_DATABASE: "test"
_SNOWFLAKE_SCHEMA: "PUBLIC"

View File

@@ -51,12 +51,12 @@ jobs:
extended: true extended: true
- name: Setup Node - name: Setup Node
uses: actions/setup-node@395ad3262231945c25e8478fd5baf05154b1d79f # v6 uses: actions/setup-node@6044e13b5dc448c55e2357c09f80417699197238 # v6
with: with:
node-version: "22" node-version: "22"
- name: Cache dependencies - name: Cache dependencies
uses: actions/cache@9255dc7a253b0ccc959486e2bca901246202afeb # v5 uses: actions/cache@8b402f58fbc84540c8b491a91e594a4576fec3d7 # v5
with: with:
path: ~/.npm path: ~/.npm
key: ${{ runner.os }}-node-${{ hashFiles('**/package-lock.json') }} key: ${{ runner.os }}-node-${{ hashFiles('**/package-lock.json') }}

View File

@@ -57,7 +57,7 @@ jobs:
with: with:
hugo-version: "0.145.0" hugo-version: "0.145.0"
extended: true extended: true
- uses: actions/setup-node@395ad3262231945c25e8478fd5baf05154b1d79f # v6 - uses: actions/setup-node@6044e13b5dc448c55e2357c09f80417699197238 # v6
with: with:
node-version: "22" node-version: "22"

View File

@@ -44,7 +44,7 @@ jobs:
extended: true extended: true
- name: Setup Node - name: Setup Node
uses: actions/setup-node@395ad3262231945c25e8478fd5baf05154b1d79f # v6 uses: actions/setup-node@6044e13b5dc448c55e2357c09f80417699197238 # v6
with: with:
node-version: "22" node-version: "22"

View File

@@ -62,12 +62,12 @@ jobs:
extended: true extended: true
- name: Setup Node - name: Setup Node
uses: actions/setup-node@395ad3262231945c25e8478fd5baf05154b1d79f # v6 uses: actions/setup-node@6044e13b5dc448c55e2357c09f80417699197238 # v6
with: with:
node-version: "22" node-version: "22"
- name: Cache dependencies - name: Cache dependencies
uses: actions/cache@9255dc7a253b0ccc959486e2bca901246202afeb # v5 uses: actions/cache@8b402f58fbc84540c8b491a91e594a4576fec3d7 # v5
with: with:
path: ~/.npm path: ~/.npm
key: ${{ runner.os }}-node-${{ hashFiles('**/package-lock.json') }} key: ${{ runner.os }}-node-${{ hashFiles('**/package-lock.json') }}

View File

@@ -25,7 +25,7 @@ jobs:
uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6 uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6
- name: Restore lychee cache - name: Restore lychee cache
uses: actions/cache@9255dc7a253b0ccc959486e2bca901246202afeb # v5 uses: actions/cache@8b402f58fbc84540c8b491a91e594a4576fec3d7 # v5
with: with:
path: .lycheecache path: .lycheecache
key: cache-lychee-${{ github.sha }} key: cache-lychee-${{ github.sha }}
@@ -39,6 +39,7 @@ jobs:
--no-progress --no-progress
--cache --cache
--max-cache-age 1d --max-cache-age 1d
--exclude '^neo4j\+.*' --exclude '^bolt://.*'
README.md README.md
docs/ docs/
output: /tmp/foo.txt output: /tmp/foo.txt

View File

@@ -51,11 +51,11 @@ jobs:
console.log('Failed to remove label. Another job may have already removed it!'); console.log('Failed to remove label. Another job may have already removed it!');
} }
- name: Setup Go - name: Setup Go
uses: actions/setup-go@4dc6199c7b1a012772edbd06daecab0f50c9053c # v6.1.0 uses: actions/setup-go@7a3fe6cf4cb3a834922a1244abfce67bcef6a0c5 # v6.2.0
with: with:
go-version: "1.25" go-version: "1.25"
- name: Checkout code - name: Checkout code
uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1 uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with: with:
ref: ${{ github.event.pull_request.head.sha }} ref: ${{ github.event.pull_request.head.sha }}
repository: ${{ github.event.pull_request.head.repo.full_name }} repository: ${{ github.event.pull_request.head.repo.full_name }}

View File

@@ -29,7 +29,7 @@ jobs:
issues: 'write' issues: 'write'
pull-requests: 'write' pull-requests: 'write'
steps: steps:
- uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1 - uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
- uses: micnncim/action-label-syncer@3abd5ab72fda571e69fffd97bd4e0033dd5f495c # v1.3.0 - uses: micnncim/action-label-syncer@3abd5ab72fda571e69fffd97bd4e0033dd5f495c # v1.3.0
env: env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}

View File

@@ -57,12 +57,12 @@ jobs:
} }
- name: Setup Go - name: Setup Go
uses: actions/setup-go@4dc6199c7b1a012772edbd06daecab0f50c9053c # v6.1.0 uses: actions/setup-go@7a3fe6cf4cb3a834922a1244abfce67bcef6a0c5 # v6.2.0
with: with:
go-version: "1.24" go-version: "1.24"
- name: Checkout code - name: Checkout code
uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1 uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with: with:
ref: ${{ github.event.pull_request.head.sha }} ref: ${{ github.event.pull_request.head.sha }}
repository: ${{ github.event.pull_request.head.repo.full_name }} repository: ${{ github.event.pull_request.head.repo.full_name }}

View File

@@ -51,6 +51,14 @@ ignoreFiles = ["quickstart/shared", "quickstart/python", "quickstart/js", "quick
# Add a new version block here before every release # Add a new version block here before every release
# The order of versions in this file is mirrored into the dropdown # The order of versions in this file is mirrored into the dropdown
[[params.versions]]
version = "v0.26.0"
url = "https://googleapis.github.io/genai-toolbox/v0.26.0/"
[[params.versions]]
version = "v0.25.0"
url = "https://googleapis.github.io/genai-toolbox/v0.25.0/"
[[params.versions]] [[params.versions]]
version = "v0.24.0" version = "v0.24.0"
url = "https://googleapis.github.io/genai-toolbox/v0.24.0/" url = "https://googleapis.github.io/genai-toolbox/v0.24.0/"

View File

@@ -39,7 +39,7 @@ https://dev.mysql.com/doc/refman/8.4/en/user-names.html
# npmjs links can occasionally trigger rate limiting during high-frequency CI builds # npmjs links can occasionally trigger rate limiting during high-frequency CI builds
https://www.npmjs.com/package/@toolbox-sdk/core https://www.npmjs.com/package/@toolbox-sdk/core
https://www.npmjs.com/package/@toolbox-sdk/adk https://www.npmjs.com/package/@toolbox-sdk/adk
https://www.oceanbase.com/
# Ignore social media and blog profiles to reduce external request overhead # Ignore social media and blog profiles to reduce external request overhead
https://medium.com/@mcp_toolbox https://medium.com/@mcp_toolbox

View File

@@ -1,5 +1,55 @@
# Changelog # Changelog
## [0.26.0](https://github.com/googleapis/genai-toolbox/compare/v0.25.0...v0.26.0) (2026-01-22)
### ⚠ BREAKING CHANGES
* Validate tool naming ([#2305](https://github.com/googleapis/genai-toolbox/issues/2305)) ([5054212](https://github.com/googleapis/genai-toolbox/commit/5054212fa43017207fe83275d27b9fbab96e8ab5))
* **tools/cloudgda:** Update description and parameter name for cloudgda tool ([#2288](https://github.com/googleapis/genai-toolbox/issues/2288)) ([6b02591](https://github.com/googleapis/genai-toolbox/commit/6b025917032394a66840488259db8ff2c3063016))
### Features
* Add new `user-agent-metadata` flag ([#2302](https://github.com/googleapis/genai-toolbox/issues/2302)) ([adc9589](https://github.com/googleapis/genai-toolbox/commit/adc9589766904d9e3cbe0a6399222f8d4bb9d0cc))
* Add remaining flag to Toolbox server in MCP registry ([#2272](https://github.com/googleapis/genai-toolbox/issues/2272)) ([5e0999e](https://github.com/googleapis/genai-toolbox/commit/5e0999ebf5cdd9046e96857738254b2e0561b6d2))
* **embeddingModel:** Add embedding model to MCP handler ([#2310](https://github.com/googleapis/genai-toolbox/issues/2310)) ([e4f60e5](https://github.com/googleapis/genai-toolbox/commit/e4f60e56335b755ef55b9553d3f40b31858ec8d9))
* **sources/bigquery:** Make maximum rows returned from queries configurable ([#2262](https://github.com/googleapis/genai-toolbox/issues/2262)) ([4abf0c3](https://github.com/googleapis/genai-toolbox/commit/4abf0c39e717d53b22cc61efb65e09928c598236))
* **prebuilt/cloud-sql:** Add create backup tool for Cloud SQL ([#2141](https://github.com/googleapis/genai-toolbox/issues/2141)) ([8e0fb03](https://github.com/googleapis/genai-toolbox/commit/8e0fb0348315a80f63cb47b3c7204869482448f4))
* **prebuilt/cloud-sql:** Add restore backup tool for Cloud SQL ([#2171](https://github.com/googleapis/genai-toolbox/issues/2171)) ([00c3e6d](https://github.com/googleapis/genai-toolbox/commit/00c3e6d8cba54e2ab6cb271c7e6b378895df53e1))
* Support combining multiple prebuilt configurations ([#2295](https://github.com/googleapis/genai-toolbox/issues/2295)) ([e535b37](https://github.com/googleapis/genai-toolbox/commit/e535b372ea81864d644a67135a1b07e4e519b4b4))
* Support MCP specs version 2025-11-25 ([#2303](https://github.com/googleapis/genai-toolbox/issues/2303)) ([4d23a3b](https://github.com/googleapis/genai-toolbox/commit/4d23a3bbf2797b1f7fe328aeb5789e778121da23))
* **tools:** Add `valueFromParam` support to Tool config ([#2333](https://github.com/googleapis/genai-toolbox/issues/2333)) ([15101b1](https://github.com/googleapis/genai-toolbox/commit/15101b1edbe2b85a4a5f9f819c23cf83138f4ee1))
### Bug Fixes
* **tools/cloudhealthcare:** Add check for client authorization before retrieving token string ([#2327](https://github.com/googleapis/genai-toolbox/issues/2327)) ([c25a233](https://github.com/googleapis/genai-toolbox/commit/c25a2330fea2ac382a398842c9e572e4e19bcb08))
## [0.25.0](https://github.com/googleapis/genai-toolbox/compare/v0.24.0...v0.25.0) (2026-01-08)
### Features
* Add `embeddingModel` support ([#2121](https://github.com/googleapis/genai-toolbox/issues/2121)) ([9c62f31](https://github.com/googleapis/genai-toolbox/commit/9c62f313ff5edf0a3b5b8a3e996eba078fba4095))
* Add `allowed-hosts` flag ([#2254](https://github.com/googleapis/genai-toolbox/issues/2254)) ([17b41f6](https://github.com/googleapis/genai-toolbox/commit/17b41f64531b8fe417c28ada45d1992ba430dc1b))
* Add parameter default value to manifest ([#2264](https://github.com/googleapis/genai-toolbox/issues/2264)) ([9d1feca](https://github.com/googleapis/genai-toolbox/commit/9d1feca10810fa42cb4c94a409252f1bd373ee36))
* **snowflake:** Add Snowflake Source and Tools ([#858](https://github.com/googleapis/genai-toolbox/issues/858)) ([b706b5b](https://github.com/googleapis/genai-toolbox/commit/b706b5bc685aeda277f277868bae77d38d5fd7b6))
* **prebuilt/cloud-sql-mysql:** Update CSQL MySQL prebuilt tools to use IAM ([#2202](https://github.com/googleapis/genai-toolbox/issues/2202)) ([731a32e](https://github.com/googleapis/genai-toolbox/commit/731a32e5360b4d6862d81fcb27d7127c655679a8))
* **sources/bigquery:** Make credentials scope configurable ([#2210](https://github.com/googleapis/genai-toolbox/issues/2210)) ([a450600](https://github.com/googleapis/genai-toolbox/commit/a4506009b93771b77fb05ae97044f914967e67ed))
* **sources/trino:** Add ssl verification options and fix docs example ([#2155](https://github.com/googleapis/genai-toolbox/issues/2155)) ([4a4cf1e](https://github.com/googleapis/genai-toolbox/commit/4a4cf1e712b671853678dba99c4dc49dd4fc16a2))
* **tools/looker:** Add ability to set destination folder with `make_look` and `make_dashboard`. ([#2245](https://github.com/googleapis/genai-toolbox/issues/2245)) ([eb79339](https://github.com/googleapis/genai-toolbox/commit/eb793398cd1cc4006d9808ccda5dc7aea5e92bd5))
* **tools/postgressql:** Add tool to list store procedure ([#2156](https://github.com/googleapis/genai-toolbox/issues/2156)) ([cf0fc51](https://github.com/googleapis/genai-toolbox/commit/cf0fc515b57d9b84770076f3c0c5597c4597ef62))
* **tools/postgressql:** Add Parameter `embeddedBy` config support ([#2151](https://github.com/googleapis/genai-toolbox/issues/2151)) ([17b70cc](https://github.com/googleapis/genai-toolbox/commit/17b70ccaa754d15bcc33a1a3ecb7e652520fa600))
### Bug Fixes
* **server:** Add `embeddingModel` config initialization ([#2281](https://github.com/googleapis/genai-toolbox/issues/2281)) ([a779975](https://github.com/googleapis/genai-toolbox/commit/a7799757c9345f99b6d2717841fbf792d364e1a2))
* **sources/cloudgda:** Add import for cloudgda source ([#2217](https://github.com/googleapis/genai-toolbox/issues/2217)) ([7daa411](https://github.com/googleapis/genai-toolbox/commit/7daa4111f4ebfb0a35319fd67a8f7b9f0f99efcf))
* **tools/alloydb-wait-for-operation:** Fix connection message generation ([#2228](https://github.com/googleapis/genai-toolbox/issues/2228)) ([7053fbb](https://github.com/googleapis/genai-toolbox/commit/7053fbb1953653143d39a8510916ea97a91022a6))
* **tools/alloydbainl:** Only add psv when NL Config Param is defined ([#2265](https://github.com/googleapis/genai-toolbox/issues/2265)) ([ef8f3b0](https://github.com/googleapis/genai-toolbox/commit/ef8f3b02f2f38ce94a6ba9acf35d08b9469bef4e))
* **tools/looker:** Looker client OAuth nil pointer error ([#2231](https://github.com/googleapis/genai-toolbox/issues/2231)) ([268700b](https://github.com/googleapis/genai-toolbox/commit/268700bdbf8281de0318d60ca613ed3672990b20))
## [0.24.0](https://github.com/googleapis/genai-toolbox/compare/v0.23.0...v0.24.0) (2025-12-19) ## [0.24.0](https://github.com/googleapis/genai-toolbox/compare/v0.23.0...v0.24.0) (2025-12-19)

View File

@@ -59,6 +59,13 @@ You can manually trigger the bot by commenting on your Pull Request:
* `/gemini summary`: Posts a summary of the changes in the pull request. * `/gemini summary`: Posts a summary of the changes in the pull request.
* `/gemini help`: Overview of the available commands * `/gemini help`: Overview of the available commands
## Guidelines for Pull Requests
1. Please keep your PR small for more thorough review and easier updates. In case of regression, it also allows us to roll back a single feature instead of multiple ones.
1. For non-trivial changes, consider opening an issue and discussing it with the code owners first.
1. Provide a good PR description as a record of what change is being made and why it was made. Link to a GitHub issue if it exists.
1. Make sure your code is thoroughly tested with unit tests and integration tests. Remember to clean up the test instances properly in your code to avoid memory leaks.
## Adding a New Database Source or Tool ## Adding a New Database Source or Tool
Please create an Please create an
@@ -85,11 +92,11 @@ implementation](https://github.com/googleapis/genai-toolbox/blob/main/internal/s
`newdb.go`. Create a `Config` struct to include all the necessary parameters `newdb.go`. Create a `Config` struct to include all the necessary parameters
for connecting to the database (e.g., host, port, username, password, database for connecting to the database (e.g., host, port, username, password, database
name) and a `Source` struct to store necessary parameters for tools (e.g., name) and a `Source` struct to store necessary parameters for tools (e.g.,
Name, Kind, connection object, additional config). Name, Type, connection object, additional config).
* **Implement the * **Implement the
[`SourceConfig`](https://github.com/googleapis/genai-toolbox/blob/fd300dc606d88bf9f7bba689e2cee4e3565537dd/internal/sources/sources.go#L57) [`SourceConfig`](https://github.com/googleapis/genai-toolbox/blob/fd300dc606d88bf9f7bba689e2cee4e3565537dd/internal/sources/sources.go#L57)
interface**. This interface requires two methods: interface**. This interface requires two methods:
* `SourceConfigKind() string`: Returns a unique string identifier for your * `SourceConfigType() string`: Returns a unique string identifier for your
data source (e.g., `"newdb"`). data source (e.g., `"newdb"`).
* `Initialize(ctx context.Context, tracer trace.Tracer) (Source, error)`: * `Initialize(ctx context.Context, tracer trace.Tracer) (Source, error)`:
Creates a new instance of your data source and establishes a connection to Creates a new instance of your data source and establishes a connection to
@@ -97,7 +104,7 @@ implementation](https://github.com/googleapis/genai-toolbox/blob/main/internal/s
* **Implement the * **Implement the
[`Source`](https://github.com/googleapis/genai-toolbox/blob/fd300dc606d88bf9f7bba689e2cee4e3565537dd/internal/sources/sources.go#L63) [`Source`](https://github.com/googleapis/genai-toolbox/blob/fd300dc606d88bf9f7bba689e2cee4e3565537dd/internal/sources/sources.go#L63)
interface**. This interface requires one method: interface**. This interface requires one method:
* `SourceKind() string`: Returns the same string identifier as `SourceConfigKind()`. * `SourceType() string`: Returns the same string identifier as `SourceConfigType()`.
* **Implement `init()`** to register the new Source. * **Implement `init()`** to register the new Source.
* **Implement Unit Tests** in a file named `newdb_test.go`. * **Implement Unit Tests** in a file named `newdb_test.go`.
@@ -110,6 +117,8 @@ implementation](https://github.com/googleapis/genai-toolbox/blob/main/internal/s
We recommend looking at an [example tool We recommend looking at an [example tool
implementation](https://github.com/googleapis/genai-toolbox/tree/main/internal/tools/postgres/postgressql). implementation](https://github.com/googleapis/genai-toolbox/tree/main/internal/tools/postgres/postgressql).
Remember to keep your PRs small. For example, if you are contributing a new Source, only include one or two core Tools within the same PR, the rest of the Tools can come in subsequent PRs.
* **Create a new directory** under `internal/tools` for your tool type (e.g., `internal/tools/newdb/newdbtool`). * **Create a new directory** under `internal/tools` for your tool type (e.g., `internal/tools/newdb/newdbtool`).
* **Define a configuration struct** for your tool in a file named `newdbtool.go`. * **Define a configuration struct** for your tool in a file named `newdbtool.go`.
Create a `Config` struct and a `Tool` struct to store necessary parameters for Create a `Config` struct and a `Tool` struct to store necessary parameters for
@@ -117,7 +126,7 @@ tools.
* **Implement the * **Implement the
[`ToolConfig`](https://github.com/googleapis/genai-toolbox/blob/fd300dc606d88bf9f7bba689e2cee4e3565537dd/internal/tools/tools.go#L61) [`ToolConfig`](https://github.com/googleapis/genai-toolbox/blob/fd300dc606d88bf9f7bba689e2cee4e3565537dd/internal/tools/tools.go#L61)
interface**. This interface requires one method: interface**. This interface requires one method:
* `ToolConfigKind() string`: Returns a unique string identifier for your tool * `ToolConfigType() string`: Returns a unique string identifier for your tool
(e.g., `"newdb-tool"`). (e.g., `"newdb-tool"`).
* `Initialize(sources map[string]Source) (Tool, error)`: Creates a new * `Initialize(sources map[string]Source) (Tool, error)`: Creates a new
instance of your tool and validates that it can connect to the specified instance of your tool and validates that it can connect to the specified
@@ -163,6 +172,8 @@ tools.
parameters][temp-param-doc]. Only run this test if template parameters][temp-param-doc]. Only run this test if template
parameters apply to your tool. parameters apply to your tool.
* **Add additional tests** for the tools that are not covered by the predefined tests. Every tool must be tested!
* **Add the new database to the integration test workflow** in * **Add the new database to the integration test workflow** in
[integration.cloudbuild.yaml](.ci/integration.cloudbuild.yaml). [integration.cloudbuild.yaml](.ci/integration.cloudbuild.yaml).
@@ -232,7 +243,7 @@ resources.
| style | Update src code, with only formatting and whitespace updates (e.g. code formatter or linter changes). | | style | Update src code, with only formatting and whitespace updates (e.g. code formatter or linter changes). |
Pull requests should always add scope whenever possible. The scope is Pull requests should always add scope whenever possible. The scope is
formatted as `<scope-type>/<scope-kind>` (e.g., `sources/postgres`, or formatted as `<scope-resource>/<scope-type>` (e.g., `sources/postgres`, or
`tools/mssql-sql`). `tools/mssql-sql`).
Ideally, **each PR covers only one scope**, if this is Ideally, **each PR covers only one scope**, if this is

View File

@@ -47,12 +47,13 @@ Before you begin, ensure you have the following:
### Tool Naming Conventions ### Tool Naming Conventions
This section details the purpose and conventions for MCP Toolbox's tools naming This section details the purpose and conventions for MCP Toolbox's tools naming
properties, **tool name** and **tool kind**. properties, **tool name** and **tool type**.
``` ```
cancel_hotel: <- tool name kind: tools
kind: postgres-sql <- tool kind name: cancel_hotel <- tool name
source: my_pg_source type: postgres-sql <- tool type
source: my_pg_source
``` ```
#### Tool Name #### Tool Name
@@ -76,17 +77,17 @@ The following guidelines apply to tool names:
to a function) until they can be validated through extensive testing to ensure to a function) until they can be validated through extensive testing to ensure
they do not negatively impact agent's performances. they do not negatively impact agent's performances.
#### Tool Kind #### Tool Type
Tool kind serves as a category or type that a user can assign to a tool. Tool type serves as a category or type that a user can assign to a tool.
The following guidelines apply to tool kinds: The following guidelines apply to tool types:
* Should user hyphens over underscores (e.g. `firestore-list-collections` or * Should use hyphens over underscores (e.g. `firestore-list-collections` or
`firestore_list_colelctions`). `firestore_list_colelctions`).
* Should use product name in name (e.g. `firestore-list-collections` over * Should use product name in name (e.g. `firestore-list-collections` over
`list-collections`). `list-collections`).
* Changes to tool kind are breaking changes and should be avoided. * Changes to tool type are breaking changes and should be avoided.
## Testing ## Testing
@@ -379,6 +380,23 @@ to approve PRs for main. TeamSync is used to create this team from the MDB
Group `toolbox-contributors`. Googlers who are developing for MCP-Toolbox Group `toolbox-contributors`. Googlers who are developing for MCP-Toolbox
but aren't part of the core team should join this group. but aren't part of the core team should join this group.
### Issue/PR Triage and SLO
After an issue is created, maintainers will assign the following labels:
* `Priority` (defaulted to P0)
* `Type` (if applicable)
* `Product` (if applicable)
All incoming issues and PRs will follow the following SLO:
| Type | Priority | Objective |
|-----------------|----------|------------------------------------------------------------------------|
| Feature Request | P0 | Must respond within **5 days** |
| Process | P0 | Must respond within **5 days** |
| Bugs | P0 | Must respond within **5 days**, and resolve/closure within **14 days** |
| Bugs | P1 | Must respond within **7 days**, and resolve/closure within **90 days** |
| Bugs | P2 | Must respond within **30 days**
_Types that are not listed in the table do not adhere to any SLO._
### Releasing ### Releasing
Toolbox has two types of releases: versioned and continuous. It uses Google Toolbox has two types of releases: versioned and continuous. It uses Google

View File

@@ -2,6 +2,8 @@
# MCP Toolbox for Databases # MCP Toolbox for Databases
<a href="https://trendshift.io/repositories/13019" target="_blank"><img src="https://trendshift.io/api/badge/repositories/13019" alt="googleapis%2Fgenai-toolbox | Trendshift" style="width: 250px; height: 55px;" width="250" height="55"/></a>
[![Docs](https://img.shields.io/badge/docs-MCP_Toolbox-blue)](https://googleapis.github.io/genai-toolbox/) [![Docs](https://img.shields.io/badge/docs-MCP_Toolbox-blue)](https://googleapis.github.io/genai-toolbox/)
[![Discord](https://img.shields.io/badge/Discord-%235865F2.svg?style=flat&logo=discord&logoColor=white)](https://discord.gg/Dmm69peqjh) [![Discord](https://img.shields.io/badge/Discord-%235865F2.svg?style=flat&logo=discord&logoColor=white)](https://discord.gg/Dmm69peqjh)
[![Medium](https://img.shields.io/badge/Medium-12100E?style=flat&logo=medium&logoColor=white)](https://medium.com/@mcp_toolbox) [![Medium](https://img.shields.io/badge/Medium-12100E?style=flat&logo=medium&logoColor=white)](https://medium.com/@mcp_toolbox)
@@ -140,7 +142,7 @@ To install Toolbox as a binary:
> >
> ```sh > ```sh
> # see releases page for other versions > # see releases page for other versions
> export VERSION=0.24.0 > export VERSION=0.26.0
> curl -L -o toolbox https://storage.googleapis.com/genai-toolbox/v$VERSION/linux/amd64/toolbox > curl -L -o toolbox https://storage.googleapis.com/genai-toolbox/v$VERSION/linux/amd64/toolbox
> chmod +x toolbox > chmod +x toolbox
> ``` > ```
@@ -153,7 +155,7 @@ To install Toolbox as a binary:
> >
> ```sh > ```sh
> # see releases page for other versions > # see releases page for other versions
> export VERSION=0.24.0 > export VERSION=0.26.0
> curl -L -o toolbox https://storage.googleapis.com/genai-toolbox/v$VERSION/darwin/arm64/toolbox > curl -L -o toolbox https://storage.googleapis.com/genai-toolbox/v$VERSION/darwin/arm64/toolbox
> chmod +x toolbox > chmod +x toolbox
> ``` > ```
@@ -166,7 +168,7 @@ To install Toolbox as a binary:
> >
> ```sh > ```sh
> # see releases page for other versions > # see releases page for other versions
> export VERSION=0.24.0 > export VERSION=0.26.0
> curl -L -o toolbox https://storage.googleapis.com/genai-toolbox/v$VERSION/darwin/amd64/toolbox > curl -L -o toolbox https://storage.googleapis.com/genai-toolbox/v$VERSION/darwin/amd64/toolbox
> chmod +x toolbox > chmod +x toolbox
> ``` > ```
@@ -179,7 +181,7 @@ To install Toolbox as a binary:
> >
> ```cmd > ```cmd
> :: see releases page for other versions > :: see releases page for other versions
> set VERSION=0.24.0 > set VERSION=0.26.0
> curl -o toolbox.exe "https://storage.googleapis.com/genai-toolbox/v%VERSION%/windows/amd64/toolbox.exe" > curl -o toolbox.exe "https://storage.googleapis.com/genai-toolbox/v%VERSION%/windows/amd64/toolbox.exe"
> ``` > ```
> >
@@ -191,7 +193,7 @@ To install Toolbox as a binary:
> >
> ```powershell > ```powershell
> # see releases page for other versions > # see releases page for other versions
> $VERSION = "0.24.0" > $VERSION = "0.26.0"
> curl.exe -o toolbox.exe "https://storage.googleapis.com/genai-toolbox/v$VERSION/windows/amd64/toolbox.exe" > curl.exe -o toolbox.exe "https://storage.googleapis.com/genai-toolbox/v$VERSION/windows/amd64/toolbox.exe"
> ``` > ```
> >
@@ -204,7 +206,7 @@ You can also install Toolbox as a container:
```sh ```sh
# see releases page for other versions # see releases page for other versions
export VERSION=0.24.0 export VERSION=0.26.0
docker pull us-central1-docker.pkg.dev/database-toolbox/toolbox/toolbox:$VERSION docker pull us-central1-docker.pkg.dev/database-toolbox/toolbox/toolbox:$VERSION
``` ```
@@ -228,7 +230,7 @@ To install from source, ensure you have the latest version of
[Go installed](https://go.dev/doc/install), and then run the following command: [Go installed](https://go.dev/doc/install), and then run the following command:
```sh ```sh
go install github.com/googleapis/genai-toolbox@v0.24.0 go install github.com/googleapis/genai-toolbox@v0.26.0
``` ```
<!-- {x-release-please-end} --> <!-- {x-release-please-end} -->
@@ -272,7 +274,7 @@ To run Toolbox from binary:
To run the server after pulling the [container image](#installing-the-server): To run the server after pulling the [container image](#installing-the-server):
```sh ```sh
export VERSION=0.11.0 # Use the version you pulled export VERSION=0.24.0 # Use the version you pulled
docker run -p 5000:5000 \ docker run -p 5000:5000 \
-v $(pwd)/tools.yaml:/app/tools.yaml \ -v $(pwd)/tools.yaml:/app/tools.yaml \
us-central1-docker.pkg.dev/database-toolbox/toolbox/toolbox:$VERSION \ us-central1-docker.pkg.dev/database-toolbox/toolbox/toolbox:$VERSION \
@@ -938,14 +940,14 @@ Toolbox should have access to. Most tools will have at least one source to
execute against. execute against.
```yaml ```yaml
sources: kind: sources
my-pg-source: name: my-pg-source
kind: postgres type: postgres
host: 127.0.0.1 host: 127.0.0.1
port: 5432 port: 5432
database: toolbox_db database: toolbox_db
user: toolbox_user user: toolbox_user
password: my-password password: my-password
``` ```
For more details on configuring different types of sources, see the For more details on configuring different types of sources, see the
@@ -954,19 +956,19 @@ For more details on configuring different types of sources, see the
### Tools ### Tools
The `tools` section of a `tools.yaml` define the actions an agent can take: what The `tools` section of a `tools.yaml` define the actions an agent can take: what
kind of tool it is, which source(s) it affects, what parameters it uses, etc. type of tool it is, which source(s) it affects, what parameters it uses, etc.
```yaml ```yaml
tools: kind: tools
search-hotels-by-name: name: search-hotels-by-name
kind: postgres-sql type: postgres-sql
source: my-pg-source source: my-pg-source
description: Search for hotels based on name. description: Search for hotels based on name.
parameters: parameters:
- name: name - name: name
type: string type: string
description: The name of the hotel. description: The name of the hotel.
statement: SELECT * FROM hotels WHERE name ILIKE '%' || $1 || '%'; statement: SELECT * FROM hotels WHERE name ILIKE '%' || $1 || '%';
``` ```
For more details on configuring different types of tools, see the For more details on configuring different types of tools, see the

View File

@@ -15,6 +15,7 @@
package cmd package cmd
import ( import (
"bytes"
"context" "context"
_ "embed" _ "embed"
"fmt" "fmt"
@@ -92,11 +93,13 @@ import (
_ "github.com/googleapis/genai-toolbox/internal/tools/cloudhealthcare/cloudhealthcaresearchdicomstudies" _ "github.com/googleapis/genai-toolbox/internal/tools/cloudhealthcare/cloudhealthcaresearchdicomstudies"
_ "github.com/googleapis/genai-toolbox/internal/tools/cloudmonitoring" _ "github.com/googleapis/genai-toolbox/internal/tools/cloudmonitoring"
_ "github.com/googleapis/genai-toolbox/internal/tools/cloudsql/cloudsqlcloneinstance" _ "github.com/googleapis/genai-toolbox/internal/tools/cloudsql/cloudsqlcloneinstance"
_ "github.com/googleapis/genai-toolbox/internal/tools/cloudsql/cloudsqlcreatebackup"
_ "github.com/googleapis/genai-toolbox/internal/tools/cloudsql/cloudsqlcreatedatabase" _ "github.com/googleapis/genai-toolbox/internal/tools/cloudsql/cloudsqlcreatedatabase"
_ "github.com/googleapis/genai-toolbox/internal/tools/cloudsql/cloudsqlcreateusers" _ "github.com/googleapis/genai-toolbox/internal/tools/cloudsql/cloudsqlcreateusers"
_ "github.com/googleapis/genai-toolbox/internal/tools/cloudsql/cloudsqlgetinstances" _ "github.com/googleapis/genai-toolbox/internal/tools/cloudsql/cloudsqlgetinstances"
_ "github.com/googleapis/genai-toolbox/internal/tools/cloudsql/cloudsqllistdatabases" _ "github.com/googleapis/genai-toolbox/internal/tools/cloudsql/cloudsqllistdatabases"
_ "github.com/googleapis/genai-toolbox/internal/tools/cloudsql/cloudsqllistinstances" _ "github.com/googleapis/genai-toolbox/internal/tools/cloudsql/cloudsqllistinstances"
_ "github.com/googleapis/genai-toolbox/internal/tools/cloudsql/cloudsqlrestorebackup"
_ "github.com/googleapis/genai-toolbox/internal/tools/cloudsql/cloudsqlwaitforoperation" _ "github.com/googleapis/genai-toolbox/internal/tools/cloudsql/cloudsqlwaitforoperation"
_ "github.com/googleapis/genai-toolbox/internal/tools/cloudsqlmssql/cloudsqlmssqlcreateinstance" _ "github.com/googleapis/genai-toolbox/internal/tools/cloudsqlmssql/cloudsqlmssqlcreateinstance"
_ "github.com/googleapis/genai-toolbox/internal/tools/cloudsqlmysql/cloudsqlmysqlcreateinstance" _ "github.com/googleapis/genai-toolbox/internal/tools/cloudsqlmysql/cloudsqlmysqlcreateinstance"
@@ -215,6 +218,8 @@ import (
_ "github.com/googleapis/genai-toolbox/internal/tools/serverlessspark/serverlesssparklistbatches" _ "github.com/googleapis/genai-toolbox/internal/tools/serverlessspark/serverlesssparklistbatches"
_ "github.com/googleapis/genai-toolbox/internal/tools/singlestore/singlestoreexecutesql" _ "github.com/googleapis/genai-toolbox/internal/tools/singlestore/singlestoreexecutesql"
_ "github.com/googleapis/genai-toolbox/internal/tools/singlestore/singlestoresql" _ "github.com/googleapis/genai-toolbox/internal/tools/singlestore/singlestoresql"
_ "github.com/googleapis/genai-toolbox/internal/tools/snowflake/snowflakeexecutesql"
_ "github.com/googleapis/genai-toolbox/internal/tools/snowflake/snowflakesql"
_ "github.com/googleapis/genai-toolbox/internal/tools/spanner/spannerexecutesql" _ "github.com/googleapis/genai-toolbox/internal/tools/spanner/spannerexecutesql"
_ "github.com/googleapis/genai-toolbox/internal/tools/spanner/spannerlistgraphs" _ "github.com/googleapis/genai-toolbox/internal/tools/spanner/spannerlistgraphs"
_ "github.com/googleapis/genai-toolbox/internal/tools/spanner/spannerlisttables" _ "github.com/googleapis/genai-toolbox/internal/tools/spanner/spannerlisttables"
@@ -263,6 +268,7 @@ import (
_ "github.com/googleapis/genai-toolbox/internal/sources/redis" _ "github.com/googleapis/genai-toolbox/internal/sources/redis"
_ "github.com/googleapis/genai-toolbox/internal/sources/serverlessspark" _ "github.com/googleapis/genai-toolbox/internal/sources/serverlessspark"
_ "github.com/googleapis/genai-toolbox/internal/sources/singlestore" _ "github.com/googleapis/genai-toolbox/internal/sources/singlestore"
_ "github.com/googleapis/genai-toolbox/internal/sources/snowflake"
_ "github.com/googleapis/genai-toolbox/internal/sources/spanner" _ "github.com/googleapis/genai-toolbox/internal/sources/spanner"
_ "github.com/googleapis/genai-toolbox/internal/sources/sqlite" _ "github.com/googleapis/genai-toolbox/internal/sources/sqlite"
_ "github.com/googleapis/genai-toolbox/internal/sources/tidb" _ "github.com/googleapis/genai-toolbox/internal/sources/tidb"
@@ -310,15 +316,15 @@ func Execute() {
type Command struct { type Command struct {
*cobra.Command *cobra.Command
cfg server.ServerConfig cfg server.ServerConfig
logger log.Logger logger log.Logger
tools_file string tools_file string
tools_files []string tools_files []string
tools_folder string tools_folder string
prebuiltConfig string prebuiltConfigs []string
inStream io.Reader inStream io.Reader
outStream io.Writer outStream io.Writer
errStream io.Writer errStream io.Writer
} }
// NewCommand returns a Command object representing an invocation of the CLI. // NewCommand returns a Command object representing an invocation of the CLI.
@@ -371,14 +377,17 @@ func NewCommand(opts ...Option) *Command {
flags.StringVar(&cmd.cfg.TelemetryServiceName, "telemetry-service-name", "toolbox", "Sets the value of the service.name resource attribute for telemetry data.") flags.StringVar(&cmd.cfg.TelemetryServiceName, "telemetry-service-name", "toolbox", "Sets the value of the service.name resource attribute for telemetry data.")
// Fetch prebuilt tools sources to customize the help description // Fetch prebuilt tools sources to customize the help description
prebuiltHelp := fmt.Sprintf( prebuiltHelp := fmt.Sprintf(
"Use a prebuilt tool configuration by source type. Allowed: '%s'.", "Use a prebuilt tool configuration by source type. Allowed: '%s'. Can be specified multiple times.",
strings.Join(prebuiltconfigs.GetPrebuiltSources(), "', '"), strings.Join(prebuiltconfigs.GetPrebuiltSources(), "', '"),
) )
flags.StringVar(&cmd.prebuiltConfig, "prebuilt", "", prebuiltHelp) flags.StringSliceVar(&cmd.prebuiltConfigs, "prebuilt", []string{}, prebuiltHelp)
flags.BoolVar(&cmd.cfg.Stdio, "stdio", false, "Listens via MCP STDIO instead of acting as a remote HTTP server.") flags.BoolVar(&cmd.cfg.Stdio, "stdio", false, "Listens via MCP STDIO instead of acting as a remote HTTP server.")
flags.BoolVar(&cmd.cfg.DisableReload, "disable-reload", false, "Disables dynamic reloading of tools file.") flags.BoolVar(&cmd.cfg.DisableReload, "disable-reload", false, "Disables dynamic reloading of tools file.")
flags.BoolVar(&cmd.cfg.UI, "ui", false, "Launches the Toolbox UI web server.") flags.BoolVar(&cmd.cfg.UI, "ui", false, "Launches the Toolbox UI web server.")
// TODO: Insecure by default. Might consider updating this for v1.0.0
flags.StringSliceVar(&cmd.cfg.AllowedOrigins, "allowed-origins", []string{"*"}, "Specifies a list of origins permitted to access this server. Defaults to '*'.") flags.StringSliceVar(&cmd.cfg.AllowedOrigins, "allowed-origins", []string{"*"}, "Specifies a list of origins permitted to access this server. Defaults to '*'.")
flags.StringSliceVar(&cmd.cfg.AllowedHosts, "allowed-hosts", []string{"*"}, "Specifies a list of hosts permitted to access this server. Defaults to '*'.")
flags.StringSliceVar(&cmd.cfg.UserAgentMetadata, "user-agent-metadata", []string{}, "Appends additional metadata to the User-Agent.")
// wrap RunE command so that we have access to original Command object // wrap RunE command so that we have access to original Command object
cmd.RunE = func(*cobra.Command, []string) error { return run(cmd) } cmd.RunE = func(*cobra.Command, []string) error { return run(cmd) }
@@ -388,7 +397,6 @@ func NewCommand(opts ...Option) *Command {
type ToolsFile struct { type ToolsFile struct {
Sources server.SourceConfigs `yaml:"sources"` Sources server.SourceConfigs `yaml:"sources"`
AuthSources server.AuthServiceConfigs `yaml:"authSources"` // Deprecated: Kept for compatibility.
AuthServices server.AuthServiceConfigs `yaml:"authServices"` AuthServices server.AuthServiceConfigs `yaml:"authServices"`
EmbeddingModels server.EmbeddingModelConfigs `yaml:"embeddingModels"` EmbeddingModels server.EmbeddingModelConfigs `yaml:"embeddingModels"`
Tools server.ToolConfigs `yaml:"tools"` Tools server.ToolConfigs `yaml:"tools"`
@@ -419,6 +427,129 @@ func parseEnv(input string) (string, error) {
return output, err return output, err
} }
func convertToolsFile(raw []byte) ([]byte, error) {
var input yaml.MapSlice
decoder := yaml.NewDecoder(bytes.NewReader(raw), yaml.UseOrderedMap())
// convert to tools file v2
var buf bytes.Buffer
encoder := yaml.NewEncoder(&buf)
v1keys := []string{"sources", "authSources", "authServices", "embeddingModels", "tools", "toolsets", "prompts"}
for {
if err := decoder.Decode(&input); err != nil {
if err == io.EOF {
break
}
return nil, err
}
for _, item := range input {
key, ok := item.Key.(string)
if !ok {
return nil, fmt.Errorf("unexpected non-string key in input: %v", item.Key)
}
// check if the key is config file v1's key
if slices.Contains(v1keys, key) {
// check if value conversion to yaml.MapSlice successfully
// fields such as "tools" in toolsets might pass the first check but
// fail to convert to MapSlice
if slice, ok := item.Value.(yaml.MapSlice); ok {
// Deprecated: convert authSources to authServices
if key == "authSources" {
key = "authServices"
}
transformed, err := transformDocs(key, slice)
if err != nil {
return nil, err
}
// encode per-doc
for _, doc := range transformed {
if err := encoder.Encode(doc); err != nil {
return nil, err
}
}
} else {
// invalid input will be ignored
// we don't want to throw error here since the config could
// be valid but with a different order such as:
// ---
// tools:
// - tool_a
// kind: toolsets
// ---
continue
}
} else {
// this doc is already v2, encode to buf
if err := encoder.Encode(input); err != nil {
return nil, err
}
break
}
}
}
return buf.Bytes(), nil
}
// transformDocs transforms the configuration file from v1 format to v2
// yaml.MapSlice will preserve the order in a map
func transformDocs(kind string, input yaml.MapSlice) ([]yaml.MapSlice, error) {
var transformed []yaml.MapSlice
for _, entry := range input {
entryName, ok := entry.Key.(string)
if !ok {
return nil, fmt.Errorf("unexpected non-string key for entry in '%s': %v", kind, entry.Key)
}
entryBody := ProcessValue(entry.Value, kind == "toolsets")
currentTransformed := yaml.MapSlice{
{Key: "kind", Value: kind},
{Key: "name", Value: entryName},
}
// Merge the transformed body into our result
if bodySlice, ok := entryBody.(yaml.MapSlice); ok {
currentTransformed = append(currentTransformed, bodySlice...)
} else {
return nil, fmt.Errorf("unable to convert entryBody to MapSlice")
}
transformed = append(transformed, currentTransformed)
}
return transformed, nil
}
// ProcessValue recursively looks for MapSlices to rename 'kind' -> 'type'
func ProcessValue(v any, isToolset bool) any {
switch val := v.(type) {
case yaml.MapSlice:
// creating a new MapSlice is safer for recursive transformation
newVal := make(yaml.MapSlice, len(val))
for i, item := range val {
// Perform renaming
if item.Key == "kind" {
item.Key = "type"
}
// Recursive call for nested values (e.g., nested objects or lists)
item.Value = ProcessValue(item.Value, false)
newVal[i] = item
}
return newVal
case []any:
// Process lists: If it's a toolset top-level list, wrap it.
if isToolset {
return yaml.MapSlice{{Key: "tools", Value: val}}
}
// Otherwise, recurse into list items (to catch nested objects)
newVal := make([]any, len(val))
for i := range val {
newVal[i] = ProcessValue(val[i], false)
}
return newVal
default:
return val
}
}
// parseToolsFile parses the provided yaml into appropriate configs. // parseToolsFile parses the provided yaml into appropriate configs.
func parseToolsFile(ctx context.Context, raw []byte) (ToolsFile, error) { func parseToolsFile(ctx context.Context, raw []byte) (ToolsFile, error) {
var toolsFile ToolsFile var toolsFile ToolsFile
@@ -429,8 +560,13 @@ func parseToolsFile(ctx context.Context, raw []byte) (ToolsFile, error) {
} }
raw = []byte(output) raw = []byte(output)
raw, err = convertToolsFile(raw)
if err != nil {
return toolsFile, fmt.Errorf("error converting tools file: %s", err)
}
// Parse contents // Parse contents
err = yaml.UnmarshalContext(ctx, raw, &toolsFile, yaml.Strict()) toolsFile.Sources, toolsFile.AuthServices, toolsFile.EmbeddingModels, toolsFile.Tools, toolsFile.Toolsets, toolsFile.Prompts, err = server.UnmarshalResourceConfig(ctx, raw)
if err != nil { if err != nil {
return toolsFile, err return toolsFile, err
} }
@@ -462,18 +598,6 @@ func mergeToolsFiles(files ...ToolsFile) (ToolsFile, error) {
} }
} }
// Check for conflicts and merge authSources (deprecated, but still support)
for name, authSource := range file.AuthSources {
if _, exists := merged.AuthSources[name]; exists {
conflicts = append(conflicts, fmt.Sprintf("authSource '%s' (file #%d)", name, fileIndex+1))
} else {
if merged.AuthSources == nil {
merged.AuthSources = make(server.AuthServiceConfigs)
}
merged.AuthSources[name] = authSource
}
}
// Check for conflicts and merge authServices // Check for conflicts and merge authServices
for name, authService := range file.AuthServices { for name, authService := range file.AuthServices {
if _, exists := merged.AuthServices[name]; exists { if _, exists := merged.AuthServices[name]; exists {
@@ -859,24 +983,32 @@ func run(cmd *Command) error {
var allToolsFiles []ToolsFile var allToolsFiles []ToolsFile
// Load Prebuilt Configuration // Load Prebuilt Configuration
if cmd.prebuiltConfig != "" {
buf, err := prebuiltconfigs.Get(cmd.prebuiltConfig)
if err != nil {
cmd.logger.ErrorContext(ctx, err.Error())
return err
}
logMsg := fmt.Sprint("Using prebuilt tool configuration for ", cmd.prebuiltConfig)
cmd.logger.InfoContext(ctx, logMsg)
// Append prebuilt.source to Version string for the User Agent
cmd.cfg.Version += "+prebuilt." + cmd.prebuiltConfig
parsed, err := parseToolsFile(ctx, buf) if len(cmd.prebuiltConfigs) > 0 {
if err != nil { slices.Sort(cmd.prebuiltConfigs)
errMsg := fmt.Errorf("unable to parse prebuilt tool configuration: %w", err) sourcesList := strings.Join(cmd.prebuiltConfigs, ", ")
cmd.logger.ErrorContext(ctx, errMsg.Error()) logMsg := fmt.Sprintf("Using prebuilt tool configurations for: %s", sourcesList)
return errMsg cmd.logger.InfoContext(ctx, logMsg)
for _, configName := range cmd.prebuiltConfigs {
buf, err := prebuiltconfigs.Get(configName)
if err != nil {
cmd.logger.ErrorContext(ctx, err.Error())
return err
}
// Update version string
cmd.cfg.Version += "+prebuilt." + configName
// Parse into ToolsFile struct
parsed, err := parseToolsFile(ctx, buf)
if err != nil {
errMsg := fmt.Errorf("unable to parse prebuilt tool configuration for '%s': %w", configName, err)
cmd.logger.ErrorContext(ctx, errMsg.Error())
return errMsg
}
allToolsFiles = append(allToolsFiles, parsed)
} }
allToolsFiles = append(allToolsFiles, parsed)
} }
// Determine if Custom Files should be loaded // Determine if Custom Files should be loaded
@@ -884,7 +1016,7 @@ func run(cmd *Command) error {
isCustomConfigured := cmd.tools_file != "" || len(cmd.tools_files) > 0 || cmd.tools_folder != "" isCustomConfigured := cmd.tools_file != "" || len(cmd.tools_files) > 0 || cmd.tools_folder != ""
// Determine if default 'tools.yaml' should be used (No prebuilt AND No custom flags) // Determine if default 'tools.yaml' should be used (No prebuilt AND No custom flags)
useDefaultToolsFile := cmd.prebuiltConfig == "" && !isCustomConfigured useDefaultToolsFile := len(cmd.prebuiltConfigs) == 0 && !isCustomConfigured
if useDefaultToolsFile { if useDefaultToolsFile {
cmd.tools_file = "tools.yaml" cmd.tools_file = "tools.yaml"
@@ -944,24 +1076,11 @@ func run(cmd *Command) error {
cmd.cfg.SourceConfigs = finalToolsFile.Sources cmd.cfg.SourceConfigs = finalToolsFile.Sources
cmd.cfg.AuthServiceConfigs = finalToolsFile.AuthServices cmd.cfg.AuthServiceConfigs = finalToolsFile.AuthServices
cmd.cfg.EmbeddingModelConfigs = finalToolsFile.EmbeddingModels
cmd.cfg.ToolConfigs = finalToolsFile.Tools cmd.cfg.ToolConfigs = finalToolsFile.Tools
cmd.cfg.ToolsetConfigs = finalToolsFile.Toolsets cmd.cfg.ToolsetConfigs = finalToolsFile.Toolsets
cmd.cfg.PromptConfigs = finalToolsFile.Prompts cmd.cfg.PromptConfigs = finalToolsFile.Prompts
authSourceConfigs := finalToolsFile.AuthSources
if authSourceConfigs != nil {
cmd.logger.WarnContext(ctx, "`authSources` is deprecated, use `authServices` instead")
for k, v := range authSourceConfigs {
if _, exists := cmd.cfg.AuthServiceConfigs[k]; exists {
errMsg := fmt.Errorf("resource conflict detected: authSource '%s' has the same name as an existing authService. Please rename your authSource", k)
cmd.logger.ErrorContext(ctx, errMsg.Error())
return errMsg
}
cmd.cfg.AuthServiceConfigs[k] = v
}
}
instrumentation, err := telemetry.CreateTelemetryInstrumentation(versionString) instrumentation, err := telemetry.CreateTelemetryInstrumentation(versionString)
if err != nil { if err != nil {
errMsg := fmt.Errorf("unable to create telemetry instrumentation: %w", err) errMsg := fmt.Errorf("unable to create telemetry instrumentation: %w", err)

File diff suppressed because it is too large Load Diff

View File

@@ -1 +1 @@
0.24.0 0.26.0

View File

@@ -45,7 +45,7 @@ most popular issues, so make sure to +1 ones you are the most interested in.
## Can Toolbox be used for non-database tools? ## Can Toolbox be used for non-database tools?
**Yes!** While Toolbox is primarily focused on databases, it also supports generic **Yes!** While Toolbox is primarily focused on databases, it also supports generic
**HTTP tools** (`kind: http`). These allow you to connect your agents to REST APIs **HTTP tools** (`type: http`). These allow you to connect your agents to REST APIs
and other web services, enabling workflows that extend beyond database interactions. and other web services, enabling workflows that extend beyond database interactions.
For configuration details, see the [HTTP Tools documentation](../resources/tools/http/http.md). For configuration details, see the [HTTP Tools documentation](../resources/tools/http/http.md).

18
docs/en/blogs/_index.md Normal file
View File

@@ -0,0 +1,18 @@
---
title: "Featured Articles"
weight: 3
description: Toolbox Medium Blogs
manualLink: "https://medium.com/@mcp_toolbox"
manualLinkTarget: _blank
---
<html>
<head>
<title>Redirecting to Featured Articles</title>
<link rel="canonical" href="https://medium.com/@mcp_toolbox"/>
<meta http-equiv="refresh" content="0;url=https://medium.com/@mcp_toolbox"/>
</head>
<body>
<p>If you are not automatically redirected, please <a href="https://medium.com/@mcp_toolbox">follow this link to our articles</a>.</p>
</body>
</html>

View File

@@ -64,7 +64,7 @@ The structured logging outputs log as JSON:
"timestamp":"2024-11-04T16:45:11.987299-08:00", "timestamp":"2024-11-04T16:45:11.987299-08:00",
"severity":"ERROR", "severity":"ERROR",
"logging.googleapis.com/sourceLocation":{...}, "logging.googleapis.com/sourceLocation":{...},
"message":"unable to parse tool file at \"tools.yaml\": \"cloud-sql-postgres1\" is not a valid kind of data source" "message":"unable to parse tool file at \"tools.yaml\": \"cloud-sql-postgres1\" is not a valid type of data source"
} }
``` ```

View File

@@ -234,7 +234,7 @@
}, },
"outputs": [], "outputs": [],
"source": [ "source": [
"version = \"0.24.0\" # x-release-please-version\n", "version = \"0.26.0\" # x-release-please-version\n",
"! curl -O https://storage.googleapis.com/genai-toolbox/v{version}/linux/amd64/toolbox\n", "! curl -O https://storage.googleapis.com/genai-toolbox/v{version}/linux/amd64/toolbox\n",
"\n", "\n",
"# Make the binary executable\n", "# Make the binary executable\n",
@@ -300,78 +300,89 @@
"# You can also upload a tools file and use that to run toolbox.\n", "# You can also upload a tools file and use that to run toolbox.\n",
"tools_file_name = \"tools.yml\"\n", "tools_file_name = \"tools.yml\"\n",
"file_content = f\"\"\"\n", "file_content = f\"\"\"\n",
"sources:\n", "kind: sources\n",
" my-pg-source:\n", "name: my-pg-source\n",
" kind: postgres\n", "type: postgres\n",
" host: 127.0.0.1\n", "host: 127.0.0.1\n",
" port: 5432\n", "port: 5432\n",
" database: toolbox_db\n", "database: toolbox_db\n",
" user: toolbox_user\n", "user: toolbox_user\n",
" password: my-password\n", "password: my-password\n",
"---\n",
"kind: tools\n",
"name: search-hotels-by-name\n",
"type: postgres-sql\n",
"source: my-pg-source\n",
"description: Search for hotels based on name.\n",
"parameters:\n",
" - name: name\n",
" type: string\n",
" description: The name of the hotel.\n",
"statement: SELECT * FROM hotels WHERE name ILIKE '%' || \\$1 || '%';\n",
"---\n",
"kind: tools\n",
"name: search-hotels-by-location\n",
"type: postgres-sql\n",
"source: my-pg-source\n",
"description: Search for hotels based on location.\n",
"parameters:\n",
" - name: location\n",
" type: string\n",
" description: The location of the hotel.\n",
"statement: SELECT * FROM hotels WHERE location ILIKE '%' || \\$1 || '%';\n",
"---\n",
"kind: tools\n",
"name: book-hotel\n",
"type: postgres-sql\n",
"source: my-pg-source\n",
"description: >-\n",
" Book a hotel by its ID. If the hotel is successfully booked, returns a NULL, raises an error if not.\n",
"parameters:\n",
" - name: hotel_id\n",
" type: string\n",
" description: The ID of the hotel to book.\n",
"statement: UPDATE hotels SET booked = B'1' WHERE id = \\$1;\n",
"---\n",
"kind: tools\n",
"name: update-hotel\n",
"type: postgres-sql\n",
"source: my-pg-source\n",
"description: >-\n",
" Update a hotel's check-in and check-out dates by its ID. Returns a message\n",
" indicating whether the hotel was successfully updated or not.\n",
"parameters:\n",
" - name: hotel_id\n",
" type: string\n",
" description: The ID of the hotel to update.\n",
" - name: checkin_date\n",
" type: string\n",
" description: The new check-in date of the hotel.\n",
" - name: checkout_date\n",
" type: string\n",
" description: The new check-out date of the hotel.\n",
"statement: >-\n",
" UPDATE hotels SET checkin_date = CAST(\\$2 as date), checkout_date = CAST(\\$3\n",
" as date) WHERE id = \\$1;\n",
"---\n",
"kind: tools\n",
"name: cancel-hotel\n",
"type: postgres-sql\n",
"source: my-pg-source\n",
"description: Cancel a hotel by its ID.\n",
"parameters:\n",
" - name: hotel_id\n",
" type: string\n",
" description: The ID of the hotel to cancel.\n",
"statement: UPDATE hotels SET booked = B'0' WHERE id = \\$1;\n",
"---\n",
"kind: toolsets\n",
"name: my-toolset\n",
"tools:\n", "tools:\n",
" search-hotels-by-name:\n", " - search-hotels-by-name\n",
" kind: postgres-sql\n", " - search-hotels-by-location\n",
" source: my-pg-source\n", " - book-hotel\n",
" description: Search for hotels based on name.\n", " - update-hotel\n",
" parameters:\n", " - cancel-hotel\n",
" - name: name\n",
" type: string\n",
" description: The name of the hotel.\n",
" statement: SELECT * FROM hotels WHERE name ILIKE '%' || \\$1 || '%';\n",
" search-hotels-by-location:\n",
" kind: postgres-sql\n",
" source: my-pg-source\n",
" description: Search for hotels based on location.\n",
" parameters:\n",
" - name: location\n",
" type: string\n",
" description: The location of the hotel.\n",
" statement: SELECT * FROM hotels WHERE location ILIKE '%' || \\$1 || '%';\n",
" book-hotel:\n",
" kind: postgres-sql\n",
" source: my-pg-source\n",
" description: >-\n",
" Book a hotel by its ID. If the hotel is successfully booked, returns a NULL, raises an error if not.\n",
" parameters:\n",
" - name: hotel_id\n",
" type: string\n",
" description: The ID of the hotel to book.\n",
" statement: UPDATE hotels SET booked = B'1' WHERE id = \\$1;\n",
" update-hotel:\n",
" kind: postgres-sql\n",
" source: my-pg-source\n",
" description: >-\n",
" Update a hotel's check-in and check-out dates by its ID. Returns a message\n",
" indicating whether the hotel was successfully updated or not.\n",
" parameters:\n",
" - name: hotel_id\n",
" type: string\n",
" description: The ID of the hotel to update.\n",
" - name: checkin_date\n",
" type: string\n",
" description: The new check-in date of the hotel.\n",
" - name: checkout_date\n",
" type: string\n",
" description: The new check-out date of the hotel.\n",
" statement: >-\n",
" UPDATE hotels SET checkin_date = CAST(\\$2 as date), checkout_date = CAST(\\$3\n",
" as date) WHERE id = \\$1;\n",
" cancel-hotel:\n",
" kind: postgres-sql\n",
" source: my-pg-source\n",
" description: Cancel a hotel by its ID.\n",
" parameters:\n",
" - name: hotel_id\n",
" type: string\n",
" description: The ID of the hotel to cancel.\n",
" statement: UPDATE hotels SET booked = B'0' WHERE id = \\$1;\n",
"toolsets:\n",
" my-toolset:\n",
" - search-hotels-by-name\n",
" - search-hotels-by-location\n",
" - book-hotel\n",
" - update-hotel\n",
" - cancel-hotel\n",
"\"\"\"" "\"\"\""
] ]
}, },
@@ -509,8 +520,7 @@
}, },
"outputs": [], "outputs": [],
"source": [ "source": [
"! pip install toolbox-core --quiet\n", "! pip install google-adk[toolbox] --quiet"
"! pip install google-adk --quiet"
] ]
}, },
{ {
@@ -525,14 +535,18 @@
"from google.adk.runners import Runner\n", "from google.adk.runners import Runner\n",
"from google.adk.sessions import InMemorySessionService\n", "from google.adk.sessions import InMemorySessionService\n",
"from google.adk.artifacts.in_memory_artifact_service import InMemoryArtifactService\n", "from google.adk.artifacts.in_memory_artifact_service import InMemoryArtifactService\n",
"from google.adk.tools.toolbox_toolset import ToolboxToolset\n",
"from google.genai import types\n", "from google.genai import types\n",
"from toolbox_core import ToolboxSyncClient\n",
"\n", "\n",
"import os\n", "import os\n",
"# TODO(developer): replace this with your Google API key\n", "# TODO(developer): replace this with your Google API key\n",
"os.environ['GOOGLE_API_KEY'] = \"<GOOGLE_API_KEY>\"\n", "os.environ['GOOGLE_API_KEY'] = \"<GOOGLE_API_KEY>\"\n",
"\n", "\n",
"toolbox_client = ToolboxSyncClient(\"http://127.0.0.1:5000\")\n", "# Configure toolset\n",
"toolset = ToolboxToolset(\n",
" server_url=\"http://127.0.0.1:5000\",\n",
" toolset_name=\"my-toolset\"\n",
")\n",
"\n", "\n",
"prompt = \"\"\"\n", "prompt = \"\"\"\n",
" You're a helpful hotel assistant. You handle hotel searching, booking and\n", " You're a helpful hotel assistant. You handle hotel searching, booking and\n",
@@ -549,7 +563,7 @@
" name='hotel_agent',\n", " name='hotel_agent',\n",
" description='A helpful AI assistant.',\n", " description='A helpful AI assistant.',\n",
" instruction=prompt,\n", " instruction=prompt,\n",
" tools=toolbox_client.load_toolset(\"my-toolset\"),\n", " tools=[toolset],\n",
")\n", ")\n",
"\n", "\n",
"session_service = InMemorySessionService()\n", "session_service = InMemorySessionService()\n",

View File

@@ -36,14 +36,14 @@ Toolbox should have access to. Most tools will have at least one source to
execute against. execute against.
```yaml ```yaml
sources: kind: sources
my-pg-source: name: my-pg-source
kind: postgres type: postgres
host: 127.0.0.1 host: 127.0.0.1
port: 5432 port: 5432
database: toolbox_db database: toolbox_db
user: ${USER_NAME} user: ${USER_NAME}
password: ${PASSWORD} password: ${PASSWORD}
``` ```
For more details on configuring different types of sources, see the For more details on configuring different types of sources, see the
@@ -52,20 +52,20 @@ For more details on configuring different types of sources, see the
### Tools ### Tools
The `tools` section of your `tools.yaml` defines the actions your agent can The `tools` section of your `tools.yaml` defines the actions your agent can
take: what kind of tool it is, which source(s) it affects, what parameters it take: what type of tool it is, which source(s) it affects, what parameters it
uses, etc. uses, etc.
```yaml ```yaml
tools: kind: tools
search-hotels-by-name: name: search-hotels-by-name
kind: postgres-sql type: postgres-sql
source: my-pg-source source: my-pg-source
description: Search for hotels based on name. description: Search for hotels based on name.
parameters: parameters:
- name: name - name: name
type: string type: string
description: The name of the hotel. description: The name of the hotel.
statement: SELECT * FROM hotels WHERE name ILIKE '%' || $1 || '%'; statement: SELECT * FROM hotels WHERE name ILIKE '%' || $1 || '%';
``` ```
For more details on configuring different types of tools, see the For more details on configuring different types of tools, see the
@@ -78,13 +78,17 @@ that you want to be able to load together. This can be useful for defining
different sets for different agents or different applications. different sets for different agents or different applications.
```yaml ```yaml
toolsets: kind: toolsets
my_first_toolset: name: my_first_toolset
- my_first_tool tools:
- my_second_tool - my_first_tool
my_second_toolset: - my_second_tool
- my_second_tool ---
- my_third_tool kind: toolsets
name: my_second_toolset
tools:
- my_second_tool
- my_third_tool
``` ```
You can load toolsets by name: You can load toolsets by name:
@@ -103,14 +107,14 @@ The `prompts` section of your `tools.yaml` defines the templates containing
structured messages and instructions for interacting with language models. structured messages and instructions for interacting with language models.
```yaml ```yaml
prompts: kind: prompts
code_review: name: code_review
description: "Asks the LLM to analyze code quality and suggest improvements." description: "Asks the LLM to analyze code quality and suggest improvements."
messages: messages:
- content: "Please review the following code for quality, correctness, and potential improvements: \n\n{{.code}}" - content: "Please review the following code for quality, correctness, and potential improvements: \n\n{{.code}}"
arguments: arguments:
- name: "code" - name: "code"
description: "The code to review" description: "The code to review"
``` ```
For more details on configuring different types of prompts, see the For more details on configuring different types of prompts, see the

View File

@@ -16,6 +16,12 @@ Databases” as its initial development predated MCP, but was renamed to align
with recently added MCP compatibility. with recently added MCP compatibility.
{{< /notice >}} {{< /notice >}}
{{< notice note >}}
This document has been updated to support the configuration file v2 format. To
view documentation with configuration file v1 format, please navigate to the
top-right menu and select versions v0.26.0 or older.
{{< /notice >}}
## Why Toolbox? ## Why Toolbox?
Toolbox helps you build Gen AI tools that let your agents access data in your Toolbox helps you build Gen AI tools that let your agents access data in your
@@ -103,7 +109,7 @@ To install Toolbox as a binary on Linux (AMD64):
```sh ```sh
# see releases page for other versions # see releases page for other versions
export VERSION=0.24.0 export VERSION=0.26.0
curl -L -o toolbox https://storage.googleapis.com/genai-toolbox/v$VERSION/linux/amd64/toolbox curl -L -o toolbox https://storage.googleapis.com/genai-toolbox/v$VERSION/linux/amd64/toolbox
chmod +x toolbox chmod +x toolbox
``` ```
@@ -114,7 +120,7 @@ To install Toolbox as a binary on macOS (Apple Silicon):
```sh ```sh
# see releases page for other versions # see releases page for other versions
export VERSION=0.24.0 export VERSION=0.26.0
curl -L -o toolbox https://storage.googleapis.com/genai-toolbox/v$VERSION/darwin/arm64/toolbox curl -L -o toolbox https://storage.googleapis.com/genai-toolbox/v$VERSION/darwin/arm64/toolbox
chmod +x toolbox chmod +x toolbox
``` ```
@@ -125,7 +131,7 @@ To install Toolbox as a binary on macOS (Intel):
```sh ```sh
# see releases page for other versions # see releases page for other versions
export VERSION=0.24.0 export VERSION=0.26.0
curl -L -o toolbox https://storage.googleapis.com/genai-toolbox/v$VERSION/darwin/amd64/toolbox curl -L -o toolbox https://storage.googleapis.com/genai-toolbox/v$VERSION/darwin/amd64/toolbox
chmod +x toolbox chmod +x toolbox
``` ```
@@ -136,7 +142,7 @@ To install Toolbox as a binary on Windows (Command Prompt):
```cmd ```cmd
:: see releases page for other versions :: see releases page for other versions
set VERSION=0.24.0 set VERSION=0.26.0
curl -o toolbox.exe "https://storage.googleapis.com/genai-toolbox/v%VERSION%/windows/amd64/toolbox.exe" curl -o toolbox.exe "https://storage.googleapis.com/genai-toolbox/v%VERSION%/windows/amd64/toolbox.exe"
``` ```
@@ -146,7 +152,7 @@ To install Toolbox as a binary on Windows (PowerShell):
```powershell ```powershell
# see releases page for other versions # see releases page for other versions
$VERSION = "0.24.0" $VERSION = "0.26.0"
curl.exe -o toolbox.exe "https://storage.googleapis.com/genai-toolbox/v$VERSION/windows/amd64/toolbox.exe" curl.exe -o toolbox.exe "https://storage.googleapis.com/genai-toolbox/v$VERSION/windows/amd64/toolbox.exe"
``` ```
@@ -158,7 +164,7 @@ You can also install Toolbox as a container:
```sh ```sh
# see releases page for other versions # see releases page for other versions
export VERSION=0.24.0 export VERSION=0.26.0
docker pull us-central1-docker.pkg.dev/database-toolbox/toolbox/toolbox:$VERSION docker pull us-central1-docker.pkg.dev/database-toolbox/toolbox/toolbox:$VERSION
``` ```
@@ -177,7 +183,7 @@ To install from source, ensure you have the latest version of
[Go installed](https://go.dev/doc/install), and then run the following command: [Go installed](https://go.dev/doc/install), and then run the following command:
```sh ```sh
go install github.com/googleapis/genai-toolbox@v0.24.0 go install github.com/googleapis/genai-toolbox@v0.26.0
``` ```
{{% /tab %}} {{% /tab %}}

View File

@@ -52,7 +52,7 @@ runtime](https://research.google.com/colaboratory/local-runtimes.html).
{{< tabpane persist=header >}} {{< tabpane persist=header >}}
{{< tab header="ADK" lang="bash" >}} {{< tab header="ADK" lang="bash" >}}
pip install toolbox-core pip install google-adk[toolbox]
{{< /tab >}} {{< /tab >}}
{{< tab header="Langchain" lang="bash" >}} {{< tab header="Langchain" lang="bash" >}}
@@ -73,7 +73,7 @@ pip install toolbox-core
{{< tabpane persist=header >}} {{< tabpane persist=header >}}
{{< tab header="ADK" lang="bash" >}} {{< tab header="ADK" lang="bash" >}}
pip install google-adk # No other dependencies required for ADK
{{< /tab >}} {{< /tab >}}
{{< tab header="Langchain" lang="bash" >}} {{< tab header="Langchain" lang="bash" >}}

View File

@@ -105,7 +105,7 @@ In this section, we will download Toolbox, configure our tools in a
<!-- {x-release-please-start-version} --> <!-- {x-release-please-start-version} -->
```bash ```bash
export OS="linux/amd64" # one of linux/amd64, darwin/arm64, darwin/amd64, or windows/amd64 export OS="linux/amd64" # one of linux/amd64, darwin/arm64, darwin/amd64, or windows/amd64
curl -O https://storage.googleapis.com/genai-toolbox/v0.24.0/$OS/toolbox curl -O https://storage.googleapis.com/genai-toolbox/v0.26.0/$OS/toolbox
``` ```
<!-- {x-release-please-end} --> <!-- {x-release-please-end} -->
@@ -125,78 +125,89 @@ In this section, we will download Toolbox, configure our tools in a
{{< /notice >}} {{< /notice >}}
```yaml ```yaml
sources: kind: sources
my-pg-source: name: my-pg-source
kind: postgres type: postgres
host: 127.0.0.1 host: 127.0.0.1
port: 5432 port: 5432
database: toolbox_db database: toolbox_db
user: toolbox_user user: toolbox_user
password: my-password password: my-password
---
kind: tools
name: search-hotels-by-name
type: postgres-sql
source: my-pg-source
description: Search for hotels based on name.
parameters:
- name: name
type: string
description: The name of the hotel.
statement: SELECT * FROM hotels WHERE name ILIKE '%' || $1 || '%';
---
kind: tools
name: search-hotels-by-location
type: postgres-sql
source: my-pg-source
description: Search for hotels based on location.
parameters:
- name: location
type: string
description: The location of the hotel.
statement: SELECT * FROM hotels WHERE location ILIKE '%' || $1 || '%';
---
kind: tools
name: book-hotel
type: postgres-sql
source: my-pg-source
description: >-
Book a hotel by its ID. If the hotel is successfully booked, returns a NULL, raises an error if not.
parameters:
- name: hotel_id
type: string
description: The ID of the hotel to book.
statement: UPDATE hotels SET booked = B'1' WHERE id = $1;
---
kind: tools
name: update-hotel
type: postgres-sql
source: my-pg-source
description: >-
Update a hotel's check-in and check-out dates by its ID. Returns a message
indicating whether the hotel was successfully updated or not.
parameters:
- name: hotel_id
type: string
description: The ID of the hotel to update.
- name: checkin_date
type: string
description: The new check-in date of the hotel.
- name: checkout_date
type: string
description: The new check-out date of the hotel.
statement: >-
UPDATE hotels SET checkin_date = CAST($2 as date), checkout_date = CAST($3
as date) WHERE id = $1;
---
kind: tools
name: cancel-hotel
type: postgres-sql
source: my-pg-source
description: Cancel a hotel by its ID.
parameters:
- name: hotel_id
type: string
description: The ID of the hotel to cancel.
statement: UPDATE hotels SET booked = B'0' WHERE id = $1;
---
kind: toolsets
name: my-toolset
tools: tools:
search-hotels-by-name: - search-hotels-by-name
kind: postgres-sql - search-hotels-by-location
source: my-pg-source - book-hotel
description: Search for hotels based on name. - update-hotel
parameters: - cancel-hotel
- name: name
type: string
description: The name of the hotel.
statement: SELECT * FROM hotels WHERE name ILIKE '%' || $1 || '%';
search-hotels-by-location:
kind: postgres-sql
source: my-pg-source
description: Search for hotels based on location.
parameters:
- name: location
type: string
description: The location of the hotel.
statement: SELECT * FROM hotels WHERE location ILIKE '%' || $1 || '%';
book-hotel:
kind: postgres-sql
source: my-pg-source
description: >-
Book a hotel by its ID. If the hotel is successfully booked, returns a NULL, raises an error if not.
parameters:
- name: hotel_id
type: string
description: The ID of the hotel to book.
statement: UPDATE hotels SET booked = B'1' WHERE id = $1;
update-hotel:
kind: postgres-sql
source: my-pg-source
description: >-
Update a hotel's check-in and check-out dates by its ID. Returns a message
indicating whether the hotel was successfully updated or not.
parameters:
- name: hotel_id
type: string
description: The ID of the hotel to update.
- name: checkin_date
type: string
description: The new check-in date of the hotel.
- name: checkout_date
type: string
description: The new check-out date of the hotel.
statement: >-
UPDATE hotels SET checkin_date = CAST($2 as date), checkout_date = CAST($3
as date) WHERE id = $1;
cancel-hotel:
kind: postgres-sql
source: my-pg-source
description: Cancel a hotel by its ID.
parameters:
- name: hotel_id
type: string
description: The ID of the hotel to cancel.
statement: UPDATE hotels SET booked = B'0' WHERE id = $1;
toolsets:
my-toolset:
- search-hotels-by-name
- search-hotels-by-location
- book-hotel
- update-hotel
- cancel-hotel
``` ```
For more info on tools, check out the For more info on tools, check out the

View File

@@ -157,61 +157,67 @@ Create a file named `tools.yaml`. This file defines the database connection, the
SQL tools available, and the prompts the agents will use. SQL tools available, and the prompts the agents will use.
```yaml ```yaml
sources: kind: sources
my-foodiefind-db: name: my-foodiefind-db
kind: postgres type: postgres
host: 127.0.0.1 host: 127.0.0.1
port: 5432 port: 5432
database: toolbox_db database: toolbox_db
user: toolbox_user user: toolbox_user
password: my-password password: my-password
tools: ---
find_user_by_email: kind: tools
kind: postgres-sql name: find_user_by_email
source: my-foodiefind-db type: postgres-sql
description: Find a user's ID by their email address. source: my-foodiefind-db
parameters: description: Find a user's ID by their email address.
- name: email parameters:
type: string - name: email
description: The email address of the user to find. type: string
statement: SELECT id FROM users WHERE email = $1; description: The email address of the user to find.
find_restaurant_by_name: statement: SELECT id FROM users WHERE email = $1;
kind: postgres-sql ---
source: my-foodiefind-db kind: tools
description: Find a restaurant's ID by its exact name. name: find_restaurant_by_name
parameters: type: postgres-sql
- name: name source: my-foodiefind-db
type: string description: Find a restaurant's ID by its exact name.
description: The name of the restaurant to find. parameters:
statement: SELECT id FROM restaurants WHERE name = $1; - name: name
find_review_by_user_and_restaurant: type: string
kind: postgres-sql description: The name of the restaurant to find.
source: my-foodiefind-db statement: SELECT id FROM restaurants WHERE name = $1;
description: Find the full record for a specific review using the user's ID and the restaurant's ID. ---
parameters: kind: tools
- name: user_id name: find_review_by_user_and_restaurant
type: integer type: postgres-sql
description: The numerical ID of the user. source: my-foodiefind-db
- name: restaurant_id description: Find the full record for a specific review using the user's ID and the restaurant's ID.
type: integer parameters:
description: The numerical ID of the restaurant. - name: user_id
statement: SELECT * FROM reviews WHERE user_id = $1 AND restaurant_id = $2; type: integer
prompts: description: The numerical ID of the user.
investigate_missing_review: - name: restaurant_id
description: "Investigates a user's missing review by finding the user, restaurant, and the review itself, then analyzing its status." type: integer
arguments: description: The numerical ID of the restaurant.
- name: "user_email" statement: SELECT * FROM reviews WHERE user_id = $1 AND restaurant_id = $2;
description: "The email of the user who wrote the review." ---
- name: "restaurant_name" kind: prompts
description: "The name of the restaurant being reviewed." name: investigate_missing_review
messages: description: "Investigates a user's missing review by finding the user, restaurant, and the review itself, then analyzing its status."
- content: >- arguments:
**Goal:** Find the review written by the user with email '{{.user_email}}' for the restaurant named '{{.restaurant_name}}' and understand its status. - name: "user_email"
**Workflow:** description: "The email of the user who wrote the review."
1. Use the `find_user_by_email` tool with the email '{{.user_email}}' to get the `user_id`. - name: "restaurant_name"
2. Use the `find_restaurant_by_name` tool with the name '{{.restaurant_name}}' to get the `restaurant_id`. description: "The name of the restaurant being reviewed."
3. Use the `find_review_by_user_and_restaurant` tool with the `user_id` and `restaurant_id` you just found. messages:
4. Analyze the results from the final tool call. Examine the `is_published` and `moderation_status` fields and explain the review's status to the user in a clear, human-readable sentence. - content: >-
**Goal:** Find the review written by the user with email '{{.user_email}}' for the restaurant named '{{.restaurant_name}}' and understand its status.
**Workflow:**
1. Use the `find_user_by_email` tool with the email '{{.user_email}}' to get the `user_id`.
2. Use the `find_restaurant_by_name` tool with the name '{{.restaurant_name}}' to get the `restaurant_id`.
3. Use the `find_review_by_user_and_restaurant` tool with the `user_id` and `restaurant_id` you just found.
4. Analyze the results from the final tool call. Examine the `is_published` and `moderation_status` fields and explain the review's status to the user in a clear, human-readable sentence.
``` ```
## Step 3: Connect to Gemini CLI ## Step 3: Connect to Gemini CLI

View File

@@ -569,11 +569,12 @@
} }
}, },
"node_modules/jws": { "node_modules/jws": {
"version": "4.0.0", "version": "4.0.1",
"resolved": "https://registry.npmjs.org/jws/-/jws-4.0.0.tgz", "resolved": "https://registry.npmjs.org/jws/-/jws-4.0.1.tgz",
"integrity": "sha512-KDncfTmOZoOMTFG4mBlG0qUIOlc03fmzH+ru6RgYVZhPkyiy/92Owlt/8UEN+a4TXR1FQetfIpJE8ApdvdVxTg==", "integrity": "sha512-EKI/M/yqPncGUUh44xz0PxSidXFr/+r0pA70+gIYhjv+et7yxM+s29Y+VGDkovRofQem0fs7Uvf4+YmAdyRduA==",
"license": "MIT",
"dependencies": { "dependencies": {
"jwa": "^2.0.0", "jwa": "^2.0.1",
"safe-buffer": "^5.0.1" "safe-buffer": "^5.0.1"
} }
}, },

View File

@@ -24,12 +24,13 @@
} }
}, },
"node_modules/@dabh/diagnostics": { "node_modules/@dabh/diagnostics": {
"version": "2.0.3", "version": "2.0.8",
"resolved": "https://registry.npmjs.org/@dabh/diagnostics/-/diagnostics-2.0.3.tgz", "resolved": "https://registry.npmjs.org/@dabh/diagnostics/-/diagnostics-2.0.8.tgz",
"integrity": "sha512-hrlQOIi7hAfzsMqlGSFyVucrx38O+j6wiGOf//H2ecvIEqYN4ADBSS2iLMh5UFyDunCNniUIPk/q3riFv45xRA==", "integrity": "sha512-R4MSXTVnuMzGD7bzHdW2ZhhdPC/igELENcq5IjEverBvq5hn1SXCWcsi6eSsdWP0/Ur+SItRRjAktmdoX/8R/Q==",
"license": "MIT",
"optional": true, "optional": true,
"dependencies": { "dependencies": {
"colorspace": "1.1.x", "@so-ric/colorspace": "^1.1.6",
"enabled": "2.0.x", "enabled": "2.0.x",
"kuler": "^2.0.0" "kuler": "^2.0.0"
} }
@@ -578,9 +579,10 @@
} }
}, },
"node_modules/@google-cloud/firestore": { "node_modules/@google-cloud/firestore": {
"version": "7.11.3", "version": "7.11.6",
"resolved": "https://registry.npmjs.org/@google-cloud/firestore/-/firestore-7.11.3.tgz", "resolved": "https://registry.npmjs.org/@google-cloud/firestore/-/firestore-7.11.6.tgz",
"integrity": "sha512-qsM3/WHpawF07SRVvEJJVRwhYzM7o9qtuksyuqnrMig6fxIrwWnsezECWsG/D5TyYru51Fv5c/RTqNDQ2yU+4w==", "integrity": "sha512-EW/O8ktzwLfyWBOsNuhRoMi8lrC3clHM5LVFhGvO1HCsLozCOOXRAlHrYBoE6HL42Sc8yYMuCb2XqcnJ4OOEpw==",
"license": "Apache-2.0",
"optional": true, "optional": true,
"peer": true, "peer": true,
"dependencies": { "dependencies": {
@@ -2887,6 +2889,17 @@
"resolved": "https://registry.npmjs.org/@protobufjs/utf8/-/utf8-1.1.0.tgz", "resolved": "https://registry.npmjs.org/@protobufjs/utf8/-/utf8-1.1.0.tgz",
"integrity": "sha512-Vvn3zZrhQZkkBE8LSuW3em98c0FwgO4nxzv6OdSxPKJIEKY2bGbHn+mhGIPerzI4twdxaP8/0+06HBpwf345Lw==" "integrity": "sha512-Vvn3zZrhQZkkBE8LSuW3em98c0FwgO4nxzv6OdSxPKJIEKY2bGbHn+mhGIPerzI4twdxaP8/0+06HBpwf345Lw=="
}, },
"node_modules/@so-ric/colorspace": {
"version": "1.1.6",
"resolved": "https://registry.npmjs.org/@so-ric/colorspace/-/colorspace-1.1.6.tgz",
"integrity": "sha512-/KiKkpHNOBgkFJwu9sh48LkHSMYGyuTcSFK/qMBdnOAlrRJzRSXAOFB5qwzaVQuDl8wAvHVMkaASQDReTahxuw==",
"license": "MIT",
"optional": true,
"dependencies": {
"color": "^5.0.2",
"text-hex": "1.0.x"
}
},
"node_modules/@toolbox-sdk/core": { "node_modules/@toolbox-sdk/core": {
"version": "0.1.2", "version": "0.1.2",
"resolved": "https://registry.npmjs.org/@toolbox-sdk/core/-/core-0.1.2.tgz", "resolved": "https://registry.npmjs.org/@toolbox-sdk/core/-/core-0.1.2.tgz",
@@ -3515,38 +3528,53 @@
} }
}, },
"node_modules/color": { "node_modules/color": {
"version": "3.2.1", "version": "5.0.3",
"resolved": "https://registry.npmjs.org/color/-/color-3.2.1.tgz", "resolved": "https://registry.npmjs.org/color/-/color-5.0.3.tgz",
"integrity": "sha512-aBl7dZI9ENN6fUGC7mWpMTPNHmWUSNan9tuWN6ahh5ZLNk9baLJOnSMlrQkHcrfFgz2/RigjUVAjdx36VcemKA==", "integrity": "sha512-ezmVcLR3xAVp8kYOm4GS45ZLLgIE6SPAFoduLr6hTDajwb3KZ2F46gulK3XpcwRFb5KKGCSezCBAY4Dw4HsyXA==",
"license": "MIT",
"optional": true, "optional": true,
"dependencies": { "dependencies": {
"color-convert": "^1.9.3", "color-convert": "^3.1.3",
"color-string": "^1.6.0" "color-string": "^2.1.3"
},
"engines": {
"node": ">=18"
} }
}, },
"node_modules/color-convert": { "node_modules/color-convert": {
"version": "1.9.3", "version": "3.1.3",
"resolved": "https://registry.npmjs.org/color-convert/-/color-convert-1.9.3.tgz", "resolved": "https://registry.npmjs.org/color-convert/-/color-convert-3.1.3.tgz",
"integrity": "sha512-QfAUtd+vFdAtFQcC8CCyYt1fYWxSqAiK2cSD6zDB8N3cpsEBAvRxp9zOGg6G/SHHJYAT88/az/IuDGALsNVbGg==", "integrity": "sha512-fasDH2ont2GqF5HpyO4w0+BcewlhHEZOFn9c1ckZdHpJ56Qb7MHhH/IcJZbBGgvdtwdwNbLvxiBEdg336iA9Sg==",
"license": "MIT",
"optional": true, "optional": true,
"dependencies": { "dependencies": {
"color-name": "1.1.3" "color-name": "^2.0.0"
},
"engines": {
"node": ">=14.6"
} }
}, },
"node_modules/color-name": { "node_modules/color-name": {
"version": "1.1.3", "version": "2.1.0",
"resolved": "https://registry.npmjs.org/color-name/-/color-name-1.1.3.tgz", "resolved": "https://registry.npmjs.org/color-name/-/color-name-2.1.0.tgz",
"integrity": "sha512-72fSenhMw2HZMTVHeCA9KCmpEIbzWiQsjN+BHcBbS9vr1mtt+vJjPdksIBNUmKAW8TFUDPJK5SUU3QhE9NEXDw==", "integrity": "sha512-1bPaDNFm0axzE4MEAzKPuqKWeRaT43U/hyxKPBdqTfmPF+d6n7FSoTFxLVULUJOmiLp01KjhIPPH+HrXZJN4Rg==",
"optional": true "license": "MIT",
"optional": true,
"engines": {
"node": ">=12.20"
}
}, },
"node_modules/color-string": { "node_modules/color-string": {
"version": "1.9.1", "version": "2.1.4",
"resolved": "https://registry.npmjs.org/color-string/-/color-string-1.9.1.tgz", "resolved": "https://registry.npmjs.org/color-string/-/color-string-2.1.4.tgz",
"integrity": "sha512-shrVawQFojnZv6xM40anx4CkoDP+fZsw/ZerEMsW/pyzsRbElpsL/DBVW7q3ExxwusdNXI3lXpuhEZkzs8p5Eg==", "integrity": "sha512-Bb6Cq8oq0IjDOe8wJmi4JeNn763Xs9cfrBcaylK1tPypWzyoy2G3l90v9k64kjphl/ZJjPIShFztenRomi8WTg==",
"license": "MIT",
"optional": true, "optional": true,
"dependencies": { "dependencies": {
"color-name": "^1.0.0", "color-name": "^2.0.0"
"simple-swizzle": "^0.2.2" },
"engines": {
"node": ">=18"
} }
}, },
"node_modules/colorette": { "node_modules/colorette": {
@@ -3554,16 +3582,6 @@
"resolved": "https://registry.npmjs.org/colorette/-/colorette-2.0.20.tgz", "resolved": "https://registry.npmjs.org/colorette/-/colorette-2.0.20.tgz",
"integrity": "sha512-IfEDxwoWIjkeXL1eXcDiow4UbKjhLdq6/EuSVR9GMN7KVH3r9gQ83e73hsz1Nd1T3ijd5xv1wcWRYO+D6kCI2w==" "integrity": "sha512-IfEDxwoWIjkeXL1eXcDiow4UbKjhLdq6/EuSVR9GMN7KVH3r9gQ83e73hsz1Nd1T3ijd5xv1wcWRYO+D6kCI2w=="
}, },
"node_modules/colorspace": {
"version": "1.1.4",
"resolved": "https://registry.npmjs.org/colorspace/-/colorspace-1.1.4.tgz",
"integrity": "sha512-BgvKJiuVu1igBUF2kEjRCZXol6wiiGbY5ipL/oVPwm0BL9sIpMIzM8IK7vwuxIIzOXMV3Ey5w+vxhm0rR/TN8w==",
"optional": true,
"dependencies": {
"color": "^3.1.3",
"text-hex": "1.0.x"
}
},
"node_modules/combined-stream": { "node_modules/combined-stream": {
"version": "1.0.8", "version": "1.0.8",
"resolved": "https://registry.npmjs.org/combined-stream/-/combined-stream-1.0.8.tgz", "resolved": "https://registry.npmjs.org/combined-stream/-/combined-stream-1.0.8.tgz",
@@ -4968,12 +4986,6 @@
"node": ">= 0.10" "node": ">= 0.10"
} }
}, },
"node_modules/is-arrayish": {
"version": "0.3.2",
"resolved": "https://registry.npmjs.org/is-arrayish/-/is-arrayish-0.3.2.tgz",
"integrity": "sha512-eVRqCvVlZbuw3GrM63ovNSNAeA1K16kaR/LRY/92w0zxQ5/1YzwblUX652i4Xs9RwAGjW9d9y6X88t8OaAJfWQ==",
"optional": true
},
"node_modules/is-core-module": { "node_modules/is-core-module": {
"version": "2.16.1", "version": "2.16.1",
"resolved": "https://registry.npmjs.org/is-core-module/-/is-core-module-2.16.1.tgz", "resolved": "https://registry.npmjs.org/is-core-module/-/is-core-module-2.16.1.tgz",
@@ -5114,13 +5126,14 @@
} }
}, },
"node_modules/jsonwebtoken/node_modules/jws": { "node_modules/jsonwebtoken/node_modules/jws": {
"version": "3.2.2", "version": "3.2.3",
"resolved": "https://registry.npmjs.org/jws/-/jws-3.2.2.tgz", "resolved": "https://registry.npmjs.org/jws/-/jws-3.2.3.tgz",
"integrity": "sha512-YHlZCB6lMTllWDtSPHz/ZXTsi8S00usEV6v1tjq8tOUZzw7DpSDWVXjXDre6ed1w/pd495ODpHZYSdkRTsa0HA==", "integrity": "sha512-byiJ0FLRdLdSVSReO/U4E7RoEyOCKnEnEPMjq3HxWtvzLsV08/i5RQKsFVNkCldrCaPr2vDNAOMsfs8T/Hze7g==",
"license": "MIT",
"optional": true, "optional": true,
"peer": true, "peer": true,
"dependencies": { "dependencies": {
"jwa": "^1.4.1", "jwa": "^1.4.2",
"safe-buffer": "^5.0.1" "safe-buffer": "^5.0.1"
} }
}, },
@@ -5153,11 +5166,12 @@
} }
}, },
"node_modules/jws": { "node_modules/jws": {
"version": "4.0.0", "version": "4.0.1",
"resolved": "https://registry.npmjs.org/jws/-/jws-4.0.0.tgz", "resolved": "https://registry.npmjs.org/jws/-/jws-4.0.1.tgz",
"integrity": "sha512-KDncfTmOZoOMTFG4mBlG0qUIOlc03fmzH+ru6RgYVZhPkyiy/92Owlt/8UEN+a4TXR1FQetfIpJE8ApdvdVxTg==", "integrity": "sha512-EKI/M/yqPncGUUh44xz0PxSidXFr/+r0pA70+gIYhjv+et7yxM+s29Y+VGDkovRofQem0fs7Uvf4+YmAdyRduA==",
"license": "MIT",
"dependencies": { "dependencies": {
"jwa": "^2.0.0", "jwa": "^2.0.1",
"safe-buffer": "^5.0.1" "safe-buffer": "^5.0.1"
} }
}, },
@@ -5424,9 +5438,10 @@
} }
}, },
"node_modules/node-forge": { "node_modules/node-forge": {
"version": "1.3.1", "version": "1.3.3",
"resolved": "https://registry.npmjs.org/node-forge/-/node-forge-1.3.1.tgz", "resolved": "https://registry.npmjs.org/node-forge/-/node-forge-1.3.3.tgz",
"integrity": "sha512-dPEtOeMvF9VMcYV/1Wb8CPoVAXtp6MKMlcbAt4ddqmGqUJ6fQZFXkNZNkNlfevtNkGtaSoXf/vNNNSvgrdXwtA==", "integrity": "sha512-rLvcdSyRCyouf6jcOIPe/BgwG/d7hKjzMKOas33/pHEr6gbq18IK9zV7DiPvzsz0oBJPme6qr6H6kGZuI9/DZg==",
"license": "(BSD-3-Clause OR GPL-2.0)",
"optional": true, "optional": true,
"peer": true, "peer": true,
"engines": { "engines": {
@@ -6038,15 +6053,6 @@
"url": "https://github.com/sponsors/ljharb" "url": "https://github.com/sponsors/ljharb"
} }
}, },
"node_modules/simple-swizzle": {
"version": "0.2.2",
"resolved": "https://registry.npmjs.org/simple-swizzle/-/simple-swizzle-0.2.2.tgz",
"integrity": "sha512-JA//kQgZtbuY83m+xT+tXJkmJncGMTFT+C+g2h2R9uxkYIrE2yy9sgmcLhCnw57/WSD+Eh3J97FPEDFnbXnDUg==",
"optional": true,
"dependencies": {
"is-arrayish": "^0.3.1"
}
},
"node_modules/source-map": { "node_modules/source-map": {
"version": "0.6.1", "version": "0.6.1",
"resolved": "https://registry.npmjs.org/source-map/-/source-map-0.6.1.tgz", "resolved": "https://registry.npmjs.org/source-map/-/source-map-0.6.1.tgz",
@@ -6233,6 +6239,7 @@
"version": "1.0.0", "version": "1.0.0",
"resolved": "https://registry.npmjs.org/text-hex/-/text-hex-1.0.0.tgz", "resolved": "https://registry.npmjs.org/text-hex/-/text-hex-1.0.0.tgz",
"integrity": "sha512-uuVGNWzgJ4yhRaNSiubPY7OjISw4sw4E5Uv0wbjp+OzcbmVU/rsT8ujgcXJhn9ypzsgr5vlzpPqP+MBBKcGvbg==", "integrity": "sha512-uuVGNWzgJ4yhRaNSiubPY7OjISw4sw4E5Uv0wbjp+OzcbmVU/rsT8ujgcXJhn9ypzsgr5vlzpPqP+MBBKcGvbg==",
"license": "MIT",
"optional": true "optional": true
}, },
"node_modules/thriftrw": { "node_modules/thriftrw": {
@@ -6416,13 +6423,14 @@
} }
}, },
"node_modules/winston": { "node_modules/winston": {
"version": "3.17.0", "version": "3.19.0",
"resolved": "https://registry.npmjs.org/winston/-/winston-3.17.0.tgz", "resolved": "https://registry.npmjs.org/winston/-/winston-3.19.0.tgz",
"integrity": "sha512-DLiFIXYC5fMPxaRg832S6F5mJYvePtmO5G9v9IgUFPhXm9/GkXarH/TUrBAVzhTCzAj9anE/+GjrgXp/54nOgw==", "integrity": "sha512-LZNJgPzfKR+/J3cHkxcpHKpKKvGfDZVPS4hfJCc4cCG0CgYzvlD6yE/S3CIL/Yt91ak327YCpiF/0MyeZHEHKA==",
"license": "MIT",
"optional": true, "optional": true,
"dependencies": { "dependencies": {
"@colors/colors": "^1.6.0", "@colors/colors": "^1.6.0",
"@dabh/diagnostics": "^2.0.2", "@dabh/diagnostics": "^2.0.8",
"async": "^3.2.3", "async": "^3.2.3",
"is-stream": "^2.0.0", "is-stream": "^2.0.0",
"logform": "^2.7.0", "logform": "^2.7.0",

View File

@@ -882,11 +882,12 @@
} }
}, },
"node_modules/jws": { "node_modules/jws": {
"version": "4.0.0", "version": "4.0.1",
"resolved": "https://registry.npmjs.org/jws/-/jws-4.0.0.tgz", "resolved": "https://registry.npmjs.org/jws/-/jws-4.0.1.tgz",
"integrity": "sha512-KDncfTmOZoOMTFG4mBlG0qUIOlc03fmzH+ru6RgYVZhPkyiy/92Owlt/8UEN+a4TXR1FQetfIpJE8ApdvdVxTg==", "integrity": "sha512-EKI/M/yqPncGUUh44xz0PxSidXFr/+r0pA70+gIYhjv+et7yxM+s29Y+VGDkovRofQem0fs7Uvf4+YmAdyRduA==",
"license": "MIT",
"dependencies": { "dependencies": {
"jwa": "^2.0.0", "jwa": "^2.0.1",
"safe-buffer": "^5.0.1" "safe-buffer": "^5.0.1"
} }
}, },
@@ -974,9 +975,10 @@
} }
}, },
"node_modules/lodash": { "node_modules/lodash": {
"version": "4.17.21", "version": "4.17.23",
"resolved": "https://registry.npmjs.org/lodash/-/lodash-4.17.21.tgz", "resolved": "https://registry.npmjs.org/lodash/-/lodash-4.17.23.tgz",
"integrity": "sha512-v2kDEe57lecTulaDIuNTPy3Ry4gLGJ6Z1O3vE1krgXZNrsQ+LFTGHVxVjcXPs17LhbZVGedAJv8XZ1tvj5FvSg==" "integrity": "sha512-LgVTMpQtIopCi79SJeDiP0TfWi5CNEc/L/aRdTh3yIvmZXTnheWpKjSZhnvMl8iXbC1tFg9gdHHDMLoV7CnG+w==",
"license": "MIT"
}, },
"node_modules/magic-bytes.js": { "node_modules/magic-bytes.js": {
"version": "1.12.1", "version": "1.12.1",

View File

@@ -1,15 +1,17 @@
from google.adk import Agent from google.adk import Agent
from google.adk.apps import App from google.adk.apps import App
from toolbox_core import ToolboxSyncClient from google.adk.tools.toolbox_toolset import ToolboxToolset
# TODO(developer): update the TOOLBOX_URL to your toolbox endpoint # TODO(developer): update the TOOLBOX_URL to your toolbox endpoint
client = ToolboxSyncClient("http://127.0.0.1:5000") toolset = ToolboxToolset(
server_url="http://127.0.0.1:5000",
)
root_agent = Agent( root_agent = Agent(
name='root_agent', name='root_agent',
model='gemini-2.5-flash', model='gemini-2.5-flash',
instruction="You are a helpful AI assistant designed to provide accurate and useful information.", instruction="You are a helpful AI assistant designed to provide accurate and useful information.",
tools=client.load_toolset(), tools=[toolset],
) )
app = App(root_agent=root_agent, name="my_agent") app = App(root_agent=root_agent, name="my_agent")

View File

@@ -1,3 +1,2 @@
google-adk==1.21.0 google-adk[toolbox]==1.23.0
toolbox-core==0.5.4
pytest==9.0.2 pytest==9.0.2

View File

@@ -1,3 +1,3 @@
google-genai==1.56.0 google-genai==1.57.0
toolbox-core==0.5.4 toolbox-core==0.5.4
pytest==9.0.2 pytest==9.0.2

View File

@@ -1,5 +1,5 @@
langchain==1.2.0 langchain==1.2.2
langchain-google-vertexai==3.2.0 langchain-google-vertexai==3.2.1
langgraph==1.0.5 langgraph==1.0.5
toolbox-langchain==0.5.4 toolbox-langchain==0.5.4
pytest==9.0.2 pytest==9.0.2

View File

@@ -13,7 +13,7 @@ In this section, we will download Toolbox, configure our tools in a
<!-- {x-release-please-start-version} --> <!-- {x-release-please-start-version} -->
```bash ```bash
export OS="linux/amd64" # one of linux/amd64, darwin/arm64, darwin/amd64, or windows/amd64 export OS="linux/amd64" # one of linux/amd64, darwin/arm64, darwin/amd64, or windows/amd64
curl -O https://storage.googleapis.com/genai-toolbox/v0.24.0/$OS/toolbox curl -O https://storage.googleapis.com/genai-toolbox/v0.26.0/$OS/toolbox
``` ```
<!-- {x-release-please-end} --> <!-- {x-release-please-end} -->
@@ -33,78 +33,89 @@ In this section, we will download Toolbox, configure our tools in a
{{< /notice >}} {{< /notice >}}
```yaml ```yaml
sources: kind: sources
my-pg-source: name: my-pg-source
kind: postgres type: postgres
host: 127.0.0.1 host: 127.0.0.1
port: 5432 port: 5432
database: toolbox_db database: toolbox_db
user: ${USER_NAME} user: toolbox_user
password: ${PASSWORD} password: my-password
---
kind: tools
name: search-hotels-by-name
type: postgres-sql
source: my-pg-source
description: Search for hotels based on name.
parameters:
- name: name
type: string
description: The name of the hotel.
statement: SELECT * FROM hotels WHERE name ILIKE '%' || $1 || '%';
---
kind: tools
name: search-hotels-by-location
type: postgres-sql
source: my-pg-source
description: Search for hotels based on location.
parameters:
- name: location
type: string
description: The location of the hotel.
statement: SELECT * FROM hotels WHERE location ILIKE '%' || $1 || '%';
---
kind: tools
name: book-hotel
type: postgres-sql
source: my-pg-source
description: >-
Book a hotel by its ID. If the hotel is successfully booked, returns a NULL, raises an error if not.
parameters:
- name: hotel_id
type: string
description: The ID of the hotel to book.
statement: UPDATE hotels SET booked = B'1' WHERE id = $1;
---
kind: tools
name: update-hotel
type: postgres-sql
source: my-pg-source
description: >-
Update a hotel's check-in and check-out dates by its ID. Returns a message
indicating whether the hotel was successfully updated or not.
parameters:
- name: hotel_id
type: string
description: The ID of the hotel to update.
- name: checkin_date
type: string
description: The new check-in date of the hotel.
- name: checkout_date
type: string
description: The new check-out date of the hotel.
statement: >-
UPDATE hotels SET checkin_date = CAST($2 as date), checkout_date = CAST($3
as date) WHERE id = $1;
---
kind: tools
name: cancel-hotel
type: postgres-sql
source: my-pg-source
description: Cancel a hotel by its ID.
parameters:
- name: hotel_id
type: string
description: The ID of the hotel to cancel.
statement: UPDATE hotels SET booked = B'0' WHERE id = $1;
---
kind: toolsets
name: my-toolset
tools: tools:
search-hotels-by-name: - search-hotels-by-name
kind: postgres-sql - search-hotels-by-location
source: my-pg-source - book-hotel
description: Search for hotels based on name. - update-hotel
parameters: - cancel-hotel
- name: name
type: string
description: The name of the hotel.
statement: SELECT * FROM hotels WHERE name ILIKE '%' || $1 || '%';
search-hotels-by-location:
kind: postgres-sql
source: my-pg-source
description: Search for hotels based on location.
parameters:
- name: location
type: string
description: The location of the hotel.
statement: SELECT * FROM hotels WHERE location ILIKE '%' || $1 || '%';
book-hotel:
kind: postgres-sql
source: my-pg-source
description: >-
Book a hotel by its ID. If the hotel is successfully booked, returns a NULL, raises an error if not.
parameters:
- name: hotel_id
type: string
description: The ID of the hotel to book.
statement: UPDATE hotels SET booked = B'1' WHERE id = $1;
update-hotel:
kind: postgres-sql
source: my-pg-source
description: >-
Update a hotel's check-in and check-out dates by its ID. Returns a message
indicating whether the hotel was successfully updated or not.
parameters:
- name: hotel_id
type: string
description: The ID of the hotel to update.
- name: checkin_date
type: string
description: The new check-in date of the hotel.
- name: checkout_date
type: string
description: The new check-out date of the hotel.
statement: >-
UPDATE hotels SET checkin_date = CAST($2 as date), checkout_date = CAST($3
as date) WHERE id = $1;
cancel-hotel:
kind: postgres-sql
source: my-pg-source
description: Cancel a hotel by its ID.
parameters:
- name: hotel_id
type: string
description: The ID of the hotel to cancel.
statement: UPDATE hotels SET booked = B'0' WHERE id = $1;
toolsets:
my-toolset:
- search-hotels-by-name
- search-hotels-by-location
- book-hotel
- update-hotel
- cancel-hotel
``` ```
For more info on tools, check out the `Resources` section of the docs. For more info on tools, check out the `Resources` section of the docs.

View File

@@ -48,11 +48,13 @@ instance, database and users:
* `roles/cloudsql.editor`: Provides permissions to manage existing resources. * `roles/cloudsql.editor`: Provides permissions to manage existing resources.
* All `viewer` tools * All `viewer` tools
* `create_database` * `create_database`
* `create_backup`
* `roles/cloudsql.admin`: Provides full control over all resources. * `roles/cloudsql.admin`: Provides full control over all resources.
* All `editor` and `viewer` tools * All `editor` and `viewer` tools
* `create_instance` * `create_instance`
* `create_user` * `create_user`
* `clone_instance` * `clone_instance`
* `restore_backup`
## Install MCP Toolbox ## Install MCP Toolbox
@@ -299,6 +301,8 @@ instances and interacting with your database:
* **create_user**: Creates a new user in a Cloud SQL instance. * **create_user**: Creates a new user in a Cloud SQL instance.
* **wait_for_operation**: Waits for a Cloud SQL operation to complete. * **wait_for_operation**: Waits for a Cloud SQL operation to complete.
* **clone_instance**: Creates a clone of an existing Cloud SQL for SQL Server instance. * **clone_instance**: Creates a clone of an existing Cloud SQL for SQL Server instance.
* **create_backup**: Creates a backup on a Cloud SQL instance.
* **restore_backup**: Restores a backup of a Cloud SQL instance.
{{< notice note >}} {{< notice note >}}
Prebuilt tools are pre-1.0, so expect some tool changes between versions. LLMs Prebuilt tools are pre-1.0, so expect some tool changes between versions. LLMs

View File

@@ -48,11 +48,13 @@ database and users:
* `roles/cloudsql.editor`: Provides permissions to manage existing resources. * `roles/cloudsql.editor`: Provides permissions to manage existing resources.
* All `viewer` tools * All `viewer` tools
* `create_database` * `create_database`
* `create_backup`
* `roles/cloudsql.admin`: Provides full control over all resources. * `roles/cloudsql.admin`: Provides full control over all resources.
* All `editor` and `viewer` tools * All `editor` and `viewer` tools
* `create_instance` * `create_instance`
* `create_user` * `create_user`
* `clone_instance` * `clone_instance`
* `restore_backup`
## Install MCP Toolbox ## Install MCP Toolbox
@@ -299,6 +301,8 @@ instances and interacting with your database:
* **create_user**: Creates a new user in a Cloud SQL instance. * **create_user**: Creates a new user in a Cloud SQL instance.
* **wait_for_operation**: Waits for a Cloud SQL operation to complete. * **wait_for_operation**: Waits for a Cloud SQL operation to complete.
* **clone_instance**: Creates a clone of an existing Cloud SQL for MySQL instance. * **clone_instance**: Creates a clone of an existing Cloud SQL for MySQL instance.
* **create_backup**: Creates a backup on a Cloud SQL instance.
* **restore_backup**: Restores a backup of a Cloud SQL instance.
{{< notice note >}} {{< notice note >}}
Prebuilt tools are pre-1.0, so expect some tool changes between versions. LLMs Prebuilt tools are pre-1.0, so expect some tool changes between versions. LLMs

View File

@@ -48,11 +48,13 @@ instance, database and users:
* `roles/cloudsql.editor`: Provides permissions to manage existing resources. * `roles/cloudsql.editor`: Provides permissions to manage existing resources.
* All `viewer` tools * All `viewer` tools
* `create_database` * `create_database`
* `create_backup`
* `roles/cloudsql.admin`: Provides full control over all resources. * `roles/cloudsql.admin`: Provides full control over all resources.
* All `editor` and `viewer` tools * All `editor` and `viewer` tools
* `create_instance` * `create_instance`
* `create_user` * `create_user`
* `clone_instance` * `clone_instance`
* `restore_backup`
## Install MCP Toolbox ## Install MCP Toolbox
@@ -299,6 +301,8 @@ instances and interacting with your database:
* **create_user**: Creates a new user in a Cloud SQL instance. * **create_user**: Creates a new user in a Cloud SQL instance.
* **wait_for_operation**: Waits for a Cloud SQL operation to complete. * **wait_for_operation**: Waits for a Cloud SQL operation to complete.
* **clone_instance**: Creates a clone of an existing Cloud SQL for PostgreSQL instance. * **clone_instance**: Creates a clone of an existing Cloud SQL for PostgreSQL instance.
* **create_backup**: Creates a backup on a Cloud SQL instance.
* **restore_backup**: Restores a backup of a Cloud SQL instance.
{{< notice note >}} {{< notice note >}}
Prebuilt tools are pre-1.0, so expect some tool changes between versions. LLMs Prebuilt tools are pre-1.0, so expect some tool changes between versions. LLMs

View File

@@ -100,19 +100,19 @@ After you install Looker in the MCP Store, resources and tools from the server a
{{< tabpane persist=header >}} {{< tabpane persist=header >}}
{{< tab header="linux/amd64" lang="bash" >}} {{< tab header="linux/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.24.0/linux/amd64/toolbox curl -O https://storage.googleapis.com/genai-toolbox/v0.26.0/linux/amd64/toolbox
{{< /tab >}} {{< /tab >}}
{{< tab header="darwin/arm64" lang="bash" >}} {{< tab header="darwin/arm64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.24.0/darwin/arm64/toolbox curl -O https://storage.googleapis.com/genai-toolbox/v0.26.0/darwin/arm64/toolbox
{{< /tab >}} {{< /tab >}}
{{< tab header="darwin/amd64" lang="bash" >}} {{< tab header="darwin/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.24.0/darwin/amd64/toolbox curl -O https://storage.googleapis.com/genai-toolbox/v0.26.0/darwin/amd64/toolbox
{{< /tab >}} {{< /tab >}}
{{< tab header="windows/amd64" lang="bash" >}} {{< tab header="windows/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.24.0/windows/amd64/toolbox.exe curl -O https://storage.googleapis.com/genai-toolbox/v0.26.0/windows/amd64/toolbox.exe
{{< /tab >}} {{< /tab >}}
{{< /tabpane >}} {{< /tabpane >}}
<!-- {x-release-please-end} --> <!-- {x-release-please-end} -->

View File

@@ -45,19 +45,19 @@ instance:
<!-- {x-release-please-start-version} --> <!-- {x-release-please-start-version} -->
{{< tabpane persist=header >}} {{< tabpane persist=header >}}
{{< tab header="linux/amd64" lang="bash" >}} {{< tab header="linux/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.24.0/linux/amd64/toolbox curl -O https://storage.googleapis.com/genai-toolbox/v0.26.0/linux/amd64/toolbox
{{< /tab >}} {{< /tab >}}
{{< tab header="darwin/arm64" lang="bash" >}} {{< tab header="darwin/arm64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.24.0/darwin/arm64/toolbox curl -O https://storage.googleapis.com/genai-toolbox/v0.26.0/darwin/arm64/toolbox
{{< /tab >}} {{< /tab >}}
{{< tab header="darwin/amd64" lang="bash" >}} {{< tab header="darwin/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.24.0/darwin/amd64/toolbox curl -O https://storage.googleapis.com/genai-toolbox/v0.26.0/darwin/amd64/toolbox
{{< /tab >}} {{< /tab >}}
{{< tab header="windows/amd64" lang="bash" >}} {{< tab header="windows/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.24.0/windows/amd64/toolbox.exe curl -O https://storage.googleapis.com/genai-toolbox/v0.26.0/windows/amd64/toolbox.exe
{{< /tab >}} {{< /tab >}}
{{< /tabpane >}} {{< /tabpane >}}
<!-- {x-release-please-end} --> <!-- {x-release-please-end} -->

View File

@@ -43,19 +43,19 @@ expose your developer assistant tools to a MySQL instance:
<!-- {x-release-please-start-version} --> <!-- {x-release-please-start-version} -->
{{< tabpane persist=header >}} {{< tabpane persist=header >}}
{{< tab header="linux/amd64" lang="bash" >}} {{< tab header="linux/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.24.0/linux/amd64/toolbox curl -O https://storage.googleapis.com/genai-toolbox/v0.26.0/linux/amd64/toolbox
{{< /tab >}} {{< /tab >}}
{{< tab header="darwin/arm64" lang="bash" >}} {{< tab header="darwin/arm64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.24.0/darwin/arm64/toolbox curl -O https://storage.googleapis.com/genai-toolbox/v0.26.0/darwin/arm64/toolbox
{{< /tab >}} {{< /tab >}}
{{< tab header="darwin/amd64" lang="bash" >}} {{< tab header="darwin/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.24.0/darwin/amd64/toolbox curl -O https://storage.googleapis.com/genai-toolbox/v0.26.0/darwin/amd64/toolbox
{{< /tab >}} {{< /tab >}}
{{< tab header="windows/amd64" lang="bash" >}} {{< tab header="windows/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.24.0/windows/amd64/toolbox.exe curl -O https://storage.googleapis.com/genai-toolbox/v0.26.0/windows/amd64/toolbox.exe
{{< /tab >}} {{< /tab >}}
{{< /tabpane >}} {{< /tabpane >}}
<!-- {x-release-please-end} --> <!-- {x-release-please-end} -->

View File

@@ -44,19 +44,19 @@ expose your developer assistant tools to a Neo4j instance:
<!-- {x-release-please-start-version} --> <!-- {x-release-please-start-version} -->
{{< tabpane persist=header >}} {{< tabpane persist=header >}}
{{< tab header="linux/amd64" lang="bash" >}} {{< tab header="linux/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.24.0/linux/amd64/toolbox curl -O https://storage.googleapis.com/genai-toolbox/v0.26.0/linux/amd64/toolbox
{{< /tab >}} {{< /tab >}}
{{< tab header="darwin/arm64" lang="bash" >}} {{< tab header="darwin/arm64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.24.0/darwin/arm64/toolbox curl -O https://storage.googleapis.com/genai-toolbox/v0.26.0/darwin/arm64/toolbox
{{< /tab >}} {{< /tab >}}
{{< tab header="darwin/amd64" lang="bash" >}} {{< tab header="darwin/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.24.0/darwin/amd64/toolbox curl -O https://storage.googleapis.com/genai-toolbox/v0.26.0/darwin/amd64/toolbox
{{< /tab >}} {{< /tab >}}
{{< tab header="windows/amd64" lang="bash" >}} {{< tab header="windows/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.24.0/windows/amd64/toolbox.exe curl -O https://storage.googleapis.com/genai-toolbox/v0.26.0/windows/amd64/toolbox.exe
{{< /tab >}} {{< /tab >}}
{{< /tabpane >}} {{< /tabpane >}}
<!-- {x-release-please-end} --> <!-- {x-release-please-end} -->

View File

@@ -56,19 +56,19 @@ Omni](https://cloud.google.com/alloydb/omni/current/docs/overview).
<!-- {x-release-please-start-version} --> <!-- {x-release-please-start-version} -->
{{< tabpane persist=header >}} {{< tabpane persist=header >}}
{{< tab header="linux/amd64" lang="bash" >}} {{< tab header="linux/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.24.0/linux/amd64/toolbox curl -O https://storage.googleapis.com/genai-toolbox/v0.26.0/linux/amd64/toolbox
{{< /tab >}} {{< /tab >}}
{{< tab header="darwin/arm64" lang="bash" >}} {{< tab header="darwin/arm64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.24.0/darwin/arm64/toolbox curl -O https://storage.googleapis.com/genai-toolbox/v0.26.0/darwin/arm64/toolbox
{{< /tab >}} {{< /tab >}}
{{< tab header="darwin/amd64" lang="bash" >}} {{< tab header="darwin/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.24.0/darwin/amd64/toolbox curl -O https://storage.googleapis.com/genai-toolbox/v0.26.0/darwin/amd64/toolbox
{{< /tab >}} {{< /tab >}}
{{< tab header="windows/amd64" lang="bash" >}} {{< tab header="windows/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.24.0/windows/amd64/toolbox.exe curl -O https://storage.googleapis.com/genai-toolbox/v0.26.0/windows/amd64/toolbox.exe
{{< /tab >}} {{< /tab >}}
{{< /tabpane >}} {{< /tabpane >}}
<!-- {x-release-please-end} --> <!-- {x-release-please-end} -->

View File

@@ -43,19 +43,19 @@ to expose your developer assistant tools to a SQLite instance:
<!-- {x-release-please-start-version} --> <!-- {x-release-please-start-version} -->
{{< tabpane persist=header >}} {{< tabpane persist=header >}}
{{< tab header="linux/amd64" lang="bash" >}} {{< tab header="linux/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.24.0/linux/amd64/toolbox curl -O https://storage.googleapis.com/genai-toolbox/v0.26.0/linux/amd64/toolbox
{{< /tab >}} {{< /tab >}}
{{< tab header="darwin/arm64" lang="bash" >}} {{< tab header="darwin/arm64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.24.0/darwin/arm64/toolbox curl -O https://storage.googleapis.com/genai-toolbox/v0.26.0/darwin/arm64/toolbox
{{< /tab >}} {{< /tab >}}
{{< tab header="darwin/amd64" lang="bash" >}} {{< tab header="darwin/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.24.0/darwin/amd64/toolbox curl -O https://storage.googleapis.com/genai-toolbox/v0.26.0/darwin/amd64/toolbox
{{< /tab >}} {{< /tab >}}
{{< tab header="windows/amd64" lang="bash" >}} {{< tab header="windows/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.24.0/windows/amd64/toolbox.exe curl -O https://storage.googleapis.com/genai-toolbox/v0.26.0/windows/amd64/toolbox.exe
{{< /tab >}} {{< /tab >}}
{{< /tabpane >}} {{< /tabpane >}}
<!-- {x-release-please-end} --> <!-- {x-release-please-end} -->

View File

@@ -20,6 +20,7 @@ The native SDKs can be combined with MCP clients in many cases.
Toolbox currently supports the following versions of MCP specification: Toolbox currently supports the following versions of MCP specification:
* [2025-11-25](https://modelcontextprotocol.io/specification/2025-11-25)
* [2025-06-18](https://modelcontextprotocol.io/specification/2025-06-18) * [2025-06-18](https://modelcontextprotocol.io/specification/2025-06-18)
* [2025-03-26](https://modelcontextprotocol.io/specification/2025-03-26) * [2025-03-26](https://modelcontextprotocol.io/specification/2025-03-26)
* [2024-11-05](https://modelcontextprotocol.io/specification/2024-11-05) * [2024-11-05](https://modelcontextprotocol.io/specification/2024-11-05)

View File

@@ -46,10 +46,10 @@ with the necessary configuration for deployment to Vertex AI Agent Engine.
process will generate deployment configuration files (like a `Makefile` and process will generate deployment configuration files (like a `Makefile` and
`Dockerfile`) in your project directory. `Dockerfile`) in your project directory.
4. Add `toolbox-core` as a dependency to the new project: 4. Add `google-adk[toolbox]` as a dependency to the new project:
```bash ```bash
uv add toolbox-core uv add google-adk[toolbox]
``` ```
## Step 3: Configure Google Cloud Authentication ## Step 3: Configure Google Cloud Authentication
@@ -95,22 +95,23 @@ authentication token.
```python ```python
from google.adk import Agent from google.adk import Agent
from google.adk.apps import App from google.adk.apps import App
from toolbox_core import ToolboxSyncClient, auth_methods from google.adk.tools.toolbox_toolset import ToolboxToolset
from toolbox_adk import CredentialStrategy
# TODO(developer): Replace with your Toolbox Cloud Run Service URL # TODO(developer): Replace with your Toolbox Cloud Run Service URL
TOOLBOX_URL = "https://your-toolbox-service-xyz.a.run.app" TOOLBOX_URL = "https://your-toolbox-service-xyz.a.run.app"
# Initialize the client with the Cloud Run URL and Auth headers # Initialize the toolset with Workload Identity (generates ID token for the URL)
client = ToolboxSyncClient( toolset = ToolboxToolset(
TOOLBOX_URL, server_url=TOOLBOX_URL,
client_headers={"Authorization": auth_methods.get_google_id_token(TOOLBOX_URL)} credentials=CredentialStrategy.workload_identity(target_audience=TOOLBOX_URL)
) )
root_agent = Agent( root_agent = Agent(
name='root_agent', name='root_agent',
model='gemini-2.5-flash', model='gemini-2.5-flash',
instruction="You are a helpful AI assistant designed to provide accurate and useful information.", instruction="You are a helpful AI assistant designed to provide accurate and useful information.",
tools=client.load_toolset(), tools=[toolset],
) )
app = App(root_agent=root_agent, name="my_agent") app = App(root_agent=root_agent, name="my_agent")

View File

@@ -68,7 +68,12 @@ networks:
``` ```
{{< notice tip >}} {{< notice tip >}}
To prevent DNS rebinding attack, use the `--allowed-origins` flag to specify a To prevent DNS rebinding attack, use the `--allowed-hosts` flag to specify a
list of hosts for validation. E.g. `command: [ "toolbox",
"--tools-file", "/config/tools.yaml", "--address", "0.0.0.0",
"--allowed-hosts", "localhost:5000"]`
To implement CORs, use the `--allowed-origins` flag to specify a
list of origins permitted to access the server. E.g. `command: [ "toolbox", list of origins permitted to access the server. E.g. `command: [ "toolbox",
"--tools-file", "/config/tools.yaml", "--address", "0.0.0.0", "--tools-file", "/config/tools.yaml", "--address", "0.0.0.0",
"--allowed-origins", "https://foo.bar"]` "--allowed-origins", "https://foo.bar"]`

View File

@@ -191,6 +191,10 @@ description: >
{{< notice tip >}} {{< notice tip >}}
To prevent DNS rebinding attack, use the `--allowed-origins` flag to specify a To prevent DNS rebinding attack, use the `--allowed-origins` flag to specify a
list of origins permitted to access the server. E.g. `args: ["--address", list of origins permitted to access the server. E.g. `args: ["--address",
"0.0.0.0", "--allowed-hosts", "foo.bar:5000"]`
To implement CORs, use the `--allowed-origins` flag to specify a
list of origins permitted to access the server. E.g. `args: ["--address",
"0.0.0.0", "--allowed-origins", "https://foo.bar"]` "0.0.0.0", "--allowed-origins", "https://foo.bar"]`
{{< /notice >}} {{< /notice >}}

View File

@@ -142,14 +142,18 @@ deployment will time out.
### Update deployed server to be secure ### Update deployed server to be secure
To prevent DNS rebinding attack, use the `--allowed-origins` flag to specify a To prevent DNS rebinding attack, use the `--allowed-hosts` flag to specify a
list of origins permitted to access the server. In order to do that, you will list of hosts. In order to do that, you will
have to re-deploy the cloud run service with the new flag. have to re-deploy the cloud run service with the new flag.
To implement CORs checks, use the `--allowed-origins` flag to specify a list of
origins permitted to access the server.
1. Set an environment variable to the cloud run url: 1. Set an environment variable to the cloud run url:
```bash ```bash
export URL=<cloud run url> export URL=<cloud run url>
export HOST=<cloud run host>
``` ```
2. Redeploy Toolbox: 2. Redeploy Toolbox:
@@ -160,7 +164,7 @@ have to re-deploy the cloud run service with the new flag.
--service-account toolbox-identity \ --service-account toolbox-identity \
--region us-central1 \ --region us-central1 \
--set-secrets "/app/tools.yaml=tools:latest" \ --set-secrets "/app/tools.yaml=tools:latest" \
--args="--tools-file=/app/tools.yaml","--address=0.0.0.0","--port=8080","--allowed-origins=$URL" --args="--tools-file=/app/tools.yaml","--address=0.0.0.0","--port=8080","--allowed-origins=$URL","--allowed-hosts=$HOST"
# --allow-unauthenticated # https://cloud.google.com/run/docs/authenticating/public#gcloud # --allow-unauthenticated # https://cloud.google.com/run/docs/authenticating/public#gcloud
``` ```
@@ -172,7 +176,7 @@ have to re-deploy the cloud run service with the new flag.
--service-account toolbox-identity \ --service-account toolbox-identity \
--region us-central1 \ --region us-central1 \
--set-secrets "/app/tools.yaml=tools:latest" \ --set-secrets "/app/tools.yaml=tools:latest" \
--args="--tools-file=/app/tools.yaml","--address=0.0.0.0","--port=8080","--allowed-origins=$URL" \ --args="--tools-file=/app/tools.yaml","--address=0.0.0.0","--port=8080","--allowed-origins=$URL","--allowed-hosts=$HOST" \
# TODO(dev): update the following to match your VPC if necessary # TODO(dev): update the following to match your VPC if necessary
--network default \ --network default \
--subnet default --subnet default
@@ -203,6 +207,7 @@ You can connect to Toolbox Cloud Run instances directly through the SDK.
{{< tab header="Python" lang="python" >}} {{< tab header="Python" lang="python" >}}
import asyncio import asyncio
from toolbox_core import ToolboxClient, auth_methods from toolbox_core import ToolboxClient, auth_methods
from toolbox_core.protocol import Protocol
# Replace with the Cloud Run service URL generated in the previous step # Replace with the Cloud Run service URL generated in the previous step
URL = "https://cloud-run-url.app" URL = "https://cloud-run-url.app"
@@ -213,6 +218,7 @@ async def main():
async with ToolboxClient( async with ToolboxClient(
URL, URL,
client_headers={"Authorization": auth_token_provider}, client_headers={"Authorization": auth_token_provider},
protocol=Protocol.TOOLBOX,
) as toolbox: ) as toolbox:
toolset = await toolbox.load_toolset() toolset = await toolbox.load_toolset()
# ... # ...
@@ -277,3 +283,5 @@ contain the specific error message needed to diagnose the problem.
Manager, it means the Toolbox service account is missing permissions. Manager, it means the Toolbox service account is missing permissions.
- Ensure the `toolbox-identity` service account has the **Secret Manager - Ensure the `toolbox-identity` service account has the **Secret Manager
Secret Accessor** (`roles/secretmanager.secretAccessor`) IAM role. Secret Accessor** (`roles/secretmanager.secretAccessor`) IAM role.
- **Cloud Run Connections via IAP:** Currently we do not support Cloud Run connections via [IAP](https://docs.cloud.google.com/iap/docs/concepts-overview). Please disable IAP if you are using it.

View File

@@ -8,25 +8,27 @@ description: >
## Reference ## Reference
| Flag (Short) | Flag (Long) | Description | Default | | Flag (Short) | Flag (Long) | Description | Default |
|--------------|----------------------------|-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|-------------| |--------------|----------------------------|----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|-------------|
| `-a` | `--address` | Address of the interface the server will listen on. | `127.0.0.1` | | `-a` | `--address` | Address of the interface the server will listen on. | `127.0.0.1` |
| | `--disable-reload` | Disables dynamic reloading of tools file. | | | | `--disable-reload` | Disables dynamic reloading of tools file. | |
| `-h` | `--help` | help for toolbox | | | `-h` | `--help` | help for toolbox | |
| | `--log-level` | Specify the minimum level logged. Allowed: 'DEBUG', 'INFO', 'WARN', 'ERROR'. | `info` | | | `--log-level` | Specify the minimum level logged. Allowed: 'DEBUG', 'INFO', 'WARN', 'ERROR'. | `info` |
| | `--logging-format` | Specify logging format to use. Allowed: 'standard' or 'JSON'. | `standard` | | | `--logging-format` | Specify logging format to use. Allowed: 'standard' or 'JSON'. | `standard` |
| `-p` | `--port` | Port the server will listen on. | `5000` | | `-p` | `--port` | Port the server will listen on. | `5000` |
| | `--prebuilt` | Use a prebuilt tool configuration by source type. See [Prebuilt Tools Reference](prebuilt-tools.md) for allowed values. | | | | `--prebuilt` | Use one or more prebuilt tool configuration by source type. See [Prebuilt Tools Reference](prebuilt-tools.md) for allowed values. | |
| | `--stdio` | Listens via MCP STDIO instead of acting as a remote HTTP server. | | | | `--stdio` | Listens via MCP STDIO instead of acting as a remote HTTP server. | |
| | `--telemetry-gcp` | Enable exporting directly to Google Cloud Monitoring. | | | | `--telemetry-gcp` | Enable exporting directly to Google Cloud Monitoring. | |
| | `--telemetry-otlp` | Enable exporting using OpenTelemetry Protocol (OTLP) to the specified endpoint (e.g. 'http://127.0.0.1:4318') | | | | `--telemetry-otlp` | Enable exporting using OpenTelemetry Protocol (OTLP) to the specified endpoint (e.g. 'http://127.0.0.1:4318') | |
| | `--telemetry-service-name` | Sets the value of the service.name resource attribute for telemetry data. | `toolbox` | | | `--telemetry-service-name` | Sets the value of the service.name resource attribute for telemetry data. | `toolbox` |
| | `--tools-file` | File path specifying the tool configuration. Cannot be used with --tools-files or --tools-folder. | | | | `--tools-file` | File path specifying the tool configuration. Cannot be used with --tools-files or --tools-folder. | |
| | `--tools-files` | Multiple file paths specifying tool configurations. Files will be merged. Cannot be used with --tools-file or --tools-folder. | | | | `--tools-files` | Multiple file paths specifying tool configurations. Files will be merged. Cannot be used with --tools-file or --tools-folder. | |
| | `--tools-folder` | Directory path containing YAML tool configuration files. All .yaml and .yml files in the directory will be loaded and merged. Cannot be used with --tools-file or --tools-files. | | | | `--tools-folder` | Directory path containing YAML tool configuration files. All .yaml and .yml files in the directory will be loaded and merged. Cannot be used with --tools-file or --tools-files. | |
| | `--ui` | Launches the Toolbox UI web server. | | | | `--ui` | Launches the Toolbox UI web server. | |
| | `--allowed-origins` | Specifies a list of origins permitted to access this server. | `*` | | | `--allowed-origins` | Specifies a list of origins permitted to access this server for CORs access. | `*` |
| `-v` | `--version` | version for toolbox | | | | `--allowed-hosts` | Specifies a list of hosts permitted to access this server to prevent DNS rebinding attacks. | `*` |
| | `--user-agent-metadata` | Appends additional metadata to the User-Agent. | |
| `-v` | `--version` | version for toolbox | |
## Examples ## Examples
@@ -49,6 +51,11 @@ description: >
# Server with prebuilt + custom tools configurations # Server with prebuilt + custom tools configurations
./toolbox --tools-file tools.yaml --prebuilt alloydb-postgres ./toolbox --tools-file tools.yaml --prebuilt alloydb-postgres
# Server with multiple prebuilt tools configurations
./toolbox --prebuilt alloydb-postgres,alloydb-postgres-admin
# OR
./toolbox --prebuilt alloydb-postgres --prebuilt alloydb-postgres-admin
``` ```
### Tool Configuration Sources ### Tool Configuration Sources
@@ -69,7 +76,7 @@ The CLI supports multiple mutually exclusive ways to specify tool configurations
**Prebuilt Configurations:** **Prebuilt Configurations:**
- `--prebuilt`: Use predefined configurations for specific database types (e.g., - `--prebuilt`: Use one or more predefined configurations for specific database types (e.g.,
'bigquery', 'postgres', 'spanner'). See [Prebuilt Tools 'bigquery', 'postgres', 'spanner'). See [Prebuilt Tools
Reference](prebuilt-tools.md) for allowed values. Reference](prebuilt-tools.md) for allowed values.

View File

@@ -16,6 +16,9 @@ details on how to connect your AI tools (IDEs) to databases via Toolbox and MCP.
{{< notice tip >}} {{< notice tip >}}
You can now use `--prebuilt` along `--tools-file`, `--tools-files`, or You can now use `--prebuilt` along `--tools-file`, `--tools-files`, or
`--tools-folder` to combine prebuilt configs with custom tools. `--tools-folder` to combine prebuilt configs with custom tools.
You can also combine multiple prebuilt configs.
See [Usage Examples](../reference/cli.md#examples). See [Usage Examples](../reference/cli.md#examples).
{{< /notice >}} {{< /notice >}}
@@ -187,12 +190,14 @@ See [Usage Examples](../reference/cli.md#examples).
manage existing resources. manage existing resources.
* All `viewer` tools * All `viewer` tools
* `create_database` * `create_database`
* `create_backup`
* **Cloud SQL Admin** (`roles/cloudsql.admin`): Provides full control over * **Cloud SQL Admin** (`roles/cloudsql.admin`): Provides full control over
all resources. all resources.
* All `editor` and `viewer` tools * All `editor` and `viewer` tools
* `create_instance` * `create_instance`
* `create_user` * `create_user`
* `clone_instance` * `clone_instance`
* `restore_backup`
* **Tools:** * **Tools:**
* `create_instance`: Creates a new Cloud SQL for MySQL instance. * `create_instance`: Creates a new Cloud SQL for MySQL instance.
@@ -203,6 +208,8 @@ See [Usage Examples](../reference/cli.md#examples).
* `create_user`: Creates a new user in a Cloud SQL instance. * `create_user`: Creates a new user in a Cloud SQL instance.
* `wait_for_operation`: Waits for a Cloud SQL operation to complete. * `wait_for_operation`: Waits for a Cloud SQL operation to complete.
* `clone_instance`: Creates a clone for an existing Cloud SQL for MySQL instance. * `clone_instance`: Creates a clone for an existing Cloud SQL for MySQL instance.
* `create_backup`: Creates a backup on a Cloud SQL instance.
* `restore_backup`: Restores a backup of a Cloud SQL instance.
## Cloud SQL for PostgreSQL ## Cloud SQL for PostgreSQL
@@ -275,12 +282,14 @@ See [Usage Examples](../reference/cli.md#examples).
manage existing resources. manage existing resources.
* All `viewer` tools * All `viewer` tools
* `create_database` * `create_database`
* `create_backup`
* **Cloud SQL Admin** (`roles/cloudsql.admin`): Provides full control over * **Cloud SQL Admin** (`roles/cloudsql.admin`): Provides full control over
all resources. all resources.
* All `editor` and `viewer` tools * All `editor` and `viewer` tools
* `create_instance` * `create_instance`
* `create_user` * `create_user`
* `clone_instance` * `clone_instance`
* `restore_backup`
* **Tools:** * **Tools:**
* `create_instance`: Creates a new Cloud SQL for PostgreSQL instance. * `create_instance`: Creates a new Cloud SQL for PostgreSQL instance.
* `get_instance`: Gets information about a Cloud SQL instance. * `get_instance`: Gets information about a Cloud SQL instance.
@@ -290,6 +299,8 @@ See [Usage Examples](../reference/cli.md#examples).
* `create_user`: Creates a new user in a Cloud SQL instance. * `create_user`: Creates a new user in a Cloud SQL instance.
* `wait_for_operation`: Waits for a Cloud SQL operation to complete. * `wait_for_operation`: Waits for a Cloud SQL operation to complete.
* `clone_instance`: Creates a clone for an existing Cloud SQL for PostgreSQL instance. * `clone_instance`: Creates a clone for an existing Cloud SQL for PostgreSQL instance.
* `create_backup`: Creates a backup on a Cloud SQL instance.
* `restore_backup`: Restores a backup of a Cloud SQL instance.
## Cloud SQL for SQL Server ## Cloud SQL for SQL Server
@@ -336,12 +347,14 @@ See [Usage Examples](../reference/cli.md#examples).
manage existing resources. manage existing resources.
* All `viewer` tools * All `viewer` tools
* `create_database` * `create_database`
* `create_backup`
* **Cloud SQL Admin** (`roles/cloudsql.admin`): Provides full control over * **Cloud SQL Admin** (`roles/cloudsql.admin`): Provides full control over
all resources. all resources.
* All `editor` and `viewer` tools * All `editor` and `viewer` tools
* `create_instance` * `create_instance`
* `create_user` * `create_user`
* `clone_instance` * `clone_instance`
* `restore_backup`
* **Tools:** * **Tools:**
* `create_instance`: Creates a new Cloud SQL for SQL Server instance. * `create_instance`: Creates a new Cloud SQL for SQL Server instance.
* `get_instance`: Gets information about a Cloud SQL instance. * `get_instance`: Gets information about a Cloud SQL instance.
@@ -351,6 +364,8 @@ See [Usage Examples](../reference/cli.md#examples).
* `create_user`: Creates a new user in a Cloud SQL instance. * `create_user`: Creates a new user in a Cloud SQL instance.
* `wait_for_operation`: Waits for a Cloud SQL operation to complete. * `wait_for_operation`: Waits for a Cloud SQL operation to complete.
* `clone_instance`: Creates a clone for an existing Cloud SQL for SQL Server instance. * `clone_instance`: Creates a clone for an existing Cloud SQL for SQL Server instance.
* `create_backup`: Creates a backup on a Cloud SQL instance.
* `restore_backup`: Restores a backup of a Cloud SQL instance.
## Dataplex ## Dataplex

View File

@@ -28,17 +28,19 @@ The following configurations are placed at the top level of a `tools.yaml` file.
{{< notice tip >}} {{< notice tip >}}
If you are accessing Toolbox with multiple applications, each If you are accessing Toolbox with multiple applications, each
application should register their own Client ID even if they use the same application should register their own Client ID even if they use the same
"kind" of auth provider. "type" of auth provider.
{{< /notice >}} {{< /notice >}}
```yaml ```yaml
authServices: kind: authServices
my_auth_app_1: name: my_auth_app_1
kind: google type: google
clientId: ${YOUR_CLIENT_ID_1} clientId: ${YOUR_CLIENT_ID_1}
my_auth_app_2: ---
kind: google kind: authServices
clientId: ${YOUR_CLIENT_ID_2} name: my_auth_app_2
type: google
clientId: ${YOUR_CLIENT_ID_2}
``` ```
{{< notice tip >}} {{< notice tip >}}

View File

@@ -40,10 +40,10 @@ id-token][provided-claims] can be used for the parameter.
## Example ## Example
```yaml ```yaml
authServices: kind: authServices
my-google-auth: name: my-google-auth
kind: google type: google
clientId: ${YOUR_GOOGLE_CLIENT_ID} clientId: ${YOUR_GOOGLE_CLIENT_ID}
``` ```
{{< notice tip >}} {{< notice tip >}}
@@ -55,5 +55,5 @@ instead of hardcoding your secrets into the configuration file.
| **field** | **type** | **required** | **description** | | **field** | **type** | **required** | **description** |
|-----------|:--------:|:------------:|------------------------------------------------------------------| |-----------|:--------:|:------------:|------------------------------------------------------------------|
| kind | string | true | Must be "google". | | type | string | true | Must be "google". |
| clientId | string | true | Client ID of your application from registering your application. | | clientId | string | true | Client ID of your application from registering your application. |

View File

@@ -3,13 +3,14 @@ title: "EmbeddingModels"
type: docs type: docs
weight: 2 weight: 2
description: > description: >
EmbeddingModels represent services that transform text into vector embeddings for semantic search. EmbeddingModels represent services that transform text into vector embeddings
for semantic search.
--- ---
EmbeddingModels represent services that generate vector representations of text EmbeddingModels represent services that generate vector representations of text
data. In the MCP Toolbox, these models enable **Semantic Queries**, data. In the MCP Toolbox, these models enable **Semantic Queries**, allowing
allowing [Tools](../tools/) to automatically convert human-readable text into [Tools](../tools/) to automatically convert human-readable text into numerical
numerical vectors before using them in a query. vectors before using them in a query.
This is primarily used in two scenarios: This is primarily used in two scenarios:
@@ -19,14 +20,33 @@ This is primarily used in two scenarios:
- **Semantic Search**: Converting a natural language query into a vector to - **Semantic Search**: Converting a natural language query into a vector to
perform similarity searches. perform similarity searches.
## Hidden Parameter Duplication (valueFromParam)
When building tools for vector ingestion, you often need the same input string
twice:
1. To store the original text in a TEXT column.
1. To generate the vector embedding for a VECTOR column.
Requesting an Agent (LLM) to output the exact same string twice is inefficient
and error-prone. The `valueFromParam` field solves this by allowing a parameter
to inherit its value from another parameter in the same tool.
### Key Behaviors
1. Hidden from Manifest: Parameters with valueFromParam set are excluded from
the tool definition sent to the Agent. The Agent does not know this parameter
exists.
1. Auto-Filled: When the tool is executed, the Toolbox automatically copies the
value from the referenced parameter before processing embeddings.
## Example ## Example
The following configuration defines an embedding model and applies it to The following configuration defines an embedding model and applies it to
specific tool parameters. specific tool parameters.
{{< notice tip >}} {{< notice tip >}} Use environment variable replacement with the format
Use environment variable replacement with the format ${ENV_NAME} ${ENV_NAME} instead of hardcoding your API keys into the configuration file.
instead of hardcoding your API keys into the configuration file.
{{< /notice >}} {{< /notice >}}
### Step 1 - Define an Embedding Model ### Step 1 - Define an Embedding Model
@@ -34,51 +54,53 @@ instead of hardcoding your API keys into the configuration file.
Define an embedding model in the `embeddingModels` section: Define an embedding model in the `embeddingModels` section:
```yaml ```yaml
embeddingModels: kind: embeddingModels
gemini-model: # Name of the embedding model name: gemini-model # Name of the embedding model
kind: gemini type: gemini
model: gemini-embedding-001 model: gemini-embedding-001
apiKey: ${GOOGLE_API_KEY} apiKey: ${GOOGLE_API_KEY}
dimension: 768 dimension: 768
``` ```
### Step 2 - Embed Tool Parameters ### Step 2 - Embed Tool Parameters
Use the defined embedding model, embed your query parameters using the Use the defined embedding model, embed your query parameters using the
`embeddedBy` field. Only string-typed `embeddedBy` field. Only string-typed parameters can be embedded:
parameters can be embedded:
```yaml ```yaml
tools: # Vector ingestion tool
# Vector ingestion tool kind: tools
insert_embedding: name: insert_embedding
kind: postgres-sql type: postgres-sql
source: my-pg-instance source: my-pg-instance
statement: | statement: |
INSERT INTO documents (content, embedding) INSERT INTO documents (content, embedding)
VALUES ($1, $2); VALUES ($1, $2);
parameters: parameters:
- name: content - name: content
type: string type: string
- name: vector_string description: The raw text content to be stored in the database.
type: string - name: vector_string
description: The text to be vectorized and stored. type: string
embeddedBy: gemini-model # refers to the name of a defined embedding model # This parameter is hidden from the LLM.
# It automatically copies the value from 'content' and embeds it.
# Semantic search tool valueFromParam: content
search_embedding: embeddedBy: gemini-model
kind: postgres-sql ---
source: my-pg-instance # Semantic search tool
statement: | kind: tools
SELECT id, content, embedding <-> $1 AS distance name: search_embedding
FROM documents type: postgres-sql
ORDER BY distance LIMIT 1 source: my-pg-instance
parameters: statement: |
- name: semantic_search_string SELECT id, content, embedding <-> $1 AS distance
type: string FROM documents
description: The search query that will be converted to a vector. ORDER BY distance LIMIT 1
embeddedBy: gemini-model # refers to the name of a defined embedding model parameters:
- name: semantic_search_string
type: string
description: The search query that will be converted to a vector.
embeddedBy: gemini-model # refers to the name of a defined embedding model
``` ```
## Kinds of Embedding Models ## Kinds of Embedding Models

View File

@@ -50,12 +50,12 @@ information.
## Example ## Example
```yaml ```yaml
embeddingModels: kind: embeddingModels
gemini-model: name: gemini-model
kind: gemini type: gemini
model: gemini-embedding-001 model: gemini-embedding-001
apiKey: ${GOOGLE_API_KEY} apiKey: ${GOOGLE_API_KEY}
dimension: 768 dimension: 768
``` ```
{{< notice tip >}} {{< notice tip >}}
@@ -67,7 +67,7 @@ instead of hardcoding your secrets into the configuration file.
| **field** | **type** | **required** | **description** | | **field** | **type** | **required** | **description** |
|-----------|:--------:|:------------:|--------------------------------------------------------------| |-----------|:--------:|:------------:|--------------------------------------------------------------|
| kind | string | true | Must be `gemini`. | | type | string | true | Must be `gemini`. |
| model | string | true | The Gemini model ID to use (e.g., `gemini-embedding-001`). | | model | string | true | The Gemini model ID to use (e.g., `gemini-embedding-001`). |
| apiKey | string | false | Your API Key from Google AI Studio. | | apiKey | string | false | Your API Key from Google AI Studio. |
| dimension | integer | false | The number of dimensions in the output vector (e.g., `768`). | | dimension | integer | false | The number of dimensions in the output vector (e.g., `768`). |

View File

@@ -16,14 +16,14 @@ can be sent to a Large Language Model (LLM). The Toolbox server implements the
specification, allowing clients to discover and retrieve these prompts. specification, allowing clients to discover and retrieve these prompts.
```yaml ```yaml
prompts: kind: prompts
code_review: name: code_review
description: "Asks the LLM to analyze code quality and suggest improvements." description: "Asks the LLM to analyze code quality and suggest improvements."
messages: messages:
- content: "Please review the following code for quality, correctness, and potential improvements: \n\n{{.code}}" - content: "Please review the following code for quality, correctness, and potential improvements: \n\n{{.code}}"
arguments: arguments:
- name: "code" - name: "code"
description: "The code to review" description: "The code to review"
``` ```
## Prompt Schema ## Prompt Schema
@@ -31,7 +31,7 @@ prompts:
| **field** | **type** | **required** | **description** | | **field** | **type** | **required** | **description** |
|-------------|--------------------------------|--------------|--------------------------------------------------------------------------| |-------------|--------------------------------|--------------|--------------------------------------------------------------------------|
| description | string | No | A brief explanation of what the prompt does. | | description | string | No | A brief explanation of what the prompt does. |
| kind | string | No | The kind of prompt. Defaults to `"custom"`. | | type | string | No | The type of prompt. Defaults to `"custom"`. |
| messages | [][Message](#message-schema) | Yes | A list of one or more message objects that make up the prompt's content. | | messages | [][Message](#message-schema) | Yes | A list of one or more message objects that make up the prompt's content. |
| arguments | [][Argument](#argument-schema) | No | A list of arguments that can be interpolated into the prompt's content. | | arguments | [][Argument](#argument-schema) | No | A list of arguments that can be interpolated into the prompt's content. |

View File

@@ -17,14 +17,14 @@ Here is an example of a simple prompt that takes a single argument, code, and
asks an LLM to review it. asks an LLM to review it.
```yaml ```yaml
prompts: kind: prompts
code_review: name: code_review
description: "Asks the LLM to analyze code quality and suggest improvements." description: "Asks the LLM to analyze code quality and suggest improvements."
messages: messages:
- content: "Please review the following code for quality, correctness, and potential improvements: \n\n{{.code}}" - content: "Please review the following code for quality, correctness, and potential improvements: \n\n{{.code}}"
arguments: arguments:
- name: "code" - name: "code"
description: "The code to review" description: "The code to review"
``` ```
### Multi-message prompt ### Multi-message prompt
@@ -33,19 +33,19 @@ You can define prompts with multiple messages to set up more complex
conversational contexts, like a role-playing scenario. conversational contexts, like a role-playing scenario.
```yaml ```yaml
prompts: kind: prompts
roleplay_scenario: name: roleplay_scenario
description: "Sets up a roleplaying scenario with initial messages." description: "Sets up a roleplaying scenario with initial messages."
arguments: arguments:
- name: "character" - name: "character"
description: "The character the AI should embody." description: "The character the AI should embody."
- name: "situation" - name: "situation"
description: "The initial situation for the roleplay." description: "The initial situation for the roleplay."
messages: messages:
- role: "user" - role: "user"
content: "Let's roleplay. You are {{.character}}. The situation is: {{.situation}}" content: "Let's roleplay. You are {{.character}}. The situation is: {{.situation}}"
- role: "assistant" - role: "assistant"
content: "Okay, I understand. I am ready. What happens next?" content: "Okay, I understand. I am ready. What happens next?"
``` ```
## Reference ## Reference
@@ -54,7 +54,7 @@ prompts:
| **field** | **type** | **required** | **description** | | **field** | **type** | **required** | **description** |
|-------------|--------------------------------|--------------|--------------------------------------------------------------------------| |-------------|--------------------------------|--------------|--------------------------------------------------------------------------|
| kind | string | No | The kind of prompt. Must be `"custom"`. | | type | string | No | The type of prompt. Must be `"custom"`. |
| description | string | No | A brief explanation of what the prompt does. | | description | string | No | A brief explanation of what the prompt does. |
| messages | [][Message](#message-schema) | Yes | A list of one or more message objects that make up the prompt's content. | | messages | [][Message](#message-schema) | Yes | A list of one or more message objects that make up the prompt's content. |
| arguments | [][Argument](#argument-schema) | No | A list of arguments that can be interpolated into the prompt's content. | | arguments | [][Argument](#argument-schema) | No | A list of arguments that can be interpolated into the prompt's content. |

View File

@@ -17,15 +17,15 @@ instead of hardcoding your secrets into the configuration file.
{{< /notice >}} {{< /notice >}}
```yaml ```yaml
sources: kind: sources
my-cloud-sql-source: name: my-cloud-sql-source
kind: cloud-sql-postgres type: cloud-sql-postgres
project: my-project-id project: my-project-id
region: us-central1 region: us-central1
instance: my-instance-name instance: my-instance-name
database: my_db database: my_db
user: ${USER_NAME} user: ${USER_NAME}
password: ${PASSWORD} password: ${PASSWORD}
``` ```
In implementation, each source is a different connection pool or client that used In implementation, each source is a different connection pool or client that used

View File

@@ -25,19 +25,20 @@ Authentication can be handled in two ways:
## Example ## Example
```yaml ```yaml
sources: kind: sources
my-alloydb-admin: name: my-alloydb-admin
kind: alloy-admin type: alloydb-admin
---
my-oauth-alloydb-admin: kind: sources
kind: alloydb-admin name: my-oauth-alloydb-admin
useClientOAuth: true type: alloydb-admin
useClientOAuth: true
``` ```
## Reference ## Reference
| **field** | **type** | **required** | **description** | | **field** | **type** | **required** | **description** |
| -------------- | :------: | :----------: | ---------------------------------------------------------------------------------------------------------------------------------------------- | | -------------- | :------: | :----------: | ---------------------------------------------------------------------------------------------------------------------------------------------- |
| kind | string | true | Must be "alloydb-admin". | | type | string | true | Must be "alloydb-admin". |
| defaultProject | string | false | The Google Cloud project ID to use for AlloyDB infrastructure tools. | | defaultProject | string | false | The Google Cloud project ID to use for AlloyDB infrastructure tools. |
| useClientOAuth | boolean | false | If true, the source will use client-side OAuth for authorization. Otherwise, it will use Application Default Credentials. Defaults to `false`. | | useClientOAuth | boolean | false | If true, the source will use client-side OAuth for authorization. Otherwise, it will use Application Default Credentials. Defaults to `false`. |

View File

@@ -176,17 +176,17 @@ To connect using IAM authentication:
## Example ## Example
```yaml ```yaml
sources: kind: sources
my-alloydb-pg-source: name: my-alloydb-pg-source
kind: alloydb-postgres type: alloydb-postgres
project: my-project-id project: my-project-id
region: us-central1 region: us-central1
cluster: my-cluster cluster: my-cluster
instance: my-instance instance: my-instance
database: my_db database: my_db
user: ${USER_NAME} user: ${USER_NAME}
password: ${PASSWORD} password: ${PASSWORD}
# ipType: "public" # ipType: "public"
``` ```
{{< notice tip >}} {{< notice tip >}}
@@ -198,7 +198,7 @@ instead of hardcoding your secrets into the configuration file.
| **field** | **type** | **required** | **description** | | **field** | **type** | **required** | **description** |
|-----------|:--------:|:------------:|--------------------------------------------------------------------------------------------------------------------------| |-----------|:--------:|:------------:|--------------------------------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "alloydb-postgres". | | type | string | true | Must be "alloydb-postgres". |
| project | string | true | Id of the GCP project that the cluster was created in (e.g. "my-project-id"). | | project | string | true | Id of the GCP project that the cluster was created in (e.g. "my-project-id"). |
| region | string | true | Name of the GCP region that the cluster was created in (e.g. "us-central1"). | | region | string | true | Name of the GCP region that the cluster was created in (e.g. "us-central1"). |
| cluster | string | true | Name of the AlloyDB cluster (e.g. "my-cluster"). | | cluster | string | true | Name of the AlloyDB cluster (e.g. "my-cluster"). |

View File

@@ -121,45 +121,47 @@ identity used has been granted the correct IAM permissions.
Initialize a BigQuery source that uses ADC: Initialize a BigQuery source that uses ADC:
```yaml ```yaml
sources: kind: sources
my-bigquery-source: name: my-bigquery-source
kind: "bigquery" type: "bigquery"
project: "my-project-id" project: "my-project-id"
# location: "US" # Optional: Specifies the location for query jobs. # location: "US" # Optional: Specifies the location for query jobs.
# writeMode: "allowed" # One of: allowed, blocked, protected. Defaults to "allowed". # writeMode: "allowed" # One of: allowed, blocked, protected. Defaults to "allowed".
# allowedDatasets: # Optional: Restricts tool access to a specific list of datasets. # allowedDatasets: # Optional: Restricts tool access to a specific list of datasets.
# - "my_dataset_1" # - "my_dataset_1"
# - "other_project.my_dataset_2" # - "other_project.my_dataset_2"
# impersonateServiceAccount: "service-account@project-id.iam.gserviceaccount.com" # Optional: Service account to impersonate # impersonateServiceAccount: "service-account@project-id.iam.gserviceaccount.com" # Optional: Service account to impersonate
# scopes: # Optional: List of OAuth scopes to request. # scopes: # Optional: List of OAuth scopes to request.
# - "https://www.googleapis.com/auth/bigquery" # - "https://www.googleapis.com/auth/bigquery"
# - "https://www.googleapis.com/auth/drive.readonly" # - "https://www.googleapis.com/auth/drive.readonly"
# maxQueryResultRows: 50 # Optional: Limits the number of rows returned by queries. Defaults to 50.
``` ```
Initialize a BigQuery source that uses the client's access token: Initialize a BigQuery source that uses the client's access token:
```yaml ```yaml
sources: kind: sources
my-bigquery-client-auth-source: name: my-bigquery-client-auth-source
kind: "bigquery" type: "bigquery"
project: "my-project-id" project: "my-project-id"
useClientOAuth: true useClientOAuth: true
# location: "US" # Optional: Specifies the location for query jobs. # location: "US" # Optional: Specifies the location for query jobs.
# writeMode: "allowed" # One of: allowed, blocked, protected. Defaults to "allowed". # writeMode: "allowed" # One of: allowed, blocked, protected. Defaults to "allowed".
# allowedDatasets: # Optional: Restricts tool access to a specific list of datasets. # allowedDatasets: # Optional: Restricts tool access to a specific list of datasets.
# - "my_dataset_1" # - "my_dataset_1"
# - "other_project.my_dataset_2" # - "other_project.my_dataset_2"
# impersonateServiceAccount: "service-account@project-id.iam.gserviceaccount.com" # Optional: Service account to impersonate # impersonateServiceAccount: "service-account@project-id.iam.gserviceaccount.com" # Optional: Service account to impersonate
# scopes: # Optional: List of OAuth scopes to request. # scopes: # Optional: List of OAuth scopes to request.
# - "https://www.googleapis.com/auth/bigquery" # - "https://www.googleapis.com/auth/bigquery"
# - "https://www.googleapis.com/auth/drive.readonly" # - "https://www.googleapis.com/auth/drive.readonly"
# maxQueryResultRows: 50 # Optional: Limits the number of rows returned by queries. Defaults to 50.
``` ```
## Reference ## Reference
| **field** | **type** | **required** | **description** | | **field** | **type** | **required** | **description** |
|---------------------------|:--------:|:------------:|---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| |---------------------------|:--------:|:------------:|---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "bigquery". | | type | string | true | Must be "bigquery". |
| project | string | true | Id of the Google Cloud project to use for billing and as the default project for BigQuery resources. | | project | string | true | Id of the Google Cloud project to use for billing and as the default project for BigQuery resources. |
| location | string | false | Specifies the location (e.g., 'us', 'asia-northeast1') in which to run the query job. This location must match the location of any tables referenced in the query. Defaults to the table's location or 'US' if the location cannot be determined. [Learn More](https://cloud.google.com/bigquery/docs/locations) | | location | string | false | Specifies the location (e.g., 'us', 'asia-northeast1') in which to run the query job. This location must match the location of any tables referenced in the query. Defaults to the table's location or 'US' if the location cannot be determined. [Learn More](https://cloud.google.com/bigquery/docs/locations) |
| writeMode | string | false | Controls the write behavior for tools. `allowed` (default): All queries are permitted. `blocked`: Only `SELECT` statements are allowed for the `bigquery-execute-sql` tool. `protected`: Enables session-based execution where all tools associated with this source instance share the same [BigQuery session](https://cloud.google.com/bigquery/docs/sessions-intro). This allows for stateful operations using temporary tables (e.g., `CREATE TEMP TABLE`). For `bigquery-execute-sql`, `SELECT` statements can be used on all tables, but write operations are restricted to the session's temporary dataset. For tools like `bigquery-sql`, `bigquery-forecast`, and `bigquery-analyze-contribution`, the `writeMode` restrictions do not apply, but they will operate within the shared session. **Note:** The `protected` mode cannot be used with `useClientOAuth: true`. It is also not recommended for multi-user server environments, as all users would share the same session. A session is terminated automatically after 24 hours of inactivity or after 7 days, whichever comes first. A new session is created on the next request, and any temporary data from the previous session will be lost. | | writeMode | string | false | Controls the write behavior for tools. `allowed` (default): All queries are permitted. `blocked`: Only `SELECT` statements are allowed for the `bigquery-execute-sql` tool. `protected`: Enables session-based execution where all tools associated with this source instance share the same [BigQuery session](https://cloud.google.com/bigquery/docs/sessions-intro). This allows for stateful operations using temporary tables (e.g., `CREATE TEMP TABLE`). For `bigquery-execute-sql`, `SELECT` statements can be used on all tables, but write operations are restricted to the session's temporary dataset. For tools like `bigquery-sql`, `bigquery-forecast`, and `bigquery-analyze-contribution`, the `writeMode` restrictions do not apply, but they will operate within the shared session. **Note:** The `protected` mode cannot be used with `useClientOAuth: true`. It is also not recommended for multi-user server environments, as all users would share the same session. A session is terminated automatically after 24 hours of inactivity or after 7 days, whichever comes first. A new session is created on the next request, and any temporary data from the previous session will be lost. |
@@ -167,3 +169,4 @@ sources:
| useClientOAuth | bool | false | If true, forwards the client's OAuth access token from the "Authorization" header to downstream queries. **Note:** This cannot be used with `writeMode: protected`. | | useClientOAuth | bool | false | If true, forwards the client's OAuth access token from the "Authorization" header to downstream queries. **Note:** This cannot be used with `writeMode: protected`. |
| scopes | []string | false | A list of OAuth 2.0 scopes to use for the credentials. If not provided, default scopes are used. | | scopes | []string | false | A list of OAuth 2.0 scopes to use for the credentials. If not provided, default scopes are used. |
| impersonateServiceAccount | string | false | Service account email to impersonate when making BigQuery and Dataplex API calls. The authenticated principal must have the `roles/iam.serviceAccountTokenCreator` role on the target service account. [Learn More](https://cloud.google.com/iam/docs/service-account-impersonation) | | impersonateServiceAccount | string | false | Service account email to impersonate when making BigQuery and Dataplex API calls. The authenticated principal must have the `roles/iam.serviceAccountTokenCreator` role on the target service account. [Learn More](https://cloud.google.com/iam/docs/service-account-impersonation) |
| maxQueryResultRows | int | false | The maximum number of rows to return from a query. Defaults to 50. |

View File

@@ -59,17 +59,17 @@ applying IAM permissions and roles to an identity.
## Example ## Example
```yaml ```yaml
sources: kind: sources
my-bigtable-source: name: my-bigtable-source
kind: "bigtable" type: "bigtable"
project: "my-project-id" project: "my-project-id"
instance: "test-instance" instance: "test-instance"
``` ```
## Reference ## Reference
| **field** | **type** | **required** | **description** | | **field** | **type** | **required** | **description** |
|-----------|:--------:|:------------:|-------------------------------------------------------------------------------| |-----------|:--------:|:------------:|-------------------------------------------------------------------------------|
| kind | string | true | Must be "bigtable". | | type | string | true | Must be "bigtable". |
| project | string | true | Id of the GCP project that the cluster was created in (e.g. "my-project-id"). | | project | string | true | Id of the GCP project that the cluster was created in (e.g. "my-project-id"). |
| instance | string | true | Name of the Bigtable instance. | | instance | string | true | Name of the Bigtable instance. |

View File

@@ -23,19 +23,19 @@ distributed architectures, and a flexible approach to schema definition.
## Example ## Example
```yaml ```yaml
sources: kind: sources
my-cassandra-source: name: my-cassandra-source
kind: cassandra type: cassandra
hosts: hosts:
- 127.0.0.1 - 127.0.0.1
keyspace: my_keyspace keyspace: my_keyspace
protoVersion: 4 protoVersion: 4
username: ${USER_NAME} username: ${USER_NAME}
password: ${PASSWORD} password: ${PASSWORD}
caPath: /path/to/ca.crt # Optional: path to CA certificate caPath: /path/to/ca.crt # Optional: path to CA certificate
certPath: /path/to/client.crt # Optional: path to client certificate certPath: /path/to/client.crt # Optional: path to client certificate
keyPath: /path/to/client.key # Optional: path to client key keyPath: /path/to/client.key # Optional: path to client key
enableHostVerification: true # Optional: enable host verification enableHostVerification: true # Optional: enable host verification
``` ```
{{< notice tip >}} {{< notice tip >}}
@@ -47,7 +47,7 @@ instead of hardcoding your secrets into the configuration file.
| **field** | **type** | **required** | **description** | | **field** | **type** | **required** | **description** |
|------------------------|:--------:|:------------:|----------------------------------------------------------------------------------------------------------------------------------------------------| |------------------------|:--------:|:------------:|----------------------------------------------------------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "cassandra". | | type | string | true | Must be "cassandra". |
| hosts | string[] | true | List of IP addresses to connect to (e.g., ["192.168.1.1:9042", "192.168.1.2:9042","192.168.1.3:9042"]). The default port is 9042 if not specified. | | hosts | string[] | true | List of IP addresses to connect to (e.g., ["192.168.1.1:9042", "192.168.1.2:9042","192.168.1.3:9042"]). The default port is 9042 if not specified. |
| keyspace | string | true | Name of the Cassandra keyspace to connect to (e.g., "my_keyspace"). | | keyspace | string | true | Name of the Cassandra keyspace to connect to (e.g., "my_keyspace"). |
| protoVersion | integer | false | Protocol version for the Cassandra connection (e.g., 4). | | protoVersion | integer | false | Protocol version for the Cassandra connection (e.g., 4). |

View File

@@ -46,31 +46,31 @@ ClickHouse supports multiple protocols:
### Secure Connection Example ### Secure Connection Example
```yaml ```yaml
sources: kind: sources
secure-clickhouse-source: name: secure-clickhouse-source
kind: clickhouse type: clickhouse
host: clickhouse.example.com host: clickhouse.example.com
port: "8443" port: "8443"
database: analytics database: analytics
user: ${CLICKHOUSE_USER} user: ${CLICKHOUSE_USER}
password: ${CLICKHOUSE_PASSWORD} password: ${CLICKHOUSE_PASSWORD}
protocol: https protocol: https
secure: true secure: true
``` ```
### HTTP Protocol Example ### HTTP Protocol Example
```yaml ```yaml
sources: kind: sources
http-clickhouse-source: name: http-clickhouse-source
kind: clickhouse type: clickhouse
host: localhost host: localhost
port: "8123" port: "8123"
database: logs database: logs
user: ${CLICKHOUSE_USER} user: ${CLICKHOUSE_USER}
password: ${CLICKHOUSE_PASSWORD} password: ${CLICKHOUSE_PASSWORD}
protocol: http protocol: http
secure: false secure: false
``` ```
{{< notice tip >}} {{< notice tip >}}
@@ -82,7 +82,7 @@ instead of hardcoding your secrets into the configuration file.
| **field** | **type** | **required** | **description** | | **field** | **type** | **required** | **description** |
|-----------|:--------:|:------------:|-------------------------------------------------------------------------------------| |-----------|:--------:|:------------:|-------------------------------------------------------------------------------------|
| kind | string | true | Must be "clickhouse". | | type | string | true | Must be "clickhouse". |
| host | string | true | IP address or hostname to connect to (e.g. "127.0.0.1" or "clickhouse.example.com") | | host | string | true | IP address or hostname to connect to (e.g. "127.0.0.1" or "clickhouse.example.com") |
| port | string | true | Port to connect to (e.g. "8443" for HTTPS, "8123" for HTTP) | | port | string | true | Port to connect to (e.g. "8443" for HTTPS, "8123" for HTTP) |
| database | string | true | Name of the ClickHouse database to connect to (e.g. "my_database"). | | database | string | true | Name of the ClickHouse database to connect to (e.g. "my_database"). |

View File

@@ -20,21 +20,22 @@ Authentication can be handled in two ways:
## Example ## Example
```yaml ```yaml
sources: kind: sources
my-gda-source: name: my-gda-source
kind: cloud-gemini-data-analytics type: cloud-gemini-data-analytics
projectId: my-project-id projectId: my-project-id
---
my-oauth-gda-source: kind: sources
kind: cloud-gemini-data-analytics name: my-oauth-gda-source
projectId: my-project-id type: cloud-gemini-data-analytics
useClientOAuth: true projectId: my-project-id
useClientOAuth: true
``` ```
## Reference ## Reference
| **field** | **type** | **required** | **description** | | **field** | **type** | **required** | **description** |
| -------------- | :------: | :----------: | ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | | -------------- | :------: | :----------: | ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| kind | string | true | Must be "cloud-gemini-data-analytics". | | type | string | true | Must be "cloud-gemini-data-analytics". |
| projectId | string | true | The Google Cloud Project ID where the API is enabled. | | projectId | string | true | The Google Cloud Project ID where the API is enabled. |
| useClientOAuth | boolean | false | If true, the source uses the token provided by the caller (forwarded to the API). Otherwise, it uses server-side Application Default Credentials (ADC). Defaults to `false`. | | useClientOAuth | boolean | false | If true, the source uses the token provided by the caller (forwarded to the API). Otherwise, it uses server-side Application Default Credentials (ADC). Defaults to `false`. |

View File

@@ -123,41 +123,41 @@ identity used has been granted the correct IAM permissions.
Initialize a Cloud Healthcare API source that uses ADC: Initialize a Cloud Healthcare API source that uses ADC:
```yaml ```yaml
sources: kind: sources
my-healthcare-source: name: my-healthcare-source
kind: "cloud-healthcare" type: "cloud-healthcare"
project: "my-project-id" project: "my-project-id"
region: "us-central1" region: "us-central1"
dataset: "my-healthcare-dataset-id" dataset: "my-healthcare-dataset-id"
# allowedFhirStores: # Optional: Restricts tool access to a specific list of FHIR store IDs. # allowedFhirStores: # Optional: Restricts tool access to a specific list of FHIR store IDs.
# - "my_fhir_store_1" # - "my_fhir_store_1"
# allowedDicomStores: # Optional: Restricts tool access to a specific list of DICOM store IDs. # allowedDicomStores: # Optional: Restricts tool access to a specific list of DICOM store IDs.
# - "my_dicom_store_1" # - "my_dicom_store_1"
# - "my_dicom_store_2" # - "my_dicom_store_2"
``` ```
Initialize a Cloud Healthcare API source that uses the client's access token: Initialize a Cloud Healthcare API source that uses the client's access token:
```yaml ```yaml
sources: kind: sources
my-healthcare-client-auth-source: name: my-healthcare-client-auth-source
kind: "cloud-healthcare" type: "cloud-healthcare"
project: "my-project-id" project: "my-project-id"
region: "us-central1" region: "us-central1"
dataset: "my-healthcare-dataset-id" dataset: "my-healthcare-dataset-id"
useClientOAuth: true useClientOAuth: true
# allowedFhirStores: # Optional: Restricts tool access to a specific list of FHIR store IDs. # allowedFhirStores: # Optional: Restricts tool access to a specific list of FHIR store IDs.
# - "my_fhir_store_1" # - "my_fhir_store_1"
# allowedDicomStores: # Optional: Restricts tool access to a specific list of DICOM store IDs. # allowedDicomStores: # Optional: Restricts tool access to a specific list of DICOM store IDs.
# - "my_dicom_store_1" # - "my_dicom_store_1"
# - "my_dicom_store_2" # - "my_dicom_store_2"
``` ```
## Reference ## Reference
| **field** | **type** | **required** | **description** | | **field** | **type** | **required** | **description** |
|--------------------|:--------:|:------------:|------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| |--------------------|:--------:|:------------:|------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "cloud-healthcare". | | type | string | true | Must be "cloud-healthcare". |
| project | string | true | ID of the GCP project that the dataset lives in. | | project | string | true | ID of the GCP project that the dataset lives in. |
| region | string | true | Specifies the region (e.g., 'us', 'asia-northeast1') of the healthcare dataset. [Learn More](https://cloud.google.com/healthcare-api/docs/regions) | | region | string | true | Specifies the region (e.g., 'us', 'asia-northeast1') of the healthcare dataset. [Learn More](https://cloud.google.com/healthcare-api/docs/regions) |
| dataset | string | true | ID of the healthcare dataset. | | dataset | string | true | ID of the healthcare dataset. |

View File

@@ -25,18 +25,19 @@ Authentication can be handled in two ways:
## Example ## Example
```yaml ```yaml
sources: kind: sources
my-cloud-monitoring: name: my-cloud-monitoring
kind: cloud-monitoring type: cloud-monitoring
---
my-oauth-cloud-monitoring: kind: sources
kind: cloud-monitoring name: my-oauth-cloud-monitoring
useClientOAuth: true type: cloud-monitoring
useClientOAuth: true
``` ```
## Reference ## Reference
| **field** | **type** | **required** | **description** | | **field** | **type** | **required** | **description** |
|----------------|:--------:|:------------:|------------------------------------------------------------------------------------------------------------------------------------------------| |----------------|:--------:|:------------:|------------------------------------------------------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "cloud-monitoring". | | type | string | true | Must be "cloud-monitoring". |
| useClientOAuth | boolean | false | If true, the source will use client-side OAuth for authorization. Otherwise, it will use Application Default Credentials. Defaults to `false`. | | useClientOAuth | boolean | false | If true, the source will use client-side OAuth for authorization. Otherwise, it will use Application Default Credentials. Defaults to `false`. |

View File

@@ -24,19 +24,20 @@ Authentication can be handled in two ways:
## Example ## Example
```yaml ```yaml
sources: kind: sources
my-cloud-sql-admin: name: my-cloud-sql-admin
kind: cloud-sql-admin type: cloud-sql-admin
---
my-oauth-cloud-sql-admin: kind: sources
kind: cloud-sql-admin name: my-oauth-cloud-sql-admin
useClientOAuth: true type: cloud-sql-admin
useClientOAuth: true
``` ```
## Reference ## Reference
| **field** | **type** | **required** | **description** | | **field** | **type** | **required** | **description** |
| -------------- | :------: | :----------: | ---------------------------------------------------------------------------------------------------------------------------------------------- | | -------------- | :------: | :----------: | ---------------------------------------------------------------------------------------------------------------------------------------------- |
| kind | string | true | Must be "cloud-sql-admin". | | type | string | true | Must be "cloud-sql-admin". |
| defaultProject | string | false | The Google Cloud project ID to use for Cloud SQL infrastructure tools. | | defaultProject | string | false | The Google Cloud project ID to use for Cloud SQL infrastructure tools. |
| useClientOAuth | boolean | false | If true, the source will use client-side OAuth for authorization. Otherwise, it will use Application Default Credentials. Defaults to `false`. | | useClientOAuth | boolean | false | If true, the source will use client-side OAuth for authorization. Otherwise, it will use Application Default Credentials. Defaults to `false`. |

View File

@@ -87,16 +87,16 @@ Currently, this source only uses standard authentication. You will need to
## Example ## Example
```yaml ```yaml
sources: kind: sources
my-cloud-sql-mssql-instance: name: my-cloud-sql-mssql-instance
kind: cloud-sql-mssql type: cloud-sql-mssql
project: my-project project: my-project
region: my-region region: my-region
instance: my-instance instance: my-instance
database: my_db database: my_db
user: ${USER_NAME} user: ${USER_NAME}
password: ${PASSWORD} password: ${PASSWORD}
# ipType: private # ipType: private
``` ```
{{< notice tip >}} {{< notice tip >}}
@@ -108,7 +108,7 @@ instead of hardcoding your secrets into the configuration file.
| **field** | **type** | **required** | **description** | | **field** | **type** | **required** | **description** |
|-----------|:--------:|:------------:|------------------------------------------------------------------------------------------------------| |-----------|:--------:|:------------:|------------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "cloud-sql-mssql". | | type | string | true | Must be "cloud-sql-mssql". |
| project | string | true | Id of the GCP project that the cluster was created in (e.g. "my-project-id"). | | project | string | true | Id of the GCP project that the cluster was created in (e.g. "my-project-id"). |
| region | string | true | Name of the GCP region that the cluster was created in (e.g. "us-central1"). | | region | string | true | Name of the GCP region that the cluster was created in (e.g. "us-central1"). |
| instance | string | true | Name of the Cloud SQL instance within the cluster (e.g. "my-instance"). | | instance | string | true | Name of the Cloud SQL instance within the cluster (e.g. "my-instance"). |

View File

@@ -128,16 +128,16 @@ To connect using IAM authentication:
## Example ## Example
```yaml ```yaml
sources: kind: sources
my-cloud-sql-mysql-source: name: my-cloud-sql-mysql-source
kind: cloud-sql-mysql type: cloud-sql-mysql
project: my-project-id project: my-project-id
region: us-central1 region: us-central1
instance: my-instance instance: my-instance
database: my_db database: my_db
user: ${USER_NAME} user: ${USER_NAME}
password: ${PASSWORD} password: ${PASSWORD}
# ipType: "private" # ipType: "private"
``` ```
{{< notice tip >}} {{< notice tip >}}
@@ -149,7 +149,7 @@ instead of hardcoding your secrets into the configuration file.
| **field** | **type** | **required** | **description** | | **field** | **type** | **required** | **description** |
|-----------|:--------:|:------------:|------------------------------------------------------------------------------------------------------| |-----------|:--------:|:------------:|------------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "cloud-sql-mysql". | | type | string | true | Must be "cloud-sql-mysql". |
| project | string | true | Id of the GCP project that the cluster was created in (e.g. "my-project-id"). | | project | string | true | Id of the GCP project that the cluster was created in (e.g. "my-project-id"). |
| region | string | true | Name of the GCP region that the cluster was created in (e.g. "us-central1"). | | region | string | true | Name of the GCP region that the cluster was created in (e.g. "us-central1"). |
| instance | string | true | Name of the Cloud SQL instance within the cluster (e.g. "my-instance"). | | instance | string | true | Name of the Cloud SQL instance within the cluster (e.g. "my-instance"). |

View File

@@ -178,16 +178,16 @@ To connect using IAM authentication:
## Example ## Example
```yaml ```yaml
sources: kind: sources
my-cloud-sql-pg-source: name: my-cloud-sql-pg-source
kind: cloud-sql-postgres type: cloud-sql-postgres
project: my-project-id project: my-project-id
region: us-central1 region: us-central1
instance: my-instance instance: my-instance
database: my_db database: my_db
user: ${USER_NAME} user: ${USER_NAME}
password: ${PASSWORD} password: ${PASSWORD}
# ipType: "private" # ipType: "private"
``` ```
{{< notice tip >}} {{< notice tip >}}
@@ -199,7 +199,7 @@ instead of hardcoding your secrets into the configuration file.
| **field** | **type** | **required** | **description** | | **field** | **type** | **required** | **description** |
|-----------|:--------:|:------------:|--------------------------------------------------------------------------------------------------------------------------| |-----------|:--------:|:------------:|--------------------------------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "cloud-sql-postgres". | | type | string | true | Must be "cloud-sql-postgres". |
| project | string | true | Id of the GCP project that the cluster was created in (e.g. "my-project-id"). | | project | string | true | Id of the GCP project that the cluster was created in (e.g. "my-project-id"). |
| region | string | true | Name of the GCP region that the cluster was created in (e.g. "us-central1"). | | region | string | true | Name of the GCP region that the cluster was created in (e.g. "us-central1"). |
| instance | string | true | Name of the Cloud SQL instance within the cluster (e.g. "my-instance"). | | instance | string | true | Name of the Cloud SQL instance within the cluster (e.g. "my-instance"). |

View File

@@ -19,14 +19,14 @@ allowing tools to execute SQL queries against it.
## Example ## Example
```yaml ```yaml
sources: kind: sources
my-couchbase-instance: name: my-couchbase-instance
kind: couchbase type: couchbase
connectionString: couchbase://localhost connectionString: couchbase://localhost
bucket: travel-sample bucket: travel-sample
scope: inventory scope: inventory
username: Administrator username: Administrator
password: password password: password
``` ```
{{< notice note >}} {{< notice note >}}
@@ -38,7 +38,7 @@ Connections](https://docs.couchbase.com/java-sdk/current/howtos/managing-connect
| **field** | **type** | **required** | **description** | | **field** | **type** | **required** | **description** |
|----------------------|:--------:|:------------:|------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| |----------------------|:--------:|:------------:|------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "couchbase". | | type | string | true | Must be "couchbase". |
| connectionString | string | true | Connection string for the Couchbase cluster. | | connectionString | string | true | Connection string for the Couchbase cluster. |
| bucket | string | true | Name of the bucket to connect to. | | bucket | string | true | Name of the bucket to connect to. |
| scope | string | true | Name of the scope within the bucket. | | scope | string | true | Name of the scope within the bucket. |

View File

@@ -23,10 +23,10 @@ applying artificial intelligence and machine learning.
## Example ## Example
```yaml ```yaml
sources: kind: sources
my-dataplex-source: name: my-dataplex-source
kind: "dataplex" type: "dataplex"
project: "my-project-id" project: "my-project-id"
``` ```
## Sample System Prompt ## Sample System Prompt
@@ -355,5 +355,5 @@ This abbreviated syntax works for the qualified predicates except for `label` in
| **field** | **type** | **required** | **description** | | **field** | **type** | **required** | **description** |
|-----------|:--------:|:------------:|----------------------------------------------------------------------------------| |-----------|:--------:|:------------:|----------------------------------------------------------------------------------|
| kind | string | true | Must be "dataplex". | | type | string | true | Must be "dataplex". |
| project | string | true | ID of the GCP project used for quota and billing purposes (e.g. "my-project-id").| | project | string | true | ID of the GCP project used for quota and billing purposes (e.g. "my-project-id").|

View File

@@ -7,6 +7,17 @@ description: >
--- ---
{{< notice note >}}
**⚠️ Best Effort Maintenance**
This integration is maintained on a best-effort basis by the project
team/community. While we strive to address issues and provide workarounds when
resources are available, there are no guaranteed response times or code fixes.
The automated integration tests for this module are currently non-functional or
failing.
{{< /notice >}}
## About ## About
[Dgraph][dgraph-docs] is an open-source graph database. It is designed for [Dgraph][dgraph-docs] is an open-source graph database. It is designed for
@@ -40,14 +51,14 @@ and user credentials for that namespace.
## Example ## Example
```yaml ```yaml
sources: kind: sources
my-dgraph-source: name: my-dgraph-source
kind: dgraph type: dgraph
dgraphUrl: https://xxxx.cloud.dgraph.io dgraphUrl: https://xxxx.cloud.dgraph.io
user: ${USER_NAME} user: ${USER_NAME}
password: ${PASSWORD} password: ${PASSWORD}
apiKey: ${API_KEY} apiKey: ${API_KEY}
namespace : 0 namespace : 0
``` ```
{{< notice tip >}} {{< notice tip >}}
@@ -59,7 +70,7 @@ instead of hardcoding your secrets into the configuration file.
| **Field** | **Type** | **Required** | **Description** | | **Field** | **Type** | **Required** | **Description** |
|-------------|:--------:|:------------:|--------------------------------------------------------------------------------------------------| |-------------|:--------:|:------------:|--------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "dgraph". | | type | string | true | Must be "dgraph". |
| dgraphUrl | string | true | Connection URI (e.g. "<https://xxx.cloud.dgraph.io>", "<https://localhost:8080>"). | | dgraphUrl | string | true | Connection URI (e.g. "<https://xxx.cloud.dgraph.io>", "<https://localhost:8080>"). |
| user | string | false | Name of the Dgraph user to connect as (e.g., "groot"). | | user | string | false | Name of the Dgraph user to connect as (e.g., "groot"). |
| password | string | false | Password of the Dgraph user (e.g., "password"). | | password | string | false | Password of the Dgraph user (e.g., "password"). |

View File

@@ -59,18 +59,18 @@ applying permissions to an API key.
## Example ## Example
```yaml ```yaml
sources: kind: sources
my-elasticsearch-source: name: my-elasticsearch-source
kind: "elasticsearch" type: "elasticsearch"
addresses: addresses:
- "http://localhost:9200" - "http://localhost:9200"
apikey: "my-api-key" apikey: "my-api-key"
``` ```
## Reference ## Reference
| **field** | **type** | **required** | **description** | | **field** | **type** | **required** | **description** |
|-----------|:--------:|:------------:|--------------------------------------------| |-----------|:--------:|:------------:|--------------------------------------------|
| kind | string | true | Must be "elasticsearch". | | type | string | true | Must be "elasticsearch". |
| addresses | []string | true | List of Elasticsearch hosts to connect to. | | addresses | []string | true | List of Elasticsearch hosts to connect to. |
| apikey | string | true | The API key to use for authentication. | | apikey | string | true | The API key to use for authentication. |

View File

@@ -36,14 +36,14 @@ user][fb-users] to login to the database with.
## Example ## Example
```yaml ```yaml
sources: kind: sources
my_firebird_db: name: my_firebird_db
kind: firebird type: firebird
host: "localhost" host: "localhost"
port: 3050 port: 3050
database: "/path/to/your/database.fdb" database: "/path/to/your/database.fdb"
user: ${FIREBIRD_USER} user: ${FIREBIRD_USER}
password: ${FIREBIRD_PASS} password: ${FIREBIRD_PASS}
``` ```
{{< notice tip >}} {{< notice tip >}}
@@ -55,7 +55,7 @@ instead of hardcoding your secrets into the configuration file.
| **field** | **type** | **required** | **description** | | **field** | **type** | **required** | **description** |
|-----------|:--------:|:------------:|------------------------------------------------------------------------------| |-----------|:--------:|:------------:|------------------------------------------------------------------------------|
| kind | string | true | Must be "firebird". | | type | string | true | Must be "firebird". |
| host | string | true | IP address to connect to (e.g. "127.0.0.1") | | host | string | true | IP address to connect to (e.g. "127.0.0.1") |
| port | string | true | Port to connect to (e.g. "3050") | | port | string | true | Port to connect to (e.g. "3050") |
| database | string | true | Path to the Firebird database file (e.g. "/var/lib/firebird/data/test.fdb"). | | database | string | true | Path to the Firebird database file (e.g. "/var/lib/firebird/data/test.fdb"). |

View File

@@ -61,17 +61,17 @@ database named `(default)` will be used.
## Example ## Example
```yaml ```yaml
sources: kind: sources
my-firestore-source: name: my-firestore-source
kind: "firestore" type: "firestore"
project: "my-project-id" project: "my-project-id"
# database: "my-database" # Optional, defaults to "(default)" # database: "my-database" # Optional, defaults to "(default)"
``` ```
## Reference ## Reference
| **field** | **type** | **required** | **description** | | **field** | **type** | **required** | **description** |
|-----------|:--------:|:------------:|----------------------------------------------------------------------------------------------------------| |-----------|:--------:|:------------:|----------------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "firestore". | | type | string | true | Must be "firestore". |
| project | string | true | Id of the GCP project that contains the Firestore database (e.g. "my-project-id"). | | project | string | true | Id of the GCP project that contains the Firestore database (e.g. "my-project-id"). |
| database | string | false | Name of the Firestore database to connect to. Defaults to "(default)" if not specified. | | database | string | false | Name of the Firestore database to connect to. Defaults to "(default)" if not specified. |

View File

@@ -21,18 +21,18 @@ and other HTTP-accessible resources.
## Example ## Example
```yaml ```yaml
sources: kind: sources
my-http-source: name: my-http-source
kind: http type: http
baseUrl: https://api.example.com/data baseUrl: https://api.example.com/data
timeout: 10s # default to 30s timeout: 10s # default to 30s
headers: headers:
Authorization: Bearer ${API_KEY} Authorization: Bearer ${API_KEY}
Content-Type: application/json Content-Type: application/json
queryParams: queryParams:
param1: value1 param1: value1
param2: value2 param2: value2
# disableSslVerification: false # disableSslVerification: false
``` ```
{{< notice tip >}} {{< notice tip >}}
@@ -44,7 +44,7 @@ instead of hardcoding your secrets into the configuration file.
| **field** | **type** | **required** | **description** | | **field** | **type** | **required** | **description** |
|------------------------|:-----------------:|:------------:|------------------------------------------------------------------------------------------------------------------------------------| |------------------------|:-----------------:|:------------:|------------------------------------------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "http". | | type | string | true | Must be "http". |
| baseUrl | string | true | The base URL for the HTTP requests (e.g., `https://api.example.com`). | | baseUrl | string | true | The base URL for the HTTP requests (e.g., `https://api.example.com`). |
| timeout | string | false | The timeout for HTTP requests (e.g., "5s", "1m", refer to [ParseDuration][parse-duration-doc] for more examples). Defaults to 30s. | | timeout | string | false | The timeout for HTTP requests (e.g., "5s", "1m", refer to [ParseDuration][parse-duration-doc] for more examples). Defaults to 30s. |
| headers | map[string]string | false | Default headers to include in the HTTP requests. | | headers | map[string]string | false | Default headers to include in the HTTP requests. |

View File

@@ -56,16 +56,16 @@ To initialize the application default credential run `gcloud auth login
## Example ## Example
```yaml ```yaml
sources: kind: sources
my-looker-source: name: my-looker-source
kind: looker type: looker
base_url: http://looker.example.com base_url: http://looker.example.com
client_id: ${LOOKER_CLIENT_ID} client_id: ${LOOKER_CLIENT_ID}
client_secret: ${LOOKER_CLIENT_SECRET} client_secret: ${LOOKER_CLIENT_SECRET}
project: ${LOOKER_PROJECT} project: ${LOOKER_PROJECT}
location: ${LOOKER_LOCATION} location: ${LOOKER_LOCATION}
verify_ssl: true verify_ssl: true
timeout: 600s timeout: 600s
``` ```
The Looker base url will look like "https://looker.example.com", don't include The Looker base url will look like "https://looker.example.com", don't include
@@ -93,7 +93,7 @@ instead of hardcoding your secrets into the configuration file.
| **field** | **type** | **required** | **description** | | **field** | **type** | **required** | **description** |
|----------------------|:--------:|:------------:|-----------------------------------------------------------------------------------------------------------------------------------------------------| |----------------------|:--------:|:------------:|-----------------------------------------------------------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "looker". | | type | string | true | Must be "looker". |
| base_url | string | true | The URL of your Looker server with no trailing /. | | base_url | string | true | The URL of your Looker server with no trailing /. |
| client_id | string | false | The client id assigned by Looker. | | client_id | string | false | The client id assigned by Looker. |
| client_secret | string | false | The client secret assigned by Looker. | | client_secret | string | false | The client secret assigned by Looker. |

View File

@@ -45,18 +45,18 @@ MariaDB user][mariadb-users] to log in to the database.
## Example ## Example
```yaml ```yaml
sources: kind: sources
my_mariadb_db: name: my_mariadb_db
kind: mysql type: mysql
host: 127.0.0.1 host: 127.0.0.1
port: 3306 port: 3306
database: my_db database: my_db
user: ${MARIADB_USER} user: ${MARIADB_USER}
password: ${MARIADB_PASS} password: ${MARIADB_PASS}
# Optional TLS and other driver parameters. For example, enable preferred TLS: # Optional TLS and other driver parameters. For example, enable preferred TLS:
# queryParams: # queryParams:
# tls: preferred # tls: preferred
queryTimeout: 30s # Optional: query timeout duration queryTimeout: 30s # Optional: query timeout duration
``` ```
{{< notice tip >}} {{< notice tip >}}
@@ -68,7 +68,7 @@ Use environment variables instead of committing credentials to source files.
| **field** | **type** | **required** | **description** | | **field** | **type** | **required** | **description** |
| ------------ | :------: | :----------: | ----------------------------------------------------------------------------------------------- | | ------------ | :------: | :----------: | ----------------------------------------------------------------------------------------------- |
| kind | string | true | Must be `mysql`. | | type | string | true | Must be `mysql`. |
| host | string | true | IP address to connect to (e.g. "127.0.0.1"). | | host | string | true | IP address to connect to (e.g. "127.0.0.1"). |
| port | string | true | Port to connect to (e.g. "3307"). | | port | string | true | Port to connect to (e.g. "3307"). |
| database | string | true | Name of the MariaDB database to connect to (e.g. "my_db"). | | database | string | true | Name of the MariaDB database to connect to (e.g. "my_db"). |

View File

@@ -125,15 +125,15 @@ can omit the password field.
## Example ## Example
```yaml ```yaml
sources: kind: sources
my-mindsdb-source: name: my-mindsdb-source
kind: mindsdb type: mindsdb
host: 127.0.0.1 host: 127.0.0.1
port: 3306 port: 3306
database: my_db database: my_db
user: ${USER_NAME} user: ${USER_NAME}
password: ${PASSWORD} # Optional: omit if MindsDB is configured without authentication password: ${PASSWORD} # Optional: omit if MindsDB is configured without authentication
queryTimeout: 30s # Optional: query timeout duration queryTimeout: 30s # Optional: query timeout duration
``` ```
### Working Configuration Example ### Working Configuration Example
@@ -141,13 +141,13 @@ sources:
Here's a working configuration that has been tested: Here's a working configuration that has been tested:
```yaml ```yaml
sources: kind: sources
my-pg-source: name: my-pg-source
kind: mindsdb type: mindsdb
host: 127.0.0.1 host: 127.0.0.1
port: 47335 port: 47335
database: files database: files
user: mindsdb user: mindsdb
``` ```
{{< notice tip >}} {{< notice tip >}}
@@ -176,7 +176,7 @@ With MindsDB integration, you can:
| **field** | **type** | **required** | **description** | | **field** | **type** | **required** | **description** |
|--------------|:--------:|:------------:|--------------------------------------------------------------------------------------------------------------| |--------------|:--------:|:------------:|--------------------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "mindsdb". | | type | string | true | Must be "mindsdb". |
| host | string | true | IP address to connect to (e.g. "127.0.0.1"). | | host | string | true | IP address to connect to (e.g. "127.0.0.1"). |
| port | string | true | Port to connect to (e.g. "3306"). | | port | string | true | Port to connect to (e.g. "3306"). |
| database | string | true | Name of the MindsDB database to connect to (e.g. "my_db"). | | database | string | true | Name of the MindsDB database to connect to (e.g. "my_db"). |

View File

@@ -17,10 +17,10 @@ flexible, JSON-like documents, making it easy to develop and scale applications.
## Example ## Example
```yaml ```yaml
sources: kind: sources
my-mongodb: name: my-mongodb
kind: mongodb type: mongodb
uri: "mongodb+srv://username:password@host.mongodb.net" uri: "mongodb+srv://username:password@host.mongodb.net"
``` ```
@@ -28,5 +28,5 @@ sources:
| **field** | **type** | **required** | **description** | | **field** | **type** | **required** | **description** |
|-----------|:--------:|:------------:|-------------------------------------------------------------------| |-----------|:--------:|:------------:|-------------------------------------------------------------------|
| kind | string | true | Must be "mongodb". | | type | string | true | Must be "mongodb". |
| uri | string | true | connection string to connect to MongoDB | | uri | string | true | connection string to connect to MongoDB |

View File

@@ -39,15 +39,15 @@ SQL Server user][mssql-users] to login to the database with.
## Example ## Example
```yaml ```yaml
sources: kind: sources
my-mssql-source: name: my-mssql-source
kind: mssql type: mssql
host: 127.0.0.1 host: 127.0.0.1
port: 1433 port: 1433
database: my_db database: my_db
user: ${USER_NAME} user: ${USER_NAME}
password: ${PASSWORD} password: ${PASSWORD}
# encrypt: strict # encrypt: strict
``` ```
{{< notice tip >}} {{< notice tip >}}
@@ -59,7 +59,7 @@ instead of hardcoding your secrets into the configuration file.
| **field** | **type** | **required** | **description** | | **field** | **type** | **required** | **description** |
|-----------|:--------:|:------------:|--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| |-----------|:--------:|:------------:|--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "mssql". | | type | string | true | Must be "mssql". |
| host | string | true | IP address to connect to (e.g. "127.0.0.1"). | | host | string | true | IP address to connect to (e.g. "127.0.0.1"). |
| port | string | true | Port to connect to (e.g. "1433"). | | port | string | true | Port to connect to (e.g. "1433"). |
| database | string | true | Name of the SQL Server database to connect to (e.g. "my_db"). | | database | string | true | Name of the SQL Server database to connect to (e.g. "my_db"). |

View File

@@ -49,18 +49,18 @@ MySQL user][mysql-users] to login to the database with.
## Example ## Example
```yaml ```yaml
sources: kind: sources
my-mysql-source: name: my-mysql-source
kind: mysql type: mysql
host: 127.0.0.1 host: 127.0.0.1
port: 3306 port: 3306
database: my_db database: my_db
user: ${USER_NAME} user: ${USER_NAME}
password: ${PASSWORD} password: ${PASSWORD}
# Optional TLS and other driver parameters. For example, enable preferred TLS: # Optional TLS and other driver parameters. For example, enable preferred TLS:
# queryParams: # queryParams:
# tls: preferred # tls: preferred
queryTimeout: 30s # Optional: query timeout duration queryTimeout: 30s # Optional: query timeout duration
``` ```
{{< notice tip >}} {{< notice tip >}}
@@ -72,7 +72,7 @@ instead of hardcoding your secrets into the configuration file.
| **field** | **type** | **required** | **description** | | **field** | **type** | **required** | **description** |
| ------------ | :------: | :----------: | ----------------------------------------------------------------------------------------------- | | ------------ | :------: | :----------: | ----------------------------------------------------------------------------------------------- |
| kind | string | true | Must be "mysql". | | type | string | true | Must be "mysql". |
| host | string | true | IP address to connect to (e.g. "127.0.0.1"). | | host | string | true | IP address to connect to (e.g. "127.0.0.1"). |
| port | string | true | Port to connect to (e.g. "3306"). | | port | string | true | Port to connect to (e.g. "3306"). |
| database | string | true | Name of the MySQL database to connect to (e.g. "my_db"). | | database | string | true | Name of the MySQL database to connect to (e.g. "my_db"). |

View File

@@ -33,13 +33,13 @@ user if available.
## Example ## Example
```yaml ```yaml
sources: kind: sources
my-neo4j-source: name: my-neo4j-source
kind: neo4j type: neo4j
uri: neo4j+s://xxxx.databases.neo4j.io:7687 uri: neo4j+s://xxxx.databases.neo4j.io:7687
user: ${USER_NAME} user: ${USER_NAME}
password: ${PASSWORD} password: ${PASSWORD}
database: "neo4j" database: "neo4j"
``` ```
{{< notice tip >}} {{< notice tip >}}
@@ -51,7 +51,7 @@ instead of hardcoding your secrets into the configuration file.
| **field** | **type** | **required** | **description** | | **field** | **type** | **required** | **description** |
|-----------|:--------:|:------------:|----------------------------------------------------------------------| |-----------|:--------:|:------------:|----------------------------------------------------------------------|
| kind | string | true | Must be "neo4j". | | type | string | true | Must be "neo4j". |
| uri | string | true | Connect URI ("bolt://localhost", "neo4j+s://xxx.databases.neo4j.io") | | uri | string | true | Connect URI ("bolt://localhost", "neo4j+s://xxx.databases.neo4j.io") |
| user | string | true | Name of the Neo4j user to connect as (e.g. "neo4j"). | | user | string | true | Name of the Neo4j user to connect as (e.g. "neo4j"). |
| password | string | true | Password of the Neo4j user (e.g. "my-password"). | | password | string | true | Password of the Neo4j user (e.g. "my-password"). |

View File

@@ -33,15 +33,15 @@ with SSL).
## Example ## Example
```yaml ```yaml
sources: kind: sources
my-oceanbase-source: name: my-oceanbase-source
kind: oceanbase type: oceanbase
host: 127.0.0.1 host: 127.0.0.1
port: 2881 port: 2881
database: my_db database: my_db
user: ${USER_NAME} user: ${USER_NAME}
password: ${PASSWORD} password: ${PASSWORD}
queryTimeout: 30s # Optional: query timeout duration queryTimeout: 30s # Optional: query timeout duration
``` ```
{{< notice tip >}} {{< notice tip >}}
@@ -53,7 +53,7 @@ instead of hardcoding your secrets into the configuration file.
| **field** | **type** | **required** | **description** | | **field** | **type** | **required** | **description** |
| ------------ | :------: | :----------: |-------------------------------------------------------------------------------------------------| | ------------ | :------: | :----------: |-------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "oceanbase". | | type | string | true | Must be "oceanbase". |
| host | string | true | IP address to connect to (e.g. "127.0.0.1"). | | host | string | true | IP address to connect to (e.g. "127.0.0.1"). |
| port | string | true | Port to connect to (e.g. "2881"). | | port | string | true | Port to connect to (e.g. "2881"). |
| database | string | true | Name of the OceanBase database to connect to (e.g. "my_db"). | | database | string | true | Name of the OceanBase database to connect to (e.g. "my_db"). |

View File

@@ -90,27 +90,27 @@ using a TNS (Transparent Network Substrate) alias.
This example demonstrates the four connection methods you could choose from: This example demonstrates the four connection methods you could choose from:
```yaml ```yaml
sources: kind: sources
my-oracle-source: name: my-oracle-source
kind: oracle type: oracle
# --- Choose one connection method --- # --- Choose one connection method ---
# 1. Host, Port, and Service Name # 1. Host, Port, and Service Name
host: 127.0.0.1 host: 127.0.0.1
port: 1521 port: 1521
serviceName: XEPDB1 serviceName: XEPDB1
# 2. Direct Connection String # 2. Direct Connection String
connectionString: "127.0.0.1:1521/XEPDB1" connectionString: "127.0.0.1:1521/XEPDB1"
# 3. TNS Alias (requires tnsnames.ora) # 3. TNS Alias (requires tnsnames.ora)
tnsAlias: "MY_DB_ALIAS" tnsAlias: "MY_DB_ALIAS"
tnsAdmin: "/opt/oracle/network/admin" # Optional: overrides TNS_ADMIN env var tnsAdmin: "/opt/oracle/network/admin" # Optional: overrides TNS_ADMIN env var
user: ${USER_NAME} user: ${USER_NAME}
password: ${PASSWORD} password: ${PASSWORD}
# Optional: Set to true to use the OCI-based driver for advanced features (Requires Oracle Instant Client) # Optional: Set to true to use the OCI-based driver for advanced features (Requires Oracle Instant Client)
``` ```
### Using an Oracle Wallet ### Using an Oracle Wallet
@@ -122,15 +122,15 @@ Oracle Wallet allows you to store credentails used for database connection. Depe
The `go-ora` driver uses the `walletLocation` field to connect to a database secured with an Oracle Wallet without standard username and password. The `go-ora` driver uses the `walletLocation` field to connect to a database secured with an Oracle Wallet without standard username and password.
```yaml ```yaml
sources: kind: sources
pure-go-wallet: name: pure-go-wallet
kind: oracle type: oracle
connectionString: "127.0.0.1:1521/XEPDB1" connectionString: "127.0.0.1:1521/XEPDB1"
user: ${USER_NAME} user: ${USER_NAME}
password: ${PASSWORD} password: ${PASSWORD}
# The TNS Alias is often required to connect to a service registered in tnsnames.ora # The TNS Alias is often required to connect to a service registered in tnsnames.ora
tnsAlias: "SECURE_DB_ALIAS" tnsAlias: "SECURE_DB_ALIAS"
walletLocation: "/path/to/my/wallet/directory" walletLocation: "/path/to/my/wallet/directory"
``` ```
#### OCI-Based Driver (`useOCI: true`) - Oracle Wallet #### OCI-Based Driver (`useOCI: true`) - Oracle Wallet
@@ -138,15 +138,15 @@ sources:
For the OCI-based driver, wallet authentication is triggered by setting tnsAdmin to the wallet directory and connecting via a tnsAlias. For the OCI-based driver, wallet authentication is triggered by setting tnsAdmin to the wallet directory and connecting via a tnsAlias.
```yaml ```yaml
sources: kind: sources
oci-wallet: name: oci-wallet
kind: oracle type: oracle
connectionString: "127.0.0.1:1521/XEPDB1" connectionString: "127.0.0.1:1521/XEPDB1"
user: ${USER_NAME} user: ${USER_NAME}
password: ${PASSWORD} password: ${PASSWORD}
tnsAlias: "WALLET_DB_ALIAS" tnsAlias: "WALLET_DB_ALIAS"
tnsAdmin: "/opt/oracle/wallet" # Directory containing tnsnames.ora, sqlnet.ora, and wallet files tnsAdmin: "/opt/oracle/wallet" # Directory containing tnsnames.ora, sqlnet.ora, and wallet files
useOCI: true useOCI: true
``` ```
{{< notice tip >}} {{< notice tip >}}
@@ -158,7 +158,7 @@ instead of hardcoding your secrets into the configuration file.
| **field** | **type** | **required** | **description** | | **field** | **type** | **required** | **description** |
|------------------|:--------:|:------------:|-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| |------------------|:--------:|:------------:|-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "oracle". | | type | string | true | Must be "oracle". |
| user | string | true | Name of the Oracle user to connect as (e.g. "my-oracle-user"). | | user | string | true | Name of the Oracle user to connect as (e.g. "my-oracle-user"). |
| password | string | true | Password of the Oracle user (e.g. "my-password"). | | password | string | true | Password of the Oracle user (e.g. "my-password"). |
| host | string | false | IP address or hostname to connect to (e.g. "127.0.0.1"). Required if not using `connectionString` or `tnsAlias`. | | host | string | false | IP address or hostname to connect to (e.g. "127.0.0.1"). Required if not using `connectionString` or `tnsAlias`. |

View File

@@ -107,14 +107,14 @@ PostgreSQL user][pg-users] to login to the database with.
## Example ## Example
```yaml ```yaml
sources: kind: sources
my-pg-source: name: my-pg-source
kind: postgres type: postgres
host: 127.0.0.1 host: 127.0.0.1
port: 5432 port: 5432
database: my_db database: my_db
user: ${USER_NAME} user: ${USER_NAME}
password: ${PASSWORD} password: ${PASSWORD}
``` ```
{{< notice tip >}} {{< notice tip >}}
@@ -126,7 +126,7 @@ instead of hardcoding your secrets into the configuration file.
| **field** | **type** | **required** | **description** | | **field** | **type** | **required** | **description** |
|-------------|:------------------:|:------------:|------------------------------------------------------------------------| |-------------|:------------------:|:------------:|------------------------------------------------------------------------|
| kind | string | true | Must be "postgres". | | type | string | true | Must be "postgres". |
| host | string | true | IP address to connect to (e.g. "127.0.0.1") | | host | string | true | IP address to connect to (e.g. "127.0.0.1") |
| port | string | true | Port to connect to (e.g. "5432") | | port | string | true | Port to connect to (e.g. "5432") |
| database | string | true | Name of the Postgres database to connect to (e.g. "my_db"). | | database | string | true | Name of the Postgres database to connect to (e.g. "my_db"). |

View File

@@ -34,16 +34,16 @@ connections must authenticate in order to connect.
Specify your AUTH string in the password field: Specify your AUTH string in the password field:
```yaml ```yaml
sources: kind: sources
my-redis-instance: name: my-redis-instance
kind: redis type: redis
address: address:
- 127.0.0.1:6379 - 127.0.0.1:6379
username: ${MY_USER_NAME} username: ${MY_USER_NAME}
password: ${MY_AUTH_STRING} # Omit this field if you don't have a password. password: ${MY_AUTH_STRING} # Omit this field if you don't have a password.
# database: 0 # database: 0
# clusterEnabled: false # clusterEnabled: false
# useGCPIAM: false # useGCPIAM: false
``` ```
{{< notice tip >}} {{< notice tip >}}
@@ -59,14 +59,14 @@ string.
Here is an example tools.yaml config with [AUTH][auth] enabled: Here is an example tools.yaml config with [AUTH][auth] enabled:
```yaml ```yaml
sources: kind: sources
my-redis-cluster-instance: name: my-redis-cluster-instance
kind: memorystore-redis type: memorystore-redis
address: address:
- 127.0.0.1:6379 - 127.0.0.1:6379
password: ${MY_AUTH_STRING} password: ${MY_AUTH_STRING}
# useGCPIAM: false # useGCPIAM: false
# clusterEnabled: false # clusterEnabled: false
``` ```
Memorystore Redis Cluster supports IAM authentication instead. Grant your Memorystore Redis Cluster supports IAM authentication instead. Grant your
@@ -76,13 +76,13 @@ Here is an example tools.yaml config for Memorystore Redis Cluster instances
using IAM authentication: using IAM authentication:
```yaml ```yaml
sources: kind: sources
my-redis-cluster-instance: name: my-redis-cluster-instance
kind: memorystore-redis type: memorystore-redis
address: address:
- 127.0.0.1:6379 - 127.0.0.1:6379
useGCPIAM: true useGCPIAM: true
clusterEnabled: true clusterEnabled: true
``` ```
[iam]: https://cloud.google.com/memorystore/docs/cluster/about-iam-auth [iam]: https://cloud.google.com/memorystore/docs/cluster/about-iam-auth
@@ -91,7 +91,7 @@ sources:
| **field** | **type** | **required** | **description** | | **field** | **type** | **required** | **description** |
|----------------|:--------:|:------------:|---------------------------------------------------------------------------------------------------------------------------------| |----------------|:--------:|:------------:|---------------------------------------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "memorystore-redis". | | type | string | true | Must be "memorystore-redis". |
| address | string | true | Primary endpoint for the Memorystore Redis instance to connect to. | | address | string | true | Primary endpoint for the Memorystore Redis instance to connect to. |
| username | string | false | If you are using a non-default user, specify the user name here. If you are using Memorystore for Redis, leave this field blank | | username | string | false | If you are using a non-default user, specify the user name here. If you are using Memorystore for Redis, leave this field blank |
| password | string | false | If you have [Redis AUTH][auth] enabled, specify the AUTH string here | | password | string | false | If you have [Redis AUTH][auth] enabled, specify the AUTH string here |

View File

@@ -49,17 +49,17 @@ set up your ADC.
## Example ## Example
```yaml ```yaml
sources: kind: sources
my-serverless-spark-source: name: my-serverless-spark-source
kind: serverless-spark type: serverless-spark
project: my-project-id project: my-project-id
location: us-central1 location: us-central1
``` ```
## Reference ## Reference
| **field** | **type** | **required** | **description** | | **field** | **type** | **required** | **description** |
| --------- | :------: | :----------: | ----------------------------------------------------------------- | | --------- | :------: | :----------: | ----------------------------------------------------------------- |
| kind | string | true | Must be "serverless-spark". | | type | string | true | Must be "serverless-spark". |
| project | string | true | ID of the GCP project with Serverless for Apache Spark resources. | | project | string | true | ID of the GCP project with Serverless for Apache Spark resources. |
| location | string | true | Location containing Serverless for Apache Spark resources. | | location | string | true | Location containing Serverless for Apache Spark resources. |

View File

@@ -39,15 +39,15 @@ database user][singlestore-user] to login to the database with.
## Example ## Example
```yaml ```yaml
sources: kind: sources
my-singlestore-source: name: my-singlestore-source
kind: singlestore type: singlestore
host: 127.0.0.1 host: 127.0.0.1
port: 3306 port: 3306
database: my_db database: my_db
user: ${USER_NAME} user: ${USER_NAME}
password: ${PASSWORD} password: ${PASSWORD}
queryTimeout: 30s # Optional: query timeout duration queryTimeout: 30s # Optional: query timeout duration
``` ```
{{< notice tip >}} {{< notice tip >}}
@@ -59,7 +59,7 @@ instead of hardcoding your secrets into the configuration file.
| **field** | **type** | **required** | **description** | | **field** | **type** | **required** | **description** |
|--------------|:--------:|:------------:|-------------------------------------------------------------------------------------------------| |--------------|:--------:|:------------:|-------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "singlestore". | | type | string | true | Must be "singlestore". |
| host | string | true | IP address to connect to (e.g. "127.0.0.1"). | | host | string | true | IP address to connect to (e.g. "127.0.0.1"). |
| port | string | true | Port to connect to (e.g. "3306"). | | port | string | true | Port to connect to (e.g. "3306"). |
| database | string | true | Name of the SingleStore database to connect to (e.g. "my_db"). | | database | string | true | Name of the SingleStore database to connect to (e.g. "my_db"). |

View File

@@ -0,0 +1,63 @@
---
title: "Snowflake"
type: docs
weight: 1
description: >
Snowflake is a cloud-based data platform.
---
## About
[Snowflake][sf-docs] is a cloud data platform that provides a data warehouse-as-a-service designed for the cloud.
[sf-docs]: https://docs.snowflake.com/
## Available Tools
- [`snowflake-sql`](../tools/snowflake/snowflake-sql.md)
Execute SQL queries as prepared statements in Snowflake.
- [`snowflake-execute-sql`](../tools/snowflake/snowflake-execute-sql.md)
Run parameterized SQL statements in Snowflake.
## Requirements
### Database User
This source only uses standard authentication. You will need to create a
Snowflake user to login to the database with.
## Example
```yaml
kind: sources
name: my-sf-source
type: snowflake
account: ${SNOWFLAKE_ACCOUNT}
user: ${SNOWFLAKE_USER}
password: ${SNOWFLAKE_PASSWORD}
database: ${SNOWFLAKE_DATABASE}
schema: ${SNOWFLAKE_SCHEMA}
warehouse: ${SNOWFLAKE_WAREHOUSE}
role: ${SNOWFLAKE_ROLE}
```
{{< notice tip >}}
Use environment variable replacement with the format ${ENV_NAME}
instead of hardcoding your secrets into the configuration file.
{{< /notice >}}
## Reference
| **field** | **type** | **required** | **description** |
|-----------|:--------:|:------------:|------------------------------------------------------------------------|
| type | string | true | Must be "snowflake". |
| account | string | true | Your Snowflake account identifier. |
| user | string | true | Name of the Snowflake user to connect as (e.g. "my-sf-user"). |
| password | string | true | Password of the Snowflake user (e.g. "my-password"). |
| database | string | true | Name of the Snowflake database to connect to (e.g. "my_db"). |
| schema | string | true | Name of the schema to use (e.g. "my_schema"). |
| warehouse | string | false | The virtual warehouse to use. Defaults to "COMPUTE_WH". |
| role | string | false | The security role to use. Defaults to "ACCOUNTADMIN". |
| timeout | integer | false | The connection timeout in seconds. Defaults to 60. |

View File

@@ -64,20 +64,20 @@ applying IAM permissions and roles to an identity.
## Example ## Example
```yaml ```yaml
sources: kind: sources
my-spanner-source: name: my-spanner-source
kind: "spanner" type: "spanner"
project: "my-project-id" project: "my-project-id"
instance: "my-instance" instance: "my-instance"
database: "my_db" database: "my_db"
# dialect: "googlesql" # dialect: "googlesql"
``` ```
## Reference ## Reference
| **field** | **type** | **required** | **description** | | **field** | **type** | **required** | **description** |
|-----------|:--------:|:------------:|---------------------------------------------------------------------------------------------------------------------| |-----------|:--------:|:------------:|---------------------------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "spanner". | | type | string | true | Must be "spanner". |
| project | string | true | Id of the GCP project that the cluster was created in (e.g. "my-project-id"). | | project | string | true | Id of the GCP project that the cluster was created in (e.g. "my-project-id"). |
| instance | string | true | Name of the Spanner instance. | | instance | string | true | Name of the Spanner instance. |
| database | string | true | Name of the database on the Spanner instance | | database | string | true | Name of the database on the Spanner instance |

View File

@@ -48,19 +48,19 @@ You need a SQLite database file. This can be:
## Example ## Example
```yaml ```yaml
sources: kind: sources
my-sqlite-db: name: my-sqlite-db
kind: "sqlite" type: "sqlite"
database: "/path/to/database.db" database: "/path/to/database.db"
``` ```
For an in-memory database: For an in-memory database:
```yaml ```yaml
sources: kind: sources
my-sqlite-memory-db: name: my-sqlite-memory-db
kind: "sqlite" type: "sqlite"
database: ":memory:" database: ":memory:"
``` ```
## Reference ## Reference
@@ -69,7 +69,7 @@ sources:
| **field** | **type** | **required** | **description** | | **field** | **type** | **required** | **description** |
|-----------|:--------:|:------------:|---------------------------------------------------------------------------------------------------------------------| |-----------|:--------:|:------------:|---------------------------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "sqlite". | | type | string | true | Must be "sqlite". |
| database | string | true | Path to SQLite database file, or ":memory:" for an in-memory database. | | database | string | true | Path to SQLite database file, or ":memory:" for an in-memory database. |
### Connection Properties ### Connection Properties

View File

@@ -46,29 +46,29 @@ console.
- TiDB Cloud - TiDB Cloud
```yaml ```yaml
sources: kind: sources
my-tidb-cloud-source: name: my-tidb-cloud-source
kind: tidb type: tidb
host: gateway01.us-west-2.prod.aws.tidbcloud.com host: gateway01.us-west-2.prod.aws.tidbcloud.com
port: 4000 port: 4000
database: my_db database: my_db
user: ${TIDB_USERNAME} user: ${TIDB_USERNAME}
password: ${TIDB_PASSWORD} password: ${TIDB_PASSWORD}
# SSL is automatically enabled for TiDB Cloud # SSL is automatically enabled for TiDB Cloud
``` ```
- Self-Hosted TiDB - Self-Hosted TiDB
```yaml ```yaml
sources: kind: sources
my-tidb-source: name: my-tidb-source
kind: tidb type: tidb
host: 127.0.0.1 host: 127.0.0.1
port: 4000 port: 4000
database: my_db database: my_db
user: ${TIDB_USERNAME} user: ${TIDB_USERNAME}
password: ${TIDB_PASSWORD} password: ${TIDB_PASSWORD}
# ssl: true # Optional: enable SSL for secure connections # ssl: true # Optional: enable SSL for secure connections
``` ```
{{< notice tip >}} {{< notice tip >}}
@@ -80,7 +80,7 @@ instead of hardcoding your secrets into the configuration file.
| **field** | **type** | **required** | **description** | | **field** | **type** | **required** | **description** |
|-----------|:--------:|:------------:|--------------------------------------------------------------------------------------------| |-----------|:--------:|:------------:|--------------------------------------------------------------------------------------------|
| kind | string | true | Must be "tidb". | | type | string | true | Must be "tidb". |
| host | string | true | IP address or hostname to connect to (e.g. "127.0.0.1" or "gateway01.*.tidbcloud.com"). | | host | string | true | IP address or hostname to connect to (e.g. "127.0.0.1" or "gateway01.*.tidbcloud.com"). |
| port | string | true | Port to connect to (typically "4000" for TiDB). | | port | string | true | Port to connect to (typically "4000" for TiDB). |
| database | string | true | Name of the TiDB database to connect to (e.g. "my_db"). | | database | string | true | Name of the TiDB database to connect to (e.g. "my_db"). |

View File

@@ -32,15 +32,15 @@ the catalogs and schemas you want to query.
## Example ## Example
```yaml ```yaml
sources: kind: sources
my-trino-source: name: my-trino-source
kind: trino type: trino
host: trino.example.com host: trino.example.com
port: "8080" port: "8080"
user: ${TRINO_USER} # Optional for anonymous access user: ${TRINO_USER} # Optional for anonymous access
password: ${TRINO_PASSWORD} # Optional password: ${TRINO_PASSWORD} # Optional
catalog: hive catalog: hive
schema: default schema: default
``` ```
{{< notice tip >}} {{< notice tip >}}
@@ -50,16 +50,19 @@ instead of hardcoding your secrets into the configuration file.
## Reference ## Reference
| **field** | **type** | **required** | **description** | | **field** | **type** | **required** | **description** |
|-----------------|:--------:|:------------:|------------------------------------------------------------------------------| | ---------------------- | :------: | :----------: | ---------------------------------------------------------------------------- |
| kind | string | true | Must be "trino". | | type | string | true | Must be "trino". |
| host | string | true | Trino coordinator hostname (e.g. "trino.example.com") | | host | string | true | Trino coordinator hostname (e.g. "trino.example.com") |
| port | string | true | Trino coordinator port (e.g. "8080", "8443") | | port | string | true | Trino coordinator port (e.g. "8080", "8443") |
| user | string | false | Username for authentication (e.g. "analyst"). Optional for anonymous access. | | user | string | false | Username for authentication (e.g. "analyst"). Optional for anonymous access. |
| password | string | false | Password for basic authentication | | password | string | false | Password for basic authentication |
| catalog | string | true | Default catalog to use for queries (e.g. "hive") | | catalog | string | true | Default catalog to use for queries (e.g. "hive") |
| schema | string | true | Default schema to use for queries (e.g. "default") | | schema | string | true | Default schema to use for queries (e.g. "default") |
| queryTimeout | string | false | Query timeout duration (e.g. "30m", "1h") | | queryTimeout | string | false | Query timeout duration (e.g. "30m", "1h") |
| accessToken | string | false | JWT access token for authentication | | accessToken | string | false | JWT access token for authentication |
| kerberosEnabled | boolean | false | Enable Kerberos authentication (default: false) | | kerberosEnabled | boolean | false | Enable Kerberos authentication (default: false) |
| sslEnabled | boolean | false | Enable SSL/TLS (default: false) | | sslEnabled | boolean | false | Enable SSL/TLS (default: false) |
| disableSslVerification | boolean | false | Skip SSL/TLS certificate verification (default: false) |
| sslCertPath | string | false | Path to a custom SSL/TLS certificate file |
| sslCert | string | false | Custom SSL/TLS certificate content |

View File

@@ -27,16 +27,16 @@ the [official Valkey website](https://valkey.io/topics/quickstart/).
## Example ## Example
```yaml ```yaml
sources: kind: sources
my-valkey-instance: name: my-valkey-instance
kind: valkey type: valkey
address: address:
- 127.0.0.1:6379 - 127.0.0.1:6379
username: ${YOUR_USERNAME} username: ${YOUR_USERNAME}
password: ${YOUR_PASSWORD} password: ${YOUR_PASSWORD}
# database: 0 # database: 0
# useGCPIAM: false # useGCPIAM: false
# disableCache: false # disableCache: false
``` ```
{{< notice tip >}} {{< notice tip >}}
@@ -51,12 +51,12 @@ authentication. Grant your account the required [IAM role][iam] and set
`useGCPIAM` to `true`: `useGCPIAM` to `true`:
```yaml ```yaml
sources: kind: sources
my-valkey-instance: name: my-valkey-instance
kind: valkey type: valkey
address: address:
- 127.0.0.1:6379 - 127.0.0.1:6379
useGCPIAM: true useGCPIAM: true
``` ```
[iam]: https://cloud.google.com/memorystore/docs/valkey/about-iam-auth [iam]: https://cloud.google.com/memorystore/docs/valkey/about-iam-auth
@@ -65,7 +65,7 @@ sources:
| **field** | **type** | **required** | **description** | | **field** | **type** | **required** | **description** |
|--------------|:--------:|:------------:|----------------------------------------------------------------------------------------------------------------------------------| |--------------|:--------:|:------------:|----------------------------------------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "valkey". | | type | string | true | Must be "valkey". |
| address | []string | true | Endpoints for the Valkey instance to connect to. | | address | []string | true | Endpoints for the Valkey instance to connect to. |
| username | string | false | If you are using a non-default user, specify the user name here. If you are using Memorystore for Valkey, leave this field blank | | username | string | false | If you are using a non-default user, specify the user name here. If you are using Memorystore for Valkey, leave this field blank |
| password | string | false | Password for the Valkey instance | | password | string | false | Password for the Valkey instance |

View File

@@ -17,23 +17,23 @@ compatibility.
## Example ## Example
```yaml ```yaml
sources: kind: sources
my-yb-source: name: my-yb-source
kind: yugabytedb type: yugabytedb
host: 127.0.0.1 host: 127.0.0.1
port: 5433 port: 5433
database: yugabyte database: yugabyte
user: ${USER_NAME} user: ${USER_NAME}
password: ${PASSWORD} password: ${PASSWORD}
loadBalance: true loadBalance: true
topologyKeys: cloud.region.zone1:1,cloud.region.zone2:2 topologyKeys: cloud.region.zone1:1,cloud.region.zone2:2
``` ```
## Reference ## Reference
| **field** | **type** | **required** | **description** | | **field** | **type** | **required** | **description** |
|------------------------------|:--------:|:------------:|-----------------------------------------------------------------------------------------------------------------------------------------------------------------------| |------------------------------|:--------:|:------------:|-----------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "yugabytedb". | | type | string | true | Must be "yugabytedb". |
| host | string | true | IP address to connect to. | | host | string | true | IP address to connect to. |
| port | integer | true | Port to connect to. The default port is 5433. | | port | integer | true | Port to connect to. The default port is 5433. |
| database | string | true | Name of the YugabyteDB database to connect to. The default database name is yugabyte. | | database | string | true | Name of the YugabyteDB database to connect to. The default database name is yugabyte. |

View File

@@ -12,41 +12,41 @@ statement. You can define Tools as a map in the `tools` section of your
`tools.yaml` file. Typically, a tool will require a source to act on: `tools.yaml` file. Typically, a tool will require a source to act on:
```yaml ```yaml
tools: kind: tools
search_flights_by_number: name: search_flights_by_number
kind: postgres-sql type: postgres-sql
source: my-pg-instance source: my-pg-instance
statement: | statement: |
SELECT * FROM flights SELECT * FROM flights
WHERE airline = $1 WHERE airline = $1
AND flight_number = $2 AND flight_number = $2
LIMIT 10 LIMIT 10
description: | description: |
Use this tool to get information for a specific flight. Use this tool to get information for a specific flight.
Takes an airline code and flight number and returns info on the flight. Takes an airline code and flight number and returns info on the flight.
Do NOT use this tool with a flight id. Do NOT guess an airline code or flight number. Do NOT use this tool with a flight id. Do NOT guess an airline code or flight number.
An airline code is a code for an airline service consisting of a two-character An airline code is a code for an airline service consisting of a two-character
airline designator and followed by a flight number, which is a 1 to 4 digit number. airline designator and followed by a flight number, which is a 1 to 4 digit number.
For example, if given CY 0123, the airline is "CY", and flight_number is "123". For example, if given CY 0123, the airline is "CY", and flight_number is "123".
Another example for this is DL 1234, the airline is "DL", and flight_number is "1234". Another example for this is DL 1234, the airline is "DL", and flight_number is "1234".
If the tool returns more than one option choose the date closest to today. If the tool returns more than one option choose the date closest to today.
Example: Example:
{{ {{
"airline": "CY", "airline": "CY",
"flight_number": "888", "flight_number": "888",
}} }}
Example: Example:
{{ {{
"airline": "DL", "airline": "DL",
"flight_number": "1234", "flight_number": "1234",
}} }}
parameters: parameters:
- name: airline - name: airline
type: string type: string
description: Airline unique 2 letter identifier description: Airline unique 2 letter identifier
- name: flight_number - name: flight_number
type: string type: string
description: 1 to 4 digit number description: 1 to 4 digit number
``` ```
## Specifying Parameters ## Specifying Parameters
@@ -55,13 +55,13 @@ Parameters for each Tool will define what inputs the agent will need to provide
to invoke them. Parameters should be pass as a list of Parameter objects: to invoke them. Parameters should be pass as a list of Parameter objects:
```yaml ```yaml
parameters: parameters:
- name: airline - name: airline
type: string type: string
description: Airline unique 2 letter identifier description: Airline unique 2 letter identifier
- name: flight_number - name: flight_number
type: string type: string
description: 1 to 4 digit number description: 1 to 4 digit number
``` ```
### Basic Parameters ### Basic Parameters
@@ -71,10 +71,10 @@ most cases, the description will be provided to the LLM as context on specifying
the parameter. the parameter.
```yaml ```yaml
parameters: parameters:
- name: airline - name: airline
type: string type: string
description: Airline unique 2 letter identifier description: Airline unique 2 letter identifier
``` ```
| **field** | **type** | **required** | **description** | | **field** | **type** | **required** | **description** |
@@ -97,16 +97,16 @@ To use the `array` type, you must also specify what kind of items are
in the list using the items field: in the list using the items field:
```yaml ```yaml
parameters: parameters:
- name: preferred_airlines - name: preferred_airlines
type: array type: array
description: A list of airline, ordered by preference. description: A list of airline, ordered by preference.
items: items:
name: name name: name
type: string type: string
description: Name of the airline. description: Name of the airline.
statement: | statement: |
SELECT * FROM airlines WHERE preferred_airlines = ANY($1); SELECT * FROM airlines WHERE preferred_airlines = ANY($1);
``` ```
| **field** | **type** | **required** | **description** | | **field** | **type** | **required** | **description** |
@@ -141,10 +141,10 @@ This is the default behavior when valueType is omitted. It's useful for passing
a flexible group of settings. a flexible group of settings.
```yaml ```yaml
parameters: parameters:
- name: execution_context - name: execution_context
type: map type: map
description: A flexible set of key-value pairs for the execution environment. description: A flexible set of key-value pairs for the execution environment.
``` ```
#### Typed Map #### Typed Map
@@ -153,11 +153,11 @@ Specify valueType to ensure all values in the map are of the same type. An error
will be thrown in case of value type mismatch. will be thrown in case of value type mismatch.
```yaml ```yaml
parameters: parameters:
- name: user_scores - name: user_scores
type: map type: map
description: A map of user IDs to their scores. All scores must be integers. description: A map of user IDs to their scores. All scores must be integers.
valueType: integer # This enforces the value type for all entries. valueType: integer # This enforces the value type for all entries.
``` ```
### Authenticated Parameters ### Authenticated Parameters
@@ -171,21 +171,21 @@ the required [authServices](../authServices/) to specific claims within the
user's ID token. user's ID token.
```yaml ```yaml
tools: kind: tools
search_flights_by_user_id: name: search_flights_by_user_id
kind: postgres-sql type: postgres-sql
source: my-pg-instance source: my-pg-instance
statement: | statement: |
SELECT * FROM flights WHERE user_id = $1 SELECT * FROM flights WHERE user_id = $1
parameters: parameters:
- name: user_id - name: user_id
type: string type: string
description: Auto-populated from Google login description: Auto-populated from Google login
authServices: authServices:
# Refer to one of the `authServices` defined # Refer to one of the `authServices` defined
- name: my-google-auth - name: my-google-auth
# `sub` is the OIDC claim field for user ID # `sub` is the OIDC claim field for user ID
field: sub field: sub
``` ```
| **field** | **type** | **required** | **description** | | **field** | **type** | **required** | **description** |
@@ -222,31 +222,31 @@ can use `minValue` and `maxValue` to define the allowable range.
{{< /notice >}} {{< /notice >}}
```yaml ```yaml
tools: kind: tools
select_columns_from_table: name: select_columns_from_table
kind: postgres-sql type: postgres-sql
source: my-pg-instance source: my-pg-instance
statement: | statement: |
SELECT {{array .columnNames}} FROM {{.tableName}} SELECT {{array .columnNames}} FROM {{.tableName}}
description: | description: |
Use this tool to list all information from a specific table. Use this tool to list all information from a specific table.
Example: Example:
{{ {{
"tableName": "flights", "tableName": "flights",
"columnNames": ["id", "name"] "columnNames": ["id", "name"]
}} }}
templateParameters: templateParameters:
- name: tableName - name: tableName
type: string type: string
description: Table to select from description: Table to select from
- name: columnNames - name: columnNames
type: array type: array
description: The columns to select description: The columns to select
items: items:
name: column name: column
type: string type: string
description: Name of a column to select description: Name of a column to select
escape: double-quotes # with this, the statement will resolve to `SELECT "id", "name" FROM flights` escape: double-quotes # with this, the statement will resolve to `SELECT "id", "name" FROM flights`
``` ```
| **field** | **type** | **required** | **description** | | **field** | **type** | **required** | **description** |
@@ -267,16 +267,16 @@ specifying an `authRequired` field. Specify a list of
[authServices](../authServices/) defined in the previous section. [authServices](../authServices/) defined in the previous section.
```yaml ```yaml
tools: kind: tools
search_all_flight: name: search_all_flight
kind: postgres-sql type: postgres-sql
source: my-pg-instance source: my-pg-instance
statement: | statement: |
SELECT * FROM flights SELECT * FROM flights
# A list of `authServices` defined previously # A list of `authServices` defined previously
authRequired: authRequired:
- my-google-auth - my-google-auth
- other-auth-service - other-auth-service
``` ```
## Kinds of tools ## Kinds of tools

View File

@@ -40,17 +40,17 @@ The tool takes the following input parameters:
## Example ## Example
```yaml ```yaml
tools: kind: tools
create_cluster: name: create_cluster
kind: alloydb-create-cluster type: alloydb-create-cluster
source: alloydb-admin-source source: alloydb-admin-source
description: Use this tool to create a new AlloyDB cluster in a given project and location. description: Use this tool to create a new AlloyDB cluster in a given project and location.
``` ```
## Reference ## Reference
| **field** | **type** | **required** | **description** | | **field** | **type** | **required** | **description** |
|-------------|:--------:|:------------:|------------------------------------------------------| |-------------|:--------:|:------------:|------------------------------------------------------|
| kind | string | true | Must be alloydb-create-cluster. | | type | string | true | Must be alloydb-create-cluster. |
| source | string | true | The name of an `alloydb-admin` source. | | source | string | true | The name of an `alloydb-admin` source. |
| description | string | false | Description of the tool that is passed to the agent. | | description | string | false | Description of the tool that is passed to the agent. |

Some files were not shown because too many files have changed in this diff Show More