Compare commits

...

28 Commits

Author SHA1 Message Date
Harsh Jha
12b25a0beb chore: resolve pr comment 2025-12-12 12:35:17 +05:30
Harsh Jha
073c8b3268 feat: added python sdk introduction 2025-12-12 12:26:02 +05:30
Harsh Jha
0cb3ad9026 Merge branch 'main' into sdk-docs-migrate 2025-12-10 14:03:46 +05:30
Anubhav Dhawan
18d0440f4e docs: separate Windows installation instructions for Command Prompt and PowerShell (#2097)
## Description
Updates the Windows installation instructions in `README.md` and the
Hugo documentation to provide separate, clear steps for Command Prompt
and PowerShell.

## Before
<img width="1836" height="1054" alt="image"
src="https://github.com/user-attachments/assets/46856ba5-e99f-4ea1-b851-921c1f885c40"
/>

## After
<img width="1842" height="1046" alt="image"
src="https://github.com/user-attachments/assets/80212630-1233-496e-98d2-9039de8e4cd0"
/>
<img width="1858" height="1070" alt="image"
src="https://github.com/user-attachments/assets/348a879d-7337-41c1-8358-fbe341c80525"
/>
2025-12-10 06:59:57 +00:00
Mend Renovate
7a135ce078 chore(deps): update module google.golang.org/genai to v1.36.0 (#1971)
This PR contains the following updates:

| Package | Change | Age | Confidence |
|---|---|---|---|
|
[google.golang.org/genai](https://redirect.github.com/googleapis/go-genai)
| `v1.35.0` -> `v1.36.0` |
[![age](https://developer.mend.io/api/mc/badges/age/go/google.golang.org%2fgenai/v1.36.0?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|
[![confidence](https://developer.mend.io/api/mc/badges/confidence/go/google.golang.org%2fgenai/v1.35.0/v1.36.0?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|

---

### Release Notes

<details>
<summary>googleapis/go-genai (google.golang.org/genai)</summary>

###
[`v1.36.0`](https://redirect.github.com/googleapis/go-genai/releases/tag/v1.36.0)

[Compare
Source](https://redirect.github.com/googleapis/go-genai/compare/v1.35.0...v1.36.0)

##### Features

- add display name to FunctionResponseBlob
([66bd0fb](66bd0fbf35))
- add display name to FunctionResponseFileData
([f470cff](f470cff5a2))
- Add generate\_content\_config.thinking\_level
([93b2586](93b2586dfb))
- Add image output options to ImageConfig for Vertex
([c4b28c3](c4b28c3414))
- Add part.media\_resolution
([93b2586](93b2586dfb))
- support Function call argument streaming for all languages
([dd5ec01](dd5ec01f21))

</details>

---

### Configuration

📅 **Schedule**: Branch creation - At any time (no schedule defined),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/googleapis/genai-toolbox).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MS4xNzMuMSIsInVwZGF0ZWRJblZlciI6IjQxLjE3My4xIiwidGFyZ2V0QnJhbmNoIjoibWFpbiIsImxhYmVscyI6W119-->

Co-authored-by: Anmol Shukla <shuklaanmol@google.com>
Co-authored-by: Averi Kitsch <akitsch@google.com>
2025-12-10 00:19:55 +00:00
gRedHeadphone
ea9e2d12bd docs: update unit tests execute command (#2074)
## Description

update unit tests execute command in DEVELOPER.md to exclude integration
tests

## PR Checklist

- [x] Make sure you reviewed

[CONTRIBUTING.md](https://github.com/googleapis/genai-toolbox/blob/main/CONTRIBUTING.md)
- [ ] Make sure to open an issue as a

[bug/issue](https://github.com/googleapis/genai-toolbox/issues/new/choose)
  before writing your code! That way we can discuss the change, evaluate
  designs, and agree on the general idea
- [x] Ensure the tests and linter pass
- [x] Code coverage does not decrease (if any source code was changed)
- [x] Appropriate docs were updated (if necessary)
- [x] Make sure to add `!` if this involve a breaking change

Co-authored-by: Averi Kitsch <akitsch@google.com>
2025-12-09 23:48:54 +00:00
Srividya Reddy
bea9705450 feat(tools/postgres): Add new postgres-list-roles tool (#2038)
## Description
Adds a postgresql custom list_roles tool, that lists all the
user-created roles in the instance. It provides details about each
role's attributes and memberships.


> Should include a concise description of the changes (bug or feature),
it's
> impact, along with a summary of the solution
![Uploading Screenshot 2025-11-26 at 1.16.42 AM.png…]()

<img width="1065" height="145" alt="Screenshot 2025-11-26 at 12 59
56 AM"
src="https://github.com/user-attachments/assets/d90131b1-d369-4108-b4db-ee5dc9aafe38"
/>


## PR Checklist

> Thank you for opening a Pull Request! Before submitting your PR, there
are a
> few things you can do to make sure it goes smoothly:

- [x] Make sure you reviewed

[CONTRIBUTING.md](https://github.com/googleapis/genai-toolbox/blob/main/CONTRIBUTING.md)
- [x] Make sure to open an issue as a

[bug/issue](https://github.com/googleapis/genai-toolbox/issues/new/choose)
  before writing your code! That way we can discuss the change, evaluate
  designs, and agree on the general idea
- [x] Ensure the tests and linter pass
- [x] Code coverage does not decrease (if any source code was changed)
- [x] Appropriate docs were updated (if necessary)
- [ ] Make sure to add `!` if this involve a breaking change

🛠️ Fixes #<1738>

Co-authored-by: Wenxin Du <117315983+duwenxin99@users.noreply.github.com>
2025-12-09 21:48:20 +00:00
Srividya Reddy
489117d747 feat(tools/postgres)!: Add additional filter params for existing postgres tools (#2033)
## Description

Add additional filter parameters for existing PostgreSQL tools:

1.  `list_views`:
- Add a new optional `"schema_name"` filter parameter to return results
based on a specific schema name pattern.
- Add an additional column `"definition"` to return the view definition.
2.  `list_schemas`:
- Add a new optional `"owner"` filter parameter to return results based
on a specific owner name pattern.
- Add a new optional `"limit"` parameter to return a specific number of
rows.
3.  `list_indexes`:
- Add a new optional `"only_unused"` filter parameter to return only
unused indexes.

> Should include a concise description of the changes (bug or feature),
it's
> impact, along with a summary of the solution

list_views
<img width="1531" height="763" alt="Screenshot 2025-11-25 at 1 36 39 PM"
src="https://github.com/user-attachments/assets/bd6805b3-43d2-46c7-adc8-62d3a4521d36"
/>

list_schemas
<img width="1519" height="755" alt="Screenshot 2025-11-25 at 1 35 54 PM"
src="https://github.com/user-attachments/assets/62d3e987-b64e-442b-ba1a-84def1df7a58"
/>


list_indexes
<img width="1523" height="774" alt="Screenshot 2025-11-25 at 1 35 32 PM"
src="https://github.com/user-attachments/assets/c6f73b3f-f8a2-4b76-9218-64d7011a2241"
/>


## PR Checklist

> Thank you for opening a Pull Request! Before submitting your PR, there
are a
> few things you can do to make sure it goes smoothly:

- [x] Make sure you reviewed

[CONTRIBUTING.md](https://github.com/googleapis/genai-toolbox/blob/main/CONTRIBUTING.md)
- [x] Make sure to open an issue as a

[bug/issue](https://github.com/googleapis/genai-toolbox/issues/new/choose)
  before writing your code! That way we can discuss the change, evaluate
  designs, and agree on the general idea
- [x] Ensure the tests and linter pass
- [x] Code coverage does not decrease (if any source code was changed)
- [x] Appropriate docs were updated (if necessary)
- [x] Make sure to add `!` if this involve a breaking change

🛠️ Fixes #<1738>

Co-authored-by: Averi Kitsch <akitsch@google.com>
2025-12-09 20:16:45 +00:00
Srividya Reddy
32367a472f feat(tools/postgres): add list_pg_settings, list_database_stats tools for postgres (#2030)
## Description
Adds the following tools for Postgres:
(1) list_pg_settings: List configuration parameters for the PostgreSQL
server.
(2) list_database_stats: Lists the key performance and activity
statistics for each database in the postgreSQL
  server.

> Should include a concise description of the changes (bug or feature),
it's
> impact, along with a summary of the solution

list_pg_settings:
<img width="1526" height="803" alt="Screenshot 2025-11-25 at 10 19
48 AM"
src="https://github.com/user-attachments/assets/73634b9b-4936-4bf0-a94b-6b31fe3642a1"
/>
<img width="1064" height="715" alt="Screenshot 2025-11-25 at 10 27
19 AM"
src="https://github.com/user-attachments/assets/36c13585-27e4-4294-b451-1c1a963c0d6c"
/>

list_database_stats:
<img width="1511" height="779" alt="Screenshot 2025-11-25 at 10 21
12 AM"
src="https://github.com/user-attachments/assets/d283e018-ea81-427d-b1b4-7aaf79b9696b"
/>
<img width="1017" height="506" alt="Screenshot 2025-11-25 at 10 27
47 AM"
src="https://github.com/user-attachments/assets/47b72bd7-7114-4f2a-8a9d-cecc80bf47e9"
/>



## PR Checklist

> Thank you for opening a Pull Request! Before submitting your PR, there
are a
> few things you can do to make sure it goes smoothly:

- [x] Make sure you reviewed

[CONTRIBUTING.md](https://github.com/googleapis/genai-toolbox/blob/main/CONTRIBUTING.md)
- [x] Make sure to open an issue as a

[bug/issue](https://github.com/googleapis/genai-toolbox/issues/new/choose)
  before writing your code! That way we can discuss the change, evaluate
  designs, and agree on the general idea
- [x] Ensure the tests and linter pass
- [x] Code coverage does not decrease (if any source code was changed)
- [x] Appropriate docs were updated (if necessary)
- [x] Make sure to add `!` if this involve a breaking change

🛠️ Fixes #<1738>

Co-authored-by: Averi Kitsch <akitsch@google.com>
Co-authored-by: Wenxin Du <117315983+duwenxin99@users.noreply.github.com>
2025-12-09 11:55:53 -08:00
Pranjul Kalsi
3b40fea25e feat(sources/mariadb): add MariaDB source and MySQL tools integration (#1908)
## Description
This PR 
1. Adds **MariaDB** as a Source - Implementation is similar to **MySQL**
source
2. Utilises pre implemented **MySQL** Tools
- `mysql-execute-sql`
- `mysql-list-active-queries`
- `mysql-list-table-fragmentation`
- `mysql-list-tables`
- `mysql-list-tables-missing-unique-indexes`
- `mysql-sql`
**Note:** After discussion with @duwenxin99 in issue #1768, I initially
assumed MariaDB required new tools due to different metadata structures
and system tables. That is true for older MariaDB versions, but current
MySQL tooling already works with MariaDB (verified), so a separate tool
set was not needed.

3. Adds a source doc for **MariaDB** in docs 
4. Adds MariaDB integration tests using the existing MySQL test flow.
Note: The test file is based on the MySQL integration test, but
`GetMariaDBWants()` and
`RunMariDBListTablesTest()` are implemented because MariaDB returns
different metadata
and list-tables output, so the assertions must be MariaDB-specific.
5. Updates CI 

Lastly
I considered adding a MariaDB-exclusive Galera cluster monitoring tool,
but skipped it because it requires a multi-node Galera setup for
integration testing and would significantly increase CI complexity with
unclear usage demand.

## PR Checklist
- [x] Make sure you reviewed

[CONTRIBUTING.md](https://github.com/googleapis/genai-toolbox/blob/main/CONTRIBUTING.md)
- [x] Make sure to open an issue as a

[bug/issue](https://github.com/googleapis/genai-toolbox/issues/new/choose)
  before writing your code! That way we can discuss the change, evaluate
  designs, and agree on the general idea
- [x] Ensure the tests and linter pass
- [x] Code coverage does not decrease (if any source code was changed)
- [x] Appropriate docs were updated (if necessary)
- [ ] Make sure to add `!` if this involve a breaking change

🛠️ Fixes #1712 #1768

---------

Co-authored-by: Wenxin Du <117315983+duwenxin99@users.noreply.github.com>
2025-12-09 18:45:05 +00:00
Anubhav Dhawan
f6b6a9fb5d ci: enable mongodb integration tests (#2111)
This PR enables MongoDB integration tests in Cloud Build. It adds the
necessary build step and secret configuration.

Fixes https://github.com/googleapis/genai-toolbox/issues/1816

---------

Co-authored-by: AnmolShukla2002 <anmol.28422@gmail.com>
Co-authored-by: Anmol Shukla <shuklaanmol@google.com>
Co-authored-by: Averi Kitsch <akitsch@google.com>
2025-12-09 17:42:51 +00:00
dependabot[bot]
1dd971b8d5 chore(deps): bump golang.org/x/crypto from 0.43.0 to 0.45.0 in /docs/en/getting-started/quickstart/go/langchain (#2041)
Bumps [golang.org/x/crypto](https://github.com/golang/crypto) from
0.43.0 to 0.45.0.
<details>
<summary>Commits</summary>
<ul>
<li><a
href="4e0068c009"><code>4e0068c</code></a>
go.mod: update golang.org/x dependencies</li>
<li><a
href="e79546e28b"><code>e79546e</code></a>
ssh: curb GSSAPI DoS risk by limiting number of specified OIDs</li>
<li><a
href="f91f7a7c31"><code>f91f7a7</code></a>
ssh/agent: prevent panic on malformed constraint</li>
<li><a
href="2df4153a03"><code>2df4153</code></a>
acme/autocert: let automatic renewal work with short lifetime certs</li>
<li><a
href="bcf6a849ef"><code>bcf6a84</code></a>
acme: pass context to request</li>
<li><a
href="b4f2b62076"><code>b4f2b62</code></a>
ssh: fix error message on unsupported cipher</li>
<li><a
href="79ec3a51fc"><code>79ec3a5</code></a>
ssh: allow to bind to a hostname in remote forwarding</li>
<li><a
href="122a78f140"><code>122a78f</code></a>
go.mod: update golang.org/x dependencies</li>
<li><a
href="c0531f9c34"><code>c0531f9</code></a>
all: eliminate vet diagnostics</li>
<li><a
href="0997000b45"><code>0997000</code></a>
all: fix some comments</li>
<li>Additional commits viewable in <a
href="https://github.com/golang/crypto/compare/v0.43.0...v0.45.0">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=golang.org/x/crypto&package-manager=go_modules&previous-version=0.43.0&new-version=0.45.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)
You can disable automated security fix PRs for this repo from the
[Security Alerts
page](https://github.com/googleapis/genai-toolbox/network/alerts).

</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Averi Kitsch <akitsch@google.com>
2025-12-05 20:02:27 +00:00
release-please[bot]
cb4529cbaa chore(main): release 0.22.0 (#1997)
🤖 I have created a release *beep* *boop*
---


##
[0.22.0](https://github.com/googleapis/genai-toolbox/compare/v0.21.0...v0.22.0)
(2025-12-04)


### Features

* Add allowed-origins flag
([#1984](https://github.com/googleapis/genai-toolbox/issues/1984))
([862868f](862868f284))
* **tools/postgres:** Add list-query-stats and get-column-cardinality
functions
([#1976](https://github.com/googleapis/genai-toolbox/issues/1976))
([9f76026](9f76026925))
* **tools/spanner:** Add spanner list graphs to prebuiltconfigs
([#2056](https://github.com/googleapis/genai-toolbox/issues/2056))
([0e7fbf4](0e7fbf465c))
* **prebuilt/cloud-sql:** Add clone instance tool for cloud sql
([#1845](https://github.com/googleapis/genai-toolbox/issues/1845))
([5e43630](5e43630907))
* **serverless-spark:** Add create_pyspark_batch tool
([1bf0b51](1bf0b51f03))
* **serverless-spark:** Add create_spark_batch tool
([17a9792](17a979207d))
* Support alternate accessToken header name
([#1968](https://github.com/googleapis/genai-toolbox/issues/1968))
([18017d6](18017d6545))
* Support for annotations
([#2007](https://github.com/googleapis/genai-toolbox/issues/2007))
([ac21335](ac21335f4e))
* **tool/mssql:** Set default host and port for MSSQL source
([#1943](https://github.com/googleapis/genai-toolbox/issues/1943))
([7a9cc63](7a9cc63376))
* **tools/cloudsqlpg:** Add CloudSQL PostgreSQL pre-check tool
([#1722](https://github.com/googleapis/genai-toolbox/issues/1722))
([8752e05](8752e05ab6))
* **tools/postgres-list-publication-tables:** Add new
postgres-list-publication-tables tool
([#1919](https://github.com/googleapis/genai-toolbox/issues/1919))
([f4b1f0a](f4b1f0a680))
* **tools/postgres-list-tablespaces:** Add new postgres-list-tablespaces
tool ([#1934](https://github.com/googleapis/genai-toolbox/issues/1934))
([5ad7c61](5ad7c6127b))
* **tools/spanner-list-graph:** Tool impl + docs + tests
([#1923](https://github.com/googleapis/genai-toolbox/issues/1923))
([a0f44d3](a0f44d34ea))


### Bug Fixes

* Add import for firebirdsql
([#2045](https://github.com/googleapis/genai-toolbox/issues/2045))
([fb7aae9](fb7aae9d35))
* Correct FAQ to mention HTTP tools
([#2036](https://github.com/googleapis/genai-toolbox/issues/2036))
([7b44237](7b44237d4a))
* Format BigQuery numeric output as decimal strings
([#2084](https://github.com/googleapis/genai-toolbox/issues/2084))
([155bff8](155bff80c1))
* Set default annotations for tools in code if annotation not provided
in yaml
([#2049](https://github.com/googleapis/genai-toolbox/issues/2049))
([565460c](565460c4ea))
* **tools/alloydb-postgres-list-tables:** Exclude google_ml schema from
list_tables
([#2046](https://github.com/googleapis/genai-toolbox/issues/2046))
([a03984c](a03984cc15))
* **tools/alloydbcreateuser:** Remove duplication of project praram
([#2028](https://github.com/googleapis/genai-toolbox/issues/2028))
([730ac6d](730ac6d228))
* **tools/mongodb:** Remove `required` tag from the `canonical` field
([#2099](https://github.com/googleapis/genai-toolbox/issues/2099))
([744214e](744214e04c))

---
This PR was generated with [Release
Please](https://github.com/googleapis/release-please). See
[documentation](https://github.com/googleapis/release-please#release-please).

---------

Co-authored-by: release-please[bot] <55107282+release-please[bot]@users.noreply.github.com>
Co-authored-by: Wenxin Du <117315983+duwenxin99@users.noreply.github.com>
2025-12-04 19:10:51 -05:00
Wenxin Du
ac375114fd chore: add v0.22.0 docs version number (#2115) 2025-12-04 23:37:44 +00:00
Wenxin Du
8a0eba9d62 chore: release 0.22.0 (#2116)
Release-As: 0.22.0
2025-12-04 23:11:24 +00:00
Srividya Reddy
5ad7c6127b feat(tools/postgres-list-tablespaces): add new postgres-list-tablespaces tool (#1934)
## Description
Adds a postgresql custom list_tablespaces tool, that returns the details
of tablespaces present in database.

<img width="1719" height="698" alt="Screenshot 2025-11-12 at 9 11 13 AM"
src="https://github.com/user-attachments/assets/03964a1b-27e0-4da8-85a2-57db905163ed"
/>
<img width="1077" height="141" alt="Screenshot 2025-11-12 at 9 12 42 AM"
src="https://github.com/user-attachments/assets/f93f5692-eb62-4f30-8192-40c8873d4d00"
/>


> Should include a concise description of the changes (bug or feature),
it's
> impact, along with a summary of the solution

Lists all tablespaces in the database. Returns the tablespace name,
owner name, size in bytes, internal object ID, the access control list
regarding permissions, and any specific tablespace options.

## PR Checklist

> Thank you for opening a Pull Request! Before submitting your PR, there
are a
> few things you can do to make sure it goes smoothly:

- [x] Make sure you reviewed

[CONTRIBUTING.md](https://github.com/googleapis/genai-toolbox/blob/main/CONTRIBUTING.md)
- [x] Make sure to open an issue as a

[bug/issue](https://github.com/googleapis/genai-toolbox/issues/new/choose)
  before writing your code! That way we can discuss the change, evaluate
  designs, and agree on the general idea
- [x] Ensure the tests and linter pass
- [x] Code coverage does not decrease (if any source code was changed)
- [x] Appropriate docs were updated (if necessary)
- [x] Make sure to add `!` if this involve a breaking change

🛠️ Fixes #1738
2025-12-04 20:38:54 +00:00
Srividya Reddy
f4b1f0a680 feat(tools/postgres-list-publication-tables): add new postgres-list-publication-tables tool (#1919)
## Description
Adds a postgresql custom list_publication_tables tool, that returns the
details of publication tables present in database.

Test Output:

<img width="845" height="239" alt="Screenshot 2025-11-11 at 12 50 59 AM"
src="https://github.com/user-attachments/assets/b7606e44-c5f6-4fc7-865e-7efadd112eff"
/>

<img width="1529" height="648" alt="Screenshot 2025-11-11 at 1 15 18 AM"
src="https://github.com/user-attachments/assets/6192b772-f0bc-4fb4-8032-ca487434d77c"
/>


> Should include a concise description of the changes (bug or feature),
it's
> impact, along with a summary of the solution

## PR Checklist

> Thank you for opening a Pull Request! Before submitting your PR, there
are a
> few things you can do to make sure it goes smoothly:

- [x] Make sure you reviewed

[CONTRIBUTING.md](https://github.com/googleapis/genai-toolbox/blob/main/CONTRIBUTING.md)
- [x] Make sure to open an issue as a

[bug/issue](https://github.com/googleapis/genai-toolbox/issues/new/choose)
  before writing your code! That way we can discuss the change, evaluate
  designs, and agree on the general idea
- [x] Ensure the tests and linter pass
- [x] Code coverage does not decrease (if any source code was changed)
- [x] Appropriate docs were updated (if necessary)
- [x] Make sure to add `!` if this involve a breaking change

🛠️ Fixes #1738

Co-authored-by: Averi Kitsch <akitsch@google.com>
Co-authored-by: Wenxin Du <117315983+duwenxin99@users.noreply.github.com>
2025-12-04 19:26:45 +00:00
Dave Borowitz
17a979207d feat(serverless-spark): add create_spark_batch tool
This tool is almost identical to create_pyspark_batch, but for Java
Spark batches. There are some minor differences in how the main files
and args are provided.
2025-12-04 11:05:53 -08:00
Dave Borowitz
3bf3fe8fa7 refactor(serverless-spark): extract common create batch tool & config
We are planning to add several very similar tools for creating batches
like the existing pyspark batches: spark (Java), R, etc. They will use
an identical approach of specifying environment and runtime config in
the YAML, differing only in how language-specific args are passed. We
can streamline this.
2025-12-04 11:05:53 -08:00
Dave Borowitz
1bf0b51f03 feat(serverless-spark): add create_pyspark_batch tool
This tool creates a PySpark batch from a minimal set of parameters, to
keep things simple for the LLM. Advanced runtime and environment config
can be specified in tools.yaml.
2025-12-04 11:05:53 -08:00
Wenxin Du
744214e04c fix(tools/mongodb): remove required tag from the canonical field (#2099)
The `required` tag raises validation failure error when a boolean field
is defined as `false`:
```
ERROR "unable to parse tool file at "mongodb_tools.yaml": unable to parse tool "insert-one-device" as kind "mongodb-insert-one": [2:12] Key: 'Config.Canonical' Error:Field validation for 'Canonical' failed on the 'required' tag\n   1 | authRequired: []\n>  2 | canonical: false\n                  ^\n   3 | collection: Device\n   4 | database: xiar\n   5 | description: Inserts a single new document into the Device collection. The 'data' parameter must be a string containing the JSON object to insert.\n   6 | "
```
All the `required` tags are removed from the boolean `canonical` field
of the MongoDB tools. Unit tests are added.
2025-12-04 18:25:16 +00:00
Shobhit Singh
155bff80c1 fix: format BigQuery numeric output as decimal strings (#2084)
## Description

> Should include a concise description of the changes (bug or feature),
it's
> impact, along with a summary of the solution

This change updates both `bigquery-sql` and `bigquery-execute-sql` tools
to format `NUMERIC` and `BIGNUMERIC` values as decimal strings (e.g.,
"9.5") instead of rational fractions (e.g., "19/2"). This ensures the
tools' output matches the BigQuery REST API JSON format.

Key changes:
- Added `NormalizeValue` function in
`internal/tools/bigquery/bigquerycommon` to handle `*big.Rat` conversion
with 38-digit precision and trailing zero trimming.
- Updated `bigquery-sql` and `bigquery-execute-sql` to use
`NormalizeValue`.
- Added comprehensive tests in
`internal/tools/bigquery/bigquerycommon/conversion_test.go`.

With these changes the formatting for NUMERIC and BIGNUMERIC is fixed.

**Before:**

```
[
  {
    "id": 3,
    "numeric_value": "1"
  },
  {
    "id": 2,
    "numeric_value": "333333333/1000000000"
  },
  {
    "id": 4,
    "numeric_value": "12341/10"
  },
  {
    "id": 1,
    "numeric_value": "19/2"
  }
]
```

**After:**

```
[
  {
    "id": 3,
    "numeric_value": "1"
  },
  {
    "id": 2,
    "numeric_value": "0.333333333"
  },
  {
    "id": 4,
    "numeric_value": "1234.1"
  },
  {
    "id": 1,
    "numeric_value": "9.5"
  }
]
```

## PR Checklist

> Thank you for opening a Pull Request! Before submitting your PR, there
are a
> few things you can do to make sure it goes smoothly:

- [x] Make sure you reviewed

[CONTRIBUTING.md](https://github.com/googleapis/genai-toolbox/blob/main/CONTRIBUTING.md)
- [ ] Make sure to open an issue as a

[bug/issue](https://github.com/googleapis/genai-toolbox/issues/new/choose)
  before writing your code! That way we can discuss the change, evaluate
  designs, and agree on the general idea
- [x] Ensure the tests and linter pass
- [x] Code coverage does not decrease (if any source code was changed)
- [ ] Appropriate docs were updated (if necessary)
- [ ] Make sure to add `!` if this involve a breaking change

🛠️ Fixes #1194

---------

Co-authored-by: google-labs-jules[bot] <161369871+google-labs-jules[bot]@users.noreply.github.com>
Co-authored-by: Averi Kitsch <akitsch@google.com>
2025-12-04 10:06:12 -08:00
Wenxin Du
e84252feb4 ci: fix multiple source test failures (#2114) 2025-12-04 09:43:18 -08:00
Twisha Bansal
1e67810740 docs(toolbox-adk): Add quickstart for ADK JS SDK (#1862)
## Description

## PR Checklist

> Thank you for opening a Pull Request! Before submitting your PR, there
are a
> few things you can do to make sure it goes smoothly:

- [x] Make sure you reviewed

[CONTRIBUTING.md](https://github.com/googleapis/genai-toolbox/blob/main/CONTRIBUTING.md)
- [ ] Make sure to open an issue as a

[bug/issue](https://github.com/googleapis/genai-toolbox/issues/new/choose)
  before writing your code! That way we can discuss the change, evaluate
  designs, and agree on the general idea
- [x] Ensure the tests and linter pass
- [ ] Code coverage does not decrease (if any source code was changed)
- [x] Appropriate docs were updated (if necessary)
- [ ] Make sure to add `!` if this involve a breaking change

🛠️ Fixes #<issue_number_goes_here>
2025-12-04 12:17:05 +05:30
Harsh Jha
290cba0f1e Merge branch 'main' into sdk-docs-migrate 2025-12-01 14:57:07 +05:30
Harsh Jha
047def93ef Merge branch 'main' into sdk-docs-migrate 2025-12-01 14:55:19 +05:30
Harsh Jha
875b5277e3 Merge branch 'main' into sdk-docs-migrate 2025-11-21 12:39:20 +05:30
Harsh Jha
a29f9e5484 feat: added basic template for sdks doc migrate 2025-11-17 13:57:49 +05:30
119 changed files with 8283 additions and 345 deletions

View File

@@ -589,6 +589,26 @@ steps:
firestore \
firestore
- id: "mongodb"
name: golang:1
waitFor: ["compile-test-binary"]
entrypoint: /bin/bash
env:
- "GOPATH=/gopath"
- "MONGODB_DATABASE=$_DATABASE_NAME"
- "SERVICE_ACCOUNT_EMAIL=$SERVICE_ACCOUNT_EMAIL"
secretEnv: ["MONGODB_URI", "CLIENT_ID"]
volumes:
- name: "go"
path: "/gopath"
args:
- -c
- |
.ci/test_with_coverage.sh \
"MongoDB" \
mongodb \
mongodb
- id: "looker"
name: golang:1
waitFor: ["compile-test-binary"]
@@ -867,6 +887,26 @@ steps:
singlestore \
singlestore
- id: "mariadb"
name: golang:1
waitFor: ["compile-test-binary"]
entrypoint: /bin/bash
env:
- "GOPATH=/gopath"
- "MARIADB_DATABASE=$_MARIADB_DATABASE"
- "MARIADB_PORT=$_MARIADB_PORT"
- "SERVICE_ACCOUNT_EMAIL=$SERVICE_ACCOUNT_EMAIL"
secretEnv: ["MARIADB_USER", "MARIADB_PASS", "MARIADB_HOST", "CLIENT_ID"]
volumes:
- name: "go"
path: "/gopath"
args:
- -c
- |
# skip coverage check as it re-uses current MySQL implementation
go test ./tests/mariadb
availableSecrets:
secretManager:
- versionName: projects/$PROJECT_ID/secrets/cloud_sql_pg_user/versions/latest
@@ -979,6 +1019,14 @@ availableSecrets:
env: SINGLESTORE_PASSWORD
- versionName: projects/$PROJECT_ID/secrets/singlestore_host/versions/latest
env: SINGLESTORE_HOST
- versionName: projects/$PROJECT_ID/secrets/mariadb_user/versions/latest
env: MARIADB_USER
- versionName: projects/$PROJECT_ID/secrets/mariadb_pass/versions/latest
env: MARIADB_PASS
- versionName: projects/$PROJECT_ID/secrets/mariadb_host/versions/latest
env: MARIADB_HOST
- versionName: projects/$PROJECT_ID/secrets/mongodb_uri/versions/latest
env: MONGODB_URI
options:
logging: CLOUD_LOGGING_ONLY
@@ -1039,3 +1087,6 @@ substitutions:
_SINGLESTORE_PORT: "3308"
_SINGLESTORE_DATABASE: "singlestore"
_SINGLESTORE_USER: "root"
_MARIADB_PORT: "3307"
_MARIADB_DATABASE: test_database

View File

@@ -13,7 +13,7 @@
# limitations under the License.
steps:
- name: 'node:20'
- name: 'node:22'
id: 'js-quickstart-test'
entrypoint: 'bash'
args:
@@ -44,4 +44,4 @@ availableSecrets:
timeout: 1000s
options:
logging: CLOUD_LOGGING_ONLY
logging: CLOUD_LOGGING_ONLY

View File

@@ -1 +1,9 @@
@import 'td/code-dark';
@import 'td/code-dark';
// Make tabs scrollable horizontally instead of wrapping
.nav-tabs {
flex-wrap: nowrap;
white-space: nowrap;
overflow-x: auto;
overflow-y: hidden;
}

View File

@@ -51,6 +51,10 @@ ignoreFiles = ["quickstart/shared", "quickstart/python", "quickstart/js", "quick
# Add a new version block here before every release
# The order of versions in this file is mirrored into the dropdown
[[params.versions]]
version = "v0.22.0"
url = "https://googleapis.github.io/genai-toolbox/v0.22.0/"
[[params.versions]]
version = "v0.21.0"
url = "https://googleapis.github.io/genai-toolbox/v0.21.0/"

View File

@@ -1,5 +1,35 @@
# Changelog
## [0.22.0](https://github.com/googleapis/genai-toolbox/compare/v0.21.0...v0.22.0) (2025-12-04)
### Features
* **tools/postgres:** Add allowed-origins flag ([#1984](https://github.com/googleapis/genai-toolbox/issues/1984)) ([862868f](https://github.com/googleapis/genai-toolbox/commit/862868f28476ea981575ce412faa7d6a03138f31))
* **tools/postgres:** Add list-query-stats and get-column-cardinality functions ([#1976](https://github.com/googleapis/genai-toolbox/issues/1976)) ([9f76026](https://github.com/googleapis/genai-toolbox/commit/9f760269253a8cc92a357e995c6993ccc4a0fb7b))
* **tools/spanner:** Add spanner list graphs to prebuiltconfigs ([#2056](https://github.com/googleapis/genai-toolbox/issues/2056)) ([0e7fbf4](https://github.com/googleapis/genai-toolbox/commit/0e7fbf465c488397aa9d8cab2e55165fff4eb53c))
* **prebuilt/cloud-sql:** Add clone instance tool for cloud sql ([#1845](https://github.com/googleapis/genai-toolbox/issues/1845)) ([5e43630](https://github.com/googleapis/genai-toolbox/commit/5e43630907aa2d7bc6818142483a33272eab060b))
* **serverless-spark:** Add create_pyspark_batch tool ([1bf0b51](https://github.com/googleapis/genai-toolbox/commit/1bf0b51f033c956790be1577bf5310d0b17e9c12))
* **serverless-spark:** Add create_spark_batch tool ([17a9792](https://github.com/googleapis/genai-toolbox/commit/17a979207dbc4fe70acd0ebda164d1a8d34c1ed3))
* Support alternate accessToken header name ([#1968](https://github.com/googleapis/genai-toolbox/issues/1968)) ([18017d6](https://github.com/googleapis/genai-toolbox/commit/18017d6545335a6fc1c472617101c35254d9a597))
* Support for annotations ([#2007](https://github.com/googleapis/genai-toolbox/issues/2007)) ([ac21335](https://github.com/googleapis/genai-toolbox/commit/ac21335f4e88ca52d954d7f8143a551a35661b94))
* **tool/mssql:** Set default host and port for MSSQL source ([#1943](https://github.com/googleapis/genai-toolbox/issues/1943)) ([7a9cc63](https://github.com/googleapis/genai-toolbox/commit/7a9cc633768d9ae9a7ff8230002da69d6a36ca86))
* **tools/cloudsqlpg:** Add CloudSQL PostgreSQL pre-check tool ([#1722](https://github.com/googleapis/genai-toolbox/issues/1722)) ([8752e05](https://github.com/googleapis/genai-toolbox/commit/8752e05ab6e98812d95673a6f1ff67e9a6ae48d2))
* **tools/postgres-list-publication-tables:** Add new postgres-list-publication-tables tool ([#1919](https://github.com/googleapis/genai-toolbox/issues/1919)) ([f4b1f0a](https://github.com/googleapis/genai-toolbox/commit/f4b1f0a68000ca2fc0325f55a1905705417c38a2))
* **tools/postgres-list-tablespaces:** Add new postgres-list-tablespaces tool ([#1934](https://github.com/googleapis/genai-toolbox/issues/1934)) ([5ad7c61](https://github.com/googleapis/genai-toolbox/commit/5ad7c6127b3e47504fc4afda0b7f3de1dff78b8b))
* **tools/spanner-list-graph:** Tool impl + docs + tests ([#1923](https://github.com/googleapis/genai-toolbox/issues/1923)) ([a0f44d3](https://github.com/googleapis/genai-toolbox/commit/a0f44d34ea3f044dd08501be616f70ddfd63ab45))
### Bug Fixes
* Add import for firebirdsql ([#2045](https://github.com/googleapis/genai-toolbox/issues/2045)) ([fb7aae9](https://github.com/googleapis/genai-toolbox/commit/fb7aae9d35b760d3471d8379642f835a0d84ec41))
* Correct FAQ to mention HTTP tools ([#2036](https://github.com/googleapis/genai-toolbox/issues/2036)) ([7b44237](https://github.com/googleapis/genai-toolbox/commit/7b44237d4a21bfbf8d3cebe4d32a15affa29584d))
* Format BigQuery numeric output as decimal strings ([#2084](https://github.com/googleapis/genai-toolbox/issues/2084)) ([155bff8](https://github.com/googleapis/genai-toolbox/commit/155bff80c1da4fae1e169e425fd82e1dc3373041))
* Set default annotations for tools in code if annotation not provided in yaml ([#2049](https://github.com/googleapis/genai-toolbox/issues/2049)) ([565460c](https://github.com/googleapis/genai-toolbox/commit/565460c4ea8953dbe80070a8e469f957c0f7a70c))
* **tools/alloydb-postgres-list-tables:** Exclude google_ml schema from list_tables ([#2046](https://github.com/googleapis/genai-toolbox/issues/2046)) ([a03984c](https://github.com/googleapis/genai-toolbox/commit/a03984cc15254c928f30085f8fa509ded6a79a0c))
* **tools/alloydbcreateuser:** Remove duplication of project praram ([#2028](https://github.com/googleapis/genai-toolbox/issues/2028)) ([730ac6d](https://github.com/googleapis/genai-toolbox/commit/730ac6d22805fd50b4a675b74c1865f4e7689e7c))
* **tools/mongodb:** Remove `required` tag from the `canonical` field ([#2099](https://github.com/googleapis/genai-toolbox/issues/2099)) ([744214e](https://github.com/googleapis/genai-toolbox/commit/744214e04cd12b11d166e6eb7da8ce4714904abc))
## [0.21.0](https://github.com/googleapis/genai-toolbox/compare/v0.20.0...v0.21.0) (2025-11-19)

View File

@@ -109,7 +109,7 @@ golangci-lint run --fix
Execute unit tests locally:
```bash
go test -race -v ./...
go test -race -v ./cmd/... ./internal/...
```
### Integration Tests

View File

@@ -125,7 +125,7 @@ To install Toolbox as a binary:
>
> ```sh
> # see releases page for other versions
> export VERSION=0.21.0
> export VERSION=0.22.0
> curl -L -o toolbox https://storage.googleapis.com/genai-toolbox/v$VERSION/linux/amd64/toolbox
> chmod +x toolbox
> ```
@@ -138,7 +138,7 @@ To install Toolbox as a binary:
>
> ```sh
> # see releases page for other versions
> export VERSION=0.21.0
> export VERSION=0.22.0
> curl -L -o toolbox https://storage.googleapis.com/genai-toolbox/v$VERSION/darwin/arm64/toolbox
> chmod +x toolbox
> ```
@@ -151,21 +151,33 @@ To install Toolbox as a binary:
>
> ```sh
> # see releases page for other versions
> export VERSION=0.21.0
> export VERSION=0.22.0
> curl -L -o toolbox https://storage.googleapis.com/genai-toolbox/v$VERSION/darwin/amd64/toolbox
> chmod +x toolbox
> ```
>
> </details>
> <details>
> <summary>Windows (AMD64)</summary>
> <summary>Windows (Command Prompt)</summary>
>
> To install Toolbox as a binary on Windows (AMD64):
> To install Toolbox as a binary on Windows (Command Prompt):
>
> ```cmd
> :: see releases page for other versions
> set VERSION=0.22.0
> curl -o toolbox.exe "https://storage.googleapis.com/genai-toolbox/v%VERSION%/windows/amd64/toolbox.exe"
> ```
>
> </details>
> <details>
> <summary>Windows (PowerShell)</summary>
>
> To install Toolbox as a binary on Windows (PowerShell):
>
> ```powershell
> :: see releases page for other versions
> set VERSION=0.21.0
> curl -o toolbox.exe "https://storage.googleapis.com/genai-toolbox/v%VERSION%/windows/amd64/toolbox.exe"
> # see releases page for other versions
> $VERSION = "0.21.0"
> curl.exe -o toolbox.exe "https://storage.googleapis.com/genai-toolbox/v$VERSION/windows/amd64/toolbox.exe"
> ```
>
> </details>
@@ -177,7 +189,7 @@ You can also install Toolbox as a container:
```sh
# see releases page for other versions
export VERSION=0.21.0
export VERSION=0.22.0
docker pull us-central1-docker.pkg.dev/database-toolbox/toolbox/toolbox:$VERSION
```
@@ -201,7 +213,7 @@ To install from source, ensure you have the latest version of
[Go installed](https://go.dev/doc/install), and then run the following command:
```sh
go install github.com/googleapis/genai-toolbox@v0.21.0
go install github.com/googleapis/genai-toolbox@v0.22.0
```
<!-- {x-release-please-end} -->
@@ -515,6 +527,36 @@ For more detailed instructions on using the Toolbox Core SDK, see the
```
</details>
<details>
<summary>ADK</summary>
1. Install [Toolbox ADK SDK][toolbox-adk-js]:
```bash
npm install @toolbox-sdk/adk
```
2. Load tools:
```javascript
import { ToolboxClient } from '@toolbox-sdk/adk';
// update the url to point to your server
const URL = 'http://127.0.0.1:5000';
let client = new ToolboxClient(URL);
// these tools can be passed to your application!
const tools = await client.loadToolset('toolsetName');
```
For more detailed instructions on using the Toolbox ADK SDK, see the
[project's README][toolbox-adk-js-readme].
[toolbox-adk-js]: https://www.npmjs.com/package/@toolbox-sdk/adk
[toolbox-adk-js-readme]:
https://github.com/googleapis/mcp-toolbox-sdk-js/blob/main/packages/toolbox-adk/README.md
</details>
</details>
</blockquote>
<details>

View File

@@ -184,13 +184,18 @@ import (
_ "github.com/googleapis/genai-toolbox/internal/tools/postgres/postgresgetcolumncardinality"
_ "github.com/googleapis/genai-toolbox/internal/tools/postgres/postgreslistactivequeries"
_ "github.com/googleapis/genai-toolbox/internal/tools/postgres/postgreslistavailableextensions"
_ "github.com/googleapis/genai-toolbox/internal/tools/postgres/postgreslistdatabasestats"
_ "github.com/googleapis/genai-toolbox/internal/tools/postgres/postgreslistindexes"
_ "github.com/googleapis/genai-toolbox/internal/tools/postgres/postgreslistinstalledextensions"
_ "github.com/googleapis/genai-toolbox/internal/tools/postgres/postgreslistlocks"
_ "github.com/googleapis/genai-toolbox/internal/tools/postgres/postgreslistpgsettings"
_ "github.com/googleapis/genai-toolbox/internal/tools/postgres/postgreslistpublicationtables"
_ "github.com/googleapis/genai-toolbox/internal/tools/postgres/postgreslistquerystats"
_ "github.com/googleapis/genai-toolbox/internal/tools/postgres/postgreslistroles"
_ "github.com/googleapis/genai-toolbox/internal/tools/postgres/postgreslistschemas"
_ "github.com/googleapis/genai-toolbox/internal/tools/postgres/postgreslistsequences"
_ "github.com/googleapis/genai-toolbox/internal/tools/postgres/postgreslisttables"
_ "github.com/googleapis/genai-toolbox/internal/tools/postgres/postgreslisttablespaces"
_ "github.com/googleapis/genai-toolbox/internal/tools/postgres/postgreslisttriggers"
_ "github.com/googleapis/genai-toolbox/internal/tools/postgres/postgreslistviews"
_ "github.com/googleapis/genai-toolbox/internal/tools/postgres/postgreslongrunningtransactions"
@@ -198,6 +203,8 @@ import (
_ "github.com/googleapis/genai-toolbox/internal/tools/postgres/postgressql"
_ "github.com/googleapis/genai-toolbox/internal/tools/redis"
_ "github.com/googleapis/genai-toolbox/internal/tools/serverlessspark/serverlesssparkcancelbatch"
_ "github.com/googleapis/genai-toolbox/internal/tools/serverlessspark/serverlesssparkcreatepysparkbatch"
_ "github.com/googleapis/genai-toolbox/internal/tools/serverlessspark/serverlesssparkcreatesparkbatch"
_ "github.com/googleapis/genai-toolbox/internal/tools/serverlessspark/serverlesssparkgetbatch"
_ "github.com/googleapis/genai-toolbox/internal/tools/serverlessspark/serverlesssparklistbatches"
_ "github.com/googleapis/genai-toolbox/internal/tools/singlestore/singlestoreexecutesql"

View File

@@ -1488,7 +1488,7 @@ func TestPrebuiltTools(t *testing.T) {
wantToolset: server.ToolsetConfigs{
"alloydb_postgres_database_tools": tools.ToolsetConfig{
Name: "alloydb_postgres_database_tools",
ToolNames: []string{"execute_sql", "list_tables", "list_active_queries", "list_available_extensions", "list_installed_extensions", "list_autovacuum_configurations", "list_memory_configurations", "list_top_bloated_tables", "list_replication_slots", "list_invalid_indexes", "get_query_plan", "list_views", "list_schemas", "database_overview", "list_triggers", "list_indexes", "list_sequences", "long_running_transactions", "list_locks", "replication_stats", "list_query_stats", "get_column_cardinality"},
ToolNames: []string{"execute_sql", "list_tables", "list_active_queries", "list_available_extensions", "list_installed_extensions", "list_autovacuum_configurations", "list_memory_configurations", "list_top_bloated_tables", "list_replication_slots", "list_invalid_indexes", "get_query_plan", "list_views", "list_schemas", "database_overview", "list_triggers", "list_indexes", "list_sequences", "long_running_transactions", "list_locks", "replication_stats", "list_query_stats", "get_column_cardinality", "list_publication_tables", "list_tablespaces", "list_pg_settings", "list_database_stats", "list_roles"},
},
},
},
@@ -1518,7 +1518,7 @@ func TestPrebuiltTools(t *testing.T) {
wantToolset: server.ToolsetConfigs{
"cloud_sql_postgres_database_tools": tools.ToolsetConfig{
Name: "cloud_sql_postgres_database_tools",
ToolNames: []string{"execute_sql", "list_tables", "list_active_queries", "list_available_extensions", "list_installed_extensions", "list_autovacuum_configurations", "list_memory_configurations", "list_top_bloated_tables", "list_replication_slots", "list_invalid_indexes", "get_query_plan", "list_views", "list_schemas", "database_overview", "list_triggers", "list_indexes", "list_sequences", "long_running_transactions", "list_locks", "replication_stats", "list_query_stats", "get_column_cardinality"},
ToolNames: []string{"execute_sql", "list_tables", "list_active_queries", "list_available_extensions", "list_installed_extensions", "list_autovacuum_configurations", "list_memory_configurations", "list_top_bloated_tables", "list_replication_slots", "list_invalid_indexes", "get_query_plan", "list_views", "list_schemas", "database_overview", "list_triggers", "list_indexes", "list_sequences", "long_running_transactions", "list_locks", "replication_stats", "list_query_stats", "get_column_cardinality", "list_publication_tables", "list_tablespaces", "list_pg_settings", "list_database_stats", "list_roles"},
},
},
},
@@ -1558,7 +1558,7 @@ func TestPrebuiltTools(t *testing.T) {
wantToolset: server.ToolsetConfigs{
"serverless_spark_tools": tools.ToolsetConfig{
Name: "serverless_spark_tools",
ToolNames: []string{"list_batches", "get_batch", "cancel_batch"},
ToolNames: []string{"list_batches", "get_batch", "cancel_batch", "create_pyspark_batch", "create_spark_batch"},
},
},
},
@@ -1618,7 +1618,7 @@ func TestPrebuiltTools(t *testing.T) {
wantToolset: server.ToolsetConfigs{
"postgres_database_tools": tools.ToolsetConfig{
Name: "postgres_database_tools",
ToolNames: []string{"execute_sql", "list_tables", "list_active_queries", "list_available_extensions", "list_installed_extensions", "list_autovacuum_configurations", "list_memory_configurations", "list_top_bloated_tables", "list_replication_slots", "list_invalid_indexes", "get_query_plan", "list_views", "list_schemas", "database_overview", "list_triggers", "list_indexes", "list_sequences", "long_running_transactions", "list_locks", "replication_stats", "list_query_stats", "get_column_cardinality"},
ToolNames: []string{"execute_sql", "list_tables", "list_active_queries", "list_available_extensions", "list_installed_extensions", "list_autovacuum_configurations", "list_memory_configurations", "list_top_bloated_tables", "list_replication_slots", "list_invalid_indexes", "get_query_plan", "list_views", "list_schemas", "database_overview", "list_triggers", "list_indexes", "list_sequences", "long_running_transactions", "list_locks", "replication_stats", "list_query_stats", "get_column_cardinality", "list_publication_tables", "list_tablespaces", "list_pg_settings", "list_database_stats", "list_roles"},
},
},
},

View File

@@ -1 +1 @@
0.21.0
0.22.0

View File

@@ -234,7 +234,7 @@
},
"outputs": [],
"source": [
"version = \"0.21.0\" # x-release-please-version\n",
"version = \"0.22.0\" # x-release-please-version\n",
"! curl -O https://storage.googleapis.com/genai-toolbox/v{version}/linux/amd64/toolbox\n",
"\n",
"# Make the binary executable\n",

View File

@@ -87,7 +87,7 @@ To install Toolbox as a binary on Linux (AMD64):
```sh
# see releases page for other versions
export VERSION=0.21.0
export VERSION=0.22.0
curl -L -o toolbox https://storage.googleapis.com/genai-toolbox/v$VERSION/linux/amd64/toolbox
chmod +x toolbox
```
@@ -98,7 +98,7 @@ To install Toolbox as a binary on macOS (Apple Silicon):
```sh
# see releases page for other versions
export VERSION=0.21.0
export VERSION=0.22.0
curl -L -o toolbox https://storage.googleapis.com/genai-toolbox/v$VERSION/darwin/arm64/toolbox
chmod +x toolbox
```
@@ -109,19 +109,29 @@ To install Toolbox as a binary on macOS (Intel):
```sh
# see releases page for other versions
export VERSION=0.21.0
export VERSION=0.22.0
curl -L -o toolbox https://storage.googleapis.com/genai-toolbox/v$VERSION/darwin/amd64/toolbox
chmod +x toolbox
```
{{% /tab %}}
{{% tab header="Windows (AMD64)" lang="en" %}}
To install Toolbox as a binary on Windows (AMD64):
{{% tab header="Windows (Command Prompt)" lang="en" %}}
To install Toolbox as a binary on Windows (Command Prompt):
```cmd
:: see releases page for other versions
set VERSION=0.22.0
curl -o toolbox.exe "https://storage.googleapis.com/genai-toolbox/v%VERSION%/windows/amd64/toolbox.exe"
```
{{% /tab %}}
{{% tab header="Windows (PowerShell)" lang="en" %}}
To install Toolbox as a binary on Windows (PowerShell):
```powershell
:: see releases page for other versions
set VERSION=0.21.0
curl -o toolbox.exe "https://storage.googleapis.com/genai-toolbox/v%VERSION%/windows/amd64/toolbox.exe"
# see releases page for other versions
$VERSION = "0.21.0"
curl.exe -o toolbox.exe "https://storage.googleapis.com/genai-toolbox/v$VERSION/windows/amd64/toolbox.exe"
```
{{% /tab %}}
@@ -132,7 +142,7 @@ You can also install Toolbox as a container:
```sh
# see releases page for other versions
export VERSION=0.21.0
export VERSION=0.22.0
docker pull us-central1-docker.pkg.dev/database-toolbox/toolbox/toolbox:$VERSION
```
@@ -151,7 +161,7 @@ To install from source, ensure you have the latest version of
[Go installed](https://go.dev/doc/install), and then run the following command:
```sh
go install github.com/googleapis/genai-toolbox@v0.21.0
go install github.com/googleapis/genai-toolbox@v0.22.0
```
{{% /tab %}}
@@ -294,6 +304,10 @@ let client = new ToolboxClient(URL);
const toolboxTools = await client.loadToolset('toolsetName');
{{< /highlight >}}
For more detailed instructions on using the Toolbox Core SDK, see the
[project's
README](https://github.com/googleapis/mcp-toolbox-sdk-js/blob/main/packages/toolbox-core/README.md).
{{% /tab %}}
{{% tab header="LangChain/Langraph" lang="en" %}}
@@ -318,6 +332,10 @@ const getTool = (toolboxTool) => tool(currTool, {
const tools = toolboxTools.map(getTool);
{{< /highlight >}}
For more detailed instructions on using the Toolbox Core SDK, see the
[project's
README](https://github.com/googleapis/mcp-toolbox-sdk-js/blob/main/packages/toolbox-core/README.md).
{{% /tab %}}
{{% tab header="Genkit" lang="en" %}}
@@ -353,6 +371,10 @@ const getTool = (toolboxTool) => ai.defineTool({
const tools = toolboxTools.map(getTool);
{{< /highlight >}}
For more detailed instructions on using the Toolbox Core SDK, see the
[project's
README](https://github.com/googleapis/mcp-toolbox-sdk-js/blob/main/packages/toolbox-core/README.md).
{{% /tab %}}
{{% tab header="LlamaIndex" lang="en" %}}
@@ -380,13 +402,33 @@ const tools = toolboxTools.map(getTool);
{{< /highlight >}}
{{% /tab %}}
{{< /tabpane >}}
For more detailed instructions on using the Toolbox Core SDK, see the
[project's
README](https://github.com/googleapis/mcp-toolbox-sdk-js/blob/main/packages/toolbox-core/README.md).
{{% /tab %}}
{{% tab header="ADK TS" lang="en" %}}
{{< highlight javascript >}}
import { ToolboxClient } from '@toolbox-sdk/adk';
// Replace with the actual URL where your Toolbox service is running
const URL = 'http://127.0.0.1:5000';
let client = new ToolboxClient(URL);
const tools = await client.loadToolset();
// Use the client and tools as per requirement
{{< /highlight >}}
For detailed samples on using the Toolbox JS SDK with ADK JS, see the [project's
README.](https://github.com/googleapis/mcp-toolbox-sdk-js/tree/main/packages/toolbox-adk/README.md)
{{% /tab %}}
{{< /tabpane >}}
#### Go
Once you've installed the [Toolbox Go

View File

@@ -40,11 +40,24 @@ from Toolbox.
```
1. In a new terminal, install the
[SDK](https://www.npmjs.com/package/@toolbox-sdk/core).
```bash
npm install @toolbox-sdk/core
```
SDK package.
{{< tabpane persist=header >}}
{{< tab header="LangChain" lang="bash" >}}
npm install @toolbox-sdk/core
{{< /tab >}}
{{< tab header="GenkitJS" lang="bash" >}}
npm install @toolbox-sdk/core
{{< /tab >}}
{{< tab header="LlamaIndex" lang="bash" >}}
npm install @toolbox-sdk/core
{{< /tab >}}
{{< tab header="GoogleGenAI" lang="bash" >}}
npm install @toolbox-sdk/core
{{< /tab >}}
{{< tab header="ADK" lang="bash" >}}
npm install @toolbox-sdk/adk
{{< /tab >}}
{{< /tabpane >}}
1. Install other required dependencies
@@ -61,6 +74,9 @@ npm install llamaindex @llamaindex/google @llamaindex/workflow
{{< tab header="GoogleGenAI" lang="bash" >}}
npm install @google/genai
{{< /tab >}}
{{< tab header="ADK" lang="bash" >}}
npm install @google/adk
{{< /tab >}}
{{< /tabpane >}}
1. Create a new file named `hotelAgent.js` and copy the following code to create
@@ -91,6 +107,12 @@ npm install @google/genai
{{< /tab >}}
{{< tab header="ADK" lang="js" >}}
{{< include "quickstart/js/adk/quickstart.js" >}}
{{< /tab >}}
{{< /tabpane >}}
1. Run your agent, and observe the results:

View File

@@ -105,7 +105,7 @@ In this section, we will download Toolbox, configure our tools in a
<!-- {x-release-please-start-version} -->
```bash
export OS="linux/amd64" # one of linux/amd64, darwin/arm64, darwin/amd64, or windows/amd64
curl -O https://storage.googleapis.com/genai-toolbox/v0.21.0/$OS/toolbox
curl -O https://storage.googleapis.com/genai-toolbox/v0.22.0/$OS/toolbox
```
<!-- {x-release-please-end} -->

View File

@@ -5,7 +5,7 @@ go 1.24.4
require (
github.com/googleapis/mcp-toolbox-sdk-go v0.4.0
google.golang.org/adk v0.1.0
google.golang.org/genai v1.35.0
google.golang.org/genai v1.36.0
)
require (

View File

@@ -108,8 +108,8 @@ google.golang.org/adk v0.1.0 h1:+w/fHuqRVolotOATlujRA+2DKUuDrFH2poRdEX2QjB8=
google.golang.org/adk v0.1.0/go.mod h1:NvtSLoNx7UzZIiUAI1KoJQLMmt9sG3oCgiCx1TLqKFw=
google.golang.org/api v0.255.0 h1:OaF+IbRwOottVCYV2wZan7KUq7UeNUQn1BcPc4K7lE4=
google.golang.org/api v0.255.0/go.mod h1:d1/EtvCLdtiWEV4rAEHDHGh2bCnqsWhw+M8y2ECN4a8=
google.golang.org/genai v1.35.0 h1:Jo6g25CzVqFzGrX5mhWyBgQqXAUzxcx5jeK7U74zv9c=
google.golang.org/genai v1.35.0/go.mod h1:A3kkl0nyBjyFlNjgxIwKq70julKbIxpSxqKO5gw/gmk=
google.golang.org/genai v1.36.0 h1:sJCIjqTAmwrtAIaemtTiKkg2TO1RxnYEusTmEQ3nGxM=
google.golang.org/genai v1.36.0/go.mod h1:A3kkl0nyBjyFlNjgxIwKq70julKbIxpSxqKO5gw/gmk=
google.golang.org/genproto v0.0.0-20251014184007-4626949a642f h1:vLd1CJuJOUgV6qijD7KT5Y2ZtC97ll4dxjTUappMnbo=
google.golang.org/genproto v0.0.0-20251014184007-4626949a642f/go.mod h1:PI3KrSadr00yqfv6UDvgZGFsmLqeRIwt8x4p5Oo7CdM=
google.golang.org/genproto/googleapis/api v0.0.0-20251014184007-4626949a642f h1:OiFuztEyBivVKDvguQJYWq1yDcfAHIID/FVrPR4oiI0=

View File

@@ -4,7 +4,7 @@ go 1.24.6
require (
github.com/googleapis/mcp-toolbox-sdk-go v0.4.0
google.golang.org/genai v1.35.0
google.golang.org/genai v1.36.0
)
require (

View File

@@ -102,8 +102,8 @@ gonum.org/v1/gonum v0.16.0 h1:5+ul4Swaf3ESvrOnidPp4GZbzf0mxVQpDCYUQE7OJfk=
gonum.org/v1/gonum v0.16.0/go.mod h1:fef3am4MQ93R2HHpKnLk4/Tbh/s0+wqD5nfa6Pnwy4E=
google.golang.org/api v0.255.0 h1:OaF+IbRwOottVCYV2wZan7KUq7UeNUQn1BcPc4K7lE4=
google.golang.org/api v0.255.0/go.mod h1:d1/EtvCLdtiWEV4rAEHDHGh2bCnqsWhw+M8y2ECN4a8=
google.golang.org/genai v1.35.0 h1:Jo6g25CzVqFzGrX5mhWyBgQqXAUzxcx5jeK7U74zv9c=
google.golang.org/genai v1.35.0/go.mod h1:A3kkl0nyBjyFlNjgxIwKq70julKbIxpSxqKO5gw/gmk=
google.golang.org/genai v1.36.0 h1:sJCIjqTAmwrtAIaemtTiKkg2TO1RxnYEusTmEQ3nGxM=
google.golang.org/genai v1.36.0/go.mod h1:A3kkl0nyBjyFlNjgxIwKq70julKbIxpSxqKO5gw/gmk=
google.golang.org/genproto v0.0.0-20251014184007-4626949a642f h1:vLd1CJuJOUgV6qijD7KT5Y2ZtC97ll4dxjTUappMnbo=
google.golang.org/genproto v0.0.0-20251014184007-4626949a642f/go.mod h1:PI3KrSadr00yqfv6UDvgZGFsmLqeRIwt8x4p5Oo7CdM=
google.golang.org/genproto/googleapis/api v0.0.0-20251014184007-4626949a642f h1:OiFuztEyBivVKDvguQJYWq1yDcfAHIID/FVrPR4oiI0=

View File

@@ -33,12 +33,12 @@ require (
go.opentelemetry.io/otel v1.38.0 // indirect
go.opentelemetry.io/otel/metric v1.38.0 // indirect
go.opentelemetry.io/otel/trace v1.38.0 // indirect
golang.org/x/crypto v0.43.0 // indirect
golang.org/x/net v0.46.0 // indirect
golang.org/x/crypto v0.45.0 // indirect
golang.org/x/net v0.47.0 // indirect
golang.org/x/oauth2 v0.32.0 // indirect
golang.org/x/sync v0.17.0 // indirect
golang.org/x/sys v0.37.0 // indirect
golang.org/x/text v0.30.0 // indirect
golang.org/x/sync v0.18.0 // indirect
golang.org/x/sys v0.38.0 // indirect
golang.org/x/text v0.31.0 // indirect
golang.org/x/time v0.14.0 // indirect
google.golang.org/api v0.255.0 // indirect
google.golang.org/genproto v0.0.0-20251014184007-4626949a642f // indirect

View File

@@ -100,18 +100,18 @@ go.opentelemetry.io/otel/sdk/metric v1.38.0 h1:aSH66iL0aZqo//xXzQLYozmWrXxyFkBJ6
go.opentelemetry.io/otel/sdk/metric v1.38.0/go.mod h1:dg9PBnW9XdQ1Hd6ZnRz689CbtrUp0wMMs9iPcgT9EZA=
go.opentelemetry.io/otel/trace v1.38.0 h1:Fxk5bKrDZJUH+AMyyIXGcFAPah0oRcT+LuNtJrmcNLE=
go.opentelemetry.io/otel/trace v1.38.0/go.mod h1:j1P9ivuFsTceSWe1oY+EeW3sc+Pp42sO++GHkg4wwhs=
golang.org/x/crypto v0.43.0 h1:dduJYIi3A3KOfdGOHX8AVZ/jGiyPa3IbBozJ5kNuE04=
golang.org/x/crypto v0.43.0/go.mod h1:BFbav4mRNlXJL4wNeejLpWxB7wMbc79PdRGhWKncxR0=
golang.org/x/net v0.46.0 h1:giFlY12I07fugqwPuWJi68oOnpfqFnJIJzaIIm2JVV4=
golang.org/x/net v0.46.0/go.mod h1:Q9BGdFy1y4nkUwiLvT5qtyhAnEHgnQ/zd8PfU6nc210=
golang.org/x/crypto v0.45.0 h1:jMBrvKuj23MTlT0bQEOBcAE0mjg8mK9RXFhRH6nyF3Q=
golang.org/x/crypto v0.45.0/go.mod h1:XTGrrkGJve7CYK7J8PEww4aY7gM3qMCElcJQ8n8JdX4=
golang.org/x/net v0.47.0 h1:Mx+4dIFzqraBXUugkia1OOvlD6LemFo1ALMHjrXDOhY=
golang.org/x/net v0.47.0/go.mod h1:/jNxtkgq5yWUGYkaZGqo27cfGZ1c5Nen03aYrrKpVRU=
golang.org/x/oauth2 v0.32.0 h1:jsCblLleRMDrxMN29H3z/k1KliIvpLgCkE6R8FXXNgY=
golang.org/x/oauth2 v0.32.0/go.mod h1:lzm5WQJQwKZ3nwavOZ3IS5Aulzxi68dUSgRHujetwEA=
golang.org/x/sync v0.17.0 h1:l60nONMj9l5drqw6jlhIELNv9I0A4OFgRsG9k2oT9Ug=
golang.org/x/sync v0.17.0/go.mod h1:9KTHXmSnoGruLpwFjVSX0lNNA75CykiMECbovNTZqGI=
golang.org/x/sys v0.37.0 h1:fdNQudmxPjkdUTPnLn5mdQv7Zwvbvpaxqs831goi9kQ=
golang.org/x/sys v0.37.0/go.mod h1:OgkHotnGiDImocRcuBABYBEXf8A9a87e/uXjp9XT3ks=
golang.org/x/text v0.30.0 h1:yznKA/E9zq54KzlzBEAWn1NXSQ8DIp/NYMy88xJjl4k=
golang.org/x/text v0.30.0/go.mod h1:yDdHFIX9t+tORqspjENWgzaCVXgk0yYnYuSZ8UzzBVM=
golang.org/x/sync v0.18.0 h1:kr88TuHDroi+UVf+0hZnirlk8o8T+4MrK6mr60WkH/I=
golang.org/x/sync v0.18.0/go.mod h1:9KTHXmSnoGruLpwFjVSX0lNNA75CykiMECbovNTZqGI=
golang.org/x/sys v0.38.0 h1:3yZWxaJjBmCWXqhN1qh02AkOnCQ1poK6oF+a7xWL6Gc=
golang.org/x/sys v0.38.0/go.mod h1:OgkHotnGiDImocRcuBABYBEXf8A9a87e/uXjp9XT3ks=
golang.org/x/text v0.31.0 h1:aC8ghyu4JhP8VojJ2lEHBnochRno1sgL6nEi9WGFGMM=
golang.org/x/text v0.31.0/go.mod h1:tKRAlv61yKIjGGHX/4tP1LTbc13YSec1pxVEWXzfoeM=
golang.org/x/time v0.14.0 h1:MRx4UaLrDotUKUdCIqzPC48t1Y9hANFKIRpNx+Te8PI=
golang.org/x/time v0.14.0/go.mod h1:eL/Oa2bBBK0TkX57Fyni+NgnyQQN4LitPmob2Hjnqw4=
gonum.org/v1/gonum v0.16.0 h1:5+ul4Swaf3ESvrOnidPp4GZbzf0mxVQpDCYUQE7OJfk=

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,17 @@
{
"name": "adk",
"version": "1.0.0",
"description": "",
"main": "quickstart.js",
"type": "module",
"scripts": {
"test": "node --test"
},
"keywords": [],
"author": "",
"license": "ISC",
"dependencies": {
"@google/adk": "^0.1.3",
"@toolbox-sdk/adk": "^0.1.5"
}
}

View File

@@ -0,0 +1,56 @@
import { InMemoryRunner, LlmAgent, LogLevel } from '@google/adk';
import { ToolboxClient } from '@toolbox-sdk/adk';
const prompt = `
You're a helpful hotel assistant. You handle hotel searching, booking, and
cancellations. When the user searches for a hotel, mention its name, id,
location and price tier. Always mention hotel ids while performing any
searches. This is very important for any operations. For any bookings or
cancellations, please provide the appropriate confirmation. Be sure to
update checkin or checkout dates if mentioned by the user.
Don't ask for confirmations from the user.
`;
const queries = [
"Find hotels with Basel in its name.",
"Can you book the Hilton Basel for me?",
"Oh wait, this is too expensive. Please cancel it and book the Hyatt Regency instead.",
"My check in dates would be from April 10, 2024 to April 19, 2024.",
];
process.env.GOOGLE_GENAI_API_KEY = process.env.GOOGLE_API_KEY || 'your-api-key'; // Replace it with your API key
export async function main() {
const userId = 'test_user';
const client = new ToolboxClient('http://127.0.0.1:5000');
const tools = await client.loadToolset("my-toolset");
const rootAgent = new LlmAgent({
name: 'hotel_agent',
model: 'gemini-2.5-flash',
description: 'Agent for hotel bookings and administration.',
instruction: prompt,
tools: tools,
});
const appName = rootAgent.name;
const runner = new InMemoryRunner({ agent: rootAgent, appName, logLevel: LogLevel.ERROR, });
const session = await runner.sessionService.createSession({ appName, userId });
for (const query of queries) {
await runPrompt(runner, userId, session.id, query);
}
}
async function runPrompt(runner, userId, sessionId, prompt) {
const content = { role: 'user', parts: [{ text: prompt }] };
const stream = runner.runAsync({ userId, sessionId, newMessage: content });
const responses = await Array.fromAsync(stream);
const accumulatedResponse = responses
.flatMap((e) => e.content?.parts?.map((p) => p.text) ?? [])
.join('');
console.log(`\nMODEL RESPONSE: ${accumulatedResponse}\n`);
}
main();

View File

@@ -13,7 +13,7 @@ In this section, we will download Toolbox, configure our tools in a
<!-- {x-release-please-start-version} -->
```bash
export OS="linux/amd64" # one of linux/amd64, darwin/arm64, darwin/amd64, or windows/amd64
curl -O https://storage.googleapis.com/genai-toolbox/v0.21.0/$OS/toolbox
curl -O https://storage.googleapis.com/genai-toolbox/v0.22.0/$OS/toolbox
```
<!-- {x-release-please-end} -->

View File

@@ -49,19 +49,19 @@ to expose your developer assistant tools to a Looker instance:
{{< tabpane persist=header >}}
{{< tab header="linux/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.21.0/linux/amd64/toolbox
curl -O https://storage.googleapis.com/genai-toolbox/v0.22.0/linux/amd64/toolbox
{{< /tab >}}
{{< tab header="darwin/arm64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.21.0/darwin/arm64/toolbox
curl -O https://storage.googleapis.com/genai-toolbox/v0.22.0/darwin/arm64/toolbox
{{< /tab >}}
{{< tab header="darwin/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.21.0/darwin/amd64/toolbox
curl -O https://storage.googleapis.com/genai-toolbox/v0.22.0/darwin/amd64/toolbox
{{< /tab >}}
{{< tab header="windows/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.21.0/windows/amd64/toolbox.exe
curl -O https://storage.googleapis.com/genai-toolbox/v0.22.0/windows/amd64/toolbox.exe
{{< /tab >}}
{{< /tabpane >}}
<!-- {x-release-please-end} -->

View File

@@ -45,19 +45,19 @@ instance:
<!-- {x-release-please-start-version} -->
{{< tabpane persist=header >}}
{{< tab header="linux/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.21.0/linux/amd64/toolbox
curl -O https://storage.googleapis.com/genai-toolbox/v0.22.0/linux/amd64/toolbox
{{< /tab >}}
{{< tab header="darwin/arm64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.21.0/darwin/arm64/toolbox
curl -O https://storage.googleapis.com/genai-toolbox/v0.22.0/darwin/arm64/toolbox
{{< /tab >}}
{{< tab header="darwin/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.21.0/darwin/amd64/toolbox
curl -O https://storage.googleapis.com/genai-toolbox/v0.22.0/darwin/amd64/toolbox
{{< /tab >}}
{{< tab header="windows/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.21.0/windows/amd64/toolbox.exe
curl -O https://storage.googleapis.com/genai-toolbox/v0.22.0/windows/amd64/toolbox.exe
{{< /tab >}}
{{< /tabpane >}}
<!-- {x-release-please-end} -->

View File

@@ -43,19 +43,19 @@ expose your developer assistant tools to a MySQL instance:
<!-- {x-release-please-start-version} -->
{{< tabpane persist=header >}}
{{< tab header="linux/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.21.0/linux/amd64/toolbox
curl -O https://storage.googleapis.com/genai-toolbox/v0.22.0/linux/amd64/toolbox
{{< /tab >}}
{{< tab header="darwin/arm64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.21.0/darwin/arm64/toolbox
curl -O https://storage.googleapis.com/genai-toolbox/v0.22.0/darwin/arm64/toolbox
{{< /tab >}}
{{< tab header="darwin/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.21.0/darwin/amd64/toolbox
curl -O https://storage.googleapis.com/genai-toolbox/v0.22.0/darwin/amd64/toolbox
{{< /tab >}}
{{< tab header="windows/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.21.0/windows/amd64/toolbox.exe
curl -O https://storage.googleapis.com/genai-toolbox/v0.22.0/windows/amd64/toolbox.exe
{{< /tab >}}
{{< /tabpane >}}
<!-- {x-release-please-end} -->

View File

@@ -44,19 +44,19 @@ expose your developer assistant tools to a Neo4j instance:
<!-- {x-release-please-start-version} -->
{{< tabpane persist=header >}}
{{< tab header="linux/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.21.0/linux/amd64/toolbox
curl -O https://storage.googleapis.com/genai-toolbox/v0.22.0/linux/amd64/toolbox
{{< /tab >}}
{{< tab header="darwin/arm64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.21.0/darwin/arm64/toolbox
curl -O https://storage.googleapis.com/genai-toolbox/v0.22.0/darwin/arm64/toolbox
{{< /tab >}}
{{< tab header="darwin/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.21.0/darwin/amd64/toolbox
curl -O https://storage.googleapis.com/genai-toolbox/v0.22.0/darwin/amd64/toolbox
{{< /tab >}}
{{< tab header="windows/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.21.0/windows/amd64/toolbox.exe
curl -O https://storage.googleapis.com/genai-toolbox/v0.22.0/windows/amd64/toolbox.exe
{{< /tab >}}
{{< /tabpane >}}
<!-- {x-release-please-end} -->

View File

@@ -56,19 +56,19 @@ Omni](https://cloud.google.com/alloydb/omni/current/docs/overview).
<!-- {x-release-please-start-version} -->
{{< tabpane persist=header >}}
{{< tab header="linux/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.21.0/linux/amd64/toolbox
curl -O https://storage.googleapis.com/genai-toolbox/v0.22.0/linux/amd64/toolbox
{{< /tab >}}
{{< tab header="darwin/arm64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.21.0/darwin/arm64/toolbox
curl -O https://storage.googleapis.com/genai-toolbox/v0.22.0/darwin/arm64/toolbox
{{< /tab >}}
{{< tab header="darwin/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.21.0/darwin/amd64/toolbox
curl -O https://storage.googleapis.com/genai-toolbox/v0.22.0/darwin/amd64/toolbox
{{< /tab >}}
{{< tab header="windows/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.21.0/windows/amd64/toolbox.exe
curl -O https://storage.googleapis.com/genai-toolbox/v0.22.0/windows/amd64/toolbox.exe
{{< /tab >}}
{{< /tabpane >}}
<!-- {x-release-please-end} -->

View File

@@ -43,19 +43,19 @@ to expose your developer assistant tools to a SQLite instance:
<!-- {x-release-please-start-version} -->
{{< tabpane persist=header >}}
{{< tab header="linux/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.21.0/linux/amd64/toolbox
curl -O https://storage.googleapis.com/genai-toolbox/v0.22.0/linux/amd64/toolbox
{{< /tab >}}
{{< tab header="darwin/arm64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.21.0/darwin/arm64/toolbox
curl -O https://storage.googleapis.com/genai-toolbox/v0.22.0/darwin/arm64/toolbox
{{< /tab >}}
{{< tab header="darwin/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.21.0/darwin/amd64/toolbox
curl -O https://storage.googleapis.com/genai-toolbox/v0.22.0/darwin/amd64/toolbox
{{< /tab >}}
{{< tab header="windows/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.21.0/windows/amd64/toolbox.exe
curl -O https://storage.googleapis.com/genai-toolbox/v0.22.0/windows/amd64/toolbox.exe
{{< /tab >}}
{{< /tabpane >}}
<!-- {x-release-please-end} -->

View File

@@ -50,6 +50,12 @@ details on how to connect your AI tools (IDEs) to databases via Toolbox and MCP.
* `list_triggers`: Lists triggers in the database.
* `list_indexes`: List available user indexes in a PostgreSQL database.
* `list_sequences`: List sequences in a PostgreSQL database.
* `list_publication_tables`: List publication tables in a PostgreSQL database.
* `list_tablespaces`: Lists tablespaces in the database.
* `list_pg_settings`: List configuration parameters for the PostgreSQL server.
* `list_database_stats`: Lists the key performance and activity statistics for
each database in the AlloyDB instance.
* `list_roles`: Lists all the user-created roles in PostgreSQL database.
## AlloyDB Postgres Admin
@@ -227,6 +233,12 @@ details on how to connect your AI tools (IDEs) to databases via Toolbox and MCP.
* `list_triggers`: Lists triggers in the database.
* `list_indexes`: List available user indexes in a PostgreSQL database.
* `list_sequences`: List sequences in a PostgreSQL database.
* `list_publication_tables`: List publication tables in a PostgreSQL database.
* `list_tablespaces`: Lists tablespaces in the database.
* `list_pg_settings`: List configuration parameters for the PostgreSQL server.
* `list_database_stats`: Lists the key performance and activity statistics for
each database in the postgreSQL instance.
* `list_roles`: Lists all the user-created roles in PostgreSQL database.
## Cloud SQL for PostgreSQL Observability
@@ -532,6 +544,12 @@ details on how to connect your AI tools (IDEs) to databases via Toolbox and MCP.
* `list_triggers`: Lists triggers in the database.
* `list_indexes`: List available user indexes in a PostgreSQL database.
* `list_sequences`: List sequences in a PostgreSQL database.
* `list_publication_tables`: List publication tables in a PostgreSQL database.
* `list_tablespaces`: Lists tablespaces in the database.
* `list_pg_settings`: List configuration parameters for the PostgreSQL server.
* `list_database_stats`: Lists the key performance and activity statistics for
each database in the PostgreSQL server.
* `list_roles`: Lists all the user-created roles in PostgreSQL database.
## Google Cloud Serverless for Apache Spark

View File

@@ -77,6 +77,22 @@ cluster][alloydb-free-trial].
- [`postgres-get-column-cardinality`](../tools/postgres/postgres-get-column-cardinality.md)
List cardinality of columns in a table in a PostgreSQL database.
- [`postgres-list-publication-tables`](../tools/postgres/postgres-list-publication-tables.md)
List publication tables in a PostgreSQL database.
- [`postgres-list-tablespaces`](../tools/postgres/postgres-list-tablespaces.md)
List tablespaces in an AlloyDB for PostgreSQL database.
- [`postgres-list-pg-settings`](../tools/postgres/postgres-list-pg-settings.md)
List configuration parameters for the PostgreSQL server.
- [`postgres-list-database-stats`](../tools/postgres/postgres-list-database-stats.md)
Lists the key performance and activity statistics for each database in the AlloyDB
instance.
- [`postgres-list-roles`](../tools/postgres/postgres-list-roles.md)
Lists all the user-created roles in PostgreSQL database..
### Pre-built Configurations
- [AlloyDB using MCP](https://googleapis.github.io/genai-toolbox/how-to/connect-ide/alloydb_pg_mcp/)

View File

@@ -73,6 +73,22 @@ to a database by following these instructions][csql-pg-quickstart].
- [`postgres-get-column-cardinality`](../tools/postgres/postgres-get-column-cardinality.md)
List cardinality of columns in a table in a PostgreSQL database.
- [`postgres-list-publication-tables`](../tools/postgres/postgres-list-publication-tables.md)
List publication tables in a PostgreSQL database.
- [`postgres-list-tablespaces`](../tools/postgres/postgres-list-tablespaces.md)
List tablespaces in a PostgreSQL database.
- [`postgres-list-pg-settings`](../tools/postgres/postgres-list-pg-settings.md)
List configuration parameters for the PostgreSQL server.
- [`postgres-list-database-stats`](../tools/postgres/postgres-list-database-stats.md)
Lists the key performance and activity statistics for each database in the postgreSQL
instance.
- [`postgres-list-roles`](../tools/postgres/postgres-list-roles.md)
Lists all the user-created roles in PostgreSQL database..
### Pre-built Configurations
- [Cloud SQL for Postgres using

View File

@@ -0,0 +1,78 @@
---
title: "MariaDB"
type: docs
weight: 1
description: >
MariaDB is an open-source relational database compatible with MySQL.
---
## About
MariaDB is a relational database management system derived from MySQL. It
implements the MySQL protocol and client libraries and supports modern SQL
features with a focus on performance and reliability.
**Note**: MariaDB is supported using the MySQL source.
## Available Tools
- [`mysql-sql`](../tools/mysql/mysql-sql.md)
Execute pre-defined prepared SQL queries in MariaDB.
- [`mysql-execute-sql`](../tools/mysql/mysql-execute-sql.md)
Run parameterized SQL queries in MariaDB.
- [`mysql-list-active-queries`](../tools/mysql/mysql-list-active-queries.md)
List active queries in MariaDB.
- [`mysql-list-tables`](../tools/mysql/mysql-list-tables.md)
List tables in a MariaDB database.
- [`mysql-list-tables-missing-unique-indexes`](../tools/mysql/mysql-list-tables-missing-unique-indexes.md)
List tables in a MariaDB database that do not have primary or unique indices.
- [`mysql-list-table-fragmentation`](../tools/mysql/mysql-list-table-fragmentation.md)
List table fragmentation in MariaDB tables.
## Requirements
### Database User
This source only uses standard authentication. You will need to [create a
MariaDB user][mariadb-users] to log in to the database.
[mariadb-users]: https://mariadb.com/kb/en/create-user/
## Example
```yaml
sources:
my_mariadb_db:
kind: mysql
host: 127.0.0.1
port: 3306
database: my_db
user: ${MARIADB_USER}
password: ${MARIADB_PASS}
# Optional TLS and other driver parameters. For example, enable preferred TLS:
# queryParams:
# tls: preferred
queryTimeout: 30s # Optional: query timeout duration
```
{{< notice tip >}}
Use environment variables instead of committing credentials to source files.
{{< /notice >}}
## Reference
| **field** | **type** | **required** | **description** |
| ------------ | :------: | :----------: | ----------------------------------------------------------------------------------------------- |
| kind | string | true | Must be `mysql`. |
| host | string | true | IP address to connect to (e.g. "127.0.0.1"). |
| port | string | true | Port to connect to (e.g. "3307"). |
| database | string | true | Name of the MariaDB database to connect to (e.g. "my_db"). |
| user | string | true | Name of the MariaDB user to connect as (e.g. "my-mysql-user"). |
| password | string | true | Password of the MariaDB user (e.g. "my-password"). |
| queryTimeout | string | false | Maximum time to wait for query execution (e.g. "30s", "2m"). By default, no timeout is applied. |
| queryParams | map<string,string> | false | Arbitrary DSN parameters passed to the driver (e.g. `tls: preferred`, `charset: utf8mb4`). Useful for enabling TLS or other connection options. |

View File

@@ -68,6 +68,22 @@ reputation for reliability, feature robustness, and performance.
- [`postgres-get-column-cardinality`](../tools/postgres/postgres-get-column-cardinality.md)
List cardinality of columns in a table in a PostgreSQL database.
- [`postgres-list-publication-tables`](../tools/postgres/postgres-list-publication-tables.md)
List publication tables in a PostgreSQL database.
- [`postgres-list-tablespaces`](../tools/postgres/postgres-list-tablespaces.md)
List tablespaces in a PostgreSQL database.
- [`postgres-list-pg-settings`](../tools/postgres/postgres-list-pg-settings.md)
List configuration parameters for the PostgreSQL server.
- [`postgres-list-database-stats`](../tools/postgres/postgres-list-database-stats.md)
Lists the key performance and activity statistics for each database in the postgreSQL
server.
- [`postgres-list-roles`](../tools/postgres/postgres-list-roles.md)
Lists all the user-created roles in PostgreSQL database..
### Pre-built Configurations
- [PostgreSQL using MCP](https://googleapis.github.io/genai-toolbox/how-to/connect-ide/postgres_mcp/)

View File

@@ -21,6 +21,10 @@ Apache Spark.
Get a Serverless Spark batch.
- [`serverless-spark-cancel-batch`](../tools/serverless-spark/serverless-spark-cancel-batch.md)
Cancel a running Serverless Spark batch operation.
- [`serverless-spark-create-pyspark-batch`](../tools/serverless-spark/serverless-spark-create-pyspark-batch.md)
Create a Serverless Spark PySpark batch operation.
- [`serverless-spark-create-spark-batch`](../tools/serverless-spark/serverless-spark-create-spark-batch.md)
Create a Serverless Spark Java batch operation.
## Requirements

View File

@@ -48,11 +48,11 @@ in the `data` parameter, like this:
## Reference
| **field** | **type** | **required** | **description** |
|:------------|:---------|:-------------|:---------------------------------------------------------------------------------------------------|
| kind | string | true | Must be `mongodb-insert-many`. |
| source | string | true | The name of the `mongodb` source to use. |
| description | string | true | A description of the tool that is passed to the LLM. |
| database | string | true | The name of the MongoDB database containing the collection. |
| collection | string | true | The name of the MongoDB collection into which the documents will be inserted. |
| canonical | bool | true | Determines if the data string is parsed using MongoDB's Canonical or Relaxed Extended JSON format. |
| **field** | **type** | **required** | **description** |
|:------------|:---------|:-------------|:------------------------------------------------------------------------------------------------------------------------|
| kind | string | true | Must be `mongodb-insert-many`. |
| source | string | true | The name of the `mongodb` source to use. |
| description | string | true | A description of the tool that is passed to the LLM. |
| database | string | true | The name of the MongoDB database containing the collection. |
| collection | string | true | The name of the MongoDB collection into which the documents will be inserted. |
| canonical | bool | false | Determines if the data string is parsed using MongoDB's Canonical or Relaxed Extended JSON format. Defaults to `false`. |

View File

@@ -43,11 +43,11 @@ An LLM would call this tool by providing the document as a JSON string in the
## Reference
| **field** | **type** | **required** | **description** |
|:------------|:---------|:-------------|:---------------------------------------------------------------------------------------------------|
| kind | string | true | Must be `mongodb-insert-one`. |
| source | string | true | The name of the `mongodb` source to use. |
| description | string | true | A description of the tool that is passed to the LLM. |
| database | string | true | The name of the MongoDB database containing the collection. |
| collection | string | true | The name of the MongoDB collection into which the document will be inserted. |
| canonical | bool | true | Determines if the data string is parsed using MongoDB's Canonical or Relaxed Extended JSON format. |
| **field** | **type** | **required** | **description** |
|:------------|:---------|:-------------|:------------------------------------------------------------------------------------------------------------------------|
| kind | string | true | Must be `mongodb-insert-one`. |
| source | string | true | The name of the `mongodb` source to use. |
| description | string | true | A description of the tool that is passed to the LLM. |
| database | string | true | The name of the MongoDB database containing the collection. |
| collection | string | true | The name of the MongoDB collection into which the document will be inserted. |
| canonical | bool | false | Determines if the data string is parsed using MongoDB's Canonical or Relaxed Extended JSON format. Defaults to `false`. |

View File

@@ -57,16 +57,16 @@ tools:
## Reference
| **field** | **type** | **required** | **description** |
|:--------------|:---------|:-------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| kind | string | true | Must be `mongodb-update-many`. |
| source | string | true | The name of the `mongodb` source to use. |
| description | string | true | A description of the tool that is passed to the LLM. |
| database | string | true | The name of the MongoDB database containing the collection. |
| collection | string | true | The name of the MongoDB collection in which to update documents. |
| filterPayload | string | true | The MongoDB query filter document to select the documents for updating. It's written as a Go template, using `{{json .param_name}}` to insert parameters. |
| filterParams | list | false | A list of parameter objects that define the variables used in the `filterPayload`. |
| updatePayload | string | true | The MongoDB update document, It's written as a Go template, using `{{json .param_name}}` to insert parameters. |
| updateParams | list | true | A list of parameter objects that define the variables used in the `updatePayload`. |
| canonical | bool | true | Determines if the `filterPayload` and `updatePayload` strings are parsed using MongoDB's Canonical or Relaxed Extended JSON format. **Canonical** is stricter about type representation, while **Relaxed** is more lenient. |
| upsert | bool | false | If `true`, a new document is created if no document matches the `filterPayload`. Defaults to `false`. |
| **field** | **type** | **required** | **description** |
|:--------------|:---------|:-------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| kind | string | true | Must be `mongodb-update-many`. |
| source | string | true | The name of the `mongodb` source to use. |
| description | string | true | A description of the tool that is passed to the LLM. |
| database | string | true | The name of the MongoDB database containing the collection. |
| collection | string | true | The name of the MongoDB collection in which to update documents. |
| filterPayload | string | true | The MongoDB query filter document to select the documents for updating. It's written as a Go template, using `{{json .param_name}}` to insert parameters. |
| filterParams | list | false | A list of parameter objects that define the variables used in the `filterPayload`. |
| updatePayload | string | true | The MongoDB update document, It's written as a Go template, using `{{json .param_name}}` to insert parameters. |
| updateParams | list | true | A list of parameter objects that define the variables used in the `updatePayload`. |
| canonical | bool | false | Determines if the `filterPayload` and `updatePayload` strings are parsed using MongoDB's Canonical or Relaxed Extended JSON format. **Canonical** is stricter about type representation, while **Relaxed** is more lenient. Defaults to `false`. |
| upsert | bool | false | If `true`, a new document is created if no document matches the `filterPayload`. Defaults to `false`. |

View File

@@ -57,16 +57,16 @@ tools:
## Reference
| **field** | **type** | **required** | **description** |
|:--------------|:---------|:-------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| kind | string | true | Must be `mongodb-update-one`. |
| source | string | true | The name of the `mongodb` source to use. |
| description | string | true | A description of the tool that is passed to the LLM. |
| database | string | true | The name of the MongoDB database containing the collection. |
| collection | string | true | The name of the MongoDB collection to update a document in. |
| filterPayload | string | true | The MongoDB query filter document to select the document for updating. It's written as a Go template, using `{{json .param_name}}` to insert parameters. |
| filterParams | list | false | A list of parameter objects that define the variables used in the `filterPayload`. |
| updatePayload | string | true | The MongoDB update document, which specifies the modifications. This often uses update operators like `$set`. It's written as a Go template, using `{{json .param_name}}` to insert parameters. |
| updateParams | list | true | A list of parameter objects that define the variables used in the `updatePayload`. |
| canonical | bool | true | Determines if the `updatePayload` string is parsed using MongoDB's Canonical or Relaxed Extended JSON format. **Canonical** is stricter about type representation (e.g., `{"$numberInt": "42"}`), while **Relaxed** is more lenient (e.g., `42`). |
| upsert | bool | false | If `true`, a new document is created if no document matches the `filterPayload`. Defaults to `false`. |
| **field** | **type** | **required** | **description** |
|:--------------|:---------|:-------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| kind | string | true | Must be `mongodb-update-one`. |
| source | string | true | The name of the `mongodb` source to use. |
| description | string | true | A description of the tool that is passed to the LLM. |
| database | string | true | The name of the MongoDB database containing the collection. |
| collection | string | true | The name of the MongoDB collection to update a document in. |
| filterPayload | string | true | The MongoDB query filter document to select the document for updating. It's written as a Go template, using `{{json .param_name}}` to insert parameters. |
| filterParams | list | false | A list of parameter objects that define the variables used in the `filterPayload`. |
| updatePayload | string | true | The MongoDB update document, which specifies the modifications. This often uses update operators like `$set`. It's written as a Go template, using `{{json .param_name}}` to insert parameters. |
| updateParams | list | true | A list of parameter objects that define the variables used in the `updatePayload`. |
| canonical | bool | false | Determines if the `updatePayload` string is parsed using MongoDB's Canonical or Relaxed Extended JSON format. **Canonical** is stricter about type representation (e.g., `{"$numberInt": "42"}`), while **Relaxed** is more lenient (e.g., `42`). Defaults to `false`. |
| upsert | bool | false | If `true`, a new document is created if no document matches the `filterPayload`. Defaults to `false`. |

View File

@@ -0,0 +1,95 @@
---
title: "postgres-list-database-stats"
type: docs
weight: 1
description: >
The "postgres-list-database-stats" tool lists lists key performance and activity statistics of PostgreSQL databases.
aliases:
- /resources/tools/postgres-list-database-stats
---
## About
The `postgres-list-database-stats` lists the key performance and activity statistics for each PostgreSQL database in the instance, offering insights into cache efficiency, transaction throughput, row-level activity, temporary file usage, and contention. It's compatible with
any of the following sources:
- [alloydb-postgres](../../sources/alloydb-pg.md)
- [cloud-sql-postgres](../../sources/cloud-sql-pg.md)
- [postgres](../../sources/postgres.md)
`postgres-list-database-stats` lists detailed information as JSON for each database. The tool
takes the following input parameters:
- `database_name` (optional): A text to filter results by database name. Default: `""`
- `include_templates` (optional): Boolean, set to `true` to include template databases in the results. Default: `false`
- `database_owner` (optional): A text to filter results by database owner. Default: `""`
- `default_tablespace` (optional): A text to filter results by the default tablespace name. Default: `""`
- `order_by` (optional): Specifies the sorting order. Valid values are `'size'` (descending) or `'commit'` (descending). Default: `database_name` ascending.
- `limit` (optional): The maximum number of databases to return. Default: `10`
## Example
```yaml
tools:
list_database_stats:
kind: postgres-list-database-stats
source: postgres-source
description: |
Lists the key performance and activity statistics for each PostgreSQL
database in the instance, offering insights into cache efficiency,
transaction throughput row-level activity, temporary file usage, and
contention. It returns: the database name, whether the database is
connectable, database owner, default tablespace name, the percentage of
data blocks found in the buffer cache rather than being read from disk
(a higher value indicates better cache performance), the total number of
disk blocks read from disk, the total number of times disk blocks were
found already in the cache; the total number of committed transactions,
the total number of rolled back transactions, the percentage of rolled
back transactions compared to the total number of completed
transactions, the total number of rows returned by queries, the total
number of live rows fetched by scans, the total number of rows inserted,
the total number of rows updated, the total number of rows deleted, the
number of temporary files created by queries, the total size of
temporary files used by queries in bytes, the number of query
cancellations due to conflicts with recovery, the number of deadlocks
detected, the current number of active backend connections, the
timestamp when the database statistics were last reset, and the total
database size in bytes.
```
The response is a json array with the following elements:
```json
{
"database_name": "Name of the database",
"is_connectable": "Boolean indicating Whether the database allows connections",
"database_owner": "Username of the database owner",
"default_tablespace": "Name of the default tablespace for the database",
"cache_hit_ratio_percent": "The percentage of data blocks found in the buffer cache rather than being read from disk",
"blocks_read_from_disk": "The total number of disk blocks read for this database",
"blocks_hit_in_cache": "The total number of times disk blocks were found already in the cache.",
"xact_commit": "The total number of committed transactions",
"xact_rollback": "The total number of rolled back transactions",
"rollback_ratio_percent": "The percentage of rolled back transactions compared to the total number of completed transactions",
"rows_returned_by_queries": "The total number of rows returned by queries",
"rows_fetched_by_scans": "The total number of live rows fetched by scans",
"tup_inserted": "The total number of rows inserted",
"tup_updated": "The total number of rows updated",
"tup_deleted": "The total number of rows deleted",
"temp_files": "The number of temporary files created by queries",
"temp_size_bytes": "The total size of temporary files used by queries in bytes",
"conflicts": "Number of query cancellations due to conflicts",
"deadlocks": "Number of deadlocks detected",
"active_connections": "The current number of active backend connections",
"statistics_last_reset": "The timestamp when the database statistics were last reset",
"database_size_bytes": "The total disk size of the database in bytes"
}
```
## Reference
| **field** | **type** | **required** | **description** |
|-------------|:--------:|:------------:|------------------------------------------------------|
| kind | string | true | Must be "postgres-list-database-stats". |
| source | string | true | Name of the source the SQL should execute on. |
| description | string | false | Description of the tool that is passed to the agent. |

View File

@@ -21,12 +21,10 @@ any of the following sources:
`postgres-list-indexes` lists detailed information as JSON for indexes. The tool
takes the following input parameters:
- `table_name` (optional): A text to filter results by table name. The input is
used within a LIKE clause. Default: `""`
- `index_name` (optional): A text to filter results by index name. The input is
used within a LIKE clause. Default: `""`
- `schema_name` (optional): A text to filter results by schema name. The input
is used within a LIKE clause. Default: `""`
- `table_name` (optional): A text to filter results by table name. Default: `""`
- `index_name` (optional): A text to filter results by index name. Default: `""`
- `schema_name` (optional): A text to filter results by schema name. Default: `""`
- `only_unused` (optional): If true, returns indexes that have never been used.
- `limit` (optional): The maximum number of rows to return. Default: `50`.
## Example

View File

@@ -0,0 +1,59 @@
---
title: "postgres-list-pg-settings"
type: docs
weight: 1
description: >
The "postgres-list-pg-settings" tool lists PostgreSQL run-time configuration settings.
aliases:
- /resources/tools/postgres-list-pg-settings
---
## About
The `postgres-list-pg-settings` tool lists the configuration parameters for the postgres server, their current values, and related information. It's compatible with any of the following sources:
- [alloydb-postgres](../../sources/alloydb-pg.md)
- [cloud-sql-postgres](../../sources/cloud-sql-pg.md)
- [postgres](../../sources/postgres.md)
`postgres-list-pg-settings` lists detailed information as JSON for each setting. The tool
takes the following input parameters:
- `setting_name` (optional): A text to filter results by setting name. Default: `""`
- `limit` (optional): The maximum number of rows to return. Default: `50`.
## Example
```yaml
tools:
list_indexes:
kind: postgres-list-pg-settings
source: postgres-source
description: |
Lists configuration parameters for the postgres server ordered lexicographically,
with a default limit of 50 rows. It returns the parameter name, its current setting,
unit of measurement, a short description, the source of the current setting (e.g.,
default, configuration file, session), and whether a restart is required when the
parameter value is changed."
```
The response is a json array with the following elements:
```json
{
"name": "Setting name",
"current_value": "Current value of the setting",
"unit": "Unit of the setting",
"short_desc": "Short description of the setting",
"source": "Source of the current value (e.g., default, configuration file, session)",
"requires_restart": "Indicates if a server restart is required to apply a change ('Yes', 'No', or 'No (Reload sufficient)')"
}
```
## Reference
| **field** | **type** | **required** | **description** |
|-------------|:--------:|:------------:|------------------------------------------------------|
| kind | string | true | Must be "postgres-list-pg-settings". |
| source | string | true | Name of the source the SQL should execute on. |
| description | string | false | Description of the tool that is passed to the agent. |

View File

@@ -0,0 +1,66 @@
---
title: "postgres-list-publication-tables"
type: docs
weight: 1
description: >
The "postgres-list-publication-tables" tool lists publication tables in a Postgres database.
aliases:
- /resources/tools/postgres-list-publication-tables
---
## About
The `postgres-list-publication-tables` tool lists all publication tables in the database. It's compatible with any of the following sources:
- [alloydb-postgres](../../sources/alloydb-pg.md)
- [cloud-sql-postgres](../../sources/cloud-sql-pg.md)
- [postgres](../../sources/postgres.md)
`postgres-list-publication-tables` lists detailed information as JSON for publication tables. A publication table in PostgreSQL is a
table that is explicitly included as a source for replication within a publication (a set of changes generated from a table or group
of tables) as part of the logical replication feature. The tool takes the following input parameters:
- `table_names` (optional): Filters by a comma-separated list of table names. Default: `""`
- `publication_names` (optional): Filters by a comma-separated list of publication names. Default: `""`
- `schema_names` (optional): Filters by a comma-separated list of schema names. Default: `""`
- `limit` (optional): The maximum number of rows to return. Default: `50`
## Example
```yaml
tools:
list_indexes:
kind: postgres-list-publication-tables
source: postgres-source
description: |
Lists all tables that are explicitly part of a publication in the database.
Tables that are part of a publication via 'FOR ALL TABLES' are not included,
unless they are also explicitly added to the publication.
Returns the publication name, schema name, and table name, along with
definition details indicating if it publishes all tables, whether it
replicates inserts, updates, deletes, or truncates, and the publication
owner.
```
The response is a JSON array with the following elements:
```json
{
"publication_name": "Name of the publication",
"schema_name": "Name of the schema the table belongs to",
"table_name": "Name of the table",
"publishes_all_tables": "boolean indicating if the publication was created with FOR ALL TABLES",
"publishes_inserts": "boolean indicating if INSERT operations are replicated",
"publishes_updates": "boolean indicating if UPDATE operations are replicated",
"publishes_deletes": "boolean indicating if DELETE operations are replicated",
"publishes_truncates": "boolean indicating if TRUNCATE operations are replicated",
"publication_owner": "Username of the database role that owns the publication"
}
```
## Reference
| **field** | **type** | **required** | **description** |
|-------------|:--------:|:------------:|------------------------------------------------------|
| kind | string | true | Must be "postgres-list-publication-tables". |
| source | string | true | Name of the source the SQL should execute on. |
| description | string | false | Description of the tool that is passed to the agent. |

View File

@@ -0,0 +1,70 @@
---
title: "postgres-list-roles"
type: docs
weight: 1
description: >
The "postgres-list-roles" tool lists user-created roles in a Postgres database.
aliases:
- /resources/tools/postgres-list-roles
---
## About
The `postgres-list-roles` tool lists all the user-created roles in the instance, excluding system roles (like `cloudsql%` or `pg_%`). It provides details about each role's attributes and memberships. It's compatible with
any of the following sources:
- [alloydb-postgres](../../sources/alloydb-pg.md)
- [cloud-sql-postgres](../../sources/cloud-sql-pg.md)
- [postgres](../../sources/postgres.md)
`postgres-list-roles` lists detailed information as JSON for each role. The tool
takes the following input parameters:
- `role_name` (optional): A text to filter results by role name. Default: `""`
- `limit` (optional): The maximum number of roles to return. Default: `50`
## Example
```yaml
tools:
list_indexes:
kind: postgres-list-roles
source: postgres-source
description: |
Lists all the user-created roles in the instance . It returns the role name,
Object ID, the maximum number of concurrent connections the role can make,
along with boolean indicators for: superuser status, privilege inheritance
from member roles, ability to create roles, ability to create databases,
ability to log in, replication privilege, and the ability to bypass
row-level security, the password expiration timestamp, a list of direct
members belonging to this role, and a list of other roles/groups that this
role is a member of.
```
The response is a json array with the following elements:
```json
{
"role_name": "Name of the role",
"oid": "Object ID of the role",
"connection_limit": "Maximum concurrent connections allowed (-1 for no limit)",
"is_superuser": "Boolean, true if the role is a superuser",
"inherits_privileges": "Boolean, true if the role inherits privileges of roles it is a member of",
"can_create_roles": "Boolean, true if the role can create other roles",
"can_create_db": "Boolean, true if the role can create databases",
"can_login": "Boolean, true if the role can log in",
"is_replication_role": "Boolean, true if this is a replication role",
"bypass_rls": "Boolean, true if the role bypasses row-level security policies",
"valid_until": "Timestamp until the password is valid (null if forever)",
"direct_members": ["Array of role names that are direct members of this role"],
"member_of": ["Array of role names that this role is a member of"]
}
```
## Reference
| **field** | **type** | **required** | **description** |
|-------------|:--------:|:------------:|------------------------------------------------------|
| kind | string | true | Must be "postgres-list-roles". |
| source | string | true | Name of the source the SQL should execute on. |
| description | string | false | Description of the tool that is passed to the agent. |

View File

@@ -21,9 +21,9 @@ the following sources:
`postgres-list-schemas` lists detailed information as JSON for each schema. The
tool takes the following input parameters:
- `schema_name` (optional): A pattern to filter schema names using SQL LIKE
operator.
If omitted, all user-defined schemas are returned.
- `schema_name` (optional): A text to filter results by schema name. Default: `""`
- `owner` (optional): A text to filter results by owner name. Default: `""`
- `limit` (optional): The maximum number of rows to return. Default: `50`.
## Example

View File

@@ -20,9 +20,9 @@ Postgres database. It's compatible with any of the following sources:
`postgres-list-sequences` lists detailed information as JSON for all sequences.
The tool takes the following input parameters:
- `sequencename` (optional): A text to filter results by sequence name. The
- `sequence_name` (optional): A text to filter results by sequence name. The
input is used within a LIKE clause. Default: `""`
- `schemaname` (optional): A text to filter results by schema name. The input is
- `schema_name` (optional): A text to filter results by schema name. The input is
used within a LIKE clause. Default: `""`
- `limit` (optional): The maximum number of rows to return. Default: `50`.
@@ -45,9 +45,9 @@ The response is a json array with the following elements:
```json
{
"sequencename": "sequence name",
"schemaname": "schema name",
"sequenceowner": "owner of the sequence",
"sequence_name": "sequence name",
"schema_name": "schema name",
"sequence_owner": "owner of the sequence",
"data_type": "data type of the sequence",
"start_value": "starting value of the sequence",
"min_value": "minimum value of the sequence",

View File

@@ -0,0 +1,56 @@
---
title: "postgres-list-tablespaces"
type: docs
weight: 1
description: >
The "postgres-list-tablespaces" tool lists tablespaces in a Postgres database.
aliases:
- /resources/tools/postgres-list-tablespaces
---
## About
The `postgres-list-tablespaces` tool lists available tablespaces in the database. It's compatible with any of the following sources:
- [alloydb-postgres](../../sources/alloydb-pg.md)
- [cloud-sql-postgres](../../sources/cloud-sql-pg.md)
- [postgres](../../sources/postgres.md)
`postgres-list-tablespaces` lists detailed information as JSON for tablespaces. The tool takes the following input parameters:
- `tablespace_name` (optional): A text to filter results by tablespace name. Default: `""`
- `limit` (optional): The maximum number of tablespaces to return. Default: `50`
## Example
```yaml
tools:
list_tablespaces:
kind: postgres-list-tablespaces
source: postgres-source
description: |
Lists all tablespaces in the database. Returns the tablespace name,
owner name, size in bytes(if the current user has CREATE privileges on
the tablespace, otherwise NULL), internal object ID, the access control
list regarding permissions, and any specific tablespace options.
```
The response is a json array with the following elements:
```json
{
"tablespace_name": "name of the tablespace",
"owner_username": "owner of the tablespace",
"size_in_bytes": "size in bytes if the current user has CREATE privileges on the tablespace, otherwise NULL",
"oid": "Object ID of the tablespace",
"spcacl": "Access privileges",
"spcoptions": "Tablespace-level options (e.g., seq_page_cost, random_page_cost)"
}
```
## Reference
| **field** | **type** | **required** | **description** |
|-------------|:--------:|:-------------:|------------------------------------------------------|
| kind | string | true | Must be "postgres-list-tablespaces". |
| source | string | true | Name of the source the SQL should execute on. |
| description | string | false | Description of the tool that is passed to the agent. |

View File

@@ -19,11 +19,11 @@ a Postgres database, excluding those in system schemas (`pg_catalog`,
- [postgres](../../sources/postgres.md)
`postgres-list-views` lists detailed view information (schemaname, viewname,
ownername) as JSON for views in a database. The tool takes the following input
ownername, definition) as JSON for views in a database. The tool takes the following input
parameters:
- `viewname` (optional): A string pattern to filter view names. The search uses
SQL LIKE operator to filter the views. Default: `""`
- `view_name` (optional): A string pattern to filter view names. Default: `""`
- `schema_name` (optional): A string pattern to filter schema names. Default: `""`
- `limit` (optional): The maximum number of rows to return. Default: `50`.
## Example

View File

@@ -9,3 +9,5 @@ description: >
- [serverless-spark-get-batch](./serverless-spark-get-batch.md)
- [serverless-spark-list-batches](./serverless-spark-list-batches.md)
- [serverless-spark-cancel-batch](./serverless-spark-cancel-batch.md)
- [serverless-spark-create-pyspark-batch](./serverless-spark-create-pyspark-batch.md)
- [serverless-spark-create-spark-batch](./serverless-spark-create-spark-batch.md)

View File

@@ -0,0 +1,90 @@
---
title: "serverless-spark-create-pyspark-batch"
type: docs
weight: 2
description: >
A "serverless-spark-create-pyspark-batch" tool submits a Spark batch to run asynchronously.
aliases:
- /resources/tools/serverless-spark-create-pyspark-batch
---
## About
A `serverless-spark-create-pyspark-batch` tool submits a Spark batch to a Google
Cloud Serverless for Apache Spark source. The workload executes asynchronously
and takes around a minute to begin executing; status can be polled using the
[get batch](serverless-spark-get-batch.md) tool.
It's compatible with the following sources:
- [serverless-spark](../../sources/serverless-spark.md)
`serverless-spark-create-pyspark-batch` accepts the following parameters:
- **`mainFile`**: The path to the main Python file, as a gs://... URI.
- **`args`** Optional. A list of arguments passed to the main file.
- **`version`** Optional. The Serverless [runtime
version](https://docs.cloud.google.com/dataproc-serverless/docs/concepts/versions/dataproc-serverless-versions)
to execute with.
## Custom Configuration
This tool supports custom
[`runtimeConfig`](https://docs.cloud.google.com/dataproc-serverless/docs/reference/rest/v1/RuntimeConfig)
and
[`environmentConfig`](https://docs.cloud.google.com/dataproc-serverless/docs/reference/rest/v1/EnvironmentConfig)
settings, which can be specified in a `tools.yaml` file. These configurations
are parsed as YAML and passed to the Dataproc API.
**Note:** If your project requires custom runtime or environment configuration,
you must write a custom `tools.yaml`, you cannot use the `serverless-spark`
prebuilt config.
### Example `tools.yaml`
```yaml
tools:
- name: "serverless-spark-create-pyspark-batch"
kind: "serverless-spark-create-pyspark-batch"
source: "my-serverless-spark-source"
runtimeConfig:
properties:
spark.driver.memory: "1024m"
environmentConfig:
executionConfig:
networkUri: "my-network"
```
## Response Format
The response is an [operation](https://docs.cloud.google.com/dataproc-serverless/docs/reference/rest/v1/projects.locations.operations#resource:-operation) metadata JSON
object corresponding to [batch operation metadata](https://pkg.go.dev/cloud.google.com/go/dataproc/v2/apiv1/dataprocpb#BatchOperationMetadata)
Example:
```json
{
"batch": "projects/myproject/locations/us-central1/batches/aaaaaaaa-bbbb-cccc-dddd-eeeeeeeeeeee",
"batchUuid": "aaaaaaaa-bbbb-cccc-dddd-eeeeeeeeeeee",
"createTime": "2025-11-19T16:36:47.607119Z",
"description": "Batch",
"labels": {
"goog-dataproc-batch-uuid": "aaaaaaaa-bbbb-cccc-dddd-eeeeeeeeeeee",
"goog-dataproc-location": "us-central1"
},
"operationType": "BATCH",
"warnings": [
"No runtime version specified. Using the default runtime version."
]
}
```
## Reference
| **field** | **type** | **required** | **description** |
| ----------------- | :------: | :----------: | -------------------------------------------------------------------------------------------------------------------------------------------------------- |
| kind | string | true | Must be "serverless-spark-create-pyspark-batch". |
| source | string | true | Name of the source the tool should use. |
| description | string | false | Description of the tool that is passed to the LLM. |
| runtimeConfig | map | false | [Runtime config](https://docs.cloud.google.com/dataproc-serverless/docs/reference/rest/v1/RuntimeConfig) for all batches created with this tool. |
| environmentConfig | map | false | [Environment config](https://docs.cloud.google.com/dataproc-serverless/docs/reference/rest/v1/EnvironmentConfig) for all batches created with this tool. |
| authRequired | string[] | false | List of auth services required to invoke this tool. |

View File

@@ -0,0 +1,97 @@
---
title: "serverless-spark-create-spark-batch"
type: docs
weight: 2
description: >
A "serverless-spark-create-spark-batch" tool submits a Spark batch to run asynchronously.
aliases:
- /resources/tools/serverless-spark-create-spark-batch
---
## About
A `serverless-spark-create-spark-batch` tool submits a Java Spark batch to a
Google Cloud Serverless for Apache Spark source. The workload executes
asynchronously and takes around a minute to begin executing; status can be
polled using the [get batch](serverless-spark-get-batch.md) tool.
It's compatible with the following sources:
- [serverless-spark](../../sources/serverless-spark.md)
`serverless-spark-create-spark-batch` accepts the following parameters:
- **`mainJarFile`**: Optional. The gs:// URI of the jar file that contains the
main class. Exactly one of mainJarFile or mainClass must be specified.
- **`mainClass`**: Optional. The name of the driver's main class. Exactly one of
mainJarFile or mainClass must be specified.
- **`jarFiles`**: Optional. A list of gs:// URIs of jar files to add to the CLASSPATHs of
the Spark driver and tasks.
- **`args`** Optional. A list of arguments passed to the driver.
- **`version`** Optional. The Serverless [runtime
version](https://docs.cloud.google.com/dataproc-serverless/docs/concepts/versions/dataproc-serverless-versions)
to execute with.
## Custom Configuration
This tool supports custom
[`runtimeConfig`](https://docs.cloud.google.com/dataproc-serverless/docs/reference/rest/v1/RuntimeConfig)
and
[`environmentConfig`](https://docs.cloud.google.com/dataproc-serverless/docs/reference/rest/v1/EnvironmentConfig)
settings, which can be specified in a `tools.yaml` file. These configurations
are parsed as YAML and passed to the Dataproc API.
**Note:** If your project requires custom runtime or environment configuration,
you must write a custom `tools.yaml`, you cannot use the `serverless-spark`
prebuilt config.
### Example `tools.yaml`
```yaml
tools:
- name: "serverless-spark-create-spark-batch"
kind: "serverless-spark-create-spark-batch"
source: "my-serverless-spark-source"
runtimeConfig:
properties:
spark.driver.memory: "1024m"
environmentConfig:
executionConfig:
networkUri: "my-network"
```
## Response Format
The response is an
[operation](https://docs.cloud.google.com/dataproc-serverless/docs/reference/rest/v1/projects.locations.operations#resource:-operation)
metadata JSON object corresponding to [batch operation
metadata](https://pkg.go.dev/cloud.google.com/go/dataproc/v2/apiv1/dataprocpb#BatchOperationMetadata)
Example:
```json
{
"batch": "projects/myproject/locations/us-central1/batches/aaaaaaaa-bbbb-cccc-dddd-eeeeeeeeeeee",
"batchUuid": "aaaaaaaa-bbbb-cccc-dddd-eeeeeeeeeeee",
"createTime": "2025-11-19T16:36:47.607119Z",
"description": "Batch",
"labels": {
"goog-dataproc-batch-uuid": "aaaaaaaa-bbbb-cccc-dddd-eeeeeeeeeeee",
"goog-dataproc-location": "us-central1"
},
"operationType": "BATCH",
"warnings": [
"No runtime version specified. Using the default runtime version."
]
}
```
## Reference
| **field** | **type** | **required** | **description** |
| ----------------- | :------: | :----------: | -------------------------------------------------------------------------------------------------------------------------------------------------------- |
| kind | string | true | Must be "serverless-spark-create-spark-batch". |
| source | string | true | Name of the source the tool should use. |
| description | string | false | Description of the tool that is passed to the LLM. |
| runtimeConfig | map | false | [Runtime config](https://docs.cloud.google.com/dataproc-serverless/docs/reference/rest/v1/RuntimeConfig) for all batches created with this tool. |
| environmentConfig | map | false | [Environment config](https://docs.cloud.google.com/dataproc-serverless/docs/reference/rest/v1/EnvironmentConfig) for all batches created with this tool. |
| authRequired | string[] | false | List of auth services required to invoke this tool. |

View File

@@ -771,7 +771,7 @@
},
"outputs": [],
"source": [
"version = \"0.21.0\" # x-release-please-version\n",
"version = \"0.22.0\" # x-release-please-version\n",
"! curl -L -o /content/toolbox https://storage.googleapis.com/genai-toolbox/v{version}/linux/amd64/toolbox\n",
"\n",
"# Make the binary executable\n",

View File

@@ -123,7 +123,7 @@ In this section, we will download and install the Toolbox binary.
<!-- {x-release-please-start-version} -->
```bash
export OS="linux/amd64" # one of linux/amd64, darwin/arm64, darwin/amd64, or windows/amd64
export VERSION="0.21.0"
export VERSION="0.22.0"
curl -O https://storage.googleapis.com/genai-toolbox/v$VERSION/$OS/toolbox
```
<!-- {x-release-please-end} -->

View File

@@ -220,7 +220,7 @@
},
"outputs": [],
"source": [
"version = \"0.21.0\" # x-release-please-version\n",
"version = \"0.22.0\" # x-release-please-version\n",
"! curl -O https://storage.googleapis.com/genai-toolbox/v{version}/linux/amd64/toolbox\n",
"\n",
"# Make the binary executable\n",

View File

@@ -179,7 +179,7 @@ to use BigQuery, and then run the Toolbox server.
<!-- {x-release-please-start-version} -->
```bash
export OS="linux/amd64" # one of linux/amd64, darwin/arm64, darwin/amd64, or windows/amd64
curl -O https://storage.googleapis.com/genai-toolbox/v0.21.0/$OS/toolbox
curl -O https://storage.googleapis.com/genai-toolbox/v0.22.0/$OS/toolbox
```
<!-- {x-release-please-end} -->

View File

@@ -98,7 +98,7 @@ In this section, we will download Toolbox, configure our tools in a
<!-- {x-release-please-start-version} -->
```bash
export OS="linux/amd64" # one of linux/amd64, darwin/arm64, darwin/amd64, or windows/amd64
curl -O https://storage.googleapis.com/genai-toolbox/v0.21.0/$OS/toolbox
curl -O https://storage.googleapis.com/genai-toolbox/v0.22.0/$OS/toolbox
```
<!-- {x-release-please-end} -->

View File

@@ -34,7 +34,7 @@ In this section, we will download Toolbox and run the Toolbox server.
<!-- {x-release-please-start-version} -->
```bash
export OS="linux/amd64" # one of linux/amd64, darwin/arm64, darwin/amd64, or windows/amd64
curl -O https://storage.googleapis.com/genai-toolbox/v0.21.0/$OS/toolbox
curl -O https://storage.googleapis.com/genai-toolbox/v0.22.0/$OS/toolbox
```
<!-- {x-release-please-end} -->

View File

@@ -48,7 +48,7 @@ In this section, we will download Toolbox and run the Toolbox server.
<!-- {x-release-please-start-version} -->
```bash
export OS="linux/amd64" # one of linux/amd64, darwin/arm64, darwin/amd64, or windows/amd64
curl -O https://storage.googleapis.com/genai-toolbox/v0.21.0/$OS/toolbox
curl -O https://storage.googleapis.com/genai-toolbox/v0.22.0/$OS/toolbox
```
<!-- {x-release-please-end} -->

View File

@@ -34,7 +34,7 @@ In this section, we will download Toolbox and run the Toolbox server.
<!-- {x-release-please-start-version} -->
```bash
export OS="linux/amd64" # one of linux/amd64, darwin/arm64, darwin/amd64, or windows/amd64
curl -O https://storage.googleapis.com/genai-toolbox/v0.21.0/$OS/toolbox
curl -O https://storage.googleapis.com/genai-toolbox/v0.22.0/$OS/toolbox
```
<!-- {x-release-please-end} -->

View File

@@ -0,0 +1,25 @@
---
title: "JS SDK"
type: docs
weight: 7
description: >
JS SDKs to connect to the MCP Toolbox server.
---
## Overview
The MCP Toolbox service provides a centralized way to manage and expose tools
(like API connectors, database query tools, etc.) for use by GenAI applications.
These JS SDKs act as clients for that service. They handle the communication needed to:
* Fetch tool definitions from your running Toolbox instance.
* Provide convenient JS objects or functions representing those tools.
* Invoke the tools (calling the underlying APIs/services configured in Toolbox).
* Handle authentication and parameter binding as needed.
By using these SDKs, you can easily leverage your Toolbox-managed tools directly
within your JS applications or AI orchestration frameworks.
[Github](https://github.com/googleapis/mcp-toolbox-sdk-js)

View File

@@ -1,15 +0,0 @@
---
title: "Go SDK"
weight: 2
description: Go lang client SDK
icon: fa-brands fa-golang
manualLink: "https://github.com/googleapis/mcp-toolbox-sdk-go"
manualLinkTarget: _blank
---
<html>
<head>
<link rel="canonical" href="https://github.com/googleapis/mcp-toolbox-sdk-go"/>
<meta http-equiv="refresh" content="0;url=https://github.com/googleapis/mcp-toolbox-sdk-go"/>
</head>
</html>

View File

@@ -0,0 +1,25 @@
---
title: "Go SDK"
type: docs
weight: 7
description: >
Go SDKs to connect to the MCP Toolbox server.
---
## Overview
The MCP Toolbox service provides a centralized way to manage and expose tools
(like API connectors, database query tools, etc.) for use by GenAI applications.
The Go SDK act as clients for that service. They handle the communication needed to:
* Fetch tool definitions from your running Toolbox instance.
* Provide convenient Go structs representing those tools.
* Invoke the tools (calling the underlying APIs/services configured in Toolbox).
* Handle authentication and parameter binding as needed.
By using the SDK, you can easily leverage your Toolbox-managed tools directly
within your Go applications or AI orchestration frameworks.
[Github](https://github.com/googleapis/mcp-toolbox-sdk-go)

View File

@@ -1,15 +0,0 @@
---
title: "JS SDK"
weight: 2
description: Javascript client SDK
icon: fa-brands fa-node-js
manualLink: "https://github.com/googleapis/mcp-toolbox-sdk-js"
manualLinkTarget: _blank
---
<html>
<head>
<link rel="canonical" href="https://github.com/googleapis/mcp-toolbox-sdk-js"/>
<meta http-equiv="refresh" content="0;url=https://github.com/googleapis/mcp-toolbox-sdk-js"/>
</head>
</html>

View File

@@ -1,15 +0,0 @@
---
title: "Python SDK"
weight: 2
description: Python client SDK
icon: fa-brands fa-python
manualLink: "https://github.com/googleapis/mcp-toolbox-sdk-python"
manualLinkTarget: _blank
---
<html>
<head>
<link rel="canonical" href="https://github.com/googleapis/mcp-toolbox-sdk-python"/>
<meta http-equiv="refresh" content="0;url=https://github.com/googleapis/mcp-toolbox-sdk-python"/>
</head>
</html>

View File

@@ -0,0 +1,55 @@
---
title: "Python SDK"
type: docs
weight: 7
description: >
Python SDKs to connect to the MCP Toolbox server.
---
## Overview
The MCP Toolbox service provides a centralized way to manage and expose tools
(like API connectors, database query tools, etc.) for use by GenAI applications.
These Python SDKs act as clients for that service. They handle the communication needed to:
* Fetch tool definitions from your running Toolbox instance.
* Provide convenient Python objects or functions representing those tools.
* Invoke the tools (calling the underlying APIs/services configured in Toolbox).
* Handle authentication and parameter binding as needed.
By using these SDKs, you can easily leverage your Toolbox-managed tools directly
within your Python applications or AI orchestration frameworks.
## Which Package Should I Use?
Choosing the right package depends on how you are building your application:
* [`toolbox-langchain`](langchain):
Use this package if you are building your application using the LangChain or
LangGraph frameworks. It provides tools that are directly compatible with the
LangChain ecosystem (`BaseTool` interface), simplifying integration.
* [`toolbox-llamaindex`](llamaindex):
Use this package if you are building your application using the LlamaIndex framework.
It provides tools that are directly compatible with the
LlamaIndex ecosystem (`BaseTool` interface), simplifying integration.
* [`toolbox-core`](core):
Use this package if you are not using LangChain/LangGraph or any other
orchestration framework, or if you need a framework-agnostic way to interact
with Toolbox tools (e.g., for custom orchestration logic or direct use in
Python scripts).
## Available Packages
This repository hosts the following Python packages. See the package-specific
README for detailed installation and usage instructions:
| Package | Target Use Case | Integration | Path | Details (README) | PyPI Status |
| :------ | :---------- | :---------- | :---------------------- | :---------- | :---------
| `toolbox-core` | Framework-agnostic / Custom applications | Use directly / Custom | `packages/toolbox-core/` | 📄 [View README](https://github.com/googleapis/mcp-toolbox-sdk-python/blob/main/packages/toolbox-core/README.md) | ![pypi version](https://img.shields.io/pypi/v/toolbox-core.svg) |
| `toolbox-langchain` | LangChain / LangGraph applications | LangChain / LangGraph | `packages/toolbox-langchain/` | 📄 [View README](https://github.com/googleapis/mcp-toolbox-sdk-python/blob/main/packages/toolbox-langchain/README.md) | ![pypi version](https://img.shields.io/pypi/v/toolbox-langchain.svg) |
| `toolbox-llamaindex` | LlamaIndex applications | LlamaIndex | `packages/toolbox-llamaindex/` | 📄 [View README](https://github.com/googleapis/mcp-toolbox-sdk-python/blob/main/packages/toolbox-llamaindex/README.md) | ![pypi version](https://img.shields.io/pypi/v/toolbox-llamaindex.svg) |
[Github](https://github.com/googleapis/mcp-toolbox-sdk-python)

View File

@@ -1,6 +1,6 @@
{
"name": "mcp-toolbox-for-databases",
"version": "0.21.0",
"version": "0.22.0",
"description": "MCP Toolbox for Databases is an open-source MCP server for more than 30 different datasources.",
"contextFileName": "MCP-TOOLBOX-EXTENSION.md"
}

View File

@@ -175,7 +175,7 @@ tools:
list_schemas:
kind: postgres-list-schemas
source: alloydb-pg-source
list_indexes:
kind: postgres-list-indexes
source: alloydb-pg-source
@@ -200,6 +200,26 @@ tools:
kind: postgres-get-column-cardinality
source: alloydb-pg-source
list_publication_tables:
kind: postgres-list-publication-tables
source: alloydb-pg-source
list_tablespaces:
kind: postgres-list-tablespaces
source: alloydb-pg-source
list_pg_settings:
kind: postgres-list-pg-settings
source: alloydb-pg-source
list_database_stats:
kind: postgres-list-database-stats
source: alloydb-pg-source
list_roles:
kind: postgres-list-roles
source: alloydb-pg-source
toolsets:
alloydb_postgres_database_tools:
- execute_sql
@@ -224,3 +244,8 @@ toolsets:
- replication_stats
- list_query_stats
- get_column_cardinality
- list_publication_tables
- list_tablespaces
- list_pg_settings
- list_database_stats
- list_roles

View File

@@ -177,7 +177,7 @@ tools:
list_schemas:
kind: postgres-list-schemas
source: cloudsql-pg-source
database_overview:
kind: postgres-database-overview
source: cloudsql-pg-source
@@ -202,6 +202,26 @@ tools:
kind: postgres-get-column-cardinality
source: cloudsql-pg-source
list_publication_tables:
kind: postgres-list-publication-tables
source: cloudsql-pg-source
list_tablespaces:
kind: postgres-list-tablespaces
source: cloudsql-pg-source
list_pg_settings:
kind: postgres-list-pg-settings
source: cloudsql-pg-source
list_database_stats:
kind: postgres-list-database-stats
source: cloudsql-pg-source
list_roles:
kind: postgres-list-roles
source: cloudsql-pg-source
toolsets:
cloud_sql_postgres_database_tools:
- execute_sql
@@ -226,3 +246,8 @@ toolsets:
- replication_stats
- list_query_stats
- get_column_cardinality
- list_publication_tables
- list_tablespaces
- list_pg_settings
- list_database_stats
- list_roles

View File

@@ -176,7 +176,7 @@ tools:
list_schemas:
kind: postgres-list-schemas
source: postgresql-source
database_overview:
kind: postgres-database-overview
source: postgresql-source
@@ -201,6 +201,26 @@ tools:
kind: postgres-get-column-cardinality
source: postgresql-source
list_publication_tables:
kind: postgres-list-publication-tables
source: postgresql-source
list_tablespaces:
kind: postgres-list-tablespaces
source: postgresql-source
list_pg_settings:
kind: postgres-list-pg-settings
source: postgresql-source
list_database_stats:
kind: postgres-list-database-stats
source: postgresql-source
list_roles:
kind: postgres-list-roles
source: postgresql-source
toolsets:
postgres_database_tools:
- execute_sql
@@ -225,4 +245,8 @@ toolsets:
- replication_stats
- list_query_stats
- get_column_cardinality
- list_publication_tables
- list_tablespaces
- list_pg_settings
- list_database_stats
- list_roles

View File

@@ -28,9 +28,17 @@ tools:
cancel_batch:
kind: serverless-spark-cancel-batch
source: serverless-spark-source
create_pyspark_batch:
kind: serverless-spark-create-pyspark-batch
source: serverless-spark-source
create_spark_batch:
kind: serverless-spark-create-spark-batch
source: serverless-spark-source
toolsets:
serverless_spark_tools:
- list_batches
- get_batch
- cancel_batch
- create_pyspark_batch
- create_spark_batch

View File

@@ -0,0 +1,123 @@
// Copyright 2025 Google LLC
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
package bigquerycommon
import (
"math/big"
"reflect"
"testing"
)
func TestNormalizeValue(t *testing.T) {
tests := []struct {
name string
input any
expected any
}{
{
name: "big.Rat 1/3 (NUMERIC scale 9)",
input: new(big.Rat).SetFrac64(1, 3), // 0.33333333333...
expected: "0.33333333333333333333333333333333333333", // FloatString(38)
},
{
name: "big.Rat 19/2 (9.5)",
input: new(big.Rat).SetFrac64(19, 2),
expected: "9.5",
},
{
name: "big.Rat 12341/10 (1234.1)",
input: new(big.Rat).SetFrac64(12341, 10),
expected: "1234.1",
},
{
name: "big.Rat 10/1 (10)",
input: new(big.Rat).SetFrac64(10, 1),
expected: "10",
},
{
name: "string",
input: "hello",
expected: "hello",
},
{
name: "int",
input: 123,
expected: 123,
},
{
name: "nested slice of big.Rat",
input: []any{
new(big.Rat).SetFrac64(19, 2),
new(big.Rat).SetFrac64(1, 4),
},
expected: []any{"9.5", "0.25"},
},
{
name: "nested map of big.Rat",
input: map[string]any{
"val1": new(big.Rat).SetFrac64(19, 2),
"val2": new(big.Rat).SetFrac64(1, 2),
},
expected: map[string]any{
"val1": "9.5",
"val2": "0.5",
},
},
{
name: "complex nested structure",
input: map[string]any{
"list": []any{
map[string]any{
"rat": new(big.Rat).SetFrac64(3, 2),
},
},
},
expected: map[string]any{
"list": []any{
map[string]any{
"rat": "1.5",
},
},
},
},
{
name: "slice of *big.Rat",
input: []*big.Rat{
new(big.Rat).SetFrac64(19, 2),
new(big.Rat).SetFrac64(1, 4),
},
expected: []any{"9.5", "0.25"},
},
{
name: "slice of strings",
input: []string{"a", "b"},
expected: []any{"a", "b"},
},
{
name: "byte slice (BYTES)",
input: []byte("hello"),
expected: []byte("hello"),
},
}
for _, tt := range tests {
t.Run(tt.name, func(t *testing.T) {
got := NormalizeValue(tt.input)
if !reflect.DeepEqual(got, tt.expected) {
t.Errorf("NormalizeValue() = %v, want %v", got, tt.expected)
}
})
}
}

View File

@@ -17,6 +17,8 @@ package bigquerycommon
import (
"context"
"fmt"
"math/big"
"reflect"
"sort"
"strings"
@@ -118,3 +120,54 @@ func InitializeDatasetParameters(
return projectParam, datasetParam
}
// NormalizeValue converts BigQuery specific types to standard JSON-compatible types.
// Specifically, it handles *big.Rat (used for NUMERIC/BIGNUMERIC) by converting
// them to decimal strings with up to 38 digits of precision, trimming trailing zeros.
// It recursively handles slices (arrays) and maps (structs) using reflection.
func NormalizeValue(v any) any {
if v == nil {
return nil
}
// Handle *big.Rat specifically.
if rat, ok := v.(*big.Rat); ok {
// Convert big.Rat to a decimal string.
// Use a precision of 38 digits (enough for BIGNUMERIC and NUMERIC)
// and trim trailing zeros to match BigQuery's behavior.
s := rat.FloatString(38)
if strings.Contains(s, ".") {
s = strings.TrimRight(s, "0")
s = strings.TrimRight(s, ".")
}
return s
}
// Use reflection for slices and maps to handle various underlying types.
rv := reflect.ValueOf(v)
switch rv.Kind() {
case reflect.Slice, reflect.Array:
// Preserve []byte as is, so json.Marshal encodes it as Base64 string (BigQuery BYTES behavior).
if rv.Type().Elem().Kind() == reflect.Uint8 {
return v
}
newSlice := make([]any, rv.Len())
for i := 0; i < rv.Len(); i++ {
newSlice[i] = NormalizeValue(rv.Index(i).Interface())
}
return newSlice
case reflect.Map:
// Ensure keys are strings to produce a JSON-compatible map.
if rv.Type().Key().Kind() != reflect.String {
return v
}
newMap := make(map[string]any, rv.Len())
iter := rv.MapRange()
for iter.Next() {
newMap[iter.Key().String()] = NormalizeValue(iter.Value().Interface())
}
return newMap
}
return v
}

View File

@@ -337,7 +337,7 @@ func (t Tool) Invoke(ctx context.Context, resourceMgr tools.SourceProvider, para
schema := it.Schema
row := orderedmap.Row{}
for i, field := range schema {
row.Add(field.Name, val[i])
row.Add(field.Name, bqutil.NormalizeValue(val[i]))
}
out = append(out, row)
}

View File

@@ -274,7 +274,7 @@ func (t Tool) Invoke(ctx context.Context, resourceMgr tools.SourceProvider, para
}
vMap := make(map[string]any)
for key, value := range row {
vMap[key] = value
vMap[key] = bqutil.NormalizeValue(value)
}
out = append(out, vMap)
}

View File

@@ -54,7 +54,7 @@ type Config struct {
Description string `yaml:"description" validate:"required"`
Database string `yaml:"database" validate:"required"`
Collection string `yaml:"collection" validate:"required"`
Canonical bool `yaml:"canonical" validate:"required"` //i want to force the user to choose
Canonical bool `yaml:"canonical"`
}
// validate interface

View File

@@ -39,6 +39,30 @@ func TestParseFromYamlMongoQuery(t *testing.T) {
{
desc: "basic example",
in: `
tools:
example_tool:
kind: mongodb-insert-many
source: my-instance
description: some description
database: test_db
collection: test_coll
`,
want: server.ToolConfigs{
"example_tool": mongodbinsertmany.Config{
Name: "example_tool",
Kind: "mongodb-insert-many",
Source: "my-instance",
AuthRequired: []string{},
Database: "test_db",
Collection: "test_coll",
Description: "some description",
Canonical: false,
},
},
},
{
desc: "true canonical",
in: `
tools:
example_tool:
kind: mongodb-insert-many
@@ -61,6 +85,31 @@ func TestParseFromYamlMongoQuery(t *testing.T) {
},
},
},
{
desc: "false canonical",
in: `
tools:
example_tool:
kind: mongodb-insert-many
source: my-instance
description: some description
database: test_db
collection: test_coll
canonical: false
`,
want: server.ToolConfigs{
"example_tool": mongodbinsertmany.Config{
Name: "example_tool",
Kind: "mongodb-insert-many",
Source: "my-instance",
AuthRequired: []string{},
Database: "test_db",
Collection: "test_coll",
Description: "some description",
Canonical: false,
},
},
},
}
for _, tc := range tcs {
t.Run(tc.desc, func(t *testing.T) {

View File

@@ -54,7 +54,7 @@ type Config struct {
Description string `yaml:"description" validate:"required"`
Database string `yaml:"database" validate:"required"`
Collection string `yaml:"collection" validate:"required"`
Canonical bool `yaml:"canonical" validate:"required"` //i want to force the user to choose
Canonical bool `yaml:"canonical"`
}
// validate interface

View File

@@ -39,6 +39,30 @@ func TestParseFromYamlMongoQuery(t *testing.T) {
{
desc: "basic example",
in: `
tools:
example_tool:
kind: mongodb-insert-one
source: my-instance
description: some description
database: test_db
collection: test_coll
`,
want: server.ToolConfigs{
"example_tool": mongodbinsertone.Config{
Name: "example_tool",
Kind: "mongodb-insert-one",
Source: "my-instance",
AuthRequired: []string{},
Database: "test_db",
Collection: "test_coll",
Canonical: false,
Description: "some description",
},
},
},
{
desc: "true canonical",
in: `
tools:
example_tool:
kind: mongodb-insert-one
@@ -61,6 +85,31 @@ func TestParseFromYamlMongoQuery(t *testing.T) {
},
},
},
{
desc: "false canonical",
in: `
tools:
example_tool:
kind: mongodb-insert-one
source: my-instance
description: some description
database: test_db
collection: test_coll
canonical: false
`,
want: server.ToolConfigs{
"example_tool": mongodbinsertone.Config{
Name: "example_tool",
Kind: "mongodb-insert-one",
Source: "my-instance",
AuthRequired: []string{},
Database: "test_db",
Collection: "test_coll",
Canonical: false,
Description: "some description",
},
},
},
}
for _, tc := range tcs {
t.Run(tc.desc, func(t *testing.T) {

View File

@@ -56,7 +56,7 @@ type Config struct {
FilterParams parameters.Parameters `yaml:"filterParams"`
UpdatePayload string `yaml:"updatePayload" validate:"required"`
UpdateParams parameters.Parameters `yaml:"updateParams" validate:"required"`
Canonical bool `yaml:"canonical" validate:"required"`
Canonical bool `yaml:"canonical"`
Upsert bool `yaml:"upsert"`
}

View File

@@ -40,6 +40,62 @@ func TestParseFromYamlMongoQuery(t *testing.T) {
{
desc: "basic example",
in: `
tools:
example_tool:
kind: mongodb-update-many
source: my-instance
description: some description
database: test_db
collection: test_coll
filterPayload: |
{ name: {{json .name}} }
filterParams:
- name: name
type: string
description: small description
updatePayload: |
{ $set: { name: {{json .name}} } }
updateParams:
- name: name
type: string
description: small description
`,
want: server.ToolConfigs{
"example_tool": mongodbupdatemany.Config{
Name: "example_tool",
Kind: "mongodb-update-many",
Source: "my-instance",
AuthRequired: []string{},
Database: "test_db",
Collection: "test_coll",
FilterPayload: "{ name: {{json .name}} }\n",
FilterParams: parameters.Parameters{
&parameters.StringParameter{
CommonParameter: parameters.CommonParameter{
Name: "name",
Type: "string",
Desc: "small description",
},
},
},
UpdatePayload: "{ $set: { name: {{json .name}} } }\n",
UpdateParams: parameters.Parameters{
&parameters.StringParameter{
CommonParameter: parameters.CommonParameter{
Name: "name",
Type: "string",
Desc: "small description",
},
},
},
Description: "some description",
Canonical: false,
},
},
},
{
desc: "true canonical",
in: `
tools:
example_tool:
kind: mongodb-update-many
@@ -94,6 +150,63 @@ func TestParseFromYamlMongoQuery(t *testing.T) {
},
},
},
{
desc: "false canonical",
in: `
tools:
example_tool:
kind: mongodb-update-many
source: my-instance
description: some description
database: test_db
collection: test_coll
filterPayload: |
{ name: {{json .name}} }
filterParams:
- name: name
type: string
description: small description
canonical: false
updatePayload: |
{ $set: { name: {{json .name}} } }
updateParams:
- name: name
type: string
description: small description
`,
want: server.ToolConfigs{
"example_tool": mongodbupdatemany.Config{
Name: "example_tool",
Kind: "mongodb-update-many",
Source: "my-instance",
AuthRequired: []string{},
Database: "test_db",
Collection: "test_coll",
FilterPayload: "{ name: {{json .name}} }\n",
FilterParams: parameters.Parameters{
&parameters.StringParameter{
CommonParameter: parameters.CommonParameter{
Name: "name",
Type: "string",
Desc: "small description",
},
},
},
UpdatePayload: "{ $set: { name: {{json .name}} } }\n",
UpdateParams: parameters.Parameters{
&parameters.StringParameter{
CommonParameter: parameters.CommonParameter{
Name: "name",
Type: "string",
Desc: "small description",
},
},
},
Description: "some description",
Canonical: false,
},
},
},
}
for _, tc := range tcs {
t.Run(tc.desc, func(t *testing.T) {

View File

@@ -57,7 +57,7 @@ type Config struct {
UpdatePayload string `yaml:"updatePayload" validate:"required"`
UpdateParams parameters.Parameters `yaml:"updateParams" validate:"required"`
Canonical bool `yaml:"canonical" validate:"required"`
Canonical bool `yaml:"canonical"`
Upsert bool `yaml:"upsert"`
}

View File

@@ -40,6 +40,123 @@ func TestParseFromYamlMongoQuery(t *testing.T) {
{
desc: "basic example",
in: `
tools:
example_tool:
kind: mongodb-update-one
source: my-instance
description: some description
database: test_db
collection: test_coll
filterPayload: |
{ name: {{json .name}} }
filterParams:
- name: name
type: string
description: small description
updatePayload: |
{ $set : { item: {{json .item}} } }
updateParams:
- name: item
type: string
description: small description
upsert: true
`,
want: server.ToolConfigs{
"example_tool": mongodbupdateone.Config{
Name: "example_tool",
Kind: "mongodb-update-one",
Source: "my-instance",
AuthRequired: []string{},
Database: "test_db",
Collection: "test_coll",
Canonical: false,
FilterPayload: "{ name: {{json .name}} }\n",
FilterParams: parameters.Parameters{
&parameters.StringParameter{
CommonParameter: parameters.CommonParameter{
Name: "name",
Type: "string",
Desc: "small description",
},
},
},
UpdatePayload: "{ $set : { item: {{json .item}} } }\n",
UpdateParams: parameters.Parameters{
&parameters.StringParameter{
CommonParameter: parameters.CommonParameter{
Name: "item",
Type: "string",
Desc: "small description",
},
},
},
Upsert: true,
Description: "some description",
},
},
},
{
desc: "false canonical",
in: `
tools:
example_tool:
kind: mongodb-update-one
source: my-instance
description: some description
database: test_db
collection: test_coll
filterPayload: |
{ name: {{json .name}} }
filterParams:
- name: name
type: string
description: small description
updatePayload: |
{ $set : { item: {{json .item}} } }
updateParams:
- name: item
type: string
description: small description
canonical: false
upsert: true
`,
want: server.ToolConfigs{
"example_tool": mongodbupdateone.Config{
Name: "example_tool",
Kind: "mongodb-update-one",
Source: "my-instance",
AuthRequired: []string{},
Database: "test_db",
Collection: "test_coll",
Canonical: false,
FilterPayload: "{ name: {{json .name}} }\n",
FilterParams: parameters.Parameters{
&parameters.StringParameter{
CommonParameter: parameters.CommonParameter{
Name: "name",
Type: "string",
Desc: "small description",
},
},
},
UpdatePayload: "{ $set : { item: {{json .item}} } }\n",
UpdateParams: parameters.Parameters{
&parameters.StringParameter{
CommonParameter: parameters.CommonParameter{
Name: "item",
Type: "string",
Desc: "small description",
},
},
},
Upsert: true,
Description: "some description",
},
},
},
{
desc: "true canonical",
in: `
tools:
example_tool:
kind: mongodb-update-one

View File

@@ -98,11 +98,10 @@ func (cfg Config) Initialize(srcs map[string]sources.Source) (tools.Tool, error)
}
allParameters := parameters.Parameters{}
description := cfg.Description
if description == "" {
description = "Fetches the current state of the PostgreSQL server, returning the version, whether it's a replica, uptime duration, maximum connection limit, number of current connections, number of active connections, and the percentage of connections in use."
if cfg.Description == "" {
cfg.Description = "Fetches the current state of the PostgreSQL server, returning the version, whether it's a replica, uptime duration, maximum connection limit, number of current connections, number of active connections, and the percentage of connections in use."
}
mcpManifest := tools.GetMcpManifest(cfg.Name, description, cfg.AuthRequired, allParameters, nil)
mcpManifest := tools.GetMcpManifest(cfg.Name, cfg.Description, cfg.AuthRequired, allParameters, nil)
// finish tool setup
return Tool{
@@ -134,7 +133,13 @@ func (t Tool) ToConfig() tools.ToolConfig {
}
func (t Tool) Invoke(ctx context.Context, resourceMgr tools.SourceProvider, params parameters.ParamValues, accessToken tools.AccessToken) (any, error) {
sliceParams := params.AsSlice()
paramsMap := params.AsMap()
newParams, err := parameters.GetParams(t.allParams, paramsMap)
if err != nil {
return nil, fmt.Errorf("unable to extract standard params %w", err)
}
sliceParams := newParams.AsSlice()
results, err := t.pool.Query(ctx, databaseOverviewStatement, sliceParams...)
if err != nil {

View File

@@ -0,0 +1,276 @@
// Copyright 2025 Google LLC
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
package postgreslistdatabasestats
import (
"context"
"fmt"
yaml "github.com/goccy/go-yaml"
"github.com/googleapis/genai-toolbox/internal/sources"
"github.com/googleapis/genai-toolbox/internal/sources/alloydbpg"
"github.com/googleapis/genai-toolbox/internal/sources/cloudsqlpg"
"github.com/googleapis/genai-toolbox/internal/sources/postgres"
"github.com/googleapis/genai-toolbox/internal/tools"
"github.com/googleapis/genai-toolbox/internal/util/parameters"
"github.com/jackc/pgx/v5/pgxpool"
)
const kind string = "postgres-list-database-stats"
// SQL query to list database statistics
const listDatabaseStats = `
WITH database_stats AS (
SELECT
s.datname AS database_name,
-- Database Metadata
d.datallowconn AS is_connectable,
pg_get_userbyid(d.datdba) AS database_owner,
ts.spcname AS default_tablespace,
-- Cache Performance
CASE
WHEN (s.blks_hit + s.blks_read) = 0 THEN 0
ELSE round((s.blks_hit * 100.0) / (s.blks_hit + s.blks_read), 2)
END AS cache_hit_ratio_percent,
s.blks_read AS blocks_read_from_disk,
s.blks_hit AS blocks_hit_in_cache,
-- Transaction Throughput
s.xact_commit,
s.xact_rollback,
round(s.xact_rollback * 100.0 / (s.xact_commit + s.xact_rollback + 1), 2) AS rollback_ratio_percent,
-- Tuple Activity
s.tup_returned AS rows_returned_by_queries,
s.tup_fetched AS rows_fetched_by_scans,
s.tup_inserted,
s.tup_updated,
s.tup_deleted,
-- Temporary File Usage
s.temp_files,
s.temp_bytes AS temp_size_bytes,
-- Conflicts & Deadlocks
s.conflicts,
s.deadlocks,
-- General Info
s.numbackends AS active_connections,
s.stats_reset AS statistics_last_reset,
pg_database_size(s.datid) AS database_size_bytes
FROM
pg_stat_database s
JOIN
pg_database d ON d.oid = s.datid
JOIN
pg_tablespace ts ON ts.oid = d.dattablespace
WHERE
-- Exclude cloudsql internal databases
s.datname NOT IN ('cloudsqladmin')
-- Exclude template databases if not requested
AND ( $2::boolean IS TRUE OR d.datistemplate IS FALSE )
)
SELECT *
FROM database_stats
WHERE
($1::text IS NULL OR database_name LIKE '%' || $1::text || '%')
AND ($3::text IS NULL OR database_owner LIKE '%' || $3::text || '%')
AND ($4::text IS NULL OR default_tablespace LIKE '%' || $4::text || '%')
ORDER BY
CASE WHEN $5::text = 'size' THEN database_size_bytes END DESC,
CASE WHEN $5::text = 'commit' THEN xact_commit END DESC,
database_name
LIMIT COALESCE($6::int, 10);
`
func init() {
if !tools.Register(kind, newConfig) {
panic(fmt.Sprintf("tool kind %q already registered", kind))
}
}
func newConfig(ctx context.Context, name string, decoder *yaml.Decoder) (tools.ToolConfig, error) {
actual := Config{Name: name}
if err := decoder.DecodeContext(ctx, &actual); err != nil {
return nil, err
}
return actual, nil
}
type compatibleSource interface {
PostgresPool() *pgxpool.Pool
}
// validate compatible sources are still compatible
var _ compatibleSource = &alloydbpg.Source{}
var _ compatibleSource = &cloudsqlpg.Source{}
var _ compatibleSource = &postgres.Source{}
var compatibleSources = [...]string{alloydbpg.SourceKind, cloudsqlpg.SourceKind, postgres.SourceKind}
type Config struct {
Name string `yaml:"name" validate:"required"`
Kind string `yaml:"kind" validate:"required"`
Source string `yaml:"source" validate:"required"`
Description string `yaml:"description"`
AuthRequired []string `yaml:"authRequired"`
}
// validate interface
var _ tools.ToolConfig = Config{}
func (cfg Config) ToolConfigKind() string {
return kind
}
func (cfg Config) Initialize(srcs map[string]sources.Source) (tools.Tool, error) {
// verify source exists
rawS, ok := srcs[cfg.Source]
if !ok {
return nil, fmt.Errorf("no source named %q configured", cfg.Source)
}
// verify the source is compatible
s, ok := rawS.(compatibleSource)
if !ok {
return nil, fmt.Errorf("invalid source for %q tool: source kind must be one of %q", kind, compatibleSources)
}
allParameters := parameters.Parameters{
parameters.NewStringParameterWithDefault("database_name", "", "Optional: A specific database name pattern to search for."),
parameters.NewBooleanParameterWithDefault("include_templates", false, "Optional: Whether to include template databases in the results."),
parameters.NewStringParameterWithDefault("database_owner", "", "Optional: A specific database owner name pattern to search for."),
parameters.NewStringParameterWithDefault("default_tablespace", "", "Optional: A specific default tablespace name pattern to search for."),
parameters.NewStringParameterWithDefault("order_by", "", "Optional: The field to order the results by. Valid values are 'size' and 'commit'."),
parameters.NewIntParameterWithDefault("limit", 10, "Optional: The maximum number of rows to return."),
}
description := cfg.Description
if description == "" {
description =
"Lists the key performance and activity statistics for each PostgreSQL database" +
"in the instance, offering insights into cache efficiency, transaction throughput" +
"row-level activity, temporary file " +
"usage, and contention. " +
"It returns: the database name, whether the database is connectable, " +
"database owner, default tablespace name, the percentage of data blocks " +
"found in the buffer cache rather than being read from disk (a higher " +
"value indicates better cache performance), the total number of disk " +
"blocks read from disk, the total number of times disk blocks were found " +
"already in the cache; the total number of committed transactions, the " +
"total number of rolled back transactions, the percentage of rolled back " +
"transactions compared to the total number of completed transactions, the " +
"total number of rows returned by queries, the total number of live rows " +
"fetched by scans, the total number of rows inserted, the total number " +
"of rows updated, the total number of rows deleted, the number of " +
"temporary files created by queries, the total size of all temporary " +
"files created by queries in bytes, the number of query cancellations due " +
"to conflicts with recovery, the number of deadlocks detected, the current " +
"number of active connections to the database, the timestamp of the " +
"last statistics reset, and total database size in bytes."
}
mcpManifest := tools.GetMcpManifest(cfg.Name, description, cfg.AuthRequired, allParameters, nil)
// finish tool setup
return Tool{
Config: cfg,
allParams: allParameters,
pool: s.PostgresPool(),
manifest: tools.Manifest{
Description: cfg.Description,
Parameters: allParameters.Manifest(),
AuthRequired: cfg.AuthRequired,
},
mcpManifest: mcpManifest,
}, nil
}
// validate interface
var _ tools.Tool = Tool{}
type Tool struct {
Config
allParams parameters.Parameters `yaml:"allParams"`
pool *pgxpool.Pool
manifest tools.Manifest
mcpManifest tools.McpManifest
}
func (t Tool) Invoke(ctx context.Context, resourceMgr tools.SourceProvider, params parameters.ParamValues, accessToken tools.AccessToken) (any, error) {
paramsMap := params.AsMap()
newParams, err := parameters.GetParams(t.allParams, paramsMap)
if err != nil {
return nil, fmt.Errorf("unable to extract standard params %w", err)
}
sliceParams := newParams.AsSlice()
results, err := t.pool.Query(ctx, listDatabaseStats, sliceParams...)
if err != nil {
return nil, fmt.Errorf("unable to execute query: %w", err)
}
defer results.Close()
fields := results.FieldDescriptions()
var out []map[string]any
for results.Next() {
values, err := results.Values()
if err != nil {
return nil, fmt.Errorf("unable to parse row: %w", err)
}
rowMap := make(map[string]any)
for i, field := range fields {
rowMap[string(field.Name)] = values[i]
}
out = append(out, rowMap)
}
// this will catch actual query execution errors
if err := results.Err(); err != nil {
return nil, fmt.Errorf("unable to execute query: %w", err)
}
return out, nil
}
func (t Tool) ParseParams(data map[string]any, claims map[string]map[string]any) (parameters.ParamValues, error) {
return parameters.ParseParams(t.allParams, data, claims)
}
func (t Tool) Manifest() tools.Manifest {
return t.manifest
}
func (t Tool) McpManifest() tools.McpManifest {
return t.mcpManifest
}
func (t Tool) Authorized(verifiedAuthServices []string) bool {
return tools.IsAuthorized(t.AuthRequired, verifiedAuthServices)
}
func (t Tool) RequiresClientAuthorization(resourceMgr tools.SourceProvider) bool {
return false
}
func (t Tool) ToConfig() tools.ToolConfig {
return t.Config
}
func (t Tool) GetAuthTokenHeaderName() string {
return "Authorization"
}

View File

@@ -0,0 +1,95 @@
// Copyright 2025 Google LLC
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
package postgreslistdatabasestats_test
import (
"testing"
yaml "github.com/goccy/go-yaml"
"github.com/google/go-cmp/cmp"
"github.com/googleapis/genai-toolbox/internal/server"
"github.com/googleapis/genai-toolbox/internal/testutils"
"github.com/googleapis/genai-toolbox/internal/tools/postgres/postgreslistdatabasestats"
)
func TestParseFromYamlPostgresListDatabaseStats(t *testing.T) {
ctx, err := testutils.ContextWithNewLogger()
if err != nil {
t.Fatalf("unexpected error: %s", err)
}
tcs := []struct {
desc string
in string
want server.ToolConfigs
}{
{
desc: "basic example",
in: `
tools:
example_tool:
kind: postgres-list-database-stats
source: my-postgres-instance
description: some description
authRequired:
- my-google-auth-service
- other-auth-service
`,
want: server.ToolConfigs{
"example_tool": postgreslistdatabasestats.Config{
Name: "example_tool",
Kind: "postgres-list-database-stats",
Source: "my-postgres-instance",
Description: "some description",
AuthRequired: []string{"my-google-auth-service", "other-auth-service"},
},
},
},
{
desc: "basic example",
in: `
tools:
example_tool:
kind: postgres-list-database-stats
source: my-postgres-instance
description: some description
`,
want: server.ToolConfigs{
"example_tool": postgreslistdatabasestats.Config{
Name: "example_tool",
Kind: "postgres-list-database-stats",
Source: "my-postgres-instance",
Description: "some description",
AuthRequired: []string{},
},
},
},
}
for _, tc := range tcs {
t.Run(tc.desc, func(t *testing.T) {
got := struct {
Tools server.ToolConfigs `yaml:"tools"`
}{}
// Parse contents
err := yaml.UnmarshalContext(ctx, testutils.FormatYaml(tc.in), &got)
if err != nil {
t.Fatalf("unable to unmarshal: %s", err)
}
if diff := cmp.Diff(tc.want, got.Tools); diff != "" {
t.Fatalf("incorrect parse: diff %v", diff)
}
})
}
}

View File

@@ -59,7 +59,8 @@ const listIndexesStatement = `
ON i.oid = s.indexrelid
WHERE
t.relkind = 'r'
AND s.schemaname NOT IN ('pg_catalog', 'information_schema')
AND s.schemaname NOT IN ('pg_catalog', 'information_schema', 'pg_toast')
AND s.schemaname NOT LIKE 'pg_temp_%'
)
SELECT *
FROM IndexDetails
@@ -67,11 +68,12 @@ const listIndexesStatement = `
($1::text IS NULL OR schema_name LIKE '%' || $1 || '%')
AND ($2::text IS NULL OR table_name LIKE '%' || $2 || '%')
AND ($3::text IS NULL OR index_name LIKE '%' || $3 || '%')
AND ($4::boolean IS NOT TRUE OR is_used IS FALSE)
ORDER BY
schema_name,
table_name,
index_name
LIMIT COALESCE($4::int, 50);
LIMIT COALESCE($5::int, 50);
`
func init() {
@@ -131,13 +133,14 @@ func (cfg Config) Initialize(srcs map[string]sources.Source) (tools.Tool, error)
parameters.NewStringParameterWithDefault("schema_name", "", "Optional: a text to filter results by schema name. The input is used within a LIKE clause."),
parameters.NewStringParameterWithDefault("table_name", "", "Optional: a text to filter results by table name. The input is used within a LIKE clause."),
parameters.NewStringParameterWithDefault("index_name", "", "Optional: a text to filter results by index name. The input is used within a LIKE clause."),
parameters.NewBooleanParameterWithDefault("only_unused", false, "Optional: If true, only returns indexes that have never been used."),
parameters.NewIntParameterWithDefault("limit", 50, "Optional: The maximum number of rows to return. Default is 50"),
}
description := cfg.Description
if description == "" {
description = "Lists available user indexes in the database, excluding system schemas (pg_catalog, information_schema). For each index, the following properties are returned: schema name, table name, index name, index type (access method), a boolean indicating if it's a unique index, a boolean indicating if it's for a primary key, the index definition, index size in bytes, the number of index scans, the number of index tuples read, the number of table tuples fetched via index scans, and a boolean indicating if the index has been used at least once."
if cfg.Description == "" {
cfg.Description = "Lists available user indexes in the database, excluding system schemas (pg_catalog, information_schema). For each index, the following properties are returned: schema name, table name, index name, index type (access method), a boolean indicating if it's a unique index, a boolean indicating if it's for a primary key, the index definition, index size in bytes, the number of index scans, the number of index tuples read, the number of table tuples fetched via index scans, and a boolean indicating if the index has been used at least once."
}
mcpManifest := tools.GetMcpManifest(cfg.Name, description, cfg.AuthRequired, allParameters, nil)
mcpManifest := tools.GetMcpManifest(cfg.Name, cfg.Description, cfg.AuthRequired, allParameters, nil)
// finish tool setup
return Tool{
@@ -169,7 +172,13 @@ func (t Tool) ToConfig() tools.ToolConfig {
}
func (t Tool) Invoke(ctx context.Context, resourceMgr tools.SourceProvider, params parameters.ParamValues, accessToken tools.AccessToken) (any, error) {
sliceParams := params.AsSlice()
paramsMap := params.AsMap()
newParams, err := parameters.GetParams(t.allParams, paramsMap)
if err != nil {
return nil, fmt.Errorf("unable to extract standard params %w", err)
}
sliceParams := newParams.AsSlice()
results, err := t.pool.Query(ctx, listIndexesStatement, sliceParams...)
if err != nil {

View File

@@ -0,0 +1,204 @@
// Copyright 2025 Google LLC
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
package postgreslistpgsettings
import (
"context"
"fmt"
yaml "github.com/goccy/go-yaml"
"github.com/googleapis/genai-toolbox/internal/sources"
"github.com/googleapis/genai-toolbox/internal/sources/alloydbpg"
"github.com/googleapis/genai-toolbox/internal/sources/cloudsqlpg"
"github.com/googleapis/genai-toolbox/internal/sources/postgres"
"github.com/googleapis/genai-toolbox/internal/tools"
"github.com/googleapis/genai-toolbox/internal/util/parameters"
"github.com/jackc/pgx/v5/pgxpool"
)
const kind string = "postgres-list-pg-settings"
const listPgSettingsStatement = `
SELECT
name,
setting AS current_value,
unit,
short_desc,
source,
CASE context
WHEN 'postmaster' THEN 'Yes'
WHEN 'sighup' THEN 'No (Reload sufficient)'
ELSE 'No'
END
AS requires_restart
FROM pg_settings
WHERE ($1::text IS NULL OR name LIKE '%' || $1::text || '%')
ORDER BY name
LIMIT COALESCE($2::int, 50);
`
func init() {
if !tools.Register(kind, newConfig) {
panic(fmt.Sprintf("tool kind %q already registered", kind))
}
}
func newConfig(ctx context.Context, name string, decoder *yaml.Decoder) (tools.ToolConfig, error) {
actual := Config{Name: name}
if err := decoder.DecodeContext(ctx, &actual); err != nil {
return nil, err
}
return actual, nil
}
type compatibleSource interface {
PostgresPool() *pgxpool.Pool
}
// validate compatible sources are still compatible
var _ compatibleSource = &alloydbpg.Source{}
var _ compatibleSource = &cloudsqlpg.Source{}
var _ compatibleSource = &postgres.Source{}
var compatibleSources = [...]string{alloydbpg.SourceKind, cloudsqlpg.SourceKind, postgres.SourceKind}
type Config struct {
Name string `yaml:"name" validate:"required"`
Kind string `yaml:"kind" validate:"required"`
Source string `yaml:"source" validate:"required"`
Description string `yaml:"description"`
AuthRequired []string `yaml:"authRequired"`
}
// validate interface
var _ tools.ToolConfig = Config{}
func (cfg Config) ToolConfigKind() string {
return kind
}
func (cfg Config) Initialize(srcs map[string]sources.Source) (tools.Tool, error) {
// verify source exists
rawS, ok := srcs[cfg.Source]
if !ok {
return nil, fmt.Errorf("no source named %q configured", cfg.Source)
}
// verify the source is compatible
s, ok := rawS.(compatibleSource)
if !ok {
return nil, fmt.Errorf("invalid source for %q tool: source kind must be one of %q", kind, compatibleSources)
}
allParameters := parameters.Parameters{
parameters.NewStringParameterWithDefault("setting_name", "", "Optional: A specific configuration parameter name pattern to search for."),
parameters.NewIntParameterWithDefault("limit", 50, "Optional: The maximum number of rows to return."),
}
description := cfg.Description
if description == "" {
description = "Lists configuration parameters for the postgres server ordered lexicographically, with a default limit of 50 rows. It returns the parameter name, its current setting, unit of measurement, a short description, the source of the current setting (e.g., default, configuration file, session), and whether a restart is required when the parameter value is changed."
}
mcpManifest := tools.GetMcpManifest(cfg.Name, description, cfg.AuthRequired, allParameters, nil)
// finish tool setup
return Tool{
Config: cfg,
allParams: allParameters,
pool: s.PostgresPool(),
manifest: tools.Manifest{
Description: cfg.Description,
Parameters: allParameters.Manifest(),
AuthRequired: cfg.AuthRequired,
},
mcpManifest: mcpManifest,
}, nil
}
// validate interface
var _ tools.Tool = Tool{}
type Tool struct {
Config
allParams parameters.Parameters `yaml:"allParams"`
pool *pgxpool.Pool
manifest tools.Manifest
mcpManifest tools.McpManifest
}
func (t Tool) Invoke(ctx context.Context, resourceMgr tools.SourceProvider, params parameters.ParamValues, accessToken tools.AccessToken) (any, error) {
paramsMap := params.AsMap()
newParams, err := parameters.GetParams(t.allParams, paramsMap)
if err != nil {
return nil, fmt.Errorf("unable to extract standard params %w", err)
}
sliceParams := newParams.AsSlice()
results, err := t.pool.Query(ctx, listPgSettingsStatement, sliceParams...)
if err != nil {
return nil, fmt.Errorf("unable to execute query: %w", err)
}
defer results.Close()
fields := results.FieldDescriptions()
var out []map[string]any
for results.Next() {
values, err := results.Values()
if err != nil {
return nil, fmt.Errorf("unable to parse row: %w", err)
}
rowMap := make(map[string]any)
for i, field := range fields {
rowMap[string(field.Name)] = values[i]
}
out = append(out, rowMap)
}
// this will catch actual query execution errors
if err := results.Err(); err != nil {
return nil, fmt.Errorf("unable to execute query: %w", err)
}
return out, nil
}
func (t Tool) ParseParams(data map[string]any, claims map[string]map[string]any) (parameters.ParamValues, error) {
return parameters.ParseParams(t.allParams, data, claims)
}
func (t Tool) Manifest() tools.Manifest {
return t.manifest
}
func (t Tool) McpManifest() tools.McpManifest {
return t.mcpManifest
}
func (t Tool) Authorized(verifiedAuthServices []string) bool {
return tools.IsAuthorized(t.AuthRequired, verifiedAuthServices)
}
func (t Tool) RequiresClientAuthorization(resourceMgr tools.SourceProvider) bool {
return false
}
func (t Tool) ToConfig() tools.ToolConfig {
return t.Config
}
func (t Tool) GetAuthTokenHeaderName() string {
return "Authorization"
}

View File

@@ -0,0 +1,95 @@
// Copyright 2025 Google LLC
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
package postgreslistpgsettings_test
import (
"testing"
yaml "github.com/goccy/go-yaml"
"github.com/google/go-cmp/cmp"
"github.com/googleapis/genai-toolbox/internal/server"
"github.com/googleapis/genai-toolbox/internal/testutils"
"github.com/googleapis/genai-toolbox/internal/tools/postgres/postgreslistpgsettings"
)
func TestParseFromYamlPostgreslistPgSettings(t *testing.T) {
ctx, err := testutils.ContextWithNewLogger()
if err != nil {
t.Fatalf("unexpected error: %s", err)
}
tcs := []struct {
desc string
in string
want server.ToolConfigs
}{
{
desc: "basic example",
in: `
tools:
example_tool:
kind: postgres-list-pg-settings
source: my-postgres-instance
description: some description
authRequired:
- my-google-auth-service
- other-auth-service
`,
want: server.ToolConfigs{
"example_tool": postgreslistpgsettings.Config{
Name: "example_tool",
Kind: "postgres-list-pg-settings",
Source: "my-postgres-instance",
Description: "some description",
AuthRequired: []string{"my-google-auth-service", "other-auth-service"},
},
},
},
{
desc: "basic example",
in: `
tools:
example_tool:
kind: postgres-list-pg-settings
source: my-postgres-instance
description: some description
`,
want: server.ToolConfigs{
"example_tool": postgreslistpgsettings.Config{
Name: "example_tool",
Kind: "postgres-list-pg-settings",
Source: "my-postgres-instance",
Description: "some description",
AuthRequired: []string{},
},
},
},
}
for _, tc := range tcs {
t.Run(tc.desc, func(t *testing.T) {
got := struct {
Tools server.ToolConfigs `yaml:"tools"`
}{}
// Parse contents
err := yaml.UnmarshalContext(ctx, testutils.FormatYaml(tc.in), &got)
if err != nil {
t.Fatalf("unable to unmarshal: %s", err)
}
if diff := cmp.Diff(tc.want, got.Tools); diff != "" {
t.Fatalf("incorrect parse: diff %v", diff)
}
})
}
}

View File

@@ -0,0 +1,216 @@
// Copyright 2025 Google LLC
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
package postgreslistpublicationtables
import (
"context"
"fmt"
yaml "github.com/goccy/go-yaml"
"github.com/googleapis/genai-toolbox/internal/sources"
"github.com/googleapis/genai-toolbox/internal/sources/alloydbpg"
"github.com/googleapis/genai-toolbox/internal/sources/cloudsqlpg"
"github.com/googleapis/genai-toolbox/internal/sources/postgres"
"github.com/googleapis/genai-toolbox/internal/tools"
"github.com/googleapis/genai-toolbox/internal/util/parameters"
"github.com/jackc/pgx/v5/pgxpool"
)
const kind string = "postgres-list-publication-tables"
const listPublicationTablesStatement = `
WITH
publication_details AS (
SELECT
pt.pubname AS publication_name,
pt.schemaname AS schema_name,
pt.tablename AS table_name,
-- Definition details
p.puballtables AS publishes_all_tables,
p.pubinsert AS publishes_inserts,
p.pubupdate AS publishes_updates,
p.pubdelete AS publishes_deletes,
p.pubtruncate AS publishes_truncates,
-- Owner information
pg_catalog.pg_get_userbyid(p.pubowner) AS publication_owner
FROM pg_catalog.pg_publication_tables pt
JOIN pg_catalog.pg_publication p
ON pt.pubname = p.pubname
)
SELECT *
FROM publication_details
WHERE
(NULLIF(TRIM($1::text), '') IS NULL OR table_name = ANY(regexp_split_to_array(TRIM($1::text), '\s*,\s*')))
AND (NULLIF(TRIM($2::text), '') IS NULL OR publication_name = ANY(regexp_split_to_array(TRIM($2::text), '\s*,\s*')))
AND (NULLIF(TRIM($3::text), '') IS NULL OR schema_name = ANY(regexp_split_to_array(TRIM($3::text), '\s*,\s*')))
ORDER BY
publication_name, schema_name, table_name
LIMIT COALESCE($4::int, 50);
`
func init() {
if !tools.Register(kind, newConfig) {
panic(fmt.Sprintf("tool kind %q already registered", kind))
}
}
func newConfig(ctx context.Context, name string, decoder *yaml.Decoder) (tools.ToolConfig, error) {
actual := Config{Name: name}
if err := decoder.DecodeContext(ctx, &actual); err != nil {
return nil, err
}
return actual, nil
}
type compatibleSource interface {
PostgresPool() *pgxpool.Pool
}
// validate compatible sources are still compatible
var _ compatibleSource = &alloydbpg.Source{}
var _ compatibleSource = &cloudsqlpg.Source{}
var _ compatibleSource = &postgres.Source{}
var compatibleSources = [...]string{alloydbpg.SourceKind, cloudsqlpg.SourceKind, postgres.SourceKind}
type Config struct {
Name string `yaml:"name" validate:"required"`
Kind string `yaml:"kind" validate:"required"`
Source string `yaml:"source" validate:"required"`
Description string `yaml:"description"`
AuthRequired []string `yaml:"authRequired"`
}
// validate interface
var _ tools.ToolConfig = Config{}
func (cfg Config) ToolConfigKind() string {
return kind
}
func (cfg Config) Initialize(srcs map[string]sources.Source) (tools.Tool, error) {
// verify source exists
rawS, ok := srcs[cfg.Source]
if !ok {
return nil, fmt.Errorf("no source named %q configured", cfg.Source)
}
// verify the source is compatible
s, ok := rawS.(compatibleSource)
if !ok {
return nil, fmt.Errorf("invalid source for %q tool: source kind must be one of %q", kind, compatibleSources)
}
allParameters := parameters.Parameters{
parameters.NewStringParameterWithDefault("table_names", "", "Optional: Filters by a comma-separated list of table names."),
parameters.NewStringParameterWithDefault("publication_names", "", "Optional: Filters by a comma-separated list of publication names."),
parameters.NewStringParameterWithDefault("schema_names", "", "Optional: Filters by a comma-separated list of schema names."),
parameters.NewIntParameterWithDefault("limit", 50, "Optional: The maximum number of rows to return."),
}
description := cfg.Description
if description == "" {
description = "Lists all publication tables in the database. Returns the publication name, schema name, and table name, along with definition details indicating if it publishes all tables, whether it replicates inserts, updates, deletes, or truncates, and the publication owner."
}
mcpManifest := tools.GetMcpManifest(cfg.Name, description, cfg.AuthRequired, allParameters, nil)
// finish tool setup
return Tool{
Config: cfg,
allParams: allParameters,
pool: s.PostgresPool(),
manifest: tools.Manifest{
Description: cfg.Description,
Parameters: allParameters.Manifest(),
AuthRequired: cfg.AuthRequired,
},
mcpManifest: mcpManifest,
}, nil
}
// validate interface
var _ tools.Tool = Tool{}
type Tool struct {
Config
allParams parameters.Parameters `yaml:"allParams"`
pool *pgxpool.Pool
manifest tools.Manifest
mcpManifest tools.McpManifest
}
func (t Tool) Invoke(ctx context.Context, resourceMgr tools.SourceProvider, params parameters.ParamValues, accessToken tools.AccessToken) (any, error) {
paramsMap := params.AsMap()
newParams, err := parameters.GetParams(t.allParams, paramsMap)
if err != nil {
return nil, fmt.Errorf("unable to extract standard params %w", err)
}
sliceParams := newParams.AsSlice()
results, err := t.pool.Query(ctx, listPublicationTablesStatement, sliceParams...)
if err != nil {
return nil, fmt.Errorf("unable to execute query: %w", err)
}
defer results.Close()
fields := results.FieldDescriptions()
var out []map[string]any
for results.Next() {
values, err := results.Values()
if err != nil {
return nil, fmt.Errorf("unable to parse row: %w", err)
}
rowMap := make(map[string]any)
for i, field := range fields {
rowMap[string(field.Name)] = values[i]
}
out = append(out, rowMap)
}
if err := results.Err(); err != nil {
return err.Error(), fmt.Errorf("unable to execute query: %w", err)
}
return out, nil
}
func (t Tool) ParseParams(data map[string]any, claims map[string]map[string]any) (parameters.ParamValues, error) {
return parameters.ParseParams(t.allParams, data, claims)
}
func (t Tool) Manifest() tools.Manifest {
return t.manifest
}
func (t Tool) McpManifest() tools.McpManifest {
return t.mcpManifest
}
func (t Tool) Authorized(verifiedAuthServices []string) bool {
return tools.IsAuthorized(t.AuthRequired, verifiedAuthServices)
}
func (t Tool) RequiresClientAuthorization(resourceMgr tools.SourceProvider) bool {
return false
}
func (t Tool) ToConfig() tools.ToolConfig {
return t.Config
}
func (t Tool) GetAuthTokenHeaderName() string {
return "Authorization"
}

View File

@@ -0,0 +1,95 @@
// Copyright 2025 Google LLC
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
package postgreslistpublicationtables_test
import (
"testing"
yaml "github.com/goccy/go-yaml"
"github.com/google/go-cmp/cmp"
"github.com/googleapis/genai-toolbox/internal/server"
"github.com/googleapis/genai-toolbox/internal/testutils"
"github.com/googleapis/genai-toolbox/internal/tools/postgres/postgreslistpublicationtables"
)
func TestParseFromYamlPostgresListPublicationTables(t *testing.T) {
ctx, err := testutils.ContextWithNewLogger()
if err != nil {
t.Fatalf("unexpected error: %s", err)
}
tcs := []struct {
desc string
in string
want server.ToolConfigs
}{
{
desc: "basic example",
in: `
tools:
example_tool:
kind: postgres-list-publication-tables
source: my-postgres-instance
description: some description
authRequired:
- my-google-auth-service
- other-auth-service
`,
want: server.ToolConfigs{
"example_tool": postgreslistpublicationtables.Config{
Name: "example_tool",
Kind: "postgres-list-publication-tables",
Source: "my-postgres-instance",
Description: "some description",
AuthRequired: []string{"my-google-auth-service", "other-auth-service"},
},
},
},
{
desc: "basic example",
in: `
tools:
example_tool:
kind: postgres-list-publication-tables
source: my-postgres-instance
description: some description
`,
want: server.ToolConfigs{
"example_tool": postgreslistpublicationtables.Config{
Name: "example_tool",
Kind: "postgres-list-publication-tables",
Source: "my-postgres-instance",
Description: "some description",
AuthRequired: []string{},
},
},
},
}
for _, tc := range tcs {
t.Run(tc.desc, func(t *testing.T) {
got := struct {
Tools server.ToolConfigs `yaml:"tools"`
}{}
// Parse contents
err := yaml.UnmarshalContext(ctx, testutils.FormatYaml(tc.in), &got)
if err != nil {
t.Fatalf("unable to unmarshal: %s", err)
}
if diff := cmp.Diff(tc.want, got.Tools); diff != "" {
t.Fatalf("incorrect parse: diff %v", diff)
}
})
}
}

View File

@@ -0,0 +1,228 @@
// Copyright 2025 Google LLC
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
package postgreslistroles
import (
"context"
"fmt"
yaml "github.com/goccy/go-yaml"
"github.com/googleapis/genai-toolbox/internal/sources"
"github.com/googleapis/genai-toolbox/internal/sources/alloydbpg"
"github.com/googleapis/genai-toolbox/internal/sources/cloudsqlpg"
"github.com/googleapis/genai-toolbox/internal/sources/postgres"
"github.com/googleapis/genai-toolbox/internal/tools"
"github.com/googleapis/genai-toolbox/internal/util/parameters"
"github.com/jackc/pgx/v5/pgxpool"
)
const kind string = "postgres-list-roles"
const listRolesStatement = `
WITH RoleDetails AS (
SELECT
r.rolname AS role_name,
r.oid AS oid,
r.rolconnlimit AS connection_limit,
r.rolsuper AS is_superuser,
r.rolinherit AS inherits_privileges,
r.rolcreaterole AS can_create_roles,
r.rolcreatedb AS can_create_db,
r.rolcanlogin AS can_login,
r.rolreplication AS is_replication_role,
r.rolbypassrls AS bypass_rls,
r.rolvaliduntil AS valid_until,
-- List of roles that belong to this role (Direct Members)
ARRAY(
SELECT m_r.rolname
FROM pg_auth_members pam
JOIN pg_roles m_r ON pam.member = m_r.oid
WHERE pam.roleid = r.oid
) AS direct_members,
-- List of roles that this role belongs to (Member Of)
ARRAY(
SELECT g_r.rolname
FROM pg_auth_members pam
JOIN pg_roles g_r ON pam.roleid = g_r.oid
WHERE pam.member = r.oid
) AS member_of
FROM pg_roles r
-- Exclude system and internal roles
WHERE r.rolname NOT LIKE 'cloudsql%'
AND r.rolname NOT LIKE 'alloydb_%'
AND r.rolname NOT LIKE 'pg_%'
)
SELECT *
FROM RoleDetails
WHERE
($1::text IS NULL OR role_name LIKE '%' || $1 || '%')
ORDER BY role_name
LIMIT COALESCE($2::int, 50);
`
func init() {
if !tools.Register(kind, newConfig) {
panic(fmt.Sprintf("tool kind %q already registered", kind))
}
}
func newConfig(ctx context.Context, name string, decoder *yaml.Decoder) (tools.ToolConfig, error) {
actual := Config{Name: name}
if err := decoder.DecodeContext(ctx, &actual); err != nil {
return nil, err
}
return actual, nil
}
type compatibleSource interface {
PostgresPool() *pgxpool.Pool
}
// validate compatible sources are still compatible
var _ compatibleSource = &alloydbpg.Source{}
var _ compatibleSource = &cloudsqlpg.Source{}
var _ compatibleSource = &postgres.Source{}
var compatibleSources = [...]string{alloydbpg.SourceKind, cloudsqlpg.SourceKind, postgres.SourceKind}
type Config struct {
Name string `yaml:"name" validate:"required"`
Kind string `yaml:"kind" validate:"required"`
Source string `yaml:"source" validate:"required"`
Description string `yaml:"description"`
AuthRequired []string `yaml:"authRequired"`
}
// validate interface
var _ tools.ToolConfig = Config{}
func (cfg Config) ToolConfigKind() string {
return kind
}
func (cfg Config) Initialize(srcs map[string]sources.Source) (tools.Tool, error) {
// verify source exists
rawS, ok := srcs[cfg.Source]
if !ok {
return nil, fmt.Errorf("no source named %q configured", cfg.Source)
}
// verify the source is compatible
s, ok := rawS.(compatibleSource)
if !ok {
return nil, fmt.Errorf("invalid source for %q tool: source kind must be one of %q", kind, compatibleSources)
}
allParameters := parameters.Parameters{
parameters.NewStringParameterWithDefault("role_name", "", "Optional: a text to filter results by role name. The input is used within a LIKE clause."),
parameters.NewIntParameterWithDefault("limit", 50, "Optional: The maximum number of rows to return. Default is 10"),
}
description := cfg.Description
if description == "" {
description = "Lists all the user-created roles in the instance . It returns the role name, Object ID, the maximum number of concurrent connections the role can make, along with boolean indicators for: superuser status, privilege inheritance from member roles, ability to create roles, ability to create databases, ability to log in, replication privilege, and the ability to bypass row-level security, the password expiration timestamp, a list of direct members belonging to this role, and a list of other roles/groups that this role is a member of."
}
mcpManifest := tools.GetMcpManifest(cfg.Name, description, cfg.AuthRequired, allParameters, nil)
// finish tool setup
return Tool{
Config: cfg,
allParams: allParameters,
pool: s.PostgresPool(),
manifest: tools.Manifest{
Description: description,
Parameters: allParameters.Manifest(),
AuthRequired: cfg.AuthRequired,
},
mcpManifest: mcpManifest,
}, nil
}
// validate interface
var _ tools.Tool = Tool{}
type Tool struct {
Config
allParams parameters.Parameters `yaml:"allParams"`
pool *pgxpool.Pool
manifest tools.Manifest
mcpManifest tools.McpManifest
}
func (t Tool) ToConfig() tools.ToolConfig {
return t.Config
}
func (t Tool) Invoke(ctx context.Context, resourceMgr tools.SourceProvider, params parameters.ParamValues, accessToken tools.AccessToken) (any, error) {
paramsMap := params.AsMap()
newParams, err := parameters.GetParams(t.allParams, paramsMap)
if err != nil {
return nil, fmt.Errorf("unable to extract standard params %w", err)
}
sliceParams := newParams.AsSlice()
results, err := t.pool.Query(ctx, listRolesStatement, sliceParams...)
if err != nil {
return nil, fmt.Errorf("unable to execute query: %w", err)
}
defer results.Close()
fields := results.FieldDescriptions()
var out []map[string]any
for results.Next() {
values, err := results.Values()
if err != nil {
return nil, fmt.Errorf("unable to parse row: %w", err)
}
rowMap := make(map[string]any)
for i, field := range fields {
rowMap[string(field.Name)] = values[i]
}
out = append(out, rowMap)
}
// this will catch actual query execution errors
if err := results.Err(); err != nil {
return nil, fmt.Errorf("unable to execute query: %w", err)
}
return out, nil
}
func (t Tool) ParseParams(data map[string]any, claims map[string]map[string]any) (parameters.ParamValues, error) {
return parameters.ParseParams(t.allParams, data, claims)
}
func (t Tool) Manifest() tools.Manifest {
return t.manifest
}
func (t Tool) McpManifest() tools.McpManifest {
return t.mcpManifest
}
func (t Tool) Authorized(verifiedAuthServices []string) bool {
return tools.IsAuthorized(t.AuthRequired, verifiedAuthServices)
}
func (t Tool) RequiresClientAuthorization(resourceMgr tools.SourceProvider) bool {
return false
}
func (t Tool) GetAuthTokenHeaderName() string {
return "Authorization"
}

View File

@@ -0,0 +1,95 @@
// Copyright 2025 Google LLC
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
package postgreslistroles_test
import (
"testing"
yaml "github.com/goccy/go-yaml"
"github.com/google/go-cmp/cmp"
"github.com/googleapis/genai-toolbox/internal/server"
"github.com/googleapis/genai-toolbox/internal/testutils"
"github.com/googleapis/genai-toolbox/internal/tools/postgres/postgreslistroles"
)
func TestParseFromYamlPostgresListRoles(t *testing.T) {
ctx, err := testutils.ContextWithNewLogger()
if err != nil {
t.Fatalf("unexpected error: %s", err)
}
tcs := []struct {
desc string
in string
want server.ToolConfigs
}{
{
desc: "basic example",
in: `
tools:
example_tool:
kind: postgres-list-roles
source: my-postgres-instance
description: some description
authRequired:
- my-google-auth-service
- other-auth-service
`,
want: server.ToolConfigs{
"example_tool": postgreslistroles.Config{
Name: "example_tool",
Kind: "postgres-list-roles",
Source: "my-postgres-instance",
Description: "some description",
AuthRequired: []string{"my-google-auth-service", "other-auth-service"},
},
},
},
{
desc: "basic example",
in: `
tools:
example_tool:
kind: postgres-list-roles
source: my-postgres-instance
description: some description
`,
want: server.ToolConfigs{
"example_tool": postgreslistroles.Config{
Name: "example_tool",
Kind: "postgres-list-roles",
Source: "my-postgres-instance",
Description: "some description",
AuthRequired: []string{},
},
},
},
}
for _, tc := range tcs {
t.Run(tc.desc, func(t *testing.T) {
got := struct {
Tools server.ToolConfigs `yaml:"tools"`
}{}
// Parse contents
err := yaml.UnmarshalContext(ctx, testutils.FormatYaml(tc.in), &got)
if err != nil {
t.Fatalf("unable to unmarshal: %s", err)
}
if diff := cmp.Diff(tc.want, got.Tools); diff != "" {
t.Fatalf("incorrect parse: diff %v", diff)
}
})
}
}

View File

@@ -32,28 +32,28 @@ const kind string = "postgres-list-schemas"
const listSchemasStatement = `
WITH
schema_grants AS (
SELECT schema_oid, jsonb_object_agg(grantee, privileges) AS grants
FROM
(
SELECT
n.oid AS schema_oid,
CASE
WHEN p.grantee = 0 THEN 'PUBLIC'
ELSE pg_catalog.pg_get_userbyid(p.grantee)
END
AS grantee,
jsonb_agg(p.privilege_type ORDER BY p.privilege_type) AS privileges
FROM pg_catalog.pg_namespace n, aclexplode(n.nspacl) p
WHERE n.nspacl IS NOT NULL
GROUP BY n.oid, grantee
) permissions_by_grantee
GROUP BY schema_oid
),
all_schemas AS (
SELECT
n.nspname AS schema_name,
pg_catalog.pg_get_userbyid(n.nspowner) AS owner,
schema_grants AS (
SELECT schema_oid, jsonb_object_agg(grantee, privileges) AS grants
FROM
(
SELECT
n.oid AS schema_oid,
CASE
WHEN p.grantee = 0 THEN 'PUBLIC'
ELSE pg_catalog.pg_get_userbyid(p.grantee)
END
AS grantee,
jsonb_agg(p.privilege_type ORDER BY p.privilege_type) AS privileges
FROM pg_catalog.pg_namespace n, aclexplode(n.nspacl) p
WHERE n.nspacl IS NOT NULL
GROUP BY n.oid, grantee
) permissions_by_grantee
GROUP BY schema_oid
),
all_schemas AS (
SELECT
n.nspname AS schema_name,
pg_catalog.pg_get_userbyid(n.nspowner) AS owner,
COALESCE(sg.grants, '{}'::jsonb) AS grants,
(
SELECT COUNT(*)
@@ -67,18 +67,21 @@ const listSchemasStatement = `
) AS views,
(SELECT COUNT(*) FROM pg_catalog.pg_proc p WHERE p.pronamespace = n.oid)
AS functions
FROM pg_catalog.pg_namespace n
LEFT JOIN schema_grants sg
ON n.oid = sg.schema_oid
)
FROM pg_catalog.pg_namespace n
LEFT JOIN schema_grants sg
ON n.oid = sg.schema_oid
)
SELECT *
FROM all_schemas
-- Exclude system schemas and temporary schemas created per session.
-- Exclude system schemas and temporary schemas created per session.
WHERE
schema_name NOT IN ('pg_catalog', 'information_schema', 'pg_toast')
AND schema_name NOT LIKE 'pg_temp_%'
AND ($1::text IS NULL OR schema_name LIKE '%' || $1::text || '%')
ORDER BY schema_name;
schema_name NOT IN ('pg_catalog', 'information_schema', 'pg_toast')
AND schema_name NOT LIKE 'pg_temp_%'
AND schema_name NOT LIKE 'pg_toast_temp_%'
AND ($1::text IS NULL OR schema_name ILIKE '%' || $1::text || '%')
AND ($2::text IS NULL OR owner ILIKE '%' || $2::text || '%')
ORDER BY schema_name
LIMIT COALESCE($3::int, NULL);
`
func init() {
@@ -136,12 +139,14 @@ func (cfg Config) Initialize(srcs map[string]sources.Source) (tools.Tool, error)
allParameters := parameters.Parameters{
parameters.NewStringParameterWithDefault("schema_name", "", "Optional: A specific schema name pattern to search for."),
parameters.NewStringParameterWithDefault("owner", "", "Optional: A specific schema owner name pattern to search for."),
parameters.NewIntParameterWithDefault("limit", 10, "Optional: The maximum number of schemas to return."),
}
description := cfg.Description
if description == "" {
description = "Lists all schemas in the database ordered by schema name and excluding system and temporary schemas. It returns the schema name, schema owner, grants, number of functions, number of tables and number of views within each schema."
if cfg.Description == "" {
cfg.Description = "Lists all schemas in the database ordered by schema name and excluding system and temporary schemas. It returns the schema name, schema owner, grants, number of functions, number of tables and number of views within each schema."
}
mcpManifest := tools.GetMcpManifest(cfg.Name, description, cfg.AuthRequired, allParameters, nil)
mcpManifest := tools.GetMcpManifest(cfg.Name, cfg.Description, cfg.AuthRequired, allParameters, nil)
// finish tool setup
return Tool{
@@ -169,7 +174,13 @@ type Tool struct {
}
func (t Tool) Invoke(ctx context.Context, resourceMgr tools.SourceProvider, params parameters.ParamValues, accessToken tools.AccessToken) (any, error) {
sliceParams := params.AsSlice()
paramsMap := params.AsMap()
newParams, err := parameters.GetParams(t.allParams, paramsMap)
if err != nil {
return nil, fmt.Errorf("unable to extract standard params %w", err)
}
sliceParams := newParams.AsSlice()
results, err := t.pool.Query(ctx, listSchemasStatement, sliceParams...)
if err != nil {

View File

@@ -32,9 +32,9 @@ const kind string = "postgres-list-sequences"
const listSequencesStatement = `
SELECT
sequencename,
schemaname,
sequenceowner,
sequencename as sequence_name,
schemaname as schema_name,
sequenceowner as sequence_owner,
data_type,
start_value,
min_value,
@@ -45,7 +45,7 @@ const listSequencesStatement = `
WHERE
($1::text IS NULL OR schemaname LIKE '%' || $1 || '%')
AND ($2::text IS NULL OR sequencename LIKE '%' || $2 || '%')
ORDER BY schemaname, sequencename
ORDER BY schema_name, sequence_name
LIMIT COALESCE($3::int, 50);
`
@@ -104,15 +104,15 @@ func (cfg Config) Initialize(srcs map[string]sources.Source) (tools.Tool, error)
}
allParameters := parameters.Parameters{
parameters.NewStringParameterWithDefault("schemaname", "", "Optional: A specific schema name pattern to search for."),
parameters.NewStringParameterWithDefault("sequencename", "", "Optional: A specific sequence name pattern to search for."),
parameters.NewStringParameterWithDefault("schema_name", "", "Optional: A specific schema name pattern to search for."),
parameters.NewStringParameterWithDefault("sequence_name", "", "Optional: A specific sequence name pattern to search for."),
parameters.NewIntParameterWithDefault("limit", 50, "Optional: The maximum number of rows to return. Default is 50"),
}
description := cfg.Description
if description == "" {
description = "Lists sequences in the database. Returns sequence name, schema name, sequence owner, data type of the sequence, starting value, minimum value, maximum value of the sequence, the value by which the sequence is incremented, and the last value generated by the sequence in the current session"
if cfg.Description == "" {
cfg.Description = "Lists sequences in the database. Returns sequence name, schema name, sequence owner, data type of the sequence, starting value, minimum value, maximum value of the sequence, the value by which the sequence is incremented, and the last value generated by the sequence in the current session"
}
mcpManifest := tools.GetMcpManifest(cfg.Name, description, cfg.AuthRequired, allParameters, nil)
mcpManifest := tools.GetMcpManifest(cfg.Name, cfg.Description, cfg.AuthRequired, allParameters, nil)
// finish tool setup
return Tool{
@@ -144,7 +144,13 @@ func (t Tool) ToConfig() tools.ToolConfig {
}
func (t Tool) Invoke(ctx context.Context, resourceMgr tools.SourceProvider, params parameters.ParamValues, accessToken tools.AccessToken) (any, error) {
sliceParams := params.AsSlice()
paramsMap := params.AsMap()
newParams, err := parameters.GetParams(t.allParams, paramsMap)
if err != nil {
return nil, fmt.Errorf("unable to extract standard params %w", err)
}
sliceParams := newParams.AsSlice()
results, err := t.pool.Query(ctx, listSequencesStatement, sliceParams...)
if err != nil {

View File

@@ -0,0 +1,213 @@
// Copyright 2025 Google LLC
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
package postgreslisttablespaces
import (
"context"
"fmt"
yaml "github.com/goccy/go-yaml"
"github.com/googleapis/genai-toolbox/internal/sources"
"github.com/googleapis/genai-toolbox/internal/sources/alloydbpg"
"github.com/googleapis/genai-toolbox/internal/sources/cloudsqlpg"
"github.com/googleapis/genai-toolbox/internal/sources/postgres"
"github.com/googleapis/genai-toolbox/internal/tools"
"github.com/googleapis/genai-toolbox/internal/util/parameters"
"github.com/jackc/pgx/v5/pgxpool"
)
const kind string = "postgres-list-tablespaces"
const listTableSpacesStatement = `
WITH
tablespace_info AS (
SELECT
spcname AS tablespace_name,
pg_catalog.pg_get_userbyid(spcowner) AS owner_name,
CASE
WHEN pg_catalog.has_tablespace_privilege(oid, 'CREATE') THEN pg_tablespace_size(oid)
ELSE NULL
END AS size_in_bytes,
oid,
spcacl,
spcoptions
FROM
pg_tablespace
)
SELECT *
FROM
tablespace_info
WHERE
($1::text IS NULL OR tablespace_name LIKE '%' || $1::text || '%')
ORDER BY
tablespace_name
LIMIT COALESCE($2::int, 50);
`
func init() {
if !tools.Register(kind, newConfig) {
panic(fmt.Sprintf("tool kind %q already registered", kind))
}
}
func newConfig(ctx context.Context, name string, decoder *yaml.Decoder) (tools.ToolConfig, error) {
actual := Config{Name: name}
if err := decoder.DecodeContext(ctx, &actual); err != nil {
return nil, err
}
return actual, nil
}
type compatibleSource interface {
PostgresPool() *pgxpool.Pool
}
// validate compatible sources are still compatible
var _ compatibleSource = &alloydbpg.Source{}
var _ compatibleSource = &cloudsqlpg.Source{}
var _ compatibleSource = &postgres.Source{}
var compatibleSources = [...]string{alloydbpg.SourceKind, cloudsqlpg.SourceKind, postgres.SourceKind}
type Config struct {
Name string `yaml:"name" validate:"required"`
Kind string `yaml:"kind" validate:"required"`
Source string `yaml:"source" validate:"required"`
Description string `yaml:"description"`
AuthRequired []string `yaml:"authRequired"`
}
// validate interface
var _ tools.ToolConfig = Config{}
func (cfg Config) ToolConfigKind() string {
return kind
}
func (cfg Config) Initialize(srcs map[string]sources.Source) (tools.Tool, error) {
// verify source exists
rawS, ok := srcs[cfg.Source]
if !ok {
return nil, fmt.Errorf("no source named %q configured", cfg.Source)
}
// verify the source is compatible
s, ok := rawS.(compatibleSource)
if !ok {
return nil, fmt.Errorf("invalid source for %q tool: source kind must be one of %q", kind, compatibleSources)
}
allParameters := parameters.Parameters{
parameters.NewStringParameterWithDefault("tablespace_name", "", "Optional: a text to filter results by tablespace name. The input is used within a LIKE clause."),
parameters.NewIntParameterWithDefault("limit", 50, "Optional: The maximum number of rows to return."),
}
description := cfg.Description
if description == "" {
description = "Lists all tablespaces in the database. Returns the tablespace name, owner name, size in bytes(if the current user has CREATE privileges on the tablespace, otherwise NULL), internal object ID, the access control list regarding permissions, and any specific tablespace options."
}
mcpManifest := tools.GetMcpManifest(cfg.Name, description, cfg.AuthRequired, allParameters, nil)
// finish tool setup
return Tool{
Config: cfg,
allParams: allParameters,
pool: s.PostgresPool(),
manifest: tools.Manifest{
Description: cfg.Description,
Parameters: allParameters.Manifest(),
AuthRequired: cfg.AuthRequired,
},
mcpManifest: mcpManifest,
}, nil
}
// validate interface
var _ tools.Tool = Tool{}
type Tool struct {
Config
allParams parameters.Parameters `yaml:"allParams"`
pool *pgxpool.Pool
manifest tools.Manifest
mcpManifest tools.McpManifest
}
func (t Tool) ToConfig() tools.ToolConfig {
return t.Config
}
func (t Tool) Invoke(ctx context.Context, resourceMgr tools.SourceProvider, params parameters.ParamValues, accessToken tools.AccessToken) (any, error) {
paramsMap := params.AsMap()
tablespaceName, ok := paramsMap["tablespace_name"].(string)
if !ok {
return nil, fmt.Errorf("invalid 'tablespace_name' parameter; expected a string")
}
limit, ok := paramsMap["limit"].(int)
if !ok {
return nil, fmt.Errorf("invalid 'limit' parameter; expected an integer")
}
results, err := t.pool.Query(ctx, listTableSpacesStatement, tablespaceName, limit)
if err != nil {
return nil, fmt.Errorf("unable to execute query: %w", err)
}
defer results.Close()
fields := results.FieldDescriptions()
var out []map[string]any
for results.Next() {
values, err := results.Values()
if err != nil {
return nil, fmt.Errorf("unable to parse row: %w", err)
}
rowMap := make(map[string]any)
for i, field := range fields {
rowMap[string(field.Name)] = values[i]
}
out = append(out, rowMap)
}
// this will catch actual query execution errors
if err := results.Err(); err != nil {
return nil, fmt.Errorf("unable to execute query: %w", err)
}
return out, nil
}
func (t Tool) ParseParams(data map[string]any, claims map[string]map[string]any) (parameters.ParamValues, error) {
return parameters.ParseParams(t.allParams, data, claims)
}
func (t Tool) Manifest() tools.Manifest {
return t.manifest
}
func (t Tool) McpManifest() tools.McpManifest {
return t.mcpManifest
}
func (t Tool) Authorized(verifiedAuthServices []string) bool {
return tools.IsAuthorized(t.AuthRequired, verifiedAuthServices)
}
func (t Tool) RequiresClientAuthorization(resourceMgr tools.SourceProvider) bool {
return false
}
func (t Tool) GetAuthTokenHeaderName() string {
return "Authorization"
}

View File

@@ -0,0 +1,95 @@
// Copyright 2025 Google LLC
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
package postgreslisttablespaces_test
import (
"testing"
yaml "github.com/goccy/go-yaml"
"github.com/google/go-cmp/cmp"
"github.com/googleapis/genai-toolbox/internal/server"
"github.com/googleapis/genai-toolbox/internal/testutils"
"github.com/googleapis/genai-toolbox/internal/tools/postgres/postgreslisttablespaces"
)
func TestParseFromYamlPostgresListTablespaces(t *testing.T) {
ctx, err := testutils.ContextWithNewLogger()
if err != nil {
t.Fatalf("unexpected error: %s", err)
}
tcs := []struct {
desc string
in string
want server.ToolConfigs
}{
{
desc: "basic example",
in: `
tools:
example_tool:
kind: postgres-list-tablespaces
source: my-postgres-instance
description: some description
authRequired:
- my-google-auth-service
- other-auth-service
`,
want: server.ToolConfigs{
"example_tool": postgreslisttablespaces.Config{
Name: "example_tool",
Kind: "postgres-list-tablespaces",
Source: "my-postgres-instance",
Description: "some description",
AuthRequired: []string{"my-google-auth-service", "other-auth-service"},
},
},
},
{
desc: "basic example",
in: `
tools:
example_tool:
kind: postgres-list-tablespaces
source: my-postgres-instance
description: some description
`,
want: server.ToolConfigs{
"example_tool": postgreslisttablespaces.Config{
Name: "example_tool",
Kind: "postgres-list-tablespaces",
Source: "my-postgres-instance",
Description: "some description",
AuthRequired: []string{},
},
},
},
}
for _, tc := range tcs {
t.Run(tc.desc, func(t *testing.T) {
got := struct {
Tools server.ToolConfigs `yaml:"tools"`
}{}
// Parse contents
err := yaml.UnmarshalContext(ctx, testutils.FormatYaml(tc.in), &got)
if err != nil {
t.Fatalf("unable to unmarshal: %s", err)
}
if diff := cmp.Diff(tc.want, got.Tools); diff != "" {
t.Fatalf("incorrect parse: diff %v", diff)
}
})
}
}

View File

@@ -135,11 +135,11 @@ func (cfg Config) Initialize(srcs map[string]sources.Source) (tools.Tool, error)
parameters.NewStringParameterWithDefault("table_name", "", "Optional: A specific table name pattern to search for."),
parameters.NewIntParameterWithDefault("limit", 50, "Optional: The maximum number of rows to return."),
}
description := cfg.Description
if description == "" {
description = "Lists all non-internal triggers in a database. Returns trigger name, schema name, table name, whether its enabled or disabled, timing (e.g BEFORE/AFTER of the event), the events that cause the trigger to fire such as INSERT, UPDATE, or DELETE, whether the trigger activates per ROW or per STATEMENT, the handler function executed by the trigger and full definition."
if cfg.Description == "" {
cfg.Description = "Lists all non-internal triggers in a database. Returns trigger name, schema name, table name, whether its enabled or disabled, timing (e.g BEFORE/AFTER of the event), the events that cause the trigger to fire such as INSERT, UPDATE, or DELETE, whether the trigger activates per ROW or per STATEMENT, the handler function executed by the trigger and full definition."
}
mcpManifest := tools.GetMcpManifest(cfg.Name, description, cfg.AuthRequired, allParameters, nil)
mcpManifest := tools.GetMcpManifest(cfg.Name, cfg.Description, cfg.AuthRequired, allParameters, nil)
// finish tool setup
return Tool{
@@ -171,7 +171,13 @@ func (t Tool) ToConfig() tools.ToolConfig {
}
func (t Tool) Invoke(ctx context.Context, resourceMgr tools.SourceProvider, params parameters.ParamValues, accessToken tools.AccessToken) (any, error) {
sliceParams := params.AsSlice()
paramsMap := params.AsMap()
newParams, err := parameters.GetParams(t.allParams, paramsMap)
if err != nil {
return nil, fmt.Errorf("unable to extract standard params %w", err)
}
sliceParams := newParams.AsSlice()
results, err := t.pool.Query(ctx, listTriggersStatement, sliceParams...)
if err != nil {

View File

@@ -31,13 +31,24 @@ import (
const kind string = "postgres-list-views"
const listViewsStatement = `
SELECT schemaname, viewname, viewowner
FROM pg_views
WHERE
schemaname NOT IN ('pg_catalog', 'information_schema')
AND ($1::text IS NULL OR viewname LIKE '%' || $1::text || '%')
ORDER BY viewname
LIMIT COALESCE($2::int, 50);
WITH list_views AS (
SELECT
schemaname AS schema_name,
viewname AS view_name,
viewowner AS owner_name,
definition
FROM pg_views
)
SELECT *
FROM list_views
WHERE
schema_name NOT IN ('pg_catalog', 'information_schema', 'pg_toast')
AND schema_name NOT LIKE 'pg_temp_%'
AND ($1::text IS NULL OR view_name ILIKE '%' || $1::text || '%')
AND ($2::text IS NULL OR schema_name ILIKE '%' || $2::text || '%')
ORDER BY
schema_name, view_name
LIMIT COALESCE($3::int, 50);
`
func init() {
@@ -94,15 +105,15 @@ func (cfg Config) Initialize(srcs map[string]sources.Source) (tools.Tool, error)
}
allParameters := parameters.Parameters{
parameters.NewStringParameterWithDefault("viewname", "", "Optional: A specific view name to search for."),
parameters.NewStringParameterWithDefault("view_name", "", "Optional: A specific view name to search for."),
parameters.NewStringParameterWithDefault("schema_name", "", "Optional: A specific schema name to search for."),
parameters.NewIntParameterWithDefault("limit", 50, "Optional: The maximum number of rows to return."),
}
paramManifest := allParameters.Manifest()
description := cfg.Description
if description == "" {
description = "Lists views in the database from pg_views with a default limit of 50 rows. Returns schemaname, viewname and the ownername."
if cfg.Description == "" {
cfg.Description = "Lists views in the database from pg_views with a default limit of 50 rows. Returns schemaname, viewname, ownername and the definition."
}
mcpManifest := tools.GetMcpManifest(cfg.Name, description, cfg.AuthRequired, allParameters, nil)
mcpManifest := tools.GetMcpManifest(cfg.Name, cfg.Description, cfg.AuthRequired, allParameters, nil)
// finish tool setup
return Tool{

View File

@@ -0,0 +1,92 @@
// Copyright 2025 Google LLC
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
package createbatch
import (
"context"
"encoding/json"
"fmt"
dataproc "cloud.google.com/go/dataproc/v2/apiv1/dataprocpb"
"github.com/goccy/go-yaml"
"google.golang.org/protobuf/encoding/protojson"
"google.golang.org/protobuf/proto"
)
// unmarshalProto is a helper function to unmarshal a generic interface{} into a proto.Message.
func unmarshalProto(data any, m proto.Message) error {
jsonData, err := json.Marshal(data)
if err != nil {
return fmt.Errorf("failed to marshal to JSON: %w", err)
}
return protojson.Unmarshal(jsonData, m)
}
// Config is a common config that can be used with any type of create batch tool. However, each tool
// will still need its own config type, embedding this Config, so it can provide a type-specific
// Initialize implementation.
type Config struct {
Name string `yaml:"name" validate:"required"`
Kind string `yaml:"kind" validate:"required"`
Source string `yaml:"source" validate:"required"`
Description string `yaml:"description"`
RuntimeConfig *dataproc.RuntimeConfig `yaml:"runtimeConfig"`
EnvironmentConfig *dataproc.EnvironmentConfig `yaml:"environmentConfig"`
AuthRequired []string `yaml:"authRequired"`
}
func NewConfig(ctx context.Context, name string, decoder *yaml.Decoder) (Config, error) {
// Use a temporary struct to decode the YAML, so that we can handle the proto
// conversion for RuntimeConfig and EnvironmentConfig.
var ymlCfg struct {
Name string `yaml:"name"`
Kind string `yaml:"kind"`
Source string `yaml:"source"`
Description string `yaml:"description"`
RuntimeConfig any `yaml:"runtimeConfig"`
EnvironmentConfig any `yaml:"environmentConfig"`
AuthRequired []string `yaml:"authRequired"`
}
if err := decoder.DecodeContext(ctx, &ymlCfg); err != nil {
return Config{}, err
}
cfg := Config{
Name: name,
Kind: ymlCfg.Kind,
Source: ymlCfg.Source,
Description: ymlCfg.Description,
AuthRequired: ymlCfg.AuthRequired,
}
if ymlCfg.RuntimeConfig != nil {
rc := &dataproc.RuntimeConfig{}
if err := unmarshalProto(ymlCfg.RuntimeConfig, rc); err != nil {
return Config{}, fmt.Errorf("failed to unmarshal runtimeConfig: %w", err)
}
cfg.RuntimeConfig = rc
}
if ymlCfg.EnvironmentConfig != nil {
ec := &dataproc.EnvironmentConfig{}
if err := unmarshalProto(ymlCfg.EnvironmentConfig, ec); err != nil {
return Config{}, fmt.Errorf("failed to unmarshal environmentConfig: %w", err)
}
cfg.EnvironmentConfig = ec
}
return cfg, nil
}

Some files were not shown because too many files have changed in this diff Show More