Compare commits

...

124 Commits

Author SHA1 Message Date
Prerna Kakkar
3117c78af6 add base url check 2025-09-10 12:59:28 +00:00
Prerna Kakkar
738cca7c61 Merge branch 'main' into cloud-sql-wait-for-operation 2025-09-10 12:58:18 +00:00
Anmol Shukla
302faf2513 docs: updated google api key conf (#1398)
Co-authored-by: Twisha Bansal <58483338+twishabansal@users.noreply.github.com>
2025-09-10 12:24:19 +05:30
Mend Renovate
22cf228b88 chore(deps): update module github.com/googleapis/mcp-toolbox-sdk-go to v0.3.0 (#1360)
This PR contains the following updates:

| Package | Change | Age | Confidence |
|---|---|---|---|
|
[github.com/googleapis/mcp-toolbox-sdk-go](https://redirect.github.com/googleapis/mcp-toolbox-sdk-go)
| `v0.2.0` -> `v0.3.0` |
[![age](https://developer.mend.io/api/mc/badges/age/go/github.com%2fgoogleapis%2fmcp-toolbox-sdk-go/v0.3.0?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|
[![confidence](https://developer.mend.io/api/mc/badges/confidence/go/github.com%2fgoogleapis%2fmcp-toolbox-sdk-go/v0.2.0/v0.3.0?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|

---

### Release Notes

<details>
<summary>googleapis/mcp-toolbox-sdk-go
(github.com/googleapis/mcp-toolbox-sdk-go)</summary>

###
[`v0.3.0`](https://redirect.github.com/googleapis/mcp-toolbox-sdk-go/releases/tag/v0.3.0)

[Compare
Source](https://redirect.github.com/googleapis/mcp-toolbox-sdk-go/compare/v0.2.0...v0.3.0)

##### Features

- Add support for Map parameter type
([#&#8203;51](https://redirect.github.com/googleapis/mcp-toolbox-sdk-go/issues/51))
([c80d8d6](c80d8d6e2b))
- Add support for nested maps in generic map parameter
([#&#8203;61](https://redirect.github.com/googleapis/mcp-toolbox-sdk-go/issues/61))
([3b33c52](3b33c52150))

##### Miscellaneous Chores

- Release 0.3.0
([#&#8203;59](https://redirect.github.com/googleapis/mcp-toolbox-sdk-go/issues/59))
([a89adcf](a89adcf440))

</details>

---

### Configuration

📅 **Schedule**: Branch creation - At any time (no schedule defined),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/googleapis/genai-toolbox).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MS45MS4xIiwidXBkYXRlZEluVmVyIjoiNDEuOTcuMTAiLCJ0YXJnZXRCcmFuY2giOiJtYWluIiwibGFiZWxzIjpbXX0=-->

Co-authored-by: Twisha Bansal <58483338+twishabansal@users.noreply.github.com>
2025-09-10 06:41:00 +00:00
Yuan Teoh
794ad91885 chore: fix lint in recent merged prs (#1396)
Run `golangcilint` on recent merged prs.
2025-09-09 22:21:40 +00:00
Wenxin Du
b5f9780a59 fix(bigquery)!: Add Bearer parsing to auth token (#1386)
Previously we propagate tokens directly to the BQ API. But MCP inspector
adds a "Bearer" prefix to all authorization header. We will need to
parse the token accordingly to make it work.
2025-09-09 15:47:52 -04:00
trehanshakuntG
cce602f280 feat(tools/firestore): Add firestore-query tool (#1305)
## Description
---

This PR introduces a new tool kind `firestore-query` that enables
parameterized querying of Firestore collections with support for
Firestore native JSON value types, ensuring proper type handling for
complex queries.

### Feature

A new Firestore tool that allows:

- __Parameterized collection paths, filters, select, orderBy, limit and
analyzeQuery__ using Go template syntax
- __Native JSON value type support__ for proper type handling in queries
- __Complex filter structures__ with AND/OR logical operators
- __Dynamic query building__ with template parameter substitution

Example usage:
<img width="761" height="721" alt="Screenshot 2025-09-09 at 1 21 16 PM"
src="https://github.com/user-attachments/assets/bb359ea8-f750-492d-9f13-cef8f3b6bfd1"
/>



## PR Checklist
---
> Thank you for opening a Pull Request! Before submitting your PR, there
are a
> few things you can do to make sure it goes smoothly:
- [x] Make sure you reviewed

[CONTRIBUTING.md](https://github.com/googleapis/genai-toolbox/blob/main/CONTRIBUTING.md)
- [x] Make sure to open an issue as a

[bug/issue](https://github.com/googleapis/langchain-google-alloydb-pg-python/issues/new/choose)
before writing your code! That way we can discuss the change, evaluate
  designs, and agree on the general idea
- [x] Ensure the tests and linter pass
- [x] Code coverage does not decrease (if any source code was changed)
- [x] Appropriate docs were updated (if necessary)
- [x] Make sure to add `!` if this involve a breaking change
2025-09-09 18:24:12 +00:00
Averi Kitsch
70e832bd08 feat(prebuilt): Update default values for prebuilt tools (#1355)
## Description
Provide default values for prebuilt tools

## PR Checklist

---
> Thank you for opening a Pull Request! Before submitting your PR, there
are a
> few things you can do to make sure it goes smoothly:

- [ ] Make sure you reviewed

[CONTRIBUTING.md](https://github.com/googleapis/genai-toolbox/blob/main/CONTRIBUTING.md)
- [ ] Make sure to open an issue as a

[bug/issue](https://github.com/googleapis/genai-toolbox/issues/new/choose)
before writing your code! That way we can discuss the change, evaluate
  designs, and agree on the general idea
- [ ] Ensure the tests and linter pass
- [ ] Code coverage does not decrease (if any source code was changed)
- [ ] Appropriate docs were updated (if necessary)
- [ ] Make sure to add `!` if this involve a breaking change

🛠️ Fixes #<issue_number_goes_here>
2025-09-09 11:03:26 -07:00
Twisha Bansal
05b14a8824 ci: update blunderbuss to assign issues to blr team (#1387)
## Description

Issues will now be assigned to more people, thus allowing faster
triages.

## PR Checklist

> Thank you for opening a Pull Request! Before submitting your PR, there
are a
> few things you can do to make sure it goes smoothly:

- [x] Make sure you reviewed

[CONTRIBUTING.md](https://github.com/googleapis/genai-toolbox/blob/main/CONTRIBUTING.md)
- [ ] Make sure to open an issue as a

[bug/issue](https://github.com/googleapis/genai-toolbox/issues/new/choose)
before writing your code! That way we can discuss the change, evaluate
  designs, and agree on the general idea
- [ ] Ensure the tests and linter pass
- [ ] Code coverage does not decrease (if any source code was changed)
- [ ] Appropriate docs were updated (if necessary)
- [ ] Make sure to add `!` if this involve a breaking change

🛠️ Fixes #<issue_number_goes_here>
2025-09-09 22:32:34 +05:30
Sfurti-yb
664711f4b3 feat(yugabytedb): Add YugabyteDB Source and Tool (#732)
- This PR aims to add YugabyteDB as a source and a tool.
- It is based on the PostgreSQL source but uses the YugabyteDB's fork of
pgx driver that better accommodates the distributed nature of YugabyteDB
- Added tests for the same.

---------

Co-authored-by: Amogh Shetkar <ashetkar@yugabyte.com>
2025-09-09 10:59:00 -04:00
Prerna Kakkar
9f69b3c8d1 update tests 2025-09-09 08:47:47 +00:00
Prerna Kakkar
47f7c9d918 Merge branch 'main' into cloud-sql-wait-for-operation 2025-09-09 08:45:51 +00:00
nester-neo4j
f819e26443 feat(prebuiltconfig/neo4j): Add prebuiltconfig support for neo4j (#1352)
This pull request adds support and documentation for connecting Neo4j
databases using the Model Context Protocol (MCP) and the MCP Toolbox. It
introduces a new prebuilt configuration for Neo4j, updates documentation
to guide users through setup across multiple IDEs, and adds tests to
ensure the new configuration is loaded correctly.

**Neo4j MCP Integration:**

* Added a new prebuilt configuration file `neo4j.yaml` defining
`execute_cypher` and `schema` tools for Neo4j, with support for
environment-based configuration.
* Updated the documentation (`neo4j_mcp.md`) with step-by-step
instructions for connecting Neo4j to various IDEs (e.g., VS Code,
Cursor, Claude, Gemini) using MCP Toolbox.
* Expanded the reference documentation to include Neo4j, detailing
required environment variables, permissions, and available tools.

**Testing and Codebase Updates:**

* Updated test cases in `prebuiltconfigs_test.go` to include Neo4j in
the list of expected tool sources and verify the Neo4j configuration can
be fetched and loaded.
[[1]](diffhunk://#diff-fa866059efb09c74bb2aabec8cc71ca9fda3b595fbc2a56f05fb608782ad55b8R36)
[[2]](diffhunk://#diff-fa866059efb09c74bb2aabec8cc71ca9fda3b595fbc2a56f05fb608782ad55b8R102)
[[3]](diffhunk://#diff-fa866059efb09c74bb2aabec8cc71ca9fda3b595fbc2a56f05fb608782ad55b8R148-R150)

---------

Co-authored-by: Yuan Teoh <45984206+Yuan325@users.noreply.github.com>
2025-09-08 23:02:33 -07:00
Mend Renovate
ec9af0a215 chore(deps): update module golang.org/x/oauth2 to v0.31.0 (#1356)
This PR contains the following updates:

| Package | Change | Age | Confidence |
|---|---|---|---|
| golang.org/x/oauth2 | `v0.30.0` -> `v0.31.0` |
[![age](https://developer.mend.io/api/mc/badges/age/go/golang.org%2fx%2foauth2/v0.31.0?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|
[![confidence](https://developer.mend.io/api/mc/badges/confidence/go/golang.org%2fx%2foauth2/v0.30.0/v0.31.0?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|

---

### Configuration

📅 **Schedule**: Branch creation - At any time (no schedule defined),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/googleapis/genai-toolbox).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MS45MS4xIiwidXBkYXRlZEluVmVyIjoiNDEuOTcuMTAiLCJ0YXJnZXRCcmFuY2giOiJtYWluIiwibGFiZWxzIjpbXX0=-->

Co-authored-by: Yuan Teoh <45984206+Yuan325@users.noreply.github.com>
2025-09-08 18:02:19 -07:00
Mend Renovate
3d622b18be chore(deps): update module cloud.google.com/go/bigquery to v1.70.0 (#1370)
This PR contains the following updates:

| Package | Change | Age | Confidence |
|---|---|---|---|
|
[cloud.google.com/go/bigquery](https://redirect.github.com/googleapis/google-cloud-go)
| `v1.69.0` -> `v1.70.0` |
[![age](https://developer.mend.io/api/mc/badges/age/go/cloud.google.com%2fgo%2fbigquery/v1.70.0?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|
[![confidence](https://developer.mend.io/api/mc/badges/confidence/go/cloud.google.com%2fgo%2fbigquery/v1.69.0/v1.70.0?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|

---

### Configuration

📅 **Schedule**: Branch creation - At any time (no schedule defined),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/googleapis/genai-toolbox).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MS45Ny4xMCIsInVwZGF0ZWRJblZlciI6IjQxLjk3LjEwIiwidGFyZ2V0QnJhbmNoIjoibWFpbiIsImxhYmVscyI6W119-->

Co-authored-by: Yuan Teoh <45984206+Yuan325@users.noreply.github.com>
2025-09-08 16:25:09 -07:00
Mend Renovate
0c3a268475 chore(deps): update module github.com/jackc/pgx/v5 to v5.7.6 (#1371)
This PR contains the following updates:

| Package | Change | Age | Confidence |
|---|---|---|---|
| [github.com/jackc/pgx/v5](https://redirect.github.com/jackc/pgx) |
`v5.7.5` -> `v5.7.6` |
[![age](https://developer.mend.io/api/mc/badges/age/go/github.com%2fjackc%2fpgx%2fv5/v5.7.6?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|
[![confidence](https://developer.mend.io/api/mc/badges/confidence/go/github.com%2fjackc%2fpgx%2fv5/v5.7.5/v5.7.6?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|

---

### Release Notes

<details>
<summary>jackc/pgx (github.com/jackc/pgx/v5)</summary>

###
[`v5.7.6`](https://redirect.github.com/jackc/pgx/compare/v5.7.5...v5.7.6)

[Compare
Source](https://redirect.github.com/jackc/pgx/compare/v5.7.5...v5.7.6)

</details>

---

### Configuration

📅 **Schedule**: Branch creation - At any time (no schedule defined),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/googleapis/genai-toolbox).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MS45Ny4xMCIsInVwZGF0ZWRJblZlciI6IjQxLjk3LjEwIiwidGFyZ2V0QnJhbmNoIjoibWFpbiIsImxhYmVscyI6W119-->

Co-authored-by: Yuan Teoh <45984206+Yuan325@users.noreply.github.com>
2025-09-08 16:15:34 -07:00
Wenxin Du
8eda284070 ci(dgraph): Skip Dgraph test temporarily (#1372)
Dgraph testing url is down: https://play.dgraph.io/
Skip test to unblock other contributors.
2025-09-08 20:35:04 +00:00
Prerna Kakkar
07083e3664 add source 2025-09-08 16:45:13 +00:00
prernakakkar-google
a74c1b69ac Merge branch 'main' into cloud-sql-wait-for-operation 2025-09-08 10:53:02 +00:00
Prerna Kakkar
79a6c827f7 Resolve comments 2025-09-08 10:49:59 +00:00
Twisha Bansal
ad53fb4460 chore: remove placeholder project id (#1367)
## Description
Remove placeholder `project-id` to error out when user does not enter
their project id.

## PR Checklist
> Thank you for opening a Pull Request! Before submitting your PR, there
are a
> few things you can do to make sure it goes smoothly:

- [x] Make sure you reviewed

[CONTRIBUTING.md](https://github.com/googleapis/genai-toolbox/blob/main/CONTRIBUTING.md)
- [ ] Make sure to open an issue as a

[bug/issue](https://github.com/googleapis/genai-toolbox/issues/new/choose)
before writing your code! That way we can discuss the change, evaluate
  designs, and agree on the general idea
- [ ] Ensure the tests and linter pass
- [ ] Code coverage does not decrease (if any source code was changed)
- [ ] Appropriate docs were updated (if necessary)
- [ ] Make sure to add `!` if this involve a breaking change

🛠️ Fixes #<issue_number_goes_here>
2025-09-08 14:55:47 +05:30
Prerna Kakkar
41ea1c488f Merge branch 'main' into cloud-sql-wait-for-operation 2025-09-08 06:37:38 +00:00
Anmol Shukla
28333e5faf chore(testing): setup initial directory structure for Go (#1185) 2025-09-08 11:27:47 +05:30
Valeriy
4b32c2a770 feat(prebuiltconfig/mysql): add queryParams field in MySQL prebuilt config (#1318)
Moves MySQL prebuiltconfig changes into a separate PR
- Adds Apache 2.0 license header
- Introduces MYSQL_QUERY_PARAMS env var to inject driver parameters
Fixes #1286

---------

Co-authored-by: Averi Kitsch <akitsch@google.com>
2025-09-05 18:30:23 +00:00
Mend Renovate
d6c55195d7 chore(deps): update actions/github-script digest to f28e40c (#1349)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
|
[actions/github-script](https://redirect.github.com/actions/github-script)
| action | digest | `60a0d83` -> `f28e40c` |

---

### Configuration

📅 **Schedule**: Branch creation - At any time (no schedule defined),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/googleapis/genai-toolbox).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MS45MS4xIiwidXBkYXRlZEluVmVyIjoiNDEuOTEuMSIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOltdfQ==-->
2025-09-05 10:59:57 -07:00
release-please[bot]
4f022b6700 chore(main): release 0.14.0 (#1261)
🤖 I have created a release *beep* *boop*
---


##
[0.14.0](https://github.com/googleapis/genai-toolbox/compare/v0.13.0...v0.14.0)
(2025-09-05)


### ⚠ BREAKING CHANGES

* **bigquery:** Move `useClientOAuth` config from tool to source
([#1279](https://github.com/googleapis/genai-toolbox/issues/1279))
([8d20a48](8d20a48f13))
* **tools/bigquerysql:** remove `useClientOAuth` from tools config
([#1312](https://github.com/googleapis/genai-toolbox/issues/1312))

### Features

* **clickhouse:** Add ClickHouse Source and Tools
([#1088](https://github.com/googleapis/genai-toolbox/issues/1088))
([75a04a5](75a04a55dd))
* **prebuilt/alloydb-postgres:** Support ipType and IAM users
([#1324](https://github.com/googleapis/genai-toolbox/issues/1324))
([0b2121e](0b2121ea72))
* **server/mcp:** Support toolbox auth in mcp
([#1140](https://github.com/googleapis/genai-toolbox/issues/1140))
([ca353e0](ca353e0b66))
* **source/mysql:** Support `queryParams` in MySQL source
([#1299](https://github.com/googleapis/genai-toolbox/issues/1299))
([3ae2526](3ae2526e0f))
* **tools/bigquery:** Support end-user credential passthrough on
multiple BQ tools
([#1314](https://github.com/googleapis/genai-toolbox/issues/1314))
([88f4b30](88f4b3028d))
* **tools/looker:** Add description for looker-get-models tool
([#1266](https://github.com/googleapis/genai-toolbox/issues/1266))
([89af3a4](89af3a4ca3))
* **tools/looker:** Authenticate via end user credentials
([#1257](https://github.com/googleapis/genai-toolbox/issues/1257))
([8755e3d](8755e3db34))
* **tools/looker:** Report field suggestions to agent
([#1267](https://github.com/googleapis/genai-toolbox/issues/1267))
([2cad82e](2cad82e510))


### Bug Fixes

* Do not print usage on runtime error
([#1315](https://github.com/googleapis/genai-toolbox/issues/1315))
([afba7a5](afba7a57cd))
* Update env var to allow empty string
([#1260](https://github.com/googleapis/genai-toolbox/issues/1260))
([03aa9fa](03aa9fabac))
* **tools/firestore:** Add document/collection path validation
([#1229](https://github.com/googleapis/genai-toolbox/issues/1229))
([14c2249](14c224939a))
* **tools/looker-get-dashboards:** Fix Looker client OAuth check
([#1338](https://github.com/googleapis/genai-toolbox/issues/1338))
([36225aa](36225aa6db))
* **tools/oceanbase:** Fix encoded text with mysql driver
([#1283](https://github.com/googleapis/genai-toolbox/issues/1283))
([d16f89f](d16f89fbb6)),
closes [#1161](https://github.com/googleapis/genai-toolbox/issues/1161)

---
This PR was generated with [Release
Please](https://github.com/googleapis/release-please). See
[documentation](https://github.com/googleapis/release-please#release-please).

---------

Co-authored-by: release-please[bot] <55107282+release-please[bot]@users.noreply.github.com>
Co-authored-by: Yuan Teoh <45984206+Yuan325@users.noreply.github.com>
2025-09-05 09:04:59 -07:00
Yuan Teoh
181a077bef chore: update release please extraFiles (#1342)
Update release please `extraFiles`.
2025-09-05 01:07:15 +00:00
Yuan Teoh
430a97d697 chore: release 0.14.0 (#1341)
Release-As: 0.14.0
2025-09-05 00:34:37 +00:00
Mend Renovate
77411ae8ab chore(deps): update module github.com/redis/go-redis/v9 to v9.13.0 (#1336)
This PR contains the following updates:

| Package | Change | Age | Confidence |
|---|---|---|---|
|
[github.com/redis/go-redis/v9](https://redirect.github.com/redis/go-redis)
| `v9.12.1` -> `v9.13.0` |
[![age](https://developer.mend.io/api/mc/badges/age/go/github.com%2fredis%2fgo-redis%2fv9/v9.13.0?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|
[![confidence](https://developer.mend.io/api/mc/badges/confidence/go/github.com%2fredis%2fgo-redis%2fv9/v9.12.1/v9.13.0?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|

---

### Release Notes

<details>
<summary>redis/go-redis (github.com/redis/go-redis/v9)</summary>

###
[`v9.13.0`](https://redirect.github.com/redis/go-redis/releases/tag/v9.13.0):
9.13.0

[Compare
Source](https://redirect.github.com/redis/go-redis/compare/v9.12.1...v9.13.0)

#### Highlights

- Pipeliner expose queued commands
([#&#8203;3496](https://redirect.github.com/redis/go-redis/pull/3496))
- Ensure that JSON.GET returns Nil response
([#&#8203;3470](https://redirect.github.com/redis/go-redis/pull/3470))
- Fixes on Read and Write buffer sizes and UniversalOptions

#### Changes

- Pipeliner expose queued commands
([#&#8203;3496](https://redirect.github.com/redis/go-redis/pull/3496))
- fix(test): fix a timing issue in pubsub test
([#&#8203;3498](https://redirect.github.com/redis/go-redis/pull/3498))
- Allow users to enable read-write splitting in failover mode.
([#&#8203;3482](https://redirect.github.com/redis/go-redis/pull/3482))
- Set the read/write buffer size of the sentinel client to 4KiB
([#&#8203;3476](https://redirect.github.com/redis/go-redis/pull/3476))

#### 🚀 New Features

- fix(otel): register wait metrics
([#&#8203;3499](https://redirect.github.com/redis/go-redis/pull/3499))
- Support subscriptions against cluster slave nodes
([#&#8203;3480](https://redirect.github.com/redis/go-redis/pull/3480))
- Add wait metrics to otel
([#&#8203;3493](https://redirect.github.com/redis/go-redis/pull/3493))
- Clean failing timeout implementation
([#&#8203;3472](https://redirect.github.com/redis/go-redis/pull/3472))

#### 🐛 Bug Fixes

- Do not assume that all non-IP hosts are loopbacks
([#&#8203;3085](https://redirect.github.com/redis/go-redis/pull/3085))
- Ensure that JSON.GET returns Nil response
([#&#8203;3470](https://redirect.github.com/redis/go-redis/pull/3470))

#### 🧰 Maintenance

- fix(otel): register wait metrics
([#&#8203;3499](https://redirect.github.com/redis/go-redis/pull/3499))
- fix(make test): Add default env in makefile
([#&#8203;3491](https://redirect.github.com/redis/go-redis/pull/3491))
- Update the introduction to running tests in README.md
([#&#8203;3495](https://redirect.github.com/redis/go-redis/pull/3495))
- test: Add comprehensive edge case tests for IncrByFloat command
([#&#8203;3477](https://redirect.github.com/redis/go-redis/pull/3477))
- Set the default read/write buffer size of Redis connection to 32KiB
([#&#8203;3483](https://redirect.github.com/redis/go-redis/pull/3483))
- Bumps test image to 8.2.1-pre
([#&#8203;3478](https://redirect.github.com/redis/go-redis/pull/3478))
- fix UniversalOptions miss ReadBufferSize and WriteBufferSize options
([#&#8203;3485](https://redirect.github.com/redis/go-redis/pull/3485))
- chore(deps): bump actions/checkout from 4 to 5
([#&#8203;3484](https://redirect.github.com/redis/go-redis/pull/3484))
- Removes dry run for stale issues policy
([#&#8203;3471](https://redirect.github.com/redis/go-redis/pull/3471))
- Update otel metrics URL
([#&#8203;3474](https://redirect.github.com/redis/go-redis/pull/3474))

#### Contributors

We'd like to thank all the contributors who worked on this release!

[@&#8203;LINKIWI](https://redirect.github.com/LINKIWI),
[@&#8203;cxljs](https://redirect.github.com/cxljs),
[@&#8203;cybersmeashish](https://redirect.github.com/cybersmeashish),
[@&#8203;elena-kolevska](https://redirect.github.com/elena-kolevska),
[@&#8203;htemelski-redis](https://redirect.github.com/htemelski-redis),
[@&#8203;mwhooker](https://redirect.github.com/mwhooker),
[@&#8203;ndyakov](https://redirect.github.com/ndyakov),
[@&#8203;ofekshenawa](https://redirect.github.com/ofekshenawa),
[@&#8203;suever](https://redirect.github.com/suever)

</details>

---

### Configuration

📅 **Schedule**: Branch creation - At any time (no schedule defined),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/googleapis/genai-toolbox).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MS45MS4xIiwidXBkYXRlZEluVmVyIjoiNDEuOTEuMSIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOltdfQ==-->

Co-authored-by: Yuan Teoh <45984206+Yuan325@users.noreply.github.com>
2025-09-04 17:21:44 -07:00
Mend Renovate
166d7b197c chore(deps): update actions/github-script action to v7.1.0 (#1335)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
|
[actions/github-script](https://redirect.github.com/actions/github-script)
| action | minor | `v7.0.1` -> `v7.1.0` |

---

### Release Notes

<details>
<summary>actions/github-script (actions/github-script)</summary>

###
[`v7.1.0`](https://redirect.github.com/actions/github-script/compare/v7.0.1...v7.1.0)

[Compare
Source](https://redirect.github.com/actions/github-script/compare/v7.0.1...v7.1.0)

</details>

---

### Configuration

📅 **Schedule**: Branch creation - At any time (no schedule defined),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/googleapis/genai-toolbox).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MS45MS4xIiwidXBkYXRlZEluVmVyIjoiNDEuOTEuMSIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOltdfQ==-->
2025-09-04 16:53:12 -07:00
Averi Kitsch
0b2121ea72 feat(prebuilt/alloydb-postgres): support ipType and IAM users (#1324)
## Description
---
> Should include a concise description of the changes (bug or feature),
it's
> impact, along with a summary of the solution

## PR Checklist
---
> Thank you for opening a Pull Request! Before submitting your PR, there
are a
> few things you can do to make sure it goes smoothly:
- [ ] Make sure you reviewed

[CONTRIBUTING.md](https://github.com/googleapis/genai-toolbox/blob/main/CONTRIBUTING.md)
- [ ] Make sure to open an issue as a

[bug/issue](https://github.com/googleapis/langchain-google-alloydb-pg-python/issues/new/choose)
before writing your code! That way we can discuss the change, evaluate
  designs, and agree on the general idea
- [ ] Ensure the tests and linter pass
- [ ] Code coverage does not decrease (if any source code was changed)
- [ ] Appropriate docs were updated (if necessary)
- [ ] Make sure to add `!` if this involve a breaking change

🛠️ Fixes #<issue_number_goes_here>
2025-09-04 21:36:03 +00:00
Wenxin Du
be30c1c08f docs: Fix PR template link (#1340)
Should link to the Toolbox repo
2025-09-04 21:22:03 +00:00
Wenxin Du
36225aa6db fix(looker): Fix Looker client OAuth check (#1338) 2025-09-04 21:00:05 +00:00
Averi Kitsch
e86a3823b0 chore: update contributing guide (#1337)
## Description
---
> Should include a concise description of the changes (bug or feature),
it's
> impact, along with a summary of the solution

## PR Checklist
---
> Thank you for opening a Pull Request! Before submitting your PR, there
are a
> few things you can do to make sure it goes smoothly:
- [ ] Make sure you reviewed

[CONTRIBUTING.md](https://github.com/googleapis/genai-toolbox/blob/main/CONTRIBUTING.md)
- [ ] Make sure to open an issue as a

[bug/issue](https://github.com/googleapis/langchain-google-alloydb-pg-python/issues/new/choose)
before writing your code! That way we can discuss the change, evaluate
  designs, and agree on the general idea
- [ ] Ensure the tests and linter pass
- [ ] Code coverage does not decrease (if any source code was changed)
- [ ] Appropriate docs were updated (if necessary)
- [ ] Make sure to add `!` if this involve a breaking change

🛠️ Fixes #<issue_number_goes_here>
2025-09-04 20:35:05 +00:00
Averi Kitsch
afba7a57cd fix: do not print usage on runtime error (#1315)
## Description
---
> Should include a concise description of the changes (bug or feature),
it's
> impact, along with a summary of the solution

## PR Checklist
---
> Thank you for opening a Pull Request! Before submitting your PR, there
are a
> few things you can do to make sure it goes smoothly:
- [ ] Make sure you reviewed

[CONTRIBUTING.md](https://github.com/googleapis/genai-toolbox/blob/main/CONTRIBUTING.md)
- [ ] Make sure to open an issue as a

[bug/issue](https://github.com/googleapis/langchain-google-alloydb-pg-python/issues/new/choose)
before writing your code! That way we can discuss the change, evaluate
  designs, and agree on the general idea
- [ ] Ensure the tests and linter pass
- [ ] Code coverage does not decrease (if any source code was changed)
- [ ] Appropriate docs were updated (if necessary)
- [ ] Make sure to add `!` if this involve a breaking change

🛠️ Fixes #<issue_number_goes_here>
2025-09-04 20:04:09 +00:00
Wenxin Du
88f4b3028d feat(tools/bigquery): Support end-user credential passthrough on multiple BQ tools (#1314)
Support end-user credential passthrough on BQ Tools that are using
clients.
2025-09-04 15:39:39 -04:00
Anushka Saxena
19a7fe2928 docs: restructure contributing guidelines for adding new db source or tool (#1239)
## Description

This PR is for re-structuring the `CONTRIBUTING.md` guide to separate
the documentation for adding new database sources and adding new tools.
The "Adding a New Database Source and Tool" section has been split into
two distinct, standalone sections.

Fixes #1158

---------

Signed-off-by: SaxenaAnushka102 <anushkasaxenaa@google.com>
Co-authored-by: Averi Kitsch <akitsch@google.com>
2025-09-04 11:06:28 -07:00
Prerna Kakkar
afe6a2c4d2 resolve comments 2025-09-04 14:58:31 +00:00
Prerna Kakkar
16f9a418c3 Merge branch 'main' into cloud-sql-wait-for-operation 2025-09-04 14:36:18 +00:00
Anmol Shukla
54d33fbd90 chore(testing): setup initial directory structure for JS (#1193) 2025-09-04 11:54:59 +05:30
Anushka Saxena
e929942b74 docs: update readme to clarify toolbox server run instructions (#1002)
### Description

Update
[README.md](https://github.com/googleapis/genai-toolbox/blob/main/README.md#running-the-server)
to clarify Toolbox server run instructions for Binary, Docker, and
Source installations.

### Related issue(s)

Addresses and resolves #952

---------

Signed-off-by: Anushka Saxena <anushkasaxenaa@google.com>
Co-authored-by: Averi Kitsch <akitsch@google.com>
2025-09-03 22:46:47 +00:00
Ajaykumar Yadav
0a8351c32d docs: corrected copilot mcp config (#1253)
## Description
- [x] corrected config for the mssql. 
- [x] Updated/Corrected the MCP config of copilot as per the latest
official docs

```json
//correct config
{
  "servers": {
    "postgres": {
      "type": "stdio",
      "command": "path",
      "args": [
        "--prebuilt",
        "postgres",
        "--stdio"
      ],
      "env": {
        "POSTGRES_HOST": "localhost",
        "POSTGRES_PORT": "5432",
        "POSTGRES_DATABASE": "qlms_db",
        "POSTGRES_USER": "",
        "POSTGRES_PASSWORD": ""
      }
    },
  }
}

```

```json
//outdated/wrong config 1

{
  "mcpServers": {
    "postgres": {
      "type": "stdio",
      "command": "path",
      "args": [
        "--prebuilt",
        "postgres",
        "--stdio"
      ],
      "env": {
        "POSTGRES_HOST": "localhost",
        "POSTGRES_PORT": "5432",
        "POSTGRES_DATABASE": "qlms_db",
        "POSTGRES_USER": "",
        "POSTGRES_PASSWORD": ""
      }
    },
  }
}


//outdated/wrong config 2
{
"mcp":{
  "servers": {
    "postgres": {
      "type": "stdio",
      "command": "path",
      "args": [
        "--prebuilt",
        "postgres",
        "--stdio"
      ],
      "env": {
        "POSTGRES_HOST": "localhost",
        "POSTGRES_PORT": "5432",
        "POSTGRES_DATABASE": "qlms_db",
        "POSTGRES_USER": "",
        "POSTGRES_PASSWORD": ""
      }
    },
  }
}
}
```

## PR Checklist

- [x] Make sure you reviewed

[CONTRIBUTING.md](https://github.com/googleapis/genai-toolbox/blob/main/CONTRIBUTING.md)
- [x] Make sure to open an issue as a

[bug/issue](https://github.com/googleapis/langchain-google-alloydb-pg-python/issues/new/choose)
before writing your code! That way we can discuss the change, evaluate
  designs, and agree on the general idea
- [ ] Ensure the tests and linter pass
- [ ] Code coverage does not decrease (if any source code was changed)
- [ ] Appropriate docs were updated (if necessary)
- [x] Make sure to add `!` if this involve a breaking change

🛠️ Fixes: https://github.com/googleapis/genai-toolbox/issues/1252

---------

Co-authored-by: Yuan Teoh <45984206+Yuan325@users.noreply.github.com>
Co-authored-by: Averi Kitsch <akitsch@google.com>
2025-09-03 15:27:56 -07:00
Mend Renovate
e2ddec9675 chore(deps): update dependency go to v1.25.1 (#1325)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
| [go](https://go.dev/)
([source](https://redirect.github.com/golang/go)) | toolchain | patch |
`1.25.0` -> `1.25.1` |

---

### Release Notes

<details>
<summary>golang/go (go)</summary>

###
[`v1.25.1`](https://redirect.github.com/golang/go/compare/go1.25.0...go1.25.1)

</details>

---

### Configuration

📅 **Schedule**: Branch creation - At any time (no schedule defined),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/googleapis/genai-toolbox).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MS45MS4xIiwidXBkYXRlZEluVmVyIjoiNDEuOTEuMSIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOltdfQ==-->
2025-09-03 14:59:33 -07:00
Dr. Strangelove
2cad82e510 feat(tools/looker): Report field suggestions to agent (#1267)
## Description
---
Report the LookML suggestions array or the suggst_explore and
suggest_dimension to the caller so
that the caller can use those suggestions to improve filtering.

## PR Checklist
---
> Thank you for opening a Pull Request! Before submitting your PR, there
are a
> few things you can do to make sure it goes smoothly:
- [x] Make sure you reviewed

[CONTRIBUTING.md](https://github.com/googleapis/genai-toolbox/blob/main/CONTRIBUTING.md)
- [x] Make sure to open an issue as a

[bug/issue](https://github.com/googleapis/langchain-google-alloydb-pg-python/issues/new/choose)
before writing your code! That way we can discuss the change, evaluate
  designs, and agree on the general idea
- [x] Ensure the tests and linter pass
- [x] Code coverage does not decrease (if any source code was changed)
- [x] Appropriate docs were updated (if necessary)
- [x] Make sure to add `!` if this involve a breaking change

🛠️ Fixes #1247
2025-09-03 14:17:22 -04:00
Valeriy
3ae2526e0f feat(source/mysql): support queryParams in MySQL source (#1299)
Fixes #1286

### Motivation
* Allow secure connections to PostgreSQL without custom code.

### Changes
#### Sources
* `mysql`: `Config.QueryParams` added; DSN building rewritten via
`url.Values`.

---------

Co-authored-by: Yuan Teoh <45984206+Yuan325@users.noreply.github.com>
Co-authored-by: Yuan Teoh <yuanteoh@google.com>
2025-09-03 10:48:22 -07:00
Dr. Strangelove
8755e3db34 feat(tools/looker): Authenticate via end user credentials (#1257)
Authenticate to Looker with end user credentials

## PR Checklist
---
> Thank you for opening a Pull Request! Before submitting your PR, there
are a
> few things you can do to make sure it goes smoothly:
- [x] Make sure you reviewed

[CONTRIBUTING.md](https://github.com/googleapis/genai-toolbox/blob/main/CONTRIBUTING.md)
- [x] Make sure to open an issue as a

[bug/issue](https://github.com/googleapis/langchain-google-alloydb-pg-python/issues/new/choose)
before writing your code! That way we can discuss the change, evaluate
  designs, and agree on the general idea
- [x] Ensure the tests and linter pass
- [x] Code coverage does not decrease (if any source code was changed)
- [x] Appropriate docs were updated (if necessary)
- [x] Make sure to add `!` if this involve a breaking change

🛠️ Fixes #1258

---------

Co-authored-by: Wenxin Du <117315983+duwenxin99@users.noreply.github.com>
2025-09-03 17:29:25 +00:00
Sri Varshitha
ade9a2515b docs: Redirect firestore doc to official cloud documentation (#1210)
## Description
---

This change redirects the documentation for connecting an IDE to
Firestore to the official Google Cloud documentation.

The new documentation is available at:
https://cloud.google.com/firestore/native/docs/connect-ide-using-mcp-toolbox

Co-authored-by: prernakakkar-google <158031829+prernakakkar-google@users.noreply.github.com>
Co-authored-by: Averi Kitsch <akitsch@google.com>
2025-09-03 17:09:01 +00:00
Harsh Jha
d2978d5395 docs: rename quickstart functions (#1308) 2025-09-03 05:22:07 +00:00
trehanshakuntG
14c224939a fix(tools/firestore): add document/collection path validation (#1229)
This change introduces robust validation for Firestore document and
collection paths across various Firestore tools.

Key changes include:
* **Path Validation:** Ensures that all Firestore paths used in tools
are relative and adhere to correct formatting, preventing issues with
absolute paths or malformed segments.
* **Improved Parameter Descriptions:** Updates the descriptions for
Firestore tool parameters to clearly specify the expectation of relative
paths (e.g., `users/userId` or `users/userId/posts/postId`) instead of
absolute paths.
* **New Utility:** Adds `internal/tools/firestore/util/validator.go` and
its corresponding test file for path validation logic.

---------

Co-authored-by: prernakakkar-google <158031829+prernakakkar-google@users.noreply.github.com>
2025-09-03 05:10:28 +00:00
Huan Chen
7430f318a2 docs: update bigquery document (#1290)
## Description

Added new tools and optional config in doc.

## PR Checklist
---
> Thank you for opening a Pull Request! Before submitting your PR, there
are a
> few things you can do to make sure it goes smoothly:
- [ ] Make sure you reviewed

[CONTRIBUTING.md](https://github.com/googleapis/genai-toolbox/blob/main/CONTRIBUTING.md)
- [ ] Make sure to open an issue as a

[bug/issue](https://github.com/googleapis/langchain-google-alloydb-pg-python/issues/new/choose)
before writing your code! That way we can discuss the change, evaluate
  designs, and agree on the general idea
- [ ] Ensure the tests and linter pass
- [ ] Code coverage does not decrease (if any source code was changed)
- [ ] Appropriate docs were updated (if necessary)
- [ ] Make sure to add `!` if this involve a breaking change

🛠️ Fixes #<issue_number_goes_here>
2025-09-02 22:10:55 +00:00
Yuan Teoh
21085ef422 fix(tools/bigquerysql)!: remove useClientOAuth from tools config (#1312)
## Description
---
The `useClientOAuth` config was moved from the tool into `bigquery`
source in #1279, however the config was not removed from the source. The
value is now retrieved from the source instead of setting it directly.

---------

Co-authored-by: Wenxin Du <117315983+duwenxin99@users.noreply.github.com>
2025-09-02 21:45:39 +00:00
Yuan Teoh
d16f89fbb6 fix(tools/oceanbase): fix encoded text with mysql driver (#1283)
Update the type conversion to according to the types that mysql driver
returns: https://github.com/go-sql-driver/mysql/blob/v1.9.3/fields.go

All `scanTypeBytes` or `scanTypeString` or `scanTypeNullString` will now
be converted into string. Remaining types (numeric types or
`scanTypeUnknown`) are all returned as is.

Separately handle "JSON" to prevent double marshaling.

This update is needed for oceanbase since it uses mysql driver for
database connection.

ref: #1161
2025-09-02 11:47:54 -07:00
Mend Renovate
4da2fcc055 chore(deps): update module github.com/neo4j/neo4j-go-driver/v5 to v5.28.3 (#1302)
This PR contains the following updates:

| Package | Change | Age | Confidence |
|---|---|---|---|
|
[github.com/neo4j/neo4j-go-driver/v5](https://redirect.github.com/neo4j/neo4j-go-driver)
| `v5.28.2` -> `v5.28.3` |
[![age](https://developer.mend.io/api/mc/badges/age/go/github.com%2fneo4j%2fneo4j-go-driver%2fv5/v5.28.3?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|
[![confidence](https://developer.mend.io/api/mc/badges/confidence/go/github.com%2fneo4j%2fneo4j-go-driver%2fv5/v5.28.2/v5.28.3?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|

---

### Release Notes

<details>
<summary>neo4j/neo4j-go-driver
(github.com/neo4j/neo4j-go-driver/v5)</summary>

###
[`v5.28.3`](https://redirect.github.com/neo4j/neo4j-go-driver/releases/tag/v5.28.3)

[Compare
Source](https://redirect.github.com/neo4j/neo4j-go-driver/compare/v5.28.2...v5.28.3)

See <https://github.com/neo4j/neo4j-go-driver/wiki/5.x-changelog> for
more information.

</details>

---

### Configuration

📅 **Schedule**: Branch creation - At any time (no schedule defined),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/googleapis/genai-toolbox).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MS45MS4xIiwidXBkYXRlZEluVmVyIjoiNDEuOTEuMSIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOltdfQ==-->
2025-09-02 10:53:13 -07:00
Prerna Kakkar
669e6a3a07 add test 2025-09-02 17:23:05 +00:00
Prerna Kakkar
96ba178a1d Merge branch 'main' into cloud-sql-wait-for-operation 2025-09-02 17:19:11 +00:00
Anmol Shukla
27381ed9c0 chore(ci): rename header checker and excluded quickstart files (#1288) 2025-09-02 13:40:52 +05:30
Prerna Kakkar
cbe8b13936 Merge branch 'main' into cloud-sql-wait-for-operation 2025-09-02 07:50:18 +00:00
Prerna Kakkar
f4ffb2fe27 feat(tool/cloudsql): Add cloud sql wait for operation tool with exponential backoff 2025-09-02 07:49:52 +00:00
Mateusz Nowak
670da6e451 ci: fix dataplex integration tests (#1289)
Restoring Dataplex integration after addressing test failure.

🛠️ Ref https://github.com/googleapis/genai-toolbox/issues/1250
2025-09-01 19:09:15 +00:00
Mend Renovate
2f1ed3aaf5 chore(deps): update module cloud.google.com/go/bigtable to v1.39.0 (#1278)
This PR contains the following updates:

| Package | Change | Age | Confidence |
|---|---|---|---|
|
[cloud.google.com/go/bigtable](https://redirect.github.com/googleapis/google-cloud-go)
| `v1.38.0` -> `v1.39.0` |
[![age](https://developer.mend.io/api/mc/badges/age/go/cloud.google.com%2fgo%2fbigtable/v1.39.0?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|
[![confidence](https://developer.mend.io/api/mc/badges/confidence/go/cloud.google.com%2fgo%2fbigtable/v1.38.0/v1.39.0?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|

---

### Configuration

📅 **Schedule**: Branch creation - At any time (no schedule defined),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/googleapis/genai-toolbox).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MS44Mi43IiwidXBkYXRlZEluVmVyIjoiNDEuODIuNyIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOltdfQ==-->

Co-authored-by: Yuan Teoh <45984206+Yuan325@users.noreply.github.com>
2025-08-29 12:30:43 -07:00
Wenxin Du
8d20a48f13 fix(bigquery): Move useClientOAuth config from tool to source (#1279) 2025-08-29 13:47:00 -04:00
Dr. Strangelove
89af3a4ca3 feat(tools/looker): Add description for looker-get-models tool (#1266)
…tool

## Description
---
Add description to output of get_models, as well as add output details
to docs

## PR Checklist
---
> Thank you for opening a Pull Request! Before submitting your PR, there
are a
> few things you can do to make sure it goes smoothly:
- [X] Make sure you reviewed

[CONTRIBUTING.md](https://github.com/googleapis/genai-toolbox/blob/main/CONTRIBUTING.md)
- [X] Make sure to open an issue as a

[bug/issue](https://github.com/googleapis/langchain-google-alloydb-pg-python/issues/new/choose)
before writing your code! That way we can discuss the change, evaluate
  designs, and agree on the general idea
- [X] Ensure the tests and linter pass
- [X] Code coverage does not decrease (if any source code was changed)
- [X] Appropriate docs were updated (if necessary)
- [X] Make sure to add `!` if this involve a breaking change

🛠️ Fixes #1246
2025-08-28 17:16:28 -04:00
Huan Chen
6029e129bc docs: bigquery source project and location description correction (#1270)
## Description
---
Updates the project and location config description.

## PR Checklist
---
> Thank you for opening a Pull Request! Before submitting your PR, there
are a
> few things you can do to make sure it goes smoothly:
- [ ] Make sure you reviewed

[CONTRIBUTING.md](https://github.com/googleapis/genai-toolbox/blob/main/CONTRIBUTING.md)
- [ ] Make sure to open an issue as a

[bug/issue](https://github.com/googleapis/langchain-google-alloydb-pg-python/issues/new/choose)
before writing your code! That way we can discuss the change, evaluate
  designs, and agree on the general idea
- [ ] Ensure the tests and linter pass
- [ ] Code coverage does not decrease (if any source code was changed)
- [ ] Appropriate docs were updated (if necessary)
- [ ] Make sure to add `!` if this involve a breaking change

🛠️ Fixes #<issue_number_goes_here>
2025-08-28 18:51:57 +00:00
Yuan Teoh
8628f7190b chore(deps): update clickhouse package and integration test (#1271) 2025-08-28 11:35:37 -07:00
Harsh Jha
f544e676ea chore: modify quickstart structure to support testing (#1186)
Co-authored-by: Anmol Shukla <shuklaanmol@google.com>
Co-authored-by: Twisha Bansal <58483338+twishabansal@users.noreply.github.com>
2025-08-28 11:52:49 +05:30
Anmol Shukla
a21e68022b docs: renaming sample agent functions to a common name (#1244) 2025-08-28 11:07:51 +05:30
Mend Renovate
08397cf398 chore(deps): update module cloud.google.com/go/cloudsqlconn to v1.18.1 (#1268)
This PR contains the following updates:

| Package | Change | Age | Confidence |
|---|---|---|---|
|
[cloud.google.com/go/cloudsqlconn](https://redirect.github.com/googlecloudplatform/cloud-sql-go-connector)
| `v1.18.0` -> `v1.18.1` |
[![age](https://developer.mend.io/api/mc/badges/age/go/cloud.google.com%2fgo%2fcloudsqlconn/v1.18.1?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|
[![confidence](https://developer.mend.io/api/mc/badges/confidence/go/cloud.google.com%2fgo%2fcloudsqlconn/v1.18.0/v1.18.1?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|

---

### Release Notes

<details>
<summary>googlecloudplatform/cloud-sql-go-connector
(cloud.google.com/go/cloudsqlconn)</summary>

###
[`v1.18.1`](https://redirect.github.com/GoogleCloudPlatform/cloud-sql-go-connector/releases/tag/v1.18.1)

[Compare
Source](https://redirect.github.com/googlecloudplatform/cloud-sql-go-connector/compare/v1.18.0...v1.18.1)

##### Bug Fixes

- Use a new context for the domain name check loop.
([#&#8203;1007](https://redirect.github.com/GoogleCloudPlatform/cloud-sql-go-connector/issues/1007))
([908d0cf](908d0cf6a6))

</details>

---

### Configuration

📅 **Schedule**: Branch creation - At any time (no schedule defined),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/googleapis/genai-toolbox).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MS44Mi43IiwidXBkYXRlZEluVmVyIjoiNDEuODIuNyIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOltdfQ==-->
2025-08-27 19:07:20 -07:00
Pete Hampton
75a04a55dd feat(clickhouse): Add ClickHouse Source and Tools (#1088)
This PR introduces ClickHouse support by finishing off
https://github.com/googleapis/genai-toolbox/pull/713. We have had quite
a few requests internally and externally of people who want ClickHouse
support and wish to contribute but can't use the toolbox because of this
stalled PR. We also tried to reach out to @sidpan1 ourselves with no
success so we are going to assume the contributor doesn't have time to
complete. We would like to thank @sidpan1 for getting this going -
community contributions help make ClickHouse better for everyone.

This PR introduces

- Adds ClickHouse HTTPS and HTTP source support (drops native support
and compression in connection)
- Adds tools: `execute_sql`, ~~`describe_table`, `list_databases`,
`list_tables`~~
  - Adds unit and integration tests
  - Adds docs

Other contributors can build on this foundation

### Connect to local ClickHouse

```sh
CLICKHOUSE_HOST="localhost" \
  CLICKHOUSE_PORT=8123 \
  CLICKHOUSE_USER="default" \
  CLICKHOUSE_PASSWORD="" \
  CLICKHOUSE_DATABASE=default \
  CLICKHOUSE_PROTOCOL="http" \
  go run main.go --prebuilt clickhouse --ui --port 8080
```

### Connect to ClickHouse Cloud

```sh
CLICKHOUSE_HOST="tsmtweovmw.us-east-2.aws.clickhouse.cloud" \
  CLICKHOUSE_PORT=8443 \
  CLICKHOUSE_USER="default" \
  CLICKHOUSE_PASSWORD="[REDACTED]" \
  CLICKHOUSE_DATABASE=default \
  CLICKHOUSE_PROTOCOL=https \
  go run main.go --prebuilt clickhouse --ui --port 8080
```

### Run tests

```bash
go test -v ./tests/clickhouse/ -run TestClickHouse
go test -v ./tests/clickhouse/ -run TestClickHouseBasicConnection
```

<img width="1318" height="895" alt="Screenshot 2025-08-06 at 10 01 01"
src="https://github.com/user-attachments/assets/034d8f1b-10d6-4097-8033-5b0da93ad3fc"
/>

---------

Co-authored-by: Pete Hampton <pjhampton@users.noreply.github.com>
Co-authored-by: Wenxin Du <117315983+duwenxin99@users.noreply.github.com>
Co-authored-by: duwenxin <duwenxin@google.com>
2025-08-27 18:42:10 -04:00
Mend Renovate
a5e74af104 chore(deps): update module github.com/go-chi/chi/v5 to v5.2.3 (#1265)
This PR contains the following updates:

| Package | Change | Age | Confidence |
|---|---|---|---|
| [github.com/go-chi/chi/v5](https://redirect.github.com/go-chi/chi) |
`v5.2.2` -> `v5.2.3` |
[![age](https://developer.mend.io/api/mc/badges/age/go/github.com%2fgo-chi%2fchi%2fv5/v5.2.3?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|
[![confidence](https://developer.mend.io/api/mc/badges/confidence/go/github.com%2fgo-chi%2fchi%2fv5/v5.2.2/v5.2.3?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|

---

### Release Notes

<details>
<summary>go-chi/chi (github.com/go-chi/chi/v5)</summary>

###
[`v5.2.3`](https://redirect.github.com/go-chi/chi/releases/tag/v5.2.3)

[Compare
Source](https://redirect.github.com/go-chi/chi/compare/v5.2.2...v5.2.3)

#### What's Changed

- Add pathvalue example to README and implement PathValue handler. by
[@&#8203;catatsuy](https://redirect.github.com/catatsuy) in
[#&#8203;985](https://redirect.github.com/go-chi/chi/pull/985)
- Allow multiple whitespace between method & pattern by
[@&#8203;JRaspass](https://redirect.github.com/JRaspass) in
[#&#8203;1013](https://redirect.github.com/go-chi/chi/pull/1013)
- Avoid potential nil dereference by
[@&#8203;ProjectMutilation](https://redirect.github.com/ProjectMutilation)
in [#&#8203;1008](https://redirect.github.com/go-chi/chi/pull/1008)
- feat(mux): support http.Request.Pattern in Go 1.23 by
[@&#8203;Gusted](https://redirect.github.com/Gusted) in
[#&#8203;986](https://redirect.github.com/go-chi/chi/pull/986)
- fix/608 - Fix flaky Throttle middleware test by synchronizing token
usage by
[@&#8203;OtavioBernardes](https://redirect.github.com/OtavioBernardes)
in [#&#8203;1016](https://redirect.github.com/go-chi/chi/pull/1016)
- Optimize throttle middleware by avoiding unnecessary timer creation by
[@&#8203;vasayxtx](https://redirect.github.com/vasayxtx) in
[#&#8203;1011](https://redirect.github.com/go-chi/chi/pull/1011)
- Simplify wildcard replacement in route patterns by
[@&#8203;srpvpn](https://redirect.github.com/srpvpn) in
[#&#8203;1012](https://redirect.github.com/go-chi/chi/pull/1012)
- Replace methodTypString func with reverseMethodMap by
[@&#8203;JRaspass](https://redirect.github.com/JRaspass) in
[#&#8203;1018](https://redirect.github.com/go-chi/chi/pull/1018)

#### New Contributors

-
[@&#8203;ProjectMutilation](https://redirect.github.com/ProjectMutilation)
made their first contribution in
[#&#8203;1008](https://redirect.github.com/go-chi/chi/pull/1008)
- [@&#8203;Gusted](https://redirect.github.com/Gusted) made their first
contribution in
[#&#8203;986](https://redirect.github.com/go-chi/chi/pull/986)
- [@&#8203;OtavioBernardes](https://redirect.github.com/OtavioBernardes)
made their first contribution in
[#&#8203;1016](https://redirect.github.com/go-chi/chi/pull/1016)
- [@&#8203;srpvpn](https://redirect.github.com/srpvpn) made their first
contribution in
[#&#8203;1012](https://redirect.github.com/go-chi/chi/pull/1012)

**Full Changelog**:
<https://github.com/go-chi/chi/compare/v5.2.2...v5.2.3>

</details>

---

### Configuration

📅 **Schedule**: Branch creation - At any time (no schedule defined),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/googleapis/genai-toolbox).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MS44Mi43IiwidXBkYXRlZEluVmVyIjoiNDEuODIuNyIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOltdfQ==-->

Co-authored-by: Yuan Teoh <45984206+Yuan325@users.noreply.github.com>
2025-08-27 14:54:58 -07:00
Mend Renovate
58d7b3e5f8 chore(deps): update google.golang.org/genproto digest to ef028d9 (#1234)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
|
[google.golang.org/genproto](https://redirect.github.com/googleapis/go-genproto)
| require | digest | `3122310` -> `ef028d9` |

---

### Configuration

📅 **Schedule**: Branch creation - At any time (no schedule defined),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/googleapis/genai-toolbox).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MS44Mi43IiwidXBkYXRlZEluVmVyIjoiNDEuODIuNyIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOltdfQ==-->
2025-08-27 14:39:47 -07:00
Yuan Teoh
ca353e0b66 feat(server/mcp): support toolbox auth in mcp (#1140)
Existing `/mcp` endpoint of Toolbox does not support auth (authorized
invocation and authenticated parameters). This PR add support for
Toolbox auth to the `/mcp` endpoint.

Added integration test for MCP with auth.

Note that Toolbox auth is **NOT** supported in stdio transport protocol,
invocations of tools with auth will result in error.
2025-08-27 09:38:56 -07:00
Yuan Teoh
fc707fb561 test: add unit tests for unauthorized calls (#1262)
Our coverage is falling below minimum (40%). Add unit tests for
unauthorized calls.
2025-08-27 09:01:48 -07:00
Yuan Teoh
03aa9fabac fix: update env var to allow empty string (#1260)
Update configuration file capability of reading from environment
variable:
* allow default empty string
* fix if statement to check if `parts[2]` is empty instead since it will
always have a length of 4.
* add unit tests.
2025-08-27 01:25:28 +00:00
release-please[bot]
1a6dfe8d37 chore(main): release 0.13.0 (#1175)
🤖 I have created a release *beep* *boop*
---


##
[0.13.0](https://github.com/googleapis/genai-toolbox/compare/v0.12.0...v0.13.0)
(2025-08-27)


### ⚠ BREAKING CHANGES

* **prebuilt/alloydb:** Add bearer token support for
alloydb-wait-for-operation
([#1183](https://github.com/googleapis/genai-toolbox/issues/1183))


### Features

* Add capability to set default for environment variable in config
([#1248](https://github.com/googleapis/genai-toolbox/issues/1248))
([5bcd52e](5bcd52e7dc))
* **firebird:** Add Firebird SQL 2.5+ source and tool
([#1011](https://github.com/googleapis/genai-toolbox/issues/1011))
([4f6b806](4f6b806de9))
* **oceanbase:** Add Oceanbase source and tool
([#895](https://github.com/googleapis/genai-toolbox/issues/895))
([6fc4982](6fc49826d4))
* **server/mcp:** Support `ping` mechanism
([#1178](https://github.com/googleapis/genai-toolbox/issues/1178))
([5dcc66c](5dcc66c84f))
* **server:** Fail-fast on environment variable substitution
([#1177](https://github.com/googleapis/genai-toolbox/issues/1177))
([212aaba](212aaba74c))
* **server:** Implement Tool call auth error propagation
([#1235](https://github.com/googleapis/genai-toolbox/issues/1235))
([b94a021](b94a021ca1))
* **sources/bigquery:** Add support for user-credential passthrough
([#1067](https://github.com/googleapis/genai-toolbox/issues/1067))
([650e2e2](650e2e26f5))
* **tool/looker:** Add support for `description` field in looker tool
([#1199](https://github.com/googleapis/genai-toolbox/issues/1199))
([97f0dd2](97f0dd2acf))
* **tools/bigquery-ask-data-insights:** Add bigquery `ask-data-insights`
tool ([#932](https://github.com/googleapis/genai-toolbox/issues/932))
([7651357](7651357d42))
* **tools/bigquery-forecast:** Add bigqueryforecast tool
([#1148](https://github.com/googleapis/genai-toolbox/issues/1148))
([2ad0ccf](2ad0ccf83d))
* **tools/firestore-add-documents:** Add firestore-add-documents tool
([#1107](https://github.com/googleapis/genai-toolbox/issues/1107))
([ee4a70a](ee4a70a0e8))
* **tools/firestore-update-document:** Add firestore-update-document
tool ([#1191](https://github.com/googleapis/genai-toolbox/issues/1191))
([0010123](00101232a3))
* **tools/looker:** Control over whether hidden objects are surfaced
([#1222](https://github.com/googleapis/genai-toolbox/issues/1222))
([bc91559](bc91559cc4))
* **trino:** Add Trino source and tools
([#948](https://github.com/googleapis/genai-toolbox/issues/948))
([7dd123b](7dd123b3d7))


### Bug Fixes

* **tools/looker:** Lookergetdashboards uses proper Authorized helper
func ([#1255](https://github.com/googleapis/genai-toolbox/issues/1255))
([00866bc](00866bc7fc))
* **tools/mongodb-find-one:** ProjectPayload unmarshaling
([#1167](https://github.com/googleapis/genai-toolbox/issues/1167))
([8ea6a98](8ea6a98bd9))
* **tools/mysql:** Fix encoded text for mysql
([#1161](https://github.com/googleapis/genai-toolbox/issues/1161))
([a37cfa8](a37cfa841d)),
closes [#840](https://github.com/googleapis/genai-toolbox/issues/840)


---
This PR was generated with [Release
Please](https://github.com/googleapis/release-please). See
[documentation](https://github.com/googleapis/release-please#release-please).

---------

Co-authored-by: release-please[bot] <55107282+release-please[bot]@users.noreply.github.com>
Co-authored-by: Yuan Teoh <45984206+Yuan325@users.noreply.github.com>
2025-08-27 01:07:04 +00:00
Yuan Teoh
297f240955 chore: release 0.13.0 (#1259)
Release-As: 0.13.0
2025-08-27 00:25:21 +00:00
Mend Renovate
6ad6d39084 chore(deps): update module github.com/microsoft/go-mssqldb to v1.9.3 (#1236)
This PR contains the following updates:

| Package | Change | Age | Confidence |
|---|---|---|---|
|
[github.com/microsoft/go-mssqldb](https://redirect.github.com/microsoft/go-mssqldb)
| `v1.9.2` -> `v1.9.3` |
[![age](https://developer.mend.io/api/mc/badges/age/go/github.com%2fmicrosoft%2fgo-mssqldb/v1.9.3?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|
[![confidence](https://developer.mend.io/api/mc/badges/confidence/go/github.com%2fmicrosoft%2fgo-mssqldb/v1.9.2/v1.9.3?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|

---

### Release Notes

<details>
<summary>microsoft/go-mssqldb
(github.com/microsoft/go-mssqldb)</summary>

###
[`v1.9.3`](https://redirect.github.com/microsoft/go-mssqldb/compare/v1.9.2...v1.9.3)

[Compare
Source](https://redirect.github.com/microsoft/go-mssqldb/compare/v1.9.2...v1.9.3)

</details>

---

### Configuration

📅 **Schedule**: Branch creation - At any time (no schedule defined),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/googleapis/genai-toolbox).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MS44Mi43IiwidXBkYXRlZEluVmVyIjoiNDEuODIuNyIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOltdfQ==-->

Co-authored-by: Yuan Teoh <45984206+Yuan325@users.noreply.github.com>
2025-08-26 17:00:23 -07:00
Mend Renovate
f8929e3396 chore(deps): update module github.com/couchbase/gocb/v2 to v2.11.0 (#1249)
This PR contains the following updates:

| Package | Change | Age | Confidence |
|---|---|---|---|
|
[github.com/couchbase/gocb/v2](https://redirect.github.com/couchbase/gocb)
| `v2.10.1` -> `v2.11.0` |
[![age](https://developer.mend.io/api/mc/badges/age/go/github.com%2fcouchbase%2fgocb%2fv2/v2.11.0?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|
[![confidence](https://developer.mend.io/api/mc/badges/confidence/go/github.com%2fcouchbase%2fgocb%2fv2/v2.10.1/v2.11.0?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|

---

### Release Notes

<details>
<summary>couchbase/gocb (github.com/couchbase/gocb/v2)</summary>

###
[`v2.11.0`](https://redirect.github.com/couchbase/gocb/compare/v2.10.1...v2.11.0)

[Compare
Source](https://redirect.github.com/couchbase/gocb/compare/v2.10.1...v2.11.0)

</details>

---

### Configuration

📅 **Schedule**: Branch creation - At any time (no schedule defined),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/googleapis/genai-toolbox).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MS44Mi43IiwidXBkYXRlZEluVmVyIjoiNDEuODIuNyIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOltdfQ==-->

Co-authored-by: Yuan Teoh <45984206+Yuan325@users.noreply.github.com>
2025-08-26 16:50:25 -07:00
Dr. Strangelove
bc91559cc4 feat(tools/looker): Control over whether hidden objects are surfaced (#1222)
The following config is added to the looker source:

* show_hidden_models
* show_hidden_explores
* show_hidden_fields

## Description
---
> Should include a concise description of the changes (bug or feature),
it's
> impact, along with a summary of the solution

## PR Checklist
---
> Thank you for opening a Pull Request! Before submitting your PR, there
are a
> few things you can do to make sure it goes smoothly:
- [ x] Make sure you reviewed

[CONTRIBUTING.md](https://github.com/googleapis/genai-toolbox/blob/main/CONTRIBUTING.md)
- [ x] Make sure to open an issue as a

[bug/issue](https://github.com/googleapis/langchain-google-alloydb-pg-python/issues/new/choose)
before writing your code! That way we can discuss the change, evaluate
  designs, and agree on the general idea
- [ x] Ensure the tests and linter pass
- [ x] Code coverage does not decrease (if any source code was changed)
- [ x] Appropriate docs were updated (if necessary)
- [ x] Make sure to add `!` if this involve a breaking change

🛠️ Fixes #1214
2025-08-26 15:39:44 -07:00
Wenxin Du
650e2e26f5 feat(sources/bigquery): add support for user-credential passthrough (#1067)
Support end-user credential passthrough with the BigQuery source and the
`bigquery-sql` tool.
Support for other BQ tools will be added in subsequent PRs.

Issue: https://github.com/googleapis/genai-toolbox/issues/813
2025-08-26 17:52:24 -04:00
Dr. Strangelove
00866bc7fc fix(tools/looker): lookergetdashboards uses proper Authorized helper func (#1255)
looker-getdash-board uses the Authorized helper function.

## PR Checklist
---
> Thank you for opening a Pull Request! Before submitting your PR, there
are a
> few things you can do to make sure it goes smoothly:
- [x ] Make sure you reviewed

[CONTRIBUTING.md](https://github.com/googleapis/genai-toolbox/blob/main/CONTRIBUTING.md)
- [ ] Make sure to open an issue as a

[bug/issue](https://github.com/googleapis/langchain-google-alloydb-pg-python/issues/new/choose)
before writing your code! That way we can discuss the change, evaluate
  designs, and agree on the general idea
- [ x] Ensure the tests and linter pass
- [ x] Code coverage does not decrease (if any source code was changed)
- [ x] Appropriate docs were updated (if necessary)
- [x ] Make sure to add `!` if this involve a breaking change
2025-08-26 20:32:08 +00:00
Dr. Strangelove
5bcd52e7dc feat: Add capability to set default for environment variable in config (#1248)
Support `${ENV_VALUE:default_value}`

🛠️ Fixes #1001
2025-08-26 20:28:55 +00:00
Huan Chen
7651357d42 feat(tools/bigquery-ask-data-insights): add bigquery ask-data-insights tool (#932)
1. Add ask-data-insights tool based on conversational analytic API.
2. Add tokenSource for ask-data-insights tool, it uses access token
instead of client or restService.
3. Add a max row count to source, currently fixed to 50 and used only
for ask-data-insights tool. Later we may make it available for user to
make change and apply to bigquery-execute-sql and bigquery-sql to avoid
return too many data by accident.

---------

Co-authored-by: Yuan Teoh <45984206+Yuan325@users.noreply.github.com>
Co-authored-by: Averi Kitsch <akitsch@google.com>
2025-08-26 20:16:17 +00:00
Wenxin Du
b94a021ca1 feat(server): implement Tool call auth error propagation (#1235)
For Toolbox protocol:
Before -  return 400 error for all tool invocation errors. 
After - Propagate auth-related errors (401 & 403) to the client if using
client credentials. If using ADC, raise 500 error instead.

For MCP protocol:
Before -  return 200 with error message in the response body.
After - Propagate auth-related errors (401 & 403) to the client if using
client credentials. If using ADC, raise 500 error instead.
2025-08-26 15:27:46 -04:00
Dr. Strangelove
81c36354cb fix(tools/looker): Minor fixes and refinements to Looker (#1215)
* fix not null expression 
* exclude dimensions that end in _raw
* Add tags to return from get_dimensions, get_measures, etc.
* Add synonyms to return from get_dimensions, get_measures, etc.
* Instruct Agent to only create pie chart visualizations with one
measure and one dimension.
* Added docs about the return types of
get_dimensions/get_measures/get_filters/get_parameters.

---------

Co-authored-by: Yuan Teoh <45984206+Yuan325@users.noreply.github.com>
2025-08-26 18:55:23 +00:00
Yuan Teoh
febbb01c41 ci: disabling dataplex integration tests due to failure (#1251)
## Description
---
Disabling Dataplex integration tests due to test failure. This is
blocking PRs from merging.


🛠️ Ref #1250
2025-08-26 11:44:34 -07:00
Anmol Shukla
6db9ad9389 docs: updated go-genai sample code (#1212)
Fixes issue #1211

Co-authored-by: dishaprakash <57954147+dishaprakash@users.noreply.github.com>
2025-08-26 20:58:05 +05:30
Yuan Teoh
a37cfa841d fix(tools/mysql): fix encoded text for mysql (#1161)
Update the type conversion to according to the types that mysql driver
returns: https://github.com/go-sql-driver/mysql/blob/v1.9.3/fields.go

All `scanTypeBytes` or `scanTypeString` or `scanTypeNullString` will now
be converted into string. Remaining types (numeric types or
`scanTypeUnknown`) are all returned as is.

Separately handle "JSON" to prevent double marshaling.

Fixes #840
2025-08-25 21:24:00 -07:00
Ganga4060
c5e1aaa2dc docs: removed database field from reference (#1230)
Removed database field from reference of mongodb sources
2025-08-25 18:35:52 +00:00
Ajaykumar Yadav
2a8ab521ef docs: added GCA, gcli mcp config for postgres (#1209)
## Description
---
Added **MCP Configuration** for the **Gemini Code Assist** and **Gemini
CLI** to use prebuilt tool of `postgres`

## PR Checklist
---

- [x] Make sure you reviewed

[CONTRIBUTING.md](https://github.com/googleapis/genai-toolbox/blob/main/CONTRIBUTING.md)
- [x] Make sure to open an issue as a

[bug/issue](https://github.com/googleapis/langchain-google-alloydb-pg-python/issues/new/choose)
before writing your code! That way we can discuss the change, evaluate
  designs, and agree on the general idea
- [x] Ensure the tests and linter pass
- [x] Code coverage does not decrease (if any source code was changed)
- [x] Appropriate docs were updated (if necessary)
- [x] Make sure to add `!` if this involve a breaking change

🛠️ Fixes: https://github.com/googleapis/genai-toolbox/issues/1204

Co-authored-by: Averi Kitsch <akitsch@google.com>
2025-08-23 03:43:53 +00:00
Wenxin Du
b1abbeb380 refactor: Add Tool method to check client authorization (#1217)
Add `RequiresClientAuthorization()` method to the `Tool` interface.
Currently returning false for all tools.
Supports: https://github.com/googleapis/genai-toolbox/pull/1067
2025-08-22 20:53:14 +00:00
Kurtis Van Gent
859301b383 chore: fix looker label (#1221) 2025-08-22 19:39:24 +00:00
Mend Renovate
e57e5355f8 chore(deps): update module github.com/trinodb/trino-go-client to v0.328.0 (#1206)
This PR contains the following updates:

| Package | Change | Age | Confidence |
|---|---|---|---|
|
[github.com/trinodb/trino-go-client](https://redirect.github.com/trinodb/trino-go-client)
| `v0.326.0` -> `v0.328.0` |
[![age](https://developer.mend.io/api/mc/badges/age/go/github.com%2ftrinodb%2ftrino-go-client/v0.328.0?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|
[![confidence](https://developer.mend.io/api/mc/badges/confidence/go/github.com%2ftrinodb%2ftrino-go-client/v0.326.0/v0.328.0?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|

---

### Release Notes

<details>
<summary>trinodb/trino-go-client
(github.com/trinodb/trino-go-client)</summary>

###
[`v0.328.0`](https://redirect.github.com/trinodb/trino-go-client/releases/tag/v0.328.0)

[Compare
Source](https://redirect.github.com/trinodb/trino-go-client/compare/v0.327.0...v0.328.0)

<!-- Release notes generated using configuration in .github/release.yml
at v0.328.0 -->

#### What's Changed

##### Bug fixes

- Fix slots not being released for inline segments by
[@&#8203;Flgado](https://redirect.github.com/Flgado) in
[https://github.com/trinodb/trino-go-client/pull/151](https://redirect.github.com/trinodb/trino-go-client/pull/151)

**Full Changelog**:
https://github.com/trinodb/trino-go-client/compare/v0.327.0...v0.328.0

###
[`v0.327.0`](https://redirect.github.com/trinodb/trino-go-client/releases/tag/v0.327.0)

[Compare
Source](https://redirect.github.com/trinodb/trino-go-client/compare/v0.326.0...v0.327.0)

<!-- Release notes generated using configuration in .github/release.yml
at v0.327.0 -->

#### What's Changed

##### Other changes

- Spooling parallelization by
[@&#8203;Flgado](https://redirect.github.com/Flgado) in
[https://github.com/trinodb/trino-go-client/pull/141](https://redirect.github.com/trinodb/trino-go-client/pull/141)

**Full Changelog**:
https://github.com/trinodb/trino-go-client/compare/v0.325.0...v0.327.0

</details>

---

### Configuration

📅 **Schedule**: Branch creation - At any time (no schedule defined),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/googleapis/genai-toolbox).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MS44MS4yIiwidXBkYXRlZEluVmVyIjoiNDEuODEuMiIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOltdfQ==-->

Co-authored-by: Yuan Teoh <45984206+Yuan325@users.noreply.github.com>
2025-08-22 18:39:03 +00:00
Mend Renovate
e1f115798f chore(deps): update google.golang.org/genproto digest to 3122310 (#1180)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
|
[google.golang.org/genproto](https://redirect.github.com/googleapis/go-genproto)
| require | digest | `513f239` -> `3122310` |

---

### Configuration

📅 **Schedule**: Branch creation - At any time (no schedule defined),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/googleapis/genai-toolbox).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MS43MS4xIiwidXBkYXRlZEluVmVyIjoiNDEuODEuMiIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOltdfQ==-->

Co-authored-by: Yuan Teoh <45984206+Yuan325@users.noreply.github.com>
2025-08-22 18:26:45 +00:00
Yuan Teoh
79c57725bc docs: add docs for test config options (#1170) 2025-08-22 18:11:20 +00:00
Paulo Mota
4f6b806de9 feat(firebird): Add Firebird SQL 2.5+ source and tool support (#1011)
## Description
This Pull Request introduces support for the Firebird database as a new
`source` and adds the `firebird-sql` and `firebird-execute-sql` tools.
The implementation follows the architectural pattern established by the
existing PostgreSQL and MySQL tools.

Comprehensive integration tests have been added in the `tests/firebird`
directory, covering table creation, data insertion, and the invocation
of all related tools. Corresponding user and developer documentation has
also been included.

## Test Results
The core functionality is working as demonstrated by the vast majority
of tests passing. However, a few tests fail due to limitations and
inflexibilities in the generic test harness (`tests/tool.go`), which is
not fully compatible with Firebird's specific syntax and behavior.

### Known and Justified Failures:

* **`invoke_my-tool` / `invoke_my-array-tool`:** The `RunToolInvokeTest`
function reuses the same `want` variable for both sub-tests, which have
different inputs and expected outputs. The test is configured for
`my-tool` to pass, which consequently causes the `my-array-tool` test to
fail its assertion. The `got` log for the array test confirms that the
tool returns the correct response for the given input.
* **`invoke_my-exec-sql-tool`:** This test fails because the harness
sends the query `SELECT 1`, which is invalid syntax in Firebird.
Subsequent tests for this tool using DDL and DML (`create`, `select`,
`drop`) all pass, confirming its core functionality.
* **`invoke_select-fields-templateParams-tool`:** This test fails due to
a case-sensitivity mismatch in column names (`NAME` vs. `name`). The
expected output in the test harness is hardcoded with a lowercase name.
* **Authentication Tests:** All authentication-related tests are
failing. This appears to be a general issue with the local test
environment setup and is not specific to the Firebird implementation.

## Next Steps
I believe the implementation is ready for review. I am available to make
any requested changes.
Feat #935

---------

Co-authored-by: Yuan Teoh <45984206+Yuan325@users.noreply.github.com>
Co-authored-by: Yuan Teoh <yuanteoh@google.com>
2025-08-22 11:01:14 -07:00
Kurtis Van Gent
6f4530e794 chore(ci/cd): add looker label and assignee (#1216)
Adds Looker and @drstrangelooker for issue assignment.
2025-08-22 17:30:06 +00:00
Yuan Teoh
8d6f3bbe2c chore: add options for all tests (#1169)
Add options for `RunToolInvokeTest()`, `RunMCPToolCallMethod()`,
`RunExecuteSqlToolInvokeTest()`.

When I was trying to update integration test for Firebird. They have a
different syntax from the usual `SELECT 1` that the other DBs are using.
If we want to update the `SELECT 1` statement in the test cases, we will
have to add a new argument and update all the other integration tests.

With `Options`, we can just add customization without having the need to
update all other integration tests. This allow the integration tests to
be more flexible. With this, we can also remove the
`GetNonSpannerWant()` since those can be set as default values for the
`Options`.
2025-08-22 10:14:36 -07:00
Harsh Jha
d27c53e3b5 docs(hugo): ignore quickstart folder from site indexing (#1208)
Prevents Hugo from automatically indexing the quickstart folder in site
navigation and search results.
This keeps documentation structure clean by excluding
development/testing quickstart resources from public docs.
2025-08-22 09:10:19 +00:00
trehanshakuntG
00101232a3 feat(tools/firestore-update-document): Add firestore-update-document tool (#1191)
## Add firestore-update-document tool

Adds a new tool for updating existing documents in Firestore
collections.

__What it does:__

- Updates documents at any path in Firestore
- Supports partial updates with field masks for selective field
modification
- Handles all Firestore data types (strings, numbers, booleans,
timestamps, geopoints, arrays, maps, etc.)
- Supports field deletion using updateMask
- Uses Firestore's native JSON format for type safety
- Can update nested fields within maps using dot notation

__Key parameters:__

- `documentPath`: The path of the document to update
- `documentData`: The document content in Firestore JSON format
- `updateMask`: Optional array of field paths for selective updates
- `returnData`: Optional flag to include updated document in response

__Special features:__

- When `updateMask` is provided, only specified fields are updated
- Can access nested fields with dot notation (e.g., 'address.city',
'user.profile.name')
- Without updateMask, performs a merge operation updating all provided
fields
2025-08-22 06:56:23 +00:00
Harsh Jha
e376cce18c docs: add shortcode for including file regions (#1043)
Introduces a new Hugo shortcode, `includeRegion`, to allow embedding
specific, tagged regions from source files directly into Markdown
content.

This helps maintain a single source of truth for code snippets and other
repeated content, such as setup instructions, preventing duplication and
simplifying updates across multiple quickstart guides.

Demo PR: https://github.com/googleapis/genai-toolbox/pull/1179

---------

Co-authored-by: Anmol Shukla <shuklaanmol@google.com>
2025-08-22 06:24:50 +00:00
Abhinav Misra
7dd123b3d7 feat(trino): Add Trino source and tools (#948)
Add support for Trino distributed SQL query engine with complete
implementation including sources, tools, documentation, and tests.

## Summary

- Add Trino source implementation with multiple authentication methods
- Create trino-execute-sql tool for arbitrary SQL execution
- Create trino-sql tool for parameterized queries
- Add comprehensive documentation ~~and prebuilt configurations~~
- Include unit tests for all components

## Features

### Authentication Support
- Basic authentication (username/password)
- JWT token authentication
- Kerberos authentication
- SSL/TLS connections

### Tools
- **trino-execute-sql**: Execute arbitrary SQL queries
- **trino-sql**: Execute parameterized SQL with prepared statements
- Prebuilt tools for common operations (list catalogs, schemas, tables,
etc.)

### Cross-Catalog Queries
- Support for federated queries across multiple data sources
- Query Hive, PostgreSQL, MySQL, and other connectors in single queries
- Leverage Trino's distributed query engine capabilities

## Test Plan
- [x] Unit tests pass for source DSN generation
- [x] Unit tests pass for tool parameter validation
- [x] Project builds successfully with new dependencies
- [x] Documentation includes comprehensive examples

Ref #880

---------

Co-authored-by: Yuan Teoh <yuanteoh@google.com>
Co-authored-by: Yuan Teoh <45984206+Yuan325@users.noreply.github.com>
2025-08-22 00:03:04 +00:00
dishaprakash
7f553a865b chore(prebuilt): dynamically generate --prebuilt flag options (#1205)
## Description
---
This PR changes the help description in the prebuilt flag from hardcoded
source options to dynamically fetching it from the prebuilt configs.

## PR Checklist
---
> Thank you for opening a Pull Request! Before submitting your PR, there
are a
> few things you can do to make sure it goes smoothly:
- [x] Make sure you reviewed

[CONTRIBUTING.md](https://github.com/googleapis/genai-toolbox/blob/main/CONTRIBUTING.md)
- [ ] Make sure to open an issue as a

[bug/issue](https://github.com/googleapis/langchain-google-alloydb-pg-python/issues/new/choose)
before writing your code! That way we can discuss the change, evaluate
  designs, and agree on the general idea
- [x] Ensure the tests and linter pass
- [x] Code coverage does not decrease (if any source code was changed)
- [x] Appropriate docs were updated (if necessary)
- [ ] Make sure to add `!` if this involve a breaking change

🛠️ Fixes #<issue_number_goes_here>
2025-08-22 04:38:11 +05:30
Averi Kitsch
e8f4ed7e85 docs: add prebuilt reference and CLI reference (#1176)
## Description
---
Add reference documentation

## PR Checklist
---
> Thank you for opening a Pull Request! Before submitting your PR, there
are a
> few things you can do to make sure it goes smoothly:
- [ ] Make sure you reviewed

[CONTRIBUTING.md](https://github.com/googleapis/genai-toolbox/blob/main/CONTRIBUTING.md)
- [ ] Make sure to open an issue as a

[bug/issue](https://github.com/googleapis/langchain-google-alloydb-pg-python/issues/new/choose)
before writing your code! That way we can discuss the change, evaluate
  designs, and agree on the general idea
- [ ] Ensure the tests and linter pass
- [ ] Code coverage does not decrease (if any source code was changed)
- [ ] Appropriate docs were updated (if necessary)
- [ ] Make sure to add `!` if this involve a breaking change

🛠️ Fixes #<issue_number_goes_here>

---------

Co-authored-by: Yuan Teoh <45984206+Yuan325@users.noreply.github.com>
2025-08-21 22:30:39 +00:00
Wenxin Du
bffe7b0661 refactor: Pass Authorization header token to Tool call functions (#1200)
Pass in authorization token to the Tool invocation functions.
Support: https://github.com/googleapis/genai-toolbox/pull/1067
2025-08-21 18:20:42 -04:00
Averi Kitsch
8ea6a98bd9 fix(tools/mongodb-find-one): ProjectPayload unmarshaling (#1167)
## Description
---
Update mongodb find one to unmarshal projectpayload

## PR Checklist
---
> Thank you for opening a Pull Request! Before submitting your PR, there
are a
> few things you can do to make sure it goes smoothly:
- [ ] Make sure you reviewed

[CONTRIBUTING.md](https://github.com/googleapis/genai-toolbox/blob/main/CONTRIBUTING.md)
- [ ] Make sure to open an issue as a

[bug/issue](https://github.com/googleapis/langchain-google-alloydb-pg-python/issues/new/choose)
before writing your code! That way we can discuss the change, evaluate
  designs, and agree on the general idea
- [ ] Ensure the tests and linter pass
- [ ] Code coverage does not decrease (if any source code was changed)
- [ ] Appropriate docs were updated (if necessary)
- [ ] Make sure to add `!` if this involve a breaking change

🛠️ Fixes #1153
2025-08-21 11:01:35 -07:00
Anushka Saxena
8430d97839 docs: add troubleshooting guide for cloud run (#1181)
## Description

Users deploying Toolbox to Cloud Run for the first time can encounter
common friction points that are not covered in the current
documentation. This can lead to deployment failures and a frustrating
setup experience, particularly around container port configuration and
IAM permissions.

## Relevant issue(s)

This PR addresses a subset of #1116.

Signed-off-by: Anushka Saxena <anushkasaxenaa@google.com>
2025-08-21 16:31:03 +00:00
manuka rahul
7158d72d39 docs: add jsgenai tab in the quickstart js (#1037)
Co-authored-by: Averi Kitsch <akitsch@google.com>
Co-authored-by: Twisha Bansal <58483338+twishabansal@users.noreply.github.com>
2025-08-21 16:16:25 +00:00
prernakakkar-google
f8f09818c7 feat(prebuilt/alloydb)!: Add bearer token support for alloydb-wait-for-operation (#1183)
## Description
---
The *alloydb-wait-for-operation* tool now automatically obtains a Google
Cloud Platform OAuth2 token and uses it as a bearer token to
authenticate requests to the AlloyDB Admin API. This ensures secure and
proper communication with the API.
<img width="3840" height="2084" alt="image"
src="https://github.com/user-attachments/assets/e756255f-83f9-4719-8d8b-596a628ca1e3"
/>
## PR Checklist
---
> Thank you for opening a Pull Request! Before submitting your PR, there
are a
> few things you can do to make sure it goes smoothly:
- [x] Make sure you reviewed

[CONTRIBUTING.md](https://github.com/googleapis/genai-toolbox/blob/main/CONTRIBUTING.md)
- [x] Make sure to open an issue as a

[bug/issue](https://github.com/googleapis/langchain-google-alloydb-pg-python/issues/new/choose)
before writing your code! That way we can discuss the change, evaluate
  designs, and agree on the general idea
- [x] Ensure the tests and linter pass
- [x] Code coverage does not decrease (if any source code was changed)
- [x] Appropriate docs were updated (if necessary)
- [x] Make sure to add `!` if this involve a breaking change

🛠️ Fixes
https://github.com/googleapis/langchain-google-alloydb-pg-python/issues/456
2025-08-21 11:58:17 +05:30
Averi Kitsch
2bcac71647 chore(deps): update dependency go to v1.25.0 (#1190)
## Description
---
> Should include a concise description of the changes (bug or feature),
it's
> impact, along with a summary of the solution

## PR Checklist
---
> Thank you for opening a Pull Request! Before submitting your PR, there
are a
> few things you can do to make sure it goes smoothly:
- [ ] Make sure you reviewed

[CONTRIBUTING.md](https://github.com/googleapis/genai-toolbox/blob/main/CONTRIBUTING.md)
- [ ] Make sure to open an issue as a

[bug/issue](https://github.com/googleapis/langchain-google-alloydb-pg-python/issues/new/choose)
before writing your code! That way we can discuss the change, evaluate
  designs, and agree on the general idea
- [ ] Ensure the tests and linter pass
- [ ] Code coverage does not decrease (if any source code was changed)
- [ ] Appropriate docs were updated (if necessary)
- [ ] Make sure to add `!` if this involve a breaking change

🛠️ Fixes #<issue_number_goes_here>

---------

Co-authored-by: Mend Renovate <bot@renovateapp.com>
2025-08-20 23:53:11 +00:00
Dr. Strangelove
97f0dd2acf feat(tool/looker): add support for "description" field in looker tool (#1199)
## Description
---
Adds support for fetching description for looker dimensions and
measures.
Descriptions give important context to any agent about how to filter and
aggregate.

Adds the `description` DimensionFields and MeasureFields
Handles cases when description is not available.
Adds tests.

## PR Checklist
---
- [X] Make sure you reviewed

[CONTRIBUTING.md](https://github.com/googleapis/genai-toolbox/blob/main/CONTRIBUTING.md)
- [X] Make sure to open an issue as a

[bug/issue](https://github.com/googleapis/langchain-google-alloydb-pg-python/issues/new/choose)
before writing your code! That way we can discuss the change, evaluate
  designs, and agree on the general idea
- [X] Ensure the tests and linter pass
- [X] Code coverage does not decrease (if any source code was changed)
- [X] Appropriate docs were updated (if necessary)
- [X] Make sure to add `!` if this involve a breaking change

🛠️ Fixes #1197

---------

Co-authored-by: Magnus Benediktsson <magnus.benediktsson@nordnet.se>
2025-08-20 12:41:18 -07:00
Wenxin Du
5dcc66c84f feat(server/mcp): Support ping mechanism (#1178)
Send back an empty result with a `ping` request following the [MCP ping
spec](https://modelcontextprotocol.io/specification/2025-06-18/basic/utilities/ping).

<img width="577" height="481" alt="Screenshot 2025-08-18 at 4 20 09 PM"
src="https://github.com/user-attachments/assets/fd48725e-298e-4938-9368-d704bea0a546"
/>
2025-08-20 15:04:47 -04:00
Wenxin Du
212aaba74c feat(server): Fail-fast on environment variable substitution (#1177)
Fail immediately with "environment variable not found" error if an
environment variable ${ENV_VAR} is declared but not defined in a
tools.yaml file.

fix: https://github.com/googleapis/genai-toolbox/issues/1159
2025-08-20 14:19:50 -04:00
Ganga4060
ff13093f74 docs: fix connectionString in couchbase sources (#1173)
Removed 8091 from the connectionString of sources in couchbase

---------

Co-authored-by: Averi Kitsch <akitsch@google.com>
2025-08-20 17:53:05 +00:00
Twisha Bansal
bf47b9c14c docs: clarify versioning pre 1.0.0 (#1184)
Co-authored-by: Yuan Teoh <45984206+Yuan325@users.noreply.github.com>
2025-08-20 10:10:09 +05:30
Dr. Strangelove
82defaeba8 docs(looker): minor changes to docs (#1188)
* fixed links to images in `looker_mcp_inspector` by moving it to own
directory.
* added Gemini-CLI section to looker_mcp.md.
* added newer tools to list in looker_mcp.md.

Co-authored-by: Averi Kitsch <akitsch@google.com>
2025-08-19 21:10:51 +00:00
Mend Renovate
45821c60b9 chore(deps): update module google.golang.org/api to v0.248.0 (#1187)
This PR contains the following updates:

| Package | Change | Age | Confidence |
|---|---|---|---|
|
[google.golang.org/api](https://redirect.github.com/googleapis/google-api-go-client)
| `v0.247.0` -> `v0.248.0` |
[![age](https://developer.mend.io/api/mc/badges/age/go/google.golang.org%2fapi/v0.248.0?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|
[![confidence](https://developer.mend.io/api/mc/badges/confidence/go/google.golang.org%2fapi/v0.247.0/v0.248.0?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|

---

### Release Notes

<details>
<summary>googleapis/google-api-go-client
(google.golang.org/api)</summary>

###
[`v0.248.0`](https://redirect.github.com/googleapis/google-api-go-client/releases/tag/v0.248.0)

[Compare
Source](https://redirect.github.com/googleapis/google-api-go-client/compare/v0.247.0...v0.248.0)

##### Features

- **all:** Auto-regenerate discovery clients
([#&#8203;3272](https://redirect.github.com/googleapis/google-api-go-client/issues/3272))
([8f03c8e](8f03c8e1e1))
- **all:** Auto-regenerate discovery clients
([#&#8203;3274](https://redirect.github.com/googleapis/google-api-go-client/issues/3274))
([2900298](29002988e8))
- **all:** Auto-regenerate discovery clients
([#&#8203;3275](https://redirect.github.com/googleapis/google-api-go-client/issues/3275))
([4c66642](4c6664201e))
- **all:** Auto-regenerate discovery clients
([#&#8203;3276](https://redirect.github.com/googleapis/google-api-go-client/issues/3276))
([2692170](2692170524))
- **all:** Auto-regenerate discovery clients
([#&#8203;3277](https://redirect.github.com/googleapis/google-api-go-client/issues/3277))
([23daa11](23daa11938))
- **all:** Auto-regenerate discovery clients
([#&#8203;3279](https://redirect.github.com/googleapis/google-api-go-client/issues/3279))
([71f2c5f](71f2c5fe1e))
- **all:** Auto-regenerate discovery clients
([#&#8203;3280](https://redirect.github.com/googleapis/google-api-go-client/issues/3280))
([59f18fd](59f18fd7f2))

</details>

---

### Configuration

📅 **Schedule**: Branch creation - At any time (no schedule defined),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/googleapis/genai-toolbox).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MS44MS4yIiwidXBkYXRlZEluVmVyIjoiNDEuODEuMiIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOltdfQ==-->
2025-08-19 10:31:58 -07:00
Twisha Bansal
d93ccdd1e3 docs: add AlloyDB AI NL codelab (#1149)
Co-authored-by: Yuan Teoh <45984206+Yuan325@users.noreply.github.com>
2025-08-19 22:01:32 +05:30
trehanshakuntG
ee4a70a0e8 feat(tools/firestore-add-documents): Add firestore-add-documents tool (#1107)
## Add firestore-add-documents tool

Adds a new tool for creating documents in Firestore collections.

__What it does:__

- Adds documents to any Firestore collection
- Auto-generates unique document IDs
- Supports all Firestore data types (strings, numbers, booleans,
timestamps, geopoints, arrays, maps, etc.)
- Uses Firestore's native JSON format for type safety

__Key parameters:__

- `collectionPath`: Where to add the document
- `documentData`: The document content in Firestore JSON format
- `returnData`: Optional flag to include created document in response

---------

Co-authored-by: Averi Kitsch <akitsch@google.com>
2025-08-19 11:24:21 +05:30
Haoming Chen
2ad0ccf83d feat(tools/bigquery-forecast): Add bigqueryforecast tool (#1148)
This tool wraps the BigQuery's AI.FORECAST function to do the time
series forecasting.

Co-authored-by: Averi Kitsch <akitsch@google.com>
2025-08-18 13:45:54 -07:00
Averi Kitsch
3abc9e00b5 docs: remove database from MongoDB source (#1166)
## Description
---
> Should include a concise description of the changes (bug or feature),
it's
> impact, along with a summary of the solution

## PR Checklist
---
> Thank you for opening a Pull Request! Before submitting your PR, there
are a
> few things you can do to make sure it goes smoothly:
- [ ] Make sure you reviewed

[CONTRIBUTING.md](https://github.com/googleapis/genai-toolbox/blob/main/CONTRIBUTING.md)
- [ ] Make sure to open an issue as a

[bug/issue](https://github.com/googleapis/langchain-google-alloydb-pg-python/issues/new/choose)
before writing your code! That way we can discuss the change, evaluate
  designs, and agree on the general idea
- [ ] Ensure the tests and linter pass
- [ ] Code coverage does not decrease (if any source code was changed)
- [ ] Appropriate docs were updated (if necessary)
- [ ] Make sure to add `!` if this involve a breaking change

🛠️ Fixes #<issue_number_goes_here>
2025-08-18 17:24:01 +00:00
zhouyh
6fc49826d4 feat(tools/oceanbase): Add oceanbase Source and Tool (#895)
1.Added oceanbase Source and Tool
2.Add tests related to OceanBase
3.Add the documentation for Oceanbase
Close #894

---------

Co-authored-by: Wenxin Du <117315983+duwenxin99@users.noreply.github.com>
Co-authored-by: duwenxin <duwenxin@google.com>
2025-08-18 11:40:10 -04:00
Mend Renovate
3aec5d502e chore(deps): update module github.com/neo4j/neo4j-go-driver/v5 to v5.28.2 (#1163)
This PR contains the following updates:

| Package | Change | Age | Confidence |
|---|---|---|---|
|
[github.com/neo4j/neo4j-go-driver/v5](https://redirect.github.com/neo4j/neo4j-go-driver)
| `v5.28.1` -> `v5.28.2` |
[![age](https://developer.mend.io/api/mc/badges/age/go/github.com%2fneo4j%2fneo4j-go-driver%2fv5/v5.28.2?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|
[![confidence](https://developer.mend.io/api/mc/badges/confidence/go/github.com%2fneo4j%2fneo4j-go-driver%2fv5/v5.28.1/v5.28.2?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|

---

### Release Notes

<details>
<summary>neo4j/neo4j-go-driver
(github.com/neo4j/neo4j-go-driver/v5)</summary>

###
[`v5.28.2`](https://redirect.github.com/neo4j/neo4j-go-driver/releases/tag/v5.28.2)

[Compare
Source](https://redirect.github.com/neo4j/neo4j-go-driver/compare/v5.28.1...v5.28.2)

See https://github.com/neo4j/neo4j-go-driver/wiki/5.x-changelog for more
information.

</details>

---

### Configuration

📅 **Schedule**: Branch creation - At any time (no schedule defined),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/googleapis/genai-toolbox).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MS43MS4xIiwidXBkYXRlZEluVmVyIjoiNDEuNzEuMSIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOltdfQ==-->

Co-authored-by: Yuan Teoh <45984206+Yuan325@users.noreply.github.com>
2025-08-15 10:28:59 -07:00
Yuan Teoh
2a3c40c344 docs: remove pr template folder (#1162)
PR template is not working. Since there is only one template, the folder
is irrelevant.
2025-08-15 17:07:53 +00:00
Yuan Teoh
f956495a6c docs: update CHANGELOG.md (#1160)
Fix wrong scope in release notes. Updated in release notes.
2025-08-15 09:42:19 -06:00
289 changed files with 24582 additions and 3915 deletions

View File

@@ -328,23 +328,23 @@ steps:
mssql \
mssql
- id: "dgraph"
name: golang:1
waitFor: ["compile-test-binary"]
entrypoint: /bin/bash
env:
- "GOPATH=/gopath"
- "DGRAPH_URL=$_DGRAPHURL"
volumes:
- name: "go"
path: "/gopath"
args:
- -c
- |
.ci/test_with_coverage.sh \
"Dgraph" \
dgraph \
dgraph
# - id: "dgraph"
# name: golang:1
# waitFor: ["compile-test-binary"]
# entrypoint: /bin/bash
# env:
# - "GOPATH=/gopath"
# - "DGRAPH_URL=$_DGRAPHURL"
# volumes:
# - name: "go"
# path: "/gopath"
# args:
# - -c
# - |
# .ci/test_with_coverage.sh \
# "Dgraph" \
# dgraph \
# dgraph
- id: "http"
name: golang:1
@@ -444,6 +444,27 @@ steps:
valkey \
valkey
- id: "oceanbase"
name: golang:1
waitFor: ["compile-test-binary"]
entrypoint: /bin/bash
env:
- "GOPATH=/gopath"
- "OCEANBASE_PORT=$_OCEANBASE_PORT"
- "OCEANBASE_DATABASE=$_OCEANBASE_DATABASE"
- "SERVICE_ACCOUNT_EMAIL=$SERVICE_ACCOUNT_EMAIL"
secretEnv: ["CLIENT_ID", "OCEANBASE_HOST", "OCEANBASE_USER", "OCEANBASE_PASSWORD"]
volumes:
- name: "go"
path: "/gopath"
args:
- -c
- |
.ci/test_with_coverage.sh \
"OceanBase" \
oceanbase \
oceanbase
- id: "firestore"
name: golang:1
waitFor: ["compile-test-binary"]
@@ -510,6 +531,24 @@ steps:
utility \
utility/alloydbwaitforoperation
- id: "cloud-sql"
name: golang:1
waitFor: ["compile-test-binary"]
entrypoint: /bin/bash
env:
- "GOPATH=/gopath"
secretEnv: ["CLIENT_ID"]
volumes:
- name: "go"
path: "/gopath"
args:
- -c
- |
.ci/test_with_coverage.sh \
"Cloud SQL Wait for Operation" \
cloudsql \
cloudsql
- id: "tidb"
name: golang:1
waitFor: ["compile-test-binary"]
@@ -532,6 +571,92 @@ steps:
tidb \
tidbsql tidbexecutesql
- id: "firebird"
name: golang:1
waitFor: ["compile-test-binary"]
entrypoint: /bin/bash
env:
- "GOPATH=/gopath"
- "FIREBIRD_DATABASE=$_FIREBIRD_DATABASE_NAME"
- "FIREBIRD_HOST=$_FIREBIRD_HOST"
- "FIREBIRD_PORT=$_FIREBIRD_PORT"
- "SERVICE_ACCOUNT_EMAIL=$SERVICE_ACCOUNT_EMAIL"
secretEnv: ["CLIENT_ID", "FIREBIRD_USER", "FIREBIRD_PASS"]
volumes:
- name: "go"
path: "/gopath"
args:
- -c
- |
.ci/test_with_coverage.sh \
"Firebird" \
firebird \
firebirdsql firebirdexecutesql
- id: "clickhouse"
name : golang:1
waitFor: ["compile-test-binary"]
entrypoint: /bin/bash
env:
- "GOPATH=/gopath"
- "CLICKHOUSE_DATABASE=$_CLICKHOUSE_DATABASE"
- "CLICKHOUSE_PORT=$_CLICKHOUSE_PORT"
- "CLICKHOUSE_PROTOCOL=$_CLICKHOUSE_PROTOCOL"
- "SERVICE_ACCOUNT_EMAIL=$SERVICE_ACCOUNT_EMAIL"
secretEnv: ["CLICKHOUSE_HOST", "CLICKHOUSE_USER", "CLIENT_ID"]
volumes:
- name: "go"
path: "/gopath"
args:
- -c
- |
.ci/test_with_coverage.sh \
"ClickHouse" \
clickhouse \
clickhouse
- id: "trino"
name: golang:1
waitFor: ["compile-test-binary"]
entrypoint: /bin/bash
env:
- "GOPATH=/gopath"
- "TRINO_HOST=$_TRINO_HOST"
- "TRINO_PORT=$_TRINO_PORT"
- "TRINO_CATALOG=$_TRINO_CATALOG"
- "TRINO_SCHEMA=$_TRINO_SCHEMA"
- "SERVICE_ACCOUNT_EMAIL=$SERVICE_ACCOUNT_EMAIL"
secretEnv: ["CLIENT_ID", "TRINO_USER"]
volumes:
- name: "go"
path: "/gopath"
args:
- -c
- |
.ci/test_with_coverage.sh \
"Trino" \
trino \
trinosql trinoexecutesql
- id: "yugabytedb"
name: golang:1
waitFor: ["compile-test-binary"]
entrypoint: /bin/bash
env:
- "GOPATH=/gopath"
- "YUGABYTEDB_DATABASE=$_YUGABYTEDB_DATABASE"
- "YUGABYTEDB_PORT=$_YUGABYTEDB_PORT"
- "YUGABYTEDB_LOADBALANCE=$_YUGABYTEDB_LOADBALANCE"
- "SERVICE_ACCOUNT_EMAIL=$SERVICE_ACCOUNT_EMAIL"
secretEnv: ["YUGABYTEDB_USER", "YUGABYTEDB_PASS", "YUGABYTEDB_HOST", "CLIENT_ID"]
volumes:
- name: "go"
path: "/gopath"
args:
- -c
- |
./yugabytedb.test -test.v
availableSecrets:
secretManager:
- versionName: projects/$PROJECT_ID/secrets/cloud_sql_pg_user/versions/latest
@@ -594,6 +719,29 @@ availableSecrets:
env: TIDB_USER
- versionName: projects/$PROJECT_ID/secrets/tidb_pass/versions/latest
env: TIDB_PASS
- versionName: projects/$PROJECT_ID/secrets/clickhouse_host/versions/latest
env: CLICKHOUSE_HOST
- versionName: projects/$PROJECT_ID/secrets/clickhouse_user/versions/latest
env: CLICKHOUSE_USER
- versionName: projects/$PROJECT_ID/secrets/firebird_user/versions/latest
env: FIREBIRD_USER
- versionName: projects/$PROJECT_ID/secrets/firebird_pass/versions/latest
env: FIREBIRD_PASS
- versionName: projects/$PROJECT_ID/secrets/trino_user/versions/latest
env: TRINO_USER
- versionName: projects/$PROJECT_ID/secrets/oceanbase_host/versions/latest
env: OCEANBASE_HOST
- versionName: projects/$PROJECT_ID/secrets/oceanbase_user/versions/latest
env: OCEANBASE_USER
- versionName: projects/$PROJECT_ID/secrets/oceanbase_pass/versions/latest
env: OCEANBASE_PASSWORD
- versionName: projects/$PROJECT_ID/secrets/yugabytedb_host/versions/latest
env: YUGABYTEDB_HOST
- versionName: projects/$PROJECT_ID/secrets/yugabytedb_user/versions/latest
env: YUGABYTEDB_USER
- versionName: projects/$PROJECT_ID/secrets/yugabytedb_pass/versions/latest
env: YUGABYTEDB_PASS
options:
logging: CLOUD_LOGGING_ONLY
@@ -605,6 +753,7 @@ options:
substitutions:
_DATABASE_NAME: test_database
_FIREBIRD_DATABASE_NAME: /firebird/test_database.fdb
_REGION: "us-central1"
_CLOUD_SQL_POSTGRES_INSTANCE: "cloud-sql-pg-testing"
_ALLOYDB_POSTGRES_CLUSTER: "alloydb-pg-testing"
@@ -628,3 +777,17 @@ substitutions:
_LOOKER_VERIFY_SSL: "true"
_TIDB_HOST: 127.0.0.1
_TIDB_PORT: "4000"
_CLICKHOUSE_DATABASE: "default"
_CLICKHOUSE_PORT: "8123"
_CLICKHOUSE_PROTOCOL: "http"
_FIREBIRD_HOST: 127.0.0.1
_FIREBIRD_PORT: "3050"
_TRINO_HOST: 127.0.0.1
_TRINO_PORT: "8080"
_TRINO_CATALOG: "memory"
_TRINO_SCHEMA: "default"
_OCEANBASE_PORT: "2883"
_OCEANBASE_DATABASE: "oceanbase"
_YUGABYTEDB_DATABASE: "yugabyte"
_YUGABYTEDB_PORT: "5433"
_YUGABYTEDB_LOADBALANCE: "false"

View File

@@ -1,16 +1,19 @@
## Description
---
> Should include a concise description of the changes (bug or feature), it's
> impact, along with a summary of the solution
## PR Checklist
---
> Thank you for opening a Pull Request! Before submitting your PR, there are a
> few things you can do to make sure it goes smoothly:
- [ ] Make sure you reviewed
[CONTRIBUTING.md](https://github.com/googleapis/genai-toolbox/blob/main/CONTRIBUTING.md)
- [ ] Make sure to open an issue as a
[bug/issue](https://github.com/googleapis/langchain-google-alloydb-pg-python/issues/new/choose)
[bug/issue](https://github.com/googleapis/genai-toolbox/issues/new/choose)
before writing your code! That way we can discuss the change, evaluate
designs, and agree on the general idea
- [ ] Ensure the tests and linter pass
@@ -18,4 +21,4 @@
- [ ] Appropriate docs were updated (if necessary)
- [ ] Make sure to add `!` if this involve a breaking change
🛠️ Fixes #<issue_number_goes_here>
🛠️ Fixes #<issue_number_goes_here>

View File

@@ -1,7 +1,24 @@
# Copyright 2025 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# https://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
assign_issues:
- Yuan325
- duwenxin99
- averikitsch
- anubhav756
- dishaprakash
- twishabansal
assign_issues_by:
- labels:
- 'product: bigquery'
@@ -9,6 +26,10 @@ assign_issues_by:
- Genesis929
- shobsi
- jiaxunwu
- labels:
- 'product: looker'
to:
- drstrangelooker
assign_prs:
- Yuan325
- duwenxin99

View File

@@ -20,3 +20,5 @@ sourceFileExtensions:
- 'go'
- 'yaml'
- 'yml'
ignoreFiles:
- 'docs/en/getting-started/quickstart/**'

4
.github/labels.yaml vendored
View File

@@ -92,3 +92,7 @@
- name: 'product: bigquery'
color: 5065c7
description: 'Product: Assigned to the BigQuery team.'
# Product Labels
- name: 'product: looker'
color: 5065c7
description: 'Product: Assigned to the Looker team.'

View File

@@ -20,26 +20,20 @@ extraFiles: [
"README.md",
"docs/en/getting-started/colab_quickstart.ipynb",
"docs/en/getting-started/introduction/_index.md",
"docs/en/getting-started/local_quickstart.md",
"docs/en/getting-started/local_quickstart_js.md",
"docs/en/getting-started/local_quickstart_go.md",
"docs/en/getting-started/mcp_quickstart/_index.md",
"docs/en/getting-started/quickstart/shared/configure_toolbox.md",
"docs/en/samples/alloydb/_index.md",
"docs/en/samples/alloydb/mcp_quickstart.md",
"docs/en/samples/alloydb/ai-nl/alloydb_ai_nl.ipynb",
"docs/en/samples/bigquery/local_quickstart.md",
"docs/en/samples/bigquery/mcp_quickstart/_index.md",
"docs/en/samples/bigquery/colab_quickstart_bigquery.ipynb",
"docs/en/samples/looker/looker_gemini.md",
"docs/en/samples/looker/looker_mcp_inspector.md",
"docs/en/how-to/connect-ide/alloydb_pg_mcp.md",
"docs/en/how-to/connect-ide/alloydb_pg_admin_mcp.md",
"docs/en/how-to/connect-ide/bigquery_mcp.md",
"docs/en/how-to/connect-ide/cloud_sql_pg_mcp.md",
"docs/en/how-to/connect-ide/cloud_sql_mssql_mcp.md",
"docs/en/how-to/connect-ide/cloud_sql_mysql_mcp.md",
"docs/en/how-to/connect-ide/firestore_mcp.md",
"docs/en/samples/looker/looker_gemini_oauth/_index.md",
"docs/en/samples/looker/looker_mcp_inspector/_index.md",
"docs/en/how-to/connect-ide/looker_mcp.md",
"docs/en/how-to/connect-ide/mysql_mcp.md",
"docs/en/how-to/connect-ide/mssql_mcp.md",
"docs/en/how-to/connect-ide/postgres_mcp.md",
"docs/en/how-to/connect-ide/spanner_mcp.md",
"docs/en/how-to/connect-ide/neo4j_mcp.md",
]

View File

@@ -37,7 +37,7 @@ jobs:
runs-on: 'ubuntu-latest'
steps:
- uses: 'actions/github-script@60a0d83039c74a4aee543508d2ffcb1c3799cdea' # v7
- uses: 'actions/github-script@f28e40c7f34bde8b3046d885e986cb6290c5673b' # v7
with:
script: |-
// parse test names

View File

@@ -48,7 +48,7 @@ jobs:
git push
- name: Comment
uses: actions/github-script@60a0d83039c74a4aee543508d2ffcb1c3799cdea # v7
uses: actions/github-script@f28e40c7f34bde8b3046d885e986cb6290c5673b # v7
with:
script: |
github.rest.issues.createComment({

View File

@@ -90,7 +90,7 @@ jobs:
commit_message: "stage: PR-${{ github.event.number }}: ${{ github.event.head_commit.message }}"
- name: Comment
uses: actions/github-script@60a0d83039c74a4aee543508d2ffcb1c3799cdea # v7
uses: actions/github-script@f28e40c7f34bde8b3046d885e986cb6290c5673b # v7
with:
script: |
github.rest.issues.createComment({

View File

@@ -36,7 +36,7 @@ jobs:
steps:
- name: Remove PR Label
if: "${{ github.event.action == 'labeled' && github.event.label.name == 'tests: run' }}"
uses: actions/github-script@60a0d83039c74a4aee543508d2ffcb1c3799cdea # v7.0.1
uses: actions/github-script@f28e40c7f34bde8b3046d885e986cb6290c5673b # v7.1.0
with:
github-token: ${{ secrets.GITHUB_TOKEN }}
script: |

View File

@@ -16,7 +16,7 @@ name: tests
on:
push:
branches:
- 'main'
- "main"
pull_request:
pull_request_target:
types: [labeled]
@@ -35,13 +35,13 @@ jobs:
os: [macos-latest, windows-latest, ubuntu-latest]
fail-fast: false
permissions:
contents: 'read'
issues: 'write'
pull-requests: 'write'
contents: "read"
issues: "write"
pull-requests: "write"
steps:
- name: Remove PR label
if: "${{ github.event.action == 'labeled' && github.event.label.name == 'tests: run' }}"
uses: actions/github-script@60a0d83039c74a4aee543508d2ffcb1c3799cdea # v7.0.1
uses: actions/github-script@f28e40c7f34bde8b3046d885e986cb6290c5673b # v7.1.0
with:
github-token: ${{ secrets.GITHUB_TOKEN }}
script: |
@@ -76,6 +76,8 @@ jobs:
- name: Run tests with coverage
if: ${{ runner.os == 'Linux' }}
env:
GOTOOLCHAIN: go1.25.0+auto
run: |
source_dir="./internal/sources/*"
tool_dir="./internal/tools/*"

2
.gitignore vendored
View File

@@ -20,4 +20,4 @@ node_modules
# executable
genai-toolbox
toolbox
toolbox

View File

@@ -8,6 +8,8 @@ defaultContentLanguageInSubdir = false
enableGitInfo = true
enableRobotsTXT = true
ignoreFiles = ["quickstart/shared", "quickstart/python", "quickstart/js", "quickstart/go"]
[languages]
[languages.en]
languageName ="English"

View File

@@ -0,0 +1,47 @@
{{ $notebookFile := .Get 0 }}
{{ with .Page.Resources.Get $notebookFile }}
{{ $content := .Content | transform.Unmarshal }}
{{ range $content.cells }}
{{ if eq .cell_type "markdown" }}
<div class="notebook-markdown">
{{ $markdown := "" }}
{{ range .source }}{{ $markdown = print $markdown . }}{{ end }}
{{ $markdown | markdownify }}
</div>
{{ end }}
{{ if eq .cell_type "code" }}
<div class="notebook-code">
{{ $code := "" }}
{{ range .source }}{{ $code = print $code . }}{{ end }}
{{ highlight $code "python" "" }}
{{ range .outputs }}
<div class="notebook-output">
{{ with .text }}
<pre class="notebook-stream"><code>{{- range . }}{{ . }}{{ end -}}</code></pre>
{{ end }}
{{ with .data }}
{{ with index . "image/png" }}
<img src="data:image/png;base64,{{ . }}" alt="Notebook output image">
{{ end }}
{{ with index . "image/jpeg" }}
<img src="data:image/jpeg;base64,{{ . }}" alt="Notebook output image">
{{ end }}
{{ with index . "text/html" }}
{{ $html := "" }}
{{ range . }}{{ $html = print $html . }}{{ end }}
{{ $html | safeHTML }}
{{ end }}
{{ end }}
</div>
{{ end }}
</div>
{{ end }}
{{ end }}
{{ else }}
<p style="color: red;">Error: Notebook '{{ $notebookFile }}' not found in page resources.</p>
{{ end }}

View File

@@ -0,0 +1,49 @@
{{/*
snippet.html
Usage:
{{< regionInclude "filename.md" "region_name" >}}
{{< regionInclude "filename.python" "region_name" "python" >}}
*/}}
{{ $file := .Get 0 }}
{{ $region := .Get 1 }}
{{ $lang := .Get 2 | default "text" }}
{{ $path := printf "%s%s" .Page.File.Dir $file }}
{{ if or (not $file) (eq $file "") }}
{{ errorf "The file parameter (first argument) is required and must be non-empty in %s" .Page.File.Path }}
{{ end }}
{{ if or (not $region) (eq $region "") }}
{{ errorf "The region parameter (second argument) is required and must be non-empty in %s" .Page.File.Path }}
{{ end }}
{{ if not (fileExists $path) }}
{{ errorf "File %q not found (referenced in %s)" $path .Page.File.Path }}
{{ end }}
{{ $content := readFile $path }}
{{ $start_tag := printf "[START %s]" $region }}
{{ $end_tag := printf "[END %s]" $region }}
{{ $snippet := "" }}
{{ $in_snippet := false }}
{{ range split $content "\n" }}
{{ if $in_snippet }}
{{ if in . $end_tag }}
{{ $in_snippet = false }}
{{ else }}
{{ $snippet = printf "%s%s\n" $snippet . }}
{{ end }}
{{ else if in . $start_tag }}
{{ $in_snippet = true }}
{{ end }}
{{ end }}
{{ if eq (trim $snippet "") "" }}
{{ errorf "Region %q not found or empty in file %s (referenced in %s)" $region $file .Page.File.Path }}
{{ end }}
{{ if eq $lang "text" }}
{{ $snippet | markdownify }}
{{ else }}
{{ highlight (trim $snippet "\n") $lang "" }}
{{ end }}

View File

@@ -1,5 +1,64 @@
# Changelog
## [0.14.0](https://github.com/googleapis/genai-toolbox/compare/v0.13.0...v0.14.0) (2025-09-05)
### ⚠ BREAKING CHANGES
* **bigquery:** Move `useClientOAuth` config from tool to source ([#1279](https://github.com/googleapis/genai-toolbox/issues/1279)) ([8d20a48](https://github.com/googleapis/genai-toolbox/commit/8d20a48f13bcda853d41bdf80a162de12b076d1b))
* **tools/bigquerysql:** remove `useClientOAuth` from tools config ([#1312](https://github.com/googleapis/genai-toolbox/issues/1312))
### Features
* **clickhouse:** Add ClickHouse Source and Tools ([#1088](https://github.com/googleapis/genai-toolbox/issues/1088)) ([75a04a5](https://github.com/googleapis/genai-toolbox/commit/75a04a55dd2259bed72fe95119a7a51a906c0b21))
* **prebuilt/alloydb-postgres:** Support ipType and IAM users ([#1324](https://github.com/googleapis/genai-toolbox/issues/1324)) ([0b2121e](https://github.com/googleapis/genai-toolbox/commit/0b2121ea72eb81348dcd9c740a62ccd32e71fe37))
* **server/mcp:** Support toolbox auth in mcp ([#1140](https://github.com/googleapis/genai-toolbox/issues/1140)) ([ca353e0](https://github.com/googleapis/genai-toolbox/commit/ca353e0b66fedc00e9110df57db18632aef49018))
* **source/mysql:** Support `queryParams` in MySQL source ([#1299](https://github.com/googleapis/genai-toolbox/issues/1299)) ([3ae2526](https://github.com/googleapis/genai-toolbox/commit/3ae2526e0fe36b57b05a9b54f1d99f3fc68d9657))
* **tools/bigquery:** Support end-user credential passthrough on multiple BQ tools ([#1314](https://github.com/googleapis/genai-toolbox/issues/1314)) ([88f4b30](https://github.com/googleapis/genai-toolbox/commit/88f4b3028df3b6a400936cdf8a035bf55021924c))
* **tools/looker:** Add description for looker-get-models tool ([#1266](https://github.com/googleapis/genai-toolbox/issues/1266)) ([89af3a4](https://github.com/googleapis/genai-toolbox/commit/89af3a4ca332f029615b2a739d1f6cd50519638d))
* **tools/looker:** Authenticate via end user credentials ([#1257](https://github.com/googleapis/genai-toolbox/issues/1257)) ([8755e3d](https://github.com/googleapis/genai-toolbox/commit/8755e3db3476abb35629b3cca9c78db7366757a4))
* **tools/looker:** Report field suggestions to agent ([#1267](https://github.com/googleapis/genai-toolbox/issues/1267)) ([2cad82e](https://github.com/googleapis/genai-toolbox/commit/2cad82e5107566dd6c9b75e34e9976af63af0bb5))
### Bug Fixes
* Do not print usage on runtime error ([#1315](https://github.com/googleapis/genai-toolbox/issues/1315)) ([afba7a5](https://github.com/googleapis/genai-toolbox/commit/afba7a57cdd4fe7c1b0741dbf8f8c78b14a68089))
* Update env var to allow empty string ([#1260](https://github.com/googleapis/genai-toolbox/issues/1260)) ([03aa9fa](https://github.com/googleapis/genai-toolbox/commit/03aa9fabacda06f860c9f178485126bddb7d5782))
* **tools/firestore:** Add document/collection path validation ([#1229](https://github.com/googleapis/genai-toolbox/issues/1229)) ([14c2249](https://github.com/googleapis/genai-toolbox/commit/14c224939a2f9bb349fa00a7d5227877198530c2))
* **tools/looker-get-dashboards:** Fix Looker client OAuth check ([#1338](https://github.com/googleapis/genai-toolbox/issues/1338)) ([36225aa](https://github.com/googleapis/genai-toolbox/commit/36225aa6db7f8426ad87930866530fde4e9bf0cd))
* **tools/oceanbase:** Fix encoded text with mysql driver ([#1283](https://github.com/googleapis/genai-toolbox/issues/1283)) ([d16f89f](https://github.com/googleapis/genai-toolbox/commit/d16f89fbb6e49c03998f114ef7dc2b584b5e4967)), closes [#1161](https://github.com/googleapis/genai-toolbox/issues/1161)
## [0.13.0](https://github.com/googleapis/genai-toolbox/compare/v0.12.0...v0.13.0) (2025-08-27)
### ⚠ BREAKING CHANGES
* **prebuilt/alloydb:** Add bearer token support for alloydb-wait-for-operation ([#1183](https://github.com/googleapis/genai-toolbox/issues/1183))
### Features
* Add capability to set default for environment variable in config ([#1248](https://github.com/googleapis/genai-toolbox/issues/1248)) ([5bcd52e](https://github.com/googleapis/genai-toolbox/commit/5bcd52e7dcd0773ded723585f4abe29d044e1540))
* **firebird:** Add Firebird SQL 2.5+ source and tool ([#1011](https://github.com/googleapis/genai-toolbox/issues/1011)) ([4f6b806](https://github.com/googleapis/genai-toolbox/commit/4f6b806de947efc4e12bdb50dff7781aedb7b966))
* **oceanbase:** Add Oceanbase source and tool ([#895](https://github.com/googleapis/genai-toolbox/issues/895)) ([6fc4982](https://github.com/googleapis/genai-toolbox/commit/6fc49826d43f46c84028e752ebebddf3d94b3d13))
* **server/mcp:** Support `ping` mechanism ([#1178](https://github.com/googleapis/genai-toolbox/issues/1178)) ([5dcc66c](https://github.com/googleapis/genai-toolbox/commit/5dcc66c84fa72c75ec50a9ac5198018212ec2979))
* **server:** Fail-fast on environment variable substitution ([#1177](https://github.com/googleapis/genai-toolbox/issues/1177)) ([212aaba](https://github.com/googleapis/genai-toolbox/commit/212aaba74c8b431de8a5f7b9822a0af4afcaaa0e))
* **server:** Implement Tool call auth error propagation ([#1235](https://github.com/googleapis/genai-toolbox/issues/1235)) ([b94a021](https://github.com/googleapis/genai-toolbox/commit/b94a021ca11c6637cf8038449483b5e75f2012b3))
* **sources/bigquery:** Add support for user-credential passthrough ([#1067](https://github.com/googleapis/genai-toolbox/issues/1067)) ([650e2e2](https://github.com/googleapis/genai-toolbox/commit/650e2e26f51bff75ce66343f64944d0a89a58b69))
* **tool/looker:** Add support for `description` field in looker tool ([#1199](https://github.com/googleapis/genai-toolbox/issues/1199)) ([97f0dd2](https://github.com/googleapis/genai-toolbox/commit/97f0dd2acf26caf28ecad65abea8779c196a27f1))
* **tools/bigquery-ask-data-insights:** Add bigquery `ask-data-insights` tool ([#932](https://github.com/googleapis/genai-toolbox/issues/932)) ([7651357](https://github.com/googleapis/genai-toolbox/commit/7651357d424a2b6656d8b6818cebc5c8a86ed053))
* **tools/bigquery-forecast:** Add bigqueryforecast tool ([#1148](https://github.com/googleapis/genai-toolbox/issues/1148)) ([2ad0ccf](https://github.com/googleapis/genai-toolbox/commit/2ad0ccf83df542340087742468d6762f81eedee6))
* **tools/firestore-add-documents:** Add firestore-add-documents tool ([#1107](https://github.com/googleapis/genai-toolbox/issues/1107)) ([ee4a70a](https://github.com/googleapis/genai-toolbox/commit/ee4a70a0e82b346b07b5b4c60dfa060da2273f50))
* **tools/firestore-update-document:** Add firestore-update-document tool ([#1191](https://github.com/googleapis/genai-toolbox/issues/1191)) ([0010123](https://github.com/googleapis/genai-toolbox/commit/00101232a39c70288aac5715649c184858d351e3))
* **tools/looker:** Control over whether hidden objects are surfaced ([#1222](https://github.com/googleapis/genai-toolbox/issues/1222)) ([bc91559](https://github.com/googleapis/genai-toolbox/commit/bc91559cc4e5b20385b84cc562b624fabf7e47a8))
* **trino:** Add Trino source and tools ([#948](https://github.com/googleapis/genai-toolbox/issues/948)) ([7dd123b](https://github.com/googleapis/genai-toolbox/commit/7dd123b3d76b8eb2b74b5d960959c1f90684b37e))
### Bug Fixes
* **tools/looker:** Lookergetdashboards uses proper Authorized helper func ([#1255](https://github.com/googleapis/genai-toolbox/issues/1255)) ([00866bc](https://github.com/googleapis/genai-toolbox/commit/00866bc7fc33115c547213e60316ae889735fdbb))
* **tools/mongodb-find-one:** ProjectPayload unmarshaling ([#1167](https://github.com/googleapis/genai-toolbox/issues/1167)) ([8ea6a98](https://github.com/googleapis/genai-toolbox/commit/8ea6a98bd9096ba97722e5f807366887e864004f))
* **tools/mysql:** Fix encoded text for mysql ([#1161](https://github.com/googleapis/genai-toolbox/issues/1161)) ([a37cfa8](https://github.com/googleapis/genai-toolbox/commit/a37cfa841d151b9995d4fab73cfc5e4d30d2cc57)), closes [#840](https://github.com/googleapis/genai-toolbox/issues/840)
## [0.12.0](https://github.com/googleapis/genai-toolbox/compare/v0.11.0...v0.12.0) (2025-08-14)
@@ -7,7 +66,7 @@
* **prebuiltconfig:** Introduce additional parameter to limit context in list_tables ([#1151](https://github.com/googleapis/genai-toolbox/issues/1151)) ([497d3b1](https://github.com/googleapis/genai-toolbox/commit/497d3b126da252a4b59806ca2ca3c56e78efc13d))
* **prebuiltconfig/alloydb-admin:** Add list cluster, instance and users ([#1126](https://github.com/googleapis/genai-toolbox/issues/1126)) ([b42c139](https://github.com/googleapis/genai-toolbox/commit/b42c139158650fb1f3b696965e840c52e2016bf0))
* **prebuiltconfig/alloydb-postgres:** Add tool to create user via Built in user type or IAM ([#1130](https://github.com/googleapis/genai-toolbox/issues/1130)) ([f5bcb9c](https://github.com/googleapis/genai-toolbox/commit/f5bcb9c755a2c1747d0beeda568b6217d7420e7a))
* **prebuiltconfig/alloydb-admin:** Add tool to create user via Built in user type or IAM ([#1130](https://github.com/googleapis/genai-toolbox/issues/1130)) ([f5bcb9c](https://github.com/googleapis/genai-toolbox/commit/f5bcb9c755a2c1747d0beeda568b6217d7420e7a))
* **source/http:** Add User Agent to `http` invocations ([#1102](https://github.com/googleapis/genai-toolbox/issues/1102)) ([6f55b78](https://github.com/googleapis/genai-toolbox/commit/6f55b78e96b8c7aa9aca601cfae4d62f3e1eb42b))
* **sources/postgres:** Add support for `queryParams` ([#1047](https://github.com/googleapis/genai-toolbox/issues/1047)) ([7b57251](https://github.com/googleapis/genai-toolbox/commit/7b5725140279de21fece45e860945b7a7d23e7d0)), closes [#963](https://github.com/googleapis/genai-toolbox/issues/963)
* **tools/bigquery-execute-sql:** Add dry run support ([#1057](https://github.com/googleapis/genai-toolbox/issues/1057)) ([1cac9b5](https://github.com/googleapis/genai-toolbox/commit/1cac9b5b378153c7dc65ff3dfb4ebd852b715a10))

View File

@@ -25,33 +25,42 @@ This project follows
## Contribution process
### Code reviews
> [!NOTE]
> New contributions should always include both unit and integration tests.
All submissions, including submissions by project members, require review. We
use GitHub pull requests for this purpose. Consult
[GitHub Help](https://help.github.com/articles/about-pull-requests/) for more
information on using pull requests.
Within 2-5 days, a reviewer will review your PR. They may approve it, or request
changes. When requesting changes, reviewers should self-assign the PR to ensure
### Code reviews
* Within 2-5 days, a reviewer will review your PR. They may approve it, or request
changes.
* When requesting changes, reviewers should self-assign the PR to ensure
they are aware of any updates.
If additional changes are needed, push additional commits to your PR branch -
this helps the reviewer know which parts of the PR have changed. Commits will be
* If additional changes are needed, push additional commits to your PR branch -
this helps the reviewer know which parts of the PR have changed.
* Commits will be
squashed when merged.
Please follow up with changes promptly. If a PR is awaiting changes by the
* Please follow up with changes promptly.
* If a PR is awaiting changes by the
author for more than 10 days, maintainers may mark that PR as Draft. PRs that
are inactive for more than 30 days may be closed.
### Adding a New Database Source and Tool
## Adding a New Database Source or Tool
We recommend creating an
Please create an
[issue](https://github.com/googleapis/genai-toolbox/issues) before
implementation to ensure we can accept the contribution and no duplicated work.
If you have any questions, reach out on our
[Discord](https://discord.gg/Dmm69peqjh) to chat directly with the team. New
contributions should be added with both unit tests and integration tests.
implementation to ensure we can accept the contribution and no duplicated work. This issue
should include an overview of the API design. If you have any questions, reach out on our
[Discord](https://discord.gg/Dmm69peqjh) to chat directly with the team.
#### 1. Implement the New Data Source
> [!NOTE]
> New tools can be added for [pre-existing data sources](https://github.com/googleapis/genai-toolbox/tree/main/internal/sources). However, any new database source should also include at least one new tool type.
### Adding a New Database Source
We recommend looking at an [example source
implementation](https://github.com/googleapis/genai-toolbox/blob/main/internal/sources/postgres/postgres.go).
@@ -78,7 +87,7 @@ implementation](https://github.com/googleapis/genai-toolbox/blob/main/internal/s
* **Implement `init()`** to register the new Source.
* **Implement Unit Tests** in a file named `newdb_test.go`.
#### 2. Implement the New Tool
### Adding a New Tool
We recommend looking at an [example tool
implementation](https://github.com/googleapis/genai-toolbox/tree/main/internal/tools/postgres/postgressql).
@@ -111,12 +120,14 @@ tools.
* **Implement `init()`** to register the new Tool.
* **Implement Unit Tests** in a file named `newdb_test.go`.
#### 3. Add Integration Tests
### Adding Integration Tests
* **Add a test file** under a new directory `tests/newdb`.
* **Add pre-defined integration test suites** in the
`/tests/newdb/newdb_test.go` that are **required** to be run as long as your
code contains related features:
code contains related features. Please check each test suites for the config
defaults, if your source require test suites config updates, please refer to
[config option](./tests/option.go):
1. [RunToolGetTest][tool-get]: tests for the `GET` endpoint that returns the
tool's manifest.
@@ -135,7 +146,7 @@ tools.
parameters][temp-param-doc]. Only run this test if template
parameters apply to your tool.
* **Add the new database to the test config** in
* **Add the new database to the integration test workflow** in
[integration.cloudbuild.yaml](.ci/integration.cloudbuild.yaml).
[tool-get]:
@@ -151,7 +162,7 @@ tools.
[temp-param-doc]:
https://googleapis.github.io/genai-toolbox/resources/tools/#template-parameters
#### 4. Add Documentation
### Adding Documentation
* **Update the documentation** to include information about your new data source
and tool. This includes:
@@ -161,7 +172,7 @@ tools.
* **(Optional) Add samples** to the `docs/en/samples/<newdb>` directory.
#### (Optional) 5. Add Prebuilt Tools
### (Optional) Adding Prebuilt Tools
You can provide developers with a set of "build-time" tools to aid common
software development user journeys like viewing and creating tables/collections
@@ -175,7 +186,7 @@ and data.
[internal/prebuiltconfigs/prebuiltconfigs_test.go](internal/prebuiltconfigs/prebuiltconfigs_test.go)
and [cmd/root_test.go](cmd/root_test.go).
#### 6. Submit a Pull Request
## Submitting a Pull Request
Submit a pull request to the repository with your changes. Be sure to include a
detailed description of your changes and any requests for long term testing
@@ -216,4 +227,4 @@ resources.
* **PR Description:** PR description should **always** be included. It should
include a concise description of the changes, it's impact, along with a
summary of the solution. If the PR is related to a specific issue, the issue
number should be mentioned in the PR description (e.g. `Fixes #1`).
number should be mentioned in the PR description (e.g. `Fixes #1`).

View File

@@ -33,12 +33,15 @@ documentation](https://googleapis.github.io/genai-toolbox/).
- [Getting Started](#getting-started)
- [Installing the server](#installing-the-server)
- [Running the server](#running-the-server)
- [Homebrew Users](#homebrew-users)
- [Integrating your application](#integrating-your-application)
- [Configuration](#configuration)
- [Sources](#sources)
- [Tools](#tools)
- [Toolsets](#toolsets)
- [Versioning](#versioning)
- [Pre-1.0.0 Versioning](#pre-100-versioning)
- [Post-1.0.0 Versioning](#post-100-versioning)
- [Contributing](#contributing)
- [Community](#community)
@@ -114,7 +117,7 @@ To install Toolbox as a binary:
<!-- {x-release-please-start-version} -->
```sh
# see releases page for other versions
export VERSION=0.12.0
export VERSION=0.14.0
curl -O https://storage.googleapis.com/genai-toolbox/v$VERSION/linux/amd64/toolbox
chmod +x toolbox
```
@@ -127,7 +130,7 @@ You can also install Toolbox as a container:
```sh
# see releases page for other versions
export VERSION=0.12.0
export VERSION=0.14.0
docker pull us-central1-docker.pkg.dev/database-toolbox/toolbox/toolbox:$VERSION
```
@@ -151,7 +154,7 @@ To install from source, ensure you have the latest version of
[Go installed](https://go.dev/doc/install), and then run the following command:
```sh
go install github.com/googleapis/genai-toolbox@v0.12.0
go install github.com/googleapis/genai-toolbox@v0.14.0
```
<!-- {x-release-please-end} -->
@@ -162,22 +165,66 @@ go install github.com/googleapis/genai-toolbox@v0.12.0
[Configure](#configuration) a `tools.yaml` to define your tools, and then
execute `toolbox` to start the server:
<details open>
<summary>Binary</summary>
To run Toolbox from binary:
```sh
./toolbox --tools-file "tools.yaml"
```
> [!NOTE]
> Toolbox enables dynamic reloading by default. To disable, use the
> `--disable-reload` flag.
**NOTE:**
Toolbox enables dynamic reloading by default. To disable, use the `--disable-reload` flag.
#### Homebrew Users
</details>
If you installed Toolbox using Homebrew, the `toolbox` binary is available in your system path. You can start the server with the same command:
<details>
<summary>Container image</summary>
To run the server after pulling the [container image](#installing-the-server):
```sh
export VERSION=0.11.0 # Use the version you pulled
docker run -p 5000:5000 \
-v $(pwd)/tools.yaml:/app/tools.yaml \
us-central1-docker.pkg.dev/database-toolbox/toolbox/toolbox:$VERSION \
--tools-file "/app/tools.yaml"
```
**NOTE:**
The `-v` flag mounts your local `tools.yaml` into the container, and `-p` maps the container's port `5000` to your host's port `5000`.
</details>
<details>
<summary>Source</summary>
To run the server directly from source, navigate to the project root directory and run:
```sh
go run .
```
**NOTE:**
This command runs the project from source, and is more suitable for development and testing. It does **not** compile a binary into your `$GOPATH`. If you want to compile a binary instead, refer the [Developer Documentation](./DEVELOPER.md#building-the-binary).
</details>
<details>
<summary>Homebrew</summary>
If you installed Toolbox using [Homebrew](https://brew.sh/), the `toolbox` binary is available in your system path. You can start the server with the same command:
```sh
toolbox --tools-file "tools.yaml"
```
</details>
You can use `toolbox help` for a full list of flags! To stop the server, send a
terminate signal (`ctrl+c` on most platforms).
@@ -185,6 +232,7 @@ For more detailed documentation on deploying to different environments, check
out the resources in the [How-to
section](https://googleapis.github.io/genai-toolbox/how-to/)
### Integrating your application
Once your server is up and running, you can load the tools into your
@@ -724,12 +772,26 @@ my_second_toolset = client.load_toolset("my_second_toolset")
## Versioning
This project uses [semantic versioning](https://semver.org/), including a
`MAJOR.MINOR.PATCH` version number that increments with:
This project uses [semantic versioning](https://semver.org/) (`MAJOR.MINOR.PATCH`).
Since the project is in a pre-release stage (version `0.x.y`), we follow the
standard conventions for initial development:
- MAJOR version when we make incompatible API changes
- MINOR version when we add functionality in a backward compatible manner
- PATCH version when we make backward compatible bug fixes
### Pre-1.0.0 Versioning
While the major version is `0`, the public API should be considered unstable.
The version will be incremented as follows:
- **`0.MINOR.PATCH`**: The **MINOR** version is incremented when we add
new functionality or make breaking, incompatible API changes.
- **`0.MINOR.PATCH`**: The **PATCH** version is incremented for
backward-compatible bug fixes.
### Post-1.0.0 Versioning
Once the project reaches a stable `1.0.0` release, the versioning will follow
the more common convention:
- **`MAJOR.MINOR.PATCH`**: Incremented for incompatible API changes.
- **`MAJOR.MINOR.PATCH`**: Incremented for new, backward-compatible functionality.
- **`MAJOR.MINOR.PATCH`**: Incremented for backward-compatible bug fixes.
The public API that this applies to is the CLI associated with Toolbox, the
interactions with official SDKs, and the definitions in the `tools.yaml` file.

View File

@@ -43,23 +43,33 @@ import (
// Import tool packages for side effect of registration
_ "github.com/googleapis/genai-toolbox/internal/tools/alloydbainl"
_ "github.com/googleapis/genai-toolbox/internal/tools/bigquery/bigqueryconversationalanalytics"
_ "github.com/googleapis/genai-toolbox/internal/tools/bigquery/bigqueryexecutesql"
_ "github.com/googleapis/genai-toolbox/internal/tools/bigquery/bigqueryforecast"
_ "github.com/googleapis/genai-toolbox/internal/tools/bigquery/bigquerygetdatasetinfo"
_ "github.com/googleapis/genai-toolbox/internal/tools/bigquery/bigquerygettableinfo"
_ "github.com/googleapis/genai-toolbox/internal/tools/bigquery/bigquerylistdatasetids"
_ "github.com/googleapis/genai-toolbox/internal/tools/bigquery/bigquerylisttableids"
_ "github.com/googleapis/genai-toolbox/internal/tools/bigquery/bigquerysql"
_ "github.com/googleapis/genai-toolbox/internal/tools/bigtable"
_ "github.com/googleapis/genai-toolbox/internal/tools/clickhouse/clickhouseexecutesql"
_ "github.com/googleapis/genai-toolbox/internal/tools/clickhouse/clickhousesql"
_ "github.com/googleapis/genai-toolbox/internal/tools/cloudsql/cloudsqlwaitforoperation"
_ "github.com/googleapis/genai-toolbox/internal/tools/couchbase"
_ "github.com/googleapis/genai-toolbox/internal/tools/dataplex/dataplexlookupentry"
_ "github.com/googleapis/genai-toolbox/internal/tools/dataplex/dataplexsearchaspecttypes"
_ "github.com/googleapis/genai-toolbox/internal/tools/dataplex/dataplexsearchentries"
_ "github.com/googleapis/genai-toolbox/internal/tools/dgraph"
_ "github.com/googleapis/genai-toolbox/internal/tools/firebird/firebirdexecutesql"
_ "github.com/googleapis/genai-toolbox/internal/tools/firebird/firebirdsql"
_ "github.com/googleapis/genai-toolbox/internal/tools/firestore/firestoreadddocuments"
_ "github.com/googleapis/genai-toolbox/internal/tools/firestore/firestoredeletedocuments"
_ "github.com/googleapis/genai-toolbox/internal/tools/firestore/firestoregetdocuments"
_ "github.com/googleapis/genai-toolbox/internal/tools/firestore/firestoregetrules"
_ "github.com/googleapis/genai-toolbox/internal/tools/firestore/firestorelistcollections"
_ "github.com/googleapis/genai-toolbox/internal/tools/firestore/firestorequery"
_ "github.com/googleapis/genai-toolbox/internal/tools/firestore/firestorequerycollection"
_ "github.com/googleapis/genai-toolbox/internal/tools/firestore/firestoreupdatedocument"
_ "github.com/googleapis/genai-toolbox/internal/tools/firestore/firestorevalidaterules"
_ "github.com/googleapis/genai-toolbox/internal/tools/http"
_ "github.com/googleapis/genai-toolbox/internal/tools/looker/lookeradddashboardelement"
@@ -93,6 +103,8 @@ import (
_ "github.com/googleapis/genai-toolbox/internal/tools/neo4j/neo4jcypher"
_ "github.com/googleapis/genai-toolbox/internal/tools/neo4j/neo4jexecutecypher"
_ "github.com/googleapis/genai-toolbox/internal/tools/neo4j/neo4jschema"
_ "github.com/googleapis/genai-toolbox/internal/tools/oceanbase/oceanbaseexecutesql"
_ "github.com/googleapis/genai-toolbox/internal/tools/oceanbase/oceanbasesql"
_ "github.com/googleapis/genai-toolbox/internal/tools/postgres/postgresexecutesql"
_ "github.com/googleapis/genai-toolbox/internal/tools/postgres/postgressql"
_ "github.com/googleapis/genai-toolbox/internal/tools/redis"
@@ -101,21 +113,26 @@ import (
_ "github.com/googleapis/genai-toolbox/internal/tools/sqlitesql"
_ "github.com/googleapis/genai-toolbox/internal/tools/tidb/tidbexecutesql"
_ "github.com/googleapis/genai-toolbox/internal/tools/tidb/tidbsql"
_ "github.com/googleapis/genai-toolbox/internal/tools/trino/trinoexecutesql"
_ "github.com/googleapis/genai-toolbox/internal/tools/trino/trinosql"
_ "github.com/googleapis/genai-toolbox/internal/tools/utility/alloydbwaitforoperation"
_ "github.com/googleapis/genai-toolbox/internal/tools/utility/wait"
_ "github.com/googleapis/genai-toolbox/internal/tools/valkey"
_ "github.com/googleapis/genai-toolbox/internal/tools/yugabytedbsql"
"github.com/spf13/cobra"
_ "github.com/googleapis/genai-toolbox/internal/sources/alloydbpg"
_ "github.com/googleapis/genai-toolbox/internal/sources/bigquery"
_ "github.com/googleapis/genai-toolbox/internal/sources/bigtable"
_ "github.com/googleapis/genai-toolbox/internal/sources/clickhouse"
_ "github.com/googleapis/genai-toolbox/internal/sources/cloudsqlmssql"
_ "github.com/googleapis/genai-toolbox/internal/sources/cloudsqlmysql"
_ "github.com/googleapis/genai-toolbox/internal/sources/cloudsqlpg"
_ "github.com/googleapis/genai-toolbox/internal/sources/couchbase"
_ "github.com/googleapis/genai-toolbox/internal/sources/dataplex"
_ "github.com/googleapis/genai-toolbox/internal/sources/dgraph"
_ "github.com/googleapis/genai-toolbox/internal/sources/firebird"
_ "github.com/googleapis/genai-toolbox/internal/sources/firestore"
_ "github.com/googleapis/genai-toolbox/internal/sources/http"
_ "github.com/googleapis/genai-toolbox/internal/sources/looker"
@@ -123,12 +140,15 @@ import (
_ "github.com/googleapis/genai-toolbox/internal/sources/mssql"
_ "github.com/googleapis/genai-toolbox/internal/sources/mysql"
_ "github.com/googleapis/genai-toolbox/internal/sources/neo4j"
_ "github.com/googleapis/genai-toolbox/internal/sources/oceanbase"
_ "github.com/googleapis/genai-toolbox/internal/sources/postgres"
_ "github.com/googleapis/genai-toolbox/internal/sources/redis"
_ "github.com/googleapis/genai-toolbox/internal/sources/spanner"
_ "github.com/googleapis/genai-toolbox/internal/sources/sqlite"
_ "github.com/googleapis/genai-toolbox/internal/sources/tidb"
_ "github.com/googleapis/genai-toolbox/internal/sources/trino"
_ "github.com/googleapis/genai-toolbox/internal/sources/valkey"
_ "github.com/googleapis/genai-toolbox/internal/sources/yugabytedb"
)
var (
@@ -203,6 +223,9 @@ func NewCommand(opts ...Option) *Command {
o(cmd)
}
// Do not print Usage on runtime error
cmd.SilenceUsage = true
// Set server version
cmd.cfg.Version = versionString
@@ -226,7 +249,13 @@ func NewCommand(opts ...Option) *Command {
flags.BoolVar(&cmd.cfg.TelemetryGCP, "telemetry-gcp", false, "Enable exporting directly to Google Cloud Monitoring.")
flags.StringVar(&cmd.cfg.TelemetryOTLP, "telemetry-otlp", "", "Enable exporting using OpenTelemetry Protocol (OTLP) to the specified endpoint (e.g. 'http://127.0.0.1:4318')")
flags.StringVar(&cmd.cfg.TelemetryServiceName, "telemetry-service-name", "toolbox", "Sets the value of the service.name resource attribute for telemetry data.")
flags.StringVar(&cmd.prebuiltConfig, "prebuilt", "", "Use a prebuilt tool configuration by source type. Cannot be used with --tools-file. Allowed: 'alloydb-postgres-admin', alloydb-postgres', 'bigquery', 'cloud-sql-mysql', 'cloud-sql-postgres', 'cloud-sql-mssql', 'dataplex', 'firestore', 'looker', 'mssql', 'mysql', 'postgres', 'spanner', 'spanner-postgres'.")
// Fetch prebuilt tools sources to customize the help description
prebuiltHelp := fmt.Sprintf(
"Use a prebuilt tool configuration by source type. Cannot be used with --tools-file. Allowed: '%s'.",
strings.Join(prebuiltconfigs.GetPrebuiltSources(), "', '"),
)
flags.StringVar(&cmd.prebuiltConfig, "prebuilt", "", prebuiltHelp)
flags.BoolVar(&cmd.cfg.Stdio, "stdio", false, "Listens via MCP STDIO instead of acting as a remote HTTP server.")
flags.BoolVar(&cmd.cfg.DisableReload, "disable-reload", false, "Disables dynamic reloading of tools file.")
flags.BoolVar(&cmd.cfg.UI, "ui", false, "Launches the Toolbox UI web server.")
@@ -246,32 +275,40 @@ type ToolsFile struct {
}
// parseEnv replaces environment variables ${ENV_NAME} with their values.
func parseEnv(input string) string {
re := regexp.MustCompile(`\$\{(\w+)\}`)
// also support ${ENV_NAME:default_value}.
func parseEnv(input string) (string, error) {
re := regexp.MustCompile(`\$\{(\w+)(:(\w*))?\}`)
return re.ReplaceAllStringFunc(input, func(match string) string {
var err error
output := re.ReplaceAllStringFunc(input, func(match string) string {
parts := re.FindStringSubmatch(match)
if len(parts) < 2 {
// technically shouldn't happen
return match
}
// extract the variable name
variableName := parts[1]
if value, found := os.LookupEnv(variableName); found {
return value
}
return match
if parts[2] != "" {
return parts[3]
}
err = fmt.Errorf("environment variable not found: %q", variableName)
return ""
})
return output, err
}
// parseToolsFile parses the provided yaml into appropriate configs.
func parseToolsFile(ctx context.Context, raw []byte) (ToolsFile, error) {
var toolsFile ToolsFile
// Replace environment variables if found
raw = []byte(parseEnv(string(raw)))
output, err := parseEnv(string(raw))
if err != nil {
return toolsFile, fmt.Errorf("error parsing environment variables: %s", err)
}
raw = []byte(output)
// Parse contents
err := yaml.UnmarshalContext(ctx, raw, &toolsFile, yaml.Strict())
err = yaml.UnmarshalContext(ctx, raw, &toolsFile, yaml.Strict())
if err != nil {
return toolsFile, err
}
@@ -807,7 +844,7 @@ func run(cmd *Command) error {
}
cmd.logger.InfoContext(ctx, "Server ready to serve!")
if cmd.cfg.UI {
cmd.logger.InfoContext(ctx, fmt.Sprintf("Toolbox UI is up and running at: http://localhost:%d/ui", cmd.cfg.Port))
cmd.logger.InfoContext(ctx, fmt.Sprintf("Toolbox UI is up and running at: http://%s:%d/ui", cmd.cfg.Address, cmd.cfg.Port))
}
go func() {

View File

@@ -206,6 +206,72 @@ func TestServerConfigFlags(t *testing.T) {
}
}
func TestParseEnv(t *testing.T) {
tcs := []struct {
desc string
env map[string]string
in string
want string
err bool
errString string
}{
{
desc: "without default without env",
in: "${FOO}",
want: "",
err: true,
errString: `environment variable not found: "FOO"`,
},
{
desc: "without default with env",
env: map[string]string{
"FOO": "bar",
},
in: "${FOO}",
want: "bar",
},
{
desc: "with empty default",
in: "${FOO:}",
want: "",
},
{
desc: "with default",
in: "${FOO:bar}",
want: "bar",
},
{
desc: "with default with env",
env: map[string]string{
"FOO": "hello",
},
in: "${FOO:bar}",
want: "hello",
},
}
for _, tc := range tcs {
t.Run(tc.desc, func(t *testing.T) {
if tc.env != nil {
for k, v := range tc.env {
t.Setenv(k, v)
}
}
got, err := parseEnv(tc.in)
if tc.err {
if err == nil {
t.Fatalf("expected error not found")
}
if tc.errString != err.Error() {
t.Fatalf("incorrect error string: got %s, want %s", err, tc.errString)
}
}
if tc.want != got {
t.Fatalf("unexpected want: got %s, want %s", got, tc.want)
}
})
}
}
func TestToolFileFlag(t *testing.T) {
tcs := []struct {
desc string
@@ -820,13 +886,14 @@ func TestParseToolFileWithAuth(t *testing.T) {
func TestEnvVarReplacement(t *testing.T) {
ctx, err := testutils.ContextWithNewLogger()
os.Setenv("TestHeader", "ACTUAL_HEADER")
os.Setenv("API_KEY", "ACTUAL_API_KEY")
os.Setenv("clientId", "ACTUAL_CLIENT_ID")
os.Setenv("clientId2", "ACTUAL_CLIENT_ID_2")
os.Setenv("toolset_name", "ACTUAL_TOOLSET_NAME")
os.Setenv("cat_string", "cat")
os.Setenv("food_string", "food")
t.Setenv("TestHeader", "ACTUAL_HEADER")
t.Setenv("API_KEY", "ACTUAL_API_KEY")
t.Setenv("clientId", "ACTUAL_CLIENT_ID")
t.Setenv("clientId2", "ACTUAL_CLIENT_ID_2")
t.Setenv("toolset_name", "ACTUAL_TOOLSET_NAME")
t.Setenv("cat_string", "cat")
t.Setenv("food_string", "food")
t.Setenv("TestHeader", "ACTUAL_HEADER")
if err != nil {
t.Fatalf("unexpected error: %s", err)
@@ -1161,9 +1228,11 @@ func TestSingleEdit(t *testing.T) {
}
func TestPrebuiltTools(t *testing.T) {
// Get prebuilt configs
alloydb_admin_config, _ := prebuiltconfigs.Get("alloydb-postgres-admin")
alloydb_config, _ := prebuiltconfigs.Get("alloydb-postgres")
bigquery_config, _ := prebuiltconfigs.Get("bigquery")
clickhouse_config, _ := prebuiltconfigs.Get("clickhouse")
cloudsqlpg_config, _ := prebuiltconfigs.Get("cloud-sql-postgres")
cloudsqlmysql_config, _ := prebuiltconfigs.Get("cloud-sql-mysql")
cloudsqlmssql_config, _ := prebuiltconfigs.Get("cloud-sql-mssql")
@@ -1175,6 +1244,86 @@ func TestPrebuiltTools(t *testing.T) {
postgresconfig, _ := prebuiltconfigs.Get("postgres")
spanner_config, _ := prebuiltconfigs.Get("spanner")
spannerpg_config, _ := prebuiltconfigs.Get("spanner-postgres")
neo4jconfig, _ := prebuiltconfigs.Get("neo4j")
// Set environment variables
t.Setenv("API_KEY", "your_api_key")
t.Setenv("BIGQUERY_PROJECT", "your_gcp_project_id")
t.Setenv("DATAPLEX_PROJECT", "your_gcp_project_id")
t.Setenv("FIRESTORE_PROJECT", "your_gcp_project_id")
t.Setenv("FIRESTORE_DATABASE", "your_firestore_db_name")
t.Setenv("SPANNER_PROJECT", "your_gcp_project_id")
t.Setenv("SPANNER_INSTANCE", "your_spanner_instance")
t.Setenv("SPANNER_DATABASE", "your_spanner_db")
t.Setenv("ALLOYDB_POSTGRES_PROJECT", "your_gcp_project_id")
t.Setenv("ALLOYDB_POSTGRES_REGION", "your_gcp_region")
t.Setenv("ALLOYDB_POSTGRES_CLUSTER", "your_alloydb_cluster")
t.Setenv("ALLOYDB_POSTGRES_INSTANCE", "your_alloydb_instance")
t.Setenv("ALLOYDB_POSTGRES_DATABASE", "your_alloydb_db")
t.Setenv("ALLOYDB_POSTGRES_USER", "your_alloydb_user")
t.Setenv("ALLOYDB_POSTGRES_PASSWORD", "your_alloydb_password")
t.Setenv("CLICKHOUSE_PROTOCOL", "your_clickhouse_protocol")
t.Setenv("CLICKHOUSE_DATABASE", "your_clickhouse_database")
t.Setenv("CLICKHOUSE_PASSWORD", "your_clickhouse_password")
t.Setenv("CLICKHOUSE_USER", "your_clickhouse_user")
t.Setenv("CLICKHOUSE_HOST", "your_clickhosue_host")
t.Setenv("CLICKHOUSE_PORT", "8123")
t.Setenv("CLOUD_SQL_POSTGRES_PROJECT", "your_pg_project")
t.Setenv("CLOUD_SQL_POSTGRES_INSTANCE", "your_pg_instance")
t.Setenv("CLOUD_SQL_POSTGRES_DATABASE", "your_pg_db")
t.Setenv("CLOUD_SQL_POSTGRES_REGION", "your_pg_region")
t.Setenv("CLOUD_SQL_POSTGRES_USER", "your_pg_user")
t.Setenv("CLOUD_SQL_POSTGRES_PASS", "your_pg_pass")
t.Setenv("CLOUD_SQL_MYSQL_PROJECT", "your_gcp_project_id")
t.Setenv("CLOUD_SQL_MYSQL_REGION", "your_gcp_region")
t.Setenv("CLOUD_SQL_MYSQL_INSTANCE", "your_instance")
t.Setenv("CLOUD_SQL_MYSQL_DATABASE", "your_cloudsql_mysql_db")
t.Setenv("CLOUD_SQL_MYSQL_USER", "your_cloudsql_mysql_user")
t.Setenv("CLOUD_SQL_MYSQL_PASSWORD", "your_cloudsql_mysql_password")
t.Setenv("CLOUD_SQL_MSSQL_PROJECT", "your_gcp_project_id")
t.Setenv("CLOUD_SQL_MSSQL_REGION", "your_gcp_region")
t.Setenv("CLOUD_SQL_MSSQL_INSTANCE", "your_cloudsql_mssql_instance")
t.Setenv("CLOUD_SQL_MSSQL_DATABASE", "your_cloudsql_mssql_db")
t.Setenv("CLOUD_SQL_MSSQL_IP_ADDRESS", "127.0.0.1")
t.Setenv("CLOUD_SQL_MSSQL_USER", "your_cloudsql_mssql_user")
t.Setenv("CLOUD_SQL_MSSQL_PASSWORD", "your_cloudsql_mssql_password")
t.Setenv("CLOUD_SQL_POSTGRES_PASSWORD", "your_cloudsql_pg_password")
t.Setenv("POSTGRES_HOST", "localhost")
t.Setenv("POSTGRES_PORT", "5432")
t.Setenv("POSTGRES_DATABASE", "your_postgres_db")
t.Setenv("POSTGRES_USER", "your_postgres_user")
t.Setenv("POSTGRES_PASSWORD", "your_postgres_password")
t.Setenv("MYSQL_HOST", "localhost")
t.Setenv("MYSQL_PORT", "3306")
t.Setenv("MYSQL_DATABASE", "your_mysql_db")
t.Setenv("MYSQL_USER", "your_mysql_user")
t.Setenv("MYSQL_PASSWORD", "your_mysql_password")
t.Setenv("MSSQL_HOST", "localhost")
t.Setenv("MSSQL_PORT", "1433")
t.Setenv("MSSQL_DATABASE", "your_mssql_db")
t.Setenv("MSSQL_USER", "your_mssql_user")
t.Setenv("MSSQL_PASSWORD", "your_mssql_password")
t.Setenv("LOOKER_BASE_URL", "https://your_company.looker.com")
t.Setenv("LOOKER_CLIENT_ID", "your_looker_client_id")
t.Setenv("LOOKER_CLIENT_SECRET", "your_looker_client_secret")
t.Setenv("LOOKER_VERIFY_SSL", "true")
t.Setenv("NEO4J_URI", "bolt://localhost:7687")
t.Setenv("NEO4J_DATABASE", "neo4j")
t.Setenv("NEO4J_USERNAME", "your_neo4j_user")
t.Setenv("NEO4J_PASSWORD", "your_neo4j_password")
ctx, err := testutils.ContextWithNewLogger()
if err != nil {
t.Fatalf("unexpected error: %s", err)
@@ -1210,7 +1359,17 @@ func TestPrebuiltTools(t *testing.T) {
wantToolset: server.ToolsetConfigs{
"bigquery-database-tools": tools.ToolsetConfig{
Name: "bigquery-database-tools",
ToolNames: []string{"execute_sql", "get_dataset_info", "get_table_info", "list_dataset_ids", "list_table_ids"},
ToolNames: []string{"ask_data_insights", "execute_sql", "forecast", "get_dataset_info", "get_table_info", "list_dataset_ids", "list_table_ids"},
},
},
},
{
name: "clickhouse prebuilt tools",
in: clickhouse_config,
wantToolset: server.ToolsetConfigs{
"clickhouse-database-tools": tools.ToolsetConfig{
Name: "clickhouse-database-tools",
ToolNames: []string{"execute_sql"},
},
},
},
@@ -1260,7 +1419,7 @@ func TestPrebuiltTools(t *testing.T) {
wantToolset: server.ToolsetConfigs{
"firestore-database-tools": tools.ToolsetConfig{
Name: "firestore-database-tools",
ToolNames: []string{"firestore-get-documents", "firestore-list-collections", "firestore-delete-documents", "firestore-query-collection", "firestore-get-rules", "firestore-validate-rules"},
ToolNames: []string{"firestore-get-documents", "firestore-add-documents", "firestore-update-document", "firestore-list-collections", "firestore-delete-documents", "firestore-query-collection", "firestore-get-rules", "firestore-validate-rules"},
},
},
},
@@ -1324,6 +1483,16 @@ func TestPrebuiltTools(t *testing.T) {
},
},
},
{
name: "neo4j prebuilt tools",
in: neo4jconfig,
wantToolset: server.ToolsetConfigs{
"neo4j-database-tools": tools.ToolsetConfig{
Name: "neo4j-database-tools",
ToolNames: []string{"execute_cypher", "get_schema"},
},
},
},
}
for _, tc := range tcs {

View File

@@ -1 +1 @@
0.12.0
0.14.0

View File

@@ -234,7 +234,7 @@
},
"outputs": [],
"source": [
"version = \"0.12.0\" # x-release-please-version\n",
"version = \"0.14.0\" # x-release-please-version\n",
"! curl -O https://storage.googleapis.com/genai-toolbox/v{version}/linux/amd64/toolbox\n",
"\n",
"# Make the binary executable\n",

View File

@@ -22,6 +22,11 @@ etc., you could use environment variables instead with the format `${ENV_NAME}`.
user: ${USER_NAME}
password: ${PASSWORD}
```
A default value can be specified like `${ENV_NAME:default}`.
```yaml
port: ${DB_PORT:3306}
```
### Sources

View File

@@ -86,7 +86,7 @@ To install Toolbox as a binary:
```sh
# see releases page for other versions
export VERSION=0.12.0
export VERSION=0.14.0
curl -O https://storage.googleapis.com/genai-toolbox/v$VERSION/linux/amd64/toolbox
chmod +x toolbox
```
@@ -97,7 +97,7 @@ You can also install Toolbox as a container:
```sh
# see releases page for other versions
export VERSION=0.12.0
export VERSION=0.14.0
docker pull us-central1-docker.pkg.dev/database-toolbox/toolbox/toolbox:$VERSION
```
@@ -115,7 +115,7 @@ To install from source, ensure you have the latest version of
[Go installed](https://go.dev/doc/install), and then run the following command:
```sh
go install github.com/googleapis/genai-toolbox@v0.12.0
go install github.com/googleapis/genai-toolbox@v0.14.0
```
{{% /tab %}}

View File

@@ -18,265 +18,13 @@ This guide assumes you have already done the following:
1. Installed [PostgreSQL 16+ and the `psql` client][install-postgres].
### Cloud Setup (Optional)
If you plan to use **Google Clouds Vertex AI** with your agent (e.g., using
`vertexai=True` or a Google GenAI model), follow these one-time setup steps for
local development:
1. [Install the Google Cloud CLI](https://cloud.google.com/sdk/docs/install)
1. [Set up Application Default Credentials (ADC)](https://cloud.google.com/docs/authentication/set-up-adc-local-dev-environment)
1. Set your project and enable Vertex AI
```bash
gcloud config set project YOUR_PROJECT_ID
gcloud services enable aiplatform.googleapis.com
```
[install-python]: https://wiki.python.org/moin/BeginnersGuide/Download
[install-pip]: https://pip.pypa.io/en/stable/installation/
[install-venv]: https://packaging.python.org/en/latest/tutorials/installing-packages/#creating-virtual-environments
[install-postgres]: https://www.postgresql.org/download/
{{< regionInclude "quickstart/shared/cloud_setup.md" "cloud_setup" >}}
## Step 1: Set up your database
In this section, we will create a database, insert some data that needs to be
accessed by our agent, and create a database user for Toolbox to connect with.
1. Connect to postgres using the `psql` command:
```bash
psql -h 127.0.0.1 -U postgres
```
Here, `postgres` denotes the default postgres superuser.
{{< notice info >}}
#### **Having trouble connecting?**
* **Password Prompt:** If you are prompted for a password for the `postgres`
user and do not know it (or a blank password doesn't work), your PostgreSQL
installation might require a password or a different authentication method.
* **`FATAL: role "postgres" does not exist`:** This error means the default
`postgres` superuser role isn't available under that name on your system.
* **`Connection refused`:** Ensure your PostgreSQL server is actually running.
You can typically check with `sudo systemctl status postgresql` and start it
with `sudo systemctl start postgresql` on Linux systems.
<br/>
#### **Common Solution**
For password issues or if the `postgres` role seems inaccessible directly, try
switching to the `postgres` operating system user first. This user often has
permission to connect without a password for local connections (this is called
peer authentication).
```bash
sudo -i -u postgres
psql -h 127.0.0.1
```
Once you are in the `psql` shell using this method, you can proceed with the
database creation steps below. Afterwards, type `\q` to exit `psql`, and then
`exit` to return to your normal user shell.
If desired, once connected to `psql` as the `postgres` OS user, you can set a
password for the `postgres` *database* user using: `ALTER USER postgres WITH
PASSWORD 'your_chosen_password';`. This would allow direct connection with `-U
postgres` and a password next time.
{{< /notice >}}
1. Create a new database and a new user:
{{< notice tip >}}
For a real application, it's best to follow the principle of least permission
and only grant the privileges your application needs.
{{< /notice >}}
```sql
CREATE USER toolbox_user WITH PASSWORD 'my-password';
CREATE DATABASE toolbox_db;
GRANT ALL PRIVILEGES ON DATABASE toolbox_db TO toolbox_user;
ALTER DATABASE toolbox_db OWNER TO toolbox_user;
```
1. End the database session:
```bash
\q
```
(If you used `sudo -i -u postgres` and then `psql`, remember you might also
need to type `exit` after `\q` to leave the `postgres` user's shell
session.)
1. Connect to your database with your new user:
```bash
psql -h 127.0.0.1 -U toolbox_user -d toolbox_db
```
1. Create a table using the following command:
```sql
CREATE TABLE hotels(
id INTEGER NOT NULL PRIMARY KEY,
name VARCHAR NOT NULL,
location VARCHAR NOT NULL,
price_tier VARCHAR NOT NULL,
checkin_date DATE NOT NULL,
checkout_date DATE NOT NULL,
booked BIT NOT NULL
);
```
1. Insert data into the table.
```sql
INSERT INTO hotels(id, name, location, price_tier, checkin_date, checkout_date, booked)
VALUES
(1, 'Hilton Basel', 'Basel', 'Luxury', '2024-04-22', '2024-04-20', B'0'),
(2, 'Marriott Zurich', 'Zurich', 'Upscale', '2024-04-14', '2024-04-21', B'0'),
(3, 'Hyatt Regency Basel', 'Basel', 'Upper Upscale', '2024-04-02', '2024-04-20', B'0'),
(4, 'Radisson Blu Lucerne', 'Lucerne', 'Midscale', '2024-04-24', '2024-04-05', B'0'),
(5, 'Best Western Bern', 'Bern', 'Upper Midscale', '2024-04-23', '2024-04-01', B'0'),
(6, 'InterContinental Geneva', 'Geneva', 'Luxury', '2024-04-23', '2024-04-28', B'0'),
(7, 'Sheraton Zurich', 'Zurich', 'Upper Upscale', '2024-04-27', '2024-04-02', B'0'),
(8, 'Holiday Inn Basel', 'Basel', 'Upper Midscale', '2024-04-24', '2024-04-09', B'0'),
(9, 'Courtyard Zurich', 'Zurich', 'Upscale', '2024-04-03', '2024-04-13', B'0'),
(10, 'Comfort Inn Bern', 'Bern', 'Midscale', '2024-04-04', '2024-04-16', B'0');
```
1. End the database session:
```bash
\q
```
{{< regionInclude "quickstart/shared/database_setup.md" "database_setup" >}}
## Step 2: Install and configure Toolbox
In this section, we will download Toolbox, configure our tools in a
`tools.yaml`, and then run the Toolbox server.
1. Download the latest version of Toolbox as a binary:
{{< notice tip >}}
Select the
[correct binary](https://github.com/googleapis/genai-toolbox/releases)
corresponding to your OS and CPU architecture.
{{< /notice >}}
<!-- {x-release-please-start-version} -->
```bash
export OS="linux/amd64" # one of linux/amd64, darwin/arm64, darwin/amd64, or windows/amd64
curl -O https://storage.googleapis.com/genai-toolbox/v0.12.0/$OS/toolbox
```
<!-- {x-release-please-end} -->
1. Make the binary executable:
```bash
chmod +x toolbox
```
1. Write the following into a `tools.yaml` file. Be sure to update any fields
such as `user`, `password`, or `database` that you may have customized in the
previous step.
{{< notice tip >}}
In practice, use environment variable replacement with the format ${ENV_NAME}
instead of hardcoding your secrets into the configuration file.
{{< /notice >}}
```yaml
sources:
my-pg-source:
kind: postgres
host: 127.0.0.1
port: 5432
database: toolbox_db
user: ${USER_NAME}
password: ${PASSWORD}
tools:
search-hotels-by-name:
kind: postgres-sql
source: my-pg-source
description: Search for hotels based on name.
parameters:
- name: name
type: string
description: The name of the hotel.
statement: SELECT * FROM hotels WHERE name ILIKE '%' || $1 || '%';
search-hotels-by-location:
kind: postgres-sql
source: my-pg-source
description: Search for hotels based on location.
parameters:
- name: location
type: string
description: The location of the hotel.
statement: SELECT * FROM hotels WHERE location ILIKE '%' || $1 || '%';
book-hotel:
kind: postgres-sql
source: my-pg-source
description: >-
Book a hotel by its ID. If the hotel is successfully booked, returns a NULL, raises an error if not.
parameters:
- name: hotel_id
type: string
description: The ID of the hotel to book.
statement: UPDATE hotels SET booked = B'1' WHERE id = $1;
update-hotel:
kind: postgres-sql
source: my-pg-source
description: >-
Update a hotel's check-in and check-out dates by its ID. Returns a message
indicating whether the hotel was successfully updated or not.
parameters:
- name: hotel_id
type: string
description: The ID of the hotel to update.
- name: checkin_date
type: string
description: The new check-in date of the hotel.
- name: checkout_date
type: string
description: The new check-out date of the hotel.
statement: >-
UPDATE hotels SET checkin_date = CAST($2 as date), checkout_date = CAST($3
as date) WHERE id = $1;
cancel-hotel:
kind: postgres-sql
source: my-pg-source
description: Cancel a hotel by its ID.
parameters:
- name: hotel_id
type: string
description: The ID of the hotel to cancel.
statement: UPDATE hotels SET booked = B'0' WHERE id = $1;
toolsets:
my-toolset:
- search-hotels-by-name
- search-hotels-by-location
- book-hotel
- update-hotel
- cancel-hotel
```
For more info on tools, check out the `Resources` section of the docs.
1. Run the Toolbox server, pointing to the `tools.yaml` file created earlier:
```bash
./toolbox --tools-file "tools.yaml"
```
{{< notice note >}}
Toolbox enables dynamic reloading by default. To disable, use the
`--disable-reload` flag.
{{< /notice >}}
{{< regionInclude "quickstart/shared/configure_toolbox.md" "configure_toolbox" >}}
## Step 3: Connect your agent to Toolbox
@@ -346,305 +94,23 @@ pip install google-genai
code to create an agent:
{{< tabpane persist=header >}}
{{< tab header="ADK" lang="python" >}}
from google.adk.agents import Agent
from google.adk.runners import Runner
from google.adk.sessions import InMemorySessionService
from google.adk.artifacts.in_memory_artifact_service import InMemoryArtifactService
from google.genai import types
from toolbox_core import ToolboxSyncClient
import asyncio
import os
{{< include "quickstart/python/adk/quickstart.py" >}}
# TODO(developer): replace this with your Google API key
os.environ['GOOGLE_API_KEY'] = 'your-api-key'
async def main():
with ToolboxSyncClient("<http://127.0.0.1:5000>") as toolbox_client:
prompt = """
You're a helpful hotel assistant. You handle hotel searching, booking and
cancellations. When the user searches for a hotel, mention it's name, id,
location and price tier. Always mention hotel ids while performing any
searches. This is very important for any operations. For any bookings or
cancellations, please provide the appropriate confirmation. Be sure to
update checkin or checkout dates if mentioned by the user.
Don't ask for confirmations from the user.
"""
root_agent = Agent(
model='gemini-2.0-flash-001',
name='hotel_agent',
description='A helpful AI assistant.',
instruction=prompt,
tools=toolbox_client.load_toolset("my-toolset"),
)
session_service = InMemorySessionService()
artifacts_service = InMemoryArtifactService()
session = await session_service.create_session(
state={}, app_name='hotel_agent', user_id='123'
)
runner = Runner(
app_name='hotel_agent',
agent=root_agent,
artifact_service=artifacts_service,
session_service=session_service,
)
queries = [
"Find hotels in Basel with Basel in its name.",
"Can you book the Hilton Basel for me?",
"Oh wait, this is too expensive. Please cancel it and book the Hyatt Regency instead.",
"My check in dates would be from April 10, 2024 to April 19, 2024.",
]
for query in queries:
content = types.Content(role='user', parts=[types.Part(text=query)])
events = runner.run(session_id=session.id,
user_id='123', new_message=content)
responses = (
part.text
for event in events
for part in event.content.parts
if part.text is not None
)
for text in responses:
print(text)
asyncio.run(main())
{{< /tab >}}
{{< tab header="LangChain" lang="python" >}}
import asyncio
from langgraph.prebuilt import create_react_agent
{{< include "quickstart/python/langchain/quickstart.py" >}}
# TODO(developer): replace this with another import if needed
from langchain_google_vertexai import ChatVertexAI
# from langchain_google_genai import ChatGoogleGenerativeAI
# from langchain_anthropic import ChatAnthropic
from langgraph.checkpoint.memory import MemorySaver
from toolbox_langchain import ToolboxClient
prompt = """
You're a helpful hotel assistant. You handle hotel searching, booking and
cancellations. When the user searches for a hotel, mention it's name, id,
location and price tier. Always mention hotel ids while performing any
searches. This is very important for any operations. For any bookings or
cancellations, please provide the appropriate confirmation. Be sure to
update checkin or checkout dates if mentioned by the user.
Don't ask for confirmations from the user.
"""
queries = [
"Find hotels in Basel with Basel in its name.",
"Can you book the Hilton Basel for me?",
"Oh wait, this is too expensive. Please cancel it and book the Hyatt Regency instead.",
"My check in dates would be from April 10, 2024 to April 19, 2024.",
]
async def run_application():
# TODO(developer): replace this with another model if needed
model = ChatVertexAI(model_name="gemini-2.0-flash-001")
# model = ChatGoogleGenerativeAI(model="gemini-2.0-flash-001")
# model = ChatAnthropic(model="claude-3-5-sonnet-20240620")
# Load the tools from the Toolbox server
async with ToolboxClient("http://127.0.0.1:5000") as client:
tools = await client.aload_toolset()
agent = create_react_agent(model, tools, checkpointer=MemorySaver())
config = {"configurable": {"thread_id": "thread-1"}}
for query in queries:
inputs = {"messages": [("user", prompt + query)]}
response = agent.invoke(inputs, stream_mode="values", config=config)
print(response["messages"][-1].content)
asyncio.run(run_application())
{{< /tab >}}
{{< tab header="LlamaIndex" lang="python" >}}
import asyncio
import os
from llama_index.core.agent.workflow import AgentWorkflow
{{< include "quickstart/python/llamaindex/quickstart.py" >}}
from llama_index.core.workflow import Context
# TODO(developer): replace this with another import if needed
from llama_index.llms.google_genai import GoogleGenAI
# from llama_index.llms.anthropic import Anthropic
from toolbox_llamaindex import ToolboxClient
prompt = """
You're a helpful hotel assistant. You handle hotel searching, booking and
cancellations. When the user searches for a hotel, mention it's name, id,
location and price tier. Always mention hotel ids while performing any
searches. This is very important for any operations. For any bookings or
cancellations, please provide the appropriate confirmation. Be sure to
update checkin or checkout dates if mentioned by the user.
Don't ask for confirmations from the user.
"""
queries = [
"Find hotels in Basel with Basel in its name.",
"Can you book the Hilton Basel for me?",
"Oh wait, this is too expensive. Please cancel it and book the Hyatt Regency instead.",
"My check in dates would be from April 10, 2024 to April 19, 2024.",
]
async def run_application():
# TODO(developer): replace this with another model if needed
llm = GoogleGenAI(
model="gemini-2.0-flash-001",
vertexai_config={"project": "project-id", "location": "us-central1"},
)
# llm = GoogleGenAI(
# api_key=os.getenv("GOOGLE_API_KEY"),
# model="gemini-2.0-flash-001",
# )
# llm = Anthropic(
# model="claude-3-7-sonnet-latest",
# api_key=os.getenv("ANTHROPIC_API_KEY")
# )
# Load the tools from the Toolbox server
async with ToolboxClient("http://127.0.0.1:5000") as client:
tools = await client.aload_toolset()
agent = AgentWorkflow.from_tools_or_functions(
tools,
llm=llm,
system_prompt=prompt,
)
ctx = Context(agent)
for query in queries:
response = await agent.run(user_msg=query, ctx=ctx)
print(f"---- {query} ----")
print(str(response))
asyncio.run(run_application())
{{< /tab >}}
{{< tab header="Core" lang="python" >}}
import asyncio
from google import genai
from google.genai.types import (
Content,
FunctionDeclaration,
GenerateContentConfig,
Part,
Tool,
)
from toolbox_core import ToolboxClient
prompt = """
You're a helpful hotel assistant. You handle hotel searching, booking and
cancellations. When the user searches for a hotel, mention it's name, id,
location and price tier. Always mention hotel id while performing any
searches. This is very important for any operations. For any bookings or
cancellations, please provide the appropriate confirmation. Be sure to
update checkin or checkout dates if mentioned by the user.
Don't ask for confirmations from the user.
"""
queries = [
"Find hotels in Basel with Basel in its name.",
"Please book the hotel Hilton Basel for me.",
"This is too expensive. Please cancel it.",
"Please book Hyatt Regency for me",
"My check in dates for my booking would be from April 10, 2024 to April 19, 2024.",
]
async def run_application():
async with ToolboxClient("<http://127.0.0.1:5000>") as toolbox_client:
# The toolbox_tools list contains Python callables (functions/methods) designed for LLM tool-use
# integration. While this example uses Google's genai client, these callables can be adapted for
# various function-calling or agent frameworks. For easier integration with supported frameworks
# (https://github.com/googleapis/mcp-toolbox-python-sdk/tree/main/packages), use the
# provided wrapper packages, which handle framework-specific boilerplate.
toolbox_tools = await toolbox_client.load_toolset("my-toolset")
genai_client = genai.Client(
vertexai=True, project="project-id", location="us-central1"
)
genai_tools = [
Tool(
function_declarations=[
FunctionDeclaration.from_callable_with_api_option(callable=tool)
]
)
for tool in toolbox_tools
]
history = []
for query in queries:
user_prompt_content = Content(
role="user",
parts=[Part.from_text(text=query)],
)
history.append(user_prompt_content)
response = genai_client.models.generate_content(
model="gemini-2.0-flash-001",
contents=history,
config=GenerateContentConfig(
system_instruction=prompt,
tools=genai_tools,
),
)
history.append(response.candidates[0].content)
function_response_parts = []
for function_call in response.function_calls:
fn_name = function_call.name
# The tools are sorted alphabetically
if fn_name == "search-hotels-by-name":
function_result = await toolbox_tools[3](**function_call.args)
elif fn_name == "search-hotels-by-location":
function_result = await toolbox_tools[2](**function_call.args)
elif fn_name == "book-hotel":
function_result = await toolbox_tools[0](**function_call.args)
elif fn_name == "update-hotel":
function_result = await toolbox_tools[4](**function_call.args)
elif fn_name == "cancel-hotel":
function_result = await toolbox_tools[1](**function_call.args)
else:
raise ValueError("Function name not present.")
function_response = {"result": function_result}
function_response_part = Part.from_function_response(
name=function_call.name,
response=function_response,
)
function_response_parts.append(function_response_part)
if function_response_parts:
tool_response_content = Content(role="tool", parts=function_response_parts)
history.append(tool_response_content)
response2 = genai_client.models.generate_content(
model="gemini-2.0-flash-001",
contents=history,
config=GenerateContentConfig(
tools=genai_tools,
),
)
final_model_response_content = response2.candidates[0].content
history.append(final_model_response_content)
print(response2.text)
asyncio.run(run_application())
{{< include "quickstart/python/core/quickstart.py" >}}
{{< /tab >}}
{{< /tabpane >}}

View File

@@ -13,266 +13,17 @@ This guide assumes you have already done the following:
1. Installed [Go (v1.24.2 or higher)].
1. Installed [PostgreSQL 16+ and the `psql` client][install-postgres].
### Cloud Setup (Optional)
If you plan to use **Google Clouds Vertex AI** with your agent (e.g., using
Gemini or PaLM models), follow these one-time setup steps:
1. [Install the Google Cloud CLI]
1. [Set up Application Default Credentials (ADC)]
1. Set your project and enable Vertex AI
```bash
gcloud config set project YOUR_PROJECT_ID
gcloud services enable aiplatform.googleapis.com
```
[Go (v1.24.2 or higher)]: https://go.dev/doc/install
[install-postgres]: https://www.postgresql.org/download/
[Install the Google Cloud CLI]: https://cloud.google.com/sdk/docs/install
[Set up Application Default Credentials (ADC)]:
https://cloud.google.com/docs/authentication/set-up-adc-local-dev-environment
### Cloud Setup (Optional)
{{< regionInclude "quickstart/shared/cloud_setup.md" "cloud_setup" >}}
## Step 1: Set up your database
In this section, we will create a database, insert some data that needs to be
accessed by our agent, and create a database user for Toolbox to connect with.
1. Connect to postgres using the `psql` command:
```bash
psql -h 127.0.0.1 -U postgres
```
Here, `postgres` denotes the default postgres superuser.
{{< notice info >}}
#### **Having trouble connecting?**
* **Password Prompt:** If you are prompted for a password for the `postgres`
user and do not know it (or a blank password doesn't work), your PostgreSQL
installation might require a password or a different authentication method.
* **`FATAL: role "postgres" does not exist`:** This error means the default
`postgres` superuser role isn't available under that name on your system.
* **`Connection refused`:** Ensure your PostgreSQL server is actually running.
You can typically check with `sudo systemctl status postgresql` and start it
with `sudo systemctl start postgresql` on Linux systems.
<br/>
#### **Common Solution**
For password issues or if the `postgres` role seems inaccessible directly, try
switching to the `postgres` operating system user first. This user often has
permission to connect without a password for local connections (this is called
peer authentication).
```bash
sudo -i -u postgres
psql -h 127.0.0.1
```
Once you are in the `psql` shell using this method, you can proceed with the
database creation steps below. Afterwards, type `\q` to exit `psql`, and then
`exit` to return to your normal user shell.
If desired, once connected to `psql` as the `postgres` OS user, you can set a
password for the `postgres` *database* user using: `ALTER USER postgres WITH
PASSWORD 'your_chosen_password';`. This would allow direct connection with `-U
postgres` and a password next time.
{{< /notice >}}
1. Create a new database and a new user:
{{< notice tip >}}
For a real application, it's best to follow the principle of least permission
and only grant the privileges your application needs.
{{< /notice >}}
```sql
CREATE USER toolbox_user WITH PASSWORD 'my-password';
CREATE DATABASE toolbox_db;
GRANT ALL PRIVILEGES ON DATABASE toolbox_db TO toolbox_user;
ALTER DATABASE toolbox_db OWNER TO toolbox_user;
```
1. End the database session:
```bash
\q
```
(If you used `sudo -i -u postgres` and then `psql`, remember you might also
need to type `exit` after `\q` to leave the `postgres` user's shell
session.)
1. Connect to your database with your new user:
```bash
psql -h 127.0.0.1 -U toolbox_user -d toolbox_db
```
1. Create a table using the following command:
```sql
CREATE TABLE hotels(
id INTEGER NOT NULL PRIMARY KEY,
name VARCHAR NOT NULL,
location VARCHAR NOT NULL,
price_tier VARCHAR NOT NULL,
checkin_date DATE NOT NULL,
checkout_date DATE NOT NULL,
booked BIT NOT NULL
);
```
1. Insert data into the table.
```sql
INSERT INTO hotels(id, name, location, price_tier, checkin_date, checkout_date, booked)
VALUES
(1, 'Hilton Basel', 'Basel', 'Luxury', '2024-04-22', '2024-04-20', B'0'),
(2, 'Marriott Zurich', 'Zurich', 'Upscale', '2024-04-14', '2024-04-21', B'0'),
(3, 'Hyatt Regency Basel', 'Basel', 'Upper Upscale', '2024-04-02', '2024-04-20', B'0'),
(4, 'Radisson Blu Lucerne', 'Lucerne', 'Midscale', '2024-04-24', '2024-04-05', B'0'),
(5, 'Best Western Bern', 'Bern', 'Upper Midscale', '2024-04-23', '2024-04-01', B'0'),
(6, 'InterContinental Geneva', 'Geneva', 'Luxury', '2024-04-23', '2024-04-28', B'0'),
(7, 'Sheraton Zurich', 'Zurich', 'Upper Upscale', '2024-04-27', '2024-04-02', B'0'),
(8, 'Holiday Inn Basel', 'Basel', 'Upper Midscale', '2024-04-24', '2024-04-09', B'0'),
(9, 'Courtyard Zurich', 'Zurich', 'Upscale', '2024-04-03', '2024-04-13', B'0'),
(10, 'Comfort Inn Bern', 'Bern', 'Midscale', '2024-04-04', '2024-04-16', B'0');
```
1. End the database session:
```bash
\q
```
{{< regionInclude "quickstart/shared/database_setup.md" "database_setup" >}}
## Step 2: Install and configure Toolbox
In this section, we will download Toolbox, configure our tools in a
`tools.yaml`, and then run the Toolbox server.
1. Download the latest version of Toolbox as a binary:
{{< notice tip >}}
Select the
[correct binary](https://github.com/googleapis/genai-toolbox/releases)
corresponding to your OS and CPU architecture.
{{< /notice >}}
<!-- {x-release-please-start-version} -->
```bash
export OS="linux/amd64" # one of linux/amd64, darwin/arm64, darwin/amd64, or windows/amd64
curl -O https://storage.googleapis.com/genai-toolbox/v0.12.0/$OS/toolbox
```
<!-- {x-release-please-end} -->
1. Make the binary executable:
```bash
chmod +x toolbox
```
1. Write the following into a `tools.yaml` file. Be sure to update any fields
such as `user`, `password`, or `database` that you may have customized in the
previous step.
{{< notice tip >}}
In practice, use environment variable replacement with the format ${ENV_NAME}
instead of hardcoding your secrets into the configuration file.
{{< /notice >}}
```yaml
sources:
my-pg-source:
kind: postgres
host: 127.0.0.1
port: 5432
database: toolbox_db
user: ${USER_NAME}
password: ${PASSWORD}
tools:
search-hotels-by-name:
kind: postgres-sql
source: my-pg-source
description: Search for hotels based on name.
parameters:
- name: name
type: string
description: The name of the hotel.
statement: SELECT * FROM hotels WHERE name ILIKE '%' || $1 || '%';
search-hotels-by-location:
kind: postgres-sql
source: my-pg-source
description: Search for hotels based on location.
parameters:
- name: location
type: string
description: The location of the hotel.
statement: SELECT * FROM hotels WHERE location ILIKE '%' || $1 || '%';
book-hotel:
kind: postgres-sql
source: my-pg-source
description: >-
Book a hotel by its ID. If the hotel is successfully booked, returns a NULL, raises an error if not.
parameters:
- name: hotel_id
type: string
description: The ID of the hotel to book.
statement: UPDATE hotels SET booked = B'1' WHERE id = $1;
update-hotel:
kind: postgres-sql
source: my-pg-source
description: >-
Update a hotel's check-in and check-out dates by its ID. Returns a message
indicating whether the hotel was successfully updated or not.
parameters:
- name: hotel_id
type: string
description: The ID of the hotel to update.
- name: checkin_date
type: string
description: The new check-in date of the hotel.
- name: checkout_date
type: string
description: The new check-out date of the hotel.
statement: >-
UPDATE hotels SET checkin_date = CAST($2 as date), checkout_date = CAST($3
as date) WHERE id = $1;
cancel-hotel:
kind: postgres-sql
source: my-pg-source
description: Cancel a hotel by its ID.
parameters:
- name: hotel_id
type: string
description: The ID of the hotel to cancel.
statement: UPDATE hotels SET booked = B'0' WHERE id = $1;
toolsets:
my-toolset:
- search-hotels-by-name
- search-hotels-by-location
- book-hotel
- update-hotel
- cancel-hotel
```
For more info on tools, check out the `Resources` section of the docs.
1. Run the Toolbox server, pointing to the `tools.yaml` file created earlier:
```bash
./toolbox --tools-file "tools.yaml"
```
{{< notice note >}}
Toolbox enables dynamic reloading by default. To disable, use the
`--disable-reload` flag.
{{< /notice >}}
{{< regionInclude "quickstart/shared/configure_toolbox.md" "configure_toolbox" >}}
## Step 3: Connect your agent to Toolbox
@@ -298,613 +49,28 @@ from Toolbox.
{{< tabpane persist=header >}}
{{< tab header="LangChain Go" lang="go" >}}
package main
{{< include "quickstart/go/langchain/quickstart.go" >}}
import (
"context"
"encoding/json"
"fmt"
"log"
"os"
"github.com/googleapis/mcp-toolbox-sdk-go/core"
"github.com/tmc/langchaingo/llms"
"github.com/tmc/langchaingo/llms/googleai"
)
// ConvertToLangchainTool converts a generic core.ToolboxTool into a LangChainGo llms.Tool.
func ConvertToLangchainTool(toolboxTool *core.ToolboxTool) llms.Tool {
// Fetch the tool's input schema
inputschema, err := toolboxTool.InputSchema()
if err != nil {
return llms.Tool{}
}
var paramsSchema map[string]any
_ = json.Unmarshal(inputschema, &paramsSchema)
// Convert into LangChain's llms.Tool
return llms.Tool{
Type: "function",
Function: &llms.FunctionDefinition{
Name: toolboxTool.Name(),
Description: toolboxTool.Description(),
Parameters: paramsSchema,
},
}
}
const systemPrompt = `
You're a helpful hotel assistant. You handle hotel searching, booking, and
cancellations. When the user searches for a hotel, mention its name, id,
location and price tier. Always mention hotel ids while performing any
searches. This is very important for any operations. For any bookings or
cancellations, please provide the appropriate confirmation. Be sure to
update checkin or checkout dates if mentioned by the user.
Don't ask for confirmations from the user.
`
var queries = []string{
"Find hotels in Basel with Basel in its name.",
"Can you book the hotel Hilton Basel for me?",
"Oh wait, this is too expensive. Please cancel it.",
"Please book the Hyatt Regency instead.",
"My check in dates would be from April 10, 2024 to April 19, 2024.",
}
func main() {
genaiKey := os.Getenv("GOOGLE_API_KEY")
toolboxURL := "http://localhost:5000"
ctx := context.Background()
// Initialize the Google AI client (LLM).
llm, err := googleai.New(ctx, googleai.WithAPIKey(genaiKey), googleai.WithDefaultModel("gemini-1.5-flash"))
if err != nil {
log.Fatalf("Failed to create Google AI client: %v", err)
}
// Initialize the MCP Toolbox client.
toolboxClient, err := core.NewToolboxClient(toolboxURL)
if err != nil {
log.Fatalf("Failed to create Toolbox client: %v", err)
}
// Load the tool using the MCP Toolbox SDK.
tools, err := toolboxClient.LoadToolset("my-toolset", ctx)
if err != nil {
log.Fatalf("Failed to load tools: %v\nMake sure your Toolbox server is running and the tool is configured.", err)
}
toolsMap := make(map[string]*core.ToolboxTool, len(tools))
langchainTools := make([]llms.Tool, len(tools))
// Convert the loaded ToolboxTools into the format LangChainGo requires.
for i, tool := range tools {
langchainTools[i] = ConvertToLangchainTool(tool)
toolsMap[tool.Name()] = tool
}
// Start the conversation history.
messageHistory := []llms.MessageContent{
llms.TextParts(llms.ChatMessageTypeSystem, systemPrompt),
}
for _, query := range queries {
messageHistory = append(messageHistory, llms.TextParts(llms.ChatMessageTypeHuman, query))
// Make the first call to the LLM, making it aware of the tool.
resp, err := llm.GenerateContent(ctx, messageHistory, llms.WithTools(langchainTools))
if err != nil {
log.Fatalf("LLM call failed: %v", err)
}
respChoice := resp.Choices[0]
assistantResponse := llms.TextParts(llms.ChatMessageTypeAI, respChoice.Content)
for _, tc := range respChoice.ToolCalls {
assistantResponse.Parts = append(assistantResponse.Parts, tc)
}
messageHistory = append(messageHistory, assistantResponse)
// Process each tool call requested by the model.
for _, tc := range respChoice.ToolCalls {
toolName := tc.FunctionCall.Name
tool := toolsMap[toolName]
var args map[string]any
if err := json.Unmarshal([]byte(tc.FunctionCall.Arguments), &args); err != nil {
log.Fatalf("Failed to unmarshal arguments for tool '%s': %v", toolName, err)
}
toolResult, err := tool.Invoke(ctx, args)
if err != nil {
log.Fatalf("Failed to execute tool '%s': %v", toolName, err)
}
if toolResult == "" || toolResult == nil {
toolResult = "Operation completed successfully with no specific return value."
}
// Create the tool call response message and add it to the history.
toolResponse := llms.MessageContent{
Role: llms.ChatMessageTypeTool,
Parts: []llms.ContentPart{
llms.ToolCallResponse{
Name: toolName,
Content: fmt.Sprintf("%v", toolResult),
},
},
}
messageHistory = append(messageHistory, toolResponse)
}
finalResp, err := llm.GenerateContent(ctx, messageHistory)
if err != nil {
log.Fatalf("Final LLM call failed after tool execution: %v", err)
}
// Add the final textual response from the LLM to the history
messageHistory = append(messageHistory, llms.TextParts(llms.ChatMessageTypeAI, finalResp.Choices[0].Content))
fmt.Println(finalResp.Choices[0].Content)
}
}
{{< /tab >}}
{{< tab header="Genkit Go" lang="go" >}}
package main
{{< include "quickstart/go/genkit/quickstart.go" >}}
import (
"context"
"fmt"
"log"
"github.com/googleapis/mcp-toolbox-sdk-go/core"
"github.com/googleapis/mcp-toolbox-sdk-go/tbgenkit"
"github.com/firebase/genkit/go/ai"
"github.com/firebase/genkit/go/genkit"
"github.com/firebase/genkit/go/plugins/googlegenai"
)
const systemPrompt = `
You're a helpful hotel assistant. You handle hotel searching, booking, and
cancellations. When the user searches for a hotel, mention its name, id,
location and price tier. Always mention hotel ids while performing any
searches. This is very important for any operations. For any bookings or
cancellations, please provide the appropriate confirmation. Be sure to
update checkin or checkout dates if mentioned by the user.
Don't ask for confirmations from the user.
`
var queries = []string{
"Find hotels in Basel with Basel in its name.",
"Can you book the hotel Hilton Basel for me?",
"Oh wait, this is too expensive. Please cancel it and book the Hyatt Regency instead.",
"My check in dates would be from April 10, 2024 to April 19, 2024.",
}
func main() {
ctx := context.Background()
// Create Toolbox Client
toolboxClient, err := core.NewToolboxClient("http://127.0.0.1:5000")
if err != nil {
log.Fatalf("Failed to create Toolbox client: %v", err)
}
// Load the tools using the MCP Toolbox SDK.
tools, err := toolboxClient.LoadToolset("my-toolset", ctx)
if err != nil {
log.Fatalf("Failed to load tools: %v\nMake sure your Toolbox server is running and the tool is configured.", err)
}
// Initialize Genkit
g, err := genkit.Init(ctx,
genkit.WithPlugins(&googlegenai.GoogleAI{}),
genkit.WithDefaultModel("googleai/gemini-1.5-flash"),
)
if err != nil {
log.Fatalf("Failed to init genkit: %v\n", err)
}
// Create a conversation history
conversationHistory := []*ai.Message{
ai.NewSystemTextMessage(systemPrompt),
}
// Convert your tool to a Genkit tool.
genkitTools := make([]ai.Tool, len(tools))
for i, tool := range tools {
newTool, err := tbgenkit.ToGenkitTool(tool, g)
if err != nil {
log.Fatalf("Failed to convert tool: %v\n", err)
}
genkitTools[i] = newTool
}
toolRefs := make([]ai.ToolRef, len(genkitTools))
for i, tool := range genkitTools {
toolRefs[i] = tool
}
for _, query := range queries {
conversationHistory = append(conversationHistory, ai.NewUserTextMessage(query))
response, err := genkit.Generate(ctx, g,
ai.WithMessages(conversationHistory...),
ai.WithTools(toolRefs...),
ai.WithReturnToolRequests(true),
)
if err != nil {
log.Fatalf("%v\n", err)
}
conversationHistory = append(conversationHistory, response.Message)
parts := []*ai.Part{}
for _, req := range response.ToolRequests() {
tool := genkit.LookupTool(g, req.Name)
if tool == nil {
log.Fatalf("tool %q not found", req.Name)
}
output, err := tool.RunRaw(ctx, req.Input)
if err != nil {
log.Fatalf("tool %q execution failed: %v", tool.Name(), err)
}
parts = append(parts,
ai.NewToolResponsePart(&ai.ToolResponse{
Name: req.Name,
Ref: req.Ref,
Output: output,
}))
}
if len(parts) > 0 {
resp, err := genkit.Generate(ctx, g,
ai.WithMessages(append(response.History(), ai.NewMessage(ai.RoleTool, nil, parts...))...),
ai.WithTools(toolRefs...),
)
if err != nil {
log.Fatal(err)
}
fmt.Println("\n", resp.Text())
conversationHistory = append(conversationHistory, resp.Message)
} else {
fmt.Println("\n", response.Text())
}
}
}
{{< /tab >}}
{{< tab header="Go GenAI" lang="go" >}}
package main
import (
"context"
"encoding/json"
"fmt"
"log"
"os"
"github.com/googleapis/mcp-toolbox-sdk-go/core"
"google.golang.org/genai"
)
// ConvertToGenaiTool translates a ToolboxTool into the genai.FunctionDeclaration format.
func ConvertToGenaiTool(toolboxTool *core.ToolboxTool) *genai.Tool {
inputschema, err := toolboxTool.InputSchema()
if err != nil {
return &genai.Tool{}
}
var paramsSchema *genai.Schema
_ = json.Unmarshal(inputschema, &paramsSchema)
// First, create the function declaration.
funcDeclaration := &genai.FunctionDeclaration{
Name: toolboxTool.Name(),
Description: toolboxTool.Description(),
Parameters: paramsSchema,
}
// Then, wrap the function declaration in a genai.Tool struct.
return &genai.Tool{
FunctionDeclarations: []*genai.FunctionDeclaration{funcDeclaration},
}
}
func printResponse(resp *genai.GenerateContentResponse) {
for _, cand := range resp.Candidates {
if cand.Content != nil {
for _, part := range cand.Content.Parts {
fmt.Println(part.Text)
}
}
}
}
const systemPrompt = `
You're a helpful hotel assistant. You handle hotel searching, booking, and
cancellations. When the user searches for a hotel, mention its name, id,
location and price tier. Always mention hotel ids while performing any
searches. This is very important for any operations. For any bookings or
cancellations, please provide the appropriate confirmation. Be sure to
update checkin or checkout dates if mentioned by the user.
Don't ask for confirmations from the user.
`
var queries = []string{
"Find hotels in Basel with Basel in its name.",
"Can you book the hotel Hilton Basel for me?",
"Oh wait, this is too expensive. Please cancel it.",
"Please book the Hyatt Regency instead.",
"My check in dates would be from April 10, 2024 to April 19, 2024.",
}
func main() {
// Setup
ctx := context.Background()
apiKey := os.Getenv("GOOGLE_API_KEY")
toolboxURL := "http://localhost:5000"
// Initialize the Google GenAI client using the explicit ClientConfig.
client, err := genai.NewClient(ctx, &genai.ClientConfig{
APIKey: apiKey,
})
if err != nil {
log.Fatalf("Failed to create Google GenAI client: %v", err)
}
// Initialize the MCP Toolbox client.
toolboxClient, err := core.NewToolboxClient(toolboxURL)
if err != nil {
log.Fatalf("Failed to create Toolbox client: %v", err)
}
// Load the tool using the MCP Toolbox SDK.
tools, err := toolboxClient.LoadToolset("my-toolset", ctx)
if err != nil {
log.Fatalf("Failed to load tools: %v\nMake sure your Toolbox server is running and the tool is configured.", err)
}
genAITools := make([]*genai.Tool, len(tools))
toolsMap := make(map[string]*core.ToolboxTool, len(tools))
for i, tool := range tools {
genAITools[i] = ConvertToGenaiTool(tool)
toolsMap[tool.Name()] = tool
}
// Set up the generative model with the available tool.
modelName := "gemini-2.0-flash"
// Create the initial content prompt for the model.
messageHistory := []*genai.Content{
genai.NewContentFromText(systemPrompt, genai.RoleUser),
}
config := &genai.GenerateContentConfig{
Tools: genAITools,
ToolConfig: &genai.ToolConfig{
FunctionCallingConfig: &genai.FunctionCallingConfig{
Mode: genai.FunctionCallingConfigModeAny,
},
},
}
for _, query := range queries {
messageHistory = append(messageHistory, genai.NewContentFromText(query, genai.RoleUser))
genContentResp, err := client.Models.GenerateContent(ctx, modelName, messageHistory, config)
if err != nil {
log.Fatalf("LLM call failed for query '%s': %v", query, err)
}
if len(genContentResp.Candidates) > 0 && genContentResp.Candidates[0].Content != nil {
messageHistory = append(messageHistory, genContentResp.Candidates[0].Content)
}
functionCalls := genContentResp.FunctionCalls()
toolResponseParts := []*genai.Part{}
for _, fc := range functionCalls {
toolToInvoke, found := toolsMap[fc.Name]
if !found {
log.Fatalf("Tool '%s' not found in loaded tools map. Check toolset configuration.", fc.Name)
}
toolResult, invokeErr := toolToInvoke.Invoke(ctx, fc.Args)
if invokeErr != nil {
log.Fatalf("Failed to execute tool '%s': %v", fc.Name, invokeErr)
}
// Enhanced Tool Result Handling (retained to prevent nil issues)
toolResultString := ""
if toolResult != nil {
jsonBytes, marshalErr := json.Marshal(toolResult)
if marshalErr == nil {
toolResultString = string(jsonBytes)
} else {
toolResultString = fmt.Sprintf("%v", toolResult)
}
}
responseMap := map[string]any{"result": toolResultString}
toolResponseParts = append(toolResponseParts, genai.NewPartFromFunctionResponse(fc.Name, responseMap))
}
// Add all accumulated tool responses for this turn to the message history.
toolResponseContent := genai.NewContentFromParts(toolResponseParts, "function")
messageHistory = append(messageHistory, toolResponseContent)
finalResponse, err := client.Models.GenerateContent(ctx, modelName, messageHistory, &genai.GenerateContentConfig{})
if err != nil {
log.Fatalf("Error calling GenerateContent (with function result): %v", err)
}
printResponse(finalResponse)
// Add the final textual response from the LLM to the history
if len(finalResponse.Candidates) > 0 && finalResponse.Candidates[0].Content != nil {
messageHistory = append(messageHistory, finalResponse.Candidates[0].Content)
}
}
}
{{< include "quickstart/go/genAI/quickstart.go" >}}
{{< /tab >}}
{{< tab header="OpenAI Go" lang="go" >}}
package main
{{< include "quickstart/go/openAI/quickstart.go" >}}
import (
"context"
"encoding/json"
"log"
"github.com/googleapis/mcp-toolbox-sdk-go/core"
openai "github.com/openai/openai-go"
)
// ConvertToOpenAITool converts a ToolboxTool into the go-openai library's Tool format.
func ConvertToOpenAITool(toolboxTool *core.ToolboxTool) openai.ChatCompletionToolParam {
// Get the input schema
jsonSchemaBytes, err := toolboxTool.InputSchema()
if err != nil {
return openai.ChatCompletionToolParam{}
}
// Unmarshal the JSON bytes into FunctionParameters
var paramsSchema openai.FunctionParameters
if err := json.Unmarshal(jsonSchemaBytes, &paramsSchema); err != nil {
return openai.ChatCompletionToolParam{}
}
// Create and return the final tool parameter struct.
return openai.ChatCompletionToolParam{
Function: openai.FunctionDefinitionParam{
Name: toolboxTool.Name(),
Description: openai.String(toolboxTool.Description()),
Parameters: paramsSchema,
},
}
}
const systemPrompt = `
You're a helpful hotel assistant. You handle hotel searching, booking, and
cancellations. When the user searches for a hotel, mention its name, id,
location and price tier. Always mention hotel ids while performing any
searches. This is very important for any operations. For any bookings or
cancellations, please provide the appropriate confirmation. Be sure to
update checkin or checkout dates if mentioned by the user.
Don't ask for confirmations from the user.
`
var queries = []string{
"Find hotels in Basel with Basel in its name.",
"Can you book the hotel Hilton Basel for me?",
"Oh wait, this is too expensive. Please cancel it and book the Hyatt Regency instead.",
"My check in dates would be from April 10, 2024 to April 19, 2024.",
}
func main() {
// Setup
ctx := context.Background()
toolboxURL := "http://localhost:5000"
openAIClient := openai.NewClient()
// Initialize the MCP Toolbox client.
toolboxClient, err := core.NewToolboxClient(toolboxURL)
if err != nil {
log.Fatalf("Failed to create Toolbox client: %v", err)
}
// Load the tools using the MCP Toolbox SDK.
tools, err := toolboxClient.LoadToolset("my-toolset", ctx)
if err != nil {
log.Fatalf("Failed to load tool : %v\nMake sure your Toolbox server is running and the tool is configured.", err)
}
openAITools := make([]openai.ChatCompletionToolParam, len(tools))
toolsMap := make(map[string]*core.ToolboxTool, len(tools))
for i, tool := range tools {
// Convert the Toolbox tool into the openAI FunctionDeclaration format.
openAITools[i] = ConvertToOpenAITool(tool)
// Add tool to a map for lookup later
toolsMap[tool.Name()] = tool
}
params := openai.ChatCompletionNewParams{
Messages: []openai.ChatCompletionMessageParamUnion{
openai.SystemMessage(systemPrompt),
},
Tools: openAITools,
Seed: openai.Int(0),
Model: openai.ChatModelGPT4o,
}
for _, query := range queries {
params.Messages = append(params.Messages, openai.UserMessage(query))
// Make initial chat completion request
completion, err := openAIClient.Chat.Completions.New(ctx, params)
if err != nil {
panic(err)
}
toolCalls := completion.Choices[0].Message.ToolCalls
// Return early if there are no tool calls
if len(toolCalls) == 0 {
log.Println("No function call")
}
// If there was a function call, continue the conversation
params.Messages = append(params.Messages, completion.Choices[0].Message.ToParam())
for _, toolCall := range toolCalls {
toolName := toolCall.Function.Name
toolToInvoke := toolsMap[toolName]
var args map[string]any
err := json.Unmarshal([]byte(toolCall.Function.Arguments), &args)
if err != nil {
panic(err)
}
result, err := toolToInvoke.Invoke(ctx, args)
if err != nil {
log.Fatal("Could not invoke tool", err)
}
params.Messages = append(params.Messages, openai.ToolMessage(result.(string), toolCall.ID))
}
completion, err = openAIClient.Chat.Completions.New(ctx, params)
if err != nil {
panic(err)
}
params.Messages = append(params.Messages, openai.AssistantMessage(query))
println("\n", completion.Choices[0].Message.Content)
}
}
{{< /tab >}}
{{< /tabpane >}}

View File

@@ -3,7 +3,7 @@ title: "JS Quickstart (Local)"
type: docs
weight: 3
description: >
How to get started running Toolbox locally with [JavaScript](https://github.com/googleapis/mcp-toolbox-sdk-js), PostgreSQL, and orchestration frameworks such as [LangChain](https://js.langchain.com/docs/introduction/), [GenkitJS](https://genkit.dev/docs/get-started/), and [LlamaIndex](https://ts.llamaindex.ai/).
How to get started running Toolbox locally with [JavaScript](https://github.com/googleapis/mcp-toolbox-sdk-js), PostgreSQL, and orchestration frameworks such as [LangChain](https://js.langchain.com/docs/introduction/), [GenkitJS](https://genkit.dev/docs/get-started/), [LlamaIndex](https://ts.llamaindex.ai/) and [GoogleGenAI](https://github.com/googleapis/js-genai).
---
## Before you begin
@@ -13,265 +13,17 @@ This guide assumes you have already done the following:
1. Installed [Node.js (v18 or higher)].
1. Installed [PostgreSQL 16+ and the `psql` client][install-postgres].
### Cloud Setup (Optional)
If you plan to use **Google Clouds Vertex AI** with your agent (e.g., using
Gemini or PaLM models), follow these one-time setup steps:
1. [Install the Google Cloud CLI]
1. [Set up Application Default Credentials (ADC)]
1. Set your project and enable Vertex AI
```bash
gcloud config set project YOUR_PROJECT_ID
gcloud services enable aiplatform.googleapis.com
```
[Node.js (v18 or higher)]: https://nodejs.org/
[install-postgres]: https://www.postgresql.org/download/
[Install the Google Cloud CLI]: https://cloud.google.com/sdk/docs/install
[Set up Application Default Credentials (ADC)]:
https://cloud.google.com/docs/authentication/set-up-adc-local-dev-environment
### Cloud Setup (Optional)
{{< regionInclude "quickstart/shared/cloud_setup.md" "cloud_setup" >}}
## Step 1: Set up your database
In this section, we will create a database, insert some data that needs to be
accessed by our agent, and create a database user for Toolbox to connect with.
1. Connect to postgres using the `psql` command:
```bash
psql -h 127.0.0.1 -U postgres
```
Here, `postgres` denotes the default postgres superuser.
{{< notice info >}}
#### **Having trouble connecting?**
* **Password Prompt:** If you are prompted for a password for the `postgres`
user and do not know it (or a blank password doesn't work), your PostgreSQL
installation might require a password or a different authentication method.
* **`FATAL: role "postgres" does not exist`:** This error means the default
`postgres` superuser role isn't available under that name on your system.
* **`Connection refused`:** Ensure your PostgreSQL server is actually running.
You can typically check with `sudo systemctl status postgresql` and start it
with `sudo systemctl start postgresql` on Linux systems.
<br/>
#### **Common Solution**
For password issues or if the `postgres` role seems inaccessible directly, try
switching to the `postgres` operating system user first. This user often has
permission to connect without a password for local connections (this is called
peer authentication).
```bash
sudo -i -u postgres
psql -h 127.0.0.1
```
Once you are in the `psql` shell using this method, you can proceed with the
database creation steps below. Afterwards, type `\q` to exit `psql`, and then
`exit` to return to your normal user shell.
If desired, once connected to `psql` as the `postgres` OS user, you can set a
password for the `postgres` *database* user using: `ALTER USER postgres WITH
PASSWORD 'your_chosen_password';`. This would allow direct connection with `-U
postgres` and a password next time.
{{< /notice >}}
1. Create a new database and a new user:
{{< notice tip >}}
For a real application, it's best to follow the principle of least permission
and only grant the privileges your application needs.
{{< /notice >}}
```sql
CREATE USER toolbox_user WITH PASSWORD 'my-password';
CREATE DATABASE toolbox_db;
GRANT ALL PRIVILEGES ON DATABASE toolbox_db TO toolbox_user;
ALTER DATABASE toolbox_db OWNER TO toolbox_user;
```
1. End the database session:
```bash
\q
```
(If you used `sudo -i -u postgres` and then `psql`, remember you might also
need to type `exit` after `\q` to leave the `postgres` user's shell
session.)
1. Connect to your database with your new user:
```bash
psql -h 127.0.0.1 -U toolbox_user -d toolbox_db
```
1. Create a table using the following command:
```sql
CREATE TABLE hotels(
id INTEGER NOT NULL PRIMARY KEY,
name VARCHAR NOT NULL,
location VARCHAR NOT NULL,
price_tier VARCHAR NOT NULL,
checkin_date DATE NOT NULL,
checkout_date DATE NOT NULL,
booked BIT NOT NULL
);
```
1. Insert data into the table.
```sql
INSERT INTO hotels(id, name, location, price_tier, checkin_date, checkout_date, booked)
VALUES
(1, 'Hilton Basel', 'Basel', 'Luxury', '2024-04-22', '2024-04-20', B'0'),
(2, 'Marriott Zurich', 'Zurich', 'Upscale', '2024-04-14', '2024-04-21', B'0'),
(3, 'Hyatt Regency Basel', 'Basel', 'Upper Upscale', '2024-04-02', '2024-04-20', B'0'),
(4, 'Radisson Blu Lucerne', 'Lucerne', 'Midscale', '2024-04-24', '2024-04-05', B'0'),
(5, 'Best Western Bern', 'Bern', 'Upper Midscale', '2024-04-23', '2024-04-01', B'0'),
(6, 'InterContinental Geneva', 'Geneva', 'Luxury', '2024-04-23', '2024-04-28', B'0'),
(7, 'Sheraton Zurich', 'Zurich', 'Upper Upscale', '2024-04-27', '2024-04-02', B'0'),
(8, 'Holiday Inn Basel', 'Basel', 'Upper Midscale', '2024-04-24', '2024-04-09', B'0'),
(9, 'Courtyard Zurich', 'Zurich', 'Upscale', '2024-04-03', '2024-04-13', B'0'),
(10, 'Comfort Inn Bern', 'Bern', 'Midscale', '2024-04-04', '2024-04-16', B'0');
```
1. End the database session:
```bash
\q
```
{{< regionInclude "quickstart/shared/database_setup.md" "database_setup" >}}
## Step 2: Install and configure Toolbox
In this section, we will download Toolbox, configure our tools in a
`tools.yaml`, and then run the Toolbox server.
1. Download the latest version of Toolbox as a binary:
{{< notice tip >}}
Select the
[correct binary](https://github.com/googleapis/genai-toolbox/releases)
corresponding to your OS and CPU architecture.
{{< /notice >}}
<!-- {x-release-please-start-version} -->
```bash
export OS="linux/amd64" # one of linux/amd64, darwin/arm64, darwin/amd64, or windows/amd64
curl -O https://storage.googleapis.com/genai-toolbox/v0.12.0/$OS/toolbox
```
<!-- {x-release-please-end} -->
1. Make the binary executable:
```bash
chmod +x toolbox
```
1. Write the following into a `tools.yaml` file. Be sure to update any fields
such as `user`, `password`, or `database` that you may have customized in the
previous step.
{{< notice tip >}}
In practice, use environment variable replacement with the format ${ENV_NAME}
instead of hardcoding your secrets into the configuration file.
{{< /notice >}}
```yaml
sources:
my-pg-source:
kind: postgres
host: 127.0.0.1
port: 5432
database: toolbox_db
user: ${USER_NAME}
password: ${PASSWORD}
tools:
search-hotels-by-name:
kind: postgres-sql
source: my-pg-source
description: Search for hotels based on name.
parameters:
- name: name
type: string
description: The name of the hotel.
statement: SELECT * FROM hotels WHERE name ILIKE '%' || $1 || '%';
search-hotels-by-location:
kind: postgres-sql
source: my-pg-source
description: Search for hotels based on location.
parameters:
- name: location
type: string
description: The location of the hotel.
statement: SELECT * FROM hotels WHERE location ILIKE '%' || $1 || '%';
book-hotel:
kind: postgres-sql
source: my-pg-source
description: >-
Book a hotel by its ID. If the hotel is successfully booked, returns a NULL, raises an error if not.
parameters:
- name: hotel_id
type: string
description: The ID of the hotel to book.
statement: UPDATE hotels SET booked = B'1' WHERE id = $1;
update-hotel:
kind: postgres-sql
source: my-pg-source
description: >-
Update a hotel's check-in and check-out dates by its ID. Returns a message
indicating whether the hotel was successfully updated or not.
parameters:
- name: hotel_id
type: string
description: The ID of the hotel to update.
- name: checkin_date
type: string
description: The new check-in date of the hotel.
- name: checkout_date
type: string
description: The new check-out date of the hotel.
statement: >-
UPDATE hotels SET checkin_date = CAST($2 as date), checkout_date = CAST($3
as date) WHERE id = $1;
cancel-hotel:
kind: postgres-sql
source: my-pg-source
description: Cancel a hotel by its ID.
parameters:
- name: hotel_id
type: string
description: The ID of the hotel to cancel.
statement: UPDATE hotels SET booked = B'0' WHERE id = $1;
toolsets:
my-toolset:
- search-hotels-by-name
- search-hotels-by-location
- book-hotel
- update-hotel
- cancel-hotel
```
For more info on tools, check out the `Resources` section of the docs.
1. Run the Toolbox server, pointing to the `tools.yaml` file created earlier:
```bash
./toolbox --tools-file "tools.yaml"
```
{{< notice note >}}
Toolbox enables dynamic reloading by default. To disable, use the `--disable-reload` flag.
{{< /notice >}}
{{< regionInclude "quickstart/shared/configure_toolbox.md" "configure_toolbox" >}}
## Step 3: Connect your agent to Toolbox
@@ -302,6 +54,9 @@ npm install genkit @genkit-ai/googleai
{{< tab header="LlamaIndex" lang="bash" >}}
npm install llamaindex @llamaindex/google @llamaindex/workflow
{{< /tab >}}
{{< tab header="GoogleGenAI" lang="bash" >}}
npm install @google/genai
{{< /tab >}}
{{< /tabpane >}}
1. Create a new file named `hotelAgent.js` and copy the following code to create an agent:
@@ -309,259 +64,25 @@ npm install llamaindex @llamaindex/google @llamaindex/workflow
{{< tabpane persist=header >}}
{{< tab header="LangChain" lang="js" >}}
import { ChatGoogleGenerativeAI } from "@langchain/google-genai";
import { ToolboxClient } from "@toolbox-sdk/core";
import { tool } from "@langchain/core/tools";
import { createReactAgent } from "@langchain/langgraph/prebuilt";
import { MemorySaver } from "@langchain/langgraph";
// Replace it with your API key
process.env.GOOGLE_API_KEY = 'your-api-key';
const prompt = `
You're a helpful hotel assistant. You handle hotel searching, booking, and
cancellations. When the user searches for a hotel, mention its name, id,
location and price tier. Always mention hotel ids while performing any
searches. This is very important for any operations. For any bookings or
cancellations, please provide the appropriate confirmation. Be sure to
update checkin or checkout dates if mentioned by the user.
Don't ask for confirmations from the user.
`;
const queries = [
"Find hotels in Basel with Basel in its name.",
"Can you book the Hilton Basel for me?",
"Oh wait, this is too expensive. Please cancel it and book the Hyatt Regency instead.",
"My check in dates would be from April 10, 2024 to April 19, 2024.",
];
async function runApplication() {
const model = new ChatGoogleGenerativeAI({
model: "gemini-2.0-flash",
});
const client = new ToolboxClient("http://127.0.0.1:5000");
const toolboxTools = await client.loadToolset("my-toolset");
// Define the basics of the tool: name, description, schema and core logic
const getTool = (toolboxTool) => tool(toolboxTool, {
name: toolboxTool.getName(),
description: toolboxTool.getDescription(),
schema: toolboxTool.getParamSchema()
});
const tools = toolboxTools.map(getTool);
const agent = createReactAgent({
llm: model,
tools: tools,
checkpointer: new MemorySaver(),
systemPrompt: prompt,
});
const langGraphConfig = {
configurable: {
thread_id: "test-thread",
},
};
for (const query of queries) {
const agentOutput = await agent.invoke(
{
messages: [
{
role: "user",
content: query,
},
],
verbose: true,
},
langGraphConfig
);
const response = agentOutput.messages[agentOutput.messages.length - 1].content;
console.log(response);
}
}
runApplication()
.catch(console.error)
.finally(() => console.log("\nApplication finished."));
{{< include "quickstart/js/langchain/quickstart.js" >}}
{{< /tab >}}
{{< tab header="GenkitJS" lang="js" >}}
import { ToolboxClient } from "@toolbox-sdk/core";
import { genkit } from "genkit";
import { googleAI } from '@genkit-ai/googleai';
{{< include "quickstart/js/genkit/quickstart.js" >}}
// Replace it with your API key
process.env.GOOGLE_API_KEY = 'your-api-key';
const systemPrompt = `
You're a helpful hotel assistant. You handle hotel searching, booking, and
cancellations. When the user searches for a hotel, mention its name, id,
location and price tier. Always mention hotel ids while performing any
searches. This is very important for any operations. For any bookings or
cancellations, please provide the appropriate confirmation. Be sure to
update checkin or checkout dates if mentioned by the user.
Don't ask for confirmations from the user.
`;
const queries = [
"Find hotels in Basel with Basel in its name.",
"Can you book the Hilton Basel for me?",
"Oh wait, this is too expensive. Please cancel it and book the Hyatt Regency instead.",
"My check in dates would be from April 10, 2024 to April 19, 2024.",
];
async function run() {
const toolboxClient = new ToolboxClient("http://127.0.0.1:5000");
const ai = genkit({
plugins: [
googleAI({
apiKey: process.env.GEMINI_API_KEY || process.env.GOOGLE_API_KEY
})
],
model: googleAI.model('gemini-2.0-flash'),
});
const toolboxTools = await toolboxClient.loadToolset("my-toolset");
const toolMap = Object.fromEntries(
toolboxTools.map((tool) => {
const definedTool = ai.defineTool(
{
name: tool.getName(),
description: tool.getDescription(),
inputSchema: tool.getParamSchema(),
},
tool
);
return [tool.getName(), definedTool];
})
);
const tools = Object.values(toolMap);
let conversationHistory = [{ role: "system", content: [{ text: systemPrompt }] }];
for (const query of queries) {
conversationHistory.push({ role: "user", content: [{ text: query }] });
const response = await ai.generate({
messages: conversationHistory,
tools: tools,
});
conversationHistory.push(response.message);
const toolRequests = response.toolRequests;
if (toolRequests?.length > 0) {
// Execute tools concurrently and collect their responses.
const toolResponses = await Promise.all(
toolRequests.map(async (call) => {
try {
const toolOutput = await toolMap[call.name].invoke(call.input);
return { role: "tool", content: [{ toolResponse: { name: call.name, output: toolOutput } }] };
} catch (e) {
console.error(`Error executing tool ${call.name}:`, e);
return { role: "tool", content: [{ toolResponse: { name: call.name, output: { error: e.message } } }] };
}
})
);
conversationHistory.push(...toolResponses);
// Call the AI again with the tool results.
response = await ai.generate({ messages: conversationHistory, tools });
conversationHistory.push(response.message);
}
console.log(response.text);
}
}
run();
{{< /tab >}}
{{< tab header="LlamaIndex" lang="js" >}}
import { gemini, GEMINI_MODEL } from "@llamaindex/google";
import { agent } from "@llamaindex/workflow";
import { createMemory, staticBlock, tool } from "llamaindex";
import { ToolboxClient } from "@toolbox-sdk/core";
{{< include "quickstart/js/llamaindex/quickstart.js" >}}
const TOOLBOX_URL = "http://127.0.0.1:5000"; // Update if needed
process.env.GOOGLE_API_KEY = 'your-api-key'; // Replace it with your API key
{{< /tab >}}
const prompt = `
{{< tab header="GoogleGenAI" lang="js" >}}
You're a helpful hotel assistant. You handle hotel searching, booking and cancellations.
When the user searches for a hotel, mention its name, id, location and price tier.
Always mention hotel ids while performing any searches — this is very important for operations.
For any bookings or cancellations, please provide the appropriate confirmation.
Update check-in or check-out dates if mentioned by the user.
Don't ask for confirmations from the user.
`;
const queries = [
"Find hotels in Basel with Basel in its name.",
"Can you book the Hilton Basel for me?",
"Oh wait, this is too expensive. Please cancel it and book the Hyatt Regency instead.",
"My check in dates would be from April 10, 2024 to April 19, 2024.",
];
async function main() {
// Connect to MCP Toolbox
const client = new ToolboxClient(TOOLBOX_URL);
const toolboxTools = await client.loadToolset("my-toolset");
const tools = toolboxTools.map((toolboxTool) => {
return tool({
name: toolboxTool.getName(),
description: toolboxTool.getDescription(),
parameters: toolboxTool.getParamSchema(),
execute: toolboxTool,
});
});
// Initialize LLM
const llm = gemini({
model: GEMINI_MODEL.GEMINI_2_0_FLASH,
apiKey: process.env.GOOGLE_API_KEY,
});
const memory = createMemory({
memoryBlocks: [
staticBlock({
content: prompt,
}),
],
});
// Create the Agent
const myAgent = agent({
tools: tools,
llm,
memory,
systemPrompt: prompt,
});
for (const query of queries) {
const result = await myAgent.run(query);
const output = result.data.result;
console.log(`\nUser: ${query}`);
if (typeof output === "string") {
console.log(output.trim());
} else if (typeof output === "object" && "text" in output) {
console.log(output.text.trim());
} else {
console.log(JSON.stringify(output));
}
}
//You may observe some extra logs during execution due to the run method provided by Llama.
console.log("Agent run finished.");
}
main();
{{< include "quickstart/js/genAI/quickstart.js" >}}
{{< /tab >}}

View File

@@ -105,7 +105,7 @@ In this section, we will download Toolbox, configure our tools in a
<!-- {x-release-please-start-version} -->
```bash
export OS="linux/amd64" # one of linux/amd64, darwin/arm64, darwin/amd64, or windows/amd64
curl -O https://storage.googleapis.com/genai-toolbox/v0.12.0/$OS/toolbox
curl -O https://storage.googleapis.com/genai-toolbox/v0.14.0/$OS/toolbox
```
<!-- {x-release-please-end} -->

View File

@@ -0,0 +1,37 @@
module genai-quickstart
go 1.24.6
require (
github.com/googleapis/mcp-toolbox-sdk-go v0.3.0
google.golang.org/genai v1.21.0
)
require (
cloud.google.com/go v0.121.1 // indirect
cloud.google.com/go/auth v0.16.2 // indirect
cloud.google.com/go/auth/oauth2adapt v0.2.8 // indirect
cloud.google.com/go/compute/metadata v0.7.0 // indirect
github.com/felixge/httpsnoop v1.0.4 // indirect
github.com/go-logr/logr v1.4.3 // indirect
github.com/go-logr/stdr v1.2.2 // indirect
github.com/google/go-cmp v0.7.0 // indirect
github.com/google/s2a-go v0.1.9 // indirect
github.com/googleapis/enterprise-certificate-proxy v0.3.6 // indirect
github.com/googleapis/gax-go/v2 v2.14.2 // indirect
github.com/gorilla/websocket v1.5.3 // indirect
go.opentelemetry.io/auto/sdk v1.1.0 // indirect
go.opentelemetry.io/contrib/instrumentation/net/http/otelhttp v0.61.0 // indirect
go.opentelemetry.io/otel v1.36.0 // indirect
go.opentelemetry.io/otel/metric v1.36.0 // indirect
go.opentelemetry.io/otel/trace v1.36.0 // indirect
golang.org/x/crypto v0.39.0 // indirect
golang.org/x/net v0.41.0 // indirect
golang.org/x/oauth2 v0.30.0 // indirect
golang.org/x/sys v0.33.0 // indirect
golang.org/x/text v0.26.0 // indirect
google.golang.org/api v0.242.0 // indirect
google.golang.org/genproto/googleapis/rpc v0.0.0-20250603155806-513f23925822 // indirect
google.golang.org/grpc v1.73.0 // indirect
google.golang.org/protobuf v1.36.6 // indirect
)

View File

@@ -0,0 +1,174 @@
package main
import (
"context"
"encoding/json"
"fmt"
"log"
"os"
"github.com/googleapis/mcp-toolbox-sdk-go/core"
"google.golang.org/genai"
)
// ConvertToGenaiTool translates a ToolboxTool into the genai.FunctionDeclaration format.
func ConvertToGenaiTool(toolboxTool *core.ToolboxTool) *genai.Tool {
inputschema, err := toolboxTool.InputSchema()
if err != nil {
return &genai.Tool{}
}
var paramsSchema *genai.Schema
_ = json.Unmarshal(inputschema, &paramsSchema)
// First, create the function declaration.
funcDeclaration := &genai.FunctionDeclaration{
Name: toolboxTool.Name(),
Description: toolboxTool.Description(),
Parameters: paramsSchema,
}
// Then, wrap the function declaration in a genai.Tool struct.
return &genai.Tool{
FunctionDeclarations: []*genai.FunctionDeclaration{funcDeclaration},
}
}
func printResponse(resp *genai.GenerateContentResponse) {
for _, cand := range resp.Candidates {
if cand.Content != nil {
for _, part := range cand.Content.Parts {
fmt.Println(part.Text)
}
}
}
}
const systemPrompt = `
You're a helpful hotel assistant. You handle hotel searching, booking, and
cancellations. When the user searches for a hotel, mention its name, id,
location and price tier. Always mention hotel ids while performing any
searches. This is very important for any operations. For any bookings or
cancellations, please provide the appropriate confirmation. Be sure to
update checkin or checkout dates if mentioned by the user.
Don't ask for confirmations from the user.
`
var queries = []string{
"Find hotels in Basel with Basel in its name.",
"Can you book the hotel Hilton Basel for me?",
"Oh wait, this is too expensive. Please cancel it.",
"Please book the Hyatt Regency instead.",
"My check in dates would be from April 10, 2024 to April 19, 2024.",
}
func main() {
// Setup
ctx := context.Background()
apiKey := os.Getenv("GOOGLE_API_KEY")
toolboxURL := "http://localhost:5000"
// Initialize the Google GenAI client using the explicit ClientConfig.
client, err := genai.NewClient(ctx, &genai.ClientConfig{
APIKey: apiKey,
})
if err != nil {
log.Fatalf("Failed to create Google GenAI client: %v", err)
}
// Initialize the MCP Toolbox client.
toolboxClient, err := core.NewToolboxClient(toolboxURL)
if err != nil {
log.Fatalf("Failed to create Toolbox client: %v", err)
}
// Load the tool using the MCP Toolbox SDK.
tools, err := toolboxClient.LoadToolset("my-toolset", ctx)
if err != nil {
log.Fatalf("Failed to load tools: %v\nMake sure your Toolbox server is running and the tool is configured.", err)
}
genAITools := make([]*genai.Tool, len(tools))
toolsMap := make(map[string]*core.ToolboxTool, len(tools))
for i, tool := range tools {
genAITools[i] = ConvertToGenaiTool(tool)
toolsMap[tool.Name()] = tool
}
// Set up the generative model with the available tool.
modelName := "gemini-2.0-flash"
// Create the initial content prompt for the model.
messageHistory := []*genai.Content{
genai.NewContentFromText(systemPrompt, genai.RoleUser),
}
config := &genai.GenerateContentConfig{
Tools: genAITools,
ToolConfig: &genai.ToolConfig{
FunctionCallingConfig: &genai.FunctionCallingConfig{
Mode: genai.FunctionCallingConfigModeAny,
},
},
}
for _, query := range queries {
messageHistory = append(messageHistory, genai.NewContentFromText(query, genai.RoleUser))
genContentResp, err := client.Models.GenerateContent(ctx, modelName, messageHistory, config)
if err != nil {
log.Fatalf("LLM call failed for query '%s': %v", query, err)
}
if len(genContentResp.Candidates) > 0 && genContentResp.Candidates[0].Content != nil {
messageHistory = append(messageHistory, genContentResp.Candidates[0].Content)
}
functionCalls := genContentResp.FunctionCalls()
toolResponseParts := []*genai.Part{}
for _, fc := range functionCalls {
toolToInvoke, found := toolsMap[fc.Name]
if !found {
log.Fatalf("Tool '%s' not found in loaded tools map. Check toolset configuration.", fc.Name)
}
toolResult, invokeErr := toolToInvoke.Invoke(ctx, fc.Args)
if invokeErr != nil {
log.Fatalf("Failed to execute tool '%s': %v", fc.Name, invokeErr)
}
// Enhanced Tool Result Handling (retained to prevent nil issues)
toolResultString := ""
if toolResult != nil {
jsonBytes, marshalErr := json.Marshal(toolResult)
if marshalErr == nil {
toolResultString = string(jsonBytes)
} else {
toolResultString = fmt.Sprintf("%v", toolResult)
}
}
responseMap := map[string]any{"result": toolResultString}
toolResponseParts = append(toolResponseParts, genai.NewPartFromFunctionResponse(fc.Name, responseMap))
}
// Add all accumulated tool responses for this turn to the message history.
toolResponseContent := genai.NewContentFromParts(toolResponseParts, "function")
messageHistory = append(messageHistory, toolResponseContent)
finalResponse, err := client.Models.GenerateContent(ctx, modelName, messageHistory, &genai.GenerateContentConfig{})
if err != nil {
log.Fatalf("Error calling GenerateContent (with function result): %v", err)
}
printResponse(finalResponse)
// Add the final textual response from the LLM to the history
if len(finalResponse.Candidates) > 0 && finalResponse.Candidates[0].Content != nil {
messageHistory = append(messageHistory, finalResponse.Candidates[0].Content)
}
}
}

View File

@@ -0,0 +1,52 @@
module genkit-quickstart
go 1.24.6
require (
github.com/firebase/genkit/go v0.6.2
github.com/googleapis/mcp-toolbox-sdk-go v0.3.0
)
require (
cloud.google.com/go v0.121.1 // indirect
cloud.google.com/go/auth v0.16.2 // indirect
cloud.google.com/go/auth/oauth2adapt v0.2.8 // indirect
cloud.google.com/go/compute/metadata v0.7.0 // indirect
github.com/bahlo/generic-list-go v0.2.0 // indirect
github.com/buger/jsonparser v1.1.1 // indirect
github.com/felixge/httpsnoop v1.0.4 // indirect
github.com/go-logr/logr v1.4.3 // indirect
github.com/go-logr/stdr v1.2.2 // indirect
github.com/goccy/go-yaml v1.17.1 // indirect
github.com/google/dotprompt/go v0.0.0-20250611200215-bb73406b05ca // indirect
github.com/google/go-cmp v0.7.0 // indirect
github.com/google/s2a-go v0.1.9 // indirect
github.com/google/uuid v1.6.0 // indirect
github.com/googleapis/enterprise-certificate-proxy v0.3.6 // indirect
github.com/googleapis/gax-go/v2 v2.14.2 // indirect
github.com/gorilla/websocket v1.5.3 // indirect
github.com/invopop/jsonschema v0.13.0 // indirect
github.com/mailru/easyjson v0.9.0 // indirect
github.com/mbleigh/raymond v0.0.0-20250414171441-6b3a58ab9e0a // indirect
github.com/wk8/go-ordered-map/v2 v2.1.8 // indirect
github.com/xeipuuv/gojsonpointer v0.0.0-20190905194746-02993c407bfb // indirect
github.com/xeipuuv/gojsonreference v0.0.0-20180127040603-bd5ef7bd5415 // indirect
github.com/xeipuuv/gojsonschema v1.2.0 // indirect
go.opentelemetry.io/auto/sdk v1.1.0 // indirect
go.opentelemetry.io/contrib/instrumentation/net/http/otelhttp v0.61.0 // indirect
go.opentelemetry.io/otel v1.36.0 // indirect
go.opentelemetry.io/otel/metric v1.36.0 // indirect
go.opentelemetry.io/otel/sdk v1.36.0 // indirect
go.opentelemetry.io/otel/trace v1.36.0 // indirect
golang.org/x/crypto v0.39.0 // indirect
golang.org/x/net v0.41.0 // indirect
golang.org/x/oauth2 v0.30.0 // indirect
golang.org/x/sys v0.33.0 // indirect
golang.org/x/text v0.26.0 // indirect
google.golang.org/api v0.242.0 // indirect
google.golang.org/genai v1.11.1 // indirect
google.golang.org/genproto/googleapis/rpc v0.0.0-20250603155806-513f23925822 // indirect
google.golang.org/grpc v1.73.0 // indirect
google.golang.org/protobuf v1.36.6 // indirect
gopkg.in/yaml.v3 v3.0.1 // indirect
)

View File

@@ -0,0 +1,129 @@
package main
import (
"context"
"fmt"
"log"
"github.com/googleapis/mcp-toolbox-sdk-go/core"
"github.com/googleapis/mcp-toolbox-sdk-go/tbgenkit"
"github.com/firebase/genkit/go/ai"
"github.com/firebase/genkit/go/genkit"
"github.com/firebase/genkit/go/plugins/googlegenai"
)
const systemPrompt = `
You're a helpful hotel assistant. You handle hotel searching, booking, and
cancellations. When the user searches for a hotel, mention its name, id,
location and price tier. Always mention hotel ids while performing any
searches. This is very important for any operations. For any bookings or
cancellations, please provide the appropriate confirmation. Be sure to
update checkin or checkout dates if mentioned by the user.
Don't ask for confirmations from the user.
`
var queries = []string{
"Find hotels in Basel with Basel in its name.",
"Can you book the hotel Hilton Basel for me?",
"Oh wait, this is too expensive. Please cancel it and book the Hyatt Regency instead.",
"My check in dates would be from April 10, 2024 to April 19, 2024.",
}
func main() {
ctx := context.Background()
// Create Toolbox Client
toolboxClient, err := core.NewToolboxClient("http://127.0.0.1:5000")
if err != nil {
log.Fatalf("Failed to create Toolbox client: %v", err)
}
// Load the tools using the MCP Toolbox SDK.
tools, err := toolboxClient.LoadToolset("my-toolset", ctx)
if err != nil {
log.Fatalf("Failed to load tools: %v\nMake sure your Toolbox server is running and the tool is configured.", err)
}
// Initialize Genkit
g, err := genkit.Init(ctx,
genkit.WithPlugins(&googlegenai.GoogleAI{}),
genkit.WithDefaultModel("googleai/gemini-1.5-flash"),
)
if err != nil {
log.Fatalf("Failed to init genkit: %v\n", err)
}
// Create a conversation history
conversationHistory := []*ai.Message{
ai.NewSystemTextMessage(systemPrompt),
}
// Convert your tool to a Genkit tool.
genkitTools := make([]ai.Tool, len(tools))
for i, tool := range tools {
newTool, err := tbgenkit.ToGenkitTool(tool, g)
if err != nil {
log.Fatalf("Failed to convert tool: %v\n", err)
}
genkitTools[i] = newTool
}
toolRefs := make([]ai.ToolRef, len(genkitTools))
for i, tool := range genkitTools {
toolRefs[i] = tool
}
for _, query := range queries {
conversationHistory = append(conversationHistory, ai.NewUserTextMessage(query))
response, err := genkit.Generate(ctx, g,
ai.WithMessages(conversationHistory...),
ai.WithTools(toolRefs...),
ai.WithReturnToolRequests(true),
)
if err != nil {
log.Fatalf("%v\n", err)
}
conversationHistory = append(conversationHistory, response.Message)
parts := []*ai.Part{}
for _, req := range response.ToolRequests() {
tool := genkit.LookupTool(g, req.Name)
if tool == nil {
log.Fatalf("tool %q not found", req.Name)
}
output, err := tool.RunRaw(ctx, req.Input)
if err != nil {
log.Fatalf("tool %q execution failed: %v", tool.Name(), err)
}
parts = append(parts,
ai.NewToolResponsePart(&ai.ToolResponse{
Name: req.Name,
Ref: req.Ref,
Output: output,
}))
}
if len(parts) > 0 {
resp, err := genkit.Generate(ctx, g,
ai.WithMessages(append(response.History(), ai.NewMessage(ai.RoleTool, nil, parts...))...),
ai.WithTools(toolRefs...),
)
if err != nil {
log.Fatal(err)
}
fmt.Println("\n", resp.Text())
conversationHistory = append(conversationHistory, resp.Message)
} else {
fmt.Println("\n", response.Text())
}
}
}

View File

@@ -0,0 +1,49 @@
module langchan-quickstart
go 1.24.6
require (
github.com/googleapis/mcp-toolbox-sdk-go v0.3.0
github.com/tmc/langchaingo v0.1.13
)
require (
cloud.google.com/go v0.121.1 // indirect
cloud.google.com/go/ai v0.7.0 // indirect
cloud.google.com/go/aiplatform v1.85.0 // indirect
cloud.google.com/go/auth v0.16.2 // indirect
cloud.google.com/go/auth/oauth2adapt v0.2.8 // indirect
cloud.google.com/go/compute/metadata v0.7.0 // indirect
cloud.google.com/go/iam v1.5.2 // indirect
cloud.google.com/go/longrunning v0.6.7 // indirect
cloud.google.com/go/vertexai v0.12.0 // indirect
github.com/dlclark/regexp2 v1.10.0 // indirect
github.com/felixge/httpsnoop v1.0.4 // indirect
github.com/go-logr/logr v1.4.3 // indirect
github.com/go-logr/stdr v1.2.2 // indirect
github.com/google/generative-ai-go v0.15.1 // indirect
github.com/google/s2a-go v0.1.9 // indirect
github.com/google/uuid v1.6.0 // indirect
github.com/googleapis/enterprise-certificate-proxy v0.3.6 // indirect
github.com/googleapis/gax-go/v2 v2.14.2 // indirect
github.com/pkoukk/tiktoken-go v0.1.6 // indirect
go.opentelemetry.io/auto/sdk v1.1.0 // indirect
go.opentelemetry.io/contrib/instrumentation/google.golang.org/grpc/otelgrpc v0.61.0 // indirect
go.opentelemetry.io/contrib/instrumentation/net/http/otelhttp v0.61.0 // indirect
go.opentelemetry.io/otel v1.36.0 // indirect
go.opentelemetry.io/otel/metric v1.36.0 // indirect
go.opentelemetry.io/otel/trace v1.36.0 // indirect
golang.org/x/crypto v0.39.0 // indirect
golang.org/x/net v0.41.0 // indirect
golang.org/x/oauth2 v0.30.0 // indirect
golang.org/x/sync v0.15.0 // indirect
golang.org/x/sys v0.33.0 // indirect
golang.org/x/text v0.26.0 // indirect
golang.org/x/time v0.12.0 // indirect
google.golang.org/api v0.242.0 // indirect
google.golang.org/genproto v0.0.0-20250505200425-f936aa4a68b2 // indirect
google.golang.org/genproto/googleapis/api v0.0.0-20250603155806-513f23925822 // indirect
google.golang.org/genproto/googleapis/rpc v0.0.0-20250603155806-513f23925822 // indirect
google.golang.org/grpc v1.73.0 // indirect
google.golang.org/protobuf v1.36.6 // indirect
)

View File

@@ -0,0 +1,149 @@
package main
import (
"context"
"encoding/json"
"fmt"
"log"
"os"
"github.com/googleapis/mcp-toolbox-sdk-go/core"
"github.com/tmc/langchaingo/llms"
"github.com/tmc/langchaingo/llms/googleai"
)
// ConvertToLangchainTool converts a generic core.ToolboxTool into a LangChainGo llms.Tool.
func ConvertToLangchainTool(toolboxTool *core.ToolboxTool) llms.Tool {
// Fetch the tool's input schema
inputschema, err := toolboxTool.InputSchema()
if err != nil {
return llms.Tool{}
}
var paramsSchema map[string]any
_ = json.Unmarshal(inputschema, &paramsSchema)
// Convert into LangChain's llms.Tool
return llms.Tool{
Type: "function",
Function: &llms.FunctionDefinition{
Name: toolboxTool.Name(),
Description: toolboxTool.Description(),
Parameters: paramsSchema,
},
}
}
const systemPrompt = `
You're a helpful hotel assistant. You handle hotel searching, booking, and
cancellations. When the user searches for a hotel, mention its name, id,
location and price tier. Always mention hotel ids while performing any
searches. This is very important for any operations. For any bookings or
cancellations, please provide the appropriate confirmation. Be sure to
update checkin or checkout dates if mentioned by the user.
Don't ask for confirmations from the user.
`
var queries = []string{
"Find hotels in Basel with Basel in its name.",
"Can you book the hotel Hilton Basel for me?",
"Oh wait, this is too expensive. Please cancel it.",
"Please book the Hyatt Regency instead.",
"My check in dates would be from April 10, 2024 to April 19, 2024.",
}
func main() {
genaiKey := os.Getenv("GOOGLE_API_KEY")
toolboxURL := "http://localhost:5000"
ctx := context.Background()
// Initialize the Google AI client (LLM).
llm, err := googleai.New(ctx, googleai.WithAPIKey(genaiKey), googleai.WithDefaultModel("gemini-1.5-flash"))
if err != nil {
log.Fatalf("Failed to create Google AI client: %v", err)
}
// Initialize the MCP Toolbox client.
toolboxClient, err := core.NewToolboxClient(toolboxURL)
if err != nil {
log.Fatalf("Failed to create Toolbox client: %v", err)
}
// Load the tool using the MCP Toolbox SDK.
tools, err := toolboxClient.LoadToolset("my-toolset", ctx)
if err != nil {
log.Fatalf("Failed to load tools: %v\nMake sure your Toolbox server is running and the tool is configured.", err)
}
toolsMap := make(map[string]*core.ToolboxTool, len(tools))
langchainTools := make([]llms.Tool, len(tools))
// Convert the loaded ToolboxTools into the format LangChainGo requires.
for i, tool := range tools {
langchainTools[i] = ConvertToLangchainTool(tool)
toolsMap[tool.Name()] = tool
}
// Start the conversation history.
messageHistory := []llms.MessageContent{
llms.TextParts(llms.ChatMessageTypeSystem, systemPrompt),
}
for _, query := range queries {
messageHistory = append(messageHistory, llms.TextParts(llms.ChatMessageTypeHuman, query))
// Make the first call to the LLM, making it aware of the tool.
resp, err := llm.GenerateContent(ctx, messageHistory, llms.WithTools(langchainTools))
if err != nil {
log.Fatalf("LLM call failed: %v", err)
}
respChoice := resp.Choices[0]
assistantResponse := llms.TextParts(llms.ChatMessageTypeAI, respChoice.Content)
for _, tc := range respChoice.ToolCalls {
assistantResponse.Parts = append(assistantResponse.Parts, tc)
}
messageHistory = append(messageHistory, assistantResponse)
// Process each tool call requested by the model.
for _, tc := range respChoice.ToolCalls {
toolName := tc.FunctionCall.Name
tool := toolsMap[toolName]
var args map[string]any
if err := json.Unmarshal([]byte(tc.FunctionCall.Arguments), &args); err != nil {
log.Fatalf("Failed to unmarshal arguments for tool '%s': %v", toolName, err)
}
toolResult, err := tool.Invoke(ctx, args)
if err != nil {
log.Fatalf("Failed to execute tool '%s': %v", toolName, err)
}
if toolResult == "" || toolResult == nil {
toolResult = "Operation completed successfully with no specific return value."
}
// Create the tool call response message and add it to the history.
toolResponse := llms.MessageContent{
Role: llms.ChatMessageTypeTool,
Parts: []llms.ContentPart{
llms.ToolCallResponse{
Name: toolName,
Content: fmt.Sprintf("%v", toolResult),
},
},
}
messageHistory = append(messageHistory, toolResponse)
}
finalResp, err := llm.GenerateContent(ctx, messageHistory)
if err != nil {
log.Fatalf("Final LLM call failed after tool execution: %v", err)
}
// Add the final textual response from the LLM to the history
messageHistory = append(messageHistory, llms.TextParts(llms.ChatMessageTypeAI, finalResp.Choices[0].Content))
fmt.Println(finalResp.Choices[0].Content)
}
}

View File

@@ -0,0 +1,38 @@
module openai-quickstart
go 1.24.6
require (
github.com/googleapis/mcp-toolbox-sdk-go v0.3.0
github.com/openai/openai-go v1.12.0
)
require (
cloud.google.com/go/auth v0.16.2 // indirect
cloud.google.com/go/auth/oauth2adapt v0.2.8 // indirect
cloud.google.com/go/compute/metadata v0.7.0 // indirect
github.com/felixge/httpsnoop v1.0.4 // indirect
github.com/go-logr/logr v1.4.3 // indirect
github.com/go-logr/stdr v1.2.2 // indirect
github.com/google/s2a-go v0.1.9 // indirect
github.com/googleapis/enterprise-certificate-proxy v0.3.6 // indirect
github.com/googleapis/gax-go/v2 v2.14.2 // indirect
github.com/tidwall/gjson v1.14.4 // indirect
github.com/tidwall/match v1.1.1 // indirect
github.com/tidwall/pretty v1.2.1 // indirect
github.com/tidwall/sjson v1.2.5 // indirect
go.opentelemetry.io/auto/sdk v1.1.0 // indirect
go.opentelemetry.io/contrib/instrumentation/net/http/otelhttp v0.61.0 // indirect
go.opentelemetry.io/otel v1.36.0 // indirect
go.opentelemetry.io/otel/metric v1.36.0 // indirect
go.opentelemetry.io/otel/trace v1.36.0 // indirect
golang.org/x/crypto v0.39.0 // indirect
golang.org/x/net v0.41.0 // indirect
golang.org/x/oauth2 v0.30.0 // indirect
golang.org/x/sys v0.33.0 // indirect
golang.org/x/text v0.26.0 // indirect
google.golang.org/api v0.242.0 // indirect
google.golang.org/genproto/googleapis/rpc v0.0.0-20250603155806-513f23925822 // indirect
google.golang.org/grpc v1.73.0 // indirect
google.golang.org/protobuf v1.36.6 // indirect
)

View File

@@ -0,0 +1,141 @@
package main
import (
"context"
"encoding/json"
"fmt"
"log"
"github.com/googleapis/mcp-toolbox-sdk-go/core"
openai "github.com/openai/openai-go"
)
// ConvertToOpenAITool converts a ToolboxTool into the go-openai library's Tool format.
func ConvertToOpenAITool(toolboxTool *core.ToolboxTool) openai.ChatCompletionToolParam {
// Get the input schema
jsonSchemaBytes, err := toolboxTool.InputSchema()
if err != nil {
return openai.ChatCompletionToolParam{}
}
// Unmarshal the JSON bytes into FunctionParameters
var paramsSchema openai.FunctionParameters
if err := json.Unmarshal(jsonSchemaBytes, &paramsSchema); err != nil {
return openai.ChatCompletionToolParam{}
}
// Create and return the final tool parameter struct.
return openai.ChatCompletionToolParam{
Function: openai.FunctionDefinitionParam{
Name: toolboxTool.Name(),
Description: openai.String(toolboxTool.Description()),
Parameters: paramsSchema,
},
}
}
const systemPrompt = `
You're a helpful hotel assistant. You handle hotel searching, booking, and
cancellations. When the user searches for a hotel, mention its name, id,
location and price tier. Always mention hotel ids while performing any
searches. This is very important for any operations. For any bookings or
cancellations, please provide the appropriate confirmation. Be sure to
update checkin or checkout dates if mentioned by the user.
Don't ask for confirmations from the user.
`
var queries = []string{
"Find hotels in Basel with Basel in its name.",
"Can you book the hotel Hilton Basel for me?",
"Oh wait, this is too expensive. Please cancel it and book the Hyatt Regency instead.",
"My check in dates would be from April 10, 2024 to April 19, 2024.",
}
func main() {
// Setup
ctx := context.Background()
toolboxURL := "http://localhost:5000"
openAIClient := openai.NewClient()
// Initialize the MCP Toolbox client.
toolboxClient, err := core.NewToolboxClient(toolboxURL)
if err != nil {
log.Fatalf("Failed to create Toolbox client: %v", err)
}
// Load the tools using the MCP Toolbox SDK.
tools, err := toolboxClient.LoadToolset("my-toolset", ctx)
if err != nil {
log.Fatalf("Failed to load tool : %v\nMake sure your Toolbox server is running and the tool is configured.", err)
}
openAITools := make([]openai.ChatCompletionToolParam, len(tools))
toolsMap := make(map[string]*core.ToolboxTool, len(tools))
for i, tool := range tools {
// Convert the Toolbox tool into the openAI FunctionDeclaration format.
openAITools[i] = ConvertToOpenAITool(tool)
// Add tool to a map for lookup later
toolsMap[tool.Name()] = tool
}
params := openai.ChatCompletionNewParams{
Messages: []openai.ChatCompletionMessageParamUnion{
openai.SystemMessage(systemPrompt),
},
Tools: openAITools,
Seed: openai.Int(0),
Model: openai.ChatModelGPT4o,
}
for _, query := range queries {
params.Messages = append(params.Messages, openai.UserMessage(query))
// Make initial chat completion request
completion, err := openAIClient.Chat.Completions.New(ctx, params)
if err != nil {
panic(err)
}
toolCalls := completion.Choices[0].Message.ToolCalls
// Return early if there are no tool calls
if len(toolCalls) == 0 {
log.Println("No function call")
}
// If there was a function call, continue the conversation
params.Messages = append(params.Messages, completion.Choices[0].Message.ToParam())
for _, toolCall := range toolCalls {
toolName := toolCall.Function.Name
toolToInvoke := toolsMap[toolName]
var args map[string]any
err := json.Unmarshal([]byte(toolCall.Function.Arguments), &args)
if err != nil {
panic(err)
}
result, err := toolToInvoke.Invoke(ctx, args)
if err != nil {
log.Fatal("Could not invoke tool", err)
}
params.Messages = append(params.Messages, openai.ToolMessage(result.(string), toolCall.ID))
}
completion, err = openAIClient.Chat.Completions.New(ctx, params)
if err != nil {
panic(err)
}
params.Messages = append(params.Messages, openai.AssistantMessage(query))
fmt.Println("\n", completion.Choices[0].Message.Content)
}
}

View File

@@ -0,0 +1,119 @@
import { GoogleGenAI } from "@google/genai";
import { ToolboxClient } from "@toolbox-sdk/core";
const TOOLBOX_URL = "http://127.0.0.1:5000"; // Update if needed
const GOOGLE_API_KEY = process.env.GOOGLE_API_KEY || 'your-api-key'; // Replace it with your API key
const prompt = `
You're a helpful hotel assistant. You handle hotel searching, booking, and
cancellations. When the user searches for a hotel, you MUST use the available tools to find information. Mention its name, id,
location and price tier. Always mention hotel id while performing any
searches. This is very important for any operations. For any bookings or
cancellations, please provide the appropriate confirmation. Be sure to
update checkin or checkout dates if mentioned by the user.
Don't ask for confirmations from the user.
`;
const queries = [
"Find hotels in Basel with Basel in its name.",
"Can you book the Hilton Basel for me?",
"Oh wait, this is too expensive. Please cancel it and book the Hyatt Regency instead.",
"My check in dates would be from April 10, 2024 to April 19, 2024.",
];
function mapZodTypeToOpenAPIType(zodTypeName) {
console.log(zodTypeName)
const typeMap = {
'ZodString': 'string',
'ZodNumber': 'number',
'ZodBoolean': 'boolean',
'ZodArray': 'array',
'ZodObject': 'object',
};
return typeMap[zodTypeName] || 'string';
}
async function main() {
const toolboxClient = new ToolboxClient(TOOLBOX_URL);
const toolboxTools = await toolboxClient.loadToolset("my-toolset");
const geminiTools = [{
functionDeclarations: toolboxTools.map(tool => {
const schema = tool.getParamSchema();
const properties = {};
const required = [];
for (const [key, param] of Object.entries(schema.shape)) {
properties[key] = {
type: mapZodTypeToOpenAPIType(param.constructor.name),
description: param.description || '',
};
required.push(key)
}
return {
name: tool.getName(),
description: tool.getDescription(),
parameters: { type: 'object', properties, required },
};
})
}];
const genAI = new GoogleGenAI({ apiKey: GOOGLE_API_KEY });
const chat = genAI.chats.create({
model: "gemini-2.5-flash",
config: {
systemInstruction: prompt,
tools: geminiTools,
}
});
for (const query of queries) {
let currentResult = await chat.sendMessage({ message: query });
let finalResponseGiven = false
while (!finalResponseGiven) {
const response = currentResult;
const functionCalls = response.functionCalls || [];
if (functionCalls.length === 0) {
console.log(response.text)
finalResponseGiven = true;
} else {
const toolResponses = [];
for (const call of functionCalls) {
const toolName = call.name
const toolToExecute = toolboxTools.find(t => t.getName() === toolName);
if (toolToExecute) {
try {
const functionResult = await toolToExecute(call.args);
toolResponses.push({
functionResponse: { name: call.name, response: { result: functionResult } }
});
} catch (e) {
console.error(`Error executing tool '${toolName}':`, e);
toolResponses.push({
functionResponse: { name: call.name, response: { error: e.message } }
});
}
}
}
currentResult = await chat.sendMessage({ message: toolResponses });
}
}
}
}
main();

View File

@@ -0,0 +1,88 @@
import { ToolboxClient } from "@toolbox-sdk/core";
import { genkit } from "genkit";
import { googleAI } from '@genkit-ai/googleai';
const GOOGLE_API_KEY = process.env.GOOGLE_API_KEY || 'your-api-key'; // Replace it with your API key
const systemPrompt = `
You're a helpful hotel assistant. You handle hotel searching, booking, and
cancellations. When the user searches for a hotel, mention its name, id,
location and price tier. Always mention hotel ids while performing any
searches. This is very important for any operations. For any bookings or
cancellations, please provide the appropriate confirmation. Be sure to
update checkin or checkout dates if mentioned by the user.
Don't ask for confirmations from the user.
`;
const queries = [
"Find hotels in Basel with Basel in its name.",
"Can you book the Hilton Basel for me?",
"Oh wait, this is too expensive. Please cancel it and book the Hyatt Regency instead.",
"My check in dates would be from April 10, 2024 to April 19, 2024.",
];
async function main() {
const toolboxClient = new ToolboxClient("http://127.0.0.1:5000");
const ai = genkit({
plugins: [
googleAI({
apiKey: process.env.GEMINI_API_KEY || GOOGLE_API_KEY
})
],
model: googleAI.model('gemini-2.0-flash'),
});
const toolboxTools = await toolboxClient.loadToolset("my-toolset");
const toolMap = Object.fromEntries(
toolboxTools.map((tool) => {
const definedTool = ai.defineTool(
{
name: tool.getName(),
description: tool.getDescription(),
inputSchema: tool.getParamSchema(),
},
tool
);
return [tool.getName(), definedTool];
})
);
const tools = Object.values(toolMap);
let conversationHistory = [{ role: "system", content: [{ text: systemPrompt }] }];
for (const query of queries) {
conversationHistory.push({ role: "user", content: [{ text: query }] });
const response = await ai.generate({
messages: conversationHistory,
tools: tools,
});
conversationHistory.push(response.message);
const toolRequests = response.toolRequests;
if (toolRequests?.length > 0) {
// Execute tools concurrently and collect their responses.
const toolResponses = await Promise.all(
toolRequests.map(async (call) => {
try {
const toolOutput = await toolMap[call.name].invoke(call.input);
return { role: "tool", content: [{ toolResponse: { name: call.name, output: toolOutput } }] };
} catch (e) {
console.error(`Error executing tool ${call.name}:`, e);
return { role: "tool", content: [{ toolResponse: { name: call.name, output: { error: e.message } } }] };
}
})
);
conversationHistory.push(...toolResponses);
// Call the AI again with the tool results.
response = await ai.generate({ messages: conversationHistory, tools });
conversationHistory.push(response.message);
}
console.log(response.text);
}
}
main();

View File

@@ -0,0 +1,73 @@
import { ChatGoogleGenerativeAI } from "@langchain/google-genai";
import { ToolboxClient } from "@toolbox-sdk/core";
import { tool } from "@langchain/core/tools";
import { createReactAgent } from "@langchain/langgraph/prebuilt";
import { MemorySaver } from "@langchain/langgraph";
const GOOGLE_API_KEY = process.env.GOOGLE_API_KEY || 'your-api-key'; // Replace it with your API key
const prompt = `
You're a helpful hotel assistant. You handle hotel searching, booking, and
cancellations. When the user searches for a hotel, mention its name, id,
location and price tier. Always mention hotel ids while performing any
searches. This is very important for any operations. For any bookings or
cancellations, please provide the appropriate confirmation. Be sure to
update checkin or checkout dates if mentioned by the user.
Don't ask for confirmations from the user.
`;
const queries = [
"Find hotels in Basel with Basel in its name.",
"Can you book the Hilton Basel for me?",
"Oh wait, this is too expensive. Please cancel it and book the Hyatt Regency instead.",
"My check in dates would be from April 10, 2024 to April 19, 2024.",
];
async function main() {
const model = new ChatGoogleGenerativeAI({
model: "gemini-2.0-flash",
});
const client = new ToolboxClient("http://127.0.0.1:5000");
const toolboxTools = await client.loadToolset("my-toolset");
// Define the basics of the tool: name, description, schema and core logic
const getTool = (toolboxTool) => tool(toolboxTool, {
name: toolboxTool.getName(),
description: toolboxTool.getDescription(),
schema: toolboxTool.getParamSchema()
});
const tools = toolboxTools.map(getTool);
const agent = createReactAgent({
llm: model,
tools: tools,
checkpointer: new MemorySaver(),
systemPrompt: prompt,
});
const langGraphConfig = {
configurable: {
thread_id: "test-thread",
},
};
for (const query of queries) {
const agentOutput = await agent.invoke(
{
messages: [
{
role: "user",
content: query,
},
],
verbose: true,
},
langGraphConfig
);
const response = agentOutput.messages[agentOutput.messages.length - 1].content;
console.log(response);
}
}
main();

View File

@@ -0,0 +1,79 @@
import { gemini, GEMINI_MODEL } from "@llamaindex/google";
import { agent } from "@llamaindex/workflow";
import { createMemory, staticBlock, tool } from "llamaindex";
import { ToolboxClient } from "@toolbox-sdk/core";
const TOOLBOX_URL = "http://127.0.0.1:5000"; // Update if needed
const GOOGLE_API_KEY = process.env.GOOGLE_API_KEY || 'your-api-key'; // Replace it with your API key
const prompt = `
You're a helpful hotel assistant. You handle hotel searching, booking and cancellations.
When the user searches for a hotel, mention its name, id, location and price tier.
Always mention hotel ids while performing any searches — this is very important for operations.
For any bookings or cancellations, please provide the appropriate confirmation.
Update check-in or check-out dates if mentioned by the user.
Don't ask for confirmations from the user.
`;
const queries = [
"Find hotels in Basel with Basel in its name.",
"Can you book the Hilton Basel for me?",
"Oh wait, this is too expensive. Please cancel it and book the Hyatt Regency instead.",
"My check in dates would be from April 10, 2024 to April 19, 2024.",
];
async function main() {
// Connect to MCP Toolbox
const client = new ToolboxClient(TOOLBOX_URL);
const toolboxTools = await client.loadToolset("my-toolset");
const tools = toolboxTools.map((toolboxTool) => {
return tool({
name: toolboxTool.getName(),
description: toolboxTool.getDescription(),
parameters: toolboxTool.getParamSchema(),
execute: toolboxTool,
});
});
// Initialize LLM
const llm = gemini({
model: GEMINI_MODEL.GEMINI_2_0_FLASH,
apiKey: GOOGLE_API_KEY,
});
const memory = createMemory({
memoryBlocks: [
staticBlock({
content: prompt,
}),
],
});
// Create the Agent
const myAgent = agent({
tools: tools,
llm,
memory,
systemPrompt: prompt,
});
for (const query of queries) {
const result = await myAgent.run(query);
const output = result.data.result;
console.log(`\nUser: ${query}`);
if (typeof output === "string") {
console.log(output.trim());
} else if (typeof output === "object" && "text" in output) {
console.log(output.text.trim());
} else {
console.log(JSON.stringify(output));
}
}
//You may observe some extra logs during execution due to the run method provided by Llama.
console.log("Agent run finished.");
}
main();

View File

@@ -0,0 +1,70 @@
from google.adk.agents import Agent
from google.adk.runners import Runner
from google.adk.sessions import InMemorySessionService
from google.adk.artifacts.in_memory_artifact_service import InMemoryArtifactService
from google.genai import types
from toolbox_core import ToolboxSyncClient
import asyncio
import os
# TODO(developer): replace this with your Google API key
os.environ['GOOGLE_API_KEY'] = 'your-api-key'
async def main():
with ToolboxSyncClient("http://127.0.0.1:5000") as toolbox_client:
prompt = """
You're a helpful hotel assistant. You handle hotel searching, booking and
cancellations. When the user searches for a hotel, mention it's name, id,
location and price tier. Always mention hotel ids while performing any
searches. This is very important for any operations. For any bookings or
cancellations, please provide the appropriate confirmation. Be sure to
update checkin or checkout dates if mentioned by the user.
Don't ask for confirmations from the user.
"""
root_agent = Agent(
model='gemini-2.0-flash-001',
name='hotel_agent',
description='A helpful AI assistant.',
instruction=prompt,
tools=toolbox_client.load_toolset("my-toolset"),
)
session_service = InMemorySessionService()
artifacts_service = InMemoryArtifactService()
session = await session_service.create_session(
state={}, app_name='hotel_agent', user_id='123'
)
runner = Runner(
app_name='hotel_agent',
agent=root_agent,
artifact_service=artifacts_service,
session_service=session_service,
)
queries = [
"Find hotels in Basel with Basel in its name.",
"Can you book the Hilton Basel for me?",
"Oh wait, this is too expensive. Please cancel it and book the Hyatt Regency instead.",
"My check in dates would be from April 10, 2024 to April 19, 2024.",
]
for query in queries:
content = types.Content(role='user', parts=[types.Part(text=query)])
events = runner.run(session_id=session.id,
user_id='123', new_message=content)
responses = (
part.text
for event in events
for part in event.content.parts
if part.text is not None
)
for text in responses:
print(text)
asyncio.run(main())

View File

@@ -0,0 +1,108 @@
import asyncio
from google import genai
from google.genai.types import (
Content,
FunctionDeclaration,
GenerateContentConfig,
Part,
Tool,
)
from toolbox_core import ToolboxClient
prompt = """
You're a helpful hotel assistant. You handle hotel searching, booking and
cancellations. When the user searches for a hotel, mention it's name, id,
location and price tier. Always mention hotel id while performing any
searches. This is very important for any operations. For any bookings or
cancellations, please provide the appropriate confirmation. Be sure to
update checkin or checkout dates if mentioned by the user.
Don't ask for confirmations from the user.
"""
queries = [
"Find hotels in Basel with Basel in its name.",
"Please book the hotel Hilton Basel for me.",
"This is too expensive. Please cancel it.",
"Please book Hyatt Regency for me",
"My check in dates for my booking would be from April 10, 2024 to April 19, 2024.",
]
async def main():
async with ToolboxClient("http://127.0.0.1:5000") as toolbox_client:
# The toolbox_tools list contains Python callables (functions/methods) designed for LLM tool-use
# integration. While this example uses Google's genai client, these callables can be adapted for
# various function-calling or agent frameworks. For easier integration with supported frameworks
# (https://github.com/googleapis/mcp-toolbox-python-sdk/tree/main/packages), use the
# provided wrapper packages, which handle framework-specific boilerplate.
toolbox_tools = await toolbox_client.load_toolset("my-toolset")
genai_client = genai.Client(
vertexai=True, project="project-id", location="us-central1"
)
genai_tools = [
Tool(
function_declarations=[
FunctionDeclaration.from_callable_with_api_option(callable=tool)
]
)
for tool in toolbox_tools
]
history = []
for query in queries:
user_prompt_content = Content(
role="user",
parts=[Part.from_text(text=query)],
)
history.append(user_prompt_content)
response = genai_client.models.generate_content(
model="gemini-2.0-flash-001",
contents=history,
config=GenerateContentConfig(
system_instruction=prompt,
tools=genai_tools,
),
)
history.append(response.candidates[0].content)
function_response_parts = []
for function_call in response.function_calls:
fn_name = function_call.name
# The tools are sorted alphabetically
if fn_name == "search-hotels-by-name":
function_result = await toolbox_tools[3](**function_call.args)
elif fn_name == "search-hotels-by-location":
function_result = await toolbox_tools[2](**function_call.args)
elif fn_name == "book-hotel":
function_result = await toolbox_tools[0](**function_call.args)
elif fn_name == "update-hotel":
function_result = await toolbox_tools[4](**function_call.args)
elif fn_name == "cancel-hotel":
function_result = await toolbox_tools[1](**function_call.args)
else:
raise ValueError("Function name not present.")
function_response = {"result": function_result}
function_response_part = Part.from_function_response(
name=function_call.name,
response=function_response,
)
function_response_parts.append(function_response_part)
if function_response_parts:
tool_response_content = Content(role="tool", parts=function_response_parts)
history.append(tool_response_content)
response2 = genai_client.models.generate_content(
model="gemini-2.0-flash-001",
contents=history,
config=GenerateContentConfig(
tools=genai_tools,
),
)
final_model_response_content = response2.candidates[0].content
history.append(final_model_response_content)
print(response2.text)
asyncio.run(main())

View File

@@ -0,0 +1,52 @@
import asyncio
from langgraph.prebuilt import create_react_agent
# TODO(developer): replace this with another import if needed
from langchain_google_vertexai import ChatVertexAI
# from langchain_google_genai import ChatGoogleGenerativeAI
# from langchain_anthropic import ChatAnthropic
from langgraph.checkpoint.memory import MemorySaver
from toolbox_langchain import ToolboxClient
prompt = """
You're a helpful hotel assistant. You handle hotel searching, booking and
cancellations. When the user searches for a hotel, mention it's name, id,
location and price tier. Always mention hotel ids while performing any
searches. This is very important for any operations. For any bookings or
cancellations, please provide the appropriate confirmation. Be sure to
update checkin or checkout dates if mentioned by the user.
Don't ask for confirmations from the user.
"""
queries = [
"Find hotels in Basel with Basel in its name.",
"Can you book the Hilton Basel for me?",
"Oh wait, this is too expensive. Please cancel it and book the Hyatt Regency instead.",
"My check in dates would be from April 10, 2024 to April 19, 2024.",
]
async def main():
# TODO(developer): replace this with another model if needed
model = ChatVertexAI(model_name="gemini-2.0-flash-001")
# model = ChatGoogleGenerativeAI(model="gemini-2.0-flash-001")
# model = ChatAnthropic(model="claude-3-5-sonnet-20240620")
# Load the tools from the Toolbox server
async with ToolboxClient("http://127.0.0.1:5000") as client:
tools = await client.aload_toolset()
agent = create_react_agent(model, tools, checkpointer=MemorySaver())
config = {"configurable": {"thread_id": "thread-1"}}
for query in queries:
inputs = {"messages": [("user", prompt + query)]}
response = agent.invoke(inputs, stream_mode="values", config=config)
print(response["messages"][-1].content)
asyncio.run(main())

View File

@@ -0,0 +1,63 @@
import asyncio
import os
from llama_index.core.agent.workflow import AgentWorkflow
from llama_index.core.workflow import Context
# TODO(developer): replace this with another import if needed
from llama_index.llms.google_genai import GoogleGenAI
# from llama_index.llms.anthropic import Anthropic
from toolbox_llamaindex import ToolboxClient
prompt = """
You're a helpful hotel assistant. You handle hotel searching, booking and
cancellations. When the user searches for a hotel, mention it's name, id,
location and price tier. Always mention hotel ids while performing any
searches. This is very important for any operations. For any bookings or
cancellations, please provide the appropriate confirmation. Be sure to
update checkin or checkout dates if mentioned by the user.
Don't ask for confirmations from the user.
"""
queries = [
"Find hotels in Basel with Basel in its name.",
"Can you book the Hilton Basel for me?",
"Oh wait, this is too expensive. Please cancel it and book the Hyatt Regency instead.",
"My check in dates would be from April 10, 2024 to April 19, 2024.",
]
async def main():
# TODO(developer): replace this with another model if needed
llm = GoogleGenAI(
model="gemini-2.0-flash-001",
vertexai_config={"project": "project-id", "location": "us-central1"},
)
# llm = GoogleGenAI(
# api_key=os.getenv("GOOGLE_API_KEY"),
# model="gemini-2.0-flash-001",
# )
# llm = Anthropic(
# model="claude-3-7-sonnet-latest",
# api_key=os.getenv("ANTHROPIC_API_KEY")
# )
# Load the tools from the Toolbox server
async with ToolboxClient("http://127.0.0.1:5000") as client:
tools = await client.aload_toolset()
agent = AgentWorkflow.from_tools_or_functions(
tools,
llm=llm,
system_prompt=prompt,
)
ctx = Context(agent)
for query in queries:
response = await agent.run(user_msg=query, ctx=ctx)
print(f"---- {query} ----")
print(str(response))
asyncio.run(main())

View File

@@ -0,0 +1,20 @@
<!-- This file has been used in local_quickstart.md, local_quickstart_go.md & local_quickstart_js.md -->
<!-- [START cloud_setup] -->
If you plan to use **Google Clouds Vertex AI** with your agent (e.g., using
`vertexai=True` or a Google GenAI model), follow these one-time setup steps for
local development:
1. [Install the Google Cloud CLI](https://cloud.google.com/sdk/docs/install)
1. [Set up Application Default Credentials (ADC)](https://cloud.google.com/docs/authentication/set-up-adc-local-dev-environment)
1. Set your project and enable Vertex AI
```bash
gcloud config set project YOUR_PROJECT_ID
gcloud services enable aiplatform.googleapis.com
```
[install-python]: https://wiki.python.org/moin/BeginnersGuide/Download
[install-pip]: https://pip.pypa.io/en/stable/installation/
[install-venv]: https://packaging.python.org/en/latest/tutorials/installing-packages/#creating-virtual-environments
[install-postgres]: https://www.postgresql.org/download/
<!-- [END cloud_setup] -->

View File

@@ -0,0 +1,122 @@
<!-- This file has been used in local_quickstart.md, local_quickstart_go.md & local_quickstart_js.md -->
<!-- [START configure_toolbox] -->
In this section, we will download Toolbox, configure our tools in a
`tools.yaml`, and then run the Toolbox server.
1. Download the latest version of Toolbox as a binary:
{{< notice tip >}}
Select the
[correct binary](https://github.com/googleapis/genai-toolbox/releases)
corresponding to your OS and CPU architecture.
{{< /notice >}}
<!-- {x-release-please-start-version} -->
```bash
export OS="linux/amd64" # one of linux/amd64, darwin/arm64, darwin/amd64, or windows/amd64
curl -O https://storage.googleapis.com/genai-toolbox/v0.14.0/$OS/toolbox
```
<!-- {x-release-please-end} -->
1. Make the binary executable:
```bash
chmod +x toolbox
```
1. Write the following into a `tools.yaml` file. Be sure to update any fields
such as `user`, `password`, or `database` that you may have customized in the
previous step.
{{< notice tip >}}
In practice, use environment variable replacement with the format ${ENV_NAME}
instead of hardcoding your secrets into the configuration file.
{{< /notice >}}
```yaml
sources:
my-pg-source:
kind: postgres
host: 127.0.0.1
port: 5432
database: toolbox_db
user: ${USER_NAME}
password: ${PASSWORD}
tools:
search-hotels-by-name:
kind: postgres-sql
source: my-pg-source
description: Search for hotels based on name.
parameters:
- name: name
type: string
description: The name of the hotel.
statement: SELECT * FROM hotels WHERE name ILIKE '%' || $1 || '%';
search-hotels-by-location:
kind: postgres-sql
source: my-pg-source
description: Search for hotels based on location.
parameters:
- name: location
type: string
description: The location of the hotel.
statement: SELECT * FROM hotels WHERE location ILIKE '%' || $1 || '%';
book-hotel:
kind: postgres-sql
source: my-pg-source
description: >-
Book a hotel by its ID. If the hotel is successfully booked, returns a NULL, raises an error if not.
parameters:
- name: hotel_id
type: string
description: The ID of the hotel to book.
statement: UPDATE hotels SET booked = B'1' WHERE id = $1;
update-hotel:
kind: postgres-sql
source: my-pg-source
description: >-
Update a hotel's check-in and check-out dates by its ID. Returns a message
indicating whether the hotel was successfully updated or not.
parameters:
- name: hotel_id
type: string
description: The ID of the hotel to update.
- name: checkin_date
type: string
description: The new check-in date of the hotel.
- name: checkout_date
type: string
description: The new check-out date of the hotel.
statement: >-
UPDATE hotels SET checkin_date = CAST($2 as date), checkout_date = CAST($3
as date) WHERE id = $1;
cancel-hotel:
kind: postgres-sql
source: my-pg-source
description: Cancel a hotel by its ID.
parameters:
- name: hotel_id
type: string
description: The ID of the hotel to cancel.
statement: UPDATE hotels SET booked = B'0' WHERE id = $1;
toolsets:
my-toolset:
- search-hotels-by-name
- search-hotels-by-location
- book-hotel
- update-hotel
- cancel-hotel
```
For more info on tools, check out the `Resources` section of the docs.
1. Run the Toolbox server, pointing to the `tools.yaml` file created earlier:
```bash
./toolbox --tools-file "tools.yaml"
```
{{< notice note >}}
Toolbox enables dynamic reloading by default. To disable, use the
`--disable-reload` flag.
{{< /notice >}}
<!-- [END configure_toolbox] -->

View File

@@ -0,0 +1,119 @@
<!-- This file has been used in local_quickstart.md, local_quickstart_go.md & local_quickstart_js.md -->
<!-- [START database_setup] -->
In this section, we will create a database, insert some data that needs to be
accessed by our agent, and create a database user for Toolbox to connect with.
1. Connect to postgres using the `psql` command:
```bash
psql -h 127.0.0.1 -U postgres
```
Here, `postgres` denotes the default postgres superuser.
{{< notice info >}}
#### **Having trouble connecting?**
* **Password Prompt:** If you are prompted for a password for the `postgres`
user and do not know it (or a blank password doesn't work), your PostgreSQL
installation might require a password or a different authentication method.
* **`FATAL: role "postgres" does not exist`:** This error means the default
`postgres` superuser role isn't available under that name on your system.
* **`Connection refused`:** Ensure your PostgreSQL server is actually running.
You can typically check with `sudo systemctl status postgresql` and start it
with `sudo systemctl start postgresql` on Linux systems.
<br/>
#### **Common Solution**
For password issues or if the `postgres` role seems inaccessible directly, try
switching to the `postgres` operating system user first. This user often has
permission to connect without a password for local connections (this is called
peer authentication).
```bash
sudo -i -u postgres
psql -h 127.0.0.1
```
Once you are in the `psql` shell using this method, you can proceed with the
database creation steps below. Afterwards, type `\q` to exit `psql`, and then
`exit` to return to your normal user shell.
If desired, once connected to `psql` as the `postgres` OS user, you can set a
password for the `postgres` *database* user using: `ALTER USER postgres WITH
PASSWORD 'your_chosen_password';`. This would allow direct connection with `-U
postgres` and a password next time.
{{< /notice >}}
1. Create a new database and a new user:
{{< notice tip >}}
For a real application, it's best to follow the principle of least permission
and only grant the privileges your application needs.
{{< /notice >}}
```sql
CREATE USER toolbox_user WITH PASSWORD 'my-password';
CREATE DATABASE toolbox_db;
GRANT ALL PRIVILEGES ON DATABASE toolbox_db TO toolbox_user;
ALTER DATABASE toolbox_db OWNER TO toolbox_user;
```
1. End the database session:
```bash
\q
```
(If you used `sudo -i -u postgres` and then `psql`, remember you might also
need to type `exit` after `\q` to leave the `postgres` user's shell
session.)
1. Connect to your database with your new user:
```bash
psql -h 127.0.0.1 -U toolbox_user -d toolbox_db
```
1. Create a table using the following command:
```sql
CREATE TABLE hotels(
id INTEGER NOT NULL PRIMARY KEY,
name VARCHAR NOT NULL,
location VARCHAR NOT NULL,
price_tier VARCHAR NOT NULL,
checkin_date DATE NOT NULL,
checkout_date DATE NOT NULL,
booked BIT NOT NULL
);
```
1. Insert data into the table.
```sql
INSERT INTO hotels(id, name, location, price_tier, checkin_date, checkout_date, booked)
VALUES
(1, 'Hilton Basel', 'Basel', 'Luxury', '2024-04-22', '2024-04-20', B'0'),
(2, 'Marriott Zurich', 'Zurich', 'Upscale', '2024-04-14', '2024-04-21', B'0'),
(3, 'Hyatt Regency Basel', 'Basel', 'Upper Upscale', '2024-04-02', '2024-04-20', B'0'),
(4, 'Radisson Blu Lucerne', 'Lucerne', 'Midscale', '2024-04-24', '2024-04-05', B'0'),
(5, 'Best Western Bern', 'Bern', 'Upper Midscale', '2024-04-23', '2024-04-01', B'0'),
(6, 'InterContinental Geneva', 'Geneva', 'Luxury', '2024-04-23', '2024-04-28', B'0'),
(7, 'Sheraton Zurich', 'Zurich', 'Upper Upscale', '2024-04-27', '2024-04-02', B'0'),
(8, 'Holiday Inn Basel', 'Basel', 'Upper Midscale', '2024-04-24', '2024-04-09', B'0'),
(9, 'Courtyard Zurich', 'Zurich', 'Upscale', '2024-04-03', '2024-04-13', B'0'),
(10, 'Comfort Inn Bern', 'Bern', 'Midscale', '2024-04-04', '2024-04-16', B'0');
```
1. End the database session:
```bash
\q
```
<!-- [END database_setup] -->

View File

@@ -6,328 +6,9 @@ description: >
Connect your IDE to Firestore using Toolbox.
---
[Model Context Protocol (MCP)](https://modelcontextprotocol.io/introduction) is
an open protocol for connecting Large Language Models (LLMs) to data sources
like Firestore. This guide covers how to use [MCP Toolbox for Databases][toolbox]
to expose your developer assistant tools to a Firestore instance:
* [Cursor][cursor]
* [Windsurf][windsurf] (Codium)
* [Visual Studio Code][vscode] (Copilot)
* [Cline][cline] (VS Code extension)
* [Claude desktop][claudedesktop]
* [Claude code][claudecode]
* [Gemini CLI][geminicli]
* [Gemini Code Assist][geminicodeassist]
[toolbox]: https://github.com/googleapis/genai-toolbox
[cursor]: #configure-your-mcp-client
[windsurf]: #configure-your-mcp-client
[vscode]: #configure-your-mcp-client
[cline]: #configure-your-mcp-client
[claudedesktop]: #configure-your-mcp-client
[claudecode]: #configure-your-mcp-client
[geminicli]: #configure-your-mcp-client
[geminicodeassist]: #configure-your-mcp-client
## Set up Firestore
1. Create or select a Google Cloud project.
* [Create a new
project](https://cloud.google.com/resource-manager/docs/creating-managing-projects)
* [Select an existing
project](https://cloud.google.com/resource-manager/docs/creating-managing-projects#identifying_projects)
1. [Enable the Firestore
API](https://console.cloud.google.com/apis/library/firestore.googleapis.com)
for your project.
1. [Create a Firestore
database](https://cloud.google.com/firestore/docs/create-database-web-mobile-client-library)
if you haven't already.
1. Set up authentication for your local environment.
* [Install gcloud CLI](https://cloud.google.com/sdk/docs/install)
* Run `gcloud auth application-default login` to authenticate
## Install MCP Toolbox
1. Download the latest version of Toolbox as a binary. Select the [correct
binary](https://github.com/googleapis/genai-toolbox/releases) corresponding
to your OS and CPU architecture. You are required to use Toolbox version
V0.10.0+:
<!-- {x-release-please-start-version} -->
{{< tabpane persist=header >}}
{{< tab header="linux/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.12.0/linux/amd64/toolbox
{{< /tab >}}
{{< tab header="darwin/arm64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.12.0/darwin/arm64/toolbox
{{< /tab >}}
{{< tab header="darwin/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.12.0/darwin/amd64/toolbox
{{< /tab >}}
{{< tab header="windows/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.12.0/windows/amd64/toolbox
{{< /tab >}}
{{< /tabpane >}}
<!-- {x-release-please-end} -->
1. Make the binary executable:
```bash
chmod +x toolbox
```
1. Verify the installation:
```bash
./toolbox --version
```
## Configure your MCP Client
{{< tabpane text=true >}}
{{% tab header="Claude code" lang="en" %}}
1. Install [Claude
Code](https://docs.anthropic.com/en/docs/agents-and-tools/claude-code/overview).
1. Create a `.mcp.json` file in your project root if it doesn't exist.
1. Add the following configuration, replace the environment variables with your
values, and save:
```json
{
"mcpServers": {
"firestore": {
"command": "./PATH/TO/toolbox",
"args": ["--prebuilt","firestore","--stdio"],
"env": {
"FIRESTORE_PROJECT": "your-project-id",
"FIRESTORE_DATABASE": "(default)"
}
}
}
}
```
1. Restart Claude code to apply the new configuration.
{{% /tab %}}
{{% tab header="Claude desktop" lang="en" %}}
1. Open [Claude desktop](https://claude.ai/download) and navigate to Settings.
1. Under the Developer tab, tap Edit Config to open the configuration file.
1. Add the following configuration, replace the environment variables with your
values, and save:
```json
{
"mcpServers": {
"firestore": {
"command": "./PATH/TO/toolbox",
"args": ["--prebuilt","firestore","--stdio"],
"env": {
"FIRESTORE_PROJECT": "your-project-id",
"FIRESTORE_DATABASE": "(default)"
}
}
}
}
```
1. Restart Claude desktop.
1. From the new chat screen, you should see a hammer (MCP) icon appear with the
new MCP server available.
{{% /tab %}}
{{% tab header="Cline" lang="en" %}}
1. Open the [Cline](https://github.com/cline/cline) extension in VS Code and tap
the **MCP Servers** icon.
1. Tap Configure MCP Servers to open the configuration file.
1. Add the following configuration, replace the environment variables with your
values, and save:
```json
{
"mcpServers": {
"firestore": {
"command": "./PATH/TO/toolbox",
"args": ["--prebuilt","firestore","--stdio"],
"env": {
"FIRESTORE_PROJECT": "your-project-id",
"FIRESTORE_DATABASE": "(default)"
}
}
}
}
```
1. You should see a green active status after the server is successfully
connected.
{{% /tab %}}
{{% tab header="Cursor" lang="en" %}}
1. Create a `.cursor` directory in your project root if it doesn't exist.
1. Create a `.cursor/mcp.json` file if it doesn't exist and open it.
1. Add the following configuration, replace the environment variables with your
values, and save:
```json
{
"mcpServers": {
"firestore": {
"command": "./PATH/TO/toolbox",
"args": ["--prebuilt","firestore","--stdio"],
"env": {
"FIRESTORE_PROJECT": "your-project-id",
"FIRESTORE_DATABASE": "(default)"
}
}
}
}
```
1. [Cursor](https://www.cursor.com/) and navigate to **Settings > Cursor
Settings > MCP**. You should see a green active status after the server is
successfully connected.
{{% /tab %}}
{{% tab header="Visual Studio Code (Copilot)" lang="en" %}}
1. Open [VS Code](https://code.visualstudio.com/docs/copilot/overview) and
create a `.vscode` directory in your project root if it doesn't exist.
1. Create a `.vscode/mcp.json` file if it doesn't exist and open it.
1. Add the following configuration, replace the environment variables with your
values, and save:
```json
{
"mcpServers": {
"firestore": {
"command": "./PATH/TO/toolbox",
"args": ["--prebuilt","firestore","--stdio"],
"env": {
"FIRESTORE_PROJECT": "your-project-id",
"FIRESTORE_DATABASE": "(default)"
}
}
}
}
```
{{% /tab %}}
{{% tab header="Windsurf" lang="en" %}}
1. Open [Windsurf](https://docs.codeium.com/windsurf) and navigate to the
Cascade assistant.
1. Tap on the hammer (MCP) icon, then Configure to open the configuration file.
1. Add the following configuration, replace the environment variables with your
values, and save:
```json
{
"mcpServers": {
"firestore": {
"command": "./PATH/TO/toolbox",
"args": ["--prebuilt","firestore","--stdio"],
"env": {
"FIRESTORE_PROJECT": "your-project-id",
"FIRESTORE_DATABASE": "(default)"
}
}
}
}
```
{{% /tab %}}
{{% tab header="Gemini CLI" lang="en" %}}
1. Install the [Gemini
CLI](https://github.com/google-gemini/gemini-cli?tab=readme-ov-file#quickstart).
1. In your working directory, create a folder named `.gemini`. Within it, create
a `settings.json` file.
1. Add the following configuration, replace the environment variables with your
values, and then save:
```json
{
"mcpServers": {
"firestore": {
"command": "./PATH/TO/toolbox",
"args": ["--prebuilt","firestore","--stdio"],
"env": {
"FIRESTORE_PROJECT": "your-project-id",
"FIRESTORE_DATABASE": "(default)"
}
}
}
}
```
{{% /tab %}}
{{% tab header="Gemini Code Assist" lang="en" %}}
1. Install the [Gemini Code
Assist](https://marketplace.visualstudio.com/items?itemName=Google.geminicodeassist)
extension in Visual Studio Code.
1. Enable Agent Mode in Gemini Code Assist chat.
1. In your working directory, create a folder named `.gemini`. Within it, create
a `settings.json` file.
1. Add the following configuration, replace the environment variables with your
values, and then save:
```json
{
"mcpServers": {
"firestore": {
"command": "./PATH/TO/toolbox",
"args": ["--prebuilt","firestore","--stdio"],
"env": {
"FIRESTORE_PROJECT": "your-project-id",
"FIRESTORE_DATABASE": "(default)"
}
}
}
}
```
{{% /tab %}}
{{< /tabpane >}}
## Use Tools
Your AI tool is now connected to Firestore using MCP. Try asking your AI
assistant to list collections, get documents, query collections, or manage
security rules.
The following tools are available to the LLM:
1. **firestore-get-documents**: Gets multiple documents from Firestore by their
paths
1. **firestore-list-collections**: List Firestore collections for a given parent
path
1. **firestore-delete-documents**: Delete multiple documents from Firestore
1. **firestore-query-collection**: Query documents from a collection with
filtering, ordering, and limit options
1. **firestore-get-rules**: Retrieves the active Firestore security rules for
the current project
1. **firestore-validate-rules**: Validates Firestore security rules syntax and
errors
{{< notice note >}}
Prebuilt tools are pre-1.0, so expect some tool changes between versions. LLMs
will adapt to the tools available, so this shouldn't affect most users.
{{< /notice >}}
<html>
<head>
<link rel="canonical" href="https://cloud.google.com/firestore/native/docs/connect-ide-using-mcp-toolbox"/>
<meta http-equiv="refresh" content="0;url=https://cloud.google.com/firestore/native/docs/connect-ide-using-mcp-toolbox"/>
</head>
</html>

View File

@@ -11,6 +11,7 @@ an open protocol for connecting Large Language Models (LLMs) to data sources
like Postgres. This guide covers how to use [MCP Toolbox for Databases][toolbox]
to expose your developer assistant tools to a Looker instance:
* [Gemini-CLI][gemini-cli]
* [Cursor][cursor]
* [Windsurf][windsurf] (Codium)
* [Visual Studio Code][vscode] (Copilot)
@@ -19,6 +20,7 @@ to expose your developer assistant tools to a Looker instance:
* [Claude code][claudecode]
[toolbox]: https://github.com/googleapis/genai-toolbox
[gemini-cli]: #configure-your-mcp-client
[cursor]: #configure-your-mcp-client
[windsurf]: #configure-your-mcp-client
[vscode]: #configure-your-mcp-client
@@ -46,19 +48,19 @@ to expose your developer assistant tools to a Looker instance:
<!-- {x-release-please-start-version} -->
{{< tabpane persist=header >}}
{{< tab header="linux/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.12.0/linux/amd64/toolbox
curl -O https://storage.googleapis.com/genai-toolbox/v0.14.0/linux/amd64/toolbox
{{< /tab >}}
{{< tab header="darwin/arm64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.12.0/darwin/arm64/toolbox
curl -O https://storage.googleapis.com/genai-toolbox/v0.14.0/darwin/arm64/toolbox
{{< /tab >}}
{{< tab header="darwin/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.12.0/darwin/amd64/toolbox
curl -O https://storage.googleapis.com/genai-toolbox/v0.14.0/darwin/amd64/toolbox
{{< /tab >}}
{{< tab header="windows/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.12.0/windows/amd64/toolbox.exe
curl -O https://storage.googleapis.com/genai-toolbox/v0.14.0/windows/amd64/toolbox.exe
{{< /tab >}}
{{< /tabpane >}}
<!-- {x-release-please-end} -->
@@ -78,6 +80,36 @@ curl -O https://storage.googleapis.com/genai-toolbox/v0.12.0/windows/amd64/toolb
## Configure your MCP Client
{{< tabpane text=true >}}
{{% tab header="Gemini-CLI" lang="en" %}}
1. Install [Gemini-CLI](https://github.com/google-gemini/gemini-cli#install-globally-with-npm).
1. Create a directory `.gemini` in your home directory if it doesn't exist.
1. Create the file `.gemini/settings.json` if it doesn't exist.
1. Add the following configuration, or add the mcpServers stanza if you already
have a `settings.json` with content. Replace the path to the toolbox
executable and the environment variables with your values, and save:
```json
{
"mcpServers": {
"looker-toolbox": {
"command": "./PATH/TO/toolbox",
"args": ["--stdio", "--prebuilt", "looker"],
"env": {
"LOOKER_BASE_URL": "https://looker.example.com",
"LOOKER_CLIENT_ID": "",
"LOOKER_CLIENT_SECRET": "",
"LOOKER_VERIFY_SSL": "true"
}
}
}
}
```
1. Start Gemini-CLI with the `gemini` command and use the command `/mcp` to see
the configured MCP tools.
{{% /tab %}}
{{% tab header="Claude code" lang="en" %}}
1. Install [Claude
@@ -203,7 +235,7 @@ curl -O https://storage.googleapis.com/genai-toolbox/v0.12.0/windows/amd64/toolb
```json
{
"mcpServers": {
"servers": {
"looker-toolbox": {
"command": "./PATH/TO/toolbox",
"args": ["--stdio", "--prebuilt", "looker"],
@@ -269,6 +301,9 @@ The following tools are available to the LLM:
1. **get_looks**: Return the saved Looks that match a title or description
1. **run_look**: Run a saved Look and return the data
1. **make_look**: Create a saved Look in Looker and return the URL
1. **get_dashboards**: Return the saved dashboards that match a title or description
1. **make_dashboard**: Create a saved dashboard in Looker and return the URL
1. **add_dashboard_element**: Add a tile to a dashboard
{{< notice note >}}
Prebuilt tools are pre-1.0, so expect some tool changes between versions. LLMs

View File

@@ -37,19 +37,19 @@ description: "Connect your IDE to SQL Server using Toolbox."
<!-- {x-release-please-start-version} -->
{{< tabpane persist=header >}}
{{< tab header="linux/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.12.0/linux/amd64/toolbox
curl -O https://storage.googleapis.com/genai-toolbox/v0.14.0/linux/amd64/toolbox
{{< /tab >}}
{{< tab header="darwin/arm64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.12.0/darwin/arm64/toolbox
curl -O https://storage.googleapis.com/genai-toolbox/v0.14.0/darwin/arm64/toolbox
{{< /tab >}}
{{< tab header="darwin/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.12.0/darwin/amd64/toolbox
curl -O https://storage.googleapis.com/genai-toolbox/v0.14.0/darwin/amd64/toolbox
{{< /tab >}}
{{< tab header="windows/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.12.0/windows/amd64/toolbox.exe
curl -O https://storage.googleapis.com/genai-toolbox/v0.14.0/windows/amd64/toolbox.exe
{{< /tab >}}
{{< /tabpane >}}
<!-- {x-release-please-end} -->
@@ -182,19 +182,17 @@ curl -O https://storage.googleapis.com/genai-toolbox/v0.12.0/windows/amd64/toolb
```json
{
"mcp" : {
"servers": {
"cloud-sql-sqlserver": {
"command": "./PATH/TO/toolbox",
"args": ["--prebuilt","cloud-sql-mssql","--stdio"],
"env": {
"MSSQL_HOST": "",
"MSSQL_PORT": "",
"MSSQL_DATABASE": "",
"MSSQL_USER": "",
"MSSQL_PASSWORD": ""
}
}
"servers": {
"mssql": {
"command": "./PATH/TO/toolbox",
"args": ["--prebuilt","mssql","--stdio"],
"env": {
"MSSQL_HOST": "",
"MSSQL_PORT": "",
"MSSQL_DATABASE": "",
"MSSQL_USER": "",
"MSSQL_PASSWORD": ""
}
}
}
}
@@ -286,4 +284,4 @@ The following tools are available to the LLM:
{{< notice note >}}
Prebuilt tools are pre-1.0, so expect some tool changes between versions. LLMs will adapt to the tools available, so this shouldn't affect most users.
{{< /notice >}}
{{< /notice >}}

View File

@@ -37,19 +37,19 @@ description: "Connect your IDE to MySQL using Toolbox."
<!-- {x-release-please-start-version} -->
{{< tabpane persist=header >}}
{{< tab header="linux/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.12.0/linux/amd64/toolbox
curl -O https://storage.googleapis.com/genai-toolbox/v0.14.0/linux/amd64/toolbox
{{< /tab >}}
{{< tab header="darwin/arm64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.12.0/darwin/arm64/toolbox
curl -O https://storage.googleapis.com/genai-toolbox/v0.14.0/darwin/arm64/toolbox
{{< /tab >}}
{{< tab header="darwin/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.12.0/darwin/amd64/toolbox
curl -O https://storage.googleapis.com/genai-toolbox/v0.14.0/darwin/amd64/toolbox
{{< /tab >}}
{{< tab header="windows/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.12.0/windows/amd64/toolbox.exe
curl -O https://storage.googleapis.com/genai-toolbox/v0.14.0/windows/amd64/toolbox.exe
{{< /tab >}}
{{< /tabpane >}}
<!-- {x-release-please-end} -->
@@ -182,7 +182,7 @@ curl -O https://storage.googleapis.com/genai-toolbox/v0.12.0/windows/amd64/toolb
```json
{
"mcpServers": {
"servers": {
"mysql": {
"command": "./PATH/TO/toolbox",
"args": ["--prebuilt","mysql","--stdio"],

View File

@@ -0,0 +1,281 @@
---
title: Neo4j using MCP
type: docs
weight: 2
description: "Connect your IDE to Neo4j using Toolbox."
---
[Model Context Protocol (MCP)](https://modelcontextprotocol.io/introduction) is an open protocol for connecting Large Language Models (LLMs) to data sources like Neo4j. This guide covers how to use [MCP Toolbox for Databases][toolbox] to expose your developer assistant tools to a Neo4j instance:
* [Cursor][cursor]
* [Windsurf][windsurf] (Codium)
* [Visual Studio Code][vscode] (Copilot)
* [Cline][cline] (VS Code extension)
* [Claude desktop][claudedesktop]
* [Claude code][claudecode]
* [Gemini CLI][geminicli]
* [Gemini Code Assist][geminicodeassist]
[toolbox]: https://github.com/googleapis/genai-toolbox
[cursor]: #configure-your-mcp-client
[windsurf]: #configure-your-mcp-client
[vscode]: #configure-your-mcp-client
[cline]: #configure-your-mcp-client
[claudedesktop]: #configure-your-mcp-client
[claudecode]: #configure-your-mcp-client
[geminicli]: #configure-your-mcp-client
[geminicodeassist]: #configure-your-mcp-client
## Set up the database
1. [Create or select a Neo4j instance.](https://neo4j.com/cloud/platform/aura-graph-database/)
## Install MCP Toolbox
1. Download the latest version of Toolbox as a binary. Select the [correct binary](https://github.com/googleapis/genai-toolbox/releases) corresponding to your OS and CPU architecture. You are required to use Toolbox version v0.15.0+:
<!-- {x-release-please-start-version} -->
{{< tabpane persist=header >}}
{{< tab header="linux/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.15.0/linux/amd64/toolbox
{{< /tab >}}
{{< tab header="darwin/arm64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.15.0/darwin/arm64/toolbox
{{< /tab >}}
{{< tab header="darwin/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.15.0/darwin/amd64/toolbox
{{< /tab >}}
{{< tab header="windows/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.15.0/windows/amd64/toolbox.exe
{{< /tab >}}
{{< /tabpane >}}
<!-- {x-release-please-end} -->
1. Make the binary executable:
```bash
chmod +x toolbox
```
1. Verify the installation:
```bash
./toolbox --version
```
## Configure your MCP Client
{{< tabpane text=true >}}
{{% tab header="Claude code" lang="en" %}}
1. Install [Claude Code](https://docs.anthropic.com/en/docs/agents-and-tools/claude-code/overview).
1. Create a `.mcp.json` file in your project root if it doesn't exist.
1. Add the following configuration, replace the environment variables with your values, and save:
```json
{
"mcpServers": {
"neo4j": {
"command": "./PATH/TO/toolbox",
"args": ["--prebuilt","neo4j","--stdio"],
"env": {
"NEO4J_URI": "",
"NEO4J_DATABASE": "",
"NEO4J_USERNAME": "",
"NEO4J_PASSWORD": ""
}
}
}
}
```
1. Restart Claude code to apply the new configuration.
{{% /tab %}}
{{% tab header="Claude desktop" lang="en" %}}
1. Open [Claude desktop](https://claude.ai/download) and navigate to Settings.
1. Under the Developer tab, tap Edit Config to open the configuration file.
1. Add the following configuration, replace the environment variables with your values, and save:
```json
{
"mcpServers": {
"neo4j": {
"command": "./PATH/TO/toolbox",
"args": ["--prebuilt","neo4j","--stdio"],
"env": {
"NEO4J_URI": "",
"NEO4J_DATABASE": "",
"NEO4J_USERNAME": "",
"NEO4J_PASSWORD": ""
}
}
}
}
```
1. Restart Claude desktop.
1. From the new chat screen, you should see a hammer (MCP) icon appear with the new MCP server available.
{{% /tab %}}
{{% tab header="Cline" lang="en" %}}
1. Open the [Cline](https://github.com/cline/cline) extension in VS Code and tap the **MCP Servers** icon.
1. Tap Configure MCP Servers to open the configuration file.
1. Add the following configuration, replace the environment variables with your values, and save:
```json
{
"mcpServers": {
"neo4j": {
"command": "./PATH/TO/toolbox",
"args": ["--prebuilt","neo4j","--stdio"],
"env": {
"NEO4J_URI": "",
"NEO4J_DATABASE": "",
"NEO4J_USERNAME": "",
"NEO4J_PASSWORD": ""
}
}
}
}
```
1. You should see a green active status after the server is successfully connected.
{{% /tab %}}
{{% tab header="Cursor" lang="en" %}}
1. Create a `.cursor` directory in your project root if it doesn't exist.
1. Create a `.cursor/mcp.json` file if it doesn't exist and open it.
1. Add the following configuration, replace the environment variables with your values, and save:
```json
{
"mcpServers": {
"neo4j": {
"command": "./PATH/TO/toolbox",
"args": ["--prebuilt","neo4j","--stdio"],
"env": {
"NEO4J_URI": "",
"NEO4J_DATABASE": "",
"NEO4J_USERNAME": "",
"NEO4J_PASSWORD": ""
}
}
}
}
```
1. Open [Cursor](https://www.cursor.com/) and navigate to **Settings > Cursor Settings > MCP**. You should see a green active status after the server is successfully connected.
{{% /tab %}}
{{% tab header="Visual Studio Code (Copilot)" lang="en" %}}
1. Open [VS Code](https://code.visualstudio.com/docs/copilot/overview) and create a `.vscode` directory in your project root if it doesn't exist.
1. Create a `.vscode/mcp.json` file if it doesn't exist and open it.
1. Add the following configuration, replace the environment variables with your values, and save:
```json
{
"mcp" : {
"servers": {
"neo4j": {
"command": "./PATH/TO/toolbox",
"args": ["--prebuilt","neo4j","--stdio"],
"env": {
"NEO4J_URI": "",
"NEO4J_DATABASE": "",
"NEO4J_USERNAME": "",
"NEO4J_PASSWORD": ""
}
}
}
}
}
```
{{% /tab %}}
{{% tab header="Windsurf" lang="en" %}}
1. Open [Windsurf](https://docs.codeium.com/windsurf) and navigate to the Cascade assistant.
1. Tap on the hammer (MCP) icon, then Configure to open the configuration file.
1. Add the following configuration, replace the environment variables with your values, and save:
```json
{
"mcpServers": {
"neo4j": {
"command": "./PATH/TO/toolbox",
"args": ["--prebuilt","neo4j","--stdio"],
"env": {
"NEO4J_URI": "",
"NEO4J_DATABASE": "",
"NEO4J_USERNAME": "",
"NEO4J_PASSWORD": ""
}
}
}
}
```
{{% /tab %}}
{{% tab header="Gemini CLI" lang="en" %}}
1. Install the [Gemini CLI](https://github.com/google-gemini/gemini-cli?tab=readme-ov-file#quickstart).
1. In your working directory, create a folder named `.gemini`. Within it, create a `settings.json` file.
1. Add the following configuration, replace the environment variables with your values, and then save:
```json
{
"mcpServers": {
"neo4j": {
"command": "./PATH/TO/toolbox",
"args": ["--prebuilt","neo4j","--stdio"],
"env": {
"NEO4J_URI": "",
"NEO4J_DATABASE": "",
"NEO4J_USERNAME": "",
"NEO4J_PASSWORD": ""
}
}
}
}
```
{{% /tab %}}
{{% tab header="Gemini Code Assist" lang="en" %}}
1. Install the [Gemini Code Assist](https://marketplace.visualstudio.com/items?itemName=Google.geminicodeassist) extension in Visual Studio Code.
1. Enable Agent Mode in Gemini Code Assist chat.
1. In your working directory, create a folder named `.gemini`. Within it, create a `settings.json` file.
1. Add the following configuration, replace the environment variables with your values, and then save:
```json
{
"mcpServers": {
"neo4j": {
"command": "./PATH/TO/toolbox",
"args": ["--prebuilt","neo4j","--stdio"],
"env": {
"NEO4J_URI": "",
"NEO4J_DATABASE": "",
"NEO4J_USERNAME": "",
"NEO4J_PASSWORD": ""
}
}
}
}
```
{{% /tab %}}
{{< /tabpane >}}
## Use Tools
Your AI tool is now connected to Neo4j using MCP. Try asking your AI assistant to get the graph schema or execute Cypher statements.
The following tools are available to the LLM:
1. **get_schema**: extracts the complete database schema, including details about node labels, relationships, properties, constraints, and indexes.
1. **execute_cypher**: executes any arbitrary Cypher statement.
{{< notice note >}}
Prebuilt tools are pre-1.0, so expect some tool changes between versions. LLMs will adapt to the tools available, so this shouldn't affect most users.
{{< /notice >}}

View File

@@ -17,6 +17,8 @@ to expose your developer assistant tools to a Postgres instance:
* [Cline][cline] (VS Code extension)
* [Claude desktop][claudedesktop]
* [Claude code][claudecode]
* [Gemini CLI][geminicli]
* [Gemini Code Assist][geminicodeassist]
[toolbox]: https://github.com/googleapis/genai-toolbox
[cursor]: #configure-your-mcp-client
@@ -25,6 +27,8 @@ to expose your developer assistant tools to a Postgres instance:
[cline]: #configure-your-mcp-client
[claudedesktop]: #configure-your-mcp-client
[claudecode]: #configure-your-mcp-client
[geminicli]: #configure-your-mcp-client
[geminicodeassist]: #configure-your-mcp-client
{{< notice tip >}}
This guide can be used with [AlloyDB
@@ -52,19 +56,19 @@ Omni](https://cloud.google.com/alloydb/omni/current/docs/overview).
<!-- {x-release-please-start-version} -->
{{< tabpane persist=header >}}
{{< tab header="linux/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.12.0/linux/amd64/toolbox
curl -O https://storage.googleapis.com/genai-toolbox/v0.14.0/linux/amd64/toolbox
{{< /tab >}}
{{< tab header="darwin/arm64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.12.0/darwin/arm64/toolbox
curl -O https://storage.googleapis.com/genai-toolbox/v0.14.0/darwin/arm64/toolbox
{{< /tab >}}
{{< tab header="darwin/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.12.0/darwin/amd64/toolbox
curl -O https://storage.googleapis.com/genai-toolbox/v0.14.0/darwin/amd64/toolbox
{{< /tab >}}
{{< tab header="windows/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.12.0/windows/amd64/toolbox.exe
curl -O https://storage.googleapis.com/genai-toolbox/v0.14.0/windows/amd64/toolbox.exe
{{< /tab >}}
{{< /tabpane >}}
<!-- {x-release-please-end} -->
@@ -213,7 +217,7 @@ curl -O https://storage.googleapis.com/genai-toolbox/v0.12.0/windows/amd64/toolb
```json
{
"mcpServers": {
"servers": {
"postgres": {
"command": "./PATH/TO/toolbox",
"args": ["--prebuilt","postgres","--stdio"],
@@ -259,6 +263,57 @@ curl -O https://storage.googleapis.com/genai-toolbox/v0.12.0/windows/amd64/toolb
```
{{% /tab %}}
{{% tab header="Gemini CLI" lang="en" %}}
1. Install the [Gemini CLI](https://github.com/google-gemini/gemini-cli?tab=readme-ov-file#quickstart).
1. In your working directory, create a folder named `.gemini`. Within it, create a `settings.json` file.
1. Add the following configuration, replace the environment variables with your values, and then save:
```json
{
"mcpServers": {
"postgres": {
"command": "./PATH/TO/toolbox",
"args": ["--prebuilt","postgres","--stdio"],
"env": {
"POSTGRES_HOST": "",
"POSTGRES_PORT": "",
"POSTGRES_DATABASE": "",
"POSTGRES_USER": "",
"POSTGRES_PASSWORD": ""
}
}
}
}
```
{{% /tab %}}
{{% tab header="Gemini Code Assist" lang="en" %}}
1. Install the [Gemini Code Assist](https://marketplace.visualstudio.com/items?itemName=Google.geminicodeassist) extension in Visual Studio Code.
1. Enable Agent Mode in Gemini Code Assist chat.
1. In your working directory, create a folder named `.gemini`. Within it, create a `settings.json` file.
1. Add the following configuration, replace the environment variables with your values, and then save:
```json
{
"mcpServers": {
"postgres": {
"command": "./PATH/TO/toolbox",
"args": ["--prebuilt","postgres","--stdio"],
"env": {
"POSTGRES_HOST": "",
"POSTGRES_PORT": "",
"POSTGRES_DATABASE": "",
"POSTGRES_USER": "",
"POSTGRES_PASSWORD": ""
}
}
}
}
```
{{% /tab %}}
{{< /tabpane >}}
## Use Tools

View File

@@ -103,6 +103,14 @@ section.
```bash
export IMAGE=us-central1-docker.pkg.dev/database-toolbox/toolbox/toolbox:latest
```
{{< notice note >}}
**The `$PORT` Environment Variable**
Google Cloud Run dictates the port your application must listen on by setting the
`$PORT` environment variable inside your container. This value defaults to
**8080**. Your application's `--port` argument **must** be set to listen on this
port. If there is a mismatch, the container will fail to start and the
deployment will time out.
{{< /notice >}}
1. Deploy Toolbox to Cloud Run using the following command:
@@ -197,3 +205,22 @@ func main() {
Now, you can use this client to connect to the deployed Cloud Run instance!
## Troubleshooting
{{< notice note >}}
For any deployment or runtime error, the best first step is to check the logs for your service in the Google Cloud Console's Cloud Run section. They often contain the specific error message needed to diagnose the problem.
{{< /notice >}}
* **Deployment Fails with "Container failed to start":** This is almost always
caused by a port mismatch. Ensure your container's `--port` argument is set to
`8080` to match the `$PORT` environment variable provided by Cloud Run.
* **Client Receives Permission Denied Error (401 or 403):** If your client application (e.g., your local SDK) gets a `401 Unauthorized` or `403 Forbidden` error when trying to call your Cloud Run service, it means the client is not properly authenticated as an invoker.
* Ensure the user or service account calling the service has the **Cloud Run Invoker** (`roles/run.invoker`) IAM role.
* If running locally, make sure your Application Default Credentials are set up correctly by running `gcloud auth application-default login`.
* **Service Fails to Access Secrets (in logs):** If your application starts but the logs show errors like "permission denied" when trying to access Secret Manager, it means the Toolbox service account is missing permissions.
* Ensure the `toolbox-identity` service account has the **Secret Manager Secret Accessor** (`roles/secretmanager.secretAccessor`) IAM role.

View File

@@ -0,0 +1,8 @@
---
title: "Reference"
type: docs
weight: 7
description: >
This section contains reference documentation.
---

75
docs/en/reference/cli.md Normal file
View File

@@ -0,0 +1,75 @@
---
title: "CLI"
type: docs
weight: 1
description: >
This page describes the `toolbox` command-line options.
---
## Reference
| Flag (Short) | Flag (Long) | Description | Default |
|---|---|---|---|
| `-a` | `--address` | Address of the interface the server will listen on. | `127.0.0.1` |
| | `--disable-reload` | Disables dynamic reloading of tools file. | |
| `-h` | `--help` | help for toolbox | |
| | `--log-level` | Specify the minimum level logged. Allowed: 'DEBUG', 'INFO', 'WARN', 'ERROR'. | `info` |
| | `--logging-format` | Specify logging format to use. Allowed: 'standard' or 'JSON'. | `standard` |
| `-p` | `--port` | Port the server will listen on. | `5000` |
| | `--prebuilt` | Use a prebuilt tool configuration by source type. Cannot be used with --tools-file. See [Prebuilt Tools Reference](prebuilt-tools.md) for allowed values. | |
| | `--stdio` | Listens via MCP STDIO instead of acting as a remote HTTP server. | |
| | `--telemetry-gcp` | Enable exporting directly to Google Cloud Monitoring. | |
| | `--telemetry-otlp` | Enable exporting using OpenTelemetry Protocol (OTLP) to the specified endpoint (e.g. 'http://127.0.0.1:4318') | |
| | `--telemetry-service-name` | Sets the value of the service.name resource attribute for telemetry data. | `toolbox` |
| | `--tools-file` | File path specifying the tool configuration. Cannot be used with --prebuilt, --tools-files, or --tools-folder. | |
| | `--tools-files` | Multiple file paths specifying tool configurations. Files will be merged. Cannot be used with --prebuilt, --tools-file, or --tools-folder. | |
| | `--tools-folder` | Directory path containing YAML tool configuration files. All .yaml and .yml files in the directory will be loaded and merged. Cannot be used with --prebuilt, --tools-file, or --tools-files. | |
| | `--ui` | Launches the Toolbox UI web server. | |
| `-v` | `--version` | version for toolbox | |
## Examples
### Transport Configuration
**Server Settings:**
- `--address`, `-a`: Server listening address (default: "127.0.0.1")
- `--port`, `-p`: Server listening port (default: 5000)
**STDIO:**
- `--stdio`: Run in MCP STDIO mode instead of HTTP server
#### Usage Examples
```bash
# Basic server with custom port configuration
./toolbox --tools-file "tools.yaml" --port 8080
```
### Tool Configuration Sources
The CLI supports multiple mutually exclusive ways to specify tool configurations:
**Single File:** (default)
- `--tools-file`: Path to a single YAML configuration file (default: `tools.yaml`)
**Multiple Files:**
- `--tools-files`: Comma-separated list of YAML files to merge
**Directory:**
- `--tools-folder`: Directory containing YAML files to load and merge
**Prebuilt Configurations:**
- `--prebuilt`: Use predefined configurations for specific database types (e.g., 'bigquery', 'postgres', 'spanner'). See [Prebuilt Tools Reference](prebuilt-tools.md) for allowed values.
{{< notice tip >}}
The CLI enforces mutual exclusivity between configuration source flags, preventing simultaneous use of `--prebuilt` with file-based options, and ensuring only one of `--tools-file`, `--tools-files`, or `--tools-folder` is used at a time.
{{< /notice >}}
### Hot Reload
Toolbox enables dynamic reloading by default. To disable, use the
`--disable-reload` flag.
### Toolbox UI
To launch Toolbox's interactive UI, use the `--ui` flag. This allows you to test tools and toolsets with features such as authorized parameters. To learn more, visit [Toolbox UI](../how-to/toolbox-ui/index.md).

View File

@@ -0,0 +1,282 @@
---
title: "Prebuilt Tools"
type: docs
weight: 1
description: >
This page lists all the prebuilt tools available.
---
Prebuilt tools are reusable, pre-packaged toolsets that are designed to extend the capabilities of agents. These tools are built to be generic and adaptable, allowing developers to interact with and take action on databases.
See guides, [Connect from your IDE](../how-to/connect-ide/_index.md), for details on how to connect your AI tools (IDEs) to databases via Toolbox and MCP.
## AlloyDB Postgres
* `--prebuilt` value: `alloydb-postgres`
* **Environment Variables:**
* `ALLOYDB_POSTGRES_PROJECT`: The GCP project ID.
* `ALLOYDB_POSTGRES_REGION`: The region of your AlloyDB instance.
* `ALLOYDB_POSTGRES_CLUSTER`: The ID of your AlloyDB cluster.
* `ALLOYDB_POSTGRES_INSTANCE`: The ID of your AlloyDB instance.
* `ALLOYDB_POSTGRES_DATABASE`: The name of the database to connect to.
* `ALLOYDB_POSTGRES_USER`: (Optional) The database username. Defaults to IAM authentication if unspecified.
* `ALLOYDB_POSTGRES_PASSWORD`: (Optional) The password for the database user. Defaults to IAM authentication if unspecified.
* `ALLOYDB_POSTGRES_IP_TYPE`: (Optional) The IP type i.e. "Public" or "Private" (Default: Public).
* **Permissions:**
* **AlloyDB Client** (`roles/alloydb.client`) to connect to the instance.
* Database-level permissions (e.g., `SELECT`, `INSERT`) are required to execute queries.
* **Tools:**
* `execute_sql`: Executes a SQL query.
* `list_tables`: Lists tables in the database.
## AlloyDB Postgres Admin
* `--prebuilt` value: `alloydb-postgres-admin`
* **Environment Variables:**
* `API_KEY`: Your API key for the AlloyDB API.
* **Permissions:**
* **AlloyDB Admin** (`roles/alloydb.admin`) IAM role is required on the project.
* **Tools:**
* `alloydb-create-cluster`: Creates a new AlloyDB cluster.
* `alloydb-operations-get`: Polls the operations API to track the status of long-running operations.
* `alloydb-create-instance`: Creates a new AlloyDB instance within a cluster.
* `alloydb-list-clusters`: Lists all AlloyDB clusters in a project.
* `alloydb-list-instances`: Lists all instances within an AlloyDB cluster.
* `alloydb-list-users`: Lists all database users within an AlloyDB cluster.
* `alloydb-create-user`: Creates a new database user in an AlloyDB cluster.
## BigQuery
* `--prebuilt` value: `bigquery`
* **Environment Variables:**
* `BIGQUERY_PROJECT`: The GCP project ID.
* `BIGQUERY_LOCATION`: (Optional) The dataset location.
* **Permissions:**
* **BigQuery User** (`roles/bigquery.user`) to execute queries and view metadata.
* **BigQuery Metadata Viewer** (`roles/bigquery.metadataViewer`) to view all datasets.
* **BigQuery Data Editor** (`roles/bigquery.dataEditor`) to create or modify datasets and tables.
* **Gemini for Google Cloud** (`roles/cloudaicompanion.user`) to use the conversational analytics API.
* **Tools:**
* `ask_data_insights`: Use this tool to perform data analysis, get insights, or answer complex questions about the contents of specific BigQuery tables. For more information on required roles, API setup, and IAM configuration, see the setup and authentication section of the [Conversational Analytics API documentation](https://cloud.google.com/gemini/docs/conversational-analytics-api/overview).
* `execute_sql`: Executes a SQL statement.
* `forecast`: Use this tool to forecast time series data.
* `get_dataset_info`: Gets dataset metadata.
* `get_table_info`: Gets table metadata.
* `list_dataset_ids`: Lists datasets.
* `list_table_ids`: Lists tables.
## Cloud SQL for MySQL
* `--prebuilt` value: `cloud-sql-mysql`
* **Environment Variables:**
* `CLOUD_SQL_MYSQL_PROJECT`: The GCP project ID.
* `CLOUD_SQL_MYSQL_REGION`: The region of your Cloud SQL instance.
* `CLOUD_SQL_MYSQL_INSTANCE`: The ID of your Cloud SQL instance.
* `CLOUD_SQL_MYSQL_DATABASE`: The name of the database to connect to.
* `CLOUD_SQL_MYSQL_USER`: The database username.
* `CLOUD_SQL_MYSQL_PASSWORD`: The password for the database user.
* **Permissions:**
* **Cloud SQL Client** (`roles/cloudsql.client`) to connect to the instance.
* Database-level permissions (e.g., `SELECT`, `INSERT`) are required to execute queries.
* **Tools:**
* `execute_sql`: Executes a SQL query.
* `list_tables`: Lists tables in the database.
## Cloud SQL for PostgreSQL
* `--prebuilt` value: `cloud-sql-postgres`
* **Environment Variables:**
* `CLOUD_SQL_POSTGRES_PROJECT`: The GCP project ID.
* `CLOUD_SQL_POSTGRES_REGION`: The region of your Cloud SQL instance.
* `CLOUD_SQL_POSTGRES_INSTANCE`: The ID of your Cloud SQL instance.
* `CLOUD_SQL_POSTGRES_DATABASE`: The name of the database to connect to.
* `CLOUD_SQL_POSTGRES_USER`: (Optional) The database username. Defaults to IAM authentication if unspecified.
* `CLOUD_SQL_POSTGRES_PASSWORD`: (Optional) The password for the database user. Defaults to IAM authentication if unspecified.
* `CLOUD_SQL_POSTGRES_IP_TYPE`: (Optional) The IP type i.e. "Public" or "Private" (Default: Public).
* **Permissions:**
* **Cloud SQL Client** (`roles/cloudsql.client`) to connect to the instance.
* Database-level permissions (e.g., `SELECT`, `INSERT`) are required to execute queries.
* **Tools:**
* `execute_sql`: Executes a SQL query.
* `list_tables`: Lists tables in the database.
## Cloud SQL for SQL Server
* `--prebuilt` value: `cloud-sql-mssql`
* **Environment Variables:**
* `CLOUD_SQL_MSSQL_PROJECT`: The GCP project ID.
* `CLOUD_SQL_MSSQL_REGION`: The region of your Cloud SQL instance.
* `CLOUD_SQL_MSSQL_INSTANCE`: The ID of your Cloud SQL instance.
* `CLOUD_SQL_MSSQL_DATABASE`: The name of the database to connect to.
* `CLOUD_SQL_MSSQL_IP_ADDRESS`: The IP address of the Cloud SQL instance.
* `CLOUD_SQL_MSSQL_USER`: The database username.
* `CLOUD_SQL_MSSQL_PASSWORD`: The password for the database user.
* `CLOUD_SQL_MSSQL_IP_TYPE`: (Optional) The IP type i.e. "Public" or "Private" (Default: Public).
* **Permissions:**
* **Cloud SQL Client** (`roles/cloudsql.client`) to connect to the instance.
* Database-level permissions (e.g., `SELECT`, `INSERT`) are required to execute queries.
* **Tools:**
* `execute_sql`: Executes a SQL query.
* `list_tables`: Lists tables in the database.
## Dataplex
* `--prebuilt` value: `dataplex`
* **Environment Variables:**
* `DATAPLEX_PROJECT`: The GCP project ID.
* **Permissions:**
* **Dataplex Reader** (`roles/dataplex.viewer`) to search and look up entries.
* **Dataplex Editor** (`roles/dataplex.editor`) to modify entries.
* **Tools:**
* `dataplex_search_entries`: Searches for entries in Dataplex Catalog.
* `dataplex_lookup_entry`: Retrieves a specific entry from Dataplex Catalog.
* `dataplex_search_aspect_types`: Finds aspect types relevant to the query.
## Firestore
* `--prebuilt` value: `firestore`
* **Environment Variables:**
* `FIRESTORE_PROJECT`: The GCP project ID.
* `FIRESTORE_DATABASE`: (Optional) The Firestore database ID. Defaults to "(default)".
* **Permissions:**
* **Cloud Datastore User** (`roles/datastore.user`) to get documents, list collections, and query collections.
* **Firebase Rules Viewer** (`roles/firebaserules.viewer`) to get and validate Firestore rules.
* **Tools:**
* `firestore-get-documents`: Gets multiple documents from Firestore by their paths.
* `firestore-list-collections`: Lists Firestore collections for a given parent path.
* `firestore-delete-documents`: Deletes multiple documents from Firestore.
* `firestore-query-collection`: Retrieves one or more Firestore documents from a collection.
* `firestore-get-rules`: Retrieves the active Firestore security rules.
* `firestore-validate-rules`: Checks the provided Firestore Rules source for syntax and validation errors.
## Looker
* `--prebuilt` value: `looker`
* **Environment Variables:**
* `LOOKER_BASE_URL`: The URL of your Looker instance.
* `LOOKER_CLIENT_ID`: The client ID for the Looker API.
* `LOOKER_CLIENT_SECRET`: The client secret for the Looker API.
* `LOOKER_VERIFY_SSL`: Whether to verify SSL certificates.
* **Permissions:**
* A Looker account with permissions to access the desired models, explores, and data is required.
* **Tools:**
* `get_models`: Retrieves the list of LookML models.
* `get_explores`: Retrieves the list of explores in a model.
* `get_dimensions`: Retrieves the list of dimensions in an explore.
* `get_measures`: Retrieves the list of measures in an explore.
* `get_filters`: Retrieves the list of filters in an explore.
* `get_parameters`: Retrieves the list of parameters in an explore.
* `query`: Runs a query against the LookML model.
* `query_sql`: Generates the SQL for a query.
* `query_url`: Generates a URL for a query in Looker.
* `get_looks`: Searches for saved looks.
* `run_look`: Runs the query associated with a look.
* `make_look`: Creates a new look.
* `get_dashboards`: Searches for saved dashboards.
* `make_dashboard`: Creates a new dashboard.
* `add_dashboard_element`: Adds a tile to a dashboard.
## Microsoft SQL Server
* `--prebuilt` value: `mssql`
* **Environment Variables:**
* `MSSQL_HOST`: The hostname or IP address of the SQL Server instance.
* `MSSQL_PORT`: The port number for the SQL Server instance.
* `MSSQL_DATABASE`: The name of the database to connect to.
* `MSSQL_USER`: The database username.
* `MSSQL_PASSWORD`: The password for the database user.
* **Permissions:**
* Database-level permissions (e.g., `SELECT`, `INSERT`) are required to execute queries.
* **Tools:**
* `execute_sql`: Executes a SQL query.
* `list_tables`: Lists tables in the database.
## MySQL
* `--prebuilt` value: `mysql`
* **Environment Variables:**
* `MYSQL_HOST`: The hostname or IP address of the MySQL server.
* `MYSQL_PORT`: The port number for the MySQL server.
* `MYSQL_DATABASE`: The name of the database to connect to.
* `MYSQL_USER`: The database username.
* `MYSQL_PASSWORD`: The password for the database user.
* **Permissions:**
* Database-level permissions (e.g., `SELECT`, `INSERT`) are required to execute queries.
* **Tools:**
* `execute_sql`: Executes a SQL query.
* `list_tables`: Lists tables in the database.
## OceanBase
* `--prebuilt` value: `oceanbase`
* **Environment Variables:**
* `OCEANBASE_HOST`: The hostname or IP address of the OceanBase server.
* `OCEANBASE_PORT`: The port number for the OceanBase server.
* `OCEANBASE_DATABASE`: The name of the database to connect to.
* `OCEANBASE_USER`: The database username.
* `OCEANBASE_PASSWORD`: The password for the database user.
* **Permissions:**
* Database-level permissions (e.g., `SELECT`, `INSERT`) are required to execute queries.
* **Tools:**
* `execute_sql`: Executes a SQL query.
* `list_tables`: Lists tables in the database.
## PostgreSQL
* `--prebuilt` value: `postgres`
* **Environment Variables:**
* `POSTGRES_HOST`: The hostname or IP address of the PostgreSQL server.
* `POSTGRES_PORT`: The port number for the PostgreSQL server.
* `POSTGRES_DATABASE`: The name of the database to connect to.
* `POSTGRES_USER`: The database username.
* `POSTGRES_PASSWORD`: The password for the database user.
* `POSTGRES_QUERY_PARAMS`: (Optional) Raw query to be added to the db connection string.
* **Permissions:**
* Database-level permissions (e.g., `SELECT`, `INSERT`) are required to execute queries.
* **Tools:**
* `execute_sql`: Executes a SQL query.
* `list_tables`: Lists tables in the database.
## Spanner (GoogleSQL dialect)
* `--prebuilt` value: `spanner`
* **Environment Variables:**
* `SPANNER_PROJECT`: The GCP project ID.
* `SPANNER_INSTANCE`: The Spanner instance ID.
* `SPANNER_DATABASE`: The Spanner database ID.
* **Permissions:**
* **Cloud Spanner Database Reader** (`roles/spanner.databaseReader`) to execute DQL queries and list tables.
* **Cloud Spanner Database User** (`roles/spanner.databaseUser`) to execute DML queries.
* **Tools:**
* `execute_sql`: Executes a DML SQL query.
* `execute_sql_dql`: Executes a DQL SQL query.
* `list_tables`: Lists tables in the database.
## Spanner (PostgreSQL dialect)
* `--prebuilt` value: `spanner-postgres`
* **Environment Variables:**
* `SPANNER_PROJECT`: The GCP project ID.
* `SPANNER_INSTANCE`: The Spanner instance ID.
* `SPANNER_DATABASE`: The Spanner database ID.
* **Permissions:**
* **Cloud Spanner Database Reader** (`roles/spanner.databaseReader`) to execute DQL queries and list tables.
* **Cloud Spanner Database User** (`roles/spanner.databaseUser`) to execute DML queries.
* **Tools:**
* `execute_sql`: Executes a DML SQL query using the PostgreSQL interface for Spanner.
* `execute_sql_dql`: Executes a DQL SQL query using the PostgreSQL interface for Spanner.
* `list_tables`: Lists tables in the database.
## Neo4j
* `--prebuilt` value: `neo4j`
* **Environment Variables:**
* `NEO4J_URI`: The URI of the Neo4j instance (e.g., `bolt://localhost:7687`).
* `NEO4J_DATABASE`: The name of the Neo4j database to connect to.
* `NEO4J_USERNAME`: The username for the Neo4j instance.
* `NEO4J_PASSWORD`: The password for the Neo4j instance.
* **Permissions:**
* **Database-level permissions** are required to execute Cypher queries.
* **Tools:**
* `execute_cypher`: Executes a Cypher query.
* `get_schema`: Retrieves the schema of the Neo4j database.

View File

@@ -36,12 +36,15 @@ avoiding full table scans or complex filters.
## Available Tools
- [`bigquery-sql`](../tools/bigquery/bigquery-sql.md)
Run SQL queries directly against BigQuery datasets.
- [`bigquery-conversational-analytics`](../tools/bigquery/bigquery-conversational-analytics.md)
Allows conversational interaction with a BigQuery source.
- [`bigquery-execute-sql`](../tools/bigquery/bigquery-execute-sql.md)
Execute structured queries using parameters.
- [`bigquery-forecast`](../tools/bigquery/bigquery-forecast.md)
Forecasts time series data in BigQuery.
- [`bigquery-get-dataset-info`](../tools/bigquery/bigquery-get-dataset-info.md)
Retrieve metadata for a specific dataset.
@@ -54,6 +57,9 @@ avoiding full table scans or complex filters.
- [`bigquery-list-table-ids`](../tools/bigquery/bigquery-list-table-ids.md)
List tables in a given dataset.
- [`bigquery-sql`](../tools/bigquery/bigquery-sql.md)
Run SQL queries directly against BigQuery datasets.
### Pre-built Configurations
- [BigQuery using MCP](https://googleapis.github.io/genai-toolbox/how-to/connect-ide/bigquery_mcp/)
@@ -65,34 +71,64 @@ Connect your IDE to BigQuery using Toolbox.
BigQuery uses [Identity and Access Management (IAM)][iam-overview] to control
user and group access to BigQuery resources like projects, datasets, and tables.
Toolbox will use your [Application Default Credentials (ADC)][adc] to authorize
and authenticate when interacting with [BigQuery][bigquery-docs].
In addition to [setting the ADC for your server][set-adc], you need to ensure
the IAM identity has been given the correct IAM permissions for the queries
you intend to run. Common roles include `roles/bigquery.user` (which includes
permissions to run jobs and read data) or `roles/bigquery.dataViewer`. See
[Introduction to BigQuery IAM][grant-permissions] for more information on
applying IAM permissions and roles to an identity.
### Authentication via Application Default Credentials (ADC)
By **default**, Toolbox will use your [Application Default Credentials (ADC)][adc] to authorize and authenticate when interacting with [BigQuery][bigquery-docs].
When using this method, you need to ensure the IAM identity associated with your
ADC (such as a service account) has the correct permissions for the queries you
intend to run. Common roles include `roles/bigquery.user` (which includes
permissions to run jobs and read data) or `roles/bigbigquery.dataViewer`.
Follow this [guide][set-adc] to set up your ADC.
### Authentication via User's OAuth Access Token
If the `useClientOAuth` parameter is set to `true`, Toolbox will instead use the
OAuth access token for authentication. This token is parsed from the
`Authorization` header passed in with the tool invocation request. This method
allows Toolbox to make queries to [BigQuery][bigquery-docs] on behalf of the
client or the end-user.
When using this on-behalf-of authentication, you must ensure that the
identity used has been granted the correct IAM permissions. Currently,
this option is only supported by the following BigQuery tools:
- [`bigquery-sql`](../tools/bigquery/bigquery-sql.md)
Run SQL queries directly against BigQuery datasets.
[iam-overview]: https://cloud.google.com/bigquery/docs/access-control
[adc]: https://cloud.google.com/docs/authentication#adc
[set-adc]: https://cloud.google.com/docs/authentication/provide-credentials-adc
[grant-permissions]: https://cloud.google.com/bigquery/docs/access-control
## Example
Initialize a BigQuery source that uses ADC:
```yaml
sources:
my-bigquery-source:
kind: "bigquery"
project: "my-project-id"
# location: "US" # Optional: Specifies the location for query jobs.
```
Initialize a BigQuery source that uses the client's access token:
```yaml
sources:
my-bigquery-client-auth-source:
kind: "bigquery"
project: "my-project-id"
useClientOAuth: true
# location: "US" # Optional: Specifies the location for query jobs.
```
## Reference
| **field** | **type** | **required** | **description** |
|-----------|:--------:|:------------:|-------------------------------------------------------------------------------|
| kind | string | true | Must be "bigquery". |
| project | string | true | Id of the GCP project that the cluster was created in (e.g. "my-project-id"). |
| location | string | false | Specifies the location (e.g., 'us', 'asia-northeast1') in which to run the query job. This location must match the location of any tables referenced in the query. The default behavior is for it to be executed in the US multi-region |
| **field** | **type** | **required** | **description** |
|----------------|:--------:|:------------:|-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "bigquery". |
| project | string | true | Id of the Google Cloud project to use for billing and as the default project for BigQuery resources. |
| location | string | false | Specifies the location (e.g., 'us', 'asia-northeast1') in which to run the query job. This location must match the location of any tables referenced in the query. Defaults to the table's location or 'US' if the location cannot be determined. [Learn More](https://cloud.google.com/bigquery/docs/locations) |
| useClientOAuth | bool | false | If true, forwards the client's OAuth access token from the "Authorization" header to downstream queries. |

View File

@@ -0,0 +1,91 @@
---
title: "ClickHouse"
type: docs
weight: 1
description: >
ClickHouse is an open-source, OLTP database.
---
## About
[ClickHouse][clickhouse-docs] is a fast, open-source, column-oriented database
[clickhouse-docs]: https://clickhouse.com/docs
## Available Tools
- [`clickhouse-execute-sql`](../tools/clickhouse/clickhouse-execute-sql.md)
Execute parameterized SQL queries in ClickHouse with query logging.
- [`clickhouse-sql`](../tools/clickhouse/clickhouse-sql.md)
Execute SQL queries as prepared statements in ClickHouse.
## Requirements
### Database User
This source uses standard ClickHouse authentication. You will need to [create a
ClickHouse user][clickhouse-users] (or with [ClickHouse Cloud][clickhouse-cloud]) to connect to the database with. The user
should have appropriate permissions for the operations you plan to perform.
[clickhouse-cloud]: https://clickhouse.com/docs/getting-started/quick-start/cloud#connect-with-your-app
[clickhouse-users]: https://clickhouse.com/docs/en/sql-reference/statements/create/user
### Network Access
ClickHouse supports multiple protocols:
- **HTTPS protocol** (default port 8443) - Secure HTTP access (default)
- **HTTP protocol** (default port 8123) - Good for web-based access
## Example
### Secure Connection Example
```yaml
sources:
secure-clickhouse-source:
kind: clickhouse
host: clickhouse.example.com
port: "8443"
database: analytics
user: ${CLICKHOUSE_USER}
password: ${CLICKHOUSE_PASSWORD}
protocol: https
secure: true
```
### HTTP Protocol Example
```yaml
sources:
http-clickhouse-source:
kind: clickhouse
host: localhost
port: "8123"
database: logs
user: ${CLICKHOUSE_USER}
password: ${CLICKHOUSE_PASSWORD}
protocol: http
secure: false
```
{{< notice tip >}}
Use environment variable replacement with the format ${ENV_NAME}
instead of hardcoding your secrets into the configuration file.
{{< /notice >}}
## Reference
| **field** | **type** | **required** | **description** |
|-------------|:--------:|:------------:|------------------------------------------------------------------------------------|
| kind | string | true | Must be "clickhouse". |
| host | string | true | IP address or hostname to connect to (e.g. "127.0.0.1" or "clickhouse.example.com") |
| port | string | true | Port to connect to (e.g. "8443" for HTTPS, "8123" for HTTP) |
| database | string | true | Name of the ClickHouse database to connect to (e.g. "my_database"). |
| user | string | true | Name of the ClickHouse user to connect as (e.g. "analytics_user"). |
| password | string | false | Password of the ClickHouse user (e.g. "my-password"). |
| protocol | string | false | Connection protocol: "https" (default) or "http". |
| secure | boolean | false | Whether to use a secure connection (TLS). Default: false. |

View File

@@ -22,13 +22,17 @@ allowing tools to execute SQL queries against it.
sources:
my-couchbase-instance:
kind: couchbase
connectionString: couchbase://localhost:8091
connectionString: couchbase://localhost
bucket: travel-sample
scope: inventory
username: Administrator
password: password
```
{{< notice note >}}
For more details about alternate addresses and custom ports refer to [Managing Connections](https://docs.couchbase.com/java-sdk/current/howtos/managing-connections.html).
{{< /notice >}}
## Reference
| **field** | **type** | **required** | **description** |

View File

@@ -0,0 +1,59 @@
---
title: "Firebird"
type: docs
weight: 1
description: >
Firebird is a powerful, cross-platform, and open-source relational database.
---
## About
[Firebird][fb-docs] is a relational database management system offering many ANSI SQL standard features that runs on Linux, Windows, and a variety of Unix platforms. It is known for its small footprint, powerful features, and easy maintenance.
[fb-docs]: https://firebirdsql.org/
## Available Tools
- [`firebird-sql`](../tools/firebird/firebird-sql.md)
Execute SQL queries as prepared statements in Firebird.
- [`firebird-execute-sql`](../tools/firebird/firebird-execute-sql.md)
Run parameterized SQL statements in Firebird.
## Requirements
### Database User
This source uses standard authentication. You will need to [create a Firebird user][fb-users] to login to the database with.
[fb-users]: https://firebirdsql.org/refdocs/langrefupd25-sql-create-user.html
## Example
```yaml
sources:
my_firebird_db:
kind: firebird
host: "localhost"
port: 3050
database: "/path/to/your/database.fdb"
user: ${FIREBIRD_USER}
password: ${FIREBIRD_PASS}
```
{{< notice tip >}}
Use environment variable replacement with the format ${ENV_NAME}
instead of hardcoding your secrets into the configuration file.
{{< /notice >}}
## Reference
| **field** | **type** | **required** | **description** |
|-----------|:--------:|:------------:|------------------------------------------------------------------------|
| kind | string | true | Must be "firebird". |
| host | string | true | IP address to connect to (e.g. "127.0.0.1") |
| port | string | true | Port to connect to (e.g. "3050") |
| database | string | true | Path to the Firebird database file (e.g. "/var/lib/firebird/data/test.fdb"). |
| user | string | true | Name of the Firebird user to connect as (e.g. "SYSDBA"). |
| password | string | true | Password of the Firebird user (e.g. "masterkey"). |

View File

@@ -47,7 +47,8 @@ a self-signed ssl certificate for the Looker server. Anything other than "true"
will be interpreted as false.
The client id and client secret are seemingly random character sequences
assigned by the looker server.
assigned by the looker server. If you are using Looker OAuth you don't need
these settings
{{< notice tip >}}
Use environment variable replacement with the format ${ENV_NAME}
@@ -56,11 +57,15 @@ instead of hardcoding your secrets into the configuration file.
## Reference
| **field** | **type** | **required** | **description** |
| ------------- | :------: | :----------: | ----------------------------------------------------------------------------------------- |
| kind | string | true | Must be "looker". |
| base_url | string | true | The URL of your Looker server with no trailing /). |
| client_id | string | true | The client id assigned by Looker. |
| client_secret | string | true | The client secret assigned by Looker. |
| verify_ssl | string | true | Whether to check the ssl certificate of the server. |
| timeout | string | false | Maximum time to wait for query execution (e.g. "30s", "2m"). By default, 120s is applied. |
| **field** | **type** | **required** | **description** |
| -------------------- | :------: | :----------: | ----------------------------------------------------------------------------------------- |
| kind | string | true | Must be "looker". |
| base_url | string | true | The URL of your Looker server with no trailing /). |
| client_id | string | false | The client id assigned by Looker. |
| client_secret | string | false | The client secret assigned by Looker. |
| verify_ssl | string | false | Whether to check the ssl certificate of the server. |
| timeout | string | false | Maximum time to wait for query execution (e.g. "30s", "2m"). By default, 120s is applied. |
| use_client_oauth | string | false | Use OAuth tokens instead of client_id and client_secret. (default: false) |
| show_hidden_models | string | false | Show or hide hidden models. (default: true) |
| show_hidden_explores | string | false | Show or hide hidden explores. (default: true) |
| show_hidden_fields | string | false | Show or hide hidden fields. (default: true) |

View File

@@ -20,9 +20,8 @@ flexible, JSON-like documents, making it easy to develop and scale applications.
sources:
my-mongodb:
kind: mongodb
uri: "mongodb+srv://username:password@host.mongodb.net"
database: sample_mflix
uri: "mongodb+srv://username:password@host.mongodb.net"
```
## Reference
@@ -31,4 +30,3 @@ sources:
|-----------|:--------:|:------------:|-------------------------------------------------------------------|
| kind | string | true | Must be "mongodb". |
| uri | string | true | connection string to connect to MongoDB |
| database | string | true | Name of the mongodb database to connect to (e.g. "sample_mflix"). |

View File

@@ -42,6 +42,9 @@ sources:
database: my_db
user: ${USER_NAME}
password: ${PASSWORD}
# Optional TLS and other driver parameters. For example, enable preferred TLS:
# queryParams:
# tls: preferred
queryTimeout: 30s # Optional: query timeout duration
```
@@ -61,3 +64,4 @@ instead of hardcoding your secrets into the configuration file.
| user | string | true | Name of the MySQL user to connect as (e.g. "my-mysql-user"). |
| password | string | true | Password of the MySQL user (e.g. "my-password"). |
| queryTimeout | string | false | Maximum time to wait for query execution (e.g. "30s", "2m"). By default, no timeout is applied. |
| queryParams | map<string,string> | false | Arbitrary DSN parameters passed to the driver (e.g. `tls: preferred`, `charset: utf8mb4`). Useful for enabling TLS or other connection options. |

View File

@@ -0,0 +1,72 @@
---
title: "OceanBase"
type: docs
weight: 1
description: >
OceanBase is a distributed relational database that provides high availability, scalability, and compatibility with MySQL.
---
## About
[OceanBase][oceanbase-docs] is a distributed relational database management system (RDBMS) that provides high availability, scalability, and strong consistency. It's designed to handle large-scale data processing and is compatible with MySQL, making it easy for developers to migrate from MySQL to OceanBase.
[oceanbase-docs]: https://www.oceanbase.com/
## Requirements
### Database User
This source only uses standard authentication. You will need to create an OceanBase user to login to the database with. OceanBase supports MySQL-compatible user management syntax.
### Network Connectivity
Ensure that your application can connect to the OceanBase cluster. OceanBase typically runs on ports 2881 (for MySQL protocol) or 3881 (for MySQL protocol with SSL).
## Example
```yaml
sources:
my-oceanbase-source:
kind: oceanbase
host: 127.0.0.1
port: 2881
database: my_db
user: ${USER_NAME}
password: ${PASSWORD}
queryTimeout: 30s # Optional: query timeout duration
```
{{< notice tip >}}
Use environment variable replacement with the format ${ENV_NAME}
instead of hardcoding your secrets into the configuration file.
{{< /notice >}}
## Reference
| **field** | **type** | **required** | **description** |
| ------------ | :------: | :----------: |-------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "oceanbase". |
| host | string | true | IP address to connect to (e.g. "127.0.0.1"). |
| port | string | true | Port to connect to (e.g. "2881"). |
| database | string | true | Name of the OceanBase database to connect to (e.g. "my_db"). |
| user | string | true | Name of the OceanBase user to connect as (e.g. "my-oceanbase-user"). |
| password | string | true | Password of the OceanBase user (e.g. "my-password"). |
| queryTimeout | string | false | Maximum time to wait for query execution (e.g. "30s", "2m"). By default, no timeout is applied. |
## Features
### MySQL Compatibility
OceanBase is highly compatible with MySQL, supporting most MySQL SQL syntax, data types, and functions. This makes it easy to migrate existing MySQL applications to OceanBase.
### High Availability
OceanBase provides automatic failover and data replication across multiple nodes, ensuring high availability and data durability.
### Scalability
OceanBase can scale horizontally by adding more nodes to the cluster, making it suitable for large-scale applications.
### Strong Consistency
OceanBase provides strong consistency guarantees, ensuring that all transactions are ACID compliant.

View File

@@ -0,0 +1,62 @@
---
title: "Trino"
type: docs
weight: 1
description: >
Trino is a distributed SQL query engine for big data analytics.
---
## About
[Trino][trino-docs] is a distributed SQL query engine designed for fast analytic queries against data of any size. It allows you to query data where it lives, including Hive, Cassandra, relational databases or even proprietary data stores.
[trino-docs]: https://trino.io/docs/
## Available Tools
- [`trino-sql`](../tools/trino/trino-sql.md)
Execute parameterized SQL queries against Trino.
- [`trino-execute-sql`](../tools/trino/trino-execute-sql.md)
Execute arbitrary SQL queries against Trino.
## Requirements
### Trino Cluster
You need access to a running Trino cluster with appropriate user permissions for the catalogs and schemas you want to query.
## Example
```yaml
sources:
my-trino-source:
kind: trino
host: trino.example.com
port: "8080"
user: ${TRINO_USER} # Optional for anonymous access
password: ${TRINO_PASSWORD} # Optional
catalog: hive
schema: default
```
{{< notice tip >}}
Use environment variable replacement with the format ${ENV_NAME}
instead of hardcoding your secrets into the configuration file.
{{< /notice >}}
## Reference
| **field** | **type** | **required** | **description** |
|-------------|:------------------:|:------------:|------------------------------------------------------------------------|
| kind | string | true | Must be "trino". |
| host | string | true | Trino coordinator hostname (e.g. "trino.example.com") |
| port | string | true | Trino coordinator port (e.g. "8080", "8443") |
| user | string | false | Username for authentication (e.g. "analyst"). Optional for anonymous access. |
| password | string | false | Password for basic authentication |
| catalog | string | true | Default catalog to use for queries (e.g. "hive") |
| schema | string | true | Default schema to use for queries (e.g. "default") |
| queryTimeout| string | false | Query timeout duration (e.g. "30m", "1h") |
| accessToken | string | false | JWT access token for authentication |
| kerberosEnabled | boolean | false | Enable Kerberos authentication (default: false) |
| sslEnabled | boolean | false | Enable SSL/TLS (default: false) |

View File

@@ -0,0 +1,44 @@
---
title: "YugabyteDB"
type: docs
weight: 1
description: >
YugabyteDB is a high-performance, distributed SQL database.
---
## About
[YugabyteDB][yugabytedb] is a high-performance, distributed SQL database designed for global, internet-scale applications, with full PostgreSQL compatibility.
[yugabytedb]: https://www.yugabyte.com/
## Example
```yaml
sources:
my-yb-source:
kind: yugabytedb
host: 127.0.0.1
port: 5433
database: yugabyte
user: ${USER_NAME}
password: ${PASSWORD}
loadBalance: true
topologyKeys: cloud.region.zone1:1,cloud.region.zone2:2
```
## Reference
| **field** | **type** | **required** | **description** |
|-----------------------------------|:--------:|:------------:|------------------------------------------------------------------------|
| kind | string | true | Must be "yugabytedb". |
| host | string | true | IP address to connect to. |
| port | integer | true | Port to connect to. The default port is 5433. |
| database | string | true | Name of the YugabyteDB database to connect to. The default database name is yugabyte. |
| user | string | true | Name of the YugabyteDB user to connect as. The default user is yugabyte. |
| password | string | true | Password of the YugabyteDB user. The default password is yugabyte. |
| loadBalance | boolean | false | If true, enable uniform load balancing. The default loadBalance value is false. |
| topologyKeys | string | false | Comma-separated geo-locations in the form cloud.region.zone:priority to enable topology-aware load balancing. Ignored if loadBalance is false. It is null by default. |
| ybServersRefreshInterval | integer | false | The interval (in seconds) to refresh the servers list; ignored if loadBalance is false. The default value of ybServersRefreshInterval is 300. |
| fallbackToTopologyKeysOnly | boolean | false | If set to true and topologyKeys are specified, only connect to nodes specified in topologyKeys. By defualt, this is set to false. |
| failedHostReconnectDelaySecs | integer | false | Time (in seconds) to wait before trying to connect to failed nodes. The default value of is 5. |

View File

@@ -0,0 +1,54 @@
---
title: "bigquery-conversational-analytics"
type: docs
weight: 1
description: >
A "bigquery-conversational-analytics" tool allows conversational interaction with a BigQuery source.
aliases:
- /resources/tools/bigquery-conversational-analytics
---
## About
A `bigquery-conversational-analytics` tool allows you to ask questions about your data in natural language.
This function takes a user's question (which can include conversational history for context)
and references to specific BigQuery tables, and sends them to a stateless conversational API.
The API uses a GenAI agent to understand the question, generate and execute SQL queries
and Python code, and formulate an answer. This function returns a detailed, sequential
log of this entire process, which includes any generated SQL or Python code, the data
retrieved, and the final text answer.
**Note**: This tool requires additional setup in your project. Please refer to the
official [Conversational Analytics API documentation](https://cloud.google.com/gemini/docs/conversational-analytics-api/overview)
for instructions.
It's compatible with the following sources:
- [bigquery](../sources/bigquery.md)
The tool takes the following input parameters:
* `user_query_with_context`: The user's question, potentially including conversation history and system instructions for context.
* `table_references`: A JSON string of a list of BigQuery tables to use as context. Each object in the list must contain `projectId`, `datasetId`, and `tableId`. Example: `'[{"projectId": "my-gcp-project", "datasetId": "my_dataset", "tableId": "my_table"}]'`
## Example
```yaml
tools:
ask_data_insights:
kind: bigquery-conversational-analytics
source: my-bigquery-source
description: |
Use this tool to perform data analysis, get insights, or answer complex
questions about the contents of specific BigQuery tables.
```
## Reference
| **field** | **type** | **required** | **description** |
|-------------|:------------------------------------------:|:------------:|--------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "bigquery-conversational-analytics". |
| source | string | true | Name of the source for chat. |
| description | string | true | Description of the tool
that is passed to the LLM. |

View File

@@ -0,0 +1,49 @@
---
title: "bigquery-forecast"
type: docs
weight: 1
description: >
A "bigquery-forecast" tool forecasts time series data in BigQuery.
aliases:
- /resources/tools/bigquery-forecast
---
## About
A `bigquery-forecast` tool forecasts time series data in BigQuery.
It's compatible with the following sources:
- [bigquery](../../sources/bigquery.md)
`bigquery-forecast` constructs and executes a `SELECT * FROM AI.FORECAST(...)` query based on the provided parameters:
- **history_data** (string, required): This specifies the source of the historical time series data. It can be either a fully qualified BigQuery table ID (e.g., my-project.my_dataset.my_table) or a SQL query that returns the data.
- **timestamp_col** (string, required): The name of the column in your history_data that contains the timestamps.
- **data_col** (string, required): The name of the column in your history_data that contains the numeric values to be forecasted.
- **id_cols** (array of strings, optional): If you are forecasting multiple time series at once (e.g., sales for different products), this parameter takes an array of column names that uniquely identify each series. It defaults to an empty array if not provided.
- **horizon** (integer, optional): The number of future time steps you want to predict. It defaults to 10 if not specified.
## Example
```yaml
tools:
forecast_tool:
kind: bigquery-forecast
source: my-bigquery-source
description: Use this tool to forecast time series data in BigQuery.
```
## Sample Prompt
You can use the following sample prompts to call this tool:
- Can you forecast the history time series data in bigquery table `bqml_tutorial.google_analytic`? Use project_id `myproject`.
- What are the future `total_visits` in bigquery table `bqml_tutorial.google_analytic`?
## Reference
| **field** | **type** | **required** | **description** |
|-------------|:------------------------------------------:|:------------:|--------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "bigquery-forecast". |
| source | string | true | Name of the source the forecast tool should execute on. |
| description | string | true | Description of the tool that is passed to the LLM. |

View File

@@ -0,0 +1,7 @@
---
title: "ClickHouse"
type: docs
weight: 1
description: >
Tools for interacting with ClickHouse databases and tables.
---

View File

@@ -0,0 +1,46 @@
---
title: "clickhouse-execute-sql"
type: docs
weight: 1
description: >
A "clickhouse-execute-sql" tool executes a SQL statement against a ClickHouse
database.
aliases:
- /resources/tools/clickhouse-execute-sql
---
## About
A `clickhouse-execute-sql` tool executes a SQL statement against a ClickHouse
database. It's compatible with the [clickhouse](../../sources/clickhouse.md) source.
`clickhouse-execute-sql` takes one input parameter `sql` and runs the SQL
statement against the specified `source`. This tool includes query logging
capabilities for monitoring and debugging purposes.
> **Note:** This tool is intended for developer assistant workflows with
> human-in-the-loop and shouldn't be used for production agents.
## Example
```yaml
tools:
execute_sql_tool:
kind: clickhouse-execute-sql
source: my-clickhouse-instance
description: Use this tool to execute SQL statements against ClickHouse.
```
## Parameters
| **parameter** | **type** | **required** | **description** |
|---------------|:--------:|:------------:|----------------------------------------------------|
| sql | string | true | The SQL statement to execute against the database |
## Reference
| **field** | **type** | **required** | **description** |
|-------------|:--------:|:------------:|---------------------------------------------------------|
| kind | string | true | Must be "clickhouse-execute-sql". |
| source | string | true | Name of the ClickHouse source to execute SQL against. |
| description | string | true | Description of the tool that is passed to the LLM. |

View File

@@ -0,0 +1,81 @@
---
title: "clickhouse-sql"
type: docs
weight: 2
description: >
A "clickhouse-sql" tool executes SQL queries as prepared statements in ClickHouse.
aliases:
- /resources/tools/clickhouse-sql
---
## About
A `clickhouse-sql` tool executes SQL queries as prepared statements against a
ClickHouse database. It's compatible with the [clickhouse](../../sources/clickhouse.md) source.
This tool supports both template parameters (for SQL statement customization)
and regular parameters (for prepared statement values), providing flexible
query execution capabilities.
## Example
```yaml
tools:
my_analytics_query:
kind: clickhouse-sql
source: my-clickhouse-instance
description: Get user analytics for a specific date range
statement: |
SELECT
user_id,
count(*) as event_count,
max(timestamp) as last_event
FROM events
WHERE date >= ? AND date <= ?
GROUP BY user_id
ORDER BY event_count DESC
LIMIT ?
parameters:
- name: start_date
description: Start date for the query (YYYY-MM-DD format)
- name: end_date
description: End date for the query (YYYY-MM-DD format)
- name: limit
description: Maximum number of results to return
```
## Template Parameters Example
```yaml
tools:
flexible_table_query:
kind: clickhouse-sql
source: my-clickhouse-instance
description: Query any table with flexible columns
statement: |
SELECT {{columns}}
FROM {{table_name}}
WHERE created_date >= ?
LIMIT ?
templateParameters:
- name: columns
description: Comma-separated list of columns to select
- name: table_name
description: Name of the table to query
parameters:
- name: start_date
description: Start date filter
- name: limit
description: Maximum number of results
```
## Reference
| **field** | **type** | **required** | **description** |
|--------------------|:------------------:|:------------:|-----------------------------------------------------------|
| kind | string | true | Must be "clickhouse-sql". |
| source | string | true | Name of the ClickHouse source to execute SQL against. |
| description | string | true | Description of the tool that is passed to the LLM. |
| statement | string | true | The SQL statement template to execute. |
| parameters | array of Parameter | false | Parameters for prepared statement values. |
| templateParameters | array of Parameter | false | Parameters for SQL statement template customization. |

View File

@@ -0,0 +1,7 @@
---
title: "Cloud SQL"
type: docs
weight: 1
description: >
Tools that work with Cloud SQL Control Plane.
---

View File

@@ -0,0 +1,43 @@
---
title: "cloud-sql-wait-for-operation"
type: docs
weight: 10
description: >
Wait for a long-running Cloud SQL operation to complete.
---
The `cloud-sql-wait-for-operation` tool is a utility tool that waits for a
long-running Cloud SQL operation to complete. It does this by polling the Cloud
SQL Admin API operation status endpoint until the operation is finished, using
exponential backoff.
{{< notice info >}}
This tool is intended for developer assistant workflows with human-in-the-loop
and shouldn't be used for production agents.
{{< /notice >}}
## Example
```yaml
tools:
cloudsql-operations-get:
kind: cloud-sql-wait-for-operation
source: some-http-source
description: "This will poll on operations API until the operation is done. For checking operation status we need projectId and operationId. Once instance is created give follow up steps on how to use the variables to bring data plane MCP server up in local and remote setup."
delay: 1s
maxDelay: 4m
multiplier: 2
maxRetries: 10
```
## Reference
| **field** | **type** | **required** | **description** |
| ----------- | :------: | :----------: | ---------------------------------------------------------------------------------------------------------------- |
| kind | string | true | Must be "cloud-sql-wait-for-operation". |
| source | string | true | The name of an `http` source to use for authentication. |
| description | string | true | A description of the tool. |
| delay | duration | false | The initial delay between polling requests (e.g., `3s`). Defaults to 3 seconds. |
| maxDelay | duration | false | The maximum delay between polling requests (e.g., `4m`). Defaults to 4 minutes. |
| multiplier | float | false | The multiplier for the polling delay. The delay is multiplied by this value after each request. Defaults to 2.0. |
| maxRetries | int | false | The maximum number of polling attempts before giving up. Defaults to 10. |

View File

@@ -0,0 +1,7 @@
---
title: "Firebird"
type: docs
weight: 1
description: >
Tools that work with Firebird Sources.
---

View File

@@ -0,0 +1,41 @@
---
title: "firebird-execute-sql"
type: docs
weight: 1
description: >
A "firebird-execute-sql" tool executes a SQL statement against a Firebird
database.
aliases:
- /resources/tools/firebird-execute-sql
---
## About
A `firebird-execute-sql` tool executes a SQL statement against a Firebird
database. It's compatible with the following source:
- [firebird](../sources/firebird.md)
`firebird-execute-sql` takes one input parameter `sql` and runs the sql
statement against the `source`.
> **Note:** This tool is intended for developer assistant workflows with
> human-in-the-loop and shouldn't be used for production agents.
## Example
```yaml
tools:
execute_sql_tool:
kind: firebird-execute-sql
source: my_firebird_db
description: Use this tool to execute a SQL statement against the Firebird database.
```
## Reference
| **field** | **type** | **required** | **description** |
|-------------|:------------------------------------------:|:------------:|--------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "firebird-execute-sql". |
| source | string | true | Name of the source the SQL should execute on. |
| description | string | true | Description of the tool that is passed to the LLM. |

View File

@@ -0,0 +1,135 @@
---
title: "firebird-sql"
type: docs
weight: 1
description: >
A "firebird-sql" tool executes a pre-defined SQL statement against a Firebird
database.
aliases:
- /resources/tools/firebird-sql
---
## About
A `firebird-sql` tool executes a pre-defined SQL statement against a Firebird
database. It's compatible with the following source:
- [firebird](../sources/firebird.md)
The specified SQL statement is executed as a [prepared statement][fb-prepare],
and supports both positional parameters (`?`) and named parameters (`:param_name`).
Parameters will be inserted according to their position or name. If template
parameters are included, they will be resolved before the execution of the
prepared statement.
[fb-prepare]: https://firebirdsql.org/refdocs/langrefupd25-psql-execstat.html
## Example
> **Note:** This tool uses parameterized queries to prevent SQL injections.
> Query parameters can be used as substitutes for arbitrary expressions.
> Parameters cannot be used as substitutes for identifiers, column names, table
> names, or other parts of the query.
```yaml
tools:
search_flights_by_number:
kind: firebird-sql
source: my_firebird_db
statement: |
SELECT * FROM flights
WHERE airline = ?
AND flight_number = ?
LIMIT 10
description: |
Use this tool to get information for a specific flight.
Takes an airline code and flight number and returns info on the flight.
Do NOT use this tool with a flight id. Do NOT guess an airline code or flight number.
A airline code is a code for an airline service consisting of two-character
airline designator and followed by flight number, which is 1 to 4 digit number.
For example, if given CY 0123, the airline is "CY", and flight_number is "123".
Another example for this is DL 1234, the airline is "DL", and flight_number is "1234".
If the tool returns more than one option choose the date closes to today.
Example:
{{
"airline": "CY",
"flight_number": "888",
}}
Example:
{{
"airline": "DL",
"flight_number": "1234",
}}
parameters:
- name: airline
type: string
description: Airline unique 2 letter identifier
- name: flight_number
type: string
description: 1 to 4 digit number
```
### Example with Named Parameters
```yaml
tools:
search_flights_by_airline:
kind: firebird-sql
source: my_firebird_db
statement: |
SELECT * FROM flights
WHERE airline = :airline
AND departure_date >= :start_date
AND departure_date <= :end_date
ORDER BY departure_date
description: |
Search for flights by airline within a date range using named parameters.
parameters:
- name: airline
type: string
description: Airline unique 2 letter identifier
- name: start_date
type: string
description: Start date in YYYY-MM-DD format
- name: end_date
type: string
description: End date in YYYY-MM-DD format
```
### Example with Template Parameters
> **Note:** This tool allows direct modifications to the SQL statement,
> including identifiers, column names, and table names. **This makes it more
> vulnerable to SQL injections**. Using basic parameters only (see above) is
> recommended for performance and safety reasons. For more details, please check
> [templateParameters](_index#template-parameters).
```yaml
tools:
list_table:
kind: firebird-sql
source: my_firebird_db
statement: |
SELECT * FROM {{.tableName}}
description: |
Use this tool to list all information from a specific table.
Example:
{{
"tableName": "flights",
}}
templateParameters:
- name: tableName
type: string
description: Table to select from
```
## Reference
| **field** | **type** | **required** | **description** |
|---------------------|:---------------------------------------------------------:|:------------:|--------------------------------------------------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "firebird-sql". |
| source | string | true | Name of the source the SQL should execute on. |
| description | string | true | Description of the tool that is passed to the LLM. |
| statement | string | true | SQL statement to execute on. |
| parameters | [parameters](_index#specifying-parameters) | false | List of [parameters](_index#specifying-parameters) that will be inserted into the SQL statement. |
| templateParameters | [templateParameters](_index#template-parameters) | false | List of [templateParameters](_index#template-parameters) that will be inserted into the SQL statement before executing prepared statement. |

View File

@@ -0,0 +1,274 @@
---
title: "firestore-add-documents"
type: docs
weight: 1
description: >
A "firestore-add-documents" tool adds document to a given collection path.
aliases:
- /resources/tools/firestore-add-documents
---
## Description
The `firestore-add-documents` tool allows you to add new documents to a Firestore collection. It supports all Firestore data types using Firestore's native JSON format. The tool automatically generates a unique document ID for each new document.
## Parameters
| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `collectionPath` | string | Yes | The path of the collection where the document will be added |
| `documentData` | map | Yes | The data to be added as a document to the given collection. Must use [Firestore's native JSON format](https://cloud.google.com/firestore/docs/reference/rest/Shared.Types/ArrayValue#Value) with typed values |
| `returnData` | boolean | No | If set to true, the output will include the data of the created document. Defaults to false to help avoid overloading the context |
## Output
The tool returns a map containing:
| Field | Type | Description |
|-------|------|-------------|
| `documentPath` | string | The full resource name of the created document (e.g., `projects/{projectId}/databases/{databaseId}/documents/{document_path}`) |
| `createTime` | string | The timestamp when the document was created |
| `documentData` | map | The data that was added (only included when `returnData` is true) |
## Data Type Format
The tool requires Firestore's native JSON format for document data. Each field must be wrapped with its type indicator:
### Basic Types
- **String**: `{"stringValue": "your string"}`
- **Integer**: `{"integerValue": "123"}` or `{"integerValue": 123}`
- **Double**: `{"doubleValue": 123.45}`
- **Boolean**: `{"booleanValue": true}`
- **Null**: `{"nullValue": null}`
- **Bytes**: `{"bytesValue": "base64EncodedString"}`
- **Timestamp**: `{"timestampValue": "2025-01-07T10:00:00Z"}` (RFC3339 format)
### Complex Types
- **GeoPoint**: `{"geoPointValue": {"latitude": 34.052235, "longitude": -118.243683}}`
- **Array**: `{"arrayValue": {"values": [{"stringValue": "item1"}, {"integerValue": "2"}]}}`
- **Map**: `{"mapValue": {"fields": {"key1": {"stringValue": "value1"}, "key2": {"booleanValue": true}}}}`
- **Reference**: `{"referenceValue": "collection/document"}`
## Examples
### Basic Document Creation
```yaml
tools:
add-company-doc:
kind: firestore-add-documents
source: my-firestore
description: Add a new company document
```
Usage:
```json
{
"collectionPath": "companies",
"documentData": {
"name": {
"stringValue": "Acme Corporation"
},
"establishmentDate": {
"timestampValue": "2000-01-15T10:30:00Z"
},
"location": {
"geoPointValue": {
"latitude": 34.052235,
"longitude": -118.243683
}
},
"active": {
"booleanValue": true
},
"employeeCount": {
"integerValue": "1500"
},
"annualRevenue": {
"doubleValue": 1234567.89
}
}
}
```
### With Nested Maps and Arrays
```json
{
"collectionPath": "companies",
"documentData": {
"name": {
"stringValue": "Tech Innovations Inc"
},
"contactInfo": {
"mapValue": {
"fields": {
"email": {
"stringValue": "info@techinnovations.com"
},
"phone": {
"stringValue": "+1-555-123-4567"
},
"address": {
"mapValue": {
"fields": {
"street": {
"stringValue": "123 Innovation Drive"
},
"city": {
"stringValue": "San Francisco"
},
"state": {
"stringValue": "CA"
},
"zipCode": {
"stringValue": "94105"
}
}
}
}
}
}
},
"products": {
"arrayValue": {
"values": [
{
"stringValue": "Product A"
},
{
"stringValue": "Product B"
},
{
"mapValue": {
"fields": {
"productName": {
"stringValue": "Product C Premium"
},
"version": {
"integerValue": "3"
},
"features": {
"arrayValue": {
"values": [
{
"stringValue": "Advanced Analytics"
},
{
"stringValue": "Real-time Sync"
}
]
}
}
}
}
}
]
}
}
},
"returnData": true
}
```
### Complete Example with All Data Types
```json
{
"collectionPath": "test-documents",
"documentData": {
"stringField": {
"stringValue": "Hello World"
},
"integerField": {
"integerValue": "42"
},
"doubleField": {
"doubleValue": 3.14159
},
"booleanField": {
"booleanValue": true
},
"nullField": {
"nullValue": null
},
"timestampField": {
"timestampValue": "2025-01-07T15:30:00Z"
},
"geoPointField": {
"geoPointValue": {
"latitude": 37.7749,
"longitude": -122.4194
}
},
"bytesField": {
"bytesValue": "SGVsbG8gV29ybGQh"
},
"arrayField": {
"arrayValue": {
"values": [
{
"stringValue": "item1"
},
{
"integerValue": "2"
},
{
"booleanValue": false
}
]
}
},
"mapField": {
"mapValue": {
"fields": {
"nestedString": {
"stringValue": "nested value"
},
"nestedNumber": {
"doubleValue": 99.99
}
}
}
}
}
}
```
## Authentication
The tool can be configured to require authentication:
```yaml
tools:
secure-add-docs:
kind: firestore-add-documents
source: prod-firestore
description: Add documents with authentication required
authRequired:
- google-oauth
- api-key
```
## Error Handling
Common errors include:
- Invalid collection path
- Missing or invalid document data
- Permission denied (if Firestore security rules block the operation)
- Invalid data type conversions
## Best Practices
1. **Always use typed values**: Every field must be wrapped with its appropriate type indicator (e.g., `{"stringValue": "text"}`)
2. **Integer values can be strings**: The tool accepts integer values as strings (e.g., `{"integerValue": "1500"}`)
3. **Use returnData sparingly**: Only set to true when you need to verify the exact data that was written
4. **Validate data before sending**: Ensure your data matches Firestore's native JSON format
5. **Handle timestamps properly**: Use RFC3339 format for timestamp strings
6. **Base64 encode binary data**: Binary data must be base64 encoded in the `bytesValue` field
7. **Consider security rules**: Ensure your Firestore security rules allow document creation in the target collection
## Related Tools
- [`firestore-get-documents`](firestore-get-documents.md) - Retrieve documents by their paths
- [`firestore-query-collection`](firestore-query-collection.md) - Query documents in a collection
- [`firestore-delete-documents`](firestore-delete-documents.md) - Delete documents from Firestore

View File

@@ -0,0 +1,412 @@
---
title: "firestore-query"
type: docs
weight: 1
description: >
Query a Firestore collection with parameterizable filters and Firestore native JSON value types
aliases:
- /resources/tools/firestore-query
---
## Overview
The `firestore-query` tool allows you to query Firestore collections with dynamic, parameterizable filters that support Firestore's native JSON value types. This tool is designed for querying single collection, which is the standard pattern in Firestore. The collection path itself can be parameterized, making it flexible for various use cases. This tool is particularly useful when you need to create reusable query templates with parameters that can be substituted at runtime.
**Developer Note**: This tool serves as the general querying foundation that developers can use to create custom tools with specific query patterns.
## Key Features
- **Parameterizable Queries**: Use Go template syntax to create dynamic queries
- **Dynamic Collection Paths**: The collection path can be parameterized for flexibility
- **Native JSON Value Types**: Support for Firestore's typed values (stringValue, integerValue, doubleValue, etc.)
- **Complex Filter Logic**: Support for AND/OR logical operators in filters
- **Template Substitution**: Dynamic collection paths, filters, and ordering
- **Query Analysis**: Optional query performance analysis with explain metrics (non-parameterizable)
## Configuration
### Basic Configuration
```yaml
tools:
query_countries:
kind: firestore-query
source: my-firestore-source
description: Query countries with dynamic filters
collectionPath: "countries"
filters: |
{
"field": "continent",
"op": "==",
"value": {"stringValue": "{{.continent}}"}
}
parameters:
- name: continent
type: string
description: Continent to filter by
required: true
```
### Advanced Configuration with Complex Filters
```yaml
tools:
advanced_query:
kind: firestore-query
source: my-firestore-source
description: Advanced query with complex filters
collectionPath: "{{.collection}}"
filters: |
{
"or": [
{"field": "status", "op": "==", "value": {"stringValue": "{{.status}}"}},
{
"and": [
{"field": "priority", "op": ">", "value": {"integerValue": "{{.priority}}"}},
{"field": "area", "op": "<", "value": {"doubleValue": {{.maxArea}}}},
{"field": "active", "op": "==", "value": {"booleanValue": {{.isActive}}}}
]
}
]
}
select:
- name
- status
- priority
orderBy:
field: "{{.sortField}}"
direction: "{{.sortDirection}}"
limit: 100
analyzeQuery: true
parameters:
- name: collection
type: string
description: Collection to query
required: true
- name: status
type: string
description: Status to filter by
required: true
- name: priority
type: string
description: Minimum priority value
required: true
- name: maxArea
type: float
description: Maximum area value
required: true
- name: isActive
type: boolean
description: Filter by active status
required: true
- name: sortField
type: string
description: Field to sort by
required: false
default: "createdAt"
- name: sortDirection
type: string
description: Sort direction (ASCENDING or DESCENDING)
required: false
default: "DESCENDING"
```
## Parameters
### Configuration Parameters
| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `kind` | string | Yes | Must be `firestore-query` |
| `source` | string | Yes | Name of the Firestore source to use |
| `description` | string | Yes | Description of what this tool does |
| `collectionPath` | string | Yes | Path to the collection to query (supports templates) |
| `filters` | string | No | JSON string defining query filters (supports templates) |
| `select` | array | No | Fields to select from documents(supports templates - string or array) |
| `orderBy` | object | No | Ordering configuration with `field` and `direction`(supports templates for the value of field or direction) |
| `limit` | integer | No | Maximum number of documents to return (default: 100) (supports templates) |
| `analyzeQuery` | boolean | No | Whether to analyze query performance (default: false) |
| `parameters` | array | Yes | Parameter definitions for template substitution |
### Runtime Parameters
Runtime parameters are defined in the `parameters` array and can be used in templates throughout the configuration.
## Filter Format
### Simple Filter
```json
{
"field": "age",
"op": ">",
"value": {"integerValue": "25"}
}
```
### AND Filter
```json
{
"and": [
{"field": "status", "op": "==", "value": {"stringValue": "active"}},
{"field": "age", "op": ">=", "value": {"integerValue": "18"}}
]
}
```
### OR Filter
```json
{
"or": [
{"field": "role", "op": "==", "value": {"stringValue": "admin"}},
{"field": "role", "op": "==", "value": {"stringValue": "moderator"}}
]
}
```
### Nested Filters
```json
{
"or": [
{"field": "type", "op": "==", "value": {"stringValue": "premium"}},
{
"and": [
{"field": "type", "op": "==", "value": {"stringValue": "standard"}},
{"field": "credits", "op": ">", "value": {"integerValue": "1000"}}
]
}
]
}
```
## Firestore Native Value Types
The tool supports all Firestore native JSON value types:
| Type | Format | Example |
|------|--------|---------|
| String | `{"stringValue": "text"}` | `{"stringValue": "{{.name}}"}` |
| Integer | `{"integerValue": "123"}` or `{"integerValue": 123}` | `{"integerValue": "{{.age}}"}` or `{"integerValue": {{.age}}}` |
| Double | `{"doubleValue": 45.67}` | `{"doubleValue": {{.price}}}` |
| Boolean | `{"booleanValue": true}` | `{"booleanValue": {{.active}}}` |
| Null | `{"nullValue": null}` | `{"nullValue": null}` |
| Timestamp | `{"timestampValue": "RFC3339"}` | `{"timestampValue": "{{.date}}"}` |
| GeoPoint | `{"geoPointValue": {"latitude": 0, "longitude": 0}}` | See below |
| Array | `{"arrayValue": {"values": [...]}}` | See below |
| Map | `{"mapValue": {"fields": {...}}}` | See below |
### Complex Type Examples
**GeoPoint:**
```json
{
"field": "location",
"op": "==",
"value": {
"geoPointValue": {
"latitude": 37.7749,
"longitude": -122.4194
}
}
}
```
**Array:**
```json
{
"field": "tags",
"op": "array-contains",
"value": {"stringValue": "{{.tag}}"}
}
```
## Supported Operators
- `<` - Less than
- `<=` - Less than or equal
- `>` - Greater than
- `>=` - Greater than or equal
- `==` - Equal
- `!=` - Not equal
- `array-contains` - Array contains value
- `array-contains-any` - Array contains any of the values
- `in` - Value is in array
- `not-in` - Value is not in array
## Examples
### Example 1: Query with Dynamic Collection Path
```yaml
tools:
user_documents:
kind: firestore-query
source: my-firestore
description: Query user-specific documents
collectionPath: "users/{{.userId}}/documents"
filters: |
{
"field": "type",
"op": "==",
"value": {"stringValue": "{{.docType}}"}
}
parameters:
- name: userId
type: string
description: User ID
required: true
- name: docType
type: string
description: Document type to filter
required: true
```
### Example 2: Complex Geographic Query
```yaml
tools:
location_search:
kind: firestore-query
source: my-firestore
description: Search locations by area and population
collectionPath: "cities"
filters: |
{
"and": [
{"field": "country", "op": "==", "value": {"stringValue": "{{.country}}"}},
{"field": "population", "op": ">", "value": {"integerValue": "{{.minPopulation}}"}},
{"field": "area", "op": "<", "value": {"doubleValue": {{.maxArea}}}}
]
}
orderBy:
field: "population"
direction: "DESCENDING"
limit: 50
parameters:
- name: country
type: string
description: Country code
required: true
- name: minPopulation
type: string
description: Minimum population (as string for large numbers)
required: true
- name: maxArea
type: float
description: Maximum area in square kilometers
required: true
```
### Example 3: Time-based Query with Analysis
```yaml
tools:
activity_log:
kind: firestore-query
source: my-firestore
description: Query activity logs within time range
collectionPath: "logs"
filters: |
{
"and": [
{"field": "timestamp", "op": ">=", "value": {"timestampValue": "{{.startTime}}"}},
{"field": "timestamp", "op": "<=", "value": {"timestampValue": "{{.endTime}}"}},
{"field": "severity", "op": "in", "value": {"arrayValue": {"values": [
{"stringValue": "ERROR"},
{"stringValue": "CRITICAL"}
]}}}
]
}
select:
- timestamp
- message
- severity
- userId
orderBy:
field: "timestamp"
direction: "DESCENDING"
analyzeQuery: true
parameters:
- name: startTime
type: string
description: Start time in RFC3339 format
required: true
- name: endTime
type: string
description: End time in RFC3339 format
required: true
```
## Usage
### Invoking the Tool
```bash
# Using curl
curl -X POST http://localhost:5000/api/tool/your-tool-name/invoke \
-H "Content-Type: application/json" \
-d '{
"continent": "Europe",
"minPopulation": "1000000",
"maxArea": 500000.5,
"isActive": true
}'
```
### Response Format
**Without analyzeQuery:**
```json
[
{
"id": "doc1",
"path": "countries/doc1",
"data": {
"name": "France",
"continent": "Europe",
"population": 67000000,
"area": 551695
},
"createTime": "2024-01-01T00:00:00Z",
"updateTime": "2024-01-15T10:30:00Z"
}
]
```
**With analyzeQuery:**
```json
{
"documents": [...],
"explainMetrics": {
"planSummary": {
"indexesUsed": [...]
},
"executionStats": {
"resultsReturned": 10,
"executionDuration": "15ms",
"readOperations": 10
}
}
}
```
## Best Practices
1. **Use Typed Values**: Always use Firestore's native JSON value types for proper type handling
2. **String Numbers for Large Integers**: Use string representation for large integers to avoid precision loss
3. **Template Security**: Validate all template parameters to prevent injection attacks
4. **Index Optimization**: Use `analyzeQuery` to identify missing indexes
5. **Limit Results**: Always set a reasonable `limit` to prevent excessive data retrieval
6. **Field Selection**: Use `select` to retrieve only necessary fields
## Technical Notes
- Queries operate on a single collection (the standard Firestore pattern)
- Maximum of 100 filters per query (configurable)
- Template parameters must be properly escaped in JSON contexts
- Complex nested queries may require composite indexes
## See Also
- [firestore-query-collection](firestore-query-collection.md) - Non-parameterizable query tool
- [Firestore Source Configuration](../../sources/firestore.md)
- [Firestore Query Documentation](https://firebase.google.com/docs/firestore/query-data/queries)

View File

@@ -0,0 +1,327 @@
---
title: "firestore-update-document"
type: docs
weight: 1
description: >
A "firestore-update-document" tool updates an existing document in Firestore.
aliases:
- /resources/tools/firestore-update-document
---
## Description
The `firestore-update-document` tool allows you to update existing documents in Firestore. It supports all Firestore data types using Firestore's native JSON format. The tool can perform both full document updates (replacing all fields) or selective field updates using an update mask. When using an update mask, fields referenced in the mask but not present in the document data will be deleted from the document, following Firestore's native behavior.
## Parameters
| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `documentPath` | string | Yes | The path of the document which needs to be updated |
| `documentData` | map | Yes | The data to update in the document. Must use [Firestore's native JSON format](https://cloud.google.com/firestore/docs/reference/rest/Shared.Types/ArrayValue#Value) with typed values |
| `updateMask` | array | No | The selective fields to update. If not provided, all fields in documentData will be updated. When provided, only the specified fields will be updated. Fields referenced in the mask but not present in documentData will be deleted from the document |
| `returnData` | boolean | No | If set to true, the output will include the data of the updated document. Defaults to false to help avoid overloading the context |
## Output
The tool returns a map containing:
| Field | Type | Description |
|-------|------|-------------|
| `documentPath` | string | The full path of the updated document |
| `updateTime` | string | The timestamp when the document was updated |
| `documentData` | map | The current data of the document after the update (only included when `returnData` is true) |
## Data Type Format
The tool requires Firestore's native JSON format for document data. Each field must be wrapped with its type indicator:
### Basic Types
- **String**: `{"stringValue": "your string"}`
- **Integer**: `{"integerValue": "123"}` or `{"integerValue": 123}`
- **Double**: `{"doubleValue": 123.45}`
- **Boolean**: `{"booleanValue": true}`
- **Null**: `{"nullValue": null}`
- **Bytes**: `{"bytesValue": "base64EncodedString"}`
- **Timestamp**: `{"timestampValue": "2025-01-07T10:00:00Z"}` (RFC3339 format)
### Complex Types
- **GeoPoint**: `{"geoPointValue": {"latitude": 34.052235, "longitude": -118.243683}}`
- **Array**: `{"arrayValue": {"values": [{"stringValue": "item1"}, {"integerValue": "2"}]}}`
- **Map**: `{"mapValue": {"fields": {"key1": {"stringValue": "value1"}, "key2": {"booleanValue": true}}}}`
- **Reference**: `{"referenceValue": "collection/document"}`
## Update Modes
### Full Document Update (Merge All)
When `updateMask` is not provided, the tool performs a merge operation that updates all fields specified in `documentData` while preserving other existing fields in the document.
### Selective Field Update
When `updateMask` is provided, only the fields listed in the mask are updated. This allows for precise control over which fields are modified, added, or deleted. To delete a field, include it in the `updateMask` but omit it from `documentData`.
## Reference
| **field** | **type** | **required** | **description** |
|-------------|:--------------:|:------------:|----------------------------------------------------------|
| kind | string | true | Must be "firestore-update-document". |
| source | string | true | Name of the Firestore source to update documents in. |
| description | string | true | Description of the tool that is passed to the LLM. |
## Examples
### Basic Document Update (Full Merge)
```yaml
tools:
update-user-doc:
kind: firestore-update-document
source: my-firestore
description: Update a user document
```
Usage:
```json
{
"documentPath": "users/user123",
"documentData": {
"name": {
"stringValue": "Jane Doe"
},
"lastUpdated": {
"timestampValue": "2025-01-15T10:30:00Z"
},
"status": {
"stringValue": "active"
},
"score": {
"integerValue": "150"
}
}
}
```
### Selective Field Update with Update Mask
```json
{
"documentPath": "users/user123",
"documentData": {
"email": {
"stringValue": "newemail@example.com"
},
"profile": {
"mapValue": {
"fields": {
"bio": {
"stringValue": "Updated bio text"
},
"avatar": {
"stringValue": "https://example.com/new-avatar.jpg"
}
}
}
}
},
"updateMask": ["email", "profile.bio", "profile.avatar"]
}
```
### Update with Field Deletion
To delete fields, include them in the `updateMask` but omit them from `documentData`:
```json
{
"documentPath": "users/user123",
"documentData": {
"name": {
"stringValue": "John Smith"
}
},
"updateMask": ["name", "temporaryField", "obsoleteData"],
"returnData": true
}
```
In this example:
- `name` will be updated to "John Smith"
- `temporaryField` and `obsoleteData` will be deleted from the document (they are in the mask but not in the data)
### Complex Update with Nested Data
```json
{
"documentPath": "companies/company456",
"documentData": {
"metadata": {
"mapValue": {
"fields": {
"lastModified": {
"timestampValue": "2025-01-15T14:30:00Z"
},
"modifiedBy": {
"stringValue": "admin@company.com"
}
}
}
},
"locations": {
"arrayValue": {
"values": [
{
"mapValue": {
"fields": {
"city": {
"stringValue": "San Francisco"
},
"coordinates": {
"geoPointValue": {
"latitude": 37.7749,
"longitude": -122.4194
}
}
}
}
},
{
"mapValue": {
"fields": {
"city": {
"stringValue": "New York"
},
"coordinates": {
"geoPointValue": {
"latitude": 40.7128,
"longitude": -74.0060
}
}
}
}
}
]
}
},
"revenue": {
"doubleValue": 5678901.23
}
},
"updateMask": ["metadata", "locations", "revenue"]
}
```
### Update with All Data Types
```json
{
"documentPath": "test-documents/doc789",
"documentData": {
"stringField": {
"stringValue": "Updated string"
},
"integerField": {
"integerValue": "999"
},
"doubleField": {
"doubleValue": 2.71828
},
"booleanField": {
"booleanValue": false
},
"nullField": {
"nullValue": null
},
"timestampField": {
"timestampValue": "2025-01-15T16:45:00Z"
},
"geoPointField": {
"geoPointValue": {
"latitude": 51.5074,
"longitude": -0.1278
}
},
"bytesField": {
"bytesValue": "VXBkYXRlZCBkYXRh"
},
"arrayField": {
"arrayValue": {
"values": [
{
"stringValue": "updated1"
},
{
"integerValue": "200"
},
{
"booleanValue": true
}
]
}
},
"mapField": {
"mapValue": {
"fields": {
"nestedString": {
"stringValue": "updated nested value"
},
"nestedNumber": {
"doubleValue": 88.88
}
}
}
},
"referenceField": {
"referenceValue": "users/updatedUser"
}
},
"returnData": true
}
```
## Authentication
The tool can be configured to require authentication:
```yaml
tools:
secure-update-doc:
kind: firestore-update-document
source: prod-firestore
description: Update documents with authentication required
authRequired:
- google-oauth
- api-key
```
## Error Handling
Common errors include:
- Document not found (when using update with a non-existent document)
- Invalid document path
- Missing or invalid document data
- Permission denied (if Firestore security rules block the operation)
- Invalid data type conversions
## Best Practices
1. **Use update masks for precision**: When you only need to update specific fields, use the `updateMask` parameter to avoid unintended changes
2. **Always use typed values**: Every field must be wrapped with its appropriate type indicator (e.g., `{"stringValue": "text"}`)
3. **Integer values can be strings**: The tool accepts integer values as strings (e.g., `{"integerValue": "1500"}`)
4. **Use returnData sparingly**: Only set to true when you need to verify the exact data after the update
5. **Validate data before sending**: Ensure your data matches Firestore's native JSON format
6. **Handle timestamps properly**: Use RFC3339 format for timestamp strings
7. **Base64 encode binary data**: Binary data must be base64 encoded in the `bytesValue` field
8. **Consider security rules**: Ensure your Firestore security rules allow document updates
9. **Delete fields using update mask**: To delete fields, include them in the `updateMask` but omit them from `documentData`
10. **Test with non-production data first**: Always test your updates on non-critical documents first
## Differences from Add Documents
- **Purpose**: Updates existing documents vs. creating new ones
- **Document must exist**: For standard updates (though not using updateMask will create if missing with given document id)
- **Update mask support**: Allows selective field updates
- **Field deletion**: Supports removing specific fields by including them in the mask but not in the data
- **Returns updateTime**: Instead of createTime
## Related Tools
- [`firestore-add-documents`](firestore-add-documents.md) - Add new documents to Firestore
- [`firestore-get-documents`](firestore-get-documents.md) - Retrieve documents by their paths
- [`firestore-query-collection`](firestore-query-collection.md) - Query documents in a collection
- [`firestore-delete-documents`](firestore-delete-documents.md) - Delete documents from Firestore

View File

@@ -33,6 +33,29 @@ tools:
It takes two parameters, the model_name looked up from get_models and the
explore_name looked up from get_explores.
If this returns a suggestions field for a dimension, the contents of suggestions
can be used as filters for this field. If this returns a suggest_explore and
suggest_dimension, a query against that explore and dimension can be used to find
valid filters for this field.
```
The response is a json array with the following elements:
```json
{
"name": "field name",
"description": "field description",
"type": "field type",
"label": "field label",
"label_short": "field short label",
"tags": ["tags", ...],
"synonyms": ["synonyms", ...],
"suggestions": ["suggestion", ...],
"suggest_explore": "explore",
"suggest_dimension": "dimension"
}
```
## Reference

View File

@@ -21,6 +21,16 @@ It's compatible with the following sources:
`looker-get-explores` accepts one parameter, the
`model` id.
The return type is an array of maps, each map is formatted like:
```json
{
"name": "explore name",
"description": "explore description",
"label": "explore label",
"group_label": "group label"
}
```
## Example
```yaml

View File

@@ -35,6 +35,24 @@ tools:
explore_name looked up from get_explores.
```
The response is a json array with the following elements:
```json
{
"name": "field name",
"description": "field description",
"type": "field type",
"label": "field label",
"label_short": "field short label",
"tags": ["tags", ...],
"synonyms": ["synonyms", ...],
"suggestions": ["suggestion", ...],
"suggest_explore": "explore",
"suggest_dimension": "dimension"
}
```
## Reference
| **field** | **type** | **required** | **description** |

View File

@@ -33,8 +33,32 @@ tools:
It takes two parameters, the model_name looked up from get_models and the
explore_name looked up from get_explores.
If this returns a suggestions field for a measure, the contents of suggestions
can be used as filters for this field. If this returns a suggest_explore and
suggest_dimension, a query against that explore and dimension can be used to find
valid filters for this field.
```
The response is a json array with the following elements:
```json
{
"name": "field name",
"description": "field description",
"type": "field type",
"label": "field label",
"label_short": "field short label",
"tags": ["tags", ...],
"synonyms": ["synonyms", ...],
"suggestions": ["suggestion", ...],
"suggest_explore": "explore",
"suggest_dimension": "dimension"
}
```
## Reference
| **field** | **type** | **required** | **description** |

View File

@@ -35,6 +35,24 @@ tools:
explore_name looked up from get_explores.
```
The response is a json array with the following elements:
```json
{
"name": "field name",
"description": "field description",
"type": "field type",
"label": "field label",
"label_short": "field short label",
"tags": ["tags", ...],
"synonyms": ["synonyms", ...],
"suggestions": ["suggestion", ...],
"suggest_explore": "explore",
"suggest_dimension": "dimension"
}
```
## Reference
| **field** | **type** | **required** | **description** |

View File

@@ -0,0 +1,7 @@
---
title: "OceanBase"
type: docs
weight: 1
description: >
Tools that work with OceanBase Sources.
---

View File

@@ -0,0 +1,37 @@
---
title: "oceanbase-execute-sql"
type: docs
weight: 1
description: >
An "oceanbase-execute-sql" tool executes a SQL statement against an OceanBase database.
aliases:
- /resources/tools/oceanbase-execute-sql
---
## About
An `oceanbase-execute-sql` tool executes a SQL statement against an OceanBase database. It's compatible with the following source:
- [oceanbase](../sources/oceanbase.md)
`oceanbase-execute-sql` takes one input parameter `sql` and runs the sql statement against the `source`.
> **Note:** This tool is intended for developer assistant workflows with human-in-the-loop and shouldn't be used for production agents.
## Example
```yaml
tools:
execute_sql_tool:
kind: oceanbase-execute-sql
source: my-oceanbase-instance
description: Use this tool to execute sql statement.
```
## Reference
| **field** | **type** | **required** | **description** |
|-------------|:----------:|:------------:|--------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "oceanbase-execute-sql". |
| source | string | true | Name of the source the SQL should execute on. |
| description | string | true | Description of the tool that is passed to the LLM. |

View File

@@ -0,0 +1,122 @@
---
title: "oceanbase-sql"
type: docs
weight: 1
description: >
An "oceanbase-sql" tool executes a pre-defined SQL statement against an OceanBase database.
aliases:
- /resources/tools/oceanbase-sql
---
## About
An `oceanbase-sql` tool executes a pre-defined SQL statement against an OceanBase database. It's compatible with the following source:
- [oceanbase](../sources/oceanbase.md)
The specified SQL statement is executed as a [prepared statement][mysql-prepare], and expects parameters in the SQL query to be in the form of placeholders `?`.
[mysql-prepare]: https://dev.mysql.com/doc/refman/8.4/en/sql-prepared-statements.html
## Example
> **Note:** This tool uses parameterized queries to prevent SQL injections.
> Query parameters can be used as substitutes for arbitrary expressions.
> Parameters cannot be used as substitutes for identifiers, column names, table names, or other parts of the query.
```yaml
tools:
search_flights_by_number:
kind: oceanbase-sql
source: my-oceanbase-instance
statement: |
SELECT * FROM flights
WHERE airline = ?
AND flight_number = ?
LIMIT 10
description: |
Use this tool to get information for a specific flight.
Takes an airline code and flight number and returns info on the flight.
Do NOT use this tool with a flight id. Do NOT guess an airline code or flight number.
Example:
{{
"airline": "CY",
"flight_number": "888",
}}
parameters:
- name: airline
type: string
description: Airline unique 2 letter identifier
- name: flight_number
type: string
description: 1 to 4 digit number
```
### Example with Template Parameters
> **Note:** This tool allows direct modifications to the SQL statement, including identifiers, column names, and table names. **This makes it more vulnerable to SQL injections**. Using basic parameters only (see above) is recommended for performance and safety reasons.
```yaml
tools:
list_table:
kind: oceanbase-sql
source: my-oceanbase-instance
statement: |
SELECT * FROM {{.tableName}};
description: |
Use this tool to list all information from a specific table.
Example:
{{
"tableName": "flights",
}}
templateParameters:
- name: tableName
type: string
description: Table to select from
```
### Example with Array Parameters
```yaml
tools:
search_flights_by_ids:
kind: oceanbase-sql
source: my-oceanbase-instance
statement: |
SELECT * FROM flights
WHERE id IN (?)
AND status IN (?)
description: |
Use this tool to get information for multiple flights by their IDs and statuses.
Example:
{{
"flight_ids": [1, 2, 3],
"statuses": ["active", "scheduled"]
}}
parameters:
- name: flight_ids
type: array
description: List of flight IDs to search for
items:
name: flight_id
type: integer
description: Individual flight ID
- name: statuses
type: array
description: List of flight statuses to filter by
items:
name: status
type: string
description: Individual flight status
```
## Reference
| **field** | **type** | **required** | **description** |
|--------------------|:------------------------------:|:------------:|--------------------------------------------------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "oceanbase-sql". |
| source | string | true | Name of the source the SQL should execute on. |
| description | string | true | Description of the tool that is passed to the LLM. |
| statement | string | true | SQL statement to execute on. |
| parameters | [parameters](_index#specifying-parameters) | false | List of [parameters](_index#specifying-parameters) that will be inserted into the SQL statement. |
| templateParameters | [templateParameters](_index#template-parameters) | false | List of [templateParameters](_index#template-parameters) that will be inserted into the SQL statement before executing prepared statement. |

View File

@@ -0,0 +1,7 @@
---
title: "Trino"
type: docs
weight: 1
description: >
Tools that work with Trino Sources, such as distributed SQL query engine.
---

View File

@@ -0,0 +1,41 @@
---
title: "trino-execute-sql"
type: docs
weight: 1
description: >
A "trino-execute-sql" tool executes a SQL statement against a Trino
database.
aliases:
- /resources/tools/trino-execute-sql
---
## About
A `trino-execute-sql` tool executes a SQL statement against a Trino
database. It's compatible with any of the following sources:
- [trino](../../sources/trino.md)
`trino-execute-sql` takes one input parameter `sql` and run the sql
statement against the `source`.
> **Note:** This tool is intended for developer assistant workflows with
> human-in-the-loop and shouldn't be used for production agents.
## Example
```yaml
tools:
execute_sql_tool:
kind: trino-execute-sql
source: my-trino-instance
description: Use this tool to execute sql statement.
```
## Reference
| **field** | **type** | **required** | **description** |
|-------------|:------------------------------------------:|:------------:|--------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "trino-execute-sql". |
| source | string | true | Name of the source the SQL should execute on. |
| description | string | true | Description of the tool that is passed to the LLM. |

View File

@@ -0,0 +1,108 @@
---
title: "trino-sql"
type: docs
weight: 1
description: >
A "trino-sql" tool executes a pre-defined SQL statement against a Trino
database.
aliases:
- /resources/tools/trino-sql
---
## About
A `trino-sql` tool executes a pre-defined SQL statement against a Trino
database. It's compatible with any of the following sources:
- [trino](../../sources/trino.md)
The specified SQL statement is executed as a [prepared statement][trino-prepare],
and specified parameters will be inserted according to their position: e.g. `$1`
will be the first parameter specified, `$2` will be the second parameter, and so
on. If template parameters are included, they will be resolved before execution
of the prepared statement.
[trino-prepare]: https://trino.io/docs/current/sql/prepare.html
## Example
> **Note:** This tool uses parameterized queries to prevent SQL injections.
> Query parameters can be used as substitutes for arbitrary expressions.
> Parameters cannot be used as substitutes for identifiers, column names, table
> names, or other parts of the query.
```yaml
tools:
search_orders_by_region:
kind: trino-sql
source: my-trino-instance
statement: |
SELECT * FROM hive.sales.orders
WHERE region = $1
AND order_date >= DATE($2)
LIMIT 10
description: |
Use this tool to get information for orders in a specific region.
Takes a region code and date and returns info on the orders.
Do NOT use this tool with an order id. Do NOT guess a region code or date.
A region code is a code for a geographic region consisting of two-character
region designator and followed by optional subregion.
For example, if given US-WEST, the region is "US-WEST".
Another example for this is EU-CENTRAL, the region is "EU-CENTRAL".
If the tool returns more than one option choose the date closest to today.
Example:
{{
"region": "US-WEST",
"order_date": "2024-01-01",
}}
Example:
{{
"region": "EU-CENTRAL",
"order_date": "2024-01-15",
}}
parameters:
- name: region
type: string
description: Region unique identifier
- name: order_date
type: string
description: Order date in YYYY-MM-DD format
```
### Example with Template Parameters
> **Note:** This tool allows direct modifications to the SQL statement,
> including identifiers, column names, and table names. **This makes it more
> vulnerable to SQL injections**. Using basic parameters only (see above) is
> recommended for performance and safety reasons. For more details, please check
> [templateParameters](..#template-parameters).
```yaml
tools:
list_table:
kind: trino-sql
source: my-trino-instance
statement: |
SELECT * FROM {{.tableName}}
description: |
Use this tool to list all information from a specific table.
Example:
{{
"tableName": "hive.sales.orders",
}}
templateParameters:
- name: tableName
type: string
description: Table to select from
```
## Reference
| **field** | **type** | **required** | **description** |
|---------------------|:---------------------------------------------------------:|:------------:|--------------------------------------------------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "trino-sql". |
| source | string | true | Name of the source the SQL should execute on. |
| description | string | true | Description of the tool that is passed to the LLM. |
| statement | string | true | SQL statement to execute on. |
| parameters | [parameters](../#specifying-parameters) | false | List of [parameters](../#specifying-parameters) that will be inserted into the SQL statement. |
| templateParameters | [templateParameters](..#template-parameters) | false | List of [templateParameters](..#template-parameters) that will be inserted into the SQL statement before executing prepared statement. |

View File

@@ -16,21 +16,17 @@ This tool is intended for developer assistant workflows with human-in-the-loop
and shouldn't be used for production agents.
{{< /notice >}}
{{< notice info >}}
This tool does not have a `source` and authenticates using the environment's
[Application Default Credentials](https://cloud.google.com/docs/authentication/application-default-credentials).
{{< /notice >}}
## Example
```yaml
sources:
alloydb-api-source:
kind: http
baseUrl: https://alloydb.googleapis.com
headers:
Authorization: Bearer ${API_KEY}
Content-Type: application/json
tools:
alloydb-operations-get:
kind: alloydb-wait-for-operation
source: alloydb-api-source
description: "This will poll on operations API until the operation is done. For checking operation status we need projectId, locationID and operationId. Once instance is created give follow up steps on how to use the variables to bring data plane MCP server up in local and remote setup."
delay: 1s
maxDelay: 4m
@@ -43,7 +39,6 @@ tools:
| **field** | **type** | **required** | **description** |
| ----------- | :------: | :----------: | ---------------------------------------------------------------------------------------------------------------- |
| kind | string | true | Must be "alloydb-wait-for-operation". |
| source | string | true | Name of the source the HTTP request should be sent to. |
| description | string | true | A description of the tool. |
| delay | duration | false | The initial delay between polling requests (e.g., `3s`). Defaults to 3 seconds. |
| maxDelay | duration | false | The maximum delay between polling requests (e.g., `4m`). Defaults to 4 minutes. |

View File

@@ -0,0 +1,101 @@
---
title: "yugabytedb-sql"
type: docs
weight: 1
description: >
A "yugabytedb-sql" tool executes a pre-defined SQL statement against a YugabyteDB
database.
---
## About
A `yugabytedb-sql` tool executes a pre-defined SQL statement against a YugabyteDB
database.
The specified SQL statement is executed as a prepared statement,
and specified parameters will inserted according to their position: e.g. `1`
will be the first parameter specified, `$@` will be the second parameter, and so
on. If template parameters are included, they will be resolved before execution
of the prepared statement.
## Example
> **Note:** This tool uses parameterized queries to prevent SQL injections.
> Query parameters can be used as substitutes for arbitrary expressions.
> Parameters cannot be used as substitutes for identifiers, column names, table
> names, or other parts of the query.
```yaml
tools:
search_flights_by_number:
kind: yugabytedb-sql
source: my-yb-instance
statement: |
SELECT * FROM flights
WHERE airline = $1
AND flight_number = $2
LIMIT 10
description: |
Use this tool to get information for a specific flight.
Takes an airline code and flight number and returns info on the flight.
Do NOT use this tool with a flight id. Do NOT guess an airline code or flight number.
A airline code is a code for an airline service consisting of two-character
airline designator and followed by flight number, which is 1 to 4 digit number.
For example, if given CY 0123, the airline is "CY", and flight_number is "123".
Another example for this is DL 1234, the airline is "DL", and flight_number is "1234".
If the tool returns more than one option choose the date closes to today.
Example:
{{
"airline": "CY",
"flight_number": "888",
}}
Example:
{{
"airline": "DL",
"flight_number": "1234",
}}
parameters:
- name: airline
type: string
description: Airline unique 2 letter identifier
- name: flight_number
type: string
description: 1 to 4 digit number
```
### Example with Template Parameters
> **Note:** This tool allows direct modifications to the SQL statement, including identifiers, column names,
> and table names. **This makes it more vulnerable to SQL injections**. Using basic parameters
> only (see above) is recommended for performance and safety reasons. For more details, please
> check [templateParameters](_index#template-parameters).
```yaml
tools:
list_table:
kind: yugabytedb-sql
source: my-yb-instance
statement: |
SELECT * FROM {{.tableName}}
description: |
Use this tool to list all information from a specific table.
Example:
{{
"tableName": "flights",
}}
templateParameters:
- name: tableName
type: string
description: Table to select from
```
## Reference
| **field** | **type** | **required** | **description** |
|---------------------|:---------------------------------------------------------:|:------------:|--------------------------------------------------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "yugabytedb-sql". |
| source | string | true | Name of the source the SQL should execute on. |
| description | string | true | Description of the tool that is passed to the LLM. |
| statement | string | true | SQL statement to execute on. |
| parameters | [parameters](_index#specifying-parameters) | false | List of [parameters](_index#specifying-parameters) that will be inserted into the SQL statement. |
| templateParameters | [templateParameters](_index#template-parameters) | false | List of [templateParameters](_index#template-parameters) that will be inserted into the SQL statement before executing prepared statement. |

View File

@@ -1,340 +1,7 @@
---
title: "AlloyDB"
type: docs
weight: 2
weight: 1
description: >
How to get started running Toolbox with MCP Inspector and AlloyDB as the source.
How to get started with Toolbox using AlloyDB.
---
## Overview
[Model Context Protocol](https://modelcontextprotocol.io) is an open protocol
that standardizes how applications provide context to LLMs. Check out this page
on how to [connect to Toolbox via MCP](../../how-to/connect_via_mcp.md).
## Before you begin
This guide assumes you have already done the following:
1. [Create a AlloyDB cluster and instance](https://cloud.google.com/alloydb/docs/cluster-create) with a database and user.
1. Connect to the instance using [AlloyDB Studio](https://cloud.google.com/alloydb/docs/manage-data-using-studio), [`psql` command-line tool](https://www.postgresql.org/download/), or any other PostgreSQL client.
2. Enable the `pgvector` and `google_ml_integration` [extensions](https://cloud.google.com/alloydb/docs/ai). These are required for Semantic Search and Natural Language to SQL tools. Run the following SQL commands:
```sql
CREATE EXTENSION IF NOT EXISTS "vector";
CREATE EXTENSION IF NOT EXISTS "google_ml_integration";
CREATE EXTENSION IF NOT EXISTS alloydb_ai_nl cascade;
CREATE EXTENSION IF NOT EXISTS parameterized_views;
```
## Step 1: Set up your AlloyDB database
In this section, we will create the necessary tables and functions in your AlloyDB instance.
1. Create tables using the following commands:
```sql
CREATE TABLE products (
product_id SERIAL PRIMARY KEY,
name VARCHAR(255) NOT NULL,
description TEXT,
price DECIMAL(10, 2) NOT NULL,
category_id INT,
embedding vector(3072) -- Vector size for model(gemini-embedding-001)
);
CREATE TABLE customers (
customer_id SERIAL PRIMARY KEY,
name VARCHAR(255) NOT NULL,
email VARCHAR(255) UNIQUE NOT NULL
);
CREATE TABLE cart (
cart_id SERIAL PRIMARY KEY,
customer_id INT UNIQUE NOT NULL,
created_at TIMESTAMP WITH TIME ZONE DEFAULT CURRENT_TIMESTAMP,
FOREIGN KEY (customer_id) REFERENCES customers(customer_id)
);
CREATE TABLE cart_items (
cart_item_id SERIAL PRIMARY KEY,
cart_id INT NOT NULL,
product_id INT NOT NULL,
quantity INT NOT NULL,
price DECIMAL(10, 2) NOT NULL,
FOREIGN KEY (cart_id) REFERENCES cart(cart_id),
FOREIGN KEY (product_id) REFERENCES products(product_id)
);
CREATE TABLE categories (
category_id SERIAL PRIMARY KEY,
name VARCHAR(255) NOT NULL
);
```
2. Insert sample data into the tables:
```sql
INSERT INTO categories (category_id, name) VALUES
(1, 'Flowers'),
(2, 'Vases');
INSERT INTO products (product_id, name, description, price, category_id, embedding) VALUES
(1, 'Rose', 'A beautiful red rose', 2.50, 1, embedding('gemini-embedding-001', 'A beautiful red rose')),
(2, 'Tulip', 'A colorful tulip', 1.50, 1, embedding('gemini-embedding-001', 'A colorful tulip')),
(3, 'Glass Vase', 'A transparent glass vase', 10.00, 2, embedding('gemini-embedding-001', 'A transparent glass vase')),
(4, 'Ceramic Vase', 'A handmade ceramic vase', 15.00, 2, embedding('gemini-embedding-001', 'A handmade ceramic vase'));
INSERT INTO customers (customer_id, name, email) VALUES
(1, 'John Doe', 'john.doe@example.com'),
(2, 'Jane Smith', 'jane.smith@example.com');
INSERT INTO cart (cart_id, customer_id) VALUES
(1, 1),
(2, 2);
INSERT INTO cart_items (cart_id, product_id, quantity, price) VALUES
(1, 1, 2, 2.50),
(1, 3, 1, 10.00),
(2, 2, 5, 1.50);
```
## Step 2: Install Toolbox
In this section, we will download and install the Toolbox binary.
1. Download the latest version of Toolbox as a binary:
{{< notice tip >}}
Select the
[correct binary](https://github.com/googleapis/genai-toolbox/releases)
corresponding to your OS and CPU architecture.
{{< /notice >}}
<!-- {x-release-please-start-version} -->
```bash
export OS="linux/amd64" # one of linux/amd64, darwin/arm64, darwin/amd64, or windows/amd64
export VERSION="0.12.0"
curl -O https://storage.googleapis.com/genai-toolbox/v$VERSION/$OS/toolbox
```
<!-- {x-release-please-end} -->
1. Make the binary executable:
```bash
chmod +x toolbox
```
## Step 3: Configure the tools
Create a `tools.yaml` file and add the following content. You must replace the placeholders with your actual AlloyDB configuration.
First, define the data source for your tools. This tells Toolbox how to connect to your AlloyDB instance.
```yaml
sources:
alloydb-pg-source:
kind: alloydb-postgres
project: YOUR_PROJECT_ID
region: YOUR_REGION
cluster: YOUR_CLUSTER
instance: YOUR_INSTANCE
database: YOUR_DATABASE
user: YOUR_USER
password: YOUR_PASSWORD
```
Next, define the tools the agent can use. We will categorize them into three types:
### 1. Structured Queries Tools
These tools execute predefined SQL statements. They are ideal for common, structured queries like managing a shopping cart. Add the following to your `tools.yaml` file:
```yaml
tools:
access-cart-information:
kind: postgres-sql
source: alloydb-pg-source
description: >-
List items in customer cart.
Use this tool to list items in a customer cart. This tool requires the cart ID.
parameters:
- name: cart_id
type: integer
description: The id of the cart.
statement: |
SELECT
p.name AS product_name,
ci.quantity,
ci.price AS item_price,
(ci.quantity * ci.price) AS total_item_price,
c.created_at AS cart_created_at,
ci.product_id AS product_id
FROM
cart_items ci JOIN cart c ON ci.cart_id = c.cart_id
JOIN products p ON ci.product_id = p.product_id
WHERE
c.cart_id = $1;
add-to-cart:
kind: postgres-sql
source: alloydb-pg-source
description: >-
Add items to customer cart using the product ID and product prices from the product list.
Use this tool to add items to a customer cart.
This tool requires the cart ID, product ID, quantity, and price.
parameters:
- name: cart_id
type: integer
description: The id of the cart.
- name: product_id
type: integer
description: The id of the product.
- name: quantity
type: integer
description: The quantity of items to add.
- name: price
type: float
description: The price of items to add.
statement: |
INSERT INTO
cart_items (cart_id, product_id, quantity, price)
VALUES($1,$2,$3,$4);
delete-from-cart:
kind: postgres-sql
source: alloydb-pg-source
description: >-
Remove products from customer cart.
Use this tool to remove products from a customer cart.
This tool requires the cart ID and product ID.
parameters:
- name: cart_id
type: integer
description: The id of the cart.
- name: product_id
type: integer
description: The id of the product.
statement: |
DELETE FROM
cart_items
WHERE
cart_id = $1 AND product_id = $2;
```
### 2. Semantic Search Tools
These tools use vector embeddings to find the most relevant results based on the meaning of a user's query, rather than just keywords. Append the following tools to the `tools` section in your `tools.yaml`:
```yaml
search-product-recommendations:
kind: postgres-sql
source: alloydb-pg-source
description: >-
Search for products based on user needs.
Use this tool to search for products. This tool requires the user's needs.
parameters:
- name: query
type: string
description: The product characteristics
statement: |
SELECT
product_id,
name,
description,
ROUND(CAST(price AS numeric), 2) as price
FROM
products
ORDER BY
embedding('gemini-embedding-001', $1)::vector <=> embedding
LIMIT 5;
```
### 3. Natural Language to SQL (NL2SQL) Tools
1. Create a [natural language configuration](https://cloud.google.com/alloydb/docs/ai/use-natural-language-generate-sql-queries#create-config) for your AlloyDB cluster.
{{< notice tip >}}Before using NL2SQL tools,
you must first install the `alloydb_ai_nl` extension and
create the [semantic layer](https://cloud.google.com/alloydb/docs/ai/natural-language-overview) under a configuration named `flower_shop`.
{{< /notice >}}
2. Configure your NL2SQL tool to use your configuration. These tools translate natural language questions into SQL queries, allowing users to interact with the database conversationally. Append the following tool to the `tools` section:
```yaml
ask-questions-about-products:
kind: alloydb-ai-nl
source: alloydb-pg-source
nlConfig: flower_shop
description: >-
Ask questions related to products or brands.
Use this tool to ask questions about products or brands.
Always SELECT the IDs of objects when generating queries.
```
Finally, group the tools into a `toolset` to make them available to the model. Add the following to the end of your `tools.yaml` file:
```yaml
toolsets:
flower_shop:
- access-cart-information
- search-product-recommendations
- ask-questions-about-products
- add-to-cart
- delete-from-cart
```
For more info on tools, check out the
[Tools](../../resources/tools/) section.
## Step 4: Run the Toolbox server
Run the Toolbox server, pointing to the `tools.yaml` file created earlier:
```bash
./toolbox --tools-file "tools.yaml"
```
## Step 5: Connect to MCP Inspector
1. Run the MCP Inspector:
```bash
npx @modelcontextprotocol/inspector
```
1. Type `y` when it asks to install the inspector package.
1. It should show the following when the MCP Inspector is up and running (please take note of `<YOUR_SESSION_TOKEN>`):
```bash
Starting MCP inspector...
⚙️ Proxy server listening on localhost:6277
🔑 Session token: <YOUR_SESSION_TOKEN>
Use this token to authenticate requests or set DANGEROUSLY_OMIT_AUTH=true to disable auth
🚀 MCP Inspector is up and running at:
http://localhost:6274/?MCP_PROXY_AUTH_TOKEN=<YOUR_SESSION_TOKEN>
```
1. Open the above link in your browser.
1. For `Transport Type`, select `Streamable HTTP`.
1. For `URL`, type in `http://127.0.0.1:5000/mcp`.
1. For `Configuration` -> `Proxy Session Token`, make sure `<YOUR_SESSION_TOKEN>` is present.
1. Click Connect.
1. Select `List Tools`, you will see a list of tools configured in `tools.yaml`.
1. Test out your tools here!
## What's next
- Learn more about [MCP Inspector](../../how-to/connect_via_mcp.md).
- Learn more about [Toolbox Resources](../../resources/).
- Learn more about [Toolbox How-to guides](../../how-to/).

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,9 @@
---
title: "Getting started with alloydb-ai-nl tool"
type: docs
weight: 30
description: >
An end to end tutorial for building an ADK agent using the AlloyDB AI NL tool.
---
{{< ipynb "alloydb_ai_nl.ipynb" >}}

View File

@@ -0,0 +1,339 @@
---
title: "Quickstart (MCP with AlloyDB)"
type: docs
weight: 1
description: >
How to get started running Toolbox with MCP Inspector and AlloyDB as the source.
---
## Overview
[Model Context Protocol](https://modelcontextprotocol.io) is an open protocol
that standardizes how applications provide context to LLMs. Check out this page
on how to [connect to Toolbox via MCP](../../how-to/connect_via_mcp.md).
## Before you begin
This guide assumes you have already done the following:
1. [Create a AlloyDB cluster and instance](https://cloud.google.com/alloydb/docs/cluster-create) with a database and user.
1. Connect to the instance using [AlloyDB Studio](https://cloud.google.com/alloydb/docs/manage-data-using-studio), [`psql` command-line tool](https://www.postgresql.org/download/), or any other PostgreSQL client.
2. Enable the `pgvector` and `google_ml_integration` [extensions](https://cloud.google.com/alloydb/docs/ai). These are required for Semantic Search and Natural Language to SQL tools. Run the following SQL commands:
```sql
CREATE EXTENSION IF NOT EXISTS "vector";
CREATE EXTENSION IF NOT EXISTS "google_ml_integration";
CREATE EXTENSION IF NOT EXISTS alloydb_ai_nl cascade;
CREATE EXTENSION IF NOT EXISTS parameterized_views;
```
## Step 1: Set up your AlloyDB database
In this section, we will create the necessary tables and functions in your AlloyDB instance.
1. Create tables using the following commands:
```sql
CREATE TABLE products (
product_id SERIAL PRIMARY KEY,
name VARCHAR(255) NOT NULL,
description TEXT,
price DECIMAL(10, 2) NOT NULL,
category_id INT,
embedding vector(3072) -- Vector size for model(gemini-embedding-001)
);
CREATE TABLE customers (
customer_id SERIAL PRIMARY KEY,
name VARCHAR(255) NOT NULL,
email VARCHAR(255) UNIQUE NOT NULL
);
CREATE TABLE cart (
cart_id SERIAL PRIMARY KEY,
customer_id INT UNIQUE NOT NULL,
created_at TIMESTAMP WITH TIME ZONE DEFAULT CURRENT_TIMESTAMP,
FOREIGN KEY (customer_id) REFERENCES customers(customer_id)
);
CREATE TABLE cart_items (
cart_item_id SERIAL PRIMARY KEY,
cart_id INT NOT NULL,
product_id INT NOT NULL,
quantity INT NOT NULL,
price DECIMAL(10, 2) NOT NULL,
FOREIGN KEY (cart_id) REFERENCES cart(cart_id),
FOREIGN KEY (product_id) REFERENCES products(product_id)
);
CREATE TABLE categories (
category_id SERIAL PRIMARY KEY,
name VARCHAR(255) NOT NULL
);
```
2. Insert sample data into the tables:
```sql
INSERT INTO categories (category_id, name) VALUES
(1, 'Flowers'),
(2, 'Vases');
INSERT INTO products (product_id, name, description, price, category_id, embedding) VALUES
(1, 'Rose', 'A beautiful red rose', 2.50, 1, embedding('gemini-embedding-001', 'A beautiful red rose')),
(2, 'Tulip', 'A colorful tulip', 1.50, 1, embedding('gemini-embedding-001', 'A colorful tulip')),
(3, 'Glass Vase', 'A transparent glass vase', 10.00, 2, embedding('gemini-embedding-001', 'A transparent glass vase')),
(4, 'Ceramic Vase', 'A handmade ceramic vase', 15.00, 2, embedding('gemini-embedding-001', 'A handmade ceramic vase'));
INSERT INTO customers (customer_id, name, email) VALUES
(1, 'John Doe', 'john.doe@example.com'),
(2, 'Jane Smith', 'jane.smith@example.com');
INSERT INTO cart (cart_id, customer_id) VALUES
(1, 1),
(2, 2);
INSERT INTO cart_items (cart_id, product_id, quantity, price) VALUES
(1, 1, 2, 2.50),
(1, 3, 1, 10.00),
(2, 2, 5, 1.50);
```
## Step 2: Install Toolbox
In this section, we will download and install the Toolbox binary.
1. Download the latest version of Toolbox as a binary:
{{< notice tip >}}
Select the
[correct binary](https://github.com/googleapis/genai-toolbox/releases)
corresponding to your OS and CPU architecture.
{{< /notice >}}
<!-- {x-release-please-start-version} -->
```bash
export OS="linux/amd64" # one of linux/amd64, darwin/arm64, darwin/amd64, or windows/amd64
export VERSION="0.14.0"
curl -O https://storage.googleapis.com/genai-toolbox/v$VERSION/$OS/toolbox
```
<!-- {x-release-please-end} -->
1. Make the binary executable:
```bash
chmod +x toolbox
```
## Step 3: Configure the tools
Create a `tools.yaml` file and add the following content. You must replace the placeholders with your actual AlloyDB configuration.
First, define the data source for your tools. This tells Toolbox how to connect to your AlloyDB instance.
```yaml
sources:
alloydb-pg-source:
kind: alloydb-postgres
project: YOUR_PROJECT_ID
region: YOUR_REGION
cluster: YOUR_CLUSTER
instance: YOUR_INSTANCE
database: YOUR_DATABASE
user: YOUR_USER
password: YOUR_PASSWORD
```
Next, define the tools the agent can use. We will categorize them into three types:
### 1. Structured Queries Tools
These tools execute predefined SQL statements. They are ideal for common, structured queries like managing a shopping cart. Add the following to your `tools.yaml` file:
```yaml
tools:
access-cart-information:
kind: postgres-sql
source: alloydb-pg-source
description: >-
List items in customer cart.
Use this tool to list items in a customer cart. This tool requires the cart ID.
parameters:
- name: cart_id
type: integer
description: The id of the cart.
statement: |
SELECT
p.name AS product_name,
ci.quantity,
ci.price AS item_price,
(ci.quantity * ci.price) AS total_item_price,
c.created_at AS cart_created_at,
ci.product_id AS product_id
FROM
cart_items ci JOIN cart c ON ci.cart_id = c.cart_id
JOIN products p ON ci.product_id = p.product_id
WHERE
c.cart_id = $1;
add-to-cart:
kind: postgres-sql
source: alloydb-pg-source
description: >-
Add items to customer cart using the product ID and product prices from the product list.
Use this tool to add items to a customer cart.
This tool requires the cart ID, product ID, quantity, and price.
parameters:
- name: cart_id
type: integer
description: The id of the cart.
- name: product_id
type: integer
description: The id of the product.
- name: quantity
type: integer
description: The quantity of items to add.
- name: price
type: float
description: The price of items to add.
statement: |
INSERT INTO
cart_items (cart_id, product_id, quantity, price)
VALUES($1,$2,$3,$4);
delete-from-cart:
kind: postgres-sql
source: alloydb-pg-source
description: >-
Remove products from customer cart.
Use this tool to remove products from a customer cart.
This tool requires the cart ID and product ID.
parameters:
- name: cart_id
type: integer
description: The id of the cart.
- name: product_id
type: integer
description: The id of the product.
statement: |
DELETE FROM
cart_items
WHERE
cart_id = $1 AND product_id = $2;
```
### 2. Semantic Search Tools
These tools use vector embeddings to find the most relevant results based on the meaning of a user's query, rather than just keywords. Append the following tools to the `tools` section in your `tools.yaml`:
```yaml
search-product-recommendations:
kind: postgres-sql
source: alloydb-pg-source
description: >-
Search for products based on user needs.
Use this tool to search for products. This tool requires the user's needs.
parameters:
- name: query
type: string
description: The product characteristics
statement: |
SELECT
product_id,
name,
description,
ROUND(CAST(price AS numeric), 2) as price
FROM
products
ORDER BY
embedding('gemini-embedding-001', $1)::vector <=> embedding
LIMIT 5;
```
### 3. Natural Language to SQL (NL2SQL) Tools
1. Create a [natural language configuration](https://cloud.google.com/alloydb/docs/ai/use-natural-language-generate-sql-queries#create-config) for your AlloyDB cluster.
{{< notice tip >}}Before using NL2SQL tools,
you must first install the `alloydb_ai_nl` extension and
create the [semantic layer](https://cloud.google.com/alloydb/docs/ai/natural-language-overview) under a configuration named `flower_shop`.
{{< /notice >}}
2. Configure your NL2SQL tool to use your configuration. These tools translate natural language questions into SQL queries, allowing users to interact with the database conversationally. Append the following tool to the `tools` section:
```yaml
ask-questions-about-products:
kind: alloydb-ai-nl
source: alloydb-pg-source
nlConfig: flower_shop
description: >-
Ask questions related to products or brands.
Use this tool to ask questions about products or brands.
Always SELECT the IDs of objects when generating queries.
```
Finally, group the tools into a `toolset` to make them available to the model. Add the following to the end of your `tools.yaml` file:
```yaml
toolsets:
flower_shop:
- access-cart-information
- search-product-recommendations
- ask-questions-about-products
- add-to-cart
- delete-from-cart
```
For more info on tools, check out the
[Tools](../../resources/tools/) section.
## Step 4: Run the Toolbox server
Run the Toolbox server, pointing to the `tools.yaml` file created earlier:
```bash
./toolbox --tools-file "tools.yaml"
```
## Step 5: Connect to MCP Inspector
1. Run the MCP Inspector:
```bash
npx @modelcontextprotocol/inspector
```
1. Type `y` when it asks to install the inspector package.
1. It should show the following when the MCP Inspector is up and running (please take note of `<YOUR_SESSION_TOKEN>`):
```bash
Starting MCP inspector...
⚙️ Proxy server listening on localhost:6277
🔑 Session token: <YOUR_SESSION_TOKEN>
Use this token to authenticate requests or set DANGEROUSLY_OMIT_AUTH=true to disable auth
🚀 MCP Inspector is up and running at:
http://localhost:6274/?MCP_PROXY_AUTH_TOKEN=<YOUR_SESSION_TOKEN>
```
1. Open the above link in your browser.
1. For `Transport Type`, select `Streamable HTTP`.
1. For `URL`, type in `http://127.0.0.1:5000/mcp`.
1. For `Configuration` -> `Proxy Session Token`, make sure `<YOUR_SESSION_TOKEN>` is present.
1. Click Connect.
1. Select `List Tools`, you will see a list of tools configured in `tools.yaml`.
1. Test out your tools here!
## What's next
- Learn more about [MCP Inspector](../../how-to/connect_via_mcp.md).
- Learn more about [Toolbox Resources](../../resources/).
- Learn more about [Toolbox How-to guides](../../how-to/).

View File

@@ -220,7 +220,7 @@
},
"outputs": [],
"source": [
"version = \"0.12.0\" # x-release-please-version\n",
"version = \"0.14.0\" # x-release-please-version\n",
"! curl -O https://storage.googleapis.com/genai-toolbox/v{version}/linux/amd64/toolbox\n",
"\n",
"# Make the binary executable\n",

View File

@@ -179,7 +179,7 @@ to use BigQuery, and then run the Toolbox server.
<!-- {x-release-please-start-version} -->
```bash
export OS="linux/amd64" # one of linux/amd64, darwin/arm64, darwin/amd64, or windows/amd64
curl -O https://storage.googleapis.com/genai-toolbox/v0.12.0/$OS/toolbox
curl -O https://storage.googleapis.com/genai-toolbox/v0.14.0/$OS/toolbox
```
<!-- {x-release-please-end} -->

View File

@@ -98,7 +98,7 @@ In this section, we will download Toolbox, configure our tools in a
<!-- {x-release-please-start-version} -->
```bash
export OS="linux/amd64" # one of linux/amd64, darwin/arm64, darwin/amd64, or windows/amd64
curl -O https://storage.googleapis.com/genai-toolbox/v0.12.0/$OS/toolbox
curl -O https://storage.googleapis.com/genai-toolbox/v0.14.0/$OS/toolbox
```
<!-- {x-release-please-end} -->

Some files were not shown because too many files have changed in this diff Show More