Compare commits

...

142 Commits
blog ... update

Author SHA1 Message Date
Mend Renovate
f693f75f38 chore(deps): update module google.golang.org/api to v0.244.0 (#1039)
This PR contains the following updates:

| Package | Change | Age | Confidence |
|---|---|---|---|
|
[google.golang.org/api](https://redirect.github.com/googleapis/google-api-go-client)
| `v0.243.0` -> `v0.244.0` |
[![age](https://developer.mend.io/api/mc/badges/age/go/google.golang.org%2fapi/v0.244.0?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|
[![confidence](https://developer.mend.io/api/mc/badges/confidence/go/google.golang.org%2fapi/v0.243.0/v0.244.0?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|

---

### Release Notes

<details>
<summary>googleapis/google-api-go-client
(google.golang.org/api)</summary>

###
[`v0.244.0`](https://redirect.github.com/googleapis/google-api-go-client/releases/tag/v0.244.0)

[Compare
Source](https://redirect.github.com/googleapis/google-api-go-client/compare/v0.243.0...v0.244.0)

##### Features

- **all:** Auto-regenerate discovery clients
([#&#8203;3241](https://redirect.github.com/googleapis/google-api-go-client/issues/3241))
([2c20485](2c204857ee))
- **all:** Auto-regenerate discovery clients
([#&#8203;3243](https://redirect.github.com/googleapis/google-api-go-client/issues/3243))
([cac72a1](cac72a1458))
- **all:** Auto-regenerate discovery clients
([#&#8203;3244](https://redirect.github.com/googleapis/google-api-go-client/issues/3244))
([e6b1c87](e6b1c8715f))
- **all:** Auto-regenerate discovery clients
([#&#8203;3245](https://redirect.github.com/googleapis/google-api-go-client/issues/3245))
([2c1ff18](2c1ff18dfc))
- **all:** Auto-regenerate discovery clients
([#&#8203;3247](https://redirect.github.com/googleapis/google-api-go-client/issues/3247))
([09e5c07](09e5c0743d))
- **all:** Auto-regenerate discovery clients
([#&#8203;3249](https://redirect.github.com/googleapis/google-api-go-client/issues/3249))
([214eb4e](214eb4ea56))
- **all:** Auto-regenerate discovery clients
([#&#8203;3250](https://redirect.github.com/googleapis/google-api-go-client/issues/3250))
([ce50789](ce50789a30))
- **all:** Auto-regenerate discovery clients
([#&#8203;3251](https://redirect.github.com/googleapis/google-api-go-client/issues/3251))
([e5c3e18](e5c3e1801e))

</details>

---

### Configuration

📅 **Schedule**: Branch creation - At any time (no schedule defined),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/googleapis/genai-toolbox).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MS40My41IiwidXBkYXRlZEluVmVyIjoiNDEuNDYuMyIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOltdfQ==-->
2025-07-31 21:18:09 +00:00
Dr. Strangelove
327ddf0439 feat: new tool - looker-query-url (#1015) 2025-07-31 17:45:45 +00:00
Twisha Bansal
330b14e518 docs: fix method name (#1042) 2025-07-31 10:29:32 +05:30
Shobhit Singh
5bdab8c83b fix: template paramaters link in bigquery-sql tool (#1032) 2025-07-31 02:57:09 +00:00
Mend Renovate
dbf355d31a chore(deps): update module modernc.org/sqlite to v1.38.2 (#1006)
This PR contains the following updates:

| Package | Change | Age | Confidence |
|---|---|---|---|
| [modernc.org/sqlite](https://gitlab.com/cznic/sqlite) | `v1.38.0` ->
`v1.38.2` |
[![age](https://developer.mend.io/api/mc/badges/age/go/modernc.org%2fsqlite/v1.38.2?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|
[![confidence](https://developer.mend.io/api/mc/badges/confidence/go/modernc.org%2fsqlite/v1.38.0/v1.38.2?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|

---

### Release Notes

<details>
<summary>cznic/sqlite (modernc.org/sqlite)</summary>

###
[`v1.38.2`](https://gitlab.com/cznic/sqlite/compare/v1.38.1...v1.38.2)

[Compare
Source](https://gitlab.com/cznic/sqlite/compare/v1.38.1...v1.38.2)

###
[`v1.38.1`](https://gitlab.com/cznic/sqlite/compare/v1.38.0...v1.38.1)

[Compare
Source](https://gitlab.com/cznic/sqlite/compare/v1.38.0...v1.38.1)

</details>

---

### Configuration

📅 **Schedule**: Branch creation - At any time (no schedule defined),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/googleapis/genai-toolbox).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MS40MC4wIiwidXBkYXRlZEluVmVyIjoiNDEuNDMuNSIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOltdfQ==-->
2025-07-30 01:44:47 +00:00
Mend Renovate
3bdb12f7a7 chore(deps): update module github.com/marcboeker/go-duckdb/v2 to v2.3.4 (#1029)
This PR contains the following updates:

| Package | Change | Age | Confidence |
|---|---|---|---|
|
[github.com/marcboeker/go-duckdb/v2](https://redirect.github.com/marcboeker/go-duckdb)
| `v2.3.3` -> `v2.3.4` |
[![age](https://developer.mend.io/api/mc/badges/age/go/github.com%2fmarcboeker%2fgo-duckdb%2fv2/v2.3.4?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|
[![confidence](https://developer.mend.io/api/mc/badges/confidence/go/github.com%2fmarcboeker%2fgo-duckdb%2fv2/v2.3.3/v2.3.4?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|

---

### Release Notes

<details>
<summary>marcboeker/go-duckdb
(github.com/marcboeker/go-duckdb/v2)</summary>

###
[`v2.3.4`](https://redirect.github.com/marcboeker/go-duckdb/compare/v2.3.3...v2.3.4)

[Compare
Source](https://redirect.github.com/marcboeker/go-duckdb/compare/v2.3.3...v2.3.4)

</details>

---

### Configuration

📅 **Schedule**: Branch creation - At any time (no schedule defined),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/googleapis/genai-toolbox).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MS40My41IiwidXBkYXRlZEluVmVyIjoiNDEuNDMuNSIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOltdfQ==-->
2025-07-30 01:29:06 +00:00
Pranava B
fd149337e9 feat: add support for DuckDB (#879)
Fixes #861 
This PR adds support for DuckDB which is a free, open-source, embedded,
in-process, relational database management system (RDBMS) designed for
analytical processing (OLAP)

---------

Co-authored-by: Averi Kitsch <akitsch@google.com>
2025-07-29 14:31:22 -07:00
Averi Kitsch
c1305b5ab4 chore: remove link checker due to flakiness (#1027) 2025-07-29 13:19:52 -07:00
Averi Kitsch
3003b45256 chore: Update blunderbuss.yml (#1026) 2025-07-29 13:00:40 -07:00
Mend Renovate
1a5fda34b1 chore(deps): update actions/checkout action to v4 (#1018)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
| [actions/checkout](https://redirect.github.com/actions/checkout) |
action | major | `v2` -> `v4` |

---

### Release Notes

<details>
<summary>actions/checkout (actions/checkout)</summary>

###
[`v4`](https://redirect.github.com/actions/checkout/blob/HEAD/CHANGELOG.md#v422)

[Compare
Source](https://redirect.github.com/actions/checkout/compare/v3...v4)

- `url-helper.ts` now leverages well-known environment variables by
[@&#8203;jww3](https://redirect.github.com/jww3) in
[https://github.com/actions/checkout/pull/1941](https://redirect.github.com/actions/checkout/pull/1941)
- Expand unit test coverage for `isGhes` by
[@&#8203;jww3](https://redirect.github.com/jww3) in
[https://github.com/actions/checkout/pull/1946](https://redirect.github.com/actions/checkout/pull/1946)

###
[`v3`](https://redirect.github.com/actions/checkout/blob/HEAD/CHANGELOG.md#v360)

[Compare
Source](https://redirect.github.com/actions/checkout/compare/v2...v3)

- [Fix: Mark test scripts with Bash'isms to be run via
Bash](https://redirect.github.com/actions/checkout/pull/1377)
- [Add option to fetch tags even if fetch-depth >
0](https://redirect.github.com/actions/checkout/pull/579)

</details>

---

### Configuration

📅 **Schedule**: Branch creation - At any time (no schedule defined),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/googleapis/genai-toolbox).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MS40My41IiwidXBkYXRlZEluVmVyIjoiNDEuNDMuNSIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOltdfQ==-->

Co-authored-by: Averi Kitsch <akitsch@google.com>
2025-07-28 16:19:15 -07:00
Mend Renovate
3353085265 chore(deps): pin dependencies (#1017)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
|
[JustinBeckwith/linkinator-action](https://redirect.github.com/JustinBeckwith/linkinator-action)
| action | pinDigest | -> `3d5ba09` |
| [actions/checkout](https://redirect.github.com/actions/checkout) |
action | pinDigest | -> `ee0669b` |

---

### Configuration

📅 **Schedule**: Branch creation - At any time (no schedule defined),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

👻 **Immortal**: This PR will be recreated if closed unmerged. Get
[config
help](https://redirect.github.com/renovatebot/renovate/discussions) if
that's undesired.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/googleapis/genai-toolbox).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MS40My41IiwidXBkYXRlZEluVmVyIjoiNDEuNDMuNSIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOltdfQ==-->
2025-07-28 15:35:53 -07:00
Averi Kitsch
a279d32c57 docs: add link checker and fix broken links (#1014) 2025-07-28 14:51:18 -07:00
Dr. Strangelove
0568423e33 docs: Fix looker source links from looker tools. (#1013)
The relative links to the Looker source documentation page weren't
resolving correctly on the documentation website.

I updated the links in all of the Looker tool markdown files to use a
different relative path that should be correctly interpreted by the Hugo
static site generator. I changed the link from `../sources/looker.md` to
`../../sources/looker/`.

Co-authored-by: google-labs-jules[bot] <161369871+google-labs-jules[bot]@users.noreply.github.com>
2025-07-28 09:18:01 -07:00
Matt Cornillon
92845c943a docs(samples): Adding AlloyDB samples (#987)
Co-authored-by: Matt Cornillon <cornillon@google.com>
Co-authored-by: Averi Kitsch <akitsch@google.com>
2025-07-25 11:38:19 -07:00
Yuan Teoh
90d4558a8e docs: update docs lint (#995) 2025-07-25 17:26:28 +00:00
prernakakkar-google
7791c6f87e docs: Minor documentation fixes for AlloyDB Admin API using MCP (#1003) 2025-07-25 09:17:55 -07:00
Pranava B
8ff60ca430 feat: add homebrew installation support for toolbox (#936)
Fixes #820 

- Added installation and upgrade support for toolbox via homebrew
-https://github.com/Homebrew/homebrew-core/pull/231149,
https://github.com/Homebrew/homebrew-core/pull/230590
- This PR updates the documentation files to include the same.

Install toolbox via homebrew with:
```
brew install mcp-toolbox
```

Start the server using the command:
```
toolbox --tools-file "tools.yaml"
```
2025-07-25 14:17:22 +00:00
release-please[bot]
c45390e6f7 chore(main): release 0.10.0 (#886)
🤖 I have created a release *beep* *boop*
---


##
[0.10.0](https://github.com/googleapis/genai-toolbox/compare/v0.9.0...v0.10.0)
(2025-07-25)


### Features

* Add `Map` parameters support
([#928](https://github.com/googleapis/genai-toolbox/issues/928))
([4468bc9](4468bc920b))
* Add Dataplex source and tool
([#847](https://github.com/googleapis/genai-toolbox/issues/847))
([30c16a5](30c16a559e))
* Add Looker source and tool
([#923](https://github.com/googleapis/genai-toolbox/issues/923))
([c67e01b](c67e01bcf9))
* Add support for null optional parameter
([#802](https://github.com/googleapis/genai-toolbox/issues/802))
([a817b12](a817b120ca)),
closes [#736](https://github.com/googleapis/genai-toolbox/issues/736)
* **prebuilt/alloydb-admin-config:** Add alloydb control plane as a
prebuilt config
([#937](https://github.com/googleapis/genai-toolbox/issues/937))
([0b28b72](0b28b72aa0))
* **prebuilt/mysql,prebuilt/mssql:** Add generic mysql and mssql
prebuilt tools
([#983](https://github.com/googleapis/genai-toolbox/issues/983))
([c600c30](c600c30374))
* **server/mcp:** Support MCP version 2025-06-18
([#898](https://github.com/googleapis/genai-toolbox/issues/898))
([313d3ca](313d3ca0d0))
* **sources/mssql:** Add support for encrypt connection parameter
([#874](https://github.com/googleapis/genai-toolbox/issues/874))
([14a868f](14a868f2a0))
* **sources/firestore:** Add Firestore as Source
([#786](https://github.com/googleapis/genai-toolbox/issues/786))
([2bb790e](2bb790e4f8))
* **sources/mongodb:** Add MongoDB Source
([#969](https://github.com/googleapis/genai-toolbox/issues/969))
([74dbd61](74dbd6124d))
* **tools/alloydb-wait-for-operation:** Add wait for operation tool with
exponential backoff
([#920](https://github.com/googleapis/genai-toolbox/issues/920))
([3f6ec29](3f6ec2944e))
* **tools/mongodb-aggregate:** Add MongoDB `aggregate` Tools
([#977](https://github.com/googleapis/genai-toolbox/issues/977))
([bd399bb](bd399bb0fb))
* **tools/mongodb-delete:** Add MongoDB `delete` Tools
([#974](https://github.com/googleapis/genai-toolbox/issues/974))
([78e9752](78e9752f62))
* **tools/mongodb-find:** Add MongoDB `find` Tools
([#970](https://github.com/googleapis/genai-toolbox/issues/970))
([a747475](a7474752d8))
* **tools/mongodb-insert:** Add MongoDB `insert` Tools
([#975](https://github.com/googleapis/genai-toolbox/issues/975))
([4c63f0c](4c63f0c1e4))
* **tools/mongodb-update:** Add MongoDB `update` Tools
([#972](https://github.com/googleapis/genai-toolbox/issues/972))
([dfde52c](dfde52ca9a))
* **tools/neo4j-execute-cypher:** Add neo4j-execute-cypher for Neo4j
sources ([#946](https://github.com/googleapis/genai-toolbox/issues/946))
([81d0505](81d05053b2))
* **tools/neo4j-schema:** Add neo4j-schema tool
([#978](https://github.com/googleapis/genai-toolbox/issues/978))
([be7db3d](be7db3dff2))
* **tools/wait:** Create wait for tool
([#885](https://github.com/googleapis/genai-toolbox/issues/885))
([ed5ef4c](ed5ef4caea))


### Bug Fixes

* Fix document preview pipeline for forked PRs
([#950](https://github.com/googleapis/genai-toolbox/issues/950))
([481cc60](481cc608ba))
* **prebuilt/firestore:** Mark database field as required in the
firestore prebuilt tools
([#959](https://github.com/googleapis/genai-toolbox/issues/959))
([15417d4](15417d4e0c))
* **prebuilt/cloud-sql-mssql:** Correct source reference for execute_sql
tool in cloud-sql-mssql.yaml prebuilt config
([#938](https://github.com/googleapis/genai-toolbox/issues/938))
([d16728e](d16728e5c6))
* **prebuilt/cloud-sql-mysql:** Update list_table tool
([#924](https://github.com/googleapis/genai-toolbox/issues/924))
([2083ba5](2083ba5048))
* Replace 'float' with 'number' in McpManifest
([#985](https://github.com/googleapis/genai-toolbox/issues/985))
([59e23e1](59e23e1725))
* **server/api:** Add logger to context in tool invoke handler
([#891](https://github.com/googleapis/genai-toolbox/issues/891))
([8ce311f](8ce311f256))
* **sources/looker:** Add agent tag to Looker API calls.
([#966](https://github.com/googleapis/genai-toolbox/issues/966))
([f55dd6f](f55dd6fcd0))
* **tools/bigquery-execute-sql:** Ensure invoke always returns a
non-null value
([#925](https://github.com/googleapis/genai-toolbox/issues/925))
([9a55b80](9a55b80482))
* **tools/mysqlsql:** Unmarshal json data from database during invoke
([#979](https://github.com/googleapis/genai-toolbox/issues/979))
([ccc3498](ccc3498cf0)),
closes [#840](https://github.com/googleapis/genai-toolbox/issues/840)

---
This PR was generated with [Release
Please](https://github.com/googleapis/release-please). See
[documentation](https://github.com/googleapis/release-please#release-please).

---------

Co-authored-by: release-please[bot] <55107282+release-please[bot]@users.noreply.github.com>
Co-authored-by: Yuan Teoh <45984206+Yuan325@users.noreply.github.com>
2025-07-24 17:58:37 -07:00
nester-neo4j
be7db3dff2 feat(tools/neo4j-schema): add neo4j-schema tool (#978)
This pull request introduces a new tool, `neo4j-schema`, for extracting
and processing comprehensive schema information from Neo4j databases. It
includes updates to the documentation, implementation of caching
mechanisms, helper utilities for schema transformation, and
corresponding unit tests. The most important changes are grouped by
theme below:

### Tool Integration:
- **`cmd/root.go`**: Added import for the new `neo4j-schema` tool to
integrate it into the application.

### Documentation:
- **`docs/en/resources/tools/neo4j/neo4j-schema.md`**: Added detailed
documentation for the `neo4j-schema` tool, describing its functionality,
caching behavior, and usage examples.

### Caching Implementation:
- **`internal/tools/neo4j/neo4jschema/cache/cache.go`**: Implemented a
thread-safe, in-memory cache with expiration and optional janitor for
cleaning expired items.

### Unit Tests:
- **`internal/tools/neo4j/neo4jschema/cache/cache_test.go`**: Added
comprehensive tests for the caching system, including functionality for
setting, retrieving, expiration, janitor cleanup, and concurrent access.

### Helper Utilities:
- **`internal/tools/neo4j/neo4jschema/helpers/helpers.go`**: Added
utility functions for processing schema data, including support for APOC
and native Cypher queries, and converting raw query results into
structured formats.

---------

Co-authored-by: Yuan Teoh <45984206+Yuan325@users.noreply.github.com>
2025-07-25 00:40:16 +00:00
Yuan Teoh
7e7d55c5d1 chore: add new docs to release please extraFiles (#994)
Add additional docs files and sort extraFiles list in alphabetical
order.
2025-07-24 17:10:09 -07:00
Anuj Jhunjhunwala
30c16a559e feat: add Dataplex source and tool (#847)
- Users have the preference to choose their clients. Below example is
using Gemini CLI.

- Users can use the pre-built Dataplex tools by creating a settings.json
file under .gemini directory. The contents of settings.json would be as
follows:-

```
{
  "mcpServers": {
    "dataplex": {
      "command": "./toolbox",
      "args": ["--prebuilt","dataplex","--stdio"],
      "env": {
          "DATAPLEX_PROJECT": "test-project"
      }
    }
  }
}
```

Fixes #831

---------

Co-authored-by: Wenxin Du <117315983+duwenxin99@users.noreply.github.com>
Co-authored-by: Mateusz Nowak <matnow@google.com>
Co-authored-by: Mateusz Nowak <kontakt@mateusznowak.pl>
2025-07-24 15:31:35 -07:00
ShanQincheng
14a868f2a0 feat(sources/mssql): add support for encrypt connection parameter (#874)
## 1. Why do we need to support the `encrypt` parameter?
MSSQL databases that `genai-toolbox` attempts to connect to may have
their encryption levels set differently. For example, a testing/demo
purpose MSSQL database may not require encryption at all. However,
`genai-toolbox` currently uses the default encryption parameter
(`encrypt=false`) to connect to this type of database and will throw an
error:
```
ERROR "toolbox failed to initialize: unable to initialize configs: unable to initialize source "my-mssql-source": unable to connect successfully: TLS Handshake failed: cannot read handshake packet: EOF"
```
> In this case, the encryption parameter should be set to
`encrypt=disable`.


## 2. Is this a necessary feature?
`genai-toolbox` uses the `github.com/microsoft/go-mssqldb` package as a
dependency to connect to MSSQL databases. According to the
[README](https://github.com/microsoft/go-mssqldb?tab=readme-ov-file#common-parameters)
of the `github.com/microsoft/go-mssqldb` package, `encrypt` is one of
the common parameters. Therefore, I believe supporting the `encrypt`
parameter in `genai-toolbox` is necessary.

## 3. How to replicate the error mentioned above?
### 3.1 Use this `docker-compose.yaml` file to start a demo MSSQL
instance
```
services:
  demo-mssql-database:
    image: mcr.microsoft.com/mssql/server:2017-CU1-ubuntu
    ports:
      - "20256:1433"
    environment:
      ACCEPT_EULA: "Y"
      MSSQL_SA_PASSWORD: "hellopassword!"
    restart: unless-stopped
    healthcheck:
      test: ["CMD", "/opt/mssql-tools/bin/sqlcmd", "-S", "localhost", "-U", "sa", "-P", "hellopassword!", "-Q", "SELECT 1"]
      interval: 5s
      retries: 10

  demo-mssql-database-init:
    image: mcr.microsoft.com/mssql/server:2017-CU1-ubuntu
    network_mode: service:demo-mssql-database
    command: >
      /bin/bash -c "/opt/mssql-tools/bin/sqlcmd -S localhost -U sa -P hellopassword! -d master -Q 'CREATE DATABASE DemoDatabase;'"
    depends_on:
      demo-mssql-database:
        condition: service_healthy
```

### 3.2 Use `genai-toolbox` to connect to the above demo MSSQL database
with this `tools.yaml` configuration file:
```
sources:
       my-mssql-source:
                kind: mssql
                host: localhost
                port: 20256
                database: master
                user: sa
                password: 'hellopassword!'
```

### 3.3 We shall see the error:
```
ERROR "toolbox failed to initialize: unable to initialize configs: unable to initialize source "my-mssql-source": unable to connect successfully: TLS Handshake failed: cannot read handshake packet: EOF"
```

---------

Co-authored-by: Yuan Teoh <45984206+Yuan325@users.noreply.github.com>
2025-07-24 21:51:25 +00:00
Averi Kitsch
3746dbae65 docs: fix typos in MCP docs for Postgres (#991)
Co-authored-by: Yuan Teoh <45984206+Yuan325@users.noreply.github.com>
2025-07-24 21:38:55 +00:00
Averi Kitsch
25a0bb7a37 docs: fix typos in MCP docs (#990)
Co-authored-by: Yuan Teoh <45984206+Yuan325@users.noreply.github.com>
2025-07-24 21:14:37 +00:00
Wenxin Du
bd399bb0fb ci: Add MongoDB aggregate Tool and integration test (#977)
Co-authored-by: Author: Dennis Geurts <dennisg@dennisg.nl>
2025-07-24 16:49:41 -04:00
Wenxin Du
4c63f0c1e4 feat: Add MongoDB insert Tools (#975)
Add MongoDB `insert` Tools:

- mongodb-insert-one
- mongodb-insert-many

---------
Co-authored-by: Author: Dennis Geurts <dennisg@dennisg.nl>
2025-07-24 15:54:12 -04:00
Wenxin Du
78e9752f62 feat: Add MongoDB delete Tools (#974)
Add MongoDB `delete` Tools:

- mongodb-delete-one
- mongodb-delete-many

---------

Co-authored-by: Author: Venkatesh Shanbhag <91714892+theshanbhag@users.noreply.github.com>
Co-authored-by: Author: Dennis Geurts <dennisg@dennisg.nl>
2025-07-24 15:24:24 -04:00
Wenxin Du
dfde52ca9a feat: Add MongoDB update Tools (#972)
Add MongoDB Tools:
- mongodb-update-one
- mongodb-update-many

---------
Co-authored-by: Author: Dennis Geurts <dennisg@dennisg.nl>
2025-07-24 15:08:27 -04:00
Wenxin Du
a7474752d8 feat: Add MongoDB find Tools (#970)
Add MongoDB Tools:
- mongodb-find
- mongodb-find-one

---------
Co-authored-by: Author: Dennis Geurts <dennisg@dennisg.nl>
Co-authored-by: Kurtis Van Gent <31518063+kurtisvg@users.noreply.github.com>
2025-07-24 14:46:29 -04:00
prernakakkar-google
be65924aa6 docs: Add documentation for alloydb control plane tools (#988)
Co-authored-by: Averi Kitsch <akitsch@google.com>
2025-07-24 11:20:46 -07:00
bitsark
59e23e1725 fix: replace 'float' with 'number' in McpManifest (#985)
According to the json schema spec:There are two numeric types in JSON
Schema: integer and number So 'float' is not with mcpcurl and mcphost

Fixes #984, #797

---------

Co-authored-by: Yuan Teoh <45984206+Yuan325@users.noreply.github.com>
2025-07-24 18:03:34 +00:00
Wenxin Du
74dbd6124d feat: Add MongoDB Source (#969)
Add MongoDB Source

---------

Co-authored-by: Author: Venkatesh Shanbhag <91714892+theshanbhag@users.noreply.github.com>
Co-authored-by: Kurtis Van Gent <31518063+kurtisvg@users.noreply.github.com>
2025-07-24 13:48:58 -04:00
prernakakkar-google
c600c30374 feat(prebuilt/mysql,prebuilt/mssql): add generic mysql and mssql prebuilt tools (#983)
Co-authored-by: Averi Kitsch <akitsch@google.com>
Co-authored-by: Yuan Teoh <45984206+Yuan325@users.noreply.github.com>
2025-07-24 22:07:07 +05:30
Yuan Teoh
ccc3498cf0 fix(tools/mysqlsql): unmarshal json data from database during invoke (#979)
mysql driver return []uint8 types for "JSON", hence we will need to cast
it.

Toolbox will unmarshal JSON data retrieved from database during
invocation. If directly convert results into string, it will be
marshaled again when returning results to the user (causing double
marshaling).

Fixes #840
2025-07-24 16:25:53 +00:00
Dr. Strangelove
f55dd6fcd0 fix: add agent tag to Looker API calls. (#966) 2025-07-24 09:19:25 -06:00
trehanshakuntG
1526b8ab8c docs: update firestore documentation (#982)
Co-authored-by: Anubhav Dhawan <anubhavdhawan@google.com>
2025-07-24 08:56:33 +00:00
prernakakkar-google
2a1b1ff787 feat: Add indexing to docs for utility (#981) 2025-07-24 13:26:20 +05:30
Averi Kitsch
86ccff0b43 docs: fix wait tool notice tag (#968) 2025-07-23 14:20:28 -07:00
Dr. Strangelove
d61e552ead chore: refactor some attributes in Looker (#965) 2025-07-23 15:10:19 -04:00
Yuan Teoh
9c289da638 chore: update instructions for adding integration test coverage (#964) 2025-07-23 18:59:11 +00:00
prernakakkar-google
0b28b72aa0 feat: Add alloydb control plane as a prebuilt config (#937) 2025-07-24 00:14:43 +05:30
prernakakkar-google
3f6ec2944e feat: Add wait for operation tool with exponential backoff (#920)
Example:

```
  alloydb-operations-get:
    kind: wait-for-operation
    source: alloydb-api-source
    method: GET
    path: /v1/projects/{{.projectId}}/locations/{{.locationId}}/operations/{{.operationId}}
    description: "Makes API call to check whether operation is done or not. This tool is run first then wait for tool. if its still in create phase trigger it after 3 minutes.  Print a message saying still not done. We will notify once its done."
    pathParams:
      - name: projectId
        type: string
        description: The dynamic path parameter
      - name: locationId
        type: string
        description: The dynamic path parameter
        default: us-central1
      - name: operationId
        type: string
        description: Operation status check for previous task

```
2025-07-24 00:04:36 +05:30
Dr. Strangelove
c67e01bcf9 feat: Looker MCP Server (#923)
Add support for Looker with the following prebuilt tools:
* get_models
* get_explores
* get_dimensions
* get_measures
* get_filters
* get_parameters
* query
* query_sql
* get_looks
* run_look

---------

Co-authored-by: duwenxin <duwenxin@google.com>
2025-07-23 18:12:06 +00:00
nester-neo4j
81d05053b2 feat: Neo4j tools enhancements - neo4j-execute-cypher (#946)
This pull request introduces a new tool for executing arbitrary Cypher
queries against a Neo4j database (`neo4j-execute-cypher`) and implements
robust query classification functionality to distinguish between read
and write operations. The changes include updates to documentation, the
addition of a query classifier, and comprehensive test coverage for the
classifier.

### Addition of `neo4j-execute-cypher` tool:

- **Documentation**: Added a new markdown file `neo4j-execute-cypher.md`
that explains the tool's functionality, usage, and configuration
options, including the ability to enforce read-only mode for security.
- **Import statement**: Registered the new tool in the `cmd/root.go`
file to make it available in the toolbox.

### Query classification functionality:

- **Query classifier implementation**: Added `QueryClassifier` in
`classifier.go`, which classifies Cypher queries into read or write
operations based on keywords, procedures, and subquery analysis. It
supports handling edge cases like nested subqueries, multi-word
keywords, and invalid syntax.
- **Test coverage**: Created extensive tests in `classifier_test.go` to
validate the classifier's behavior across various query types, including
abuse cases, subqueries, and procedure calls. Tests ensure the
classifier is robust and does not panic on malformed queries.

---------

Co-authored-by: Yuan Teoh <45984206+Yuan325@users.noreply.github.com>
2025-07-23 17:33:44 +00:00
trehanshakuntG
15417d4e0c fix: mark database field as required in the firestore prebuilt tools (#959)
Env variables cannot be used as optional parameters, hence marking
database field as required.
2025-07-23 10:13:12 -07:00
dishaprakash
9aa6aa079d docs: Update sample for Genkit Go using tbgenkit package (#943)
Previously code snippets were added to the README on how to use the new
Toolbox Go Core SDK.
Recently we have released a `tbgenkit` client-side package to make
integration with Genkit Go easier.
This PR updates the code snippet to use the newer package

---------

Co-authored-by: Yuan Teoh <45984206+Yuan325@users.noreply.github.com>
2025-07-23 11:59:47 +05:30
Anubhav Dhawan
fa3e9ac04b docs: fix type guidance for map parameter (#951)
`number` is now changed to `integer` and `float`.
2025-07-22 22:19:02 +00:00
Yuan Teoh
4ee8cfa1f4 chore: fix labels description (#957)
The `sync label` job failed during PR merge due to fail parsing.
2025-07-22 21:53:41 +00:00
Mend Renovate
552e86bc43 chore(deps): update module google.golang.org/api to v0.243.0 (#956)
This PR contains the following updates:

| Package | Change | Age | Confidence |
|---|---|---|---|
|
[google.golang.org/api](https://redirect.github.com/googleapis/google-api-go-client)
| `v0.242.0` -> `v0.243.0` |
[![age](https://developer.mend.io/api/mc/badges/age/go/google.golang.org%2fapi/v0.243.0?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|
[![confidence](https://developer.mend.io/api/mc/badges/confidence/go/google.golang.org%2fapi/v0.242.0/v0.243.0?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|

---

### Release Notes

<details>
<summary>googleapis/google-api-go-client
(google.golang.org/api)</summary>

###
[`v0.243.0`](https://redirect.github.com/googleapis/google-api-go-client/releases/tag/v0.243.0)

[Compare
Source](https://redirect.github.com/googleapis/google-api-go-client/compare/v0.242.0...v0.243.0)

##### Features

- **all:** Auto-regenerate discovery clients
([#&#8203;3233](https://redirect.github.com/googleapis/google-api-go-client/issues/3233))
([a269dca](a269dca39e))
- **all:** Auto-regenerate discovery clients
([#&#8203;3235](https://redirect.github.com/googleapis/google-api-go-client/issues/3235))
([b656000](b656000d19))
- **all:** Auto-regenerate discovery clients
([#&#8203;3236](https://redirect.github.com/googleapis/google-api-go-client/issues/3236))
([971135a](971135a022))
- **all:** Auto-regenerate discovery clients
([#&#8203;3237](https://redirect.github.com/googleapis/google-api-go-client/issues/3237))
([be7e601](be7e601ced))
- **all:** Auto-regenerate discovery clients
([#&#8203;3239](https://redirect.github.com/googleapis/google-api-go-client/issues/3239))
([b2202ca](b2202ca571))
- **all:** Auto-regenerate discovery clients
([#&#8203;3240](https://redirect.github.com/googleapis/google-api-go-client/issues/3240))
([ceceb79](ceceb79c86))

##### Bug Fixes

- **gensupport:** Update chunk upload logic for robust timeout handling.
([#&#8203;3208](https://redirect.github.com/googleapis/google-api-go-client/issues/3208))
([93865aa](93865aac32))

</details>

---

### Configuration

📅 **Schedule**: Branch creation - At any time (no schedule defined),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/googleapis/genai-toolbox).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MS40MC4wIiwidXBkYXRlZEluVmVyIjoiNDEuNDAuMCIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOltdfQ==-->
2025-07-22 21:21:50 +00:00
Kurtis Van Gent
1387f858b6 chore: update labels.yaml (#901) 2025-07-22 19:27:45 +00:00
Twisha Bansal
481cc608ba fix: fix document preview pipeline for forked PRs (#950)
The checkout actions uses ref as `github.sha` by default. For forked
repos, this refers to the last commit on the main branch
([ref](https://docs.github.com/en/actions/reference/events-that-trigger-workflows#fork)).

This means that the docs preview pipeline when run on forked PRs does
not allow us to preview changes made in the PR.

`github.event.pull_request.head.sha` allows to to checkout the latest
commit on the forked PR.
2025-07-22 21:59:44 +05:30
Niraj Nandre
d16728e5c6 fix: correct source reference for execute_sql tool in internal/prebuiltconfigs/tools/cloud-sql-mssql.yaml (#938)
### What this PR does:

This Pull Request resolves a critical configuration error in the
`cloud-sql-mssql.yaml` prebuilt tool definition, which was preventing
the `execute_sql` tool from correctly initializing and connecting to a
Cloud SQL for SQL Server instance.

### Problem:

Previously, the `source` field in
`internal/prebuiltconfigs/tools/cloud-sql-mssql.yaml` was incorrectly
configured as `cloud-sql-mssql-source`. This led to the `genai-toolbox`
server failing to find the corresponding data source, resulting in
errors like:
2025-07-22 15:09:09 +00:00
Anushka Saxena
49b1562f73 docs: add llamaindex integration to js quickstart (#931)
### Description

Enhance the [JavaScript
Quickstart](https://googleapis.github.io/genai-toolbox/getting-started/local_quickstart_js/)
by adding a new tab dedicated to integrating the MCP Toolbox with
LlamaIndex. This provides users with a direct, runnable example of how
to utilize Toolbox's capabilities within a LlamaIndex-powered LLM agent
for hotel search and booking operations.

The changes include:
- Addition of a new `LlamaIndex` tab in `docs/quickstarts/js-local.md`.
- Corresponding `hotelAgent.js` code snippet for LlamaIndex integration.

### Related Issue

This PR addresses and resolves #930.

---------

Signed-off-by: Anushka Saxena <anushka.saxena-it-2k19-05@iips.edu.in>
Co-authored-by: Twisha Bansal <58483338+twishabansal@users.noreply.github.com>
2025-07-22 16:20:33 +05:30
Anushka Saxena
a1def43b35 docs: add available tools for each source (#914)
- This PR adds an `"Available Tools"` section under each source page in
the
[documentation](https://googleapis.github.io/genai-toolbox/resources/sources/).
- The purpose is to help users quickly identify relevant tools
compatible with each data source, improving discoverability and
developer experience.

---------

Signed-off-by: Anushka Saxena <anushkasaxenaa@google.com>
Co-authored-by: Twisha Bansal <58483338+twishabansal@users.noreply.github.com>
2025-07-22 15:53:39 +05:30
dishaprakash
480d76dfff docs: add Toolbox Go SDK quickstart (#941)
docs: Add Toolbox Go SDK Quickstart
2025-07-22 13:51:49 +05:30
Huan Chen
9334368a42 chore: fix dry run location (#947)
Updated dry run in execute sql to also include a location, fix the
potential issue in PR #925.
2025-07-21 20:53:19 -07:00
nester-neo4j
2a650349cb chore: rename neo4j packaage to neo4jcypher (#945)
**Package Renaming**: The `neo4j` package has been renamed to
`neo4jcypher` to better reflect its functionality. This change affects
the tool's implementation
(`internal/tools/neo4j/neo4jcypher/neo4jcypher.go`) and its associated
tests (`internal/tools/neo4j/neo4jcypher/neo4jcypher_test.go`).
2025-07-21 22:09:12 +00:00
Kurtis Van Gent
5f7cc32127 chore: update blunderbuss for kvg-ooo (#942) 2025-07-21 18:21:44 +00:00
Mend Renovate
abdab54503 chore(deps): update module github.com/couchbase/gocb/v2 to v2.10.1 (#939)
This PR contains the following updates:

| Package | Change | Age | Confidence |
|---|---|---|---|
|
[github.com/couchbase/gocb/v2](https://redirect.github.com/couchbase/gocb)
| `v2.10.0` -> `v2.10.1` |
[![age](https://developer.mend.io/api/mc/badges/age/go/github.com%2fcouchbase%2fgocb%2fv2/v2.10.1?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|
[![confidence](https://developer.mend.io/api/mc/badges/confidence/go/github.com%2fcouchbase%2fgocb%2fv2/v2.10.0/v2.10.1?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|

---

### Release Notes

<details>
<summary>couchbase/gocb (github.com/couchbase/gocb/v2)</summary>

###
[`v2.10.1`](https://redirect.github.com/couchbase/gocb/compare/v2.10.0...v2.10.1)

[Compare
Source](https://redirect.github.com/couchbase/gocb/compare/v2.10.0...v2.10.1)

</details>

---

### Configuration

📅 **Schedule**: Branch creation - At any time (no schedule defined),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/googleapis/genai-toolbox).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MS40MC4wIiwidXBkYXRlZEluVmVyIjoiNDEuNDAuMCIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOltdfQ==-->
2025-07-21 11:14:43 -07:00
Yuan Teoh
e78bce32dc style(tools/firestore): fix linting (#944) 2025-07-21 17:49:45 +00:00
Yuan Teoh
3727b1d053 docs: add parameter required field to docs (#917)
add documentation for the `required` field.
2025-07-21 17:41:58 +00:00
Yuan Teoh
7eff0f9ac7 test: integration tests for null optional parameters (#889)
Note: null optional parameters currently doesn't work with BigQuery.
2025-07-21 17:32:44 +00:00
Wenxin Du
4468bc920b feat: Add Map parameters support (#928)
support both generic and typed map. Config example:
```
 parameters:
      - name: user_scores
        type: map
        description: A map of user IDs to their scores. All scores must be integers.
        valueType: integer # This enforces the value type for all entries. Leave it blank for generic map

```
Represented as `Object` with `additionalProperties` in manifests.

Added a util function to convert json.Number (string type) to int/float
types to address the problem where int/float values are converted to
strings for the generic map.
2025-07-18 17:19:09 -04:00
Huan Chen
9a55b80482 fix(tools/bigquery-execute-sql): ensure invoke always returns a non-null value (#925)
- Added a dry run step to identify the query type (e.g., SELECT, DML),
which allows the tool to correctly handle the query's output.
- The recommended high-level client, cloud.google.com/go/bigquery, does
not expose the statement type from a dry run. To circumvent this
limitation, the low-level BigQuery REST API client
(google.golang.org/api/bigquery/v2) was added to gain access to these
necessary details.

fixes: #915

---------

Co-authored-by: Yuan Teoh <45984206+Yuan325@users.noreply.github.com>
2025-07-18 17:17:45 +00:00
Twisha Bansal
e5ac5ba9ee docs: fix to include correct way to authenticate to cloud run instances (#896)
Corresponding to
https://github.com/googleapis/mcp-toolbox-sdk-python/pull/313
2025-07-18 14:22:13 +05:30
trehanshakuntG
2bb790e4f8 feat: Add Firestore as Source (#786)
Firestore is a NoSQL document database built for automatic scaling, high
performance, and ease of application development. It's a fully managed,
serverless database that supports mobile, web, and server development.

This change adds Firestore as a source in toolbox
2025-07-18 11:42:07 +05:30
Yuan Teoh
2083ba5048 fix(prebuilt/cloud-sql-mysql): update list_table tool (#924)
This PR consist of two fixes: 
1. Added `default` to table_names parameter since the default value is
an empty string.
2. Update the sql statement for reusing parameter within the statement.

Fixes partial #877
2025-07-17 13:40:25 -07:00
Yuan Teoh
53afed5b76 chore(tools): invoke return type any instead of []any (#904)
Update `tool.Invoke()` to return type `any` instead of `[]any`.

Toolbox return a map with the `results` key, and the SDK reads the
string from the key. So this won't break existing SDK implementation.

Fixes #870
2025-07-17 11:03:54 -07:00
Harsh Jha
2b2732ec39 docs: fix valkey doc broken link (#916)
In
[genai-toolbox](https://googleapis.github.io/genai-toolbox/resources/sources/valkey/)
Official Documentation of valkey URL is broken.

---------

Co-authored-by: Yuan Teoh <45984206+Yuan325@users.noreply.github.com>
2025-07-17 06:23:19 +00:00
Mend Renovate
9b1505e4bd chore(deps): update module google.golang.org/api to v0.242.0 (#913)
This PR contains the following updates:

| Package | Change | Age | Confidence |
|---|---|---|---|
|
[google.golang.org/api](https://redirect.github.com/googleapis/google-api-go-client)
| `v0.241.0` -> `v0.242.0` |
[![age](https://developer.mend.io/api/mc/badges/age/go/google.golang.org%2fapi/v0.242.0?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|
[![confidence](https://developer.mend.io/api/mc/badges/confidence/go/google.golang.org%2fapi/v0.241.0/v0.242.0?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|

---

### Release Notes

<details>
<summary>googleapis/google-api-go-client
(google.golang.org/api)</summary>

###
[`v0.242.0`](https://redirect.github.com/googleapis/google-api-go-client/releases/tag/v0.242.0)

[Compare
Source](https://redirect.github.com/googleapis/google-api-go-client/compare/v0.241.0...v0.242.0)

##### Features

- **all:** Auto-regenerate discovery clients
([#&#8203;3226](https://redirect.github.com/googleapis/google-api-go-client/issues/3226))
([9bd47c4](9bd47c484b))
- **all:** Auto-regenerate discovery clients
([#&#8203;3228](https://redirect.github.com/googleapis/google-api-go-client/issues/3228))
([2ee2e31](2ee2e31870))
- **all:** Auto-regenerate discovery clients
([#&#8203;3229](https://redirect.github.com/googleapis/google-api-go-client/issues/3229))
([6fdc3eb](6fdc3ebb20))
- **all:** Auto-regenerate discovery clients
([#&#8203;3230](https://redirect.github.com/googleapis/google-api-go-client/issues/3230))
([d5fa61e](d5fa61e954))
- **all:** Auto-regenerate discovery clients
([#&#8203;3231](https://redirect.github.com/googleapis/google-api-go-client/issues/3231))
([96d4d98](96d4d98a3d))
- **all:** Auto-regenerate discovery clients
([#&#8203;3232](https://redirect.github.com/googleapis/google-api-go-client/issues/3232))
([2ab275b](2ab275bbcb))

</details>

---

### Configuration

📅 **Schedule**: Branch creation - At any time (no schedule defined),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/googleapis/genai-toolbox).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MS4yMy4yIiwidXBkYXRlZEluVmVyIjoiNDEuMjMuMiIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOltdfQ==-->
2025-07-17 01:11:11 +00:00
Mend Renovate
b7795c8857 chore(deps): update module github.com/valkey-io/valkey-go to v1.0.63 (#907)
This PR contains the following updates:

| Package | Change | Age | Confidence |
|---|---|---|---|
|
[github.com/valkey-io/valkey-go](https://redirect.github.com/valkey-io/valkey-go)
| `v1.0.62` -> `v1.0.63` |
[![age](https://developer.mend.io/api/mc/badges/age/go/github.com%2fvalkey-io%2fvalkey-go/v1.0.63?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|
[![confidence](https://developer.mend.io/api/mc/badges/confidence/go/github.com%2fvalkey-io%2fvalkey-go/v1.0.62/v1.0.63?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|

---

### Release Notes

<details>
<summary>valkey-io/valkey-go (github.com/valkey-io/valkey-go)</summary>

###
[`v1.0.63`](https://redirect.github.com/valkey-io/valkey-go/releases/tag/v1.0.63):
1.0.63

[Compare
Source](https://redirect.github.com/valkey-io/valkey-go/compare/v1.0.62...v1.0.63)

### Changes

- feat: Add XDELEX command
- feat: add XACKDEL command
- feat: add KEEPREF, DELREF, and ACKED options to XTRIM command
- feat: Add KEEPREF, DELREF, and ACKED options to XADD command
- feat: add WITHATTRIBS option to command VSIM
- feat: add DIFF, DIFF1, ANDOR, and ONE options to BITOP command
- feat: add HStrLen to valkeycompat
- feat: Add TotalNetIn, TotalNetOut, and TotalCmds fields to
valkeycompat.ClientInfo
- feat: add Scanner implementation with Iter and Iter2 methods for XSCAN
- feat: allow non-blocking client initialization when ForceSingleClient
is set
- perf: replace json.NewDecoder with json.Unmarshal
- perf: reduce mux size by consolidating wire, sc, mu into one struct
- perf: allocate fields for RESP2 PubSub only when necessary

#### Contributors

We'd like to thank all the contributors who worked on this release!

[@&#8203;Aakkash-Suresh](https://redirect.github.com/Aakkash-Suresh),
[@&#8203;Ryan2327](https://redirect.github.com/Ryan2327),
[@&#8203;arbhalerao](https://redirect.github.com/arbhalerao),
[@&#8203;ash2k](https://redirect.github.com/ash2k),
[@&#8203;dalaoqi](https://redirect.github.com/dalaoqi),
[@&#8203;davidlin-tv2](https://redirect.github.com/davidlin-tv2),
[@&#8203;mingdaoy](https://redirect.github.com/mingdaoy),
[@&#8203;rueian](https://redirect.github.com/rueian),
[@&#8203;sugymt](https://redirect.github.com/sugymt) and
[@&#8203;yhc9311](https://redirect.github.com/yhc9311)

</details>

---

### Configuration

📅 **Schedule**: Branch creation - At any time (no schedule defined),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/googleapis/genai-toolbox).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MS4yMy4yIiwidXBkYXRlZEluVmVyIjoiNDEuMjMuMiIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOltdfQ==-->
2025-07-17 00:46:05 +00:00
Yuan Teoh
000831c15b docs: update transport protocol in mcp instructions (#893)
Current default transport protocol shown in docs is using `sse`.
Updating to use `Streamable HTTP`.

Fixes #876
2025-07-16 23:36:59 +00:00
Yuan Teoh
5c54cc973d chore(tools/http): replace nil query parameter with empty string (#892)
Replace `nil` values with empty string for query parameter.
2025-07-16 22:26:53 +00:00
Wenxin Du
8cc91ee3f7 fix: Redis and Valkey docs (#912)
Add port and make sure address is an array.

fix: https://github.com/googleapis/genai-toolbox/issues/883
2025-07-16 18:19:33 -04:00
prernakakkar-google
ed5ef4caea feat: Create wait for tool (#885)
`
tools:
  wait_for_tool:
    kind: wait-for
description: "Tool puts chat to sleep for given timeout example 3m,10s,
5m etc. For tasks such as cluster creation no need to wait. It takes
longer time such as 8m for instance creation. Use timeout value as
default if nothing is provided"
    timeout: 10s
`

Output: 
`
prernakakkar@prernakakkar:~/senseai/genai-toolbox$ curl -X POST -H
"Content-Type: application/json" -d '{"duration": "5m"}'
http://127.0.0.1:5000/api/tool/wa
it_for_tool/invoke
{"result":"[\"Wait for 5m0s completed successfully.\"]"}
`
2025-07-16 17:18:24 +05:30
Yuan Teoh
208df0a428 docs: add mcp v2025-06-18 to supported mcp version (#899)
Add `2025-06-18` to list of supported MCP versions. Sort list to latest
version first.
2025-07-15 22:16:51 +00:00
Yuan Teoh
313d3ca0d0 feat(server/mcp): support MCP version 2025-06-18 (#898)
This PR add supports to MCP version 2025-06-18 defined
[here](https://modelcontextprotocol.io/specification/2025-06-18).

The main updates includes:
* Retrieving protocol version from header via `MCP-Protocol-Version`.
* Throwing `400 Bad Request` when an invalid version is received.
2025-07-15 15:07:27 -07:00
Jack Wotherspoon
4dae5a6ed7 chore: add medium badge to README that links to Toolbox Publication (#897)
Link to [Medium Publication for
Toolbox](https://medium.com/@mcp_toolbox) from badge.

<img width="819" height="216" alt="image"
src="https://github.com/user-attachments/assets/b6631c64-7bfa-4136-8b1c-3b29aeb118b3"
/>
2025-07-15 11:52:29 -06:00
Twisha Bansal
26bdba46ca docs: add links to quickstart (#888) 2025-07-15 11:47:59 +05:30
Yuan Teoh
a817b120ca feat: add support for null optional parameter (#802)
Added an option for user to indicate if the parameter is required. 

Example:
```
parameters:
  - name: foo
    type: string
    description: foo
    required: false
```

If the `required` field is not provided, it will be defaulted to `true`.


If a `default` value is provided, `required = false` regardless if the
field is indicated.
```
parameters:
  - name: foo
    type: string
    description: foo
    default: hello world
```

Fixes #736
2025-07-14 21:41:49 -07:00
Dr. Strangelove
8ce311f256 fix(server/api): add logger to context in tool invoke handler (#891) 2025-07-14 21:02:10 -07:00
Yuan Teoh
86227e3104 docs: update mcp docs to include v2025-03-26 (#868) 2025-07-14 15:38:30 -07:00
Nick Acosta
206bea4575 docs: update MCP inspector instructions (#867)
This pr updates the docs, specifically the quickstart working with MCP
Inspector. The tool recently included a Session Token for Authorization
that can easily be missed, this updates the guide to make it easy to
find and use this Session Token

---------

Co-authored-by: Nick Acosta <nick.acosta@getcollate.io>
Co-authored-by: Yuan Teoh <45984206+Yuan325@users.noreply.github.com>
2025-07-14 20:11:28 +00:00
Harsh Jha
65843621c5 docs: Add JS SDK quickstart (#845)
Co-authored-by: Anmol Shukla <shuklaanmol@google.com>
Co-authored-by: Twisha Bansal <58483338+twishabansal@users.noreply.github.com>
Co-authored-by: Twisha Bansal <twishabansal@google.com>
2025-07-15 01:09:18 +05:30
Harsh Jha
e681a7e36c docs: clarify Bigtable GoogleSQL DML support (#863)
feat(docs): Enhance Bigtable GoogleSQL DML clarity

**Description:**
This PR updates the `bigtable-sql` tool documentation to provide a
clearer understanding of GoogleSQL's DML capabilities within Bigtable.

**Changes Made:**
- Added a note indicating that Bigtable's GoogleSQL dialect only
supports `SELECT` DML operations.
- Explicitly states that `INSERT`, `UPDATE`, and `DELETE` DML statements
are not supported.
- Updated the external link to directly reference the "Use cases"
section in the official Google Cloud Bigtable GoogleSQL overview, where
this limitation is detailed.

**Reasoning:**
The previous documentation, while linking to the overview, did not
explicitly highlight the DML limitations. This clarification will
prevent potential confusion for users expecting full DML support,
ensuring the documentation accurately reflects Bigtable's capabilities
with GoogleSQL.

---------

Co-authored-by: Twisha Bansal <58483338+twishabansal@users.noreply.github.com>
2025-07-14 23:32:05 +05:30
Jack Wotherspoon
0d1cadb245 chore: add docs readme badge and update discord to match style (#884)
Adding badge that links to docs to make them more easily discoverable
from README.
2025-07-14 10:54:02 -04:00
Harsh Jha
6d27dabfb2 docs: add optional cloud setup section for Vertex AI (#790)
Co-authored-by: Twisha Bansal <58483338+twishabansal@users.noreply.github.com>
2025-07-14 18:38:25 +05:30
dishaprakash
f312fc01b2 docs: Add Toolbox Go SDK snippets to the docsite (#851)
This PR adds the Go SDK code snippets to the docsite, following the same
pattern as Python and JS.

The code snippets are for:
- Core usage
- LangChain Go
- Genkit Go
- Go GenAI
- OpenAI Go
2025-07-14 02:28:45 +05:30
dishaprakash
4998cae260 docs: Add redirection links to client SDK content pages (#856)
This PR adds redirection links to the actual SDK content pages.
This is added because Hugo still generates a regular HTML site page for
each of the client SDK file, even though it points to the external
Github link.
In case a user finds themselves in the internal url pointing to these
files, this should still redirect them to the specific github repos.
2025-07-14 01:21:25 +05:30
release-please[bot]
2b69700c5e chore(main): release 0.9.0 (#789)
🤖 I have created a release *beep* *boop*
---


##
[0.9.0](https://github.com/googleapis/genai-toolbox/compare/v0.8.0...v0.9.0)
(2025-07-11)


### Features

* Dynamic reloading for toolbox config
([#800](https://github.com/googleapis/genai-toolbox/issues/800))
([4c240ac](4c240ac3c9))
* **sources/mysql:** Add queryTimeout support to MySQL source
([#830](https://github.com/googleapis/genai-toolbox/issues/830))
([391cb5b](391cb5bfe8))
* **tools/bigquery:** Add optional projectID parameter to bigquery tools
([#799](https://github.com/googleapis/genai-toolbox/issues/799))
([c6ab74c](c6ab74c5da))


### Bug Fixes

* Cleanup unassigned err log
([#857](https://github.com/googleapis/genai-toolbox/issues/857))
([c081ace](c081ace46b))
* Fix docs preview deployment pipeline
([#787](https://github.com/googleapis/genai-toolbox/issues/787))
([0a93b04](0a93b0482c))
* **tools:** Nil parameter error when arrays are used
([#801](https://github.com/googleapis/genai-toolbox/issues/801))
([2bdcc08](2bdcc0841a))
* Trigger reload on additional fsnotify operations
([#854](https://github.com/googleapis/genai-toolbox/issues/854))
([aa8dbec](aa8dbec970))

---
This PR was generated with [Release
Please](https://github.com/googleapis/release-please). See
[documentation](https://github.com/googleapis/release-please#release-please).

---------

Co-authored-by: release-please[bot] <55107282+release-please[bot]@users.noreply.github.com>
Co-authored-by: Yuan Teoh <45984206+Yuan325@users.noreply.github.com>
2025-07-10 17:46:05 -07:00
AlexTalreja
c081ace46b fix: cleanup unassigned err log (#857)
Ref: #843 

Addressed one of the bug fixes needed to merge before release.
2025-07-11 00:28:35 +00:00
Mend Renovate
edf32abd84 chore(deps): update module cloud.google.com/go/cloudsqlconn to v1.17.3 (#853)
This PR contains the following updates:

| Package | Change | Age | Confidence |
|---|---|---|---|
|
[cloud.google.com/go/cloudsqlconn](https://redirect.github.com/googlecloudplatform/cloud-sql-go-connector)
| `v1.17.2` -> `v1.17.3` |
[![age](https://developer.mend.io/api/mc/badges/age/go/cloud.google.com%2fgo%2fcloudsqlconn/v1.17.3?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|
[![confidence](https://developer.mend.io/api/mc/badges/confidence/go/cloud.google.com%2fgo%2fcloudsqlconn/v1.17.2/v1.17.3?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|

---

### Release Notes

<details>
<summary>googlecloudplatform/cloud-sql-go-connector
(cloud.google.com/go/cloudsqlconn)</summary>

###
[`v1.17.3`](https://redirect.github.com/GoogleCloudPlatform/cloud-sql-go-connector/releases/tag/v1.17.3)

[Compare
Source](https://redirect.github.com/googlecloudplatform/cloud-sql-go-connector/compare/v1.17.2...v1.17.3)

##### Bug Fixes

- bump dependencies to latest
([#&#8203;993](https://redirect.github.com/GoogleCloudPlatform/cloud-sql-go-connector/issues/993))
([c0e5f9c](c0e5f9cad6))

</details>

---

### Configuration

📅 **Schedule**: Branch creation - At any time (no schedule defined),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/googleapis/genai-toolbox).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MS4yMy4yIiwidXBkYXRlZEluVmVyIjoiNDEuMjMuMiIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOltdfQ==-->
2025-07-10 17:14:11 -07:00
AlexTalreja
aa8dbec970 fix: trigger reload on additional fsnotify operations (#854)
Allow fsnotify `CREATE` and `RENAME` events to trigger dynamic reload,
instead of requiring `WRITE`.

This is due to some cases with specific editors/OS, (ex: vim on Mac),
where file writes can potentially occur without ever triggering a
`WRITE` operation on the watched tools file, but solely through creation
and renaming of .swp files.
2025-07-10 16:39:26 -07:00
dishaprakash
a7963c5a83 docs: Add Go SDK samples to the README (#850)
This PR adds the Go SDK code snippets to the README, following the same
pattern as Python and JS.

The code snippets are for:
 - Core usage
 - LangChain Go
 - Genkit Go
 - Go GenAI
 - OpenAI Go
2025-07-11 04:27:34 +05:30
Kurtis Van Gent
ea3c805467 chore: add akitsch to rotation (#852) 2025-07-10 20:21:47 +00:00
Twisha Bansal
ebbbe4c409 docs: update readme with discord link (#849) 2025-07-10 23:39:49 +05:30
Twisha Bansal
f9743ecf7e docs: add a docsy include shortcode (#824)
This shortcode allows us to pull in code snippets from code files into
the docsite.
Tested in https://github.com/googleapis/genai-toolbox/pull/825
2025-07-10 22:00:34 +05:30
Twisha Bansal
d3693c0d6b Revert "fix: fix docs preview deployment pipeline" (#823)
Reverts googleapis/genai-toolbox#787
2025-07-10 21:47:16 +05:30
dishaprakash
1a1815d822 docs: Add links to the Client SDKs (#837)
This PR adds links to the Client SDKs into the docsite.

All the 3 Client SDKs are added:
 - Go
 - JS
 - Python
2025-07-10 05:18:49 +05:30
Kamal Muradov
391cb5bfe8 feat: add queryTimeout support to MySQL source (#830)
## Summary
- Added configurable query timeout to MySQL source configuration
- Updated connection DSN to include readTimeout parameter  
- Added documentation and example usage
- Added test coverage for queryTimeout configuration

## Test plan
- [x] Added unit test for queryTimeout configuration parsing
- [x] Updated documentation with queryTimeout field description
- [x] Verified timeout parameter is correctly added to DSN when
specified

## Caveat

When queryTimeout is exceeded, we get an obscure error message
([screenshot](https://github.com/user-attachments/assets/fd292f91-328d-4ebc-9a87-2d92e9887300)):
```
unable to execute query: invalid connection
```

This seems to be a problem with the mysql-go library:
https://stackoverflow.com/q/65253798/10720618

I tried to use
[MAX_EXECUTION_TIME](https://dev.mysql.com/doc/refman/8.0/en/optimizer-hints.html#optimizer-hints-execution-time)
but it didn't work as expected (my `sleep(MAX_EXECUTION_TIME+3)` query
finished successfully after MAX_EXECUTION_TIME milliseconds)

Any ideas on what can be done here? The error message is very
misleading. My goal with adding timeouts is to communicate to the LLM
when it has issued a slow query and force it to adjust (e.g. query
indexes and write a more optimized query) but this defeats the purpose.

---------

Co-authored-by: Wenxin Du <117315983+duwenxin99@users.noreply.github.com>
2025-07-09 18:11:15 -04:00
Wenxin Du
2bdcc0841a fix: nil parameter error when arrays are used (#801)
- Spanner: convert array to typed before querying, as Spanner does not
accept non-typed array []any
- BigQuery: fix https://github.com/googleapis/genai-toolbox/issues/793
- Bigtable: add the required `ElemType` for array-type params 
- Redis/Valkey: change indexing to append to avoid extra spaces

Add integration tests for array parameters, skipped for the sources not
supporting arrays:
- SQLite
- Cloud SQL MSSQL
- Cloud SQL MySQL
- MSSQL
- MySQL
2025-07-09 17:40:49 -04:00
Mend Renovate
a6693ab8b0 chore(deps): update module go.opentelemetry.io/contrib/propagators/autoprop to v0.62.0 (#836)
This PR contains the following updates:

| Package | Change | Age | Confidence |
|---|---|---|---|
|
[go.opentelemetry.io/contrib/propagators/autoprop](https://redirect.github.com/open-telemetry/opentelemetry-go-contrib)
| `v0.61.0` -> `v0.62.0` |
[![age](https://developer.mend.io/api/mc/badges/age/go/go.opentelemetry.io%2fcontrib%2fpropagators%2fautoprop/v0.62.0?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|
[![confidence](https://developer.mend.io/api/mc/badges/confidence/go/go.opentelemetry.io%2fcontrib%2fpropagators%2fautoprop/v0.61.0/v0.62.0?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|

---

### Configuration

📅 **Schedule**: Branch creation - At any time (no schedule defined),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/googleapis/genai-toolbox).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MS4yMy4yIiwidXBkYXRlZEluVmVyIjoiNDEuMjMuMiIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOltdfQ==-->

Co-authored-by: Yuan Teoh <45984206+Yuan325@users.noreply.github.com>
2025-07-09 21:19:58 +00:00
Mend Renovate
e1325880d1 chore(deps): update dependency go to v1.24.5 (#808)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
| [go](https://go.dev/)
([source](https://redirect.github.com/golang/go)) | toolchain | patch |
`1.24.4` -> `1.24.5` |

---

### Release Notes

<details>
<summary>golang/go (go)</summary>

###
[`v1.24.5`](https://redirect.github.com/golang/go/compare/go1.24.4...go1.24.5)

</details>

---

### Configuration

📅 **Schedule**: Branch creation - At any time (no schedule defined),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/googleapis/genai-toolbox).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MS4yMy4yIiwidXBkYXRlZEluVmVyIjoiNDEuMjMuMiIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOltdfQ==-->

Co-authored-by: Yuan Teoh <45984206+Yuan325@users.noreply.github.com>
2025-07-09 14:06:04 -07:00
Mend Renovate
b7230a93df chore(deps): update module cloud.google.com/go/alloydbconn to v1.15.4 (#809)
This PR contains the following updates:

| Package | Change | Age | Confidence |
|---|---|---|---|
|
[cloud.google.com/go/alloydbconn](https://redirect.github.com/googlecloudplatform/alloydb-go-connector)
| `v1.15.3` -> `v1.15.4` |
[![age](https://developer.mend.io/api/mc/badges/age/go/cloud.google.com%2fgo%2falloydbconn/v1.15.4?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|
[![confidence](https://developer.mend.io/api/mc/badges/confidence/go/cloud.google.com%2fgo%2falloydbconn/v1.15.3/v1.15.4?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|

---

> [!WARNING]
> Some dependencies could not be looked up. Check the Dependency
Dashboard for more information.

---

### Release Notes

<details>
<summary>googlecloudplatform/alloydb-go-connector
(cloud.google.com/go/alloydbconn)</summary>

###
[`v1.15.4`](https://redirect.github.com/GoogleCloudPlatform/alloydb-go-connector/releases/tag/v1.15.4)

[Compare
Source](https://redirect.github.com/googlecloudplatform/alloydb-go-connector/compare/v1.15.3...v1.15.4)

##### Bug Fixes

- update dependencies to latest
([#&#8203;693](https://redirect.github.com/GoogleCloudPlatform/alloydb-go-connector/issues/693))
([86a621d](86a621d0a7))

</details>

---

### Configuration

📅 **Schedule**: Branch creation - At any time (no schedule defined),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/googleapis/genai-toolbox).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MS4yMy4yIiwidXBkYXRlZEluVmVyIjoiNDEuMjMuMiIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOltdfQ==-->

Co-authored-by: Yuan Teoh <45984206+Yuan325@users.noreply.github.com>
2025-07-09 20:58:59 +00:00
Mend Renovate
32712fa018 chore(deps): update module google.golang.org/api to v0.241.0 (#835)
This PR contains the following updates:

| Package | Change | Age | Confidence |
|---|---|---|---|
|
[google.golang.org/api](https://redirect.github.com/googleapis/google-api-go-client)
| `v0.240.0` -> `v0.241.0` |
[![age](https://developer.mend.io/api/mc/badges/age/go/google.golang.org%2fapi/v0.241.0?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|
[![confidence](https://developer.mend.io/api/mc/badges/confidence/go/google.golang.org%2fapi/v0.240.0/v0.241.0?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|

---

> [!WARNING]
> Some dependencies could not be looked up. Check the Dependency
Dashboard for more information.

---

### Release Notes

<details>
<summary>googleapis/google-api-go-client
(google.golang.org/api)</summary>

###
[`v0.241.0`](https://redirect.github.com/googleapis/google-api-go-client/releases/tag/v0.241.0)

[Compare
Source](https://redirect.github.com/googleapis/google-api-go-client/compare/v0.240.0...v0.241.0)

##### Features

- **all:** Auto-regenerate discovery clients
([#&#8203;3219](https://redirect.github.com/googleapis/google-api-go-client/issues/3219))
([987e4ab](987e4abe1e))
- **all:** Auto-regenerate discovery clients
([#&#8203;3221](https://redirect.github.com/googleapis/google-api-go-client/issues/3221))
([7e31abb](7e31abbe69))
- **all:** Auto-regenerate discovery clients
([#&#8203;3222](https://redirect.github.com/googleapis/google-api-go-client/issues/3222))
([3346ebb](3346ebb070))
- **all:** Auto-regenerate discovery clients
([#&#8203;3223](https://redirect.github.com/googleapis/google-api-go-client/issues/3223))
([f94c92c](f94c92cafe))
- **all:** Auto-regenerate discovery clients
([#&#8203;3224](https://redirect.github.com/googleapis/google-api-go-client/issues/3224))
([3f1f756](3f1f756570))
- **all:** Auto-regenerate discovery clients
([#&#8203;3225](https://redirect.github.com/googleapis/google-api-go-client/issues/3225))
([8799cd8](8799cd8e4c))

</details>

---

### Configuration

📅 **Schedule**: Branch creation - At any time (no schedule defined),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/googleapis/genai-toolbox).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MS4yMy4yIiwidXBkYXRlZEluVmVyIjoiNDEuMjMuMiIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOltdfQ==-->

Co-authored-by: Yuan Teoh <45984206+Yuan325@users.noreply.github.com>
2025-07-09 13:51:55 -07:00
Yuan Teoh
35e0919184 chore(deps): update opentelemetry-go monorepo to v1.37.0 (#834)
This require updates of the semconv version to
`go.opentelemetry.io/otel/semconv/v1.34.0` library.
2025-07-09 16:40:03 -04:00
Anubhav Dhawan
72a7282797 docs: Add Toolbox SDKs repo links to relevant doc snippets (#828)
This PR adds Toolbox SDK github repo links to the relevant parts where
these SDKs are introduced in the `README` for additional context.
2025-07-09 16:34:57 +05:30
Anmol Shukla
29fe3b93cd docs: fix copy to clipboard button visibility in light mode (#826)
This PR fixes the issue #791 and updated the info box color so that tags
are visible in dark mode as well in docsite.
2025-07-09 14:13:12 +05:30
Anubhav Dhawan
fb3f66acf4 docs: Correct link for Cloud Run datasource setup (#794)
Updated the link in the Cloud Run deployment guide for `tools.yaml`
setup. The previous link incorrectly pointed to a `localhost` source
example, which causes confusion and deployment failures. The new link
directs users to the guide for configuring cloud-based sources, ensuring
a correct setup.
2025-07-09 06:11:18 +00:00
Yuan Teoh
1f95eb134b test: add more time to spanner integration test ctx (#819)
Occasionally the Spanner integration test's `context` timeout before the
`DROP` operation could finish.
2025-07-09 01:21:22 +00:00
AlexTalreja
4c240ac3c9 feat: dynamic reloading for toolbox config (#800)
Allow Toolbox server to automatically update when users modify their
tool configuration file(s), instead of requiring a restart.

This feature is automatically enabled, but can be turned off with the
flag `--disable-reload`.
2025-07-08 17:28:12 -07:00
Huan Chen
c6ab74c5da feat: add optional projectID parameter to bigquery tools (#799)
Optional projectID parameter enables dynamic, cross-project resource
access in BigQuery tools.

This allows a single tool configuration to target different projects at
runtime, rather than being fixed to the project in its source
configuration.

---------

Co-authored-by: Yuan Teoh <45984206+Yuan325@users.noreply.github.com>
Co-authored-by: Wenxin Du <117315983+duwenxin99@users.noreply.github.com>
2025-07-08 18:02:42 -04:00
Yuan Teoh
04e2529ba9 test: add null column test case (#768)
Add integration tests to check for `null` columns. ref #757
2025-07-08 20:20:16 +00:00
Mend Renovate
53dd247e6e chore(deps): update module google.golang.org/api to v0.240.0 (#778)
This PR contains the following updates:

| Package | Change | Age | Confidence |
|---|---|---|---|
|
[google.golang.org/api](https://redirect.github.com/googleapis/google-api-go-client)
| `v0.239.0` -> `v0.240.0` |
[![age](https://developer.mend.io/api/mc/badges/age/go/google.golang.org%2fapi/v0.240.0?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|
[![confidence](https://developer.mend.io/api/mc/badges/confidence/go/google.golang.org%2fapi/v0.239.0/v0.240.0?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|

---

### Release Notes

<details>
<summary>googleapis/google-api-go-client
(google.golang.org/api)</summary>

###
[`v0.240.0`](https://redirect.github.com/googleapis/google-api-go-client/releases/tag/v0.240.0)

[Compare
Source](https://redirect.github.com/googleapis/google-api-go-client/compare/v0.239.0...v0.240.0)

##### Features

- **all:** Auto-regenerate discovery clients
([#&#8203;3210](https://redirect.github.com/googleapis/google-api-go-client/issues/3210))
([c0efdb5](c0efdb50d5))
- **all:** Auto-regenerate discovery clients
([#&#8203;3212](https://redirect.github.com/googleapis/google-api-go-client/issues/3212))
([c699558](c699558a9c))
- **all:** Auto-regenerate discovery clients
([#&#8203;3214](https://redirect.github.com/googleapis/google-api-go-client/issues/3214))
([7b43598](7b43598833))
- **all:** Auto-regenerate discovery clients
([#&#8203;3215](https://redirect.github.com/googleapis/google-api-go-client/issues/3215))
([22e2c38](22e2c38068))
- **all:** Auto-regenerate discovery clients
([#&#8203;3216](https://redirect.github.com/googleapis/google-api-go-client/issues/3216))
([e8c3504](e8c3504399))
- **all:** Auto-regenerate discovery clients
([#&#8203;3217](https://redirect.github.com/googleapis/google-api-go-client/issues/3217))
([604190c](604190c29e))
- **all:** Auto-regenerate discovery clients
([#&#8203;3218](https://redirect.github.com/googleapis/google-api-go-client/issues/3218))
([0a46af7](0a46af7bb3))

</details>

---

### Configuration

📅 **Schedule**: Branch creation - At any time (no schedule defined),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/googleapis/genai-toolbox).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MS4xNy4yIiwidXBkYXRlZEluVmVyIjoiNDEuMTcuMiIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOltdfQ==-->

Co-authored-by: Yuan <45984206+Yuan325@users.noreply.github.com>
2025-07-07 16:08:01 -07:00
Twisha Bansal
648eede62b docs: add introduction snippets for JS SDK (#785)
Using nested tabs in hugo is hard and flaky. So, we're using different
heading for the multiple languages here.
2025-07-08 00:11:49 +05:30
Yuan
9b2dfcc553 chore: update int test variable name to be consistent (#766)
Update `_` variables to camelcase.
2025-07-04 05:21:39 +00:00
Twisha Bansal
cb514209b6 docs: add JS SDK to readme (#776)
Co-authored-by: Kurtis Van Gent <31518063+kurtisvg@users.noreply.github.com>
2025-07-03 22:35:13 +05:30
Twisha Bansal
0a93b0482c fix: fix docs preview deployment pipeline (#787)
The code for the preview build is now sourced from the target branch,
whereas it was previously sourced from the main branch.
2025-07-03 21:54:10 +05:30
release-please[bot]
f13e9635ba chore(main): release 0.8.0 (#689)
🤖 I have created a release *beep* *boop*
---


##
[0.8.0](https://github.com/googleapis/genai-toolbox/compare/v0.7.0...v0.8.0)
(2025-07-02)


### ⚠ BREAKING CHANGES

* **postgres,mssql,cloudsqlmssql:** encode source connection url for
sources ([#727](https://github.com/googleapis/genai-toolbox/issues/727))

### Features

* Add support for multiple YAML configuration files
([#760](https://github.com/googleapis/genai-toolbox/issues/760))
([40679d7](40679d700e))
* Add support for optional parameters
([#617](https://github.com/googleapis/genai-toolbox/issues/617))
([4827771](4827771b78)),
closes [#475](https://github.com/googleapis/genai-toolbox/issues/475)
* **mcp:** Support MCP version 2025-03-26
([#755](https://github.com/googleapis/genai-toolbox/issues/755))
([474df57](474df57d62))
* **sources/http:** Support disable SSL verification for HTTP Source
([#674](https://github.com/googleapis/genai-toolbox/issues/674))
([4055b0c](4055b0c356))
* **tools/bigquery:** Add templateParameters field for bigquery
([#699](https://github.com/googleapis/genai-toolbox/issues/699))
([f5f771b](f5f771b0f3))
* **tools/bigtable:** Add templateParameters field for bigtable
([#692](https://github.com/googleapis/genai-toolbox/issues/692))
([1c06771](1c067715fa))
* **tools/couchbase:** Add templateParameters field for couchbase
([#723](https://github.com/googleapis/genai-toolbox/issues/723))
([9197186](9197186b8b))
* **tools/http:** Add support for HTTP Tool pathParams
([#726](https://github.com/googleapis/genai-toolbox/issues/726))
([fd300dc](fd300dc606))
* **tools/redis:** Add Redis Source and Tool
([#519](https://github.com/googleapis/genai-toolbox/issues/519))
([f0aef29](f0aef29b0c))
* **tools/spanner:** Add templateParameters field for spanner
([#691](https://github.com/googleapis/genai-toolbox/issues/691))
([075dfa4](075dfa47e1))
* **tools/sqlitesql:** Add templateParameters field for sqlitesql
([#687](https://github.com/googleapis/genai-toolbox/issues/687))
([75e254c](75e254c0a4))
* **tools/valkey:** Add Valkey Source and Tool
([#532](https://github.com/googleapis/genai-toolbox/issues/532))
([054ec19](054ec198b9))


### Bug Fixes

* **bigquery,mssql:** Fix panic on tools with array param
([#722](https://github.com/googleapis/genai-toolbox/issues/722))
([7a6644c](7a6644cf0c))
* **postgres,mssql,cloudsqlmssql:** Encode source connection url for
sources ([#727](https://github.com/googleapis/genai-toolbox/issues/727))
([67964d9](67964d939f)),
closes [#717](https://github.com/googleapis/genai-toolbox/issues/717)
* Set default value to field's type during unmarshalling
([#774](https://github.com/googleapis/genai-toolbox/issues/774))
([fafed24](fafed24858)),
closes [#771](https://github.com/googleapis/genai-toolbox/issues/771)
* **server/mcp:** Do not listen from port for stdio
([#719](https://github.com/googleapis/genai-toolbox/issues/719))
([d51dbc7](d51dbc759b)),
closes [#711](https://github.com/googleapis/genai-toolbox/issues/711)
* **tools/mysqlexecutesql:** Handle nil panic and connection leak in
Invoke ([#757](https://github.com/googleapis/genai-toolbox/issues/757))
([7badba4](7badba42ee))
* **tools/mysqlsql:** Handle nil panic and connection leak in invoke
([#758](https://github.com/googleapis/genai-toolbox/issues/758))
([cbb4a33](cbb4a33351))

---
This PR was generated with [Release
Please](https://github.com/googleapis/release-please). See
[documentation](https://github.com/googleapis/release-please#release-please).

---------

Co-authored-by: release-please[bot] <55107282+release-please[bot]@users.noreply.github.com>
Co-authored-by: Yuan <45984206+Yuan325@users.noreply.github.com>
2025-07-02 09:30:33 -06:00
Yuan
fafed24858 fix: set default value to field's type during unmarshalling (#774)
When go-yaml decode into CommonParameter with Default being an any type,
int will be converted into []uint64.
It will fail the Parse() when the value is being used since it does not
belong to either of the int types.

Unmarshal `default` value into each field's type directly.

Fixes #771
2025-07-02 14:58:42 +00:00
Mend Renovate
6337434623 chore(deps): update module github.com/go-playground/validator/v10 to v10.27.0 (#775)
This PR contains the following updates:

| Package | Change | Age | Adoption | Passing | Confidence |
|---|---|---|---|---|---|
|
[github.com/go-playground/validator/v10](https://redirect.github.com/go-playground/validator)
| `v10.26.0` -> `v10.27.0` |
[![age](https://developer.mend.io/api/mc/badges/age/go/github.com%2fgo-playground%2fvalidator%2fv10/v10.27.0?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|
[![adoption](https://developer.mend.io/api/mc/badges/adoption/go/github.com%2fgo-playground%2fvalidator%2fv10/v10.27.0?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|
[![passing](https://developer.mend.io/api/mc/badges/compatibility/go/github.com%2fgo-playground%2fvalidator%2fv10/v10.26.0/v10.27.0?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|
[![confidence](https://developer.mend.io/api/mc/badges/confidence/go/github.com%2fgo-playground%2fvalidator%2fv10/v10.26.0/v10.27.0?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|

---

### Release Notes

<details>
<summary>go-playground/validator
(github.com/go-playground/validator/v10)</summary>

###
[`v10.27.0`](https://redirect.github.com/go-playground/validator/releases/tag/v10.27.0):
Release 10.27.0

[Compare
Source](https://redirect.github.com/go-playground/validator/compare/v10.26.0...v10.27.0)

#### What's Changed

- Fix Release version badge on README page by
[@&#8203;nodivbyzero](https://redirect.github.com/nodivbyzero) in
[https://github.com/go-playground/validator/pull/1406](https://redirect.github.com/go-playground/validator/pull/1406)
- fix russian E.164 error message by
[@&#8203;prigornitskiy](https://redirect.github.com/prigornitskiy) in
[https://github.com/go-playground/validator/pull/1349](https://redirect.github.com/go-playground/validator/pull/1349)
- chore: remove unnecessary statement by
[@&#8203;qshuai](https://redirect.github.com/qshuai) in
[https://github.com/go-playground/validator/pull/1200](https://redirect.github.com/go-playground/validator/pull/1200)
- Re-enable several linters by
[@&#8203;nodivbyzero](https://redirect.github.com/nodivbyzero) in
[https://github.com/go-playground/validator/pull/1412](https://redirect.github.com/go-playground/validator/pull/1412)
- add support to tag validateFn by
[@&#8203;peczenyj](https://redirect.github.com/peczenyj) in
[https://github.com/go-playground/validator/pull/1363](https://redirect.github.com/go-playground/validator/pull/1363)
- Bump golang.org/x/crypto from 0.33.0 to 0.35.0 in
/\_examples/validate\_fn by
[@&#8203;dependabot](https://redirect.github.com/dependabot) in
[https://github.com/go-playground/validator/pull/1418](https://redirect.github.com/go-playground/validator/pull/1418)
- Bump golang.org/x/net from 0.34.0 to 0.38.0 in
/\_examples/validate\_fn by
[@&#8203;dependabot](https://redirect.github.com/dependabot) in
[https://github.com/go-playground/validator/pull/1419](https://redirect.github.com/go-playground/validator/pull/1419)
- Align required\_without with the contract stated in the documentation
by [@&#8203;jmfrees](https://redirect.github.com/jmfrees) in
[https://github.com/go-playground/validator/pull/1422](https://redirect.github.com/go-playground/validator/pull/1422)
- Add translation example by
[@&#8203;cxlblm](https://redirect.github.com/cxlblm) in
[https://github.com/go-playground/validator/pull/1394](https://redirect.github.com/go-playground/validator/pull/1394)
- doc(errors): mention RegisterTagNameFunc for FieldError.Field by
[@&#8203;khan-ajamal](https://redirect.github.com/khan-ajamal) in
[https://github.com/go-playground/validator/pull/1358](https://redirect.github.com/go-playground/validator/pull/1358)
- Bump golangci/golangci-lint-action from 7 to 8 by
[@&#8203;dependabot](https://redirect.github.com/dependabot) in
[https://github.com/go-playground/validator/pull/1425](https://redirect.github.com/go-playground/validator/pull/1425)
- feat(translation): add en translation for urn\_rfc2141 by
[@&#8203;ryanmalesic](https://redirect.github.com/ryanmalesic) in
[https://github.com/go-playground/validator/pull/1431](https://redirect.github.com/go-playground/validator/pull/1431)
- fix: panics when private field is validated by
[@&#8203;ykalchevskiy](https://redirect.github.com/ykalchevskiy) in
[https://github.com/go-playground/validator/pull/1423](https://redirect.github.com/go-playground/validator/pull/1423)
- Fix: support validation for map values with struct types by
[@&#8203;JunaidIslam2105](https://redirect.github.com/JunaidIslam2105)
in
[https://github.com/go-playground/validator/pull/1433](https://redirect.github.com/go-playground/validator/pull/1433)
- Omitzero does not work with slice and map bug by
[@&#8203;JunaidIslam2105](https://redirect.github.com/JunaidIslam2105)
in
[https://github.com/go-playground/validator/pull/1436](https://redirect.github.com/go-playground/validator/pull/1436)
- Fix: Validator panics when 'nil' is used along with required if for
slices and maps by
[@&#8203;JunaidIslam2105](https://redirect.github.com/JunaidIslam2105)
in
[https://github.com/go-playground/validator/pull/1442](https://redirect.github.com/go-playground/validator/pull/1442)
- docs: typos by [@&#8203;eqsdxr](https://redirect.github.com/eqsdxr) in
[https://github.com/go-playground/validator/pull/1440](https://redirect.github.com/go-playground/validator/pull/1440)
- fix: make "file://" fail `url` validation by
[@&#8203;bfabio](https://redirect.github.com/bfabio) in
[https://github.com/go-playground/validator/pull/1444](https://redirect.github.com/go-playground/validator/pull/1444)
- disable way too aggressive and disagreeable linters by
[@&#8203;deankarn](https://redirect.github.com/deankarn) in
[https://github.com/go-playground/validator/pull/1445](https://redirect.github.com/go-playground/validator/pull/1445)
- use golangci lint file for disables by
[@&#8203;deankarn](https://redirect.github.com/deankarn) in
[https://github.com/go-playground/validator/pull/1447](https://redirect.github.com/go-playground/validator/pull/1447)

#### New Contributors

- [@&#8203;prigornitskiy](https://redirect.github.com/prigornitskiy)
made their first contribution in
[https://github.com/go-playground/validator/pull/1349](https://redirect.github.com/go-playground/validator/pull/1349)
- [@&#8203;qshuai](https://redirect.github.com/qshuai) made their first
contribution in
[https://github.com/go-playground/validator/pull/1200](https://redirect.github.com/go-playground/validator/pull/1200)
- [@&#8203;peczenyj](https://redirect.github.com/peczenyj) made their
first contribution in
[https://github.com/go-playground/validator/pull/1363](https://redirect.github.com/go-playground/validator/pull/1363)
- [@&#8203;jmfrees](https://redirect.github.com/jmfrees) made their
first contribution in
[https://github.com/go-playground/validator/pull/1422](https://redirect.github.com/go-playground/validator/pull/1422)
- [@&#8203;cxlblm](https://redirect.github.com/cxlblm) made their first
contribution in
[https://github.com/go-playground/validator/pull/1394](https://redirect.github.com/go-playground/validator/pull/1394)
- [@&#8203;khan-ajamal](https://redirect.github.com/khan-ajamal) made
their first contribution in
[https://github.com/go-playground/validator/pull/1358](https://redirect.github.com/go-playground/validator/pull/1358)
- [@&#8203;ryanmalesic](https://redirect.github.com/ryanmalesic) made
their first contribution in
[https://github.com/go-playground/validator/pull/1431](https://redirect.github.com/go-playground/validator/pull/1431)
- [@&#8203;ykalchevskiy](https://redirect.github.com/ykalchevskiy) made
their first contribution in
[https://github.com/go-playground/validator/pull/1423](https://redirect.github.com/go-playground/validator/pull/1423)
- [@&#8203;JunaidIslam2105](https://redirect.github.com/JunaidIslam2105)
made their first contribution in
[https://github.com/go-playground/validator/pull/1433](https://redirect.github.com/go-playground/validator/pull/1433)
- [@&#8203;eqsdxr](https://redirect.github.com/eqsdxr) made their first
contribution in
[https://github.com/go-playground/validator/pull/1440](https://redirect.github.com/go-playground/validator/pull/1440)
- [@&#8203;bfabio](https://redirect.github.com/bfabio) made their first
contribution in
[https://github.com/go-playground/validator/pull/1444](https://redirect.github.com/go-playground/validator/pull/1444)

**Full Changelog**:
https://github.com/go-playground/validator/compare/v10.26.0...v10.27.0

</details>

---

### Configuration

📅 **Schedule**: Branch creation - At any time (no schedule defined),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/googleapis/genai-toolbox).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MC42Mi4xIiwidXBkYXRlZEluVmVyIjoiNDAuNjIuMSIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOltdfQ==-->
2025-07-02 05:32:35 +00:00
Mend Renovate
822708afaa chore(deps): update module cloud.google.com/go/bigtable to v1.38.0 (#773)
This PR contains the following updates:

| Package | Change | Age | Adoption | Passing | Confidence |
|---|---|---|---|---|---|
|
[cloud.google.com/go/bigtable](https://redirect.github.com/googleapis/google-cloud-go)
| `v1.37.0` -> `v1.38.0` |
[![age](https://developer.mend.io/api/mc/badges/age/go/cloud.google.com%2fgo%2fbigtable/v1.38.0?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|
[![adoption](https://developer.mend.io/api/mc/badges/adoption/go/cloud.google.com%2fgo%2fbigtable/v1.38.0?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|
[![passing](https://developer.mend.io/api/mc/badges/compatibility/go/cloud.google.com%2fgo%2fbigtable/v1.37.0/v1.38.0?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|
[![confidence](https://developer.mend.io/api/mc/badges/confidence/go/cloud.google.com%2fgo%2fbigtable/v1.37.0/v1.38.0?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|

---

### Configuration

📅 **Schedule**: Branch creation - At any time (no schedule defined),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/googleapis/genai-toolbox).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MC42Mi4xIiwidXBkYXRlZEluVmVyIjoiNDAuNjIuMSIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOltdfQ==-->
2025-07-02 05:14:59 +00:00
Yuan
010c278cbf chore: release 0.8.0 (#769)
Release-As: 0.8.0
2025-06-30 18:32:48 +00:00
Mohd Mujtaba
40679d700e feat: add support for multiple YAML configuration files (#760)
# Add Multiple YAML Configuration File Support

## 🎯 Overview

This PR introduces support for loading and merging multiple YAML
configuration files in Toolbox, addressing the need for modular
configuration management in complex deployments.

##  New Features

### 1. Multiple Files Support (`--tools-files`)
- **Usage**: `--tools-files=file1.yaml,file2.yaml,file3.yaml`
- Load and intelligently merge multiple YAML configuration files
- Comma-separated file paths for maximum flexibility

### 2. Directory Support (`--tools-folder`)
- **Usage**: `--tools-folder=config-directory`
- Automatically discover and load all `.yaml` and `.yml` files from a
directory
- Simplifies configuration management for organized deployments

### 3. Smart Merging Logic
- **Sources/AuthServices/Tools**: Later files override earlier files
with same names
- **Toolsets**: Tools from same-named toolsets are combined without
duplicates
- Preserves all existing functionality while enabling composition

## 🔒 Safety & Validation

- **Mutual Exclusivity**: Prevents simultaneous use of `--tools-file`,
`--tools-files`, `--tools-folder`, and `--prebuilt`
- **Clear Error Messages**: Descriptive validation errors guide users to
correct usage
- **Comprehensive Error Handling**: Proper handling of missing files,
directories, and parsing errors
- **Full Backward Compatibility**: Existing configurations continue to
work unchanged

## 🏗️ Implementation Details

### Core Functions Added
- `mergeToolsFiles()` - Smart merging with configurable override rules
- `loadAndMergeToolsFiles()` - Multi-file loading and processing
- `loadAndMergeToolsFolder()` - Directory scanning and batch loading

### Command Structure Updates
- New `tools_files []string` field for multiple file paths
- New `tools_folder string` field for directory path
- Enhanced validation logic in `run()` function
- Updated flag definitions with proper descriptions

## 📋 Use Cases

### Organizational Benefits
- **Modular Configuration**: Separate database, API, and auth
configurations
- **Team Collaboration**: Multiple developers can work on different
config files
- **Environment Management**: Easy configuration swapping for different
environments
- **Scalability**: Large configurations can be broken into manageable
chunks

### Example Usage Patterns
```bash
# Multiple specific files
./toolbox --tools-files=database.yaml,apis.yaml,auth.yaml

# Directory-based loading
./toolbox --tools-folder=./production-configs

# Error case (properly handled)
./toolbox --tools-file=single.yaml --tools-folder=configs
# ERROR: --tools-file, --tools-files, and --tools-folder flags cannot be used simultaneously

---------

Co-authored-by: Wenxin Du <117315983+duwenxin99@users.noreply.github.com>
2025-06-30 14:00:49 -04:00
Mend Renovate
5fb056ee43 chore(deps): update module github.com/microsoft/go-mssqldb to v1.9.2 (#767)
This PR contains the following updates:

| Package | Change | Age | Adoption | Passing | Confidence |
|---|---|---|---|---|---|
|
[github.com/microsoft/go-mssqldb](https://redirect.github.com/microsoft/go-mssqldb)
| `v1.9.1` -> `v1.9.2` |
[![age](https://developer.mend.io/api/mc/badges/age/go/github.com%2fmicrosoft%2fgo-mssqldb/v1.9.2?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|
[![adoption](https://developer.mend.io/api/mc/badges/adoption/go/github.com%2fmicrosoft%2fgo-mssqldb/v1.9.2?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|
[![passing](https://developer.mend.io/api/mc/badges/compatibility/go/github.com%2fmicrosoft%2fgo-mssqldb/v1.9.1/v1.9.2?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|
[![confidence](https://developer.mend.io/api/mc/badges/confidence/go/github.com%2fmicrosoft%2fgo-mssqldb/v1.9.1/v1.9.2?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|

---

### Release Notes

<details>
<summary>microsoft/go-mssqldb
(github.com/microsoft/go-mssqldb)</summary>

###
[`v1.9.2`](https://redirect.github.com/microsoft/go-mssqldb/compare/v1.9.1...v1.9.2)

[Compare
Source](https://redirect.github.com/microsoft/go-mssqldb/compare/v1.9.1...v1.9.2)

</details>

---

### Configuration

📅 **Schedule**: Branch creation - At any time (no schedule defined),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/googleapis/genai-toolbox).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MC42Mi4xIiwidXBkYXRlZEluVmVyIjoiNDAuNjIuMSIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOltdfQ==-->
2025-06-30 17:06:28 +00:00
Wenxin Du
a1b60100c2 chore: Group tools by type (#743)
Group tools of the same type into the same folder so that they are more
discoverable and our tools are more organized as the number grows.
2025-06-30 11:37:48 -04:00
Mend Renovate
cb92883330 chore(deps): update module cloud.google.com/go/spanner to v1.83.0 (#763)
This PR contains the following updates:

| Package | Change | Age | Adoption | Passing | Confidence |
|---|---|---|---|---|---|
|
[cloud.google.com/go/spanner](https://redirect.github.com/googleapis/google-cloud-go)
| `v1.82.0` -> `v1.83.0` |
[![age](https://developer.mend.io/api/mc/badges/age/go/cloud.google.com%2fgo%2fspanner/v1.83.0?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|
[![adoption](https://developer.mend.io/api/mc/badges/adoption/go/cloud.google.com%2fgo%2fspanner/v1.83.0?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|
[![passing](https://developer.mend.io/api/mc/badges/compatibility/go/cloud.google.com%2fgo%2fspanner/v1.82.0/v1.83.0?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|
[![confidence](https://developer.mend.io/api/mc/badges/confidence/go/cloud.google.com%2fgo%2fspanner/v1.82.0/v1.83.0?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|

---

### Configuration

📅 **Schedule**: Branch creation - At any time (no schedule defined),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/googleapis/genai-toolbox).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MC42Mi4xIiwidXBkYXRlZEluVmVyIjoiNDAuNjIuMSIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOltdfQ==-->
2025-06-27 19:06:39 +00:00
Mend Renovate
bd2f1956bd chore(deps): update module github.com/valkey-io/valkey-go to v1.0.62 (#762)
This PR contains the following updates:

| Package | Change | Age | Adoption | Passing | Confidence |
|---|---|---|---|---|---|
|
[github.com/valkey-io/valkey-go](https://redirect.github.com/valkey-io/valkey-go)
| `v1.0.61` -> `v1.0.62` |
[![age](https://developer.mend.io/api/mc/badges/age/go/github.com%2fvalkey-io%2fvalkey-go/v1.0.62?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|
[![adoption](https://developer.mend.io/api/mc/badges/adoption/go/github.com%2fvalkey-io%2fvalkey-go/v1.0.62?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|
[![passing](https://developer.mend.io/api/mc/badges/compatibility/go/github.com%2fvalkey-io%2fvalkey-go/v1.0.61/v1.0.62?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|
[![confidence](https://developer.mend.io/api/mc/badges/confidence/go/github.com%2fvalkey-io%2fvalkey-go/v1.0.61/v1.0.62?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|

---

### Release Notes

<details>
<summary>valkey-io/valkey-go (github.com/valkey-io/valkey-go)</summary>

###
[`v1.0.62`](https://redirect.github.com/valkey-io/valkey-go/releases/tag/v1.0.62):
1.0.62

[Compare
Source](https://redirect.github.com/valkey-io/valkey-go/compare/v1.0.61...v1.0.62)

### Changes

- feat: support the SendToReplicas option in the Sentinel client.
- feat: deterministic SendToReplicas routing in the Cluster client.
- perf: changed atomic.Value to atomic.Pointer in the pipe.
- docs: fix typos and spellings.

#### Contributors

We'd like to thank all the contributors who worked on this release!

[@&#8203;PingXie](https://redirect.github.com/PingXie),
[@&#8203;jsoref](https://redirect.github.com/jsoref),
[@&#8203;nithinputhenveettil](https://redirect.github.com/nithinputhenveettil),
[@&#8203;proost](https://redirect.github.com/proost) and
[@&#8203;rueian](https://redirect.github.com/rueian)

</details>

---

### Configuration

📅 **Schedule**: Branch creation - At any time (no schedule defined),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/googleapis/genai-toolbox).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MC42Mi4xIiwidXBkYXRlZEluVmVyIjoiNDAuNjIuMSIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOltdfQ==-->
2025-06-27 12:32:36 -04:00
Yuan
cbb4a33351 fix(tools/mysqlsql): Handle nil panic and connection leak in invoke (#758)
Copy fix from #757. 

The Invoke function had two bugs:

A panic would occur when scanning a row containing a NULL value in a
TEXT or VARCHAR column. The code did not check for nil before attempting
a type assertion on the scanned value.

The *sql.Rows result was not being closed on all code paths, leading to
connection leaks that could exhaust the database connection pool.

This change corrects both issues. A guard clause now checks for nil
values before processing, and rows.Close() is deferred to guarantee the
connection is released.
2025-06-26 20:54:59 +00:00
megatron0000
7badba42ee fix(tools/mysqlexecutesql): Handle nil panic and connection leak in Invoke (#757)
The Invoke function had two bugs:

1. A panic would occur when scanning a row containing a NULL value in a
TEXT or VARCHAR column. The code did not check for nil before attempting
a type assertion on the scanned value.

2. The *sql.Rows result was not being closed on all code paths, leading
to connection leaks that could exhaust the database connection pool.

This change corrects both issues. A guard clause now checks for nil
values before processing, and rows.Close() is deferred to guarantee the
connection is released.

Co-authored-by: Yuan <45984206+Yuan325@users.noreply.github.com>
2025-06-26 13:12:45 -07:00
Twisha Bansal
f72e426314 docs: fix grammar (#751) 2025-06-26 10:40:15 +05:30
Wenxin Du
7a6644cf0c fix(bigquery,mssql): fix panic on tools with array param (#722)
Fix: https://github.com/googleapis/genai-toolbox/issues/701

Things done:
1. Replace the `AsReversedMap()` helper with `AsMap()`
2. BigQuery's QueryParameter only accepts typed slices as input, but our
arrays are passed in as []any. Therefore, add a logic to convert []any
to a typed array based on the item type.

Tested on MCP inspector:
<img width="409" alt="Screenshot 2025-06-16 at 5 15 55 PM"
src="https://github.com/user-attachments/assets/8053cad5-270e-4d82-b97c-856238c42154"
/>

---------

Co-authored-by: Kurtis Van Gent <31518063+kurtisvg@users.noreply.github.com>
2025-06-25 22:54:26 -04:00
Mend Renovate
184c681797 chore(deps): update module google.golang.org/api to v0.239.0 (#754)
This PR contains the following updates:

| Package | Change | Age | Adoption | Passing | Confidence |
|---|---|---|---|---|---|
|
[google.golang.org/api](https://redirect.github.com/googleapis/google-api-go-client)
| `v0.238.0` -> `v0.239.0` |
[![age](https://developer.mend.io/api/mc/badges/age/go/google.golang.org%2fapi/v0.239.0?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|
[![adoption](https://developer.mend.io/api/mc/badges/adoption/go/google.golang.org%2fapi/v0.239.0?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|
[![passing](https://developer.mend.io/api/mc/badges/compatibility/go/google.golang.org%2fapi/v0.238.0/v0.239.0?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|
[![confidence](https://developer.mend.io/api/mc/badges/confidence/go/google.golang.org%2fapi/v0.238.0/v0.239.0?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|

---

> [!WARNING]
> Some dependencies could not be looked up. Check the Dependency
Dashboard for more information.

---

### Release Notes

<details>
<summary>googleapis/google-api-go-client
(google.golang.org/api)</summary>

###
[`v0.239.0`](https://redirect.github.com/googleapis/google-api-go-client/releases/tag/v0.239.0)

[Compare
Source](https://redirect.github.com/googleapis/google-api-go-client/compare/v0.238.0...v0.239.0)

##### Features

- **all:** Auto-regenerate discovery clients
([#&#8203;3199](https://redirect.github.com/googleapis/google-api-go-client/issues/3199))
([2bdd042](2bdd042ac9))
- **all:** Auto-regenerate discovery clients
([#&#8203;3201](https://redirect.github.com/googleapis/google-api-go-client/issues/3201))
([8eff56f](8eff56f43f))
- **all:** Auto-regenerate discovery clients
([#&#8203;3202](https://redirect.github.com/googleapis/google-api-go-client/issues/3202))
([f7c299e](f7c299e9c0))
- **all:** Auto-regenerate discovery clients
([#&#8203;3203](https://redirect.github.com/googleapis/google-api-go-client/issues/3203))
([459c5a8](459c5a8db5))
- **all:** Auto-regenerate discovery clients
([#&#8203;3205](https://redirect.github.com/googleapis/google-api-go-client/issues/3205))
([ca610d5](ca610d5390))
- **all:** Auto-regenerate discovery clients
([#&#8203;3206](https://redirect.github.com/googleapis/google-api-go-client/issues/3206))
([98b7398](98b739881e))
- **all:** Auto-regenerate discovery clients
([#&#8203;3207](https://redirect.github.com/googleapis/google-api-go-client/issues/3207))
([71fe287](71fe287d9c))
- **all:** Auto-regenerate discovery clients
([#&#8203;3209](https://redirect.github.com/googleapis/google-api-go-client/issues/3209))
([27d1aa4](27d1aa43d1))

</details>

---

### Configuration

📅 **Schedule**: Branch creation - At any time (no schedule defined),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/googleapis/genai-toolbox).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MC42Mi4xIiwidXBkYXRlZEluVmVyIjoiNDAuNjIuMSIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOltdfQ==-->

Co-authored-by: Yuan <45984206+Yuan325@users.noreply.github.com>
2025-06-26 01:26:08 +00:00
Yuan
474df57d62 feat: support MCP version 2025-03-26 (#755)
This feature includes the following:
* Implement initialize lifecycle (including version negotiation)
* Add the v20250326 schema
* Supporting the `DELETE` and `GET` endpoint for MCP.
* Supporting streamable HTTP (without SSE).
* Terminating sessions after timeout (default = 10 minutes from last
active).
* Toolbox do not support batch request. Will response with `Invalid
requests` if batch requests is received.
2025-06-26 00:34:37 +00:00
Wenxin Du
fc1a3813ea ci: Add integration test coverage by source (#742)
Add a script for checking coverage for each source package with its
compatible tools.
Fail with coverage under 50%
2025-06-25 15:07:00 -04:00
Yuan
c7fe3c7f38 docs: fix linting in docs (#749)
Fix long lines and table column width lints in docs.
2025-06-25 17:03:42 +00:00
Anubhav Dhawan
dc2690bd39 docs: Document correct syntax for array parameters in SQL queries (#750)
## Problem

Users attempting to filter results in a SQL query based on an array
parameter from a tool may intuitively write a `statement` using the `IN`
clause, like so:
```sql
SELECT * FROM flights WHERE preferred_airlines IN ($1);
```
When this query is executed with an array argument (e.g., `["Delta",
"United"]`), it fails with a cryptic error message from the database
driver:
```
Exception: error while invoking tool: unable to execute query: failed to encode args[0]: unable to encode []interface {}{"Delta", "United"} into text format for text (OID 25): cannot find encode plan
```
This error occurs because the driver does not automatically expand the
single `$1` placeholder into a list of values `('Delta', 'United')`.
Instead, it tries to encode the entire Go slice `[]interface{}` as a
single text value, which fails. This creates a point of friction, as the
correct syntax is not immediately obvious and can lead to user
frustration and debugging time.

## Solution
This PR updates our documentation and example usage to demonstrate the
correct SQL syntax for handling array parameters. The proper way to
check for a value's existence in an array parameter is by using
PostgreSQL's `ANY()` operator:

```sql
SELECT * FROM flights WHERE preferred_airlines = ANY($1);
```
When this syntax is used, the database driver correctly interprets the
Go slice passed as `$1` as a PostgreSQL array, and the query executes as
intended.

## Impact
Saves developers time they would otherwise spend troubleshooting a
non-obvious database driver behavior.
2025-06-25 20:32:30 +05:30
Mend Renovate
b78f7480cf chore(deps): update module github.com/microsoft/go-mssqldb to v1.9.1 (#746)
This PR contains the following updates:

| Package | Change | Age | Adoption | Passing | Confidence |
|---|---|---|---|---|---|
|
[github.com/microsoft/go-mssqldb](https://redirect.github.com/microsoft/go-mssqldb)
| `v1.8.2` -> `v1.9.1` |
[![age](https://developer.mend.io/api/mc/badges/age/go/github.com%2fmicrosoft%2fgo-mssqldb/v1.9.1?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|
[![adoption](https://developer.mend.io/api/mc/badges/adoption/go/github.com%2fmicrosoft%2fgo-mssqldb/v1.9.1?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|
[![passing](https://developer.mend.io/api/mc/badges/compatibility/go/github.com%2fmicrosoft%2fgo-mssqldb/v1.8.2/v1.9.1?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|
[![confidence](https://developer.mend.io/api/mc/badges/confidence/go/github.com%2fmicrosoft%2fgo-mssqldb/v1.8.2/v1.9.1?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|

---

### Release Notes

<details>
<summary>microsoft/go-mssqldb
(github.com/microsoft/go-mssqldb)</summary>

###
[`v1.9.1`](https://redirect.github.com/microsoft/go-mssqldb/compare/v1.9.0...v1.9.1)

[Compare
Source](https://redirect.github.com/microsoft/go-mssqldb/compare/v1.9.0...v1.9.1)

###
[`v1.9.0`](https://redirect.github.com/microsoft/go-mssqldb/compare/v1.8.2...v1.9.0)

[Compare
Source](https://redirect.github.com/microsoft/go-mssqldb/compare/v1.8.2...v1.9.0)

</details>

---

### Configuration

📅 **Schedule**: Branch creation - At any time (no schedule defined),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/googleapis/genai-toolbox).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MC42Mi4xIiwidXBkYXRlZEluVmVyIjoiNDAuNjIuMSIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOltdfQ==-->

Co-authored-by: Yuan <45984206+Yuan325@users.noreply.github.com>
2025-06-24 23:38:49 +00:00
Mend Renovate
ffe9b74211 chore(deps): update module github.com/redis/go-redis/v9 to v9.11.0 (#745)
This PR contains the following updates:

| Package | Change | Age | Adoption | Passing | Confidence |
|---|---|---|---|---|---|
|
[github.com/redis/go-redis/v9](https://redirect.github.com/redis/go-redis)
| `v9.10.0` -> `v9.11.0` |
[![age](https://developer.mend.io/api/mc/badges/age/go/github.com%2fredis%2fgo-redis%2fv9/v9.11.0?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|
[![adoption](https://developer.mend.io/api/mc/badges/adoption/go/github.com%2fredis%2fgo-redis%2fv9/v9.11.0?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|
[![passing](https://developer.mend.io/api/mc/badges/compatibility/go/github.com%2fredis%2fgo-redis%2fv9/v9.10.0/v9.11.0?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|
[![confidence](https://developer.mend.io/api/mc/badges/confidence/go/github.com%2fredis%2fgo-redis%2fv9/v9.10.0/v9.11.0?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|

---

### Release Notes

<details>
<summary>redis/go-redis (github.com/redis/go-redis/v9)</summary>

###
[`v9.11.0`](https://redirect.github.com/redis/go-redis/releases/tag/v9.11.0):
9.11.0

[Compare
Source](https://redirect.github.com/redis/go-redis/compare/v9.10.0...v9.11.0)

#### 🚀 Highlights

Fixes `TxPipeline` to work correctly in cluster scenarios, allowing
execution of commands
only in the same slot for a given transaction.

### Changes

#### 🚀 New Features

- Set cluster slot for `scan` commands, rather than random
([#&#8203;2623](https://redirect.github.com/redis/go-redis/pull/2623))
- Add CredentialsProvider field to UniversalOptions
([#&#8203;2927](https://redirect.github.com/redis/go-redis/pull/2927))
- feat(redisotel): add WithCallerEnabled option
([#&#8203;3415](https://redirect.github.com/redis/go-redis/pull/3415))

#### 🐛 Bug Fixes

- fix(txpipeline): keyless commands should take the slot of the keyed
([#&#8203;3411](https://redirect.github.com/redis/go-redis/pull/3411))
- fix(loading): cache the loaded flag for slave nodes
([#&#8203;3410](https://redirect.github.com/redis/go-redis/pull/3410))
- fix(txpipeline): should return error on multi/exec on multiple slots
([#&#8203;3408](https://redirect.github.com/redis/go-redis/pull/3408))
- fix: check if the shard exists to avoid returning nil
([#&#8203;3396](https://redirect.github.com/redis/go-redis/pull/3396))

#### 🧰 Maintenance

- feat: optimize connection pool waitTurn
([#&#8203;3412](https://redirect.github.com/redis/go-redis/pull/3412))
- chore(ci): update CI redis builds
([#&#8203;3407](https://redirect.github.com/redis/go-redis/pull/3407))
- chore: remove a redundant method from `Ring`, `Client` and
`ClusterClient`
([#&#8203;3401](https://redirect.github.com/redis/go-redis/pull/3401))
- test: refactor TestBasicCredentials using table-driven tests
([#&#8203;3406](https://redirect.github.com/redis/go-redis/pull/3406))
- perf: reduce unnecessary memory allocation operations
([#&#8203;3399](https://redirect.github.com/redis/go-redis/pull/3399))
- fix: insert entry during iterating over a map
([#&#8203;3398](https://redirect.github.com/redis/go-redis/pull/3398))
- DOC-5229 probabilistic data type examples
([#&#8203;3413](https://redirect.github.com/redis/go-redis/pull/3413))
- chore(deps): bump rojopolis/spellcheck-github-actions from 0.49.0 to
0.51.0
([#&#8203;3414](https://redirect.github.com/redis/go-redis/pull/3414))

#### Contributors

We'd like to thank all the contributors who worked on this release!


[@&#8203;andy-stark-redis](https://redirect.github.com/andy-stark-redis),
[@&#8203;boekkooi-impossiblecloud](https://redirect.github.com/boekkooi-impossiblecloud),
[@&#8203;cxljs](https://redirect.github.com/cxljs),
[@&#8203;dcherubini](https://redirect.github.com/dcherubini),
[@&#8203;iamamirsalehi](https://redirect.github.com/iamamirsalehi),
[@&#8203;ndyakov](https://redirect.github.com/ndyakov),
[@&#8203;pete-woods](https://redirect.github.com/pete-woods),
[@&#8203;twz915](https://redirect.github.com/twz915)

</details>

---

### Configuration

📅 **Schedule**: Branch creation - At any time (no schedule defined),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/googleapis/genai-toolbox).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MC42Mi4xIiwidXBkYXRlZEluVmVyIjoiNDAuNjIuMSIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOltdfQ==-->

Co-authored-by: Yuan <45984206+Yuan325@users.noreply.github.com>
2025-06-24 23:27:34 +00:00
Averi Kitsch
e1355660d4 chore: Update Developer and Contributing docs (#738)
Co-authored-by: Wenxin Du <117315983+duwenxin99@users.noreply.github.com>
2025-06-24 13:58:47 -07:00
Twisha Bansal
d8e2abe2dd fix: fix adk quickstart (#741) 2025-06-24 12:04:11 +05:30
Twisha Bansal
7b3539e9ff chore: reorder quickstart (#740)
Reorder quickstart.
Order from
`GoogleGenAI -> ADK -> Langchain -> Llamaindex`

to
`ADK -> Langchain -> Llamaindex -> GoogleGenAI`
2025-06-24 11:55:01 +05:30
345 changed files with 33116 additions and 3149 deletions

View File

@@ -33,7 +33,9 @@ steps:
- name: "go"
path: "/gopath"
script: |
go test -c -race ./tests/...
go test -c -race -cover \
-coverpkg=./internal/sources/...,./internal/tools/... ./tests/...
chmod +x .ci/test_with_coverage.sh
- id: "cloud-sql-pg"
name: golang:1
@@ -54,7 +56,12 @@ steps:
args:
- -c
- |
./cloudsqlpg.test -test.v
.ci/test_with_coverage.sh \
"Cloud SQL Postgres" \
cloudsqlpg \
postgressql \
postgresexecutesql
- id: "alloydb-pg"
name: golang:1
@@ -75,7 +82,11 @@ steps:
args:
- -c
- |
./alloydbpg.test -test.v
.ci/test_with_coverage.sh \
"AlloyDB Postgres" \
alloydbpg \
postgressql \
postgresexecutesql
- id: "alloydb-ai-nl"
name: golang:1
@@ -96,7 +107,10 @@ steps:
args:
- -c
- |
./alloydbainl.test -test.v
.ci/test_with_coverage.sh \
"AlloyDB AI NL" \
alloydbainl \
alloydbainl
- id: "bigtable"
name: golang:1
@@ -115,7 +129,10 @@ steps:
args:
- -c
- |
./bigtable.test -test.v
.ci/test_with_coverage.sh \
"Bigtable" \
bigtable \
bigtable
- id: "bigquery"
name: golang:1
@@ -132,7 +149,30 @@ steps:
args:
- -c
- |
./bigquery.test -test.v
.ci/test_with_coverage.sh \
"BigQuery" \
bigquery \
bigquery
- id: "dataplex"
name: golang:1
waitFor: ["compile-test-binary"]
entrypoint: /bin/bash
env:
- "GOPATH=/gopath"
- "DATAPLEX_PROJECT=$PROJECT_ID"
- "SERVICE_ACCOUNT_EMAIL=$SERVICE_ACCOUNT_EMAIL"
secretEnv: ["CLIENT_ID"]
volumes:
- name: "go"
path: "/gopath"
args:
- -c
- |
.ci/test_with_coverage.sh \
"Dataplex" \
dataplex \
dataplex
- id: "postgres"
name: golang:1
@@ -151,7 +191,11 @@ steps:
args:
- -c
- |
./postgres.test -test.v
.ci/test_with_coverage.sh \
"Postgres" \
postgres \
postgressql \
postgresexecutesql
- id: "spanner"
name: golang:1
@@ -170,7 +214,10 @@ steps:
args:
- -c
- |
./spanner.test -test.v
.ci/test_with_coverage.sh \
"Spanner" \
spanner \
spanner
- id: "neo4j"
name: golang:1
@@ -187,7 +234,10 @@ steps:
args:
- -c
- |
./neo4j.test -test.v
.ci/test_with_coverage.sh \
"Neo4j" \
neo4j \
neo4j
- id: "cloud-sql-mssql"
name: golang:1
@@ -208,7 +258,10 @@ steps:
args:
- -c
- |
./cloudsqlmssql.test -test.v
.ci/test_with_coverage.sh \
"Cloud SQL MSSQL" \
cloudsqlmssql \
mssql
- id: "cloud-sql-mysql"
name: golang:1
@@ -229,7 +282,10 @@ steps:
args:
- -c
- |
./cloudsqlmysql.test -test.v
.ci/test_with_coverage.sh \
"Cloud SQL MySQL" \
cloudsqlmysql \
mysql
- id: "mysql"
name: golang:1
@@ -248,7 +304,10 @@ steps:
args:
- -c
- |
./mysql.test -test.v
.ci/test_with_coverage.sh \
"MySQL" \
mysql \
mysql
- id: "mssql"
name: golang:1
@@ -267,7 +326,10 @@ steps:
args:
- -c
- |
./mssql.test -test.v
.ci/test_with_coverage.sh \
"MSSQL" \
mssql \
mssql
- id: "dgraph"
name: golang:1
@@ -282,7 +344,10 @@ steps:
args:
- -c
- |
./dgraph.test -test.v
.ci/test_with_coverage.sh \
"Dgraph" \
dgraph \
dgraph
- id: "http"
name: golang:1
@@ -297,7 +362,10 @@ steps:
args:
- -c
- |
./http.test -test.v
.ci/test_with_coverage.sh \
"HTTP" \
http \
http
- id: "sqlite"
name: golang:1
@@ -313,7 +381,10 @@ steps:
args:
- -c
- |
./sqlite.test -test.v
.ci/test_with_coverage.sh \
"SQLite" \
sqlite \
sqlite
- id: "couchbase"
name : golang:1
@@ -331,7 +402,10 @@ steps:
args:
- -c
- |
./couchbase.test -test.v
.ci/test_with_coverage.sh \
"Couchbase" \
couchbase \
couchbase
- id: "redis"
name : golang:1
@@ -347,7 +421,10 @@ steps:
args:
- -c
- |
./redis.test -test.v
.ci/test_with_coverage.sh \
"Redis" \
redis \
redis
- id: "valkey"
name : golang:1
@@ -364,9 +441,91 @@ steps:
args:
- -c
- |
./valkey.test -test.v
.ci/test_with_coverage.sh \
"Valkey" \
valkey \
valkey
- id: "firestore"
name: golang:1
waitFor: ["compile-test-binary"]
entrypoint: /bin/bash
env:
- "GOPATH=/gopath"
- "FIRESTORE_PROJECT=$PROJECT_ID"
- "SERVICE_ACCOUNT_EMAIL=$SERVICE_ACCOUNT_EMAIL"
secretEnv: ["CLIENT_ID"]
volumes:
- name: "go"
path: "/gopath"
args:
- -c
- |
.ci/test_with_coverage.sh \
"Firestore" \
firestore \
firestore
- id: "looker"
name: golang:1
waitFor: ["compile-test-binary"]
entrypoint: /bin/bash
env:
- "GOPATH=/gopath"
- "FIRESTORE_PROJECT=$PROJECT_ID"
- "SERVICE_ACCOUNT_EMAIL=$SERVICE_ACCOUNT_EMAIL"
- "LOOKER_VERIFY_SSL=$_LOOKER_VERIFY_SSL"
secretEnv: ["CLIENT_ID", "LOOKER_BASE_URL", "LOOKER_CLIENT_ID", "LOOKER_CLIENT_SECRET"]
volumes:
- name: "go"
path: "/gopath"
args:
- -c
- |
.ci/test_with_coverage.sh \
"Looker" \
looker \
looker
- id: "duckdb"
name: golang:1
waitFor: ["compile-test-binary"]
entrypoint: /bin/bash
env:
- "GOPATH=/gopath"
- "SERVICE_ACCOUNT_EMAIL=$SERVICE_ACCOUNT_EMAIL"
volumes:
- name: "go"
path: "/gopath"
secretEnv: ["CLIENT_ID"]
args:
- -c
- |
.ci/test_with_coverage.sh \
"DuckDB" \
duckdb \
duckdb
- id: "alloydbwaitforoperation"
name: golang:1
waitFor: ["compile-test-binary"]
entrypoint: /bin/bash
env:
- "GOPATH=/gopath"
- "API_KEY=$(gcloud auth print-access-token)"
secretEnv: ["CLIENT_ID"]
volumes:
- name: "go"
path: "/gopath"
args:
- -c
- |
.ci/test_with_coverage.sh \
"Alloydb Wait for Operation" \
utility \
utility/alloydbwaitforoperation
availableSecrets:
secretManager:
- versionName: projects/$PROJECT_ID/secrets/cloud_sql_pg_user/versions/latest
@@ -404,7 +563,7 @@ availableSecrets:
- versionName: projects/$PROJECT_ID/secrets/mysql_pass/versions/latest
env: MYSQL_PASS
- versionName: projects/$PROJECT_ID/secrets/mssql_user/versions/latest
env: MSSQL_USER
env: MSSQL_USER
- versionName: projects/$PROJECT_ID/secrets/mssql_pass/versions/latest
env: MSSQL_PASS
- versionName: projects/$PROJECT_ID/secrets/couchbase_connection/versions/latest
@@ -419,6 +578,12 @@ availableSecrets:
env: REDIS_PASS
- versionName: projects/$PROJECT_ID/secrets/memorystore_valkey_address/versions/latest
env: VALKEY_ADDRESS
- versionName: projects/107716898620/secrets/looker_base_url/versions/latest
env: LOOKER_BASE_URL
- versionName: projects/107716898620/secrets/looker_client_id/versions/latest
env: LOOKER_CLIENT_ID
- versionName: projects/107716898620/secrets/looker_client_secret/versions/latest
env: LOOKER_CLIENT_SECRET
options:
@@ -450,4 +615,5 @@ substitutions:
_MSSQL_PORT: "1433"
_DGRAPHURL: "https://play.dgraph.io"
_COUCHBASE_BUCKET: "couchbase-bucket"
_COUCHBASE_SCOPE: "couchbase-scope"
_COUCHBASE_SCOPE: "couchbase-scope"
_LOOKER_VERIFY_SSL: "true"

62
.ci/test_with_coverage.sh Executable file
View File

@@ -0,0 +1,62 @@
#!/bin/bash
# Arguments:
# $1: Display name for logs (e.g., "Cloud SQL Postgres")
# $2: Integration test's package name (e.g., cloudsqlpg)
# $3, $4, ...: Tool package names for grep (e.g., postgressql), if the
# integration test specifically check a separate package inside a folder, please
# specify the full path instead (e.g., postgressql/postgresexecutesql)
DISPLAY_NAME="$1"
SOURCE_PACKAGE_NAME="$2"
# Construct the test binary name
TEST_BINARY="${SOURCE_PACKAGE_NAME}.test"
# Construct the full source path
SOURCE_PATH="sources/${SOURCE_PACKAGE_NAME}/"
# Shift arguments so that $3 and onwards become the list of tool package names
shift 2
TOOL_PACKAGE_NAMES=("$@")
COVERAGE_FILE="${TEST_BINARY%.test}_coverage.out"
FILTERED_COVERAGE_FILE="${TEST_BINARY%.test}_filtered_coverage.out"
export path="github.com/googleapis/genai-toolbox/internal/"
GREP_PATTERN="^mode:|${path}${SOURCE_PATH}"
# Add each tool package path to the grep pattern
for tool_name in "${TOOL_PACKAGE_NAMES[@]}"; do
if [ -n "$tool_name" ]; then
full_tool_path="tools/${tool_name}/"
GREP_PATTERN="${GREP_PATTERN}|${path}${full_tool_path}"
fi
done
# Run integration test
if ! ./"${TEST_BINARY}" -test.v -test.coverprofile="${COVERAGE_FILE}"; then
echo "Error: Tests for ${DISPLAY_NAME} failed. Exiting."
exit 1
fi
# Filter source/tool packages
if ! grep -E "${GREP_PATTERN}" "${COVERAGE_FILE}" > "${FILTERED_COVERAGE_FILE}"; then
echo "Warning: Could not filter coverage for ${DISPLAY_NAME}. Filtered file might be empty or invalid."
fi
# Calculate coverage
echo "Calculating coverage for ${DISPLAY_NAME}..."
total_coverage=$(go tool cover -func="${FILTERED_COVERAGE_FILE}" 2>/dev/null | grep "total:" | awk '{print $3}')
echo "${DISPLAY_NAME} total coverage: $total_coverage"
coverage_numeric=$(echo "$total_coverage" | sed 's/%//')
# Check coverage threshold
if awk -v coverage="$coverage_numeric" 'BEGIN {exit !(coverage < 50)}'; then
echo "Coverage failure: ${DISPLAY_NAME} total coverage($total_coverage) is below 50%."
exit 1
else
echo "Coverage for ${DISPLAY_NAME} is sufficient."
fi

View File

@@ -1,7 +1,7 @@
assign_issues:
- kurtisvg
- Yuan325
- duwenxin99
- averikitsch
assign_issues_by:
- labels:
- 'product: bigquery'
@@ -10,6 +10,6 @@ assign_issues_by:
- shobsi
- jiaxunwu
assign_prs:
- kurtisvg
- Yuan325
- duwenxin99
- averikitsch

24
.github/labels.yaml vendored
View File

@@ -50,7 +50,7 @@
color: ffffc7
description: Desirable enhancement or fix. May not be included in next release.
- name: do not merge
- name: 'do not merge'
color: d93f0b
description: Indicates a pull request not ready for merge, due to either quality
or timing.
@@ -65,28 +65,26 @@
color: ededed
description: Release please has completed a release for this.
- name: 'blunderbuss: assign'
color: 3DED97
description: Have blunderbuss assign this to someone new.
- name: 'tests: run'
color: 3DED97
description: Label to trigger Github Action tests.
- name: 'docs: deploy-preview'
color: BFDADC
description: Label to trigger Github Action docs preview.
- name: 'status: contribution welcome'
- name: 'status: help wanted'
color: 8befd7
description: Status - Contributions welcome.
- name: 'status: awaiting response'
description: 'Status: Unplanned work open to contributions from the community.'
- name: 'status: feedback wanted'
color: 8befd7
description: Status - Awaiting response from author.
- name: 'status: awaiting codeowners'
color: 8befd7
description: Status - Awaiting response from code owners.
description: 'Status: waiting for feedback from community or issue author.'
# Product Labels
- name: 'product: bigquery'
color: 5065c7
description: Product - Assigned to the BigQuery team.
description: 'Product: Assigned to the BigQuery team.'

View File

@@ -18,18 +18,25 @@ releaseType: simple
versionFile: "cmd/version.txt"
extraFiles: [
"README.md",
"docs/en/getting-started/colab_quickstart.ipynb",
"docs/en/getting-started/introduction/_index.md",
"docs/en/getting-started/local_quickstart.md",
"docs/en/getting-started/local_quickstart_js.md",
"docs/en/getting-started/local_quickstart_go.md",
"docs/en/getting-started/mcp_quickstart/_index.md",
"docs/en/samples/bigquery/local_quickstart.md",
"docs/en/samples/bigquery/mcp_quickstart/_index.md",
"docs/en/getting-started/colab_quickstart.ipynb",
"docs/en/samples/bigquery/colab_quickstart_bigquery.ipynb",
"docs/en/how-to/connect-ide/bigquery_mcp.md",
"docs/en/how-to/connect-ide/spanner_mcp.md",
"docs/en/samples/looker/looker_gemini.md",
"docs/en/samples/looker/looker_mcp_inspector.md",
"docs/en/how-to/connect-ide/alloydb_pg_mcp.md",
"docs/en/how-to/connect-ide/cloud_sql_mysql_mcp.md",
"docs/en/how-to/connect-ide/alloydb_pg_admin_mcp.md",
"docs/en/how-to/connect-ide/bigquery_mcp.md",
"docs/en/how-to/connect-ide/cloud_sql_pg_mcp.md",
"docs/en/how-to/connect-ide/postgres_mcp.md",
"docs/en/how-to/connect-ide/cloud_sql_mssql_mcp.md",
"docs/en/how-to/connect-ide/cloud_sql_mysql_mcp.md",
"docs/en/how-to/connect-ide/firestore_mcp.md",
"docs/en/how-to/connect-ide/looker_mcp.md",
"docs/en/how-to/connect-ide/postgres_mcp.md",
"docs/en/how-to/connect-ide/spanner_mcp.md",
]

View File

@@ -51,6 +51,8 @@ jobs:
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
# Checkout the PR's HEAD commit (supports forks).
ref: ${{ github.event.pull_request.head.sha }}
fetch-depth: 0 # Fetch all history for .GitInfo and .Lastmod
- name: Setup Hugo

7
.gitignore vendored
View File

@@ -4,6 +4,9 @@
# vscode
.vscode/
# idea
.idea/
# npm
node_modules
@@ -14,3 +17,7 @@ node_modules
# coverage
.coverage
# executable
genai-toolbox
toolbox

View File

@@ -0,0 +1 @@
@import 'td/code-dark';

View File

@@ -48,7 +48,11 @@ enableRobotsTXT = true
pre = "<i class='fa-brands fa-github'></i>"
[markup.goldmark.renderer]
unsafe= true
unsafe= true
[markup.highlight]
noClasses = false
style = "tango"
[outputFormats]
[outputFormats.LLMS]

View File

@@ -0,0 +1,2 @@
{{ $file := .Get 0 }}
{{ (printf "%s%s" .Page.File.Dir $file) | readFile | replaceRE "^---[\\s\\S]+?---" "" | safeHTML }}

View File

@@ -1,5 +1,92 @@
# Changelog
## [0.10.0](https://github.com/googleapis/genai-toolbox/compare/v0.9.0...v0.10.0) (2025-07-25)
### Features
* Add `Map` parameters support ([#928](https://github.com/googleapis/genai-toolbox/issues/928)) ([4468bc9](https://github.com/googleapis/genai-toolbox/commit/4468bc920bbf27dce4ab160197587b7c12fcd20f))
* Add Dataplex source and tool ([#847](https://github.com/googleapis/genai-toolbox/issues/847)) ([30c16a5](https://github.com/googleapis/genai-toolbox/commit/30c16a559e8d49a9a717935269e69b97ec25519a))
* Add Looker source and tool ([#923](https://github.com/googleapis/genai-toolbox/issues/923)) ([c67e01b](https://github.com/googleapis/genai-toolbox/commit/c67e01bcf998e7b884be30ebb1fd277c89ed6ffc))
* Add support for null optional parameter ([#802](https://github.com/googleapis/genai-toolbox/issues/802)) ([a817b12](https://github.com/googleapis/genai-toolbox/commit/a817b120ca5e09ce80eb8d7544ebbe81fc28b082)), closes [#736](https://github.com/googleapis/genai-toolbox/issues/736)
* **prebuilt/alloydb-admin-config:** Add alloydb control plane as a prebuilt config ([#937](https://github.com/googleapis/genai-toolbox/issues/937)) ([0b28b72](https://github.com/googleapis/genai-toolbox/commit/0b28b72aa0ca2cdc87afbddbeb7f4dbb9688593d))
* **prebuilt/mysql,prebuilt/mssql:** Add generic mysql and mssql prebuilt tools ([#983](https://github.com/googleapis/genai-toolbox/issues/983)) ([c600c30](https://github.com/googleapis/genai-toolbox/commit/c600c30374443b6106c1f10b60cd334fd202789b))
* **server/mcp:** Support MCP version 2025-06-18 ([#898](https://github.com/googleapis/genai-toolbox/issues/898)) ([313d3ca](https://github.com/googleapis/genai-toolbox/commit/313d3ca0d084a3a6e7ac9a21a862aa31bf3edadd))
* **sources/mssql:** Add support for encrypt connection parameter ([#874](https://github.com/googleapis/genai-toolbox/issues/874)) ([14a868f](https://github.com/googleapis/genai-toolbox/commit/14a868f2a0780b94c2ca104419b2ff098778303b))
* **sources/firestore:** Add Firestore as Source ([#786](https://github.com/googleapis/genai-toolbox/issues/786)) ([2bb790e](https://github.com/googleapis/genai-toolbox/commit/2bb790e4f8194b677fe0ba40122d409d0e3e687e))
* **sources/mongodb:** Add MongoDB Source ([#969](https://github.com/googleapis/genai-toolbox/issues/969)) ([74dbd61](https://github.com/googleapis/genai-toolbox/commit/74dbd6124daab6192dd880dbd1d15f36861abf74))
* **tools/alloydb-wait-for-operation:** Add wait for operation tool with exponential backoff ([#920](https://github.com/googleapis/genai-toolbox/issues/920)) ([3f6ec29](https://github.com/googleapis/genai-toolbox/commit/3f6ec2944ede18ee02b10157cc048145bdaec87a))
* **tools/mongodb-aggregate:** Add MongoDB `aggregate` Tools ([#977](https://github.com/googleapis/genai-toolbox/issues/977)) ([bd399bb](https://github.com/googleapis/genai-toolbox/commit/bd399bb0fb7134469345ed9a1111ea4209440867))
* **tools/mongodb-delete:** Add MongoDB `delete` Tools ([#974](https://github.com/googleapis/genai-toolbox/issues/974)) ([78e9752](https://github.com/googleapis/genai-toolbox/commit/78e9752f620e065246f3e7b9d37062e492247c8a))
* **tools/mongodb-find:** Add MongoDB `find` Tools ([#970](https://github.com/googleapis/genai-toolbox/issues/970)) ([a747475](https://github.com/googleapis/genai-toolbox/commit/a7474752d8d7ea7af1e80a3c4533d2fd4154d897))
* **tools/mongodb-insert:** Add MongoDB `insert` Tools ([#975](https://github.com/googleapis/genai-toolbox/issues/975)) ([4c63f0c](https://github.com/googleapis/genai-toolbox/commit/4c63f0c1e402817a0c8fec611635e99290308d0e))
* **tools/mongodb-update:** Add MongoDB `update` Tools ([#972](https://github.com/googleapis/genai-toolbox/issues/972)) ([dfde52c](https://github.com/googleapis/genai-toolbox/commit/dfde52ca9a8e25e2f3944f52b4c2e307072b6c37))
* **tools/neo4j-execute-cypher:** Add neo4j-execute-cypher for Neo4j sources ([#946](https://github.com/googleapis/genai-toolbox/issues/946)) ([81d0505](https://github.com/googleapis/genai-toolbox/commit/81d05053b2e08338fd6eabe4849c309064f76b6b))
* **tools/neo4j-schema:** Add neo4j-schema tool ([#978](https://github.com/googleapis/genai-toolbox/issues/978)) ([be7db3d](https://github.com/googleapis/genai-toolbox/commit/be7db3dff263625ce64fdb726e81164996b7a708))
* **tools/wait:** Create wait for tool ([#885](https://github.com/googleapis/genai-toolbox/issues/885)) ([ed5ef4c](https://github.com/googleapis/genai-toolbox/commit/ed5ef4caea10ba1dbc49c0fc0a0d2b91cf341d3b))
### Bug Fixes
* Fix document preview pipeline for forked PRs ([#950](https://github.com/googleapis/genai-toolbox/issues/950)) ([481cc60](https://github.com/googleapis/genai-toolbox/commit/481cc608bae807d9e92497bc8863066916f7ef21))
* **prebuilt/firestore:** Mark database field as required in the firestore prebuilt tools ([#959](https://github.com/googleapis/genai-toolbox/issues/959)) ([15417d4](https://github.com/googleapis/genai-toolbox/commit/15417d4e0c7b173e81edbbeb672e53884d186104))
* **prebuilt/cloud-sql-mssql:** Correct source reference for execute_sql tool in cloud-sql-mssql.yaml prebuilt config ([#938](https://github.com/googleapis/genai-toolbox/issues/938)) ([d16728e](https://github.com/googleapis/genai-toolbox/commit/d16728e5c603eab37700876a6ddacbf709fd5823))
* **prebuilt/cloud-sql-mysql:** Update list_table tool ([#924](https://github.com/googleapis/genai-toolbox/issues/924)) ([2083ba5](https://github.com/googleapis/genai-toolbox/commit/2083ba50483951e9ee6101bb832aa68823cd96a5))
* Replace 'float' with 'number' in McpManifest ([#985](https://github.com/googleapis/genai-toolbox/issues/985)) ([59e23e1](https://github.com/googleapis/genai-toolbox/commit/59e23e17250a516e3931996114f32ac6526a4f8e))
* **server/api:** Add logger to context in tool invoke handler ([#891](https://github.com/googleapis/genai-toolbox/issues/891)) ([8ce311f](https://github.com/googleapis/genai-toolbox/commit/8ce311f256481e8f11ecb4aa505b95a562f394ef))
* **sources/looker:** Add agent tag to Looker API calls. ([#966](https://github.com/googleapis/genai-toolbox/issues/966)) ([f55dd6f](https://github.com/googleapis/genai-toolbox/commit/f55dd6fcd099f23bd89df62b268c4a53d16f3bac))
* **tools/bigquery-execute-sql:** Ensure invoke always returns a non-null value ([#925](https://github.com/googleapis/genai-toolbox/issues/925)) ([9a55b80](https://github.com/googleapis/genai-toolbox/commit/9a55b804821a6ccfcd157bcfaee7e599c4a5cb63))
* **tools/mysqlsql:** Unmarshal json data from database during invoke ([#979](https://github.com/googleapis/genai-toolbox/issues/979)) ([ccc3498](https://github.com/googleapis/genai-toolbox/commit/ccc3498cf0a4c43eb909e3850b9e6f582cd48f2a)), closes [#840](https://github.com/googleapis/genai-toolbox/issues/840)
## [0.9.0](https://github.com/googleapis/genai-toolbox/compare/v0.8.0...v0.9.0) (2025-07-11)
### Features
* Dynamic reloading for toolbox config ([#800](https://github.com/googleapis/genai-toolbox/issues/800)) ([4c240ac](https://github.com/googleapis/genai-toolbox/commit/4c240ac3c961cd14738c998ba2d10d5235ef523e))
* **sources/mysql:** Add queryTimeout support to MySQL source ([#830](https://github.com/googleapis/genai-toolbox/issues/830)) ([391cb5b](https://github.com/googleapis/genai-toolbox/commit/391cb5bfe845e554411240a1d9838df5331b25fa))
* **tools/bigquery:** Add optional projectID parameter to bigquery tools ([#799](https://github.com/googleapis/genai-toolbox/issues/799)) ([c6ab74c](https://github.com/googleapis/genai-toolbox/commit/c6ab74c5dad53a0e7885a18438ab3be36b9b7cb3))
### Bug Fixes
* Cleanup unassigned err log ([#857](https://github.com/googleapis/genai-toolbox/issues/857)) ([c081ace](https://github.com/googleapis/genai-toolbox/commit/c081ace46bb24cb3fd2adb21d519489be0d3f3c3))
* Fix docs preview deployment pipeline ([#787](https://github.com/googleapis/genai-toolbox/issues/787)) ([0a93b04](https://github.com/googleapis/genai-toolbox/commit/0a93b0482c8d3c64b324e67408d408f5576ecaf3))
* **tools:** Nil parameter error when arrays are used ([#801](https://github.com/googleapis/genai-toolbox/issues/801)) ([2bdcc08](https://github.com/googleapis/genai-toolbox/commit/2bdcc0841ab37d18e2f0d6fe63fb6f10da3e302b))
* Trigger reload on additional fsnotify operations ([#854](https://github.com/googleapis/genai-toolbox/issues/854)) ([aa8dbec](https://github.com/googleapis/genai-toolbox/commit/aa8dbec97095cf0d7ac771c8084a84e2d3d8ce4e))
## [0.8.0](https://github.com/googleapis/genai-toolbox/compare/v0.7.0...v0.8.0) (2025-07-02)
### ⚠ BREAKING CHANGES
* **postgres,mssql,cloudsqlmssql:** encode source connection url for sources ([#727](https://github.com/googleapis/genai-toolbox/issues/727))
### Features
* Add support for multiple YAML configuration files ([#760](https://github.com/googleapis/genai-toolbox/issues/760)) ([40679d7](https://github.com/googleapis/genai-toolbox/commit/40679d700eded50d19569923e2a71c51e907a8bf))
* Add support for optional parameters ([#617](https://github.com/googleapis/genai-toolbox/issues/617)) ([4827771](https://github.com/googleapis/genai-toolbox/commit/4827771b78dee9a1284a898b749509b472061527)), closes [#475](https://github.com/googleapis/genai-toolbox/issues/475)
* **mcp:** Support MCP version 2025-03-26 ([#755](https://github.com/googleapis/genai-toolbox/issues/755)) ([474df57](https://github.com/googleapis/genai-toolbox/commit/474df57d62de683079f8d12c31db53396a545fd1))
* **sources/http:** Support disable SSL verification for HTTP Source ([#674](https://github.com/googleapis/genai-toolbox/issues/674)) ([4055b0c](https://github.com/googleapis/genai-toolbox/commit/4055b0c3569c527560d7ad34262963b3dd4e282d))
* **tools/bigquery:** Add templateParameters field for bigquery ([#699](https://github.com/googleapis/genai-toolbox/issues/699)) ([f5f771b](https://github.com/googleapis/genai-toolbox/commit/f5f771b0f3d159630ff602ff55c6c66b61981446))
* **tools/bigtable:** Add templateParameters field for bigtable ([#692](https://github.com/googleapis/genai-toolbox/issues/692)) ([1c06771](https://github.com/googleapis/genai-toolbox/commit/1c067715fac06479eb0060d7067b73dba099ed92))
* **tools/couchbase:** Add templateParameters field for couchbase ([#723](https://github.com/googleapis/genai-toolbox/issues/723)) ([9197186](https://github.com/googleapis/genai-toolbox/commit/9197186b8bea1ac4ec1b39c9c5c110807c8b2ba9))
* **tools/http:** Add support for HTTP Tool pathParams ([#726](https://github.com/googleapis/genai-toolbox/issues/726)) ([fd300dc](https://github.com/googleapis/genai-toolbox/commit/fd300dc606d88bf9f7bba689e2cee4e3565537dd))
* **tools/redis:** Add Redis Source and Tool ([#519](https://github.com/googleapis/genai-toolbox/issues/519)) ([f0aef29](https://github.com/googleapis/genai-toolbox/commit/f0aef29b0c2563e2a00277fbe2784f39f16d2835))
* **tools/spanner:** Add templateParameters field for spanner ([#691](https://github.com/googleapis/genai-toolbox/issues/691)) ([075dfa4](https://github.com/googleapis/genai-toolbox/commit/075dfa47e1fd92be4847bd0aec63296146b66455))
* **tools/sqlitesql:** Add templateParameters field for sqlitesql ([#687](https://github.com/googleapis/genai-toolbox/issues/687)) ([75e254c](https://github.com/googleapis/genai-toolbox/commit/75e254c0a4ce690ca5fa4d1741550ce54734b226))
* **tools/valkey:** Add Valkey Source and Tool ([#532](https://github.com/googleapis/genai-toolbox/issues/532)) ([054ec19](https://github.com/googleapis/genai-toolbox/commit/054ec198b97ba9f36f67dd12b2eff0cc6bc4d080))
### Bug Fixes
* **bigquery,mssql:** Fix panic on tools with array param ([#722](https://github.com/googleapis/genai-toolbox/issues/722)) ([7a6644c](https://github.com/googleapis/genai-toolbox/commit/7a6644cf0c5413e5c803955c88a2cfd0a2233ed3))
* **postgres,mssql,cloudsqlmssql:** Encode source connection url for sources ([#727](https://github.com/googleapis/genai-toolbox/issues/727)) ([67964d9](https://github.com/googleapis/genai-toolbox/commit/67964d939f27320b63b5759f4b3f3fdaa0c76fbf)), closes [#717](https://github.com/googleapis/genai-toolbox/issues/717)
* Set default value to field's type during unmarshalling ([#774](https://github.com/googleapis/genai-toolbox/issues/774)) ([fafed24](https://github.com/googleapis/genai-toolbox/commit/fafed2485839cf1acc1350e8a24103d2e6356ee0)), closes [#771](https://github.com/googleapis/genai-toolbox/issues/771)
* **server/mcp:** Do not listen from port for stdio ([#719](https://github.com/googleapis/genai-toolbox/issues/719)) ([d51dbc7](https://github.com/googleapis/genai-toolbox/commit/d51dbc759ba493021d3ec6f5417fc04c21f7044f)), closes [#711](https://github.com/googleapis/genai-toolbox/issues/711)
* **tools/mysqlexecutesql:** Handle nil panic and connection leak in Invoke ([#757](https://github.com/googleapis/genai-toolbox/issues/757)) ([7badba4](https://github.com/googleapis/genai-toolbox/commit/7badba42eefb34252be77b852a57d6bd78dd267d))
* **tools/mysqlsql:** Handle nil panic and connection leak in invoke ([#758](https://github.com/googleapis/genai-toolbox/issues/758)) ([cbb4a33](https://github.com/googleapis/genai-toolbox/commit/cbb4a333517313744800d148840312e56340f3fd))
## [0.7.0](https://github.com/googleapis/genai-toolbox/compare/v0.6.0...v0.7.0) (2025-06-10)

View File

@@ -14,21 +14,21 @@ race, religion, or sexual identity and orientation.
Examples of behavior that contributes to creating a positive environment
include:
* Using welcoming and inclusive language
* Being respectful of differing viewpoints and experiences
* Gracefully accepting constructive criticism
* Focusing on what is best for the community
* Showing empathy towards other community members
* Using welcoming and inclusive language
* Being respectful of differing viewpoints and experiences
* Gracefully accepting constructive criticism
* Focusing on what is best for the community
* Showing empathy towards other community members
Examples of unacceptable behavior by participants include:
* The use of sexualized language or imagery and unwelcome sexual attention or
* The use of sexualized language or imagery and unwelcome sexual attention or
advances
* Trolling, insulting/derogatory comments, and personal or political attacks
* Public or private harassment
* Publishing others' private information, such as a physical or electronic
* Trolling, insulting/derogatory comments, and personal or political attacks
* Public or private harassment
* Publishing others' private information, such as a physical or electronic
address, without explicit permission
* Other conduct which could reasonably be considered inappropriate in a
* Other conduct which could reasonably be considered inappropriate in a
professional setting
## Our Responsibilities
@@ -75,7 +75,7 @@ receive and address reported violations of the code of conduct. They will then
work with a committee consisting of representatives from the Open Source
Programs Office and the Google Open Source Strategy team. If for any reason you
are uncomfortable reaching out to the Project Steward, please email
opensource@google.com.
<opensource@google.com>.
We will investigate every complaint, but you may not receive a direct response.
We will use our discretion in determining when and how to follow up on reported
@@ -90,4 +90,4 @@ harassment or threats to anyone's safety, we may take action without notice.
This Code of Conduct is adapted from the Contributor Covenant, version 1.4,
available at
https://www.contributor-covenant.org/version/1/4/code-of-conduct.html
<https://www.contributor-covenant.org/version/1/4/code-of-conduct.html>

View File

@@ -30,4 +30,153 @@ This project follows
All submissions, including submissions by project members, require review. We
use GitHub pull requests for this purpose. Consult
[GitHub Help](https://help.github.com/articles/about-pull-requests/) for more
information on using pull requests.
information on using pull requests.
Within 2-5 days, a reviewer will review your PR. They may approve it, or request
changes. When requesting changes, reviewers should self-assign the PR to ensure
they are aware of any updates.
If additional changes are needed, push additional commits to your PR branch -
this helps the reviewer know which parts of the PR have changed. Commits will be
squashed when merged.
Please follow up with changes promptly. If a PR is awaiting changes by the
author for more than 10 days, maintainers may mark that PR as Draft. PRs that
are inactive for more than 30 days may be closed.
### Adding a New Database Source and Tool
We recommend creating an
[issue](https://github.com/googleapis/genai-toolbox/issues) before
implementation to ensure we can accept the contribution and no duplicated work.
If you have any questions, reach out on our
[Discord](https://discord.gg/Dmm69peqjh) to chat directly with the team. New
contributions should be added with both unit tests and integration tests.
#### 1. Implement the New Data Source
We recommend looking at an [example source
implementation](https://github.com/googleapis/genai-toolbox/blob/main/internal/sources/postgres/postgres.go).
* **Create a new directory** under `internal/sources` for your database type
(e.g., `internal/sources/newdb`).
* **Define a configuration struct** for your data source in a file named
`newdb.go`. Create a `Config` struct to include all the necessary parameters
for connecting to the database (e.g., host, port, username, password, database
name) and a `Source` struct to store necessary parameters for tools (e.g.,
Name, Kind, connection object, additional config).
* **Implement the
[`SourceConfig`](https://github.com/googleapis/genai-toolbox/blob/fd300dc606d88bf9f7bba689e2cee4e3565537dd/internal/sources/sources.go#L57)
interface**. This interface requires two methods:
* `SourceConfigKind() string`: Returns a unique string identifier for your
data source (e.g., `"newdb"`).
* `Initialize(ctx context.Context, tracer trace.Tracer) (Source, error)`:
Creates a new instance of your data source and establishes a connection to
the database.
* **Implement the
[`Source`](https://github.com/googleapis/genai-toolbox/blob/fd300dc606d88bf9f7bba689e2cee4e3565537dd/internal/sources/sources.go#L63)
interface**. This interface requires one method:
* `SourceKind() string`: Returns the same string identifier as `SourceConfigKind()`.
* **Implement `init()`** to register the new Source.
* **Implement Unit Tests** in a file named `newdb_test.go`.
#### 2. Implement the New Tool
We recommend looking at an [example tool
implementation](https://github.com/googleapis/genai-toolbox/tree/main/internal/tools/postgres/postgressql).
* **Create a new directory** under `internal/tools` for your tool type (e.g.,
`internal/tools/newdb` or `internal/tools/newdb<tool_name>`).
* **Define a configuration struct** for your tool in a file named `newdbtool.go`.
Create a `Config` struct and a `Tool` struct to store necessary parameters for
tools.
* **Implement the
[`ToolConfig`](https://github.com/googleapis/genai-toolbox/blob/fd300dc606d88bf9f7bba689e2cee4e3565537dd/internal/tools/tools.go#L61)
interface**. This interface requires one method:
* `ToolConfigKind() string`: Returns a unique string identifier for your tool
(e.g., `"newdb"`).
* `Initialize(sources map[string]Source) (Tool, error)`: Creates a new
instance of your tool and validates that it can connect to the specified
data source.
* **Implement the `Tool` interface**. This interface requires the following
methods:
* `Invoke(ctx context.Context, params map[string]any) ([]any, error)`:
Executes the operation on the database using the provided parameters.
* `ParseParams(data map[string]any, claims map[string]map[string]any)
(ParamValues, error)`: Parses and validates the input parameters.
* `Manifest() Manifest`: Returns a manifest describing the tool's capabilities
and parameters.
* `McpManifest() McpManifest`: Returns an MCP manifest describing the tool for
use with the Model Context Protocol.
* `Authorized(services []string) bool`: Checks if the tool is authorized to
run based on the provided authentication services.
* **Implement `init()`** to register the new Tool.
* **Implement Unit Tests** in a file named `newdb_test.go`.
#### 3. Add Integration Tests
* **Add a test file** under a new directory `tests/newdb`.
* **Add pre-defined integration test suites** in the
`/tests/newdb/newdb_test.go` that are **required** to be run as long as your
code contains related features:
1. [RunToolGetTest][tool-get]: tests for the `GET` endpoint that returns the
tool's manifest.
2. [RunToolInvokeTest][tool-call]: tests for tool calling through the native
Toolbox endpoints.
3. [RunMCPToolCallMethod][mcp-call]: tests tool calling through the MCP
endpoints.
4. (Optional) [RunExecuteSqlToolInvokeTest][execute-sql]: tests an
`execute-sql` tool for any source. Only run this test if you are adding an
`execute-sql` tool.
5. (Optional) [RunToolInvokeWithTemplateParameters][temp-param]: tests for [template
parameters][temp-param-doc]. Only run this test if template
parameters apply to your tool.
* **Add the new database to the test config** in
[integration.cloudbuild.yaml](.ci/integration.cloudbuild.yaml).
[tool-get]:
https://github.com/googleapis/genai-toolbox/blob/fd300dc606d88bf9f7bba689e2cee4e3565537dd/tests/tool.go#L31
[tool-call]:
<https://github.com/googleapis/genai-toolbox/blob/fd300dc606d88bf9f7bba689e2cee4e3565537dd/tests/tool.go#L79>
[mcp-call]:
https://github.com/googleapis/genai-toolbox/blob/fd300dc606d88bf9f7bba689e2cee4e3565537dd/tests/tool.go#L554
[execute-sql]:
<https://github.com/googleapis/genai-toolbox/blob/fd300dc606d88bf9f7bba689e2cee4e3565537dd/tests/tool.go#L431>
[temp-param]:
<https://github.com/googleapis/genai-toolbox/blob/fd300dc606d88bf9f7bba689e2cee4e3565537dd/tests/tool.go#L297>
[temp-param-doc]:
https://googleapis.github.io/genai-toolbox/resources/tools/#template-parameters
#### 4. Add Documentation
* **Update the documentation** to include information about your new data source
and tool. This includes:
* Adding a new page to the `docs/en/resources/sources` directory for your data
source.
* Adding a new page to the `docs/en/resources/tools` directory for your tool.
* **(Optional) Add samples** to the `docs/en/samples/<newdb>` directory.
#### (Optional) 5. Add Prebuilt Tools
You can provide developers with a set of "build-time" tools to aid common
software development user journeys like viewing and creating tables/collections
and data.
* **Create a set of prebuilt tools** by defining a new `tools.yaml` and adding
it to `internal/tools`. Make sure the file name matches the source (i.e. for
source "alloydb-postgres" create a file named "alloydb-postgres.yaml").
* **Update `cmd/root.go`** to add new source to the `prebuilt` flag.
* **Add tests** in
[internal/prebuiltconfigs/prebuiltconfigs_test.go](internal/prebuiltconfigs/prebuiltconfigs_test.go)
and [cmd/root_test.go](cmd/root_test.go).
#### 6. Submit a Pull Request
* **Submit a pull request** to the repository with your changes. Be sure to
include a detailed description of your changes and any requests for long term
testing resources.

View File

@@ -1,12 +1,16 @@
# DEVELOPER.md
## Before you begin
This document provides instructions for setting up your development environment
and contributing to the Toolbox project.
1. Make sure you've setup your databases.
## Prerequisites
1. Install the latest version of [Go](https://go.dev/doc/install).
Before you begin, ensure you have the following:
1. Locate and download dependencies:
1. **Databases:** Set up the necessary databases for your development
environment.
1. **Go:** Install the latest version of [Go](https://go.dev/doc/install).
1. **Dependencies:** Download and manage project dependencies:
```bash
go get
@@ -15,196 +19,224 @@
## Developing Toolbox
### Run Toolbox from local source
### Running from Local Source
1. Create a `tools.yaml` file with your [sources and tools configurations](./README.md#Configuration).
1. You can specify flags for the Toolbox server. Execute the following to list the possible CLI flags:
1. **Configuration:** Create a `tools.yaml` file to configure your sources and
tools. See the [Configuration section in the
README](./README.md#Configuration) for details.
1. **CLI Flags:** List available command-line flags for the Toolbox server:
```bash
go run . --help
```
1. To run the server, execute the following (with any flags, if applicable):
1. **Running the Server:** Start the Toolbox server with optional flags. The
server listens on port 5000 by default.
```bash
go run .
```
The server will listen on port 5000 (by default).
1. Test endpoint using the following:
1. **Testing the Endpoint:** Verify the server is running by sending a request
to the endpoint:
```bash
curl http://127.0.0.1:5000
```
### Testing
## Testing
#### Writing Tests
### Infrastructure
New contributions should be added with both unit tests and integration tests.
Toolbox uses both GitHub Actions and Cloud Build to run test workflows. Cloud
Build is used when Google credentials are required. Cloud Build uses test
project "toolbox-testing-438616".
- Unit tests are in the same package as the source code.
### Linting
- Integration tests are in the `/tests` directory, under its dedicated package
named after its source. We have several pre-defined integration test suites
in the `/tests/tool.go` that are **required** to be run as long as your code contains related
features:
Run the lint check to ensure code quality:
1. [RunToolGetTest][tool-get]: tests for the `GET` endpoint that returns the
tool's manifest.
2. [RunToolInvokeTest][tool-call]: tests for tool calling through the native
Toolbox endpoints.
```bash
golangci-lint run --fix
```
3. [RunMCPToolCallMethod][mcp-call]: tests tool calling through the MCP
endpoints.
4. (Optional) [RunExecuteSqlToolInvokeTest][execute-sql]: tests an
`execute-sql` tool for any source. Only run this test if you are adding an
`execute-sql` tool.
### Unit Tests
5. (Optional) [RunToolInvokeWithTemplateParameters][temp-param]: tests for [template
parameters][temp-param-doc]. Only run this test if template parameters apply to your tool.
[tool-get]:
https://github.com/googleapis/genai-toolbox/blob/fd300dc606d88bf9f7bba689e2cee4e3565537dd/tests/tool.go#L31
[tool-call]:
<https://github.com/googleapis/genai-toolbox/blob/fd300dc606d88bf9f7bba689e2cee4e3565537dd/tests/tool.go#L79>
[mcp-call]:
https://github.com/googleapis/genai-toolbox/blob/fd300dc606d88bf9f7bba689e2cee4e3565537dd/tests/tool.go#L554
[execute-sql]:
<https://github.com/googleapis/genai-toolbox/blob/fd300dc606d88bf9f7bba689e2cee4e3565537dd/tests/tool.go#L431>
[temp-param]:
<https://github.com/googleapis/genai-toolbox/blob/fd300dc606d88bf9f7bba689e2cee4e3565537dd/tests/tool.go#L297>
[temp-param-doc]:
https://googleapis.github.io/genai-toolbox/resources/tools/#template-parameters
Execute unit tests locally:
#### Running Tests
```bash
go test -race -v ./...
```
- Run the lint check:
### Integration Tests
```bash
golangci-lint run --fix
#### Running Locally
1. **Environment Variables:** Set the required environment variables. Refer to
the [Cloud Build testing configuration](./.ci/integration.cloudbuild.yaml)
for a complete list of variables for each source.
* `SERVICE_ACCOUNT_EMAIL`: Use your own GCP email.
* `CLIENT_ID`: Use the Google Cloud SDK application Client ID. Contact
Toolbox maintainers if you don't have it.
1. **Running Tests:** Run the integration test for your target source. Specify
the required Go build tags at the top of each integration test file.
```shell
go test -race -v ./tests/<YOUR_TEST_DIR>
```
- Run unit tests locally:
For example, to run the AlloyDB integration test:
```bash
go test -race -v ./...
```shell
go test -race -v ./tests/alloydbpg
```
- Run integration tests locally:
1. Set required environment variables. For a complete lists of required
vairables for each source, check out the [Cloud Build testing
configuration](./.ci/integration.cloudbuild.yaml).
- Use your own GCP email as the `SERVICE_ACCOUNT_EMAIL`.
- Use the Google Cloud SDK application Client ID as the `CLIENT_ID`. Ask the
Toolbox maintainers if you don't know it already.
#### Running on Pull Requests
2. Run the integration test for your target source with the required Go
build tags specified at the top of each integration test file:
* **Internal Contributors:** Testing workflows should trigger automatically.
* **External Contributors:** Request Toolbox maintainers to trigger the testing
workflows on your PR.
```shell
go test -race -v ./tests/<YOUR_TEST_DIR>
```
#### Test Resources
For example, to run the AlloyDB integration test, run:
The following databases have been added as test resources. To add a new database
to test against, please contact the Toolbox maintainer team via an issue or PR.
Refer to the [Cloud Build testing
configuration](./.ci/integration.cloudbuild.yaml) for a complete list of
variables for each source.
```shell
go test -race -v ./tests/alloydbpg
```
* AlloyDB - setup in the test project
* AI Natural Language ([setup
instructions](https://cloud.google.com/alloydb/docs/ai/use-natural-language-generate-sql-queries))
has been configured for `alloydb-a`-nl` tool tests
* The Cloud Build service account is a user
* Bigtable - setup in the test project
* The Cloud Build service account is a user
* BigQuery - setup in the test project
* The Cloud Build service account is a user
* Cloud SQL Postgres - setup in the test project
* The Cloud Build service account is a user
* Cloud SQL MySQL - setup in the test project
* The Cloud Build service account is a user
* Cloud SQL SQL Server - setup in the test project
* The Cloud Build service account is a user
* Couchbase - setup in the test project via the Marketplace
* DGraph - using the public dgraph interface <https://play.dgraph.io> for
testing
* Memorystore Redis - setup in the test project using a Memorystore for Redis
standalone instance
* Memorystore Redis Cluster, Memorystore Valkey standalone, and Memorystore
Valkey Cluster instances all require PSC connections, which requires extra
security setup to connect from Cloud Build. Memorystore Redis standalone is
the only one allowing PSA connection.
* The Cloud Build service account is a user
* Memorystore Valkey - setup in the test project using a Memorystore for Redis
standalone instance
* The Cloud Build service account is a user
* MySQL - setup in the test project using a Cloud SQL instance
* Neo4j - setup in the test project on a GCE VM
* Postgres - setup in the test project using an AlloyDB instance
* Spanner - setup in the test project
* The Cloud Build service account is a user
* SQL Server - setup in the test project using a Cloud SQL instance
* SQLite - setup in the integration test, where we create a temporary database
file
- Run integration tests on your PR:
### Other GitHub Checks
For internal contributors, the testing workflows should trigger
automatically. For external contributors, ask the Toolbox
maintainers to trigger the testing workflows on your PR.
* License header check (`.github/header-checker-lint.yml`) - Ensures files have
the appropriate license
* CLA/google - Ensures the developer has signed the CLA:
<https://cla.developers.google.com/>
* conventionalcommits.org - Ensures the commit messages are in the correct
format. This repository uses tool [Release
Please](https://github.com/googleapis/release-please) to create GitHub
releases. It does so by parsing your git history, looking for [Conventional
Commit messages](https://www.conventionalcommits.org/), and creating release
PRs. Learn more by reading [How should I write my
commits?](https://github.com/googleapis/release-please?tab=readme-ov-file#how-should-i-write-my-commits)
## Developing Documentation
Follow these steps to run a Hugo server for local preview:
### Running a Local Hugo Server
1. [Install Hugo](https://gohugo.io/installation/macos/) version 0.146.0+.
1. Move into the `.hugo` directory
Follow these steps to preview documentation changes locally using a Hugo server:
1. **Install Hugo:** Ensure you have
[Hugo](https://gohugo.io/installation/macos/) extended edition version
0.146.0 or later installed.
1. **Navigate to the Hugo Directory:**
```bash
cd .hugo
```
1. Install dependencies
1. **Install Dependencies:**
```bash
npm ci
```
1. Run the server
1. **Start the Server:**
```bash
hugo server
```
### PR documentation preview
### Previewing Documentation on Pull Requests
- For contributors:
#### Contributors
- Ask a repo owner to run the preview deployment workflow on your PR. The preview link
will be commented under your PR automatically.
Request a repo owner to run the preview deployment workflow on your PR. A
preview link will be automatically added as a comment to your PR.
- For maintainers:
#### Maintainers
- Inspect the proposed changes in the PR and ensure that it does not contain
malicious code changes. You should be especially alert to any proposed changes in the
`.github/workflows/` directory that affect workflow files.
- After you make sure the changes are safe, apply the `docs: deploy-preview`
label to the PR to deploy documentation preview.
1. **Inspect Changes:** Review the proposed changes in the PR to ensure they are
safe and do not contain malicious code. Pay close attention to changes in the
`.github/workflows/` directory.
1. **Deploy Preview:** Apply the `docs: deploy-preview` label to the PR to
deploy a documentation preview.
## Compile the app locally
## Building Toolbox
### Compile Toolbox binary
### Building the Binary
1. Run build to compile binary:
1. **Build Command:** Compile the Toolbox binary:
```bash
go build -o toolbox
```
1. You can specify flags for the Toolbox server. Execute the following to list the possible CLI flags:
```bash
./toolbox --help
```
1. To run the binary, execute the following (with any flags, if applicable):
1. **Running the Binary:** Execute the compiled binary with optional flags. The
server listens on port 5000 by default:
```bash
./toolbox
```
The server will listen on port 5000 (by default).
1. Test endpoint using the following:
1. **Testing the Endpoint:** Verify the server is running by sending a request
to the endpoint:
```bash
curl http://127.0.0.1:5000
```
### Compile Toolbox container images
### Building Container Images
1. Run build to compile container image:
1. **Build Command:** Build the Toolbox container image:
```bash
docker build -t toolbox:dev .
```
1. Execute the following to view image:
1. **View Image:** List available Docker images to confirm the build:
```bash
docker images
```
1. Run container image with Docker:
1. **Run Container:** Run the Toolbox container image using Docker:
```bash
docker run -d toolbox:dev
@@ -212,56 +244,118 @@ Follow these steps to run a Hugo server for local preview:
## Developing Toolbox SDKs
Please refer to the [SDK developer guide](https://github.com/googleapis/mcp-toolbox-sdk-python/blob/main/DEVELOPER.md)
Refer to the [SDK developer
guide](https://github.com/googleapis/mcp-toolbox-sdk-python/blob/main/DEVELOPER.md)
for instructions on developing Toolbox SDKs.
## (Optional) Maintainer Information
## Maintainer Information
### Team
Team, `@googleapis/senseai-eco`, has been set as
[CODEOWNERS](.github/CODEOWNERS). The GitHub TeamSync tool is used to create
this team from MDB Group, `senseai-eco`.
### Releasing
There are two types of release for Toolbox, including a versioned release and continuous release.
Toolbox has two types of releases: versioned and continuous. It uses Google
Cloud project, `database-toolbox`.
- Versioned release: Official supported distributions with the `latest` tag. The release process for versioned release is in [versioned.release.cloudbuild.yaml](https://github.com/googleapis/genai-toolbox/blob/main/versioned.release.cloudbuild.yaml).
- Continuous release: Used for early testing features between official supported releases and end-to-end testings.
* **Versioned Release:** Official, supported distributions tagged as `latest`.
The release process is defined in
[versioned.release.cloudbuild.yaml](.ci/versioned.release.cloudbuild.yaml).
* **Continuous Release:** Used for early testing of features between official
releases and for end-to-end testing. The release process is defined in
[continuous.release.cloudbuild.yaml](.ci/continuous.release.cloudbuild.yaml).
* **GitHub Release:** `.github/release-please.yml` automatically creates GitHub
Releases and release PRs.
#### Supported OS and Architecture binaries
### How-to Release a new Version
The following OS and computer architecture is supported within the binary releases.
1. [Optional] If you want to override the version number, send a
[PR](https://github.com/googleapis/genai-toolbox/pull/31) to trigger
[release-please](https://github.com/googleapis/release-please?tab=readme-ov-file#how-do-i-change-the-version-number).
You can generate a commit with the following line: `git commit -m "chore:
release 0.1.0" -m "Release-As: 0.1.0" --allow-empty`
1. [Optional] If you want to edit the changelog, send commits to the release PR
1. Approve and merge the PR with the title “[chore(main): release
x.x.x](https://github.com/googleapis/genai-toolbox/pull/16)”
1. The
[trigger](https://pantheon.corp.google.com/cloud-build/triggers;region=us-central1/edit/27bd0d21-264a-4446-b2d7-0df4e9915fb3?e=13802955&inv=1&invt=AbhU8A&mods=logs_tg_staging&project=database-toolbox)
should automatically run when a new tag is pushed. You can view [triggered
builds here to check the
status](https://pantheon.corp.google.com/cloud-build/builds;region=us-central1?query=trigger_id%3D%2227bd0d21-264a-4446-b2d7-0df4e9915fb3%22&e=13802955&inv=1&invt=AbhU8A&mods=logs_tg_staging&project=database-toolbox)
1. Update the Github release notes to include the following table:
1. Run the following command (from the root directory):
- linux/amd64
- darwin/arm64
- darwin/amd64
- windows/amd64
```
export VERSION="v0.0.0"
.ci/generate_release_table.sh
```
#### Supported container images
1. Copy the table output
1. In the GitHub UI, navigate to Releases and click the `edit` button.
1. Paste the table at the bottom of release note and click `Update release`.
1. Post release in internal chat and on Discord.
The following base container images is supported within the container image releases.
#### Supported Binaries
- distroless
The following operating systems and architectures are supported for binary
releases:
### Automated tests
* linux/amd64
* darwin/arm64
* darwin/amd64
* windows/amd64
Integration and unit tests are automatically triggered via CloudBuild during each PR creation.
#### Supported Container Images
The following base container images are supported for container image releases:
* distroless
### Automated Tests
Integration and unit tests are automatically triggered via Cloud Build on each
pull request. Integration tests run on merge and nightly.
#### Failure notifications
On-merge and nightly tests that fail have notification setup via Cloud Build
Failure Reporter [GitHub Actions
Workflow](.github/workflows/schedule_reporter.yml).
#### Trigger Setup
Create a Cloud Build trigger via the UI or `gcloud` with the following specs:
Configure a Cloud Build trigger using the UI or `gcloud` with the following
settings:
- Event: Pull request
- Region:
- global - for default worker pools
- Source:
- Generation: 1st gen
- Repo: googleapis/genai-toolbox (GitHub App)
- Base branch: `^main$`
- Comment control: Required except for owners and collaborators
- Filters: add directory filter
- Config: Cloud Build configuration file
- Location: Repository (add path to file)
- Service account: set for demo service to enable ID token creation to use to authenticated services
* **Event:** Pull request
* **Region:** global (for default worker pools)
* **Source:**
* Generation: 1st gen
* Repo: googleapis/genai-toolbox (GitHub App)
* Base branch: `^main$`
* **Comment control:** Required except for owners and collaborators
* **Filters:** Add directory filter
* **Config:** Cloud Build configuration file
* Location: Repository (add path to file)
* **Service account:** Set for demo service to enable ID token creation for
authenticated services
### Trigger
### Triggering Tests
Trigger the PR tests on PRs from external contributors:
Trigger pull request tests for external contributors by:
- Cloud Build tests: comment `/gcbrun`
- Unit tests: add `tests:run` label
* **Cloud Build tests:** Comment `/gcbrun`
* **Unit tests:** Add the `tests:run` label
## Repo Setup & Automation
* .github/blunderbuss.yml - Auto-assign issues and PRs from GitHub teams
* .github/renovate.json5 - Tooling for dependency updates. Dependabot is built
into the GitHub repo for GitHub security warnings
* go/github-issue-mirror - GitHub issues are automatically mirrored into buganizer
* (Suspended) .github/sync-repo-settings.yaml - configure repo settings
* .github/release-please.yml - Creates GitHub releases
* .github/ISSUE_TEMPLATE - templates for GitHub issues

493
README.md
View File

@@ -2,7 +2,9 @@
# MCP Toolbox for Databases
[![Discord](https://img.shields.io/badge/Discord-%235865F2.svg?style=for-the-badge&logo=discord&logoColor=white)](https://discord.gg/Dmm69peqjh)
[![Docs](https://img.shields.io/badge/docs-MCP_Toolbox-blue)](https://googleapis.github.io/genai-toolbox/)
[![Discord](https://img.shields.io/badge/Discord-%235865F2.svg?style=flat&logo=discord&logoColor=white)](https://discord.gg/Dmm69peqjh)
[![Medium](https://img.shields.io/badge/Medium-12100E?style=flat&logo=medium&logoColor=white)](https://medium.com/@mcp_toolbox)
[![Go Report Card](https://goreportcard.com/badge/github.com/googleapis/genai-toolbox)](https://goreportcard.com/report/github.com/googleapis/genai-toolbox)
> [!NOTE]
@@ -16,7 +18,6 @@ such as connection pooling, authentication, and more.
This README provides a brief overview. For comprehensive details, see the [full
documentation](https://googleapis.github.io/genai-toolbox/).
> [!NOTE]
> This solution was originally named “Gen AI Toolbox for Databases” as
> its initial development predated MCP, but was renamed to align with recently
@@ -30,23 +31,24 @@ documentation](https://googleapis.github.io/genai-toolbox/).
- [Why Toolbox?](#why-toolbox)
- [General Architecture](#general-architecture)
- [Getting Started](#getting-started)
- [Installing the server](#installing-the-server)
- [Running the server](#running-the-server)
- [Integrating your application](#integrating-your-application)
- [Installing the server](#installing-the-server)
- [Running the server](#running-the-server)
- [Integrating your application](#integrating-your-application)
- [Configuration](#configuration)
- [Sources](#sources)
- [Tools](#tools)
- [Toolsets](#toolsets)
- [Sources](#sources)
- [Tools](#tools)
- [Toolsets](#toolsets)
- [Versioning](#versioning)
- [Contributing](#contributing)
- [Community](#community)
<!-- /TOC -->
## Why Toolbox?
## Why Toolbox?
Toolbox helps you build Gen AI tools that let your agents access data in your
database. Toolbox provides:
- **Simplified development**: Integrate tools to your agent in less than 10
lines of code, reuse tools between multiple agents or frameworks, and deploy
new versions of tools more easily.
@@ -56,17 +58,29 @@ database. Toolbox provides:
- **End-to-end observability**: Out of the box metrics and tracing with built-in
support for OpenTelemetry.
**⚡ Supercharge Your Workflow with an AI Database Assistant ⚡**
Stop context-switching and let your AI assistant become a true co-developer. By [connecting your IDE to your databases with MCP Toolbox][connect-ide], you can delegate complex and time-consuming database tasks, allowing you to build faster and focus on what matters. This isn't just about code completion; it's about giving your AI the context it needs to handle the entire development lifecycle.
Stop context-switching and let your AI assistant become a true co-developer. By
[connecting your IDE to your databases with MCP Toolbox][connect-ide], you can
delegate complex and time-consuming database tasks, allowing you to build faster
and focus on what matters. This isn't just about code completion; it's about
giving your AI the context it needs to handle the entire development lifecycle.
Heres how it will save you time:
* **Query in Plain English**: Interact with your data using natural language right from your IDE. Ask complex questions like, *"How many orders were delivered in 2024, and what items were in them?"* without writing any SQL.
* **Automate Database Management**: Simply describe your data needs, and let the AI assistant manage your database for you. It can handle generating queries, creating tables, adding indexes, and more.
* **Generate Context-Aware Code**: Empower your AI assistant to generate application code and tests with a deep understanding of your real-time database schema. This accelerates the development cycle by ensuring the generated code is directly usable.
* **Slash Development Overhead**: Radically reduce the time spent on manual setup and boilerplate. MCP Toolbox helps streamline lengthy database configurations, repetitive code, and error-prone schema migrations.
- **Query in Plain English**: Interact with your data using natural language
right from your IDE. Ask complex questions like, *"How many orders were
delivered in 2024, and what items were in them?"* without writing any SQL.
- **Automate Database Management**: Simply describe your data needs, and let the
AI assistant manage your database for you. It can handle generating queries,
creating tables, adding indexes, and more.
- **Generate Context-Aware Code**: Empower your AI assistant to generate
application code and tests with a deep understanding of your real-time
database schema. This accelerates the development cycle by ensuring the
generated code is directly usable.
- **Slash Development Overhead**: Radically reduce the time spent on manual
setup and boilerplate. MCP Toolbox helps streamline lengthy database
configurations, repetitive code, and error-prone schema migrations.
Learn [how to connect your AI tools (IDEs) to Toolbox using MCP][connect-ide].
@@ -86,6 +100,7 @@ redeploying your application.
## Getting Started
### Installing the server
For the latest version, check the [releases page][releases] and use the
following instructions for your OS and CPU architecture.
@@ -99,7 +114,7 @@ To install Toolbox as a binary:
<!-- {x-release-please-start-version} -->
```sh
# see releases page for other versions
export VERSION=0.7.0
export VERSION=0.10.0
curl -O https://storage.googleapis.com/genai-toolbox/v$VERSION/linux/amd64/toolbox
chmod +x toolbox
```
@@ -112,12 +127,23 @@ You can also install Toolbox as a container:
```sh
# see releases page for other versions
export VERSION=0.7.0
export VERSION=0.10.0
docker pull us-central1-docker.pkg.dev/database-toolbox/toolbox/toolbox:$VERSION
```
</details>
<details>
<summary>Homebrew</summary>
To install Toolbox using Homebrew on macOS or Linux:
```sh
brew install mcp-toolbox
```
</details>
<details>
<summary>Compile from source</summary>
@@ -125,7 +151,7 @@ To install from source, ensure you have the latest version of
[Go installed](https://go.dev/doc/install), and then run the following command:
```sh
go install github.com/googleapis/genai-toolbox@v0.7.0
go install github.com/googleapis/genai-toolbox@v0.10.0
```
<!-- {x-release-please-end} -->
@@ -140,6 +166,18 @@ execute `toolbox` to start the server:
./toolbox --tools-file "tools.yaml"
```
> [!NOTE]
> Toolbox enables dynamic reloading by default. To disable, use the
> `--disable-reload` flag.
#### Homebrew Users
If you installed Toolbox using Homebrew, the `toolbox` binary is available in your system path. You can start the server with the same command:
```sh
toolbox --tools-file "tools.yaml"
```
You can use `toolbox help` for a full list of flags! To stop the server, send a
terminate signal (`ctrl+c` on most platforms).
@@ -153,13 +191,21 @@ Once your server is up and running, you can load the tools into your
application. See below the list of Client SDKs for using various frameworks:
<details open>
<summary>Core</summary>
<summary>Python (<a href="https://github.com/googleapis/mcp-toolbox-sdk-python">Github</a>)</summary>
<br>
<blockquote>
<details open>
<summary>Core</summary>
1. Install [Toolbox Core SDK][toolbox-core]:
```bash
pip install toolbox-core
```
1. Load tools:
```python
from toolbox_core import ToolboxClient
@@ -176,15 +222,18 @@ For more detailed instructions on using the Toolbox Core SDK, see the
[toolbox-core]: https://pypi.org/project/toolbox-core/
[toolbox-core-readme]: https://github.com/googleapis/mcp-toolbox-sdk-python/tree/main/packages/toolbox-core/README.md
</details>
<details>
<summary>LangChain / LangGraph</summary>
</details>
<details>
<summary>LangChain / LangGraph</summary>
1. Install [Toolbox LangChain SDK][toolbox-langchain]:
```bash
pip install toolbox-langchain
```
1. Load tools:
```python
from toolbox_langchain import ToolboxClient
@@ -195,22 +244,24 @@ For more detailed instructions on using the Toolbox Core SDK, see the
tools = client.load_toolset()
```
For more detailed instructions on using the Toolbox LangChain SDK, see the
[project's README][toolbox-langchain-readme].
For more detailed instructions on using the Toolbox LangChain SDK, see the
[project's README][toolbox-langchain-readme].
[toolbox-langchain]: https://pypi.org/project/toolbox-langchain/
[toolbox-langchain-readme]: https://github.com/googleapis/mcp-toolbox-sdk-python/blob/main/packages/toolbox-langchain/README.md
[toolbox-langchain]: https://pypi.org/project/toolbox-langchain/
[toolbox-langchain-readme]: https://github.com/googleapis/mcp-toolbox-sdk-python/blob/main/packages/toolbox-langchain/README.md
</details>
<details>
<summary>LlamaIndex</summary>
</details>
<details>
<summary>LlamaIndex</summary>
1. Install [Toolbox Llamaindex SDK][toolbox-llamaindex]:
```bash
pip install toolbox-llamaindex
```
1. Load tools:
```python
from toolbox_llamaindex import ToolboxClient
@@ -221,12 +272,378 @@ For more detailed instructions on using the Toolbox LangChain SDK, see the
tools = client.load_toolset()
```
For more detailed instructions on using the Toolbox Llamaindex SDK, see the
[project's README][toolbox-llamaindex-readme].
For more detailed instructions on using the Toolbox Llamaindex SDK, see the
[project's README][toolbox-llamaindex-readme].
[toolbox-llamaindex]: https://pypi.org/project/toolbox-llamaindex/
[toolbox-llamaindex-readme]: https://github.com/googleapis/genai-toolbox-llamaindex-python/blob/main/README.md
[toolbox-llamaindex]: https://pypi.org/project/toolbox-llamaindex/
[toolbox-llamaindex-readme]: https://github.com/googleapis/genai-toolbox-llamaindex-python/blob/main/README.md
</details>
</details>
</blockquote>
<details>
<summary>Javascript/Typescript (<a href="https://github.com/googleapis/mcp-toolbox-sdk-js">Github</a>)</summary>
<br>
<blockquote>
<details open>
<summary>Core</summary>
1. Install [Toolbox Core SDK][toolbox-core-js]:
```bash
npm install @toolbox-sdk/core
```
1. Load tools:
```javascript
import { ToolboxClient } from '@toolbox-sdk/core';
// update the url to point to your server
const URL = 'http://127.0.0.1:5000';
let client = new ToolboxClient(URL);
// these tools can be passed to your application!
const tools = await client.loadToolset('toolsetName');
```
For more detailed instructions on using the Toolbox Core SDK, see the
[project's README][toolbox-core-js-readme].
[toolbox-core-js]: https://www.npmjs.com/package/@toolbox-sdk/core
[toolbox-core-js-readme]: https://github.com/googleapis/mcp-toolbox-sdk-js/blob/main/packages/toolbox-core/README.md
</details>
<details>
<summary>LangChain / LangGraph</summary>
1. Install [Toolbox Core SDK][toolbox-core-js]:
```bash
npm install @toolbox-sdk/core
```
2. Load tools:
```javascript
import { ToolboxClient } from '@toolbox-sdk/core';
// update the url to point to your server
const URL = 'http://127.0.0.1:5000';
let client = new ToolboxClient(URL);
// these tools can be passed to your application!
const toolboxTools = await client.loadToolset('toolsetName');
// Define the basics of the tool: name, description, schema and core logic
const getTool = (toolboxTool) => tool(currTool, {
name: toolboxTool.getName(),
description: toolboxTool.getDescription(),
schema: toolboxTool.getParamSchema()
});
// Use these tools in your Langchain/Langraph applications
const tools = toolboxTools.map(getTool);
```
</details>
<details>
<summary>Genkit</summary>
1. Install [Toolbox Core SDK][toolbox-core-js]:
```bash
npm install @toolbox-sdk/core
```
2. Load tools:
```javascript
import { ToolboxClient } from '@toolbox-sdk/core';
import { genkit } from 'genkit';
// Initialise genkit
const ai = genkit({
plugins: [
googleAI({
apiKey: process.env.GEMINI_API_KEY || process.env.GOOGLE_API_KEY
})
],
model: googleAI.model('gemini-2.0-flash'),
});
// update the url to point to your server
const URL = 'http://127.0.0.1:5000';
let client = new ToolboxClient(URL);
// these tools can be passed to your application!
const toolboxTools = await client.loadToolset('toolsetName');
// Define the basics of the tool: name, description, schema and core logic
const getTool = (toolboxTool) => ai.defineTool({
name: toolboxTool.getName(),
description: toolboxTool.getDescription(),
schema: toolboxTool.getParamSchema()
}, toolboxTool)
// Use these tools in your Genkit applications
const tools = toolboxTools.map(getTool);
```
</details>
</details>
</blockquote>
<details>
<summary>Go (<a href="https://github.com/googleapis/mcp-toolbox-sdk-go">Github</a>)</summary>
<br>
<blockquote>
<details open>
<summary>Core</summary>
1. Install [Toolbox Go SDK][toolbox-go]:
```bash
go get github.com/googleapis/mcp-toolbox-sdk-go
```
1. Load tools:
```go
package main
import (
"github.com/googleapis/mcp-toolbox-sdk-go/core"
"context"
)
func main() {
// Make sure to add the error checks
// update the url to point to your server
URL := "http://127.0.0.1:5000";
ctx := context.Background()
client, err := core.NewToolboxClient(URL)
// Framework agnostic tools
tools, err := client.LoadToolset("toolsetName", ctx)
}
```
For more detailed instructions on using the Toolbox Go SDK, see the
[project's README][toolbox-core-go-readme].
[toolbox-go]: https://pkg.go.dev/github.com/googleapis/mcp-toolbox-sdk-go/core
[toolbox-core-go-readme]: https://github.com/googleapis/mcp-toolbox-sdk-go/blob/main/core/README.md
</details>
<details>
<summary>LangChain Go</summary>
1. Install [Toolbox Go SDK][toolbox-go]:
```bash
go get github.com/googleapis/mcp-toolbox-sdk-go
```
2. Load tools:
```go
package main
import (
"context"
"encoding/json"
"github.com/googleapis/mcp-toolbox-sdk-go/core"
"github.com/tmc/langchaingo/llms"
)
func main() {
// Make sure to add the error checks
// update the url to point to your server
URL := "http://127.0.0.1:5000"
ctx := context.Background()
client, err := core.NewToolboxClient(URL)
// Framework agnostic tool
tool, err := client.LoadTool("toolName", ctx)
// Fetch the tool's input schema
inputschema, err := tool.InputSchema()
var paramsSchema map[string]any
_ = json.Unmarshal(inputschema, &paramsSchema)
// Use this tool with LangChainGo
langChainTool := llms.Tool{
Type: "function",
Function: &llms.FunctionDefinition{
Name: tool.Name(),
Description: tool.Description(),
Parameters: paramsSchema,
},
}
}
```
</details>
<details>
<summary>Genkit</summary>
1. Install [Toolbox Go SDK][toolbox-go]:
```bash
go get github.com/googleapis/mcp-toolbox-sdk-go
```
2. Load tools:
```go
package main
import (
"context"
"encoding/json"
"github.com/firebase/genkit/go/ai"
"github.com/firebase/genkit/go/genkit"
"github.com/googleapis/mcp-toolbox-sdk-go/core"
"github.com/googleapis/mcp-toolbox-sdk-go/tbgenkit"
"github.com/invopop/jsonschema"
)
func main() {
// Make sure to add the error checks
// Update the url to point to your server
URL := "http://127.0.0.1:5000"
ctx := context.Background()
g, err := genkit.Init(ctx)
client, err := core.NewToolboxClient(URL)
// Framework agnostic tool
tool, err := client.LoadTool("toolName", ctx)
// Convert the tool using the tbgenkit package
// Use this tool with Genkit Go
genkitTool, err := tbgenkit.ToGenkitTool(tool, g)
if err != nil {
log.Fatalf("Failed to convert tool: %v\n", err)
}
}
```
</details>
<details>
<summary>Go GenAI</summary>
1. Install [Toolbox Go SDK][toolbox-go]:
```bash
go get github.com/googleapis/mcp-toolbox-sdk-go
```
2. Load tools:
```go
package main
import (
"context"
"encoding/json"
"github.com/googleapis/mcp-toolbox-sdk-go/core"
"google.golang.org/genai"
)
func main() {
// Make sure to add the error checks
// Update the url to point to your server
URL := "http://127.0.0.1:5000"
ctx := context.Background()
client, err := core.NewToolboxClient(URL)
// Framework agnostic tool
tool, err := client.LoadTool("toolName", ctx)
// Fetch the tool's input schema
inputschema, err := tool.InputSchema()
var schema *genai.Schema
_ = json.Unmarshal(inputschema, &schema)
funcDeclaration := &genai.FunctionDeclaration{
Name: tool.Name(),
Description: tool.Description(),
Parameters: schema,
}
// Use this tool with Go GenAI
genAITool := &genai.Tool{
FunctionDeclarations: []*genai.FunctionDeclaration{funcDeclaration},
}
}
```
</details>
<details>
<summary>OpenAI Go</summary>
1. Install [Toolbox Go SDK][toolbox-go]:
```bash
go get github.com/googleapis/mcp-toolbox-sdk-go
```
2. Load tools:
```go
package main
import (
"context"
"encoding/json"
"github.com/googleapis/mcp-toolbox-sdk-go/core"
openai "github.com/openai/openai-go"
)
func main() {
// Make sure to add the error checks
// Update the url to point to your server
URL := "http://127.0.0.1:5000"
ctx := context.Background()
client, err := core.NewToolboxClient(URL)
// Framework agnostic tool
tool, err := client.LoadTool("toolName", ctx)
// Fetch the tool's input schema
inputschema, err := tool.InputSchema()
var paramsSchema openai.FunctionParameters
_ = json.Unmarshal(inputschema, &paramsSchema)
// Use this tool with OpenAI Go
openAITool := openai.ChatCompletionToolParam{
Function: openai.FunctionDefinitionParam{
Name: tool.Name(),
Description: openai.String(tool.Description()),
Parameters: paramsSchema,
},
}
}
```
</details>
</details>
</blockquote>
</details>
## Configuration
@@ -237,6 +654,7 @@ tools.yaml` flag.
You can find more detailed reference documentation to all resource types in the
[Resources](https://googleapis.github.io/genai-toolbox/resources/).
### Sources
The `sources` section of your `tools.yaml` defines what data sources your
@@ -278,7 +696,6 @@ tools:
For more details on configuring different types of tools, see the
[Tools](https://googleapis.github.io/genai-toolbox/resources/tools).
### Toolsets
The `toolsets` section of your `tools.yaml` allows you to define groups of tools
@@ -325,3 +742,7 @@ to get started.
Please note that this project is released with a Contributor Code of Conduct.
By participating in this project you agree to abide by its terms. See
[Contributor Code of Conduct](CODE_OF_CONDUCT.md) for more information.
## Community
Join our [discord community](https://discord.gg/GQrFB3Ec3W) to connect with our developers!

20
cmd/BUILD Normal file
View File

@@ -0,0 +1,20 @@
load("//tools/build_defs/go:go_library.bzl", "go_library")
load("//tools/build_defs/go:go_test.bzl", "go_test")
go_library(
name = "cmd",
srcs = [
"options.go",
"root.go",
],
embedsrcs = ["version.txt"],
)
go_test(
name = "cmd_test",
srcs = [
"options_test.go",
"root_test.go",
],
library = ":cmd",
)

View File

@@ -19,44 +19,83 @@ import (
_ "embed"
"fmt"
"io"
"maps"
"os"
"os/signal"
"path/filepath"
"regexp"
"runtime"
"slices"
"strings"
"syscall"
"time"
"github.com/fsnotify/fsnotify"
yaml "github.com/goccy/go-yaml"
"github.com/googleapis/genai-toolbox/internal/auth"
"github.com/googleapis/genai-toolbox/internal/log"
"github.com/googleapis/genai-toolbox/internal/prebuiltconfigs"
"github.com/googleapis/genai-toolbox/internal/server"
"github.com/googleapis/genai-toolbox/internal/sources"
"github.com/googleapis/genai-toolbox/internal/telemetry"
"github.com/googleapis/genai-toolbox/internal/tools"
"github.com/googleapis/genai-toolbox/internal/util"
// Import tool packages for side effect of registration
_ "github.com/googleapis/genai-toolbox/internal/tools/alloydbainl"
_ "github.com/googleapis/genai-toolbox/internal/tools/bigquery"
_ "github.com/googleapis/genai-toolbox/internal/tools/bigqueryexecutesql"
_ "github.com/googleapis/genai-toolbox/internal/tools/bigquerygetdatasetinfo"
_ "github.com/googleapis/genai-toolbox/internal/tools/bigquerygettableinfo"
_ "github.com/googleapis/genai-toolbox/internal/tools/bigquerylistdatasetids"
_ "github.com/googleapis/genai-toolbox/internal/tools/bigquerylisttableids"
_ "github.com/googleapis/genai-toolbox/internal/tools/bigquery/bigqueryexecutesql"
_ "github.com/googleapis/genai-toolbox/internal/tools/bigquery/bigquerygetdatasetinfo"
_ "github.com/googleapis/genai-toolbox/internal/tools/bigquery/bigquerygettableinfo"
_ "github.com/googleapis/genai-toolbox/internal/tools/bigquery/bigquerylistdatasetids"
_ "github.com/googleapis/genai-toolbox/internal/tools/bigquery/bigquerylisttableids"
_ "github.com/googleapis/genai-toolbox/internal/tools/bigquery/bigquerysql"
_ "github.com/googleapis/genai-toolbox/internal/tools/bigtable"
_ "github.com/googleapis/genai-toolbox/internal/tools/couchbase"
_ "github.com/googleapis/genai-toolbox/internal/tools/dataplex/dataplexsearchentries"
_ "github.com/googleapis/genai-toolbox/internal/tools/dgraph"
_ "github.com/googleapis/genai-toolbox/internal/tools/duckdbsql"
_ "github.com/googleapis/genai-toolbox/internal/tools/firestore/firestoredeletedocuments"
_ "github.com/googleapis/genai-toolbox/internal/tools/firestore/firestoregetdocuments"
_ "github.com/googleapis/genai-toolbox/internal/tools/firestore/firestoregetrules"
_ "github.com/googleapis/genai-toolbox/internal/tools/firestore/firestorelistcollections"
_ "github.com/googleapis/genai-toolbox/internal/tools/firestore/firestorequerycollection"
_ "github.com/googleapis/genai-toolbox/internal/tools/firestore/firestorevalidaterules"
_ "github.com/googleapis/genai-toolbox/internal/tools/http"
_ "github.com/googleapis/genai-toolbox/internal/tools/mssqlexecutesql"
_ "github.com/googleapis/genai-toolbox/internal/tools/mssqlsql"
_ "github.com/googleapis/genai-toolbox/internal/tools/mysqlexecutesql"
_ "github.com/googleapis/genai-toolbox/internal/tools/mysqlsql"
_ "github.com/googleapis/genai-toolbox/internal/tools/neo4j"
_ "github.com/googleapis/genai-toolbox/internal/tools/postgresexecutesql"
_ "github.com/googleapis/genai-toolbox/internal/tools/postgressql"
_ "github.com/googleapis/genai-toolbox/internal/tools/looker/lookergetdimensions"
_ "github.com/googleapis/genai-toolbox/internal/tools/looker/lookergetexplores"
_ "github.com/googleapis/genai-toolbox/internal/tools/looker/lookergetfilters"
_ "github.com/googleapis/genai-toolbox/internal/tools/looker/lookergetlooks"
_ "github.com/googleapis/genai-toolbox/internal/tools/looker/lookergetmeasures"
_ "github.com/googleapis/genai-toolbox/internal/tools/looker/lookergetmodels"
_ "github.com/googleapis/genai-toolbox/internal/tools/looker/lookergetparameters"
_ "github.com/googleapis/genai-toolbox/internal/tools/looker/lookerquery"
_ "github.com/googleapis/genai-toolbox/internal/tools/looker/lookerquerysql"
_ "github.com/googleapis/genai-toolbox/internal/tools/looker/lookerqueryurl"
_ "github.com/googleapis/genai-toolbox/internal/tools/looker/lookerrunlook"
_ "github.com/googleapis/genai-toolbox/internal/tools/mongodb/mongodbaggregate"
_ "github.com/googleapis/genai-toolbox/internal/tools/mongodb/mongodbdeletemany"
_ "github.com/googleapis/genai-toolbox/internal/tools/mongodb/mongodbdeleteone"
_ "github.com/googleapis/genai-toolbox/internal/tools/mongodb/mongodbfind"
_ "github.com/googleapis/genai-toolbox/internal/tools/mongodb/mongodbfindone"
_ "github.com/googleapis/genai-toolbox/internal/tools/mongodb/mongodbinsertmany"
_ "github.com/googleapis/genai-toolbox/internal/tools/mongodb/mongodbinsertone"
_ "github.com/googleapis/genai-toolbox/internal/tools/mongodb/mongodbupdatemany"
_ "github.com/googleapis/genai-toolbox/internal/tools/mongodb/mongodbupdateone"
_ "github.com/googleapis/genai-toolbox/internal/tools/mssql/mssqlexecutesql"
_ "github.com/googleapis/genai-toolbox/internal/tools/mssql/mssqlsql"
_ "github.com/googleapis/genai-toolbox/internal/tools/mysql/mysqlexecutesql"
_ "github.com/googleapis/genai-toolbox/internal/tools/mysql/mysqlsql"
_ "github.com/googleapis/genai-toolbox/internal/tools/neo4j/neo4jcypher"
_ "github.com/googleapis/genai-toolbox/internal/tools/neo4j/neo4jexecutecypher"
_ "github.com/googleapis/genai-toolbox/internal/tools/neo4j/neo4jschema"
_ "github.com/googleapis/genai-toolbox/internal/tools/postgres/postgresexecutesql"
_ "github.com/googleapis/genai-toolbox/internal/tools/postgres/postgressql"
_ "github.com/googleapis/genai-toolbox/internal/tools/redis"
_ "github.com/googleapis/genai-toolbox/internal/tools/spanner"
_ "github.com/googleapis/genai-toolbox/internal/tools/spannerexecutesql"
_ "github.com/googleapis/genai-toolbox/internal/tools/spanner/spannerexecutesql"
_ "github.com/googleapis/genai-toolbox/internal/tools/spanner/spannersql"
_ "github.com/googleapis/genai-toolbox/internal/tools/sqlitesql"
_ "github.com/googleapis/genai-toolbox/internal/tools/utility/alloydbwaitforoperation"
_ "github.com/googleapis/genai-toolbox/internal/tools/utility/wait"
_ "github.com/googleapis/genai-toolbox/internal/tools/valkey"
"github.com/spf13/cobra"
@@ -68,8 +107,13 @@ import (
_ "github.com/googleapis/genai-toolbox/internal/sources/cloudsqlmysql"
_ "github.com/googleapis/genai-toolbox/internal/sources/cloudsqlpg"
_ "github.com/googleapis/genai-toolbox/internal/sources/couchbase"
_ "github.com/googleapis/genai-toolbox/internal/sources/dataplex"
_ "github.com/googleapis/genai-toolbox/internal/sources/dgraph"
_ "github.com/googleapis/genai-toolbox/internal/sources/duckdb"
_ "github.com/googleapis/genai-toolbox/internal/sources/firestore"
_ "github.com/googleapis/genai-toolbox/internal/sources/http"
_ "github.com/googleapis/genai-toolbox/internal/sources/looker"
_ "github.com/googleapis/genai-toolbox/internal/sources/mongodb"
_ "github.com/googleapis/genai-toolbox/internal/sources/mssql"
_ "github.com/googleapis/genai-toolbox/internal/sources/mysql"
_ "github.com/googleapis/genai-toolbox/internal/sources/neo4j"
@@ -122,6 +166,8 @@ type Command struct {
cfg server.ServerConfig
logger log.Logger
tools_file string
tools_files []string
tools_folder string
prebuiltConfig string
inStream io.Reader
outStream io.Writer
@@ -165,14 +211,17 @@ func NewCommand(opts ...Option) *Command {
flags.StringVar(&cmd.tools_file, "tools_file", "", "File path specifying the tool configuration. Cannot be used with --prebuilt.")
// deprecate tools_file
_ = flags.MarkDeprecated("tools_file", "please use --tools-file instead")
flags.StringVar(&cmd.tools_file, "tools-file", "", "File path specifying the tool configuration. Cannot be used with --prebuilt.")
flags.StringVar(&cmd.tools_file, "tools-file", "", "File path specifying the tool configuration. Cannot be used with --prebuilt, --tools-files, or --tools-folder.")
flags.StringSliceVar(&cmd.tools_files, "tools-files", []string{}, "Multiple file paths specifying tool configurations. Files will be merged. Cannot be used with --prebuilt, --tools-file, or --tools-folder.")
flags.StringVar(&cmd.tools_folder, "tools-folder", "", "Directory path containing YAML tool configuration files. All .yaml and .yml files in the directory will be loaded and merged. Cannot be used with --prebuilt, --tools-file, or --tools-files.")
flags.Var(&cmd.cfg.LogLevel, "log-level", "Specify the minimum level logged. Allowed: 'DEBUG', 'INFO', 'WARN', 'ERROR'.")
flags.Var(&cmd.cfg.LoggingFormat, "logging-format", "Specify logging format to use. Allowed: 'standard' or 'JSON'.")
flags.BoolVar(&cmd.cfg.TelemetryGCP, "telemetry-gcp", false, "Enable exporting directly to Google Cloud Monitoring.")
flags.StringVar(&cmd.cfg.TelemetryOTLP, "telemetry-otlp", "", "Enable exporting using OpenTelemetry Protocol (OTLP) to the specified endpoint (e.g. 'http://127.0.0.1:4318')")
flags.StringVar(&cmd.cfg.TelemetryServiceName, "telemetry-service-name", "toolbox", "Sets the value of the service.name resource attribute for telemetry data.")
flags.StringVar(&cmd.prebuiltConfig, "prebuilt", "", "Use a prebuilt tool configuration by source type. Cannot be used with --tools-file. Allowed: 'alloydb-postgres', 'bigquery', 'cloud-sql-mysql', 'cloud-sql-postgres', 'cloud-sql-mssql', 'postgres', 'spanner', 'spanner-postgres'.")
flags.StringVar(&cmd.prebuiltConfig, "prebuilt", "", "Use a prebuilt tool configuration by source type. Cannot be used with --tools-file. Allowed: 'alloydb-postgres-admin', alloydb-postgres', 'bigquery', 'cloud-sql-mysql', 'cloud-sql-postgres', 'cloud-sql-mssql', 'dataplex', 'firestore', 'looker', 'mssql', 'mysql', 'postgres', 'spanner', 'spanner-postgres'.")
flags.BoolVar(&cmd.cfg.Stdio, "stdio", false, "Listens via MCP STDIO instead of acting as a remote HTTP server.")
flags.BoolVar(&cmd.cfg.DisableReload, "disable-reload", false, "Disables dynamic reloading of tools file.")
// wrap RunE command so that we have access to original Command object
cmd.RunE = func(*cobra.Command, []string) error { return run(cmd) }
@@ -221,6 +270,308 @@ func parseToolsFile(ctx context.Context, raw []byte) (ToolsFile, error) {
return toolsFile, nil
}
// mergeToolsFiles merges multiple ToolsFile structs into one.
// Detects and raises errors for resource conflicts in sources, authServices, tools, and toolsets.
// All resource names (sources, authServices, tools, toolsets) must be unique across all files.
func mergeToolsFiles(files ...ToolsFile) (ToolsFile, error) {
merged := ToolsFile{
Sources: make(server.SourceConfigs),
AuthServices: make(server.AuthServiceConfigs),
Tools: make(server.ToolConfigs),
Toolsets: make(server.ToolsetConfigs),
}
var conflicts []string
for fileIndex, file := range files {
// Check for conflicts and merge sources
for name, source := range file.Sources {
if _, exists := merged.Sources[name]; exists {
conflicts = append(conflicts, fmt.Sprintf("source '%s' (file #%d)", name, fileIndex+1))
} else {
merged.Sources[name] = source
}
}
// Check for conflicts and merge authSources (deprecated, but still support)
for name, authSource := range file.AuthSources {
if _, exists := merged.AuthSources[name]; exists {
conflicts = append(conflicts, fmt.Sprintf("authSource '%s' (file #%d)", name, fileIndex+1))
} else {
merged.AuthSources[name] = authSource
}
}
// Check for conflicts and merge authServices
for name, authService := range file.AuthServices {
if _, exists := merged.AuthServices[name]; exists {
conflicts = append(conflicts, fmt.Sprintf("authService '%s' (file #%d)", name, fileIndex+1))
} else {
merged.AuthServices[name] = authService
}
}
// Check for conflicts and merge tools
for name, tool := range file.Tools {
if _, exists := merged.Tools[name]; exists {
conflicts = append(conflicts, fmt.Sprintf("tool '%s' (file #%d)", name, fileIndex+1))
} else {
merged.Tools[name] = tool
}
}
// Check for conflicts and merge toolsets
for name, toolset := range file.Toolsets {
if _, exists := merged.Toolsets[name]; exists {
conflicts = append(conflicts, fmt.Sprintf("toolset '%s' (file #%d)", name, fileIndex+1))
} else {
merged.Toolsets[name] = toolset
}
}
}
// If conflicts were detected, return an error
if len(conflicts) > 0 {
return ToolsFile{}, fmt.Errorf("resource conflicts detected:\n - %s\n\nPlease ensure each source, authService, tool, and toolset has a unique name across all files", strings.Join(conflicts, "\n - "))
}
return merged, nil
}
// loadAndMergeToolsFiles loads multiple YAML files and merges them
func loadAndMergeToolsFiles(ctx context.Context, filePaths []string) (ToolsFile, error) {
var toolsFiles []ToolsFile
for _, filePath := range filePaths {
buf, err := os.ReadFile(filePath)
if err != nil {
return ToolsFile{}, fmt.Errorf("unable to read tool file at %q: %w", filePath, err)
}
toolsFile, err := parseToolsFile(ctx, buf)
if err != nil {
return ToolsFile{}, fmt.Errorf("unable to parse tool file at %q: %w", filePath, err)
}
toolsFiles = append(toolsFiles, toolsFile)
}
mergedFile, err := mergeToolsFiles(toolsFiles...)
if err != nil {
return ToolsFile{}, fmt.Errorf("unable to merge tools files: %w", err)
}
return mergedFile, nil
}
// loadAndMergeToolsFolder loads all YAML files from a directory and merges them
func loadAndMergeToolsFolder(ctx context.Context, folderPath string) (ToolsFile, error) {
// Check if directory exists
info, err := os.Stat(folderPath)
if err != nil {
return ToolsFile{}, fmt.Errorf("unable to access tools folder at %q: %w", folderPath, err)
}
if !info.IsDir() {
return ToolsFile{}, fmt.Errorf("path %q is not a directory", folderPath)
}
// Find all YAML files in the directory
pattern := filepath.Join(folderPath, "*.yaml")
yamlFiles, err := filepath.Glob(pattern)
if err != nil {
return ToolsFile{}, fmt.Errorf("error finding YAML files in %q: %w", folderPath, err)
}
// Also find .yml files
ymlPattern := filepath.Join(folderPath, "*.yml")
ymlFiles, err := filepath.Glob(ymlPattern)
if err != nil {
return ToolsFile{}, fmt.Errorf("error finding YML files in %q: %w", folderPath, err)
}
// Combine both file lists
allFiles := append(yamlFiles, ymlFiles...)
if len(allFiles) == 0 {
return ToolsFile{}, fmt.Errorf("no YAML files found in directory %q", folderPath)
}
// Use existing loadAndMergeToolsFiles function
return loadAndMergeToolsFiles(ctx, allFiles)
}
func handleDynamicReload(ctx context.Context, toolsFile ToolsFile, s *server.Server) error {
logger, err := util.LoggerFromContext(ctx)
if err != nil {
panic(err)
}
sourcesMap, authServicesMap, toolsMap, toolsetsMap, err := validateReloadEdits(ctx, toolsFile)
if err != nil {
errMsg := fmt.Errorf("unable to validate reloaded edits: %w", err)
logger.WarnContext(ctx, errMsg.Error())
return err
}
s.ResourceMgr.SetResources(sourcesMap, authServicesMap, toolsMap, toolsetsMap)
return nil
}
// validateReloadEdits checks that the reloaded tools file configs can initialized without failing
func validateReloadEdits(
ctx context.Context, toolsFile ToolsFile,
) (map[string]sources.Source, map[string]auth.AuthService, map[string]tools.Tool, map[string]tools.Toolset, error,
) {
logger, err := util.LoggerFromContext(ctx)
if err != nil {
panic(err)
}
instrumentation, err := util.InstrumentationFromContext(ctx)
if err != nil {
panic(err)
}
logger.DebugContext(ctx, "Attempting to parse and validate reloaded tools file.")
ctx, span := instrumentation.Tracer.Start(ctx, "toolbox/server/reload")
defer span.End()
reloadedConfig := server.ServerConfig{
Version: versionString,
SourceConfigs: toolsFile.Sources,
AuthServiceConfigs: toolsFile.AuthServices,
ToolConfigs: toolsFile.Tools,
ToolsetConfigs: toolsFile.Toolsets,
}
sourcesMap, authServicesMap, toolsMap, toolsetsMap, err := server.InitializeConfigs(ctx, reloadedConfig)
if err != nil {
errMsg := fmt.Errorf("unable to initialize reloaded configs: %w", err)
logger.WarnContext(ctx, errMsg.Error())
return nil, nil, nil, nil, err
}
return sourcesMap, authServicesMap, toolsMap, toolsetsMap, nil
}
// watchChanges checks for changes in the provided yaml tools file(s) or folder.
func watchChanges(ctx context.Context, watchDirs map[string]bool, watchedFiles map[string]bool, s *server.Server) {
logger, err := util.LoggerFromContext(ctx)
if err != nil {
panic(err)
}
w, err := fsnotify.NewWatcher()
if err != nil {
logger.WarnContext(ctx, "error setting up new watcher %s", err)
return
}
defer w.Close()
watchingFolder := false
var folderToWatch string
// if watchedFiles is empty, indicates that user passed entire folder instead
if len(watchedFiles) == 0 {
watchingFolder = true
// validate that watchDirs only has single element
if len(watchDirs) > 1 {
logger.WarnContext(ctx, "error setting watcher, expected single tools folder if no file(s) are defined.")
return
}
for onlyKey := range watchDirs {
folderToWatch = onlyKey
break
}
}
for dir := range watchDirs {
err := w.Add(dir)
if err != nil {
logger.WarnContext(ctx, fmt.Sprintf("Error adding path %s to watcher: %s", dir, err))
break
}
logger.DebugContext(ctx, fmt.Sprintf("Added directory %s to watcher.", dir))
}
// debounce timer is used to prevent multiple writes triggering multiple reloads
debounceDelay := 100 * time.Millisecond
debounce := time.NewTimer(1 * time.Minute)
debounce.Stop()
for {
select {
case <-ctx.Done():
logger.DebugContext(ctx, "file watcher context cancelled")
return
case err, ok := <-w.Errors:
if !ok {
logger.WarnContext(ctx, "file watcher was closed unexpectedly")
return
}
if err != nil {
logger.WarnContext(ctx, "file watcher error %s", err)
return
}
case e, ok := <-w.Events:
if !ok {
logger.WarnContext(ctx, "file watcher already closed")
return
}
// only check for events which indicate user saved a new tools file
// multiple operations checked due to various file update methods across editors
if !e.Has(fsnotify.Write | fsnotify.Create | fsnotify.Rename) {
continue
}
cleanedFilename := filepath.Clean(e.Name)
logger.DebugContext(ctx, fmt.Sprintf("%s event detected in %s", e.Op, cleanedFilename))
folderChanged := watchingFolder &&
(strings.HasSuffix(cleanedFilename, ".yaml") || strings.HasSuffix(cleanedFilename, ".yml"))
if folderChanged || watchedFiles[cleanedFilename] {
// indicates the write event is on a relevant file
debounce.Reset(debounceDelay)
}
case <-debounce.C:
debounce.Stop()
var reloadedToolsFile ToolsFile
if watchingFolder {
logger.DebugContext(ctx, "Reloading tools folder.")
reloadedToolsFile, err = loadAndMergeToolsFolder(ctx, folderToWatch)
if err != nil {
logger.WarnContext(ctx, "error loading tools folder %s", err)
continue
}
} else {
logger.DebugContext(ctx, "Reloading tools file(s).")
reloadedToolsFile, err = loadAndMergeToolsFiles(ctx, slices.Collect(maps.Keys(watchedFiles)))
if err != nil {
logger.WarnContext(ctx, "error loading tools files %s", err)
continue
}
}
err = handleDynamicReload(ctx, reloadedToolsFile, s)
if err != nil {
errMsg := fmt.Errorf("unable to parse reloaded tools file at %q: %w", reloadedToolsFile, err)
logger.WarnContext(ctx, errMsg.Error())
continue
}
}
}
}
// updateLogLevel checks if Toolbox have to update the existing log level set by users.
// stdio doesn't support "debug" and "info" logs.
func updateLogLevel(stdio bool, logLevel string) bool {
@@ -235,6 +586,33 @@ func updateLogLevel(stdio bool, logLevel string) bool {
return false
}
func resolveWatcherInputs(toolsFile string, toolsFiles []string, toolsFolder string) (map[string]bool, map[string]bool) {
var relevantFiles []string
// map for efficiently checking if a file is relevant
watchedFiles := make(map[string]bool)
// dirs that will be added to watcher (fsnotify prefers watching directory then filtering for file)
watchDirs := make(map[string]bool)
if len(toolsFiles) > 0 {
relevantFiles = toolsFiles
} else if toolsFolder != "" {
watchDirs[filepath.Clean(toolsFolder)] = true
} else {
relevantFiles = []string{toolsFile}
}
// extract parent dir for relevant files and dedup
for _, f := range relevantFiles {
cleanFile := filepath.Clean(f)
watchedFiles[cleanFile] = true
watchDirs[filepath.Dir(cleanFile)] = true
}
return watchDirs, watchedFiles
}
func run(cmd *Command) error {
if updateLogLevel(cmd.cfg.Stdio, cmd.cfg.LogLevel.String()) {
cmd.cfg.LogLevel = server.StringLevel(log.Warn)
@@ -298,17 +676,17 @@ func run(cmd *Command) error {
}
}()
var buf []byte
var toolsFile ToolsFile
if cmd.prebuiltConfig != "" {
// Make sure --prebuilt and --tools-file flags are mutually exclusive
if cmd.tools_file != "" {
errMsg := fmt.Errorf("--prebuilt and --tools-file flags cannot be used simultaneously")
// Make sure --prebuilt and --tools-file/--tools-files/--tools-folder flags are mutually exclusive
if cmd.tools_file != "" || len(cmd.tools_files) > 0 || cmd.tools_folder != "" {
errMsg := fmt.Errorf("--prebuilt and --tools-file/--tools-files/--tools-folder flags cannot be used simultaneously")
cmd.logger.ErrorContext(ctx, errMsg.Error())
return errMsg
}
// Use prebuilt tools
buf, err = prebuiltconfigs.Get(cmd.prebuiltConfig)
buf, err := prebuiltconfigs.Get(cmd.prebuiltConfig)
if err != nil {
cmd.logger.ErrorContext(ctx, err.Error())
return err
@@ -317,35 +695,85 @@ func run(cmd *Command) error {
cmd.logger.InfoContext(ctx, logMsg)
// Append prebuilt.source to Version string for the User Agent
cmd.cfg.Version += "+prebuilt." + cmd.prebuiltConfig
toolsFile, err = parseToolsFile(ctx, buf)
if err != nil {
errMsg := fmt.Errorf("unable to parse prebuilt tool configuration: %w", err)
cmd.logger.ErrorContext(ctx, errMsg.Error())
return errMsg
}
} else if len(cmd.tools_files) > 0 {
// Make sure --tools-file, --tools-files, and --tools-folder flags are mutually exclusive
if cmd.tools_file != "" || cmd.tools_folder != "" {
errMsg := fmt.Errorf("--tools-file, --tools-files, and --tools-folder flags cannot be used simultaneously")
cmd.logger.ErrorContext(ctx, errMsg.Error())
return errMsg
}
// Use multiple tools files
cmd.logger.InfoContext(ctx, fmt.Sprintf("Loading and merging %d tool configuration files", len(cmd.tools_files)))
var err error
toolsFile, err = loadAndMergeToolsFiles(ctx, cmd.tools_files)
if err != nil {
cmd.logger.ErrorContext(ctx, err.Error())
return err
}
} else if cmd.tools_folder != "" {
// Make sure --tools-folder and other flags are mutually exclusive
if cmd.tools_file != "" || len(cmd.tools_files) > 0 {
errMsg := fmt.Errorf("--tools-file, --tools-files, and --tools-folder flags cannot be used simultaneously")
cmd.logger.ErrorContext(ctx, errMsg.Error())
return errMsg
}
// Use tools folder
cmd.logger.InfoContext(ctx, fmt.Sprintf("Loading and merging all YAML files from directory: %s", cmd.tools_folder))
var err error
toolsFile, err = loadAndMergeToolsFolder(ctx, cmd.tools_folder)
if err != nil {
cmd.logger.ErrorContext(ctx, err.Error())
return err
}
} else {
// Set default value of tools-file flag to tools.yaml
if cmd.tools_file == "" {
cmd.tools_file = "tools.yaml"
}
// Read tool file contents
buf, err = os.ReadFile(cmd.tools_file)
// Read single tool file contents
buf, err := os.ReadFile(cmd.tools_file)
if err != nil {
errMsg := fmt.Errorf("unable to read tool file at %q: %w", cmd.tools_file, err)
cmd.logger.ErrorContext(ctx, errMsg.Error())
return errMsg
}
toolsFile, err = parseToolsFile(ctx, buf)
if err != nil {
errMsg := fmt.Errorf("unable to parse tool file at %q: %w", cmd.tools_file, err)
cmd.logger.ErrorContext(ctx, errMsg.Error())
return errMsg
}
}
toolsFile, err := parseToolsFile(ctx, buf)
cmd.cfg.SourceConfigs, cmd.cfg.AuthServiceConfigs, cmd.cfg.ToolConfigs, cmd.cfg.ToolsetConfigs = toolsFile.Sources, toolsFile.AuthServices, toolsFile.Tools, toolsFile.Toolsets
authSourceConfigs := toolsFile.AuthSources
if authSourceConfigs != nil {
cmd.logger.WarnContext(ctx, "`authSources` is deprecated, use `authServices` instead")
cmd.cfg.AuthServiceConfigs = authSourceConfigs
}
instrumentation, err := telemetry.CreateTelemetryInstrumentation(versionString)
if err != nil {
errMsg := fmt.Errorf("unable to parse tool file at %q: %w", cmd.tools_file, err)
errMsg := fmt.Errorf("unable to create telemetry instrumentation: %w", err)
cmd.logger.ErrorContext(ctx, errMsg.Error())
return errMsg
}
ctx = util.WithInstrumentation(ctx, instrumentation)
// start server
s, err := server.NewServer(ctx, cmd.cfg, cmd.logger)
s, err := server.NewServer(ctx, cmd.cfg)
if err != nil {
errMsg := fmt.Errorf("toolbox failed to initialize: %w", err)
cmd.logger.ErrorContext(ctx, errMsg.Error())
@@ -380,6 +808,13 @@ func run(cmd *Command) error {
}()
}
watchDirs, watchedFiles := resolveWatcherInputs(cmd.tools_file, cmd.tools_files, cmd.tools_folder)
if !cmd.cfg.DisableReload {
// start watching the file(s) or folder for changes to trigger dynamic reloading
go watchChanges(ctx, watchDirs, watchedFiles, s)
}
// wait for either the server to error out or the command's context to be canceled
select {
case err := <-srvErr:

View File

@@ -16,23 +16,33 @@ package cmd
import (
"bytes"
"context"
_ "embed"
"fmt"
"io"
"os"
"path"
"path/filepath"
"regexp"
"runtime"
"strings"
"testing"
"time"
"github.com/google/go-cmp/cmp"
"github.com/googleapis/genai-toolbox/internal/auth/google"
"github.com/googleapis/genai-toolbox/internal/log"
"github.com/googleapis/genai-toolbox/internal/prebuiltconfigs"
"github.com/googleapis/genai-toolbox/internal/server"
cloudsqlpgsrc "github.com/googleapis/genai-toolbox/internal/sources/cloudsqlpg"
httpsrc "github.com/googleapis/genai-toolbox/internal/sources/http"
"github.com/googleapis/genai-toolbox/internal/telemetry"
"github.com/googleapis/genai-toolbox/internal/testutils"
"github.com/googleapis/genai-toolbox/internal/tools"
"github.com/googleapis/genai-toolbox/internal/tools/http"
"github.com/googleapis/genai-toolbox/internal/tools/postgressql"
"github.com/googleapis/genai-toolbox/internal/tools/postgres/postgressql"
"github.com/googleapis/genai-toolbox/internal/util"
"github.com/spf13/cobra"
)
@@ -174,6 +184,13 @@ func TestServerConfigFlags(t *testing.T) {
Stdio: true,
}),
},
{
desc: "disable reload",
args: []string{"--disable-reload"},
want: withDefaults(server.ServerConfig{
DisableReload: true,
}),
},
}
for _, tc := range tcs {
t.Run(tc.desc, func(t *testing.T) {
@@ -229,6 +246,71 @@ func TestToolFileFlag(t *testing.T) {
}
}
func TestToolsFilesFlag(t *testing.T) {
tcs := []struct {
desc string
args []string
want []string
}{
{
desc: "no value",
args: []string{},
want: []string{},
},
{
desc: "single file",
args: []string{"--tools-files", "foo.yaml"},
want: []string{"foo.yaml"},
},
{
desc: "multiple files",
args: []string{"--tools-files", "foo.yaml,bar.yaml"},
want: []string{"foo.yaml", "bar.yaml"},
},
}
for _, tc := range tcs {
t.Run(tc.desc, func(t *testing.T) {
c, _, err := invokeCommand(tc.args)
if err != nil {
t.Fatalf("unexpected error invoking command: %s", err)
}
if diff := cmp.Diff(c.tools_files, tc.want); diff != "" {
t.Fatalf("got %v, want %v", c.tools_files, tc.want)
}
})
}
}
func TestToolsFolderFlag(t *testing.T) {
tcs := []struct {
desc string
args []string
want string
}{
{
desc: "no value",
args: []string{},
want: "",
},
{
desc: "folder set",
args: []string{"--tools-folder", "test-folder"},
want: "test-folder",
},
}
for _, tc := range tcs {
t.Run(tc.desc, func(t *testing.T) {
c, _, err := invokeCommand(tc.args)
if err != nil {
t.Fatalf("unexpected error invoking command: %s", err)
}
if c.tools_folder != tc.want {
t.Fatalf("got %v, want %v", c.tools_folder, tc.want)
}
})
}
}
func TestPrebuiltFlag(t *testing.T) {
tcs := []struct {
desc string
@@ -900,12 +982,196 @@ func TestEnvVarReplacement(t *testing.T) {
}
// normalizeFilepaths is a helper function to allow same filepath formats for Mac and Windows.
// this prevents needing multiple "want" cases for TestResolveWatcherInputs
func normalizeFilepaths(m map[string]bool) map[string]bool {
newMap := make(map[string]bool)
for k, v := range m {
newMap[filepath.ToSlash(k)] = v
}
return newMap
}
func TestResolveWatcherInputs(t *testing.T) {
tcs := []struct {
description string
toolsFile string
toolsFiles []string
toolsFolder string
wantWatchDirs map[string]bool
wantWatchedFiles map[string]bool
}{
{
description: "single tools file",
toolsFile: "tools_folder/example_tools.yaml",
toolsFiles: []string{},
toolsFolder: "",
wantWatchDirs: map[string]bool{"tools_folder": true},
wantWatchedFiles: map[string]bool{"tools_folder/example_tools.yaml": true},
},
{
description: "default tools file (root dir)",
toolsFile: "tools.yaml",
toolsFiles: []string{},
toolsFolder: "",
wantWatchDirs: map[string]bool{".": true},
wantWatchedFiles: map[string]bool{"tools.yaml": true},
},
{
description: "multiple files in different folders",
toolsFile: "",
toolsFiles: []string{"tools_folder/example_tools.yaml", "tools_folder2/example_tools.yaml"},
toolsFolder: "",
wantWatchDirs: map[string]bool{"tools_folder": true, "tools_folder2": true},
wantWatchedFiles: map[string]bool{
"tools_folder/example_tools.yaml": true,
"tools_folder2/example_tools.yaml": true,
},
},
{
description: "multiple files in same folder",
toolsFile: "",
toolsFiles: []string{"tools_folder/example_tools.yaml", "tools_folder/example_tools2.yaml"},
toolsFolder: "",
wantWatchDirs: map[string]bool{"tools_folder": true},
wantWatchedFiles: map[string]bool{
"tools_folder/example_tools.yaml": true,
"tools_folder/example_tools2.yaml": true,
},
},
{
description: "multiple files in different levels",
toolsFile: "",
toolsFiles: []string{
"tools_folder/example_tools.yaml",
"tools_folder/special_tools/example_tools2.yaml"},
toolsFolder: "",
wantWatchDirs: map[string]bool{"tools_folder": true, "tools_folder/special_tools": true},
wantWatchedFiles: map[string]bool{
"tools_folder/example_tools.yaml": true,
"tools_folder/special_tools/example_tools2.yaml": true,
},
},
{
description: "tools folder",
toolsFile: "",
toolsFiles: []string{},
toolsFolder: "tools_folder",
wantWatchDirs: map[string]bool{"tools_folder": true},
wantWatchedFiles: map[string]bool{},
},
}
for _, tc := range tcs {
t.Run(tc.description, func(t *testing.T) {
gotWatchDirs, gotWatchedFiles := resolveWatcherInputs(tc.toolsFile, tc.toolsFiles, tc.toolsFolder)
normalizedGotWatchDirs := normalizeFilepaths(gotWatchDirs)
normalizedGotWatchedFiles := normalizeFilepaths(gotWatchedFiles)
if diff := cmp.Diff(tc.wantWatchDirs, normalizedGotWatchDirs); diff != "" {
t.Errorf("incorrect watchDirs: diff %v", diff)
}
if diff := cmp.Diff(tc.wantWatchedFiles, normalizedGotWatchedFiles); diff != "" {
t.Errorf("incorrect watchedFiles: diff %v", diff)
}
})
}
}
// helper function for testing file detection in dynamic reloading
func tmpFileWithCleanup(content []byte) (string, func(), error) {
f, err := os.CreateTemp("", "*")
if err != nil {
return "", nil, err
}
cleanup := func() { os.Remove(f.Name()) }
if _, err := f.Write(content); err != nil {
cleanup()
return "", nil, err
}
if err := f.Close(); err != nil {
cleanup()
return "", nil, err
}
return f.Name(), cleanup, err
}
func TestSingleEdit(t *testing.T) {
ctx, cancelCtx := context.WithTimeout(context.Background(), time.Minute)
defer cancelCtx()
pr, pw := io.Pipe()
defer pw.Close()
defer pr.Close()
fileToWatch, cleanup, err := tmpFileWithCleanup([]byte("initial content"))
if err != nil {
t.Fatalf("error editing tools file %s", err)
}
defer cleanup()
logger, err := log.NewStdLogger(pw, pw, "DEBUG")
if err != nil {
t.Fatalf("failed to setup logger %s", err)
}
ctx = util.WithLogger(ctx, logger)
instrumentation, err := telemetry.CreateTelemetryInstrumentation(versionString)
if err != nil {
t.Fatalf("failed to setup instrumentation %s", err)
}
ctx = util.WithInstrumentation(ctx, instrumentation)
mockServer := &server.Server{}
cleanFileToWatch := filepath.Clean(fileToWatch)
watchDir := filepath.Dir(cleanFileToWatch)
watchedFiles := map[string]bool{cleanFileToWatch: true}
watchDirs := map[string]bool{watchDir: true}
go watchChanges(ctx, watchDirs, watchedFiles, mockServer)
// escape backslash so regex doesn't fail on windows filepaths
regexEscapedPathFile := strings.ReplaceAll(cleanFileToWatch, `\`, `\\\\*\\`)
regexEscapedPathFile = path.Clean(regexEscapedPathFile)
regexEscapedPathDir := strings.ReplaceAll(watchDir, `\`, `\\\\*\\`)
regexEscapedPathDir = path.Clean(regexEscapedPathDir)
begunWatchingDir := regexp.MustCompile(fmt.Sprintf(`DEBUG "Added directory %s to watcher."`, regexEscapedPathDir))
_, err = testutils.WaitForString(ctx, begunWatchingDir, pr)
if err != nil {
t.Fatalf("timeout or error waiting for watcher to start: %s", err)
}
err = os.WriteFile(fileToWatch, []byte("modification"), 0777)
if err != nil {
t.Fatalf("error writing to file: %v", err)
}
// only check substring of DEBUG message due to some OS/editors firing different operations
detectedFileChange := regexp.MustCompile(fmt.Sprintf(`event detected in %s"`, regexEscapedPathFile))
_, err = testutils.WaitForString(ctx, detectedFileChange, pr)
if err != nil {
t.Fatalf("timeout or error waiting for file to detect write: %s", err)
}
}
func TestPrebuiltTools(t *testing.T) {
alloydb_admin_config, _ := prebuiltconfigs.Get("alloydb-postgres-admin")
alloydb_config, _ := prebuiltconfigs.Get("alloydb-postgres")
bigquery_config, _ := prebuiltconfigs.Get("bigquery")
cloudsqlpg_config, _ := prebuiltconfigs.Get("cloud-sql-postgres")
cloudsqlmysql_config, _ := prebuiltconfigs.Get("cloud-sql-mysql")
cloudsqlmssql_config, _ := prebuiltconfigs.Get("cloud-sql-mssql")
dataplex_config, _ := prebuiltconfigs.Get("dataplex")
firestoreconfig, _ := prebuiltconfigs.Get("firestore")
mysql_config, _ := prebuiltconfigs.Get("mysql")
mssql_config, _ := prebuiltconfigs.Get("mssql")
looker_config, _ := prebuiltconfigs.Get("looker")
postgresconfig, _ := prebuiltconfigs.Get("postgres")
spanner_config, _ := prebuiltconfigs.Get("spanner")
spannerpg_config, _ := prebuiltconfigs.Get("spanner-postgres")
@@ -918,6 +1184,16 @@ func TestPrebuiltTools(t *testing.T) {
in []byte
wantToolset server.ToolsetConfigs
}{
{
name: "alloydb postgres admin prebuilt tools",
in: alloydb_admin_config,
wantToolset: server.ToolsetConfigs{
"alloydb-postgres-admin-tools": tools.ToolsetConfig{
Name: "alloydb-postgres-admin-tools",
ToolNames: []string{"alloydb-create-cluster", "alloydb-operations-get", "alloydb-create-instance"},
},
},
},
{
name: "alloydb prebuilt tools",
in: alloydb_config,
@@ -968,6 +1244,56 @@ func TestPrebuiltTools(t *testing.T) {
},
},
},
{
name: "dataplex prebuilt tools",
in: dataplex_config,
wantToolset: server.ToolsetConfigs{
"dataplex-tools": tools.ToolsetConfig{
Name: "dataplex-tools",
ToolNames: []string{"dataplex_search_entries"},
},
},
},
{
name: "firestore prebuilt tools",
in: firestoreconfig,
wantToolset: server.ToolsetConfigs{
"firestore-database-tools": tools.ToolsetConfig{
Name: "firestore-database-tools",
ToolNames: []string{"firestore-get-documents", "firestore-list-collections", "firestore-delete-documents", "firestore-query-collection", "firestore-get-rules", "firestore-validate-rules"},
},
},
},
{
name: "mysql prebuilt tools",
in: mysql_config,
wantToolset: server.ToolsetConfigs{
"mysql-database-tools": tools.ToolsetConfig{
Name: "mysql-database-tools",
ToolNames: []string{"execute_sql", "list_tables"},
},
},
},
{
name: "mssql prebuilt tools",
in: mssql_config,
wantToolset: server.ToolsetConfigs{
"mssql-database-tools": tools.ToolsetConfig{
Name: "mssql-database-tools",
ToolNames: []string{"execute_sql", "list_tables"},
},
},
},
{
name: "looker prebuilt tools",
in: looker_config,
wantToolset: server.ToolsetConfigs{
"looker-tools": tools.ToolsetConfig{
Name: "looker-tools",
ToolNames: []string{"get_models", "get_explores", "get_dimensions", "get_measures", "get_filters", "get_parameters", "query", "query_sql", "query_url", "get_looks", "run_look"},
},
},
},
{
name: "postgres prebuilt tools",
in: postgresconfig,

View File

@@ -1 +1 @@
0.7.0
0.10.0

View File

@@ -4,12 +4,12 @@ type: docs
notoc: false
weight: 1
description: >
All of Toolbox's documentation.
All of Toolbox's documentation.
---
<html>
<head>
<link rel="canonical" href="getting-started/introduction/"/>
<meta http-equiv="refresh" content="0;url=getting-started/introduction"/>
<meta http-equiv="refresh" content="0;url=getting-started/introduction/"/>
</head>
</html>

View File

@@ -9,22 +9,22 @@ description: Frequently asked questions about Toolbox.
MCP Toolbox for Databases is open-source and can be ran or deployed to a
multitude of environments. For convenience, we release [compiled binaries and
docker images][release-notes] (but you can always compile yourself as well!).
docker images][release-notes] (but you can always compile yourself as well!).
For detailed instructions, check our these resources:
- [Quickstart: How to Run Locally](../getting-started/local_quickstart.md)
- [Deploy to Cloud Run](../how-to/deploy_toolbox.md)
[release-notes]: https://github.com/googleapis/genai-toolbox/releases/
## Do I need a Google Cloud account/project to get started with Toolbox?
Nope! While some of the sources Toolbox connects to may require GCP credentials,
Toolbox doesn't require them and can connect to a bunch of different resources
that don't.
that don't.
## Does Toolbox take contributions from external users?
## Does Toolbox take contributions from external users?
Absolutely! Please check out our [DEVELOPER.md][] for instructions on how to get
started developing _on_ Toolbox instead of with it, and the [CONTRIBUTING.md][]
@@ -33,17 +33,16 @@ for instructions on completing the CLA and getting a PR accepted.
[DEVELOPER.md]: https://github.com/googleapis/genai-toolbox/blob/main/DEVELOPER.md
[CONTRIBUTING.MD]: https://github.com/googleapis/genai-toolbox/blob/main/CONTRIBUTING.md
## Can Toolbox support a feature to let me do _$FOO_?
## Can Toolbox support a feature to let me do _$FOO_?
Maybe? The best place to start is by [opening an issue][github-issue] for
discussion (or seeing if there is already one open), so we can better understand
your use case and the best way to solve it. Generally we aim to prioritize the
most popular issues, so make sure to +1 ones you are the most interested in.
most popular issues, so make sure to +1 ones you are the most interested in.
[github-issue]: https://github.com/googleapis/genai-toolbox/issues
## Can Toolbox be used for non-database tools?
## Can Toolbox be used for non-database tools?
Currently, Toolbox is primarily focused on making it easier to create and
develop tools focused on interacting with Databases. We believe that there are a
@@ -55,21 +54,21 @@ GRPC tools might be helpful in assisting with migrating to Toolbox or in
accomplishing more complicated workflows. We're looking into what that might
best look like in Toolbox.
## Can I use _$BAR_ orchestration framework to use tools from Toolbox?
## Can I use _$BAR_ orchestration framework to use tools from Toolbox?
Currently, Toolbox only supports a limited number of client SDKs at our initial
launch. We are investigating support for more frameworks as well as more general
approaches for users without a framework -- look forward to seeing an update
soon.
## Why does Toolbox use a server-client architecture pattern?
## Why does Toolbox use a server-client architecture pattern?
Toolbox's server-client architecture allows us to more easily support a wide
variety of languages and frameworks with a centralized implementation. It also
allows us to tackle problems like connection pooling, auth, or caching more
completely than entirely client-side solutions.
## Why was Toolbox written in Go?
## Why was Toolbox written in Go?
While a large part of the Gen AI Ecosystem is predominately Python, we opted to
use Go. We chose Go because it's still easy and simple to use, but also easier
@@ -80,8 +79,9 @@ to be able to use Toolbox on the serving path of mission critical applications.
It's easier to build the needed robustness, performance and scalability in Go
than in Python.
## Is Toolbox compatible with Model Context Protocol (MCP)?
## Is Toolbox compatible with Model Context Protocol (MCP)?
Yes! Toolbox is compatible with [Anthropic's Model Context Protocol (MCP)](https://modelcontextprotocol.io/). Please checkout [Connect via MCP](../how-to/connect_via_mcp.md) on how to
connect to Toolbox with an MCP client.
Yes! Toolbox is compatible with [Anthropic's Model Context Protocol
(MCP)](https://modelcontextprotocol.io/). Please checkout [Connect via
MCP](../how-to/connect_via_mcp.md) on how to connect to Toolbox with an MCP
client.

View File

@@ -3,7 +3,7 @@ title: "Telemetry"
type: docs
weight: 2
description: >
An overview of telemetry and observability in Toolbox.
An overview of telemetry and observability in Toolbox.
---
## About
@@ -17,7 +17,6 @@ through [OpenTelemetry](https://opentelemetry.io/). Additional flags can be
passed to Toolbox to enable different logging behavior, or to export metrics
through a specific [exporter](#exporter).
## Logging
The following flags can be used to customize Toolbox logging:
@@ -27,7 +26,8 @@ The following flags can be used to customize Toolbox logging:
| `--log-level` | Preferred log level, allowed values: `debug`, `info`, `warn`, `error`. Default: `info`. |
| `--logging-format` | Preferred logging format, allowed values: `standard`, `json`. Default: `standard`. |
__Example:__
**Example:**
```bash
./toolbox --tools-file "tools.yaml" --log-level warn --logging-format json
```
@@ -35,6 +35,7 @@ __Example:__
### Level
Toolbox supports the following log levels, including:
| **Log level** | **Description** |
|---------------|------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| Debug | Debug logs typically contain information that is only useful during the debugging phase and may be of little value during production. |
@@ -46,17 +47,18 @@ Toolbox will only output logs that are equal or more severe to the
level that it is set. Below are the log levels that Toolbox supports in the
order of severity.
### Format
Toolbox supports both standard and structured logging format.
The standard logging outputs log as string:
```
2024-11-12T15:08:11.451377-08:00 INFO "Initialized 0 sources.\n"
```
The structured logging outputs log as JSON:
```
{
"timestamp":"2024-11-04T16:45:11.987299-08:00",
@@ -66,9 +68,9 @@ The structured logging outputs log as JSON:
}
```
{{< notice tip >}}
{{< notice tip >}}
`logging.googleapis.com/sourceLocation` shows the source code
location information associated with the log entry, if any.
location information associated with the log entry, if any.
{{< /notice >}}
## Telemetry
@@ -125,7 +127,6 @@ unified [resource][resource]. The list of resource attributes included are:
| `service.name` | Open telemetry service name. Defaulted to `toolbox`. User can set the service name via flag mentioned above to distinguish between different toolbox service. |
| `service.version` | The version of Toolbox used. |
[resource]: https://opentelemetry.io/docs/languages/go/resources/
### Exporter
@@ -151,12 +152,13 @@ Exporter][gcp-trace-exporter].
[gcp-trace-exporter]:
https://github.com/GoogleCloudPlatform/opentelemetry-operations-go/tree/main/exporter/trace
{{< notice note >}}
If you're using Google Cloud Monitoring, the following APIs will need to be
{{< notice note >}}
If you're using Google Cloud Monitoring, the following APIs will need to be
enabled:
- [Cloud Logging API](https://cloud.google.com/logging/docs/api/enable-api)
- [Cloud Monitoring API](https://cloud.google.com/monitoring/api/enable-api)
- [Cloud Trace API](https://cloud.google.com/apis/enableflow?apiid=cloudtrace.googleapis.com)
- [Cloud Trace API](https://console.cloud.google.com/apis/enableflow?apiid=cloudtrace.googleapis.com)
{{< /notice >}}
#### OTLP Exporter
@@ -175,7 +177,7 @@ It receives telemetry data, transforms it, and then exports data to backends
that can store it permanently. Toolbox provide an option to export telemetry
data to user's choice of backend(s) that are compatible with the Open Telemetry
Protocol (OTLP). If you would like to use a collector, please refer to this
[Export Telemetry using the Otel Collector](../how-to/export_telemetry.md).
[Export Telemetry using the Otel Collector](../../how-to/export_telemetry.md).
### Flags
@@ -184,7 +186,7 @@ The following flags are used to determine Toolbox's telemetry configuration:
| **flag** | **type** | **description** |
|----------------------------|----------|----------------------------------------------------------------------------------------------------------------|
| `--telemetry-gcp` | bool | Enable exporting directly to Google Cloud Monitoring. Default is `false`. |
| `--telemetry-otlp` | string | Enable exporting using OpenTelemetry Protocol (OTLP) to the specified endpoint (e.g. "http://127.0.0.1:4318"). |
| `--telemetry-otlp` | string | Enable exporting using OpenTelemetry Protocol (OTLP) to the specified endpoint (e.g. "<http://127.0.0.1:4318>"). |
| `--telemetry-service-name` | string | Sets the value of the `service.name` resource attribute. Default is `toolbox`. |
In addition to the flags noted above, you can also make additional configuration
@@ -194,14 +196,16 @@ environmental variables.
[sdk-configuration]:
https://opentelemetry.io/docs/languages/sdk-configuration/general/
__Examples:__
**Examples:**
To enable Google Cloud Exporter:
```bash
./toolbox --telemetry-gcp
```
To enable OTLP Exporter, provide Collector endpoint:
```bash
./toolbox --telemetry-otlp="http://127.0.0.1:4553"
```

View File

@@ -190,6 +190,18 @@
"!sudo lsof -i :5432"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Optional: Enable Vertex AI API for Google Cloud\n",
"\n",
"If you're using a model hosted on **Vertex AI**, run the following command to enable the API:\n",
"\n",
"```bash\n",
"!gcloud services enable aiplatform.googleapis.com\n"
]
},
{
"cell_type": "markdown",
"metadata": {
@@ -222,7 +234,7 @@
},
"outputs": [],
"source": [
"version = \"0.7.0\" # x-release-please-version\n",
"version = \"0.10.0\" # x-release-please-version\n",
"! curl -O https://storage.googleapis.com/genai-toolbox/v{version}/linux/amd64/toolbox\n",
"\n",
"# Make the binary executable\n",
@@ -474,165 +486,12 @@
"source": [
"> You can either use LangGraph or LlamaIndex to develop a Toolbox based\n",
"> application. Run one of the sections below\n",
"> - [Connect using Google GenAI](#scrollTo=Rwgv1LDdNKSn)\n",
"> - [Connect using Google GenAI](#scrollTo=Fv2-uT4mvYtp)\n",
"> - [Connect using ADK](#scrollTo=QqRlWqvYNKSo)\n",
"> - [Connect Using LangGraph](#scrollTo=pbapNMhhL33S)\n",
"> - [Connect using LlamaIndex](#scrollTo=04iysrm_L_7v)\n"
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "Rwgv1LDdNKSn"
},
"source": [
"### Connect Using Google GenAI"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"id": "HY23RMk4NKSn"
},
"outputs": [],
"source": [
"# Install the Toolbox Core package\n",
"!pip install toolbox-core --quiet\n",
"\n",
"# Install the Google GenAI package\n",
"!pip install google-genai --quiet"
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "9F1u566sNKSn"
},
"source": [
"Create a Google GenAI Application which can Search, Book and Cancel hotels."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"id": "LAuBIOXvNKSn"
},
"outputs": [],
"source": [
"import asyncio\n",
"\n",
"from google import genai\n",
"from google.genai.types import (\n",
" Content,\n",
" FunctionDeclaration,\n",
" GenerateContentConfig,\n",
" Part,\n",
" Tool,\n",
")\n",
"\n",
"from toolbox_core import ToolboxClient\n",
"\n",
"prompt = \"\"\"\n",
" You're a helpful hotel assistant. You handle hotel searching, booking and\n",
" cancellations. When the user searches for a hotel, mention it's name, id,\n",
" location and price tier. Always mention hotel id while performing any\n",
" searches. This is very important for any operations. For any bookings or\n",
" cancellations, please provide the appropriate confirmation. Be sure to\n",
" update checkin or checkout dates if mentioned by the user.\n",
" Don't ask for confirmations from the user.\n",
"\"\"\"\n",
"\n",
"queries = [\n",
" \"Find hotels in Basel with Basel in it's name.\",\n",
" \"Please book the hotel Hilton Basel for me.\",\n",
" \"This is too expensive. Please cancel it.\",\n",
" \"Please book Hyatt Regency for me\",\n",
" \"My check in dates for my booking would be from April 10, 2024 to April 19, 2024.\",\n",
"]\n",
"\n",
"\n",
"async def run_application():\n",
" toolbox_client = ToolboxClient(\"http://127.0.0.1:5000\")\n",
"\n",
" # The toolbox_tools list contains Python callables (functions/methods) designed for LLM tool-use\n",
" # integration. While this example uses Google's genai client, these callables can be adapted for\n",
" # various function-calling or agent frameworks. For easier integration with supported frameworks\n",
" # (https://github.com/googleapis/mcp-toolbox-python-sdk/tree/main/packages), use the\n",
" # provided wrapper packages, which handle framework-specific boilerplate.\n",
" toolbox_tools = await toolbox_client.load_toolset(\"my-toolset\")\n",
" genai_client = genai.Client(\n",
" vertexai=True, project=project_id, location=\"us-central1\"\n",
" )\n",
"\n",
" genai_tools = [\n",
" Tool(\n",
" function_declarations=[\n",
" FunctionDeclaration.from_callable_with_api_option(callable=tool)\n",
" ]\n",
" )\n",
" for tool in toolbox_tools\n",
" ]\n",
" history = []\n",
" for query in queries:\n",
" user_prompt_content = Content(\n",
" role=\"user\",\n",
" parts=[Part.from_text(text=query)],\n",
" )\n",
" history.append(user_prompt_content)\n",
"\n",
" response = genai_client.models.generate_content(\n",
" model=\"gemini-2.0-flash-001\",\n",
" contents=history,\n",
" config=GenerateContentConfig(\n",
" system_instruction=prompt,\n",
" tools=genai_tools,\n",
" ),\n",
" )\n",
" history.append(response.candidates[0].content)\n",
" function_response_parts = []\n",
" for function_call in response.function_calls:\n",
" fn_name = function_call.name\n",
" # The tools are sorted alphabetically\n",
" if fn_name == \"search-hotels-by-name\":\n",
" function_result = await toolbox_tools[3](**function_call.args)\n",
" elif fn_name == \"search-hotels-by-location\":\n",
" function_result = await toolbox_tools[2](**function_call.args)\n",
" elif fn_name == \"book-hotel\":\n",
" function_result = await toolbox_tools[0](**function_call.args)\n",
" elif fn_name == \"update-hotel\":\n",
" function_result = await toolbox_tools[4](**function_call.args)\n",
" elif fn_name == \"cancel-hotel\":\n",
" function_result = await toolbox_tools[1](**function_call.args)\n",
" else:\n",
" raise ValueError(\"Function name not present.\")\n",
" function_response = {\"result\": function_result}\n",
" function_response_part = Part.from_function_response(\n",
" name=function_call.name,\n",
" response=function_response,\n",
" )\n",
" function_response_parts.append(function_response_part)\n",
"\n",
" if function_response_parts:\n",
" tool_response_content = Content(role=\"tool\", parts=function_response_parts)\n",
" history.append(tool_response_content)\n",
"\n",
" response2 = genai_client.models.generate_content(\n",
" model=\"gemini-2.0-flash-001\",\n",
" contents=history,\n",
" config=GenerateContentConfig(\n",
" tools=genai_tools,\n",
" ),\n",
" )\n",
" final_model_response_content = response2.candidates[0].content\n",
" history.append(final_model_response_content)\n",
" print(response2.text)\n",
"\n",
"\n",
"asyncio.run(run_application())"
]
},
{
"cell_type": "markdown",
"metadata": {
@@ -695,7 +554,7 @@
"\n",
"session_service = InMemorySessionService()\n",
"artifacts_service = InMemoryArtifactService()\n",
"session = session_service.create_session(\n",
"session = await session_service.create_session(\n",
" state={}, app_name='hotel_agent', user_id='123'\n",
")\n",
"runner = Runner(\n",
@@ -931,6 +790,159 @@
"await run_application()"
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "Fv2-uT4mvYtp"
},
"source": [
"### Connect Using Google GenAI"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"id": "mHSvk5_AvYtu"
},
"outputs": [],
"source": [
"# Install the Toolbox Core package\n",
"!pip install toolbox-core --quiet\n",
"\n",
"# Install the Google GenAI package\n",
"!pip install google-genai --quiet"
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "sO_7FGSYvYtu"
},
"source": [
"Create a Google GenAI Application which can Search, Book and Cancel hotels."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"id": "-NVVBiLnvYtu"
},
"outputs": [],
"source": [
"import asyncio\n",
"\n",
"from google import genai\n",
"from google.genai.types import (\n",
" Content,\n",
" FunctionDeclaration,\n",
" GenerateContentConfig,\n",
" Part,\n",
" Tool,\n",
")\n",
"\n",
"from toolbox_core import ToolboxClient\n",
"\n",
"prompt = \"\"\"\n",
" You're a helpful hotel assistant. You handle hotel searching, booking and\n",
" cancellations. When the user searches for a hotel, mention it's name, id,\n",
" location and price tier. Always mention hotel id while performing any\n",
" searches. This is very important for any operations. For any bookings or\n",
" cancellations, please provide the appropriate confirmation. Be sure to\n",
" update checkin or checkout dates if mentioned by the user.\n",
" Don't ask for confirmations from the user.\n",
"\"\"\"\n",
"\n",
"queries = [\n",
" \"Find hotels in Basel with Basel in it's name.\",\n",
" \"Please book the hotel Hilton Basel for me.\",\n",
" \"This is too expensive. Please cancel it.\",\n",
" \"Please book Hyatt Regency for me\",\n",
" \"My check in dates for my booking would be from April 10, 2024 to April 19, 2024.\",\n",
"]\n",
"\n",
"\n",
"async def run_application():\n",
" toolbox_client = ToolboxClient(\"http://127.0.0.1:5000\")\n",
"\n",
" # The toolbox_tools list contains Python callables (functions/methods) designed for LLM tool-use\n",
" # integration. While this example uses Google's genai client, these callables can be adapted for\n",
" # various function-calling or agent frameworks. For easier integration with supported frameworks\n",
" # (https://github.com/googleapis/mcp-toolbox-python-sdk/tree/main/packages), use the\n",
" # provided wrapper packages, which handle framework-specific boilerplate.\n",
" toolbox_tools = await toolbox_client.load_toolset(\"my-toolset\")\n",
" genai_client = genai.Client(\n",
" vertexai=True, project=project_id, location=\"us-central1\"\n",
" )\n",
"\n",
" genai_tools = [\n",
" Tool(\n",
" function_declarations=[\n",
" FunctionDeclaration.from_callable_with_api_option(callable=tool)\n",
" ]\n",
" )\n",
" for tool in toolbox_tools\n",
" ]\n",
" history = []\n",
" for query in queries:\n",
" user_prompt_content = Content(\n",
" role=\"user\",\n",
" parts=[Part.from_text(text=query)],\n",
" )\n",
" history.append(user_prompt_content)\n",
"\n",
" response = genai_client.models.generate_content(\n",
" model=\"gemini-2.0-flash-001\",\n",
" contents=history,\n",
" config=GenerateContentConfig(\n",
" system_instruction=prompt,\n",
" tools=genai_tools,\n",
" ),\n",
" )\n",
" history.append(response.candidates[0].content)\n",
" function_response_parts = []\n",
" for function_call in response.function_calls:\n",
" fn_name = function_call.name\n",
" # The tools are sorted alphabetically\n",
" if fn_name == \"search-hotels-by-name\":\n",
" function_result = await toolbox_tools[3](**function_call.args)\n",
" elif fn_name == \"search-hotels-by-location\":\n",
" function_result = await toolbox_tools[2](**function_call.args)\n",
" elif fn_name == \"book-hotel\":\n",
" function_result = await toolbox_tools[0](**function_call.args)\n",
" elif fn_name == \"update-hotel\":\n",
" function_result = await toolbox_tools[4](**function_call.args)\n",
" elif fn_name == \"cancel-hotel\":\n",
" function_result = await toolbox_tools[1](**function_call.args)\n",
" else:\n",
" raise ValueError(\"Function name not present.\")\n",
" function_response = {\"result\": function_result}\n",
" function_response_part = Part.from_function_response(\n",
" name=function_call.name,\n",
" response=function_response,\n",
" )\n",
" function_response_parts.append(function_response_part)\n",
"\n",
" if function_response_parts:\n",
" tool_response_content = Content(role=\"tool\", parts=function_response_parts)\n",
" history.append(tool_response_content)\n",
"\n",
" response2 = genai_client.models.generate_content(\n",
" model=\"gemini-2.0-flash-001\",\n",
" contents=history,\n",
" config=GenerateContentConfig(\n",
" tools=genai_tools,\n",
" ),\n",
" )\n",
" final_model_response_content = response2.candidates[0].content\n",
" history.append(final_model_response_content)\n",
" print(response2.text)\n",
"\n",
"\n",
"asyncio.run(run_application())"
]
},
{
"cell_type": "markdown",
"metadata": {

View File

@@ -1,7 +1,7 @@
---
title: "Configuration"
type: docs
weight: 4
weight: 6
description: >
How to configure Toolbox's tools.yaml file.
---

View File

@@ -10,17 +10,17 @@ MCP Toolbox for Databases is an open source MCP server for databases. It enables
you to develop tools easier, faster, and more securely by handling the complexities
such as connection pooling, authentication, and more.
{{< notice note >}}
This solution was originally named “Gen AI Toolbox for
Databases” as its initial development predated MCP, but was renamed to align
with recently added MCP compatibility.
{{< /notice >}}
## Why Toolbox?
## Why Toolbox?
Toolbox helps you build Gen AI tools that let your agents access data in your
database. Toolbox provides:
- **Simplified development**: Integrate tools to your agent in less than 10
lines of code, reuse tools between multiple agents or frameworks, and deploy
new versions of tools more easily.
@@ -30,17 +30,29 @@ database. Toolbox provides:
- **End-to-end observability**: Out of the box metrics and tracing with built-in
support for OpenTelemetry.
**⚡ Supercharge Your Workflow with an AI Database Assistant ⚡**
Stop context-switching and let your AI assistant become a true co-developer. By [connecting your IDE to your databases with MCP Toolbox][connect-ide], you can delegate complex and time-consuming database tasks, allowing you to build faster and focus on what matters. This isn't just about code completion; it's about giving your AI the context it needs to handle the entire development lifecycle.
Stop context-switching and let your AI assistant become a true co-developer. By
[connecting your IDE to your databases with MCP Toolbox][connect-ide], you can
delegate complex and time-consuming database tasks, allowing you to build faster
and focus on what matters. This isn't just about code completion; it's about
giving your AI the context it needs to handle the entire development lifecycle.
Heres how it will save you time:
* **Query in Plain English**: Interact with your data using natural language right from your IDE. Ask complex questions like, *"How many orders were delivered in 2024, and what items were in them?"* without writing any SQL.
* **Automate Database Management**: Simply describe your data needs, and let the AI assistant manage your database for you. It can handle generating queries, creating tables, adding indexes, and more.
* **Generate Context-Aware Code**: Empower your AI assistant to generate application code and tests with a deep understanding of your real-time database schema. This accelerates the development cycle by ensuring the generated code is directly usable.
* **Slash Development Overhead**: Radically reduce the time spent on manual setup and boilerplate. MCP Toolbox helps streamline lengthy database configurations, repetitive code, and error-prone schema migrations.
- **Query in Plain English**: Interact with your data using natural language
right from your IDE. Ask complex questions like, *"How many orders were
delivered in 2024, and what items were in them?"* without writing any SQL.
- **Automate Database Management**: Simply describe your data needs, and let the
AI assistant manage your database for you. It can handle generating queries,
creating tables, adding indexes, and more.
- **Generate Context-Aware Code**: Empower your AI assistant to generate
application code and tests with a deep understanding of your real-time
database schema. This accelerates the development cycle by ensuring the
generated code is directly usable.
- **Slash Development Overhead**: Radically reduce the time spent on manual
setup and boilerplate. MCP Toolbox helps streamline lengthy database
configurations, repetitive code, and error-prone schema migrations.
Learn [how to connect your AI tools (IDEs) to Toolbox using MCP][connect-ide].
@@ -60,6 +72,7 @@ redeploying your application.
## Getting Started
### Installing the server
For the latest version, check the [releases page][releases] and use the
following instructions for your OS and CPU architecture.
@@ -73,7 +86,7 @@ To install Toolbox as a binary:
```sh
# see releases page for other versions
export VERSION=0.7.0
export VERSION=0.10.0
curl -O https://storage.googleapis.com/genai-toolbox/v$VERSION/linux/amd64/toolbox
chmod +x toolbox
```
@@ -84,10 +97,17 @@ You can also install Toolbox as a container:
```sh
# see releases page for other versions
export VERSION=0.7.0
export VERSION=0.10.0
docker pull us-central1-docker.pkg.dev/database-toolbox/toolbox/toolbox:$VERSION
```
{{% /tab %}}
{{% tab header="Homebrew" lang="en" %}}
To install Toolbox using Homebrew on macOS or Linux:
```sh
brew install mcp-toolbox
```
{{% /tab %}}
{{% tab header="Compile from source" lang="en" %}}
@@ -95,7 +115,7 @@ To install from source, ensure you have the latest version of
[Go installed](https://go.dev/doc/install), and then run the following command:
```sh
go install github.com/googleapis/genai-toolbox@v0.7.0
go install github.com/googleapis/genai-toolbox@v0.10.0
```
{{% /tab %}}
@@ -111,17 +131,32 @@ execute `toolbox` to start the server:
./toolbox --tools-file "tools.yaml"
```
{{< notice note >}}
Toolbox enables dynamic reloading by default. To disable, use the
`--disable-reload` flag.
{{< /notice >}}
#### Homebrew Users
If you installed Toolbox using Homebrew, the `toolbox` binary is available in your system path. You can start the server with the same command:
```sh
toolbox --tools-file "tools.yaml"
```
You can use `toolbox help` for a full list of flags! To stop the server, send a
terminate signal (`ctrl+c` on most platforms).
For more detailed documentation on deploying to different environments, check
out the resources in the [How-to section](../../how-to/_index.md)
out the resources in the [How-to section](../../how-to/)
### Integrating your application
Once your server is up and running, you can load the tools into your
application. See below the list of Client SDKs for using various frameworks:
#### Python
{{< tabpane text=true persist=header >}}
{{% tab header="Core" lang="en" %}}
@@ -133,6 +168,7 @@ tools:
from toolbox_core import ToolboxClient
# update the url to point to your server
async with ToolboxClient("http://127.0.0.1:5000") as client:
# these tools can be passed to your application!
@@ -153,6 +189,7 @@ tools:
from toolbox_langchain import ToolboxClient
# update the url to point to your server
async with ToolboxClient("http://127.0.0.1:5000") as client:
# these tools can be passed to your application!
@@ -173,9 +210,11 @@ tools:
from toolbox_llamaindex import ToolboxClient
# update the url to point to your server
async with ToolboxClient("http://127.0.0.1:5000") as client:
# these tools can be passed to your application!
# these tools can be passed to your application
tools = client.load_toolset()
{{< /highlight >}}
@@ -184,3 +223,366 @@ For more detailed instructions on using the Toolbox Llamaindex SDK, see the
{{% /tab %}}
{{< /tabpane >}}
#### Javascript/Typescript
Once you've installed the [Toolbox Core
SDK](https://www.npmjs.com/package/@toolbox-sdk/core), you can load
tools:
{{< tabpane text=true persist=header >}}
{{% tab header="Core" lang="en" %}}
{{< highlight javascript >}}
import { ToolboxClient } from '@toolbox-sdk/core';
// update the url to point to your server
const URL = 'http://127.0.0.1:5000';
let client = new ToolboxClient(URL);
// these tools can be passed to your application!
const toolboxTools = await client.loadToolset('toolsetName');
{{< /highlight >}}
{{% /tab %}}
{{% tab header="LangChain/Langraph" lang="en" %}}
{{< highlight javascript >}}
import { ToolboxClient } from '@toolbox-sdk/core';
// update the url to point to your server
const URL = 'http://127.0.0.1:5000';
let client = new ToolboxClient(URL);
// these tools can be passed to your application!
const toolboxTools = await client.loadToolset('toolsetName');
// Define the basics of the tool: name, description, schema and core logic
const getTool = (toolboxTool) => tool(currTool, {
name: toolboxTool.getName(),
description: toolboxTool.getDescription(),
schema: toolboxTool.getParamSchema()
});
// Use these tools in your Langchain/Langraph applications
const tools = toolboxTools.map(getTool);
{{< /highlight >}}
{{% /tab %}}
{{% tab header="Genkit" lang="en" %}}
{{< highlight javascript >}}
import { ToolboxClient } from '@toolbox-sdk/core';
import { genkit } from 'genkit';
// Initialise genkit
const ai = genkit({
plugins: [
googleAI({
apiKey: process.env.GEMINI_API_KEY || process.env.GOOGLE_API_KEY
})
],
model: googleAI.model('gemini-2.0-flash'),
});
// update the url to point to your server
const URL = 'http://127.0.0.1:5000';
let client = new ToolboxClient(URL);
// these tools can be passed to your application!
const toolboxTools = await client.loadToolset('toolsetName');
// Define the basics of the tool: name, description, schema and core logic
const getTool = (toolboxTool) => ai.defineTool({
name: toolboxTool.getName(),
description: toolboxTool.getDescription(),
schema: toolboxTool.getParamSchema()
}, toolboxTool)
// Use these tools in your Genkit applications
const tools = toolboxTools.map(getTool);
{{< /highlight >}}
{{% /tab %}}
{{% tab header="LlamaIndex" lang="en" %}}
{{< highlight javascript >}}
import { ToolboxClient } from '@toolbox-sdk/core';
import { tool } from "llamaindex";
// update the url to point to your server
const URL = 'http://127.0.0.1:5000';
let client = new ToolboxClient(URL);
// these tools can be passed to your application!
const toolboxTools = await client.loadToolset('toolsetName');
// Define the basics of the tool: name, description, schema and core logic
const getTool = (toolboxTool) => tool({
name: toolboxTool.getName(),
description: toolboxTool.getDescription(),
parameters: toolboxTool.getParamSchema(),
execute: toolboxTool
});;
// Use these tools in your LlamaIndex applications
const tools = toolboxTools.map(getTool);
{{< /highlight >}}
{{% /tab %}}
{{< /tabpane >}}
For more detailed instructions on using the Toolbox Core SDK, see the
[project's README](https://github.com/googleapis/mcp-toolbox-sdk-js/blob/main/packages/toolbox-core/README.md).
#### Go
Once you've installed the [Toolbox Go
SDK](https://pkg.go.dev/github.com/googleapis/mcp-toolbox-sdk-go/core), you can load
tools:
{{< tabpane text=true persist=header >}}
{{% tab header="Core" lang="en" %}}
{{< highlight go >}}
package main
import (
"context"
"log"
"github.com/googleapis/mcp-toolbox-sdk-go/core"
)
func main() {
// update the url to point to your server
URL := "http://127.0.0.1:5000"
ctx := context.Background()
client, err := core.NewToolboxClient(URL)
if err != nil {
log.Fatalf("Failed to create Toolbox client: %v", err)
}
// Framework agnostic tools
tools, err := client.LoadToolset("toolsetName", ctx)
if err != nil {
log.Fatalf("Failed to load tools: %v", err)
}
}
{{< /highlight >}}
{{% /tab %}}
{{% tab header="LangChain Go" lang="en" %}}
{{< highlight go >}}
package main
import (
"context"
"encoding/json"
"log"
"github.com/googleapis/mcp-toolbox-sdk-go/core"
"github.com/tmc/langchaingo/llms"
)
func main() {
// Make sure to add the error checks
// update the url to point to your server
URL := "http://127.0.0.1:5000"
ctx := context.Background()
client, err := core.NewToolboxClient(URL)
if err != nil {
log.Fatalf("Failed to create Toolbox client: %v", err)
}
// Framework agnostic tool
tool, err := client.LoadTool("toolName", ctx)
if err != nil {
log.Fatalf("Failed to load tools: %v", err)
}
// Fetch the tool's input schema
inputschema, err := tool.InputSchema()
if err != nil {
log.Fatalf("Failed to fetch inputSchema: %v", err)
}
var paramsSchema map[string]any
_ = json.Unmarshal(inputschema, &paramsSchema)
// Use this tool with LangChainGo
langChainTool := llms.Tool{
Type: "function",
Function: &llms.FunctionDefinition{
Name: tool.Name(),
Description: tool.Description(),
Parameters: paramsSchema,
},
}
}
{{< /highlight >}}
{{% /tab %}}
{{% tab header="Genkit Go" lang="en" %}}
{{< highlight go >}}
package main
import (
"context"
"encoding/json"
"log"
"github.com/firebase/genkit/go/ai"
"github.com/firebase/genkit/go/genkit"
"github.com/googleapis/mcp-toolbox-sdk-go/core"
"github.com/googleapis/mcp-toolbox-sdk-go/tbgenkit"
"github.com/invopop/jsonschema"
)
func main() {
// Make sure to add the error checks
// Update the url to point to your server
URL := "http://127.0.0.1:5000"
ctx := context.Background()
g, err := genkit.Init(ctx)
client, err := core.NewToolboxClient(URL)
if err != nil {
log.Fatalf("Failed to create Toolbox client: %v", err)
}
// Framework agnostic tool
tool, err := client.LoadTool("toolName", ctx)
if err != nil {
log.Fatalf("Failed to load tools: %v", err)
}
// Convert the tool using the tbgenkit package
// Use this tool with Genkit Go
genkitTool, err := tbgenkit.ToGenkitTool(tool, g)
if err != nil {
log.Fatalf("Failed to convert tool: %v\n", err)
}
}
{{< /highlight >}}
{{% /tab %}}
{{% tab header="Go GenAI" lang="en" %}}
{{< highlight go >}}
package main
import (
"context"
"encoding/json"
"log"
"github.com/googleapis/mcp-toolbox-sdk-go/core"
"google.golang.org/genai"
)
func main() {
// Make sure to add the error checks
// Update the url to point to your server
URL := "http://127.0.0.1:5000"
ctx := context.Background()
client, err := core.NewToolboxClient(URL)
if err != nil {
log.Fatalf("Failed to create Toolbox client: %v", err)
}
// Framework agnostic tool
tool, err := client.LoadTool("toolName", ctx)
if err != nil {
log.Fatalf("Failed to load tools: %v", err)
}
// Fetch the tool's input schema
inputschema, err := tool.InputSchema()
if err != nil {
log.Fatalf("Failed to fetch inputSchema: %v", err)
}
var schema *genai.Schema
_ = json.Unmarshal(inputschema, &schema)
funcDeclaration := &genai.FunctionDeclaration{
Name: tool.Name(),
Description: tool.Description(),
Parameters: schema,
}
// Use this tool with Go GenAI
genAITool := &genai.Tool{
FunctionDeclarations: []*genai.FunctionDeclaration{funcDeclaration},
}
}
{{< /highlight >}}
{{% /tab %}}
{{% tab header="OpenAI Go" lang="en" %}}
{{< highlight go >}}
package main
import (
"context"
"encoding/json"
"log"
"github.com/googleapis/mcp-toolbox-sdk-go/core"
openai "github.com/openai/openai-go"
)
func main() {
// Make sure to add the error checks
// Update the url to point to your server
URL := "http://127.0.0.1:5000"
ctx := context.Background()
client, err := core.NewToolboxClient(URL)
if err != nil {
log.Fatalf("Failed to create Toolbox client: %v", err)
}
// Framework agnostic tool
tool, err := client.LoadTool("toolName", ctx)
if err != nil {
log.Fatalf("Failed to load tools: %v", err)
}
// Fetch the tool's input schema
inputschema, err := tool.InputSchema()
if err != nil {
log.Fatalf("Failed to fetch inputSchema: %v", err)
}
var paramsSchema openai.FunctionParameters
_ = json.Unmarshal(inputschema, &paramsSchema)
// Use this tool with OpenAI Go
openAITool := openai.ChatCompletionToolParam{
Function: openai.FunctionDefinitionParam{
Name: tool.Name(),
Description: openai.String(tool.Description()),
Parameters: paramsSchema,
},
}
}
{{< /highlight >}}
{{% /tab %}}
{{< /tabpane >}}
For more detailed instructions on using the Toolbox Go SDK, see the
[project's README](https://github.com/googleapis/mcp-toolbox-sdk-go/blob/main/core/README.md).
For end-to-end samples on using the Toolbox Go SDK with orchestration
frameworks, see the [project's
samples](https://github.com/googleapis/mcp-toolbox-sdk-go/tree/main/core/samples)

View File

@@ -1,11 +1,10 @@
---
title: "Quickstart (Local)"
title: "Python Quickstart (Local)"
type: docs
weight: 2
description: >
How to get started running Toolbox locally with Python, PostgreSQL, and
[GoogleGenAI](https://pypi.org/project/google-genai/),
[LangGraph](https://www.langchain.com/langgraph), [LlamaIndex](https://www.llamaindex.ai/) or [Agent Development Kit](https://google.github.io/adk-docs/).
How to get started running Toolbox locally with [Python](https://github.com/googleapis/mcp-toolbox-sdk-python), PostgreSQL, and [Agent Development Kit](https://google.github.io/adk-docs/),
[LangGraph](https://www.langchain.com/langgraph), [LlamaIndex](https://www.llamaindex.ai/) or [GoogleGenAI](https://pypi.org/project/google-genai/).
---
[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/googleapis/genai-toolbox/blob/main/docs/en/getting-started/colab_quickstart.ipynb)
@@ -15,8 +14,23 @@ description: >
This guide assumes you have already done the following:
1. Installed [Python 3.9+][install-python] (including [pip][install-pip] and
your preferred virtual environment tool for managing dependencies e.g. [venv][install-venv])
1. Installed [PostgreSQL 16+ and the `psql` client][install-postgres]
your preferred virtual environment tool for managing dependencies e.g. [venv][install-venv]).
1. Installed [PostgreSQL 16+ and the `psql` client][install-postgres].
### Cloud Setup (Optional)
If you plan to use **Google Clouds Vertex AI** with your agent (e.g., using
`vertexai=True` or a Google GenAI model), follow these one-time setup steps for
local development:
1. [Install the Google Cloud CLI](https://cloud.google.com/sdk/docs/install)
1. [Set up Application Default Credentials (ADC)](https://cloud.google.com/docs/authentication/set-up-adc-local-dev-environment)
1. Set your project and enable Vertex AI
```bash
gcloud config set project YOUR_PROJECT_ID
gcloud services enable aiplatform.googleapis.com
```
[install-python]: https://wiki.python.org/moin/BeginnersGuide/Download
[install-pip]: https://pip.pypa.io/en/stable/installation/
@@ -37,24 +51,40 @@ accessed by our agent, and create a database user for Toolbox to connect with.
Here, `postgres` denotes the default postgres superuser.
{{< notice info >}}
#### **Having trouble connecting?**
* **Password Prompt:** If you are prompted for a password for the `postgres` user and do not know it (or a blank password doesn't work), your PostgreSQL installation might require a password or a different authentication method.
* **`FATAL: role "postgres" does not exist`:** This error means the default `postgres` superuser role isn't available under that name on your system.
* **`Connection refused`:** Ensure your PostgreSQL server is actually running. You can typically check with `sudo systemctl status postgresql` and start it with `sudo systemctl start postgresql` on Linux systems.
* **Password Prompt:** If you are prompted for a password for the `postgres`
user and do not know it (or a blank password doesn't work), your PostgreSQL
installation might require a password or a different authentication method.
* **`FATAL: role "postgres" does not exist`:** This error means the default
`postgres` superuser role isn't available under that name on your system.
* **`Connection refused`:** Ensure your PostgreSQL server is actually running.
You can typically check with `sudo systemctl status postgresql` and start it
with `sudo systemctl start postgresql` on Linux systems.
<br/>
#### **Common Solution**
For password issues or if the `postgres` role seems inaccessible directly, try switching to the `postgres` operating system user first. This user often has permission to connect without a password for local connections (this is called peer authentication).
For password issues or if the `postgres` role seems inaccessible directly, try
switching to the `postgres` operating system user first. This user often has
permission to connect without a password for local connections (this is called
peer authentication).
```bash
sudo -i -u postgres
psql -h 127.0.0.1
```
Once you are in the `psql` shell using this method, you can proceed with the database creation steps below. Afterwards, type `\q` to exit `psql`, and then `exit` to return to your normal user shell.
If desired, once connected to `psql` as the `postgres` OS user, you can set a password for the `postgres` *database* user using: `ALTER USER postgres WITH PASSWORD 'your_chosen_password';`. This would allow direct connection with `-U postgres` and a password next time.
Once you are in the `psql` shell using this method, you can proceed with the
database creation steps below. Afterwards, type `\q` to exit `psql`, and then
`exit` to return to your normal user shell.
If desired, once connected to `psql` as the `postgres` OS user, you can set a
password for the `postgres` *database* user using: `ALTER USER postgres WITH
PASSWORD 'your_chosen_password';`. This would allow direct connection with `-U
postgres` and a password next time.
{{< /notice >}}
1. Create a new database and a new user:
@@ -78,7 +108,10 @@ If desired, once connected to `psql` as the `postgres` OS user, you can set a pa
```bash
\q
```
(If you used `sudo -i -u postgres` and then `psql`, remember you might also need to type `exit` after `\q` to leave the `postgres` user's shell session.)
(If you used `sudo -i -u postgres` and then `psql`, remember you might also
need to type `exit` after `\q` to leave the `postgres` user's shell
session.)
1. Connect to your database with your new user:
@@ -138,7 +171,7 @@ In this section, we will download Toolbox, configure our tools in a
<!-- {x-release-please-start-version} -->
```bash
export OS="linux/amd64" # one of linux/amd64, darwin/arm64, darwin/amd64, or windows/amd64
curl -O https://storage.googleapis.com/genai-toolbox/v0.7.0/$OS/toolbox
curl -O https://storage.googleapis.com/genai-toolbox/v0.10.0/$OS/toolbox
```
<!-- {x-release-please-end} -->
@@ -240,6 +273,11 @@ In this section, we will download Toolbox, configure our tools in a
./toolbox --tools-file "tools.yaml"
```
{{< notice note >}}
Toolbox enables dynamic reloading by default. To disable, use the
`--disable-reload` flag.
{{< /notice >}}
## Step 3: Connect your agent to Toolbox
In this section, we will write and run an agent that will load the Tools
@@ -253,10 +291,6 @@ you can connect to a
1. In a new terminal, install the SDK package.
{{< tabpane persist=header >}}
{{< tab header="Core" lang="bash" >}}
pip install toolbox-core
{{< /tab >}}
{{< tab header="ADK" lang="bash" >}}
pip install toolbox-core
@@ -269,15 +303,15 @@ pip install toolbox-langchain
pip install toolbox-llamaindex
{{< /tab >}}
{{< tab header="Core" lang="bash" >}}
pip install toolbox-core
{{< /tab >}}
{{< /tabpane >}}
1. Install other required dependencies:
{{< tabpane persist=header >}}
{{< tab header="Core" lang="bash" >}}
pip install google-genai
{{< /tab >}}
{{< tab header="ADK" lang="bash" >}}
pip install google-adk
@@ -301,123 +335,16 @@ pip install llama-index-llms-google-genai
# pip install llama-index-llms-anthropic
{{< /tab >}}
{{< tab header="Core" lang="bash" >}}
pip install google-genai
{{< /tab >}}
{{< /tabpane >}}
1. Create a new file named `hotel_agent.py` and copy the following
code to create an agent:
{{< tabpane persist=header >}}
{{< tab header="Core" lang="python" >}}
import asyncio
from google import genai
from google.genai.types import (
Content,
FunctionDeclaration,
GenerateContentConfig,
Part,
Tool,
)
from toolbox_core import ToolboxClient
prompt = """
You're a helpful hotel assistant. You handle hotel searching, booking and
cancellations. When the user searches for a hotel, mention it's name, id,
location and price tier. Always mention hotel id while performing any
searches. This is very important for any operations. For any bookings or
cancellations, please provide the appropriate confirmation. Be sure to
update checkin or checkout dates if mentioned by the user.
Don't ask for confirmations from the user.
"""
queries = [
"Find hotels in Basel with Basel in it's name.",
"Please book the hotel Hilton Basel for me.",
"This is too expensive. Please cancel it.",
"Please book Hyatt Regency for me",
"My check in dates for my booking would be from April 10, 2024 to April 19, 2024.",
]
async def run_application():
async with ToolboxClient("http://127.0.0.1:5000") as toolbox_client:
# The toolbox_tools list contains Python callables (functions/methods) designed for LLM tool-use
# integration. While this example uses Google's genai client, these callables can be adapted for
# various function-calling or agent frameworks. For easier integration with supported frameworks
# (https://github.com/googleapis/mcp-toolbox-python-sdk/tree/main/packages), use the
# provided wrapper packages, which handle framework-specific boilerplate.
toolbox_tools = await toolbox_client.load_toolset("my-toolset")
genai_client = genai.Client(
vertexai=True, project="project-id", location="us-central1"
)
genai_tools = [
Tool(
function_declarations=[
FunctionDeclaration.from_callable_with_api_option(callable=tool)
]
)
for tool in toolbox_tools
]
history = []
for query in queries:
user_prompt_content = Content(
role="user",
parts=[Part.from_text(text=query)],
)
history.append(user_prompt_content)
response = genai_client.models.generate_content(
model="gemini-2.0-flash-001",
contents=history,
config=GenerateContentConfig(
system_instruction=prompt,
tools=genai_tools,
),
)
history.append(response.candidates[0].content)
function_response_parts = []
for function_call in response.function_calls:
fn_name = function_call.name
# The tools are sorted alphabetically
if fn_name == "search-hotels-by-name":
function_result = await toolbox_tools[3](**function_call.args)
elif fn_name == "search-hotels-by-location":
function_result = await toolbox_tools[2](**function_call.args)
elif fn_name == "book-hotel":
function_result = await toolbox_tools[0](**function_call.args)
elif fn_name == "update-hotel":
function_result = await toolbox_tools[4](**function_call.args)
elif fn_name == "cancel-hotel":
function_result = await toolbox_tools[1](**function_call.args)
else:
raise ValueError("Function name not present.")
function_response = {"result": function_result}
function_response_part = Part.from_function_response(
name=function_call.name,
response=function_response,
)
function_response_parts.append(function_response_part)
if function_response_parts:
tool_response_content = Content(role="tool", parts=function_response_parts)
history.append(tool_response_content)
response2 = genai_client.models.generate_content(
model="gemini-2.0-flash-001",
contents=history,
config=GenerateContentConfig(
tools=genai_tools,
),
)
final_model_response_content = response2.candidates[0].content
history.append(final_model_response_content)
print(response2.text)
asyncio.run(run_application())
{{< /tab >}}
{{< tab header="ADK" lang="python" >}}
from google.adk.agents import Agent
from google.adk.runners import Runner
@@ -426,65 +353,69 @@ from google.adk.artifacts.in_memory_artifact_service import InMemoryArtifactServ
from google.genai import types
from toolbox_core import ToolboxSyncClient
import asyncio
import os
# TODO(developer): replace this with your Google API key
os.environ['GOOGLE_API_KEY'] = 'your-api-key'
with ToolboxSyncClient("http://127.0.0.1:5000") as toolbox_client:
async def main():
with ToolboxSyncClient("<http://127.0.0.1:5000>") as toolbox_client:
prompt = """
You're a helpful hotel assistant. You handle hotel searching, booking and
cancellations. When the user searches for a hotel, mention it's name, id,
location and price tier. Always mention hotel ids while performing any
searches. This is very important for any operations. For any bookings or
cancellations, please provide the appropriate confirmation. Be sure to
update checkin or checkout dates if mentioned by the user.
Don't ask for confirmations from the user.
"""
prompt = """
You're a helpful hotel assistant. You handle hotel searching, booking and
cancellations. When the user searches for a hotel, mention it's name, id,
location and price tier. Always mention hotel ids while performing any
searches. This is very important for any operations. For any bookings or
cancellations, please provide the appropriate confirmation. Be sure to
update checkin or checkout dates if mentioned by the user.
Don't ask for confirmations from the user.
"""
root_agent = Agent(
model='gemini-2.0-flash-001',
name='hotel_agent',
description='A helpful AI assistant.',
instruction=prompt,
tools=toolbox_client.load_toolset("my-toolset"),
)
root_agent = Agent(
model='gemini-2.0-flash-001',
name='hotel_agent',
description='A helpful AI assistant.',
instruction=prompt,
tools=toolbox_client.load_toolset("my-toolset"),
)
session_service = InMemorySessionService()
artifacts_service = InMemoryArtifactService()
session = session_service.create_session(
state={}, app_name='hotel_agent', user_id='123'
)
runner = Runner(
app_name='hotel_agent',
agent=root_agent,
artifact_service=artifacts_service,
session_service=session_service,
)
session_service = InMemorySessionService()
artifacts_service = InMemoryArtifactService()
session = await session_service.create_session(
state={}, app_name='hotel_agent', user_id='123'
)
runner = Runner(
app_name='hotel_agent',
agent=root_agent,
artifact_service=artifacts_service,
session_service=session_service,
)
queries = [
"Find hotels in Basel with Basel in it's name.",
"Can you book the Hilton Basel for me?",
"Oh wait, this is too expensive. Please cancel it and book the Hyatt Regency instead.",
"My check in dates would be from April 10, 2024 to April 19, 2024.",
]
queries = [
"Find hotels in Basel with Basel in it's name.",
"Can you book the Hilton Basel for me?",
"Oh wait, this is too expensive. Please cancel it and book the Hyatt Regency instead.",
"My check in dates would be from April 10, 2024 to April 19, 2024.",
]
for query in queries:
content = types.Content(role='user', parts=[types.Part(text=query)])
events = runner.run(session_id=session.id,
user_id='123', new_message=content)
for query in queries:
content = types.Content(role='user', parts=[types.Part(text=query)])
events = runner.run(session_id=session.id,
user_id='123', new_message=content)
responses = (
part.text
for event in events
for part in event.content.parts
if part.text is not None
)
responses = (
part.text
for event in events
for part in event.content.parts
if part.text is not None
)
for text in responses:
print(text)
for text in responses:
print(text)
asyncio.run(main())
{{< /tab >}}
{{< tab header="LangChain" lang="python" >}}
import asyncio
@@ -604,23 +535,138 @@ async def run_application():
print(str(response))
asyncio.run(run_application())
{{< /tab >}}
{{< tab header="Core" lang="python" >}}
import asyncio
from google import genai
from google.genai.types import (
Content,
FunctionDeclaration,
GenerateContentConfig,
Part,
Tool,
)
from toolbox_core import ToolboxClient
prompt = """
You're a helpful hotel assistant. You handle hotel searching, booking and
cancellations. When the user searches for a hotel, mention it's name, id,
location and price tier. Always mention hotel id while performing any
searches. This is very important for any operations. For any bookings or
cancellations, please provide the appropriate confirmation. Be sure to
update checkin or checkout dates if mentioned by the user.
Don't ask for confirmations from the user.
"""
queries = [
"Find hotels in Basel with Basel in it's name.",
"Please book the hotel Hilton Basel for me.",
"This is too expensive. Please cancel it.",
"Please book Hyatt Regency for me",
"My check in dates for my booking would be from April 10, 2024 to April 19, 2024.",
]
async def run_application():
async with ToolboxClient("<http://127.0.0.1:5000>") as toolbox_client:
# The toolbox_tools list contains Python callables (functions/methods) designed for LLM tool-use
# integration. While this example uses Google's genai client, these callables can be adapted for
# various function-calling or agent frameworks. For easier integration with supported frameworks
# (https://github.com/googleapis/mcp-toolbox-python-sdk/tree/main/packages), use the
# provided wrapper packages, which handle framework-specific boilerplate.
toolbox_tools = await toolbox_client.load_toolset("my-toolset")
genai_client = genai.Client(
vertexai=True, project="project-id", location="us-central1"
)
genai_tools = [
Tool(
function_declarations=[
FunctionDeclaration.from_callable_with_api_option(callable=tool)
]
)
for tool in toolbox_tools
]
history = []
for query in queries:
user_prompt_content = Content(
role="user",
parts=[Part.from_text(text=query)],
)
history.append(user_prompt_content)
response = genai_client.models.generate_content(
model="gemini-2.0-flash-001",
contents=history,
config=GenerateContentConfig(
system_instruction=prompt,
tools=genai_tools,
),
)
history.append(response.candidates[0].content)
function_response_parts = []
for function_call in response.function_calls:
fn_name = function_call.name
# The tools are sorted alphabetically
if fn_name == "search-hotels-by-name":
function_result = await toolbox_tools[3](**function_call.args)
elif fn_name == "search-hotels-by-location":
function_result = await toolbox_tools[2](**function_call.args)
elif fn_name == "book-hotel":
function_result = await toolbox_tools[0](**function_call.args)
elif fn_name == "update-hotel":
function_result = await toolbox_tools[4](**function_call.args)
elif fn_name == "cancel-hotel":
function_result = await toolbox_tools[1](**function_call.args)
else:
raise ValueError("Function name not present.")
function_response = {"result": function_result}
function_response_part = Part.from_function_response(
name=function_call.name,
response=function_response,
)
function_response_parts.append(function_response_part)
if function_response_parts:
tool_response_content = Content(role="tool", parts=function_response_parts)
history.append(tool_response_content)
response2 = genai_client.models.generate_content(
model="gemini-2.0-flash-001",
contents=history,
config=GenerateContentConfig(
tools=genai_tools,
),
)
final_model_response_content = response2.candidates[0].content
history.append(final_model_response_content)
print(response2.text)
asyncio.run(run_application())
{{< /tab >}}
{{< /tabpane >}}
{{< tabpane text=true persist=header >}}
{{% tab header="Core" lang="en" %}}
To learn more about tool calling with Google GenAI, check out the
[Google GenAI Documentation](https://github.com/googleapis/python-genai?tab=readme-ov-file#manually-declare-and-invoke-a-function-for-function-calling).
{{% /tab %}}
{{% tab header="ADK" lang="en" %}}
To learn more about Agent Development Kit, check out the [ADK documentation.](https://google.github.io/adk-docs/)
To learn more about Agent Development Kit, check out the [ADK
documentation.](https://google.github.io/adk-docs/)
{{% /tab %}}
{{% tab header="Langchain" lang="en" %}}
To learn more about Agents in LangChain, check out the [LangGraph Agent documentation.](https://langchain-ai.github.io/langgraph/reference/prebuilt/#langgraph.prebuilt.chat_agent_executor.create_react_agent)
To learn more about Agents in LangChain, check out the [LangGraph Agent
documentation.](https://langchain-ai.github.io/langgraph/reference/prebuilt/#langgraph.prebuilt.chat_agent_executor.create_react_agent)
{{% /tab %}}
{{% tab header="LlamaIndex" lang="en" %}}
To learn more about Agents in LlamaIndex, check out the
[LlamaIndex AgentWorkflow documentation.](https://docs.llamaindex.ai/en/stable/examples/agent/agent_workflow_basic/)
To learn more about Agents in LlamaIndex, check out the [LlamaIndex
AgentWorkflow
documentation.](https://docs.llamaindex.ai/en/stable/examples/agent/agent_workflow_basic/)
{{% /tab %}}
{{% tab header="Core" lang="en" %}}
To learn more about tool calling with Google GenAI, check out the
[Google GenAI
Documentation](https://github.com/googleapis/python-genai?tab=readme-ov-file#manually-declare-and-invoke-a-function-for-function-calling).
{{% /tab %}}
{{< /tabpane >}}
@@ -629,3 +675,7 @@ To learn more about Agents in LlamaIndex, check out the
```sh
python hotel_agent.py
```
{{< notice info >}}
For more information, visit the [Python SDK repo](https://github.com/googleapis/mcp-toolbox-sdk-python).
{{</ notice >}}

View File

@@ -0,0 +1,927 @@
---
title: "Go Quickstart (Local)"
type: docs
weight: 4
description: >
How to get started running Toolbox locally with [Go](https://github.com/googleapis/mcp-toolbox-sdk-go), PostgreSQL, and orchestration frameworks such as [LangChain Go](https://tmc.github.io/langchaingo/docs/), [GenkitGo](https://genkit.dev/go/docs/get-started-go/), [Go GenAI](https://github.com/googleapis/go-genai) and [OpenAI Go](https://github.com/openai/openai-go).
---
## Before you begin
This guide assumes you have already done the following:
1. Installed [Go (v1.24.2 or higher)].
1. Installed [PostgreSQL 16+ and the `psql` client][install-postgres].
### Cloud Setup (Optional)
If you plan to use **Google Clouds Vertex AI** with your agent (e.g., using
Gemini or PaLM models), follow these one-time setup steps:
1. [Install the Google Cloud CLI]
1. [Set up Application Default Credentials (ADC)]
1. Set your project and enable Vertex AI
```bash
gcloud config set project YOUR_PROJECT_ID
gcloud services enable aiplatform.googleapis.com
```
[Go (v1.24.2 or higher)]: https://go.dev/doc/install
[install-postgres]: https://www.postgresql.org/download/
[Install the Google Cloud CLI]: https://cloud.google.com/sdk/docs/install
[Set up Application Default Credentials (ADC)]:
https://cloud.google.com/docs/authentication/set-up-adc-local-dev-environment
## Step 1: Set up your database
In this section, we will create a database, insert some data that needs to be
accessed by our agent, and create a database user for Toolbox to connect with.
1. Connect to postgres using the `psql` command:
```bash
psql -h 127.0.0.1 -U postgres
```
Here, `postgres` denotes the default postgres superuser.
{{< notice info >}}
#### **Having trouble connecting?**
* **Password Prompt:** If you are prompted for a password for the `postgres`
user and do not know it (or a blank password doesn't work), your PostgreSQL
installation might require a password or a different authentication method.
* **`FATAL: role "postgres" does not exist`:** This error means the default
`postgres` superuser role isn't available under that name on your system.
* **`Connection refused`:** Ensure your PostgreSQL server is actually running.
You can typically check with `sudo systemctl status postgresql` and start it
with `sudo systemctl start postgresql` on Linux systems.
<br/>
#### **Common Solution**
For password issues or if the `postgres` role seems inaccessible directly, try
switching to the `postgres` operating system user first. This user often has
permission to connect without a password for local connections (this is called
peer authentication).
```bash
sudo -i -u postgres
psql -h 127.0.0.1
```
Once you are in the `psql` shell using this method, you can proceed with the
database creation steps below. Afterwards, type `\q` to exit `psql`, and then
`exit` to return to your normal user shell.
If desired, once connected to `psql` as the `postgres` OS user, you can set a
password for the `postgres` *database* user using: `ALTER USER postgres WITH
PASSWORD 'your_chosen_password';`. This would allow direct connection with `-U
postgres` and a password next time.
{{< /notice >}}
1. Create a new database and a new user:
{{< notice tip >}}
For a real application, it's best to follow the principle of least permission
and only grant the privileges your application needs.
{{< /notice >}}
```sql
CREATE USER toolbox_user WITH PASSWORD 'my-password';
CREATE DATABASE toolbox_db;
GRANT ALL PRIVILEGES ON DATABASE toolbox_db TO toolbox_user;
ALTER DATABASE toolbox_db OWNER TO toolbox_user;
```
1. End the database session:
```bash
\q
```
(If you used `sudo -i -u postgres` and then `psql`, remember you might also
need to type `exit` after `\q` to leave the `postgres` user's shell
session.)
1. Connect to your database with your new user:
```bash
psql -h 127.0.0.1 -U toolbox_user -d toolbox_db
```
1. Create a table using the following command:
```sql
CREATE TABLE hotels(
id INTEGER NOT NULL PRIMARY KEY,
name VARCHAR NOT NULL,
location VARCHAR NOT NULL,
price_tier VARCHAR NOT NULL,
checkin_date DATE NOT NULL,
checkout_date DATE NOT NULL,
booked BIT NOT NULL
);
```
1. Insert data into the table.
```sql
INSERT INTO hotels(id, name, location, price_tier, checkin_date, checkout_date, booked)
VALUES
(1, 'Hilton Basel', 'Basel', 'Luxury', '2024-04-22', '2024-04-20', B'0'),
(2, 'Marriott Zurich', 'Zurich', 'Upscale', '2024-04-14', '2024-04-21', B'0'),
(3, 'Hyatt Regency Basel', 'Basel', 'Upper Upscale', '2024-04-02', '2024-04-20', B'0'),
(4, 'Radisson Blu Lucerne', 'Lucerne', 'Midscale', '2024-04-24', '2024-04-05', B'0'),
(5, 'Best Western Bern', 'Bern', 'Upper Midscale', '2024-04-23', '2024-04-01', B'0'),
(6, 'InterContinental Geneva', 'Geneva', 'Luxury', '2024-04-23', '2024-04-28', B'0'),
(7, 'Sheraton Zurich', 'Zurich', 'Upper Upscale', '2024-04-27', '2024-04-02', B'0'),
(8, 'Holiday Inn Basel', 'Basel', 'Upper Midscale', '2024-04-24', '2024-04-09', B'0'),
(9, 'Courtyard Zurich', 'Zurich', 'Upscale', '2024-04-03', '2024-04-13', B'0'),
(10, 'Comfort Inn Bern', 'Bern', 'Midscale', '2024-04-04', '2024-04-16', B'0');
```
1. End the database session:
```bash
\q
```
## Step 2: Install and configure Toolbox
In this section, we will download Toolbox, configure our tools in a
`tools.yaml`, and then run the Toolbox server.
1. Download the latest version of Toolbox as a binary:
{{< notice tip >}}
Select the
[correct binary](https://github.com/googleapis/genai-toolbox/releases)
corresponding to your OS and CPU architecture.
{{< /notice >}}
<!-- {x-release-please-start-version} -->
```bash
export OS="linux/amd64" # one of linux/amd64, darwin/arm64, darwin/amd64, or windows/amd64
curl -O https://storage.googleapis.com/genai-toolbox/v0.10.0/$OS/toolbox
```
<!-- {x-release-please-end} -->
1. Make the binary executable:
```bash
chmod +x toolbox
```
1. Write the following into a `tools.yaml` file. Be sure to update any fields
such as `user`, `password`, or `database` that you may have customized in the
previous step.
{{< notice tip >}}
In practice, use environment variable replacement with the format ${ENV_NAME}
instead of hardcoding your secrets into the configuration file.
{{< /notice >}}
```yaml
sources:
my-pg-source:
kind: postgres
host: 127.0.0.1
port: 5432
database: toolbox_db
user: ${USER_NAME}
password: ${PASSWORD}
tools:
search-hotels-by-name:
kind: postgres-sql
source: my-pg-source
description: Search for hotels based on name.
parameters:
- name: name
type: string
description: The name of the hotel.
statement: SELECT * FROM hotels WHERE name ILIKE '%' || $1 || '%';
search-hotels-by-location:
kind: postgres-sql
source: my-pg-source
description: Search for hotels based on location.
parameters:
- name: location
type: string
description: The location of the hotel.
statement: SELECT * FROM hotels WHERE location ILIKE '%' || $1 || '%';
book-hotel:
kind: postgres-sql
source: my-pg-source
description: >-
Book a hotel by its ID. If the hotel is successfully booked, returns a NULL, raises an error if not.
parameters:
- name: hotel_id
type: string
description: The ID of the hotel to book.
statement: UPDATE hotels SET booked = B'1' WHERE id = $1;
update-hotel:
kind: postgres-sql
source: my-pg-source
description: >-
Update a hotel's check-in and check-out dates by its ID. Returns a message
indicating whether the hotel was successfully updated or not.
parameters:
- name: hotel_id
type: string
description: The ID of the hotel to update.
- name: checkin_date
type: string
description: The new check-in date of the hotel.
- name: checkout_date
type: string
description: The new check-out date of the hotel.
statement: >-
UPDATE hotels SET checkin_date = CAST($2 as date), checkout_date = CAST($3
as date) WHERE id = $1;
cancel-hotel:
kind: postgres-sql
source: my-pg-source
description: Cancel a hotel by its ID.
parameters:
- name: hotel_id
type: string
description: The ID of the hotel to cancel.
statement: UPDATE hotels SET booked = B'0' WHERE id = $1;
toolsets:
my-toolset:
- search-hotels-by-name
- search-hotels-by-location
- book-hotel
- update-hotel
- cancel-hotel
```
For more info on tools, check out the `Resources` section of the docs.
1. Run the Toolbox server, pointing to the `tools.yaml` file created earlier:
```bash
./toolbox --tools-file "tools.yaml"
```
{{< notice note >}}
Toolbox enables dynamic reloading by default. To disable, use the
`--disable-reload` flag.
{{< /notice >}}
## Step 3: Connect your agent to Toolbox
In this section, we will write and run an agent that will load the Tools
from Toolbox.
1. Initialize a go module:
```bash
go mod init main
```
1. In a new terminal, install the
[SDK](https://pkg.go.dev/github.com/googleapis/mcp-toolbox-sdk-go).
```bash
go get github.com/googleapis/mcp-toolbox-sdk-go
```
1. Create a new file named `hotelagent.go` and copy the following code to create
an agent:
{{< tabpane persist=header >}}
{{< tab header="LangChain Go" lang="go" >}}
package main
import (
"context"
"encoding/json"
"fmt"
"log"
"os"
"github.com/googleapis/mcp-toolbox-sdk-go/core"
"github.com/tmc/langchaingo/llms"
"github.com/tmc/langchaingo/llms/googleai"
)
// ConvertToLangchainTool converts a generic core.ToolboxTool into a LangChainGo llms.Tool.
func ConvertToLangchainTool(toolboxTool *core.ToolboxTool) llms.Tool {
// Fetch the tool's input schema
inputschema, err := toolboxTool.InputSchema()
if err != nil {
return llms.Tool{}
}
var paramsSchema map[string]any
_ = json.Unmarshal(inputschema, &paramsSchema)
// Convert into LangChain's llms.Tool
return llms.Tool{
Type: "function",
Function: &llms.FunctionDefinition{
Name: toolboxTool.Name(),
Description: toolboxTool.Description(),
Parameters: paramsSchema,
},
}
}
const systemPrompt = `
You're a helpful hotel assistant. You handle hotel searching, booking, and
cancellations. When the user searches for a hotel, mention its name, id,
location and price tier. Always mention hotel ids while performing any
searches. This is very important for any operations. For any bookings or
cancellations, please provide the appropriate confirmation. Be sure to
update checkin or checkout dates if mentioned by the user.
Don't ask for confirmations from the user.
`
var queries = []string{
"Find hotels in Basel with Basel in its name.",
"Can you book the hotel Hilton Basel for me?",
"Oh wait, this is too expensive. Please cancel it.",
"Please book the Hyatt Regency instead.",
"My check in dates would be from April 10, 2024 to April 19, 2024.",
}
func main() {
genaiKey := os.Getenv("GOOGLE_API_KEY")
toolboxURL := "http://localhost:5000"
ctx := context.Background()
// Initialize the Google AI client (LLM).
llm, err := googleai.New(ctx, googleai.WithAPIKey(genaiKey), googleai.WithDefaultModel("gemini-1.5-flash"))
if err != nil {
log.Fatalf("Failed to create Google AI client: %v", err)
}
// Initialize the MCP Toolbox client.
toolboxClient, err := core.NewToolboxClient(toolboxURL)
if err != nil {
log.Fatalf("Failed to create Toolbox client: %v", err)
}
// Load the tool using the MCP Toolbox SDK.
tools, err := toolboxClient.LoadToolset("my-toolset", ctx)
if err != nil {
log.Fatalf("Failed to load tools: %v\nMake sure your Toolbox server is running and the tool is configured.", err)
}
toolsMap := make(map[string]*core.ToolboxTool, len(tools))
langchainTools := make([]llms.Tool, len(tools))
// Convert the loaded ToolboxTools into the format LangChainGo requires.
for i, tool := range tools {
langchainTools[i] = ConvertToLangchainTool(tool)
toolsMap[tool.Name()] = tool
}
// Start the conversation history.
messageHistory := []llms.MessageContent{
llms.TextParts(llms.ChatMessageTypeSystem, systemPrompt),
}
for _, query := range queries {
messageHistory = append(messageHistory, llms.TextParts(llms.ChatMessageTypeHuman, query))
// Make the first call to the LLM, making it aware of the tool.
resp, err := llm.GenerateContent(ctx, messageHistory, llms.WithTools(langchainTools))
if err != nil {
log.Fatalf("LLM call failed: %v", err)
}
respChoice := resp.Choices[0]
assistantResponse := llms.TextParts(llms.ChatMessageTypeAI, respChoice.Content)
for _, tc := range respChoice.ToolCalls {
assistantResponse.Parts = append(assistantResponse.Parts, tc)
}
messageHistory = append(messageHistory, assistantResponse)
// Process each tool call requested by the model.
for _, tc := range respChoice.ToolCalls {
toolName := tc.FunctionCall.Name
tool := toolsMap[toolName]
var args map[string]any
if err := json.Unmarshal([]byte(tc.FunctionCall.Arguments), &args); err != nil {
log.Fatalf("Failed to unmarshal arguments for tool '%s': %v", toolName, err)
}
toolResult, err := tool.Invoke(ctx, args)
if err != nil {
log.Fatalf("Failed to execute tool '%s': %v", toolName, err)
}
if toolResult == "" || toolResult == nil {
toolResult = "Operation completed successfully with no specific return value."
}
// Create the tool call response message and add it to the history.
toolResponse := llms.MessageContent{
Role: llms.ChatMessageTypeTool,
Parts: []llms.ContentPart{
llms.ToolCallResponse{
Name: toolName,
Content: fmt.Sprintf("%v", toolResult),
},
},
}
messageHistory = append(messageHistory, toolResponse)
}
finalResp, err := llm.GenerateContent(ctx, messageHistory)
if err != nil {
log.Fatalf("Final LLM call failed after tool execution: %v", err)
}
// Add the final textual response from the LLM to the history
messageHistory = append(messageHistory, llms.TextParts(llms.ChatMessageTypeAI, finalResp.Choices[0].Content))
fmt.Println(finalResp.Choices[0].Content)
}
}
{{< /tab >}}
{{< tab header="Genkit Go" lang="go" >}}
package main
import (
"context"
"fmt"
"log"
"github.com/googleapis/mcp-toolbox-sdk-go/core"
"github.com/googleapis/mcp-toolbox-sdk-go/tbgenkit"
"github.com/firebase/genkit/go/ai"
"github.com/firebase/genkit/go/genkit"
"github.com/firebase/genkit/go/plugins/googlegenai"
)
const systemPrompt = `
You're a helpful hotel assistant. You handle hotel searching, booking, and
cancellations. When the user searches for a hotel, mention its name, id,
location and price tier. Always mention hotel ids while performing any
searches. This is very important for any operations. For any bookings or
cancellations, please provide the appropriate confirmation. Be sure to
update checkin or checkout dates if mentioned by the user.
Don't ask for confirmations from the user.
`
var queries = []string{
"Find hotels in Basel with Basel in its name.",
"Can you book the hotel Hilton Basel for me?",
"Oh wait, this is too expensive. Please cancel it and book the Hyatt Regency instead.",
"My check in dates would be from April 10, 2024 to April 19, 2024.",
}
func main() {
ctx := context.Background()
// Create Toolbox Client
toolboxClient, err := core.NewToolboxClient("http://127.0.0.1:5000")
if err != nil {
log.Fatalf("Failed to create Toolbox client: %v", err)
}
// Load the tools using the MCP Toolbox SDK.
tools, err := toolboxClient.LoadToolset("my-toolset", ctx)
if err != nil {
log.Fatalf("Failed to load tools: %v\nMake sure your Toolbox server is running and the tool is configured.", err)
}
// Initialize Genkit
g, err := genkit.Init(ctx,
genkit.WithPlugins(&googlegenai.GoogleAI{}),
genkit.WithDefaultModel("googleai/gemini-1.5-flash"),
)
if err != nil {
log.Fatalf("Failed to init genkit: %v\n", err)
}
// Create a conversation history
conversationHistory := []*ai.Message{
ai.NewSystemTextMessage(systemPrompt),
}
// Convert your tool to a Genkit tool.
genkitTools := make([]ai.Tool, len(tools))
for i, tool := range tools {
newTool, err := tbgenkit.ToGenkitTool(tool, g)
if err != nil {
log.Fatalf("Failed to convert tool: %v\n", err)
}
genkitTools[i] = newTool
}
toolRefs := make([]ai.ToolRef, len(genkitTools))
for i, tool := range genkitTools {
toolRefs[i] = tool
}
for _, query := range queries {
conversationHistory = append(conversationHistory, ai.NewUserTextMessage(query))
response, err := genkit.Generate(ctx, g,
ai.WithMessages(conversationHistory...),
ai.WithTools(toolRefs...),
ai.WithReturnToolRequests(true),
)
if err != nil {
log.Fatalf("%v\n", err)
}
conversationHistory = append(conversationHistory, response.Message)
parts := []*ai.Part{}
for _, req := range response.ToolRequests() {
tool := genkit.LookupTool(g, req.Name)
if tool == nil {
log.Fatalf("tool %q not found", req.Name)
}
output, err := tool.RunRaw(ctx, req.Input)
if err != nil {
log.Fatalf("tool %q execution failed: %v", tool.Name(), err)
}
parts = append(parts,
ai.NewToolResponsePart(&ai.ToolResponse{
Name: req.Name,
Ref: req.Ref,
Output: output,
}))
}
if len(parts) > 0 {
resp, err := genkit.Generate(ctx, g,
ai.WithMessages(append(response.History(), ai.NewMessage(ai.RoleTool, nil, parts...))...),
ai.WithTools(toolRefs...),
)
if err != nil {
log.Fatal(err)
}
fmt.Println("\n", resp.Text())
conversationHistory = append(conversationHistory, resp.Message)
} else {
fmt.Println("\n", response.Text())
}
}
}
{{< /tab >}}
{{< tab header="Go GenAI" lang="go" >}}
package main
import (
"context"
"encoding/json"
"fmt"
"log"
"os"
"github.com/googleapis/mcp-toolbox-sdk-go/core"
"google.golang.org/genai"
)
// ConvertToGenaiTool translates a ToolboxTool into the genai.FunctionDeclaration format.
func ConvertToGenaiTool(toolboxTool *core.ToolboxTool) *genai.Tool {
inputschema, err := toolboxTool.InputSchema()
if err != nil {
return &genai.Tool{}
}
var paramsSchema *genai.Schema
_ = json.Unmarshal(inputschema, &paramsSchema)
// First, create the function declaration.
funcDeclaration := &genai.FunctionDeclaration{
Name: toolboxTool.Name(),
Description: toolboxTool.Description(),
Parameters: paramsSchema,
}
// Then, wrap the function declaration in a genai.Tool struct.
return &genai.Tool{
FunctionDeclarations: []*genai.FunctionDeclaration{funcDeclaration},
}
}
func printResponse(resp *genai.GenerateContentResponse) {
for _, cand := range resp.Candidates {
if cand.Content != nil {
for _, part := range cand.Content.Parts {
fmt.Println(part.Text)
}
}
}
}
const systemPrompt = `
You're a helpful hotel assistant. You handle hotel searching, booking, and
cancellations. When the user searches for a hotel, mention its name, id,
location and price tier. Always mention hotel ids while performing any
searches. This is very important for any operations. For any bookings or
cancellations, please provide the appropriate confirmation. Be sure to
update checkin or checkout dates if mentioned by the user.
Don't ask for confirmations from the user.
`
var queries = []string{
"Find hotels in Basel with Basel in its name.",
"Can you book the hotel Hilton Basel for me?",
"Oh wait, this is too expensive. Please cancel it.",
"Please book the Hyatt Regency instead.",
"My check in dates would be from April 10, 2024 to April 19, 2024.",
}
func main() {
// Setup
ctx := context.Background()
apiKey := os.Getenv("GOOGLE_API_KEY")
toolboxURL := "http://localhost:5000"
// Initialize the Google GenAI client using the explicit ClientConfig.
client, err := genai.NewClient(ctx, &genai.ClientConfig{
APIKey: apiKey,
})
if err != nil {
log.Fatalf("Failed to create Google GenAI client: %v", err)
}
// Initialize the MCP Toolbox client.
toolboxClient, err := core.NewToolboxClient(toolboxURL)
if err != nil {
log.Fatalf("Failed to create Toolbox client: %v", err)
}
// Load the tool using the MCP Toolbox SDK.
tools, err := toolboxClient.LoadToolset("my-toolset", ctx)
if err != nil {
log.Fatalf("Failed to load tools: %v\nMake sure your Toolbox server is running and the tool is configured.", err)
}
genAITools := make([]*genai.Tool, len(tools))
toolsMap := make(map[string]*core.ToolboxTool, len(tools))
for i, tool := range tools {
genAITools[i] = ConvertToGenaiTool(tool)
toolsMap[tool.Name()] = tool
}
// Set up the generative model with the available tool.
modelName := "gemini-2.0-flash"
// Create the initial content prompt for the model.
messageHistory := []*genai.Content{
genai.NewContentFromText(systemPrompt, genai.RoleUser),
}
config := &genai.GenerateContentConfig{
Tools: genAITools,
ToolConfig: &genai.ToolConfig{
FunctionCallingConfig: &genai.FunctionCallingConfig{
Mode: genai.FunctionCallingConfigModeAny,
},
},
}
for _, query := range queries {
messageHistory = append(messageHistory, genai.NewContentFromText(query, genai.RoleUser))
genContentResp, err := client.Models.GenerateContent(ctx, modelName, messageHistory, config)
if err != nil {
log.Fatalf("LLM call failed for query '%s': %v", query, err)
}
if len(genContentResp.Candidates) > 0 && genContentResp.Candidates[0].Content != nil {
messageHistory = append(messageHistory, genContentResp.Candidates[0].Content)
}
functionCalls := genContentResp.FunctionCalls()
toolResponseParts := []*genai.Part{}
for _, fc := range functionCalls {
toolToInvoke, found := toolsMap[fc.Name]
if !found {
log.Fatalf("Tool '%s' not found in loaded tools map. Check toolset configuration.", fc.Name)
}
toolResult, invokeErr := toolToInvoke.Invoke(ctx, fc.Args)
if invokeErr != nil {
log.Fatalf("Failed to execute tool '%s': %v", fc.Name, invokeErr)
}
// Enhanced Tool Result Handling (retained to prevent nil issues)
toolResultString := ""
if toolResult != nil {
jsonBytes, marshalErr := json.Marshal(toolResult)
if marshalErr == nil {
toolResultString = string(jsonBytes)
} else {
toolResultString = fmt.Sprintf("%v", toolResult)
}
}
responseMap := map[string]any{"result": toolResultString}
toolResponseParts = append(toolResponseParts, genai.NewPartFromFunctionResponse(fc.Name, responseMap))
}
// Add all accumulated tool responses for this turn to the message history.
toolResponseContent := genai.NewContentFromParts(toolResponseParts, "function")
messageHistory = append(messageHistory, toolResponseContent)
finalResponse, err := client.Models.GenerateContent(ctx, modelName, messageHistory, &genai.GenerateContentConfig{})
if err != nil {
log.Fatalf("Error calling GenerateContent (with function result): %v", err)
}
printResponse(finalResponse)
// Add the final textual response from the LLM to the history
if len(finalResponse.Candidates) > 0 && finalResponse.Candidates[0].Content != nil {
messageHistory = append(messageHistory, finalResponse.Candidates[0].Content)
}
}
}
{{< /tab >}}
{{< tab header="OpenAI Go" lang="go" >}}
package main
import (
"context"
"encoding/json"
"log"
"github.com/googleapis/mcp-toolbox-sdk-go/core"
openai "github.com/openai/openai-go"
)
// ConvertToOpenAITool converts a ToolboxTool into the go-openai library's Tool format.
func ConvertToOpenAITool(toolboxTool *core.ToolboxTool) openai.ChatCompletionToolParam {
// Get the input schema
jsonSchemaBytes, err := toolboxTool.InputSchema()
if err != nil {
return openai.ChatCompletionToolParam{}
}
// Unmarshal the JSON bytes into FunctionParameters
var paramsSchema openai.FunctionParameters
if err := json.Unmarshal(jsonSchemaBytes, &paramsSchema); err != nil {
return openai.ChatCompletionToolParam{}
}
// Create and return the final tool parameter struct.
return openai.ChatCompletionToolParam{
Function: openai.FunctionDefinitionParam{
Name: toolboxTool.Name(),
Description: openai.String(toolboxTool.Description()),
Parameters: paramsSchema,
},
}
}
const systemPrompt = `
You're a helpful hotel assistant. You handle hotel searching, booking, and
cancellations. When the user searches for a hotel, mention its name, id,
location and price tier. Always mention hotel ids while performing any
searches. This is very important for any operations. For any bookings or
cancellations, please provide the appropriate confirmation. Be sure to
update checkin or checkout dates if mentioned by the user.
Don't ask for confirmations from the user.
`
var queries = []string{
"Find hotels in Basel with Basel in its name.",
"Can you book the hotel Hilton Basel for me?",
"Oh wait, this is too expensive. Please cancel it and book the Hyatt Regency instead.",
"My check in dates would be from April 10, 2024 to April 19, 2024.",
}
func main() {
// Setup
ctx := context.Background()
toolboxURL := "http://localhost:5000"
openAIClient := openai.NewClient()
// Initialize the MCP Toolbox client.
toolboxClient, err := core.NewToolboxClient(toolboxURL)
if err != nil {
log.Fatalf("Failed to create Toolbox client: %v", err)
}
// Load the tools using the MCP Toolbox SDK.
tools, err := toolboxClient.LoadToolset("my-toolset", ctx)
if err != nil {
log.Fatalf("Failed to load tool : %v\nMake sure your Toolbox server is running and the tool is configured.", err)
}
openAITools := make([]openai.ChatCompletionToolParam, len(tools))
toolsMap := make(map[string]*core.ToolboxTool, len(tools))
for i, tool := range tools {
// Convert the Toolbox tool into the openAI FunctionDeclaration format.
openAITools[i] = ConvertToOpenAITool(tool)
// Add tool to a map for lookup later
toolsMap[tool.Name()] = tool
}
params := openai.ChatCompletionNewParams{
Messages: []openai.ChatCompletionMessageParamUnion{
openai.SystemMessage(systemPrompt),
},
Tools: openAITools,
Seed: openai.Int(0),
Model: openai.ChatModelGPT4o,
}
for _, query := range queries {
params.Messages = append(params.Messages, openai.UserMessage(query))
// Make initial chat completion request
completion, err := openAIClient.Chat.Completions.New(ctx, params)
if err != nil {
panic(err)
}
toolCalls := completion.Choices[0].Message.ToolCalls
// Return early if there are no tool calls
if len(toolCalls) == 0 {
log.Println("No function call")
}
// If there is a was a function call, continue the conversation
params.Messages = append(params.Messages, completion.Choices[0].Message.ToParam())
for _, toolCall := range toolCalls {
toolName := toolCall.Function.Name
toolToInvoke := toolsMap[toolName]
var args map[string]any
err := json.Unmarshal([]byte(toolCall.Function.Arguments), &args)
if err != nil {
panic(err)
}
result, err := toolToInvoke.Invoke(ctx, args)
if err != nil {
log.Fatal("Could not invoke tool", err)
}
params.Messages = append(params.Messages, openai.ToolMessage(result.(string), toolCall.ID))
}
completion, err = openAIClient.Chat.Completions.New(ctx, params)
if err != nil {
panic(err)
}
params.Messages = append(params.Messages, openai.AssistantMessage(query))
println("\n", completion.Choices[0].Message.Content)
}
}
{{< /tab >}}
{{< /tabpane >}}
1. Ensure all dependencies are installed:
```sh
go mod tidy
```
1. Run your agent, and observe the results:
```sh
go run hotelagent.go
```
{{< notice info >}}
For more information, visit the [Go SDK
repo](https://github.com/googleapis/mcp-toolbox-sdk-go).
{{</ notice >}}

View File

@@ -0,0 +1,578 @@
---
title: "JS Quickstart (Local)"
type: docs
weight: 3
description: >
How to get started running Toolbox locally with [JavaScript](https://github.com/googleapis/mcp-toolbox-sdk-js), PostgreSQL, and orchestration frameworks such as [LangChain](https://js.langchain.com/docs/introduction/), [GenkitJS](https://genkit.dev/docs/get-started/), and [LlamaIndex](https://ts.llamaindex.ai/).
---
## Before you begin
This guide assumes you have already done the following:
1. Installed [Node.js (v18 or higher)].
1. Installed [PostgreSQL 16+ and the `psql` client][install-postgres].
### Cloud Setup (Optional)
If you plan to use **Google Clouds Vertex AI** with your agent (e.g., using
Gemini or PaLM models), follow these one-time setup steps:
1. [Install the Google Cloud CLI]
1. [Set up Application Default Credentials (ADC)]
1. Set your project and enable Vertex AI
```bash
gcloud config set project YOUR_PROJECT_ID
gcloud services enable aiplatform.googleapis.com
```
[Node.js (v18 or higher)]: https://nodejs.org/
[install-postgres]: https://www.postgresql.org/download/
[Install the Google Cloud CLI]: https://cloud.google.com/sdk/docs/install
[Set up Application Default Credentials (ADC)]:
https://cloud.google.com/docs/authentication/set-up-adc-local-dev-environment
## Step 1: Set up your database
In this section, we will create a database, insert some data that needs to be
accessed by our agent, and create a database user for Toolbox to connect with.
1. Connect to postgres using the `psql` command:
```bash
psql -h 127.0.0.1 -U postgres
```
Here, `postgres` denotes the default postgres superuser.
{{< notice info >}}
#### **Having trouble connecting?**
* **Password Prompt:** If you are prompted for a password for the `postgres`
user and do not know it (or a blank password doesn't work), your PostgreSQL
installation might require a password or a different authentication method.
* **`FATAL: role "postgres" does not exist`:** This error means the default
`postgres` superuser role isn't available under that name on your system.
* **`Connection refused`:** Ensure your PostgreSQL server is actually running.
You can typically check with `sudo systemctl status postgresql` and start it
with `sudo systemctl start postgresql` on Linux systems.
<br/>
#### **Common Solution**
For password issues or if the `postgres` role seems inaccessible directly, try
switching to the `postgres` operating system user first. This user often has
permission to connect without a password for local connections (this is called
peer authentication).
```bash
sudo -i -u postgres
psql -h 127.0.0.1
```
Once you are in the `psql` shell using this method, you can proceed with the
database creation steps below. Afterwards, type `\q` to exit `psql`, and then
`exit` to return to your normal user shell.
If desired, once connected to `psql` as the `postgres` OS user, you can set a
password for the `postgres` *database* user using: `ALTER USER postgres WITH
PASSWORD 'your_chosen_password';`. This would allow direct connection with `-U
postgres` and a password next time.
{{< /notice >}}
1. Create a new database and a new user:
{{< notice tip >}}
For a real application, it's best to follow the principle of least permission
and only grant the privileges your application needs.
{{< /notice >}}
```sql
CREATE USER toolbox_user WITH PASSWORD 'my-password';
CREATE DATABASE toolbox_db;
GRANT ALL PRIVILEGES ON DATABASE toolbox_db TO toolbox_user;
ALTER DATABASE toolbox_db OWNER TO toolbox_user;
```
1. End the database session:
```bash
\q
```
(If you used `sudo -i -u postgres` and then `psql`, remember you might also
need to type `exit` after `\q` to leave the `postgres` user's shell
session.)
1. Connect to your database with your new user:
```bash
psql -h 127.0.0.1 -U toolbox_user -d toolbox_db
```
1. Create a table using the following command:
```sql
CREATE TABLE hotels(
id INTEGER NOT NULL PRIMARY KEY,
name VARCHAR NOT NULL,
location VARCHAR NOT NULL,
price_tier VARCHAR NOT NULL,
checkin_date DATE NOT NULL,
checkout_date DATE NOT NULL,
booked BIT NOT NULL
);
```
1. Insert data into the table.
```sql
INSERT INTO hotels(id, name, location, price_tier, checkin_date, checkout_date, booked)
VALUES
(1, 'Hilton Basel', 'Basel', 'Luxury', '2024-04-22', '2024-04-20', B'0'),
(2, 'Marriott Zurich', 'Zurich', 'Upscale', '2024-04-14', '2024-04-21', B'0'),
(3, 'Hyatt Regency Basel', 'Basel', 'Upper Upscale', '2024-04-02', '2024-04-20', B'0'),
(4, 'Radisson Blu Lucerne', 'Lucerne', 'Midscale', '2024-04-24', '2024-04-05', B'0'),
(5, 'Best Western Bern', 'Bern', 'Upper Midscale', '2024-04-23', '2024-04-01', B'0'),
(6, 'InterContinental Geneva', 'Geneva', 'Luxury', '2024-04-23', '2024-04-28', B'0'),
(7, 'Sheraton Zurich', 'Zurich', 'Upper Upscale', '2024-04-27', '2024-04-02', B'0'),
(8, 'Holiday Inn Basel', 'Basel', 'Upper Midscale', '2024-04-24', '2024-04-09', B'0'),
(9, 'Courtyard Zurich', 'Zurich', 'Upscale', '2024-04-03', '2024-04-13', B'0'),
(10, 'Comfort Inn Bern', 'Bern', 'Midscale', '2024-04-04', '2024-04-16', B'0');
```
1. End the database session:
```bash
\q
```
## Step 2: Install and configure Toolbox
In this section, we will download Toolbox, configure our tools in a
`tools.yaml`, and then run the Toolbox server.
1. Download the latest version of Toolbox as a binary:
{{< notice tip >}}
Select the
[correct binary](https://github.com/googleapis/genai-toolbox/releases)
corresponding to your OS and CPU architecture.
{{< /notice >}}
<!-- {x-release-please-start-version} -->
```bash
export OS="linux/amd64" # one of linux/amd64, darwin/arm64, darwin/amd64, or windows/amd64
curl -O https://storage.googleapis.com/genai-toolbox/v0.10.0/$OS/toolbox
```
<!-- {x-release-please-end} -->
1. Make the binary executable:
```bash
chmod +x toolbox
```
1. Write the following into a `tools.yaml` file. Be sure to update any fields
such as `user`, `password`, or `database` that you may have customized in the
previous step.
{{< notice tip >}}
In practice, use environment variable replacement with the format ${ENV_NAME}
instead of hardcoding your secrets into the configuration file.
{{< /notice >}}
```yaml
sources:
my-pg-source:
kind: postgres
host: 127.0.0.1
port: 5432
database: toolbox_db
user: ${USER_NAME}
password: ${PASSWORD}
tools:
search-hotels-by-name:
kind: postgres-sql
source: my-pg-source
description: Search for hotels based on name.
parameters:
- name: name
type: string
description: The name of the hotel.
statement: SELECT * FROM hotels WHERE name ILIKE '%' || $1 || '%';
search-hotels-by-location:
kind: postgres-sql
source: my-pg-source
description: Search for hotels based on location.
parameters:
- name: location
type: string
description: The location of the hotel.
statement: SELECT * FROM hotels WHERE location ILIKE '%' || $1 || '%';
book-hotel:
kind: postgres-sql
source: my-pg-source
description: >-
Book a hotel by its ID. If the hotel is successfully booked, returns a NULL, raises an error if not.
parameters:
- name: hotel_id
type: string
description: The ID of the hotel to book.
statement: UPDATE hotels SET booked = B'1' WHERE id = $1;
update-hotel:
kind: postgres-sql
source: my-pg-source
description: >-
Update a hotel's check-in and check-out dates by its ID. Returns a message
indicating whether the hotel was successfully updated or not.
parameters:
- name: hotel_id
type: string
description: The ID of the hotel to update.
- name: checkin_date
type: string
description: The new check-in date of the hotel.
- name: checkout_date
type: string
description: The new check-out date of the hotel.
statement: >-
UPDATE hotels SET checkin_date = CAST($2 as date), checkout_date = CAST($3
as date) WHERE id = $1;
cancel-hotel:
kind: postgres-sql
source: my-pg-source
description: Cancel a hotel by its ID.
parameters:
- name: hotel_id
type: string
description: The ID of the hotel to cancel.
statement: UPDATE hotels SET booked = B'0' WHERE id = $1;
toolsets:
my-toolset:
- search-hotels-by-name
- search-hotels-by-location
- book-hotel
- update-hotel
- cancel-hotel
```
For more info on tools, check out the `Resources` section of the docs.
1. Run the Toolbox server, pointing to the `tools.yaml` file created earlier:
```bash
./toolbox --tools-file "tools.yaml"
```
{{< notice note >}}
Toolbox enables dynamic reloading by default. To disable, use the `--disable-reload` flag.
{{< /notice >}}
## Step 3: Connect your agent to Toolbox
In this section, we will write and run an agent that will load the Tools
from Toolbox.
1. (Optional) Initialize a Node.js project:
```bash
npm init -y
```
1. In a new terminal, install the [SDK](https://www.npmjs.com/package/@toolbox-sdk/core).
```bash
npm install @toolbox-sdk/core
```
1. Install other required dependencies
{{< tabpane persist=header >}}
{{< tab header="LangChain" lang="bash" >}}
npm install langchain @langchain/google-vertexai
{{< /tab >}}
{{< tab header="GenkitJS" lang="bash" >}}
npm install genkit @genkit-ai/vertexai
{{< /tab >}}
{{< tab header="LlamaIndex" lang="bash" >}}
npm install llamaindex @llamaindex/google @llamaindex/workflow
{{< /tab >}}
{{< /tabpane >}}
1. Create a new file named `hotelAgent.js` and copy the following code to create an agent:
{{< tabpane persist=header >}}
{{< tab header="LangChain" lang="js" >}}
import { ChatVertexAI } from "@langchain/google-vertexai";
import { ToolboxClient } from "@toolbox-sdk/core";
import { tool } from "@langchain/core/tools";
import { createReactAgent } from "@langchain/langgraph/prebuilt";
import { MemorySaver } from "@langchain/langgraph";
// Replace it with your API key
process.env.GOOGLE_API_KEY = 'your-api-key';
const prompt = `
You're a helpful hotel assistant. You handle hotel searching, booking, and
cancellations. When the user searches for a hotel, mention its name, id,
location and price tier. Always mention hotel ids while performing any
searches. This is very important for any operations. For any bookings or
cancellations, please provide the appropriate confirmation. Be sure to
update checkin or checkout dates if mentioned by the user.
Don't ask for confirmations from the user.
`;
const queries = [
"Find hotels in Basel with Basel in its name.",
"Can you book the Hilton Basel for me?",
"Oh wait, this is too expensive. Please cancel it and book the Hyatt Regency instead.",
"My check in dates would be from April 10, 2024 to April 19, 2024.",
];
async function runApplication() {
const model = new ChatVertexAI({
model: "gemini-2.0-flash",
});
const client = new ToolboxClient("http://127.0.0.1:5000");
const toolboxTools = await client.loadToolset("my-toolset");
// Define the basics of the tool: name, description, schema and core logic
const getTool = (toolboxTool) => tool(toolboxTool, {
name: toolboxTool.getName(),
description: toolboxTool.getDescription(),
schema: toolboxTool.getParamSchema()
});
const tools = toolboxTools.map(getTool);
const agent = createReactAgent({
llm: model,
tools: tools,
checkpointer: new MemorySaver(),
systemPrompt: prompt,
});
const langGraphConfig = {
configurable: {
thread_id: "test-thread",
},
};
for (const query of queries) {
const agentOutput = await agent.invoke(
{
messages: [
{
role: "user",
content: query,
},
],
verbose: true,
},
langGraphConfig
);
const response = agentOutput.messages[agentOutput.messages.length - 1].content;
console.log(response);
}
}
runApplication()
.catch(console.error)
.finally(() => console.log("\nApplication finished."));
{{< /tab >}}
{{< tab header="GenkitJS" lang="js" >}}
import { ToolboxClient } from "@toolbox-sdk/core";
import { genkit } from "genkit";
import { googleAI } from '@genkit-ai/googleai';
// Replace it with your API key
process.env.GOOGLE_API_KEY = 'your-api-key';
const systemPrompt = `
You're a helpful hotel assistant. You handle hotel searching, booking, and
cancellations. When the user searches for a hotel, mention its name, id,
location and price tier. Always mention hotel ids while performing any
searches. This is very important for any operations. For any bookings or
cancellations, please provide the appropriate confirmation. Be sure to
update checkin or checkout dates if mentioned by the user.
Don't ask for confirmations from the user.
`;
const queries = [
"Find hotels in Basel with Basel in its name.",
"Can you book the Hilton Basel for me?",
"Oh wait, this is too expensive. Please cancel it and book the Hyatt Regency instead.",
"My check in dates would be from April 10, 2024 to April 19, 2024.",
];
async function run() {
const toolboxClient = new ToolboxClient("http://127.0.0.1:5000");
const ai = genkit({
plugins: [
googleAI({
apiKey: process.env.GEMINI_API_KEY || process.env.GOOGLE_API_KEY
})
],
model: googleAI.model('gemini-2.0-flash'),
});
const toolboxTools = await toolboxClient.loadToolset("my-toolset");
const toolMap = Object.fromEntries(
toolboxTools.map((tool) => {
const definedTool = ai.defineTool(
{
name: tool.getName(),
description: tool.getDescription(),
inputSchema: tool.getParamSchema(),
},
tool
);
return [tool.getName(), definedTool];
})
);
const tools = Object.values(toolMap);
let conversationHistory = [{ role: "system", content: [{ text: systemPrompt }] }];
for (const query of queries) {
conversationHistory.push({ role: "user", content: [{ text: query }] });
const response = await ai.generate({
messages: conversationHistory,
tools: tools,
});
conversationHistory.push(response.message);
const toolRequests = response.toolRequests;
if (toolRequests?.length > 0) {
// Execute tools concurrently and collect their responses.
const toolResponses = await Promise.all(
toolRequests.map(async (call) => {
try {
const toolOutput = await toolMap[call.name].invoke(call.input);
return { role: "tool", content: [{ toolResponse: { name: call.name, output: toolOutput } }] };
} catch (e) {
console.error(`Error executing tool ${call.name}:`, e);
return { role: "tool", content: [{ toolResponse: { name: call.name, output: { error: e.message } } }] };
}
})
);
conversationHistory.push(...toolResponses);
// Call the AI again with the tool results.
response = await ai.generate({ messages: conversationHistory, tools });
conversationHistory.push(response.message);
}
console.log(response.text);
}
}
run();
{{< /tab >}}
{{< tab header="LlamaIndex" lang="js" >}}
import { gemini, GEMINI_MODEL } from "@llamaindex/google";
import { agent } from "@llamaindex/workflow";
import { createMemory, staticBlock, tool } from "llamaindex";
import { ToolboxClient } from "@toolbox-sdk/core";
const TOOLBOX_URL = "http://127.0.0.1:5000"; // Update if needed
process.env.GOOGLE_API_KEY = 'your-api-key'; // Replace it with your API key
const prompt = `
You're a helpful hotel assistant. You handle hotel searching, booking and cancellations.
When the user searches for a hotel, mention its name, id, location and price tier.
Always mention hotel ids while performing any searches — this is very important for operations.
For any bookings or cancellations, please provide the appropriate confirmation.
Update check-in or check-out dates if mentioned by the user.
Don't ask for confirmations from the user.
`;
const queries = [
"Find hotels in Basel with Basel in its name.",
"Can you book the Hilton Basel for me?",
"Oh wait, this is too expensive. Please cancel it and book the Hyatt Regency instead.",
"My check in dates would be from April 10, 2024 to April 19, 2024.",
];
async function main() {
// Connect to MCP Toolbox
const client = new ToolboxClient(TOOLBOX_URL);
const toolboxTools = await client.loadToolset("my-toolset");
const tools = toolboxTools.map((toolboxTool) => {
return tool({
name: toolboxTool.getName(),
description: toolboxTool.getDescription(),
parameters: toolboxTool.getParamSchema(),
execute: toolboxTool,
});
});
// Initialize LLM
const llm = gemini({
model: GEMINI_MODEL.GEMINI_2_0_FLASH,
apiKey: process.env.GOOGLE_API_KEY,
});
const memory = createMemory({
memoryBlocks: [
staticBlock({
content: prompt,
}),
],
});
// Create the Agent
const myAgent = agent({
tools: tools,
llm,
memory,
systemPrompt: prompt,
});
for (const query of queries) {
const result = await myAgent.run(query);
const output = result.data.result;
console.log(`\nUser: ${query}`);
if (typeof output === "string") {
console.log(output.trim());
} else if (typeof output === "object" && "text" in output) {
console.log(output.text.trim());
} else {
console.log(JSON.stringify(output));
}
}
//You may observe some extra logs during execution due to the run method provided by Llama.
console.log("Agent run finished.");
}
main();
{{< /tab >}}
{{< /tabpane >}}
1. Run your agent, and observe the results:
```sh
node hotelAgent.js
```
{{< notice info >}}
For more information, visit the [JS SDK repo](https://github.com/googleapis/mcp-toolbox-sdk-js).
{{</ notice >}}

View File

@@ -1,9 +1,9 @@
---
title: "Quickstart (MCP)"
type: docs
weight: 3
weight: 5
description: >
How to get started running Toolbox locally with MCP Inspector.
How to get started running Toolbox locally with MCP Inspector.
---
## Overview
@@ -71,7 +71,7 @@ access by our agent, and create a database user for Toolbox to connect with.
```sql
INSERT INTO hotels(id, name, location, price_tier, checkin_date, checkout_date, booked)
VALUES
VALUES
(1, 'Hilton Basel', 'Basel', 'Luxury', '2024-04-22', '2024-04-20', B'0'),
(2, 'Marriott Zurich', 'Zurich', 'Upscale', '2024-04-14', '2024-04-21', B'0'),
(3, 'Hyatt Regency Basel', 'Basel', 'Upper Upscale', '2024-04-02', '2024-04-20', B'0'),
@@ -105,7 +105,7 @@ In this section, we will download Toolbox, configure our tools in a
<!-- {x-release-please-start-version} -->
```bash
export OS="linux/amd64" # one of linux/amd64, darwin/arm64, darwin/amd64, or windows/amd64
curl -O https://storage.googleapis.com/genai-toolbox/v0.7.0/$OS/toolbox
curl -O https://storage.googleapis.com/genai-toolbox/v0.10.0/$OS/toolbox
```
<!-- {x-release-please-end} -->
@@ -199,7 +199,8 @@ In this section, we will download Toolbox, configure our tools in a
- cancel-hotel
```
For more info on tools, check out the [Tools](../../resources/tools/_index.md) section.
For more info on tools, check out the
[Tools](../../resources/tools/) section.
1. Run the Toolbox server, pointing to the `tools.yaml` file created earlier:
@@ -217,17 +218,27 @@ In this section, we will download Toolbox, configure our tools in a
1. Type `y` when it asks to install the inspector package.
1. It should show the following when the MCP Inspector is up and running:
1. It should show the following when the MCP Inspector is up and running (please
take note of `<YOUR_SESSION_TOKEN>`):
```bash
🔍 MCP Inspector is up and running at http://127.0.0.1:5173 🚀
Starting MCP inspector...
⚙️ Proxy server listening on localhost:6277
🔑 Session token: <YOUR_SESSION_TOKEN>
Use this token to authenticate requests or set DANGEROUSLY_OMIT_AUTH=true to disable auth
🚀 MCP Inspector is up and running at:
http://localhost:6274/?MCP_PROXY_AUTH_TOKEN=<YOUR_SESSION_TOKEN>
```
1. Open the above link in your browser.
1. For `Transport Type`, select `SSE`.
1. For `Transport Type`, select `Streamable HTTP`.
1. For `URL`, type in `http://127.0.0.1:5000/mcp/sse`.
1. For `URL`, type in `http://127.0.0.1:5000/mcp`.
1. For `Configuration` -> `Proxy Session Token`, make sure
`<YOUR_SESSION_TOKEN>` is present.
1. Click Connect.

Binary file not shown.

Before

Width:  |  Height:  |  Size: 22 KiB

After

Width:  |  Height:  |  Size: 32 KiB

View File

@@ -0,0 +1,342 @@
---
title: "AlloyDB Admin API using MCP"
type: docs
weight: 2
description: >
Create your AlloyDB database with MCP Toolbox.
---
This guide covers how to use [MCP Toolbox for Databases][toolbox] to create
AlloyDB clusters and instances from IDE enabling their E2E journey.
- [Cursor][cursor]
- [Windsurf][windsurf] (Codium)
- [Visual Studio Code][vscode] (Copilot)
- [Cline][cline] (VS Code extension)
- [Claude desktop][claudedesktop]
- [Claude code][claudecode]
- [Gemini CLI][geminicli]
- [Gemini Code Assist][geminicodeassist]
[toolbox]: https://github.com/googleapis/genai-toolbox
[cursor]: #configure-your-mcp-client
[windsurf]: #configure-your-mcp-client
[vscode]: #configure-your-mcp-client
[cline]: #configure-your-mcp-client
[claudedesktop]: #configure-your-mcp-client
[claudecode]: #configure-your-mcp-client
[geminicli]: #configure-your-mcp-client
[geminicodeassist]: #configure-your-mcp-client
## Before you begin
1. In the Google Cloud console, on the [project selector
page](https://console.cloud.google.com/projectselector2/home/dashboard),
select or create a Google Cloud project.
1. [Make sure that billing is enabled for your Google Cloud
project](https://cloud.google.com/billing/docs/how-to/verify-billing-enabled#confirm_billing_is_enabled_on_a_project).
## Install MCP Toolbox
1. Download the latest version of Toolbox as a binary. Select the [correct
binary](https://github.com/googleapis/genai-toolbox/releases) corresponding
to your OS and CPU architecture. You are required to use Toolbox version
V0.10.0+:
<!-- {x-release-please-start-version} -->
{{< tabpane persist=header >}}
{{< tab header="linux/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.10.0/linux/amd64/toolbox
{{< /tab >}}
{{< tab header="darwin/arm64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.10.0/darwin/arm64/toolbox
{{< /tab >}}
{{< tab header="darwin/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.10.0/darwin/amd64/toolbox
{{< /tab >}}
{{< tab header="windows/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.10.0/windows/amd64/toolbox.exe
{{< /tab >}}
{{< /tabpane >}}
<!-- {x-release-please-end} -->
1. Make the binary executable:
```bash
chmod +x toolbox
```
1. Verify the installation:
```bash
./toolbox --version
```
## Configure your MCP Client
{{< tabpane text=true >}}
{{% tab header="Claude code" lang="en" %}}
1. Install [Claude
Code](https://docs.anthropic.com/en/docs/agents-and-tools/claude-code/overview).
1. Create a `.mcp.json` file in your project root if it doesn't exist.
1. Generate Access token to be used as API_KEY using `gcloud auth
print-access-token`.
> **Note:** The lifetime of token is 1 hour.
1. Add the following configuration, replace the environment variables with your
values, and save:
```json
{
"mcpServers": {
"alloydb-admin": {
"command": "./PATH/TO/toolbox",
"args": ["--prebuilt", "alloydb-postgres-admin", "--stdio"],
"env": {
"API_KEY": "your-api-key"
}
}
}
}
```
1. Restart Claude code to apply the new configuration.
{{% /tab %}}
{{% tab header="Claude desktop" lang="en" %}}
1. Open [Claude desktop](https://claude.ai/download) and navigate to Settings.
1. Under the Developer tab, tap Edit Config to open the configuration file.
1. Generate Access token to be used as API_KEY using `gcloud auth
print-access-token`.
> **Note:** The lifetime of token is 1 hour.
1. Add the following configuration, replace the environment variables with your
values, and save:
```json
{
"mcpServers": {
"alloydb-admin": {
"command": "./PATH/TO/toolbox",
"args": ["--prebuilt", "alloydb-postgres-admin", "--stdio"],
"env": {
"API_KEY": "your-api-key"
}
}
}
}
```
1. Restart Claude desktop.
1. From the new chat screen, you should see a hammer (MCP) icon appear with the
new MCP server available.
{{% /tab %}}
{{% tab header="Cline" lang="en" %}}
1. Open the [Cline](https://github.com/cline/cline) extension in VS Code and tap
the **MCP Servers** icon.
1. Tap Configure MCP Servers to open the configuration file.
1. Generate Access token to be used as API_KEY using `gcloud auth
print-access-token`.
> **Note:** The lifetime of token is 1 hour.
1. Add the following configuration, replace the environment variables with your
values, and save:
```json
{
"mcpServers": {
"alloydb-admin": {
"command": "./PATH/TO/toolbox",
"args": ["--prebuilt", "alloydb-postgres-admin", "--stdio"],
"env": {
"API_KEY": "your-api-key"
}
}
}
}
```
1. You should see a green active status after the server is successfully
connected.
{{% /tab %}}
{{% tab header="Cursor" lang="en" %}}
1. Create a `.cursor` directory in your project root if it doesn't exist.
1. Create a `.cursor/mcp.json` file if it doesn't exist and open it.
1. Generate Access token to be used as API_KEY using `gcloud auth
print-access-token`.
> **Note:** The lifetime of token is 1 hour.
1. Add the following configuration, replace the environment variables with your
values, and save:
```json
{
"mcpServers": {
"alloydb-admin": {
"command": "./PATH/TO/toolbox",
"args": ["--prebuilt", "alloydb-postgres-admin", "--stdio"],
"env": {
"API_KEY": "your-api-key"
}
}
}
}
```
1. [Cursor](https://www.cursor.com/) and navigate to **Settings > Cursor
Settings > MCP**. You should see a green active status after the server is
successfully connected.
{{% /tab %}}
{{% tab header="Visual Studio Code (Copilot)" lang="en" %}}
1. Open [VS Code](https://code.visualstudio.com/docs/copilot/overview) and
create a `.vscode` directory in your project root if it doesn't exist.
1. Create a `.vscode/mcp.json` file if it doesn't exist and open it.
1. Generate Access token to be used as API_KEY using `gcloud auth
print-access-token`.
> **Note:** The lifetime of token is 1 hour.
1. Add the following configuration, replace the environment variables with your
values, and save:
```json
{
"mcpServers": {
"alloydb-admin": {
"command": "./PATH/TO/toolbox",
"args": ["--prebuilt", "alloydb-postgres-admin", "--stdio"],
"env": {
"API_KEY": "your-api-key"
}
}
}
}
```
{{% /tab %}}
{{% tab header="Windsurf" lang="en" %}}
1. Open [Windsurf](https://docs.codeium.com/windsurf) and navigate to the
Cascade assistant.
1. Tap on the hammer (MCP) icon, then Configure to open the configuration file.
1. Generate Access token to be used as API_KEY using `gcloud auth
print-access-token`.
> **Note:** The lifetime of token is 1 hour.
1. Add the following configuration, replace the environment variables with your
values, and save:
```json
{
"mcpServers": {
"alloydb-admin": {
"command": "./PATH/TO/toolbox",
"args": ["--prebuilt", "alloydb-postgres-admin", "--stdio"],
"env": {
"API_KEY": "your-api-key"
}
}
}
}
```
{{% /tab %}}
{{% tab header="Gemini CLI" lang="en" %}}
1. Install the [Gemini
CLI](https://github.com/google-gemini/gemini-cli?tab=readme-ov-file#quickstart).
1. In your working directory, create a folder named `.gemini`. Within it, create
a `settings.json` file.
1. Generate Access token to be used as API_KEY using `gcloud auth print-access-token`.
> **Note:** The lifetime of token is 1 hour.
1. Add the following configuration, replace the environment variables with your
values, and save:
```json
{
"mcpServers": {
"alloydb-admin": {
"command": "./PATH/TO/toolbox",
"args": ["--prebuilt", "alloydb-postgres-admin", "--stdio"],
"env": {
"API_KEY": "your-api-key"
}
}
}
}
```
{{% /tab %}}
{{% tab header="Gemini Code Assist" lang="en" %}}
1. Install the [Gemini Code
Assist](https://marketplace.visualstudio.com/items?itemName=Google.geminicodeassist)
extension in Visual Studio Code.
1. Enable Agent Mode in Gemini Code Assist chat.
1. In your working directory, create a folder named `.gemini`. Within it, create
a `settings.json` file.
1. Generate Access token to be used as API_KEY using `gcloud auth print-access-token`.
> **Note:** The lifetime of token is 1 hour.
1. Add the following configuration, replace the environment variables with your
values, and save:
```json
{
"mcpServers": {
"alloydb-admin": {
"command": "./PATH/TO/toolbox",
"args": ["--prebuilt", "alloydb-postgres-admin", "--stdio"],
"env": {
"API_KEY": "your-api-key"
}
}
}
}
```
{{% /tab %}}
{{< /tabpane >}}
## Use Tools
Your AI tool is now connected to AlloyDB using MCP. Try asking your AI assistant
to create a database, cluster or instance.
The following tools are available to the LLM:
1. **alloydb-create-cluster**: creates alloydb cluster
1. **alloydb-create-instance**: creates alloydb instance (PRIMARY, READ_POOL or SECONDARY)
1. **alloydb-get-operation**: polls on operations API until the operation is done.
{{< notice note >}}
Prebuilt tools are pre-1.0, so expect some tool changes between versions. LLMs
will adapt to the tools available, so this shouldn't affect most users.
{{< /notice >}}
## Connect to your Data
After setting up an AlloyDB cluster and instance, you can [connect your IDE to
the
database](https://cloud.google.com/alloydb/docs/pre-built-tools-with-mcp-toolbox).

View File

@@ -0,0 +1,333 @@
---
title: "Firestore using MCP"
type: docs
weight: 2
description: >
Connect your IDE to Firestore using Toolbox.
---
[Model Context Protocol (MCP)](https://modelcontextprotocol.io/introduction) is
an open protocol for connecting Large Language Models (LLMs) to data sources
like Firestore. This guide covers how to use [MCP Toolbox for Databases][toolbox]
to expose your developer assistant tools to a Firestore instance:
* [Cursor][cursor]
* [Windsurf][windsurf] (Codium)
* [Visual Studio Code][vscode] (Copilot)
* [Cline][cline] (VS Code extension)
* [Claude desktop][claudedesktop]
* [Claude code][claudecode]
* [Gemini CLI][geminicli]
* [Gemini Code Assist][geminicodeassist]
[toolbox]: https://github.com/googleapis/genai-toolbox
[cursor]: #configure-your-mcp-client
[windsurf]: #configure-your-mcp-client
[vscode]: #configure-your-mcp-client
[cline]: #configure-your-mcp-client
[claudedesktop]: #configure-your-mcp-client
[claudecode]: #configure-your-mcp-client
[geminicli]: #configure-your-mcp-client
[geminicodeassist]: #configure-your-mcp-client
## Set up Firestore
1. Create or select a Google Cloud project.
* [Create a new
project](https://cloud.google.com/resource-manager/docs/creating-managing-projects)
* [Select an existing
project](https://cloud.google.com/resource-manager/docs/creating-managing-projects#identifying_projects)
1. [Enable the Firestore
API](https://console.cloud.google.com/apis/library/firestore.googleapis.com)
for your project.
1. [Create a Firestore
database](https://cloud.google.com/firestore/docs/create-database-web-mobile-client-library)
if you haven't already.
1. Set up authentication for your local environment.
* [Install gcloud CLI](https://cloud.google.com/sdk/docs/install)
* Run `gcloud auth application-default login` to authenticate
## Install MCP Toolbox
1. Download the latest version of Toolbox as a binary. Select the [correct
binary](https://github.com/googleapis/genai-toolbox/releases) corresponding
to your OS and CPU architecture. You are required to use Toolbox version
V0.10.0+:
<!-- {x-release-please-start-version} -->
{{< tabpane persist=header >}}
{{< tab header="linux/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.10.0/linux/amd64/toolbox
{{< /tab >}}
{{< tab header="darwin/arm64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.10.0/darwin/arm64/toolbox
{{< /tab >}}
{{< tab header="darwin/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.10.0/darwin/amd64/toolbox
{{< /tab >}}
{{< tab header="windows/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.10.0/windows/amd64/toolbox
{{< /tab >}}
{{< /tabpane >}}
<!-- {x-release-please-end} -->
1. Make the binary executable:
```bash
chmod +x toolbox
```
1. Verify the installation:
```bash
./toolbox --version
```
## Configure your MCP Client
{{< tabpane text=true >}}
{{% tab header="Claude code" lang="en" %}}
1. Install [Claude
Code](https://docs.anthropic.com/en/docs/agents-and-tools/claude-code/overview).
1. Create a `.mcp.json` file in your project root if it doesn't exist.
1. Add the following configuration, replace the environment variables with your
values, and save:
```json
{
"mcpServers": {
"firestore": {
"command": "./PATH/TO/toolbox",
"args": ["--prebuilt","firestore","--stdio"],
"env": {
"FIRESTORE_PROJECT": "your-project-id",
"FIRESTORE_DATABASE": "(default)"
}
}
}
}
```
1. Restart Claude code to apply the new configuration.
{{% /tab %}}
{{% tab header="Claude desktop" lang="en" %}}
1. Open [Claude desktop](https://claude.ai/download) and navigate to Settings.
1. Under the Developer tab, tap Edit Config to open the configuration file.
1. Add the following configuration, replace the environment variables with your
values, and save:
```json
{
"mcpServers": {
"firestore": {
"command": "./PATH/TO/toolbox",
"args": ["--prebuilt","firestore","--stdio"],
"env": {
"FIRESTORE_PROJECT": "your-project-id",
"FIRESTORE_DATABASE": "(default)"
}
}
}
}
```
1. Restart Claude desktop.
1. From the new chat screen, you should see a hammer (MCP) icon appear with the
new MCP server available.
{{% /tab %}}
{{% tab header="Cline" lang="en" %}}
1. Open the [Cline](https://github.com/cline/cline) extension in VS Code and tap
the **MCP Servers** icon.
1. Tap Configure MCP Servers to open the configuration file.
1. Add the following configuration, replace the environment variables with your
values, and save:
```json
{
"mcpServers": {
"firestore": {
"command": "./PATH/TO/toolbox",
"args": ["--prebuilt","firestore","--stdio"],
"env": {
"FIRESTORE_PROJECT": "your-project-id",
"FIRESTORE_DATABASE": "(default)"
}
}
}
}
```
1. You should see a green active status after the server is successfully
connected.
{{% /tab %}}
{{% tab header="Cursor" lang="en" %}}
1. Create a `.cursor` directory in your project root if it doesn't exist.
1. Create a `.cursor/mcp.json` file if it doesn't exist and open it.
1. Add the following configuration, replace the environment variables with your
values, and save:
```json
{
"mcpServers": {
"firestore": {
"command": "./PATH/TO/toolbox",
"args": ["--prebuilt","firestore","--stdio"],
"env": {
"FIRESTORE_PROJECT": "your-project-id",
"FIRESTORE_DATABASE": "(default)"
}
}
}
}
```
1. [Cursor](https://www.cursor.com/) and navigate to **Settings > Cursor
Settings > MCP**. You should see a green active status after the server is
successfully connected.
{{% /tab %}}
{{% tab header="Visual Studio Code (Copilot)" lang="en" %}}
1. Open [VS Code](https://code.visualstudio.com/docs/copilot/overview) and
create a `.vscode` directory in your project root if it doesn't exist.
1. Create a `.vscode/mcp.json` file if it doesn't exist and open it.
1. Add the following configuration, replace the environment variables with your
values, and save:
```json
{
"mcpServers": {
"firestore": {
"command": "./PATH/TO/toolbox",
"args": ["--prebuilt","firestore","--stdio"],
"env": {
"FIRESTORE_PROJECT": "your-project-id",
"FIRESTORE_DATABASE": "(default)"
}
}
}
}
```
{{% /tab %}}
{{% tab header="Windsurf" lang="en" %}}
1. Open [Windsurf](https://docs.codeium.com/windsurf) and navigate to the
Cascade assistant.
1. Tap on the hammer (MCP) icon, then Configure to open the configuration file.
1. Add the following configuration, replace the environment variables with your
values, and save:
```json
{
"mcpServers": {
"firestore": {
"command": "./PATH/TO/toolbox",
"args": ["--prebuilt","firestore","--stdio"],
"env": {
"FIRESTORE_PROJECT": "your-project-id",
"FIRESTORE_DATABASE": "(default)"
}
}
}
}
```
{{% /tab %}}
{{% tab header="Gemini CLI" lang="en" %}}
1. Install the [Gemini
CLI](https://github.com/google-gemini/gemini-cli?tab=readme-ov-file#quickstart).
1. In your working directory, create a folder named `.gemini`. Within it, create
a `settings.json` file.
1. Add the following configuration, replace the environment variables with your
values, and then save:
```json
{
"mcpServers": {
"firestore": {
"command": "./PATH/TO/toolbox",
"args": ["--prebuilt","firestore","--stdio"],
"env": {
"FIRESTORE_PROJECT": "your-project-id",
"FIRESTORE_DATABASE": "(default)"
}
}
}
}
```
{{% /tab %}}
{{% tab header="Gemini Code Assist" lang="en" %}}
1. Install the [Gemini Code
Assist](https://marketplace.visualstudio.com/items?itemName=Google.geminicodeassist)
extension in Visual Studio Code.
1. Enable Agent Mode in Gemini Code Assist chat.
1. In your working directory, create a folder named `.gemini`. Within it, create
a `settings.json` file.
1. Add the following configuration, replace the environment variables with your
values, and then save:
```json
{
"mcpServers": {
"firestore": {
"command": "./PATH/TO/toolbox",
"args": ["--prebuilt","firestore","--stdio"],
"env": {
"FIRESTORE_PROJECT": "your-project-id",
"FIRESTORE_DATABASE": "(default)"
}
}
}
}
```
{{% /tab %}}
{{< /tabpane >}}
## Use Tools
Your AI tool is now connected to Firestore using MCP. Try asking your AI
assistant to list collections, get documents, query collections, or manage
security rules.
The following tools are available to the LLM:
1. **firestore-get-documents**: Gets multiple documents from Firestore by their
paths
1. **firestore-list-collections**: List Firestore collections for a given parent
path
1. **firestore-delete-documents**: Delete multiple documents from Firestore
1. **firestore-query-collection**: Query documents from a collection with
filtering, ordering, and limit options
1. **firestore-get-rules**: Retrieves the active Firestore security rules for
the current project
1. **firestore-validate-rules**: Validates Firestore security rules syntax and
errors
{{< notice note >}}
Prebuilt tools are pre-1.0, so expect some tool changes between versions. LLMs
will adapt to the tools available, so this shouldn't affect most users.
{{< /notice >}}

View File

@@ -0,0 +1,275 @@
---
title: "Looker using MCP"
type: docs
weight: 2
description: >
Connect your IDE to Looker using Toolbox.
---
[Model Context Protocol (MCP)](https://modelcontextprotocol.io/introduction) is
an open protocol for connecting Large Language Models (LLMs) to data sources
like Postgres. This guide covers how to use [MCP Toolbox for Databases][toolbox]
to expose your developer assistant tools to a Looker instance:
* [Cursor][cursor]
* [Windsurf][windsurf] (Codium)
* [Visual Studio Code][vscode] (Copilot)
* [Cline][cline] (VS Code extension)
* [Claude desktop][claudedesktop]
* [Claude code][claudecode]
[toolbox]: https://github.com/googleapis/genai-toolbox
[cursor]: #configure-your-mcp-client
[windsurf]: #configure-your-mcp-client
[vscode]: #configure-your-mcp-client
[cline]: #configure-your-mcp-client
[claudedesktop]: #configure-your-mcp-client
[claudecode]: #configure-your-mcp-client
## Set up Looker
1. Get a Looker Client ID and Client Secret. Follow the directions
[here](https://cloud.google.com/looker/docs/api-auth#authentication_with_an_sdk).
1. Have the base URL of your Looker instance available. It is likely
something like `https://looker.example.com`. In some cases the API is
listening at a different port, and you will need to use
`https://looker.example.com:19999` instead.
## Install MCP Toolbox
1. Download the latest version of Toolbox as a binary. Select the [correct
binary](https://github.com/googleapis/genai-toolbox/releases) corresponding
to your OS and CPU architecture. You are required to use Toolbox version
v0.10.0+:
<!-- {x-release-please-start-version} -->
{{< tabpane persist=header >}}
{{< tab header="linux/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.10.0/linux/amd64/toolbox
{{< /tab >}}
{{< tab header="darwin/arm64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.10.0/darwin/arm64/toolbox
{{< /tab >}}
{{< tab header="darwin/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.10.0/darwin/amd64/toolbox
{{< /tab >}}
{{< tab header="windows/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.10.0/windows/amd64/toolbox.exe
{{< /tab >}}
{{< /tabpane >}}
<!-- {x-release-please-end} -->
1. Make the binary executable:
```bash
chmod +x toolbox
```
1. Verify the installation:
```bash
./toolbox --version
```
## Configure your MCP Client
{{< tabpane text=true >}}
{{% tab header="Claude code" lang="en" %}}
1. Install [Claude
Code](https://docs.anthropic.com/en/docs/agents-and-tools/claude-code/overview).
1. Create a `.mcp.json` file in your project root if it doesn't exist.
1. Add the following configuration, replace the environment variables with your
values, and save:
```json
{
"mcpServers": {
"looker-toolbox": {
"command": "./PATH/TO/toolbox",
"args": ["--stdio", "--prebuilt", "looker"],
"env": {
"LOOKER_BASE_URL": "https://looker.example.com",
"LOOKER_CLIENT_ID": "",
"LOOKER_CLIENT_SECRET": "",
"LOOKER_VERIFY_SSL": "true"
}
}
}
}
```
1. Restart Claude Code to apply the new configuration.
{{% /tab %}}
{{% tab header="Claude desktop" lang="en" %}}
1. Open [Claude desktop](https://claude.ai/download) and navigate to Settings.
1. Under the Developer tab, tap Edit Config to open the configuration file.
1. Add the following configuration, replace the environment variables with your
values, and save:
```json
{
"mcpServers": {
"looker-toolbox": {
"command": "./PATH/TO/toolbox",
"args": ["--stdio", "--prebuilt", "looker"],
"env": {
"LOOKER_BASE_URL": "https://looker.example.com",
"LOOKER_CLIENT_ID": "",
"LOOKER_CLIENT_SECRET": "",
"LOOKER_VERIFY_SSL": "true"
}
}
}
}
```
1. Restart Claude desktop.
1. From the new chat screen, you should see a hammer (MCP) icon appear with the
new MCP server available.
{{% /tab %}}
{{% tab header="Cline" lang="en" %}}
1. Open the [Cline](https://github.com/cline/cline) extension in VS Code and tap
the **MCP Servers** icon.
1. Tap Configure MCP Servers to open the configuration file.
1. Add the following configuration, replace the environment variables with your
values, and save:
```json
{
"mcpServers": {
"looker-toolbox": {
"command": "./PATH/TO/toolbox",
"args": ["--stdio", "--prebuilt", "looker"],
"env": {
"LOOKER_BASE_URL": "https://looker.example.com",
"LOOKER_CLIENT_ID": "",
"LOOKER_CLIENT_SECRET": "",
"LOOKER_VERIFY_SSL": "true"
}
}
}
}
```
1. You should see a green active status after the server is successfully
connected.
{{% /tab %}}
{{% tab header="Cursor" lang="en" %}}
1. Create a `.cursor` directory in your project root if it doesn't exist.
1. Create a `.cursor/mcp.json` file if it doesn't exist and open it.
1. Add the following configuration, replace the environment variables with your
values, and save:
```json
{
"mcpServers": {
"looker-toolbox": {
"command": "./PATH/TO/toolbox",
"args": ["--stdio", "--prebuilt", "looker"],
"env": {
"LOOKER_BASE_URL": "https://looker.example.com",
"LOOKER_CLIENT_ID": "",
"LOOKER_CLIENT_SECRET": "",
"LOOKER_VERIFY_SSL": "true"
}
}
}
}
```
1. Open [Cursor](https://www.cursor.com/) and navigate to **Settings > Cursor
Settings > MCP**. You should see a green active status after the server is
successfully connected.
{{% /tab %}}
{{% tab header="Visual Studio Code (Copilot)" lang="en" %}}
1. Open [VS Code](https://code.visualstudio.com/docs/copilot/overview) and
create a `.vscode` directory in your project root if it doesn't exist.
1. Create a `.vscode/mcp.json` file if it doesn't exist and open it.
1. Add the following configuration, replace the environment variables with your
values, and save:
```json
{
"mcpServers": {
"looker-toolbox": {
"command": "./PATH/TO/toolbox",
"args": ["--stdio", "--prebuilt", "looker"],
"env": {
"LOOKER_BASE_URL": "https://looker.example.com",
"LOOKER_CLIENT_ID": "",
"LOOKER_CLIENT_SECRET": "",
"LOOKER_VERIFY_SSL": "true"
}
}
}
}
```
{{% /tab %}}
{{% tab header="Windsurf" lang="en" %}}
1. Open [Windsurf](https://docs.codeium.com/windsurf) and navigate to the
Cascade assistant.
1. Tap on the hammer (MCP) icon, then Configure to open the configuration file.
1. Add the following configuration, replace the environment variables with your
values, and save:
```json
{
"mcpServers": {
"looker-toolbox": {
"command": "./PATH/TO/toolbox",
"args": ["--stdio", "--prebuilt", "looker"],
"env": {
"LOOKER_BASE_URL": "https://looker.example.com",
"LOOKER_CLIENT_ID": "",
"LOOKER_CLIENT_SECRET": "",
"LOOKER_VERIFY_SSL": "true"
}
}
}
}
```
{{% /tab %}}
{{< /tabpane >}}
## Use Tools
Your AI tool is now connected to Looker using MCP. Try asking your AI
assistant to list models, explores, dimensions, and measures. Run a
query, retrieve the SQL for a query, and run a saved Look.
The following tools are available to the LLM:
1. **get_models**: list the LookML models in Looker
1. **get_explores**: list the explores in a given model
1. **get_dimensions**: list the dimensions in a given explore
1. **get_measures**: list the measures in a given explore
1. **get_filters**: list the filters in a given explore
1. **get_parameters**: list the parameters in a given explore
1. **query**: Run a query
1. **query_sql**: Return the SQL generated by Looker for a query
1. **query_url**: Return a link to the query in Looker for further exploration
1. **get_looks**: Return the saved Looks that match a title or description
1. **run_look**: Run a saved Look and return the data
{{< notice note >}}
Prebuilt tools are pre-1.0, so expect some tool changes between versions. LLMs
will adapt to the tools available, so this shouldn't affect most users.
{{< /notice >}}

View File

@@ -6,11 +6,14 @@ description: >
Connect your IDE to PostgreSQL using Toolbox.
---
[Model Context Protocol (MCP)](https://modelcontextprotocol.io/introduction) is an open protocol for connecting Large Language Models (LLMs) to data sources like Postgres. This guide covers how to use [MCP Toolbox for Databases][toolbox] to expose your developer assistant tools to a Postgres instance:
[Model Context Protocol (MCP)](https://modelcontextprotocol.io/introduction) is
an open protocol for connecting Large Language Models (LLMs) to data sources
like Postgres. This guide covers how to use [MCP Toolbox for Databases][toolbox]
to expose your developer assistant tools to a Postgres instance:
* [Cursor][cursor]
* [Windsurf][windsurf] (Codium)
* [Visual Studio Code ][vscode] (Copilot)
* [Visual Studio Code][vscode] (Copilot)
* [Cline][cline] (VS Code extension)
* [Claude desktop][claudedesktop]
* [Claude code][claudecode]
@@ -24,7 +27,8 @@ description: >
[claudecode]: #configure-your-mcp-client
{{< notice tip >}}
This guide can be used with [AlloyDB Omni](https://cloud.google.com/alloydb/omni/current/docs/overview).
This guide can be used with [AlloyDB
Omni](https://cloud.google.com/alloydb/omni/current/docs/overview).
{{< /notice >}}
## Set up the database
@@ -34,34 +38,37 @@ This guide can be used with [AlloyDB Omni](https://cloud.google.com/alloydb/omni
* [Install PostgreSQL locally](https://www.postgresql.org/download/)
* [Install AlloyDB Omni](https://cloud.google.com/alloydb/omni/current/docs/quickstart)
1. Create or reuse [a database user](https://cloud.google.com/alloydb/omni/current/docs/database-users/manage-users) and have the username and password ready.
1. Create or reuse [a database
user](https://cloud.google.com/alloydb/omni/current/docs/database-users/manage-users)
and have the username and password ready.
## Install MCP Toolbox
1. Download the latest version of Toolbox as a binary. Select the [correct binary](https://github.com/googleapis/genai-toolbox/releases) corresponding to your OS and CPU architecture. You are required to use Toolbox version V0.6.0+:
1. Download the latest version of Toolbox as a binary. Select the [correct
binary](https://github.com/googleapis/genai-toolbox/releases) corresponding
to your OS and CPU architecture. You are required to use Toolbox version
V0.6.0+:
<!-- {x-release-please-start-version} -->
{{< tabpane persist=header >}}
{{< tab header="linux/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.7.0/linux/amd64/toolbox
curl -O https://storage.googleapis.com/genai-toolbox/v0.10.0/linux/amd64/toolbox
{{< /tab >}}
{{< tab header="darwin/arm64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.7.0/darwin/arm64/toolbox
curl -O https://storage.googleapis.com/genai-toolbox/v0.10.0/darwin/arm64/toolbox
{{< /tab >}}
{{< tab header="darwin/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.7.0/darwin/amd64/toolbox
curl -O https://storage.googleapis.com/genai-toolbox/v0.10.0/darwin/amd64/toolbox
{{< /tab >}}
{{< tab header="windows/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.7.0/windows/amd64/toolbox
curl -O https://storage.googleapis.com/genai-toolbox/v0.10.0/windows/amd64/toolbox.exe
{{< /tab >}}
{{< /tabpane >}}
<!-- {x-release-please-end} -->
1. Make the binary executable:
```bash
@@ -79,9 +86,11 @@ curl -O https://storage.googleapis.com/genai-toolbox/v0.7.0/windows/amd64/toolbo
{{< tabpane text=true >}}
{{% tab header="Claude code" lang="en" %}}
1. Install [Claude Code](https://docs.anthropic.com/en/docs/agents-and-tools/claude-code/overview).
1. Install [Claude
Code](https://docs.anthropic.com/en/docs/agents-and-tools/claude-code/overview).
1. Create a `.mcp.json` file in your project root if it doesn't exist.
1. Add the following configuration, replace the environment variables with your values, and save:
1. Add the following configuration, replace the environment variables with your
values, and save:
```json
{
@@ -108,7 +117,8 @@ curl -O https://storage.googleapis.com/genai-toolbox/v0.7.0/windows/amd64/toolbo
1. Open [Claude desktop](https://claude.ai/download) and navigate to Settings.
1. Under the Developer tab, tap Edit Config to open the configuration file.
1. Add the following configuration, replace the environment variables with your values, and save:
1. Add the following configuration, replace the environment variables with your
values, and save:
```json
{
@@ -129,14 +139,17 @@ curl -O https://storage.googleapis.com/genai-toolbox/v0.7.0/windows/amd64/toolbo
```
1. Restart Claude desktop.
1. From the new chat screen, you should see a hammer (MCP) icon appear with the new MCP server available.
1. From the new chat screen, you should see a hammer (MCP) icon appear with the
new MCP server available.
{{% /tab %}}
{{% tab header="Cline" lang="en" %}}
1. Open the [Cline](https://github.com/cline/cline) extension in VS Code and tap the **MCP Servers** icon.
1. Open the [Cline](https://github.com/cline/cline) extension in VS Code and tap
the **MCP Servers** icon.
1. Tap Configure MCP Servers to open the configuration file.
1. Add the following configuration, replace the environment variables with your values, and save:
1. Add the following configuration, replace the environment variables with your
values, and save:
```json
{
@@ -156,14 +169,16 @@ curl -O https://storage.googleapis.com/genai-toolbox/v0.7.0/windows/amd64/toolbo
}
```
1. You should see a green active status after the server is successfully connected.
1. You should see a green active status after the server is successfully
connected.
{{% /tab %}}
{{% tab header="Cursor" lang="en" %}}
1. Create a `.cursor` directory in your project root if it doesn't exist.
1. Create a `.cursor/mcp.json` file if it doesn't exist and open it.
1. Add the following configuration, replace the environment variables with your values, and save:
1. Add the following configuration, replace the environment variables with your
values, and save:
```json
{
@@ -183,14 +198,18 @@ curl -O https://storage.googleapis.com/genai-toolbox/v0.7.0/windows/amd64/toolbo
}
```
1. [Cursor](https://www.cursor.com/) and navigate to **Settings > Cursor Settings > MCP**. You should see a green active status after the server is successfully connected.
1. [Cursor](https://www.cursor.com/) and navigate to **Settings > Cursor
Settings > MCP**. You should see a green active status after the server is
successfully connected.
{{% /tab %}}
{{% tab header="Visual Studio Code (Copilot)" lang="en" %}}
1. Open [VS Code](https://code.visualstudio.com/docs/copilot/overview) and create a `.vscode` directory in your project root if it doesn't exist.
1. Open [VS Code](https://code.visualstudio.com/docs/copilot/overview) and
create a `.vscode` directory in your project root if it doesn't exist.
1. Create a `.vscode/mcp.json` file if it doesn't exist and open it.
1. Add the following configuration, replace the environment variables with your values, and save:
1. Add the following configuration, replace the environment variables with your
values, and save:
```json
{
@@ -209,13 +228,16 @@ curl -O https://storage.googleapis.com/genai-toolbox/v0.7.0/windows/amd64/toolbo
}
}
```
{{% /tab %}}
{{% tab header="Windsurf" lang="en" %}}
1. Open [Windsurf](https://docs.codeium.com/windsurf) and navigate to the Cascade assistant.
1. Open [Windsurf](https://docs.codeium.com/windsurf) and navigate to the
Cascade assistant.
1. Tap on the hammer (MCP) icon, then Configure to open the configuration file.
1. Add the following configuration, replace the environment variables with your values, and save:
1. Add the following configuration, replace the environment variables with your
values, and save:
```json
{
@@ -235,12 +257,15 @@ curl -O https://storage.googleapis.com/genai-toolbox/v0.7.0/windows/amd64/toolbo
}
```
{{% /tab %}}
{{< /tabpane >}}
## Use Tools
Your AI tool is now connected to Postgres using MCP. Try asking your AI assistant to list tables, create a table, or define and execute other SQL statements.
Your AI tool is now connected to Postgres using MCP. Try asking your AI
assistant to list tables, create a table, or define and execute other SQL
statements.
The following tools are available to the LLM:
@@ -248,5 +273,6 @@ The following tools are available to the LLM:
1. **execute_sql**: execute any SQL statement
{{< notice note >}}
Prebuilt tools are pre-1.0, so expect some tool changes between versions. LLMs will adapt to the tools available, so this shouldn't affect most users.
{{< /notice >}}
Prebuilt tools are pre-1.0, so expect some tool changes between versions. LLMs
will adapt to the tools available, so this shouldn't affect most users.
{{< /notice >}}

View File

@@ -7,36 +7,48 @@ description: >
---
## Toolbox SDKs vs Model Context Protocol (MCP)
Toolbox now supports connections via both the native Toolbox SDKs and via [Model Context Protocol (MCP)](https://modelcontextprotocol.io/). However, Toolbox has several features which are not supported in the MCP specification (such as Authenticated Parameters and Authorized invocation).
We recommend using the native SDKs over MCP clients to leverage these features. The native SDKs can be combined with MCP clients in many cases.
Toolbox now supports connections via both the native Toolbox SDKs and via [Model
Context Protocol (MCP)](https://modelcontextprotocol.io/). However, Toolbox has
several features which are not supported in the MCP specification (such as
Authenticated Parameters and Authorized invocation).
We recommend using the native SDKs over MCP clients to leverage these features.
The native SDKs can be combined with MCP clients in many cases.
### Protocol Versions
Toolbox currently supports the following versions of MCP specification:
* [2024-11-05](https://spec.modelcontextprotocol.io/specification/2024-11-05/)
### Features Not Supported by MCP
Toolbox has several features that are not yet supported in the MCP specification:
* **AuthZ/AuthN:** There are no auth implementation in the `2024-11-05` specification. This includes:
* [Authenticated Parameters](../resources/tools/_index.md#authenticated-parameters)
* [Authorized Invocations](../resources/tools/_index.md#authorized-invocations)
* **Notifications:** Currently, editing Toolbox Tools requires a server restart. Clients should reload tools on disconnect to get the latest version.
* [2025-06-18](https://modelcontextprotocol.io/specification/2025-06-18)
* [2025-03-26](https://modelcontextprotocol.io/specification/2025-03-26)
* [2024-11-05](https://modelcontextprotocol.io/specification/2024-11-05)
### Toolbox AuthZ/AuthN Not Supported by MCP
The auth implementation in Toolbox is not supported in MCP's auth specification.
This includes:
* [Authenticated Parameters](../resources/tools/#authenticated-parameters)
* [Authorized Invocations](../resources/tools/#authorized-invocations)
## Connecting to Toolbox with an MCP client
### Before you begin
{{< notice note >}}
{{< notice note >}}
MCP is only compatible with Toolbox version 0.3.0 and above.
{{< /notice >}}
1. [Install](../getting-started/introduction/_index.md#installing-the-server) Toolbox version 0.3.0+.
1. [Install](../getting-started/introduction/#installing-the-server)
Toolbox version 0.3.0+.
1. Make sure you've set up and initialized your database.
1. [Set up](../getting-started/configure.md) your `tools.yaml` file.
### Connecting via Standard Input/Output (stdio)
Toolbox supports the
[stdio](https://modelcontextprotocol.io/docs/concepts/transports#standard-input%2Foutput-stdio)
transport protocol. Users that wish to use stdio will have to include the
@@ -47,14 +59,21 @@ transport protocol. Users that wish to use stdio will have to include the
```
When running with stdio, Toolbox will listen via stdio instead of acting as a
remote HTTP server. Logs will be set to the `warn` level by default. `debug` and `info` logs are not
supported with stdio.
remote HTTP server. Logs will be set to the `warn` level by default. `debug` and
`info` logs are not supported with stdio.
{{< notice note >}}
Toolbox enables dynamic reloading by default. To disable, use the
`--disable-reload` flag.
{{< /notice >}}
### Connecting via HTTP
Toolbox supports the HTTP transport protocol with and without SSE.
{{< tabpane text=true >}} {{% tab header="HTTP with SSE" lang="en" %}}
{{< tabpane text=true >}} {{% tab header="HTTP with SSE (deprecated)" lang="en" %}}
Add the following configuration to your MCP client configuration:
```bash
{
"mcpServers": {
@@ -66,11 +85,27 @@ Add the following configuration to your MCP client configuration:
}
```
If you would like to connect to a specific toolset, replace `url` with `"http://127.0.0.1:5000/mcp/{toolset_name}/sse"`.
{{% /tab %}} {{% tab header="HTTP POST" lang="en" %}}
Connect to Toolbox HTTP POST via `http://127.0.0.1:5000/mcp`.
If you would like to connect to a specific toolset, replace `url` with
`"http://127.0.0.1:5000/mcp/{toolset_name}/sse"`.
If you would like to connect to a specific toolset, connect via `http://127.0.0.1:5000/mcp/{toolset_name}`.
HTTP with SSE is only supported in version `2024-11-05` and is currently
deprecated.
{{% /tab %}} {{% tab header="Streamable HTTP" lang="en" %}}
Add the following configuration to your MCP client configuration:
```bash
{
"mcpServers": {
"toolbox": {
"type": "http",
"url": "http://127.0.0.1:5000/mcp",
}
}
}
```
If you would like to connect to a specific toolset, replace `url` with
`"http://127.0.0.1:5000/mcp/{toolset_name}"`.
{{% /tab %}} {{< /tabpane >}}
### Using the MCP Inspector with Toolbox
@@ -80,6 +115,7 @@ testing and debugging Toolbox server.
{{< tabpane text=true >}}
{{% tab header="STDIO" lang="en" %}}
1. Run Inspector with Toolbox as a subprocess:
```bash
@@ -88,15 +124,16 @@ testing and debugging Toolbox server.
1. For `Transport Type` dropdown menu, select `STDIO`.
1. In `Command`, make sure that it is set to :`./toolbox` (or the correct path to where the Toolbox binary is installed).
1. In `Command`, make sure that it is set to :`./toolbox` (or the correct path
to where the Toolbox binary is installed).
1. In `Arguments`, make sure that it's filled with `--stdio`.
1. Click the `Connect` button. It might take awhile to spin up Toolbox. Voila!
You should be able to inspect your toolbox tools!
{{% /tab %}}
{{% tab header="HTTP with SSE" lang="en" %}}
1. [Run Toolbox](../getting-started/introduction/_index.md#running-the-server).
{{% tab header="HTTP with SSE (deprecated)" lang="en" %}}
1. [Run Toolbox](../getting-started/introduction/#running-the-server).
1. In a separate terminal, run Inspector directly through `npx`:
@@ -109,6 +146,23 @@ testing and debugging Toolbox server.
1. For `URL`, type in `http://127.0.0.1:5000/mcp/sse` to use all tool or
`http//127.0.0.1:5000/mcp/{toolset_name}/sse` to use a specific toolset.
1. Click the `Connect` button. Voila! You should be able to inspect your toolbox
tools!
{{% /tab %}}
{{% tab header="Streamable HTTP" lang="en" %}}
1. [Run Toolbox](../getting-started/introduction/#running-the-server).
1. In a separate terminal, run Inspector directly through `npx`:
```bash
npx @modelcontextprotocol/inspector
```
1. For `Transport Type` dropdown menu, select `Streamable HTTP`.
1. For `URL`, type in `http://127.0.0.1:5000/mcp` to use all tool or
`http//127.0.0.1:5000/mcp/{toolset_name}` to use a specific toolset.
1. Click the `Connect` button. Voila! You should be able to inspect your toolbox
tools!
{{% /tab %}} {{< /tabpane >}}
@@ -117,8 +171,8 @@ testing and debugging Toolbox server.
| Client | SSE Works | MCP Config Docs |
|--------|--------|--------|
| Claude Desktop | ✅ | https://modelcontextprotocol.io/quickstart/user#1-download-claude-for-desktop |
| MCP Inspector | ✅ | https://github.com/modelcontextprotocol/inspector |
| Cursor | ✅ | https://docs.cursor.com/context/model-context-protocol |
| Windsurf | ✅ | https://docs.windsurf.com/windsurf/mcp |
| VS Code (Insiders) | ✅ | https://code.visualstudio.com/docs/copilot/chat/mcp-servers |
| Claude Desktop | ✅ | <https://modelcontextprotocol.io/quickstart/user#1-download-claude-for-desktop> |
| MCP Inspector | ✅ | <https://github.com/modelcontextprotocol/inspector> |
| Cursor | ✅ | <https://docs.cursor.com/context/model-context-protocol> |
| Windsurf | ✅ | <https://docs.windsurf.com/windsurf/mcp> |
| VS Code (Insiders) | ✅ | <https://code.visualstudio.com/docs/copilot/chat/mcp-servers> |

View File

@@ -8,7 +8,6 @@ description: >
<!-- Contributor: Sujith R Pillai <sujithrpillai@gmail.com> -->
## Before you begin
1. [Install Docker Compose.](https://docs.docker.com/compose/install/)
@@ -74,7 +73,6 @@ networks:
docker-compose up -d
```
{{< notice tip >}}
You can use this setup quickly set up Toolbox + Postgres to follow along in our
@@ -82,8 +80,6 @@ You can use this setup quickly set up Toolbox + Postgres to follow along in our
{{< /notice >}}
## Connecting with Toolbox Client SDK
Next, we will use Toolbox with the Client SDKs:
@@ -101,14 +97,14 @@ Next, we will use Toolbox with the Client SDKs:
from toolbox_langchain import ToolboxClient
# Replace with the cloud run service URL generated above
async with ToolboxClient("http://$YOUR_URL") as toolbox:
{{< /tab >}}
{{< tab header="Llamaindex" lang="Python" >}}
from toolbox_llamaindex import ToolboxClient
# Replace with the cloud run service URL generated above
async with ToolboxClient("http://$YOUR_URL") as toolbox:
{{< /tab >}}
{{< /tabpane >}}

View File

@@ -9,7 +9,6 @@ description: >
## Before you begin
1. Set the PROJECT_ID environment variable:
```bash
@@ -41,7 +40,6 @@ description: >
kubectl version --client
```
1. If needed, install `kubectl` component using the Google Cloud CLI:
```bash
@@ -62,7 +60,7 @@ description: >
gcloud iam service-accounts create $SA_NAME
```
1. Grant any IAM roles necessary to the IAM service account. Each source have a
1. Grant any IAM roles necessary to the IAM service account. Each source have a
list of necessary IAM permissions listed on it's page. The example below is
for cloud sql postgres source:
@@ -254,6 +252,7 @@ description: >
```
## Clean up resources
1. Delete secret.
```bash

View File

@@ -54,7 +54,6 @@ AlloyDB or Cloud SQL over private IP), make sure your Cloud Run service and the
database are in the same VPC network.
{{< /notice >}}
## Create a service account
1. Create a backend service account if you don't already have one:
@@ -63,7 +62,7 @@ database are in the same VPC network.
gcloud iam service-accounts create toolbox-identity
```
1. Grant permissions to use secret manager:
1. Grant permissions to use secret manager:
```bash
gcloud projects add-iam-policy-binding $PROJECT_ID \
@@ -71,7 +70,8 @@ database are in the same VPC network.
--role roles/secretmanager.secretAccessor
```
1. Grant additional permissions to the service account that are specific to the source, e.g.:
1. Grant additional permissions to the service account that are specific to the
source, e.g.:
- [AlloyDB for PostgreSQL](../resources/sources/alloydb-pg.md#iam-permissions)
- [Cloud SQL for PostgreSQL](../resources/sources/cloud-sql-pg.md#iam-permissions)
@@ -79,7 +79,7 @@ database are in the same VPC network.
Create a `tools.yaml` file that contains your configuration for Toolbox. For
details, see the
[configuration](https://github.com/googleapis/genai-toolbox/blob/main/README.md#configuration)
[configuration](../resources/sources/)
section.
## Deploy to Cloud Run
@@ -97,7 +97,8 @@ section.
gcloud secrets versions add tools --data-file=tools.yaml
```
1. Set an environment variable to the container image that you want to use for cloud run:
1. Set an environment variable to the container image that you want to use for
cloud run:
```bash
export IMAGE=us-central1-docker.pkg.dev/database-toolbox/toolbox/toolbox:latest
@@ -124,7 +125,7 @@ section.
--region us-central1 \
--set-secrets "/app/tools.yaml=tools:latest" \
--args="--tools-file=/app/tools.yaml","--address=0.0.0.0","--port=8080" \
# TODO(dev): update the following to match your VPC if necessary
# TODO(dev): update the following to match your VPC if necessary
--network default \
--subnet default
# --allow-unauthenticated # https://cloud.google.com/run/docs/authenticating/public#gcloud
@@ -132,19 +133,15 @@ section.
## Connecting with Toolbox Client SDK
You can connect to Toolbox Cloud Run instances directly through the SDK
You can connect to Toolbox Cloud Run instances directly through the SDK.
1. [Set up `Cloud Run Invoker` role access](https://cloud.google.com/run/docs/securing/managing-access#service-add-principals) to your Cloud Run service.
1. [Set up `Cloud Run Invoker` role
access](https://cloud.google.com/run/docs/securing/managing-access#service-add-principals)
to your Cloud Run service.
1. Set up [Application Default
1. (Only for local runs) Set up [Application Default
Credentials](https://cloud.google.com/docs/authentication/set-up-adc-local-dev-environment)
for the principle you set up the `Cloud Run Invoker` role access to.
{{< notice tip >}}
If you're working in some other environment than local, set up [environment
specific Default
Credentials](https://cloud.google.com/docs/authentication/provide-credentials-adc).
{{< /notice >}}
1. Run the following to retrieve a non-deterministic URL for the cloud run service:
@@ -157,9 +154,11 @@ You can connect to Toolbox Cloud Run instances directly through the SDK
```python
from toolbox_core import ToolboxClient, auth_methods
auth_token_provider = auth_methods.aget_google_id_token # can also use sync method
# Replace with the Cloud Run service URL generated in the previous step.
URL = "https://cloud-run-url.app"
auth_token_provider = auth_methods.aget_google_id_token(URL) # can also use sync method
async with ToolboxClient(
URL,
client_headers={"Authorization": auth_token_provider},

View File

@@ -20,6 +20,7 @@ the need to run, operate, and maintain multiple agents/collectors.
To configure the collector, you will have to provide a configuration file. The
configuration file consists of four classes of pipeline component that access
telemetry data.
- `Receivers`
- `Processors`
- `Exporters`

View File

@@ -3,11 +3,11 @@ title: "AuthServices"
type: docs
weight: 1
description: >
AuthServices represent services that handle authentication and authorization.
AuthServices represent services that handle authentication and authorization.
---
AuthServices represent services that handle authentication and authorization. It
can primarily be used by [Tools](../tools) in two different ways:
can primarily be used by [Tools](../tools/) in two different ways:
- [**Authorized Invocation**][auth-invoke] is when a tool
is validated by the auth service before the call can be invoked. Toolbox
@@ -62,7 +62,9 @@ token you will provide a function (that returns an id). This function is called
when the tool is invoked. This allows you to cache and refresh the ID token as
needed.
The primary method for providing these getters is via the `auth_token_getters` parameter when loading tools, or the `add_auth_token_getter`() / `add_auth_token_getters()` methods on a loaded tool object.
The primary method for providing these getters is via the `auth_token_getters`
parameter when loading tools, or the `add_auth_token_getter`() /
`add_auth_token_getters()` methods on a loaded tool object.
### Specifying tokens during load
@@ -77,7 +79,7 @@ async def get_auth_token():
return "YOUR_ID_TOKEN" # Placeholder
async def main():
async with ToolboxClient("http://127.0.0.1:5000") as toolbox:
async with ToolboxClient("<http://127.0.0.1:5000>") as toolbox:
auth_tool = await toolbox.load_tool(
"get_sensitive_data",
auth_token_getters={"my_auth_app_1": get_auth_token}
@@ -85,7 +87,7 @@ async def main():
result = await auth_tool(param="value")
print(result)
if __name__ == "__main__":
if **name** == "**main**":
asyncio.run(main())
{{< /tab >}}
{{< tab header="LangChain" lang="Python" >}}
@@ -98,7 +100,7 @@ async def get_auth_token():
return "YOUR_ID_TOKEN" # Placeholder
async def main():
toolbox = ToolboxClient("http://127.0.0.1:5000")
toolbox = ToolboxClient("<http://127.0.0.1:5000>")
auth_tool = await toolbox.aload_tool(
"get_sensitive_data",
@@ -107,7 +109,7 @@ async def main():
result = await auth_tool.ainvoke({"param": "value"})
print(result)
if __name__ == "__main__":
if **name** == "**main**":
asyncio.run(main())
{{< /tab >}}
{{< tab header="Llamaindex" lang="Python" >}}
@@ -120,7 +122,7 @@ async def get_auth_token():
return "YOUR_ID_TOKEN" # Placeholder
async def main():
toolbox = ToolboxClient("http://127.0.0.1:5000")
toolbox = ToolboxClient("<http://127.0.0.1:5000>")
auth_tool = await toolbox.aload_tool(
"get_sensitive_data",
@@ -129,7 +131,7 @@ async def main():
# result = await auth_tool.acall(param="value")
# print(result.content)
if __name__ == "__main__":
if **name** == "**main**":
asyncio.run(main()){{< /tab >}}
{{< /tabpane >}}

View File

@@ -3,7 +3,7 @@ title: "Google Sign-In"
type: docs
weight: 1
description: >
Use Google Sign-In for Oauth 2.0 flow and token lifecycle.
Use Google Sign-In for Oauth 2.0 flow and token lifecycle.
---
## Getting Started

View File

@@ -22,6 +22,17 @@ cluster][alloydb-free-trial].
[alloydb-docs]: https://cloud.google.com/alloydb/docs
[alloydb-free-trial]: https://cloud.google.com/alloydb/docs/create-free-trial-cluster
## Available Tools
- [`alloydb-ai-nl`](../tools/alloydbainl/alloydb-ai-nl.md)
Use natural language queries on AlloyDB, powered by AlloyDB AI.
- [`postgres-sql`](../tools/postgres/postgres-sql.md)
Execute SQL queries as prepared statements in AlloyDB Postgres.
- [`postgres-execute-sql`](../tools/postgres/postgres-execute-sql.md)
Run parameterized SQL statements in AlloyDB Postgres.
## Requirements
### IAM Permissions
@@ -81,8 +92,8 @@ To connect using IAM authentication:
1. Prepare your database instance and user following this [guide][iam-guide].
2. You could choose one of the two ways to log in:
- Specify your IAM email as the `user`.
- Leave your `user` field blank. Toolbox
will fetch the [ADC][adc] automatically and log in using the email associated with it.
- Leave your `user` field blank. Toolbox will fetch the [ADC][adc]
automatically and log in using the email associated with it.
3. Leave the `password` field blank.
[iam-guide]: https://cloud.google.com/alloydb/docs/database-users/manage-iam-auth

View File

@@ -34,6 +34,26 @@ avoiding full table scans or complex filters.
[bigquery-quickstart-cli]: https://cloud.google.com/bigquery/docs/quickstarts/quickstart-command-line
[bigquery-googlesql]: https://cloud.google.com/bigquery/docs/reference/standard-sql/
## Available Tools
- [`bigquery-sql`](../tools/bigquery/bigquery-sql.md)
Run SQL queries directly against BigQuery datasets.
- [`bigquery-execute-sql`](../tools/bigquery/bigquery-execute-sql.md)
Execute structured queries using parameters.
- [`bigquery-get-dataset-info`](../tools/bigquery/bigquery-get-dataset-info.md)
Retrieve metadata for a specific dataset.
- [`bigquery-get-table-info`](../tools/bigquery/bigquery-get-table-info.md)
Retrieve metadata for a specific table.
- [`bigquery-list-dataset-ids`](../tools/bigquery/bigquery-list-dataset-ids.md)
List available dataset IDs.
- [`bigquery-list-table-ids`](../tools/bigquery/bigquery-list-table-ids.md)
List tables in a given dataset.
## Requirements
### IAM Permissions

View File

@@ -32,6 +32,11 @@ such as avoiding full table scans or complex filters.
[bigtable-googlesql]:
https://cloud.google.com/bigtable/docs/googlesql-overview
## Available Tools
- [`bigtable-sql`](../tools/bigtable/bigtable-sql.md)
Run SQL-like queries over Bigtable rows.
## Requirements
### IAM Permissions

View File

@@ -19,6 +19,14 @@ to a database by following these instructions][csql-mssql-connect].
[csql-mssql-docs]: https://cloud.google.com/sql/docs/sqlserver
[csql-mssql-connect]: https://cloud.google.com/sql/docs/sqlserver/connect-overview
## Available Tools
- [`mssql-sql`](../tools/mssql/mssql-sql.md)
Execute pre-defined SQL Server queries with placeholder parameters.
- [`mssql-execute-sql`](../tools/mssql/mssql-execute-sql.md)
Run parameterized SQL Server queries in Cloud SQL for SQL Server.
## Requirements
### IAM Permissions
@@ -63,8 +71,8 @@ mTLS.
### Database User
Currently, this source only uses standard authentication. You will need to [create a
SQL Server user][cloud-sql-users] to login to the database with.
Currently, this source only uses standard authentication. You will need to
[create a SQL Server user][cloud-sql-users] to login to the database with.
[cloud-sql-users]: https://cloud.google.com/sql/docs/sqlserver/create-manage-users
@@ -96,7 +104,7 @@ instead of hardcoding your secrets into the configuration file.
| kind | string | true | Must be "cloud-sql-mssql". |
| project | string | true | Id of the GCP project that the cluster was created in (e.g. "my-project-id"). |
| region | string | true | Name of the GCP region that the cluster was created in (e.g. "us-central1"). |
| instance | string | true | Name of the Cloud SQL instance within the cluster (e.g. "my-instance"). |
| instance | string | true | Name of the Cloud SQL instance within the cluster (e.g. "my-instance"). |
| database | string | true | Name of the Cloud SQL database to connect to (e.g. "my_db"). |
| ipAddress | string | true | IP address of the Cloud SQL instance to connect to. |
| user | string | true | Name of the SQL Server user to connect as (e.g. "my-pg-user"). |

View File

@@ -20,6 +20,14 @@ to a database by following these instructions][csql-mysql-quickstart].
[csql-mysql-docs]: https://cloud.google.com/sql/docs/mysql
[csql-mysql-quickstart]: https://cloud.google.com/sql/docs/mysql/connect-instance-local-computer
## Available Tools
- [`mysql-sql`](../tools/mysql/mysql-sql.md)
Execute pre-defined prepared SQL queries in MySQL.
- [`mysql-execute-sql`](../tools/mysql/mysql-execute-sql.md)
Run parameterized SQL queries in Cloud SQL for MySQL.
## Requirements
### IAM Permissions

View File

@@ -20,6 +20,14 @@ to a database by following these instructions][csql-pg-quickstart].
[csql-pg-docs]: https://cloud.google.com/sql/docs/postgres
[csql-pg-quickstart]: https://cloud.google.com/sql/docs/postgres/connect-instance-local-computer
## Available Tools
- [`postgres-sql`](../tools/postgres/postgres-sql.md)
Execute SQL queries as prepared statements in PostgreSQL.
- [`postgres-execute-sql`](../tools/postgres/postgres-execute-sql.md)
Run parameterized SQL statements in PostgreSQL.
## Requirements
### IAM Permissions
@@ -85,8 +93,8 @@ To connect using IAM authentication:
1. Prepare your database instance and user following this [guide][iam-guide].
2. You could choose one of the two ways to log in:
- Specify your IAM email as the `user`.
- Leave your `user` field blank. Toolbox
will fetch the [ADC][adc] automatically and log in using the email associated with it.
- Leave your `user` field blank. Toolbox will fetch the [ADC][adc]
automatically and log in using the email associated with it.
3. Leave the `password` field blank.
@@ -115,13 +123,13 @@ instead of hardcoding your secrets into the configuration file.
## Reference
| **field** | **type** | **required** | **description** |
|-----------|:--------:|:------------:|---------------------------------------------------------------------------------------------|
| kind | string | true | Must be "cloud-sql-postgres". |
| project | string | true | Id of the GCP project that the cluster was created in (e.g. "my-project-id"). |
| region | string | true | Name of the GCP region that the cluster was created in (e.g. "us-central1"). |
| instance | string | true | Name of the Cloud SQL instance within the cluster (e.g. "my-instance"). |
| database | string | true | Name of the Postgres database to connect to (e.g. "my_db"). |
| user | string | false | Name of the Postgres user to connect as (e.g. "my-pg-user"). Defaults to IAM auth using [ADC][adc] email if unspecified. |
| password | string | false | Password of the Postgres user (e.g. "my-password"). Defaults to attempting IAM authentication if unspecified. |
| ipType | string | false | IP Type of the Cloud SQL instance; must be one of `public` or `private`. Default: `public`. |
| **field** | **type** | **required** | **description** |
|-----------|:--------:|:------------:|--------------------------------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "cloud-sql-postgres". |
| project | string | true | Id of the GCP project that the cluster was created in (e.g. "my-project-id"). |
| region | string | true | Name of the GCP region that the cluster was created in (e.g. "us-central1"). |
| instance | string | true | Name of the Cloud SQL instance within the cluster (e.g. "my-instance"). |
| database | string | true | Name of the Postgres database to connect to (e.g. "my_db"). |
| user | string | false | Name of the Postgres user to connect as (e.g. "my-pg-user"). Defaults to IAM auth using [ADC][adc] email if unspecified. |
| password | string | false | Password of the Postgres user (e.g. "my-password"). Defaults to attempting IAM authentication if unspecified. |
| ipType | string | false | IP Type of the Cloud SQL instance; must be one of `public` or `private`. Default: `public`. |

View File

@@ -8,7 +8,13 @@ description: >
## About
A `couchbase` source establishes a connection to a Couchbase database cluster, allowing tools to execute SQL queries against it.
A `couchbase` source establishes a connection to a Couchbase database cluster,
allowing tools to execute SQL queries against it.
## Available Tools
- [`couchbase-sql`](../tools/couchbase/couchbase-sql.md)
Run SQL++ statements on Couchbase with parameterized input.
## Example
@@ -25,19 +31,19 @@ sources:
## Reference
| **field** | **type** | **required** | **description** |
|---------------------|:--------:|:------------:|-----------------------------------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "couchbase". |
| connectionString | string | true | Connection string for the Couchbase cluster. |
| bucket | string | true | Name of the bucket to connect to. |
| scope | string | true | Name of the scope within the bucket. |
| username | string | false | Username for authentication. |
| password | string | false | Password for authentication. |
| clientCert | string | false | Path to client certificate file for TLS authentication. |
| clientCertPassword| string | false | Password for the client certificate. |
| clientKey | string | false | Path to client key file for TLS authentication. |
| clientKeyPassword | string | false | Password for the client key. |
| caCert | string | false | Path to CA certificate file. |
| noSslVerify | boolean | false | If true, skip server certificate verification. **Warning:** This option should only be used in development or testing environments. Disabling SSL verification poses significant security risks in production as it makes your connection vulnerable to man-in-the-middle attacks. |
| profile | string | false | Name of the connection profile to apply. |
| **field** | **type** | **required** | **description** |
|----------------------|:--------:|:------------:|---------------------------------------------------------|
| kind | string | true | Must be "couchbase". |
| connectionString | string | true | Connection string for the Couchbase cluster. |
| bucket | string | true | Name of the bucket to connect to. |
| scope | string | true | Name of the scope within the bucket. |
| username | string | false | Username for authentication. |
| password | string | false | Password for authentication. |
| clientCert | string | false | Path to client certificate file for TLS authentication. |
| clientCertPassword | string | false | Password for the client certificate. |
| clientKey | string | false | Path to client key file for TLS authentication. |
| clientKeyPassword | string | false | Password for the client key. |
| caCert | string | false | Path to CA certificate file. |
| noSslVerify | boolean | false | If true, skip server certificate verification. **Warning:** This option should only be used in development or testing environments. Disabling SSL verification poses significant security risks in production as it makes your connection vulnerable to man-in-the-middle attacks. |
| profile | string | false | Name of the connection profile to apply. |
| queryScanConsistency | integer | false | Query scan consistency. Controls the consistency guarantee for index scanning. Values: 1 for "not_bounded" (fastest option, but results may not include the most recent operations), 2 for "request_plus" (highest consistency level, includes all operations up until the query started, but incurs a performance penalty). If not specified, defaults to the Couchbase Go SDK default. |

View File

@@ -0,0 +1,93 @@
---
title: "Dataplex"
type: docs
weight: 1
description: >
Dataplex Universal Catalog is a unified, intelligent governance solution for data and AI assets in Google Cloud. Dataplex Universal Catalog powers AI, analytics, and business intelligence at scale.
---
# Dataplex Source
[Dataplex][dataplex-docs] Universal Catalog is a unified, intelligent governance
solution for data and AI assets in Google Cloud. Dataplex Universal Catalog
powers AI, analytics, and business intelligence at scale.
At the heart of these governance capabilities is a catalog that contains a
centralized inventory of the data assets in your organization. Dataplex
Universal Catalog holds business, technical, and runtime metadata for all of
your data. It helps you discover relationships and semantics in the metadata by
applying artificial intelligence and machine learning.
[dataplex-docs]: https://cloud.google.com/dataplex/docs
## Example
```yaml
sources:
my-dataplex-source:
kind: "dataplex"
project: "my-project-id"
```
## Sample System Prompt
You can use the following system prompt as "Custom Instructions" in your client
application.
```
Whenever you will receive response from dataplex_search_entries tool decide what do to by following these steps:
1. If there are multiple search results found
1.1. Present the list of search results
1.2. Format the output in nested ordered list, for example:
Given
```
{
results: [
{
name: "projects/test-project/locations/us/entryGroups/@bigquery-aws-us-east-1/entries/users"
entrySource: {
displayName: "Users"
description: "Table contains list of users."
location: "aws-us-east-1"
system: "BigQuery"
}
},
{
name: "projects/another_project/locations/us-central1/entryGroups/@bigquery/entries/top_customers"
entrySource: {
displayName: "Top customers",
description: "Table contains list of best customers."
location: "us-central1"
system: "BigQuery"
}
},
]
}
```
Return output formatted as markdown nested list:
```
* Users:
- projectId: test_project
- location: aws-us-east-1
- description: Table contains list of users.
* Top customers:
- projectId: another_project
- location: us-central1
- description: Table contains list of best customers.
```
1.3. Ask to select one of the presented search results
2. If there is only one search result found
2.1. Present the search result immediately.
3. If there are no search result found
3.1. Explain that no search result was found
3.2. Suggest to provide a more specific search query.
Do not try to search within search results on your own.
```
## Reference
| **field** | **type** | **required** | **description** |
|-----------|:--------:|:------------:|----------------------------------------------------------------------------------|
| kind | string | true | Must be "dataplex". |
| project | string | true | Id of the GCP project used for quota and billing purposes (e.g. "my-project-id").|

View File

@@ -9,7 +9,10 @@ description: >
## About
[Dgraph][dgraph-docs] is an open-source graph database. It is designed for real-time workloads, horizontal scalability, and data flexibility. Implemented as a distributed system, Dgraph processes queries in parallel to deliver the fastest result.
[Dgraph][dgraph-docs] is an open-source graph database. It is designed for
real-time workloads, horizontal scalability, and data flexibility. Implemented
as a distributed system, Dgraph processes queries in parallel to deliver the
fastest result.
This source can connect to either a self-managed Dgraph cluster or one hosted on
Dgraph Cloud. If you're new to Dgraph, the fastest way to get started is to
@@ -18,6 +21,11 @@ Dgraph Cloud. If you're new to Dgraph, the fastest way to get started is to
[dgraph-docs]: https://dgraph.io/docs
[dgraph-login]: https://cloud.dgraph.io/login
## Available Tools
- [`dgraph-dql`](../tools/dgraph/dgraph-dql.md)
Run DQL (Dgraph Query Language) queries.
## Requirements
### Database User
@@ -52,7 +60,7 @@ instead of hardcoding your secrets into the configuration file.
| **Field** | **Type** | **Required** | **Description** |
|-------------|:--------:|:------------:|--------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "dgraph". |
| dgraphUrl | string | true | Connection URI (e.g. "<https://xxx.cloud.dgraph.io>", "<https://localhost:8080>"). |
| dgraphUrl | string | true | Connection URI (e.g. "<https://xxx.cloud.dgraph.io>", "<https://localhost:8080>"). |
| user | string | false | Name of the Dgraph user to connect as (e.g., "groot"). |
| password | string | false | Password of the Dgraph user (e.g., "password"). |
| apiKey | string | false | API key to connect to a Dgraph Cloud instance. |

View File

@@ -0,0 +1,73 @@
---
title: DuckDB
linkTitle: DuckDB
type: docs
weight: 1
description: >
DuckDB is an in-process SQL OLAP database management system designed for analytical query processing.
---
## About
[DuckDB](https://duckdb.org/) is an embedded analytical database management system that runs in-process with the client application. It is optimized for analytical workloads, providing high performance for complex queries with minimal setup.
DuckDB has the following notable characteristics:
- In-process, serverless database engine
- Supports complex SQL queries for analytical processing
- Can operate on in-memory or persistent storage
- Zero-configuration - no external dependencies or server setup required
- Highly optimized for columnar data storage and query execution
For more details, refer to the [DuckDB Documentation](https://duckdb.org/).
## Available Tools
- [`duckdb-sql`](../tools/duckdb/duckdb-sql.md)
Execute pre-defined prepared SQL queries in DuckDB.
## Requirements
### Database File
To use DuckDB, you can either:
- Specify a file path for a persistent database stored on the filesystem
- Omit the file path to use an in-memory database
## Example
For a persistent DuckDB database:
```yaml
sources:
my-duckdb:
kind: "duckdb"
dbFilePath: "/path/to/database.db"
configuration:
memory_limit: "2GB"
threads: "4"
```
For an in-memory DuckDB database:
```yaml
sources:
my-duckdb-memory:
name: "my-duckdb-memory"
kind: "duckdb"
```
## Reference
### Configuration Fields
| **field** | **type** | **required** | **description** |
|-------------------|:-----------------:|:------------:|---------------------------------------------------------------------------------|
| kind | string | true | Must be "duckdb". |
| dbFilePath | string | false | Path to the DuckDB database file. Omit for an in-memory database. |
| configuration | map[string]string | false | Additional DuckDB configuration options (e.g., `memory_limit`, `threads`). |
For a complete list of available configuration options, refer to the [DuckDB Configuration Documentation](https://duckdb.org/docs/stable/configuration/overview.html#local-configuration-options).
For more details on the Go implementation, see the [go-duckdb package documentation](https://pkg.go.dev/github.com/scottlepp/go-duckdb#section-readme).

View File

@@ -0,0 +1,77 @@
---
title: "Firestore"
type: docs
weight: 1
description: >
Firestore is a NoSQL document database built for automatic scaling, high performance, and ease of application development. It's a fully managed, serverless database that supports mobile, web, and server development.
---
# Firestore Source
[Firestore][firestore-docs] is a NoSQL document database built for automatic
scaling, high performance, and ease of application development. While the
Firestore interface has many of the same features as traditional databases,
as a NoSQL database it differs from them in the way it describes relationships
between data objects.
If you are new to Firestore, you can [create a database and learn the
basics][firestore-quickstart].
[firestore-docs]: https://cloud.google.com/firestore/docs
[firestore-quickstart]: https://cloud.google.com/firestore/docs/quickstart-servers
## Requirements
### IAM Permissions
Firestore uses [Identity and Access Management (IAM)][iam-overview] to control
user and group access to Firestore resources. Toolbox will use your [Application
Default Credentials (ADC)][adc] to authorize and authenticate when interacting
with [Firestore][firestore-docs].
In addition to [setting the ADC for your server][set-adc], you need to ensure
the IAM identity has been given the correct IAM permissions for accessing
Firestore. Common roles include:
- `roles/datastore.user` - Read and write access to Firestore
- `roles/datastore.viewer` - Read-only access to Firestore
- `roles/firebaserules.admin` - Full management of Firebase Security Rules for
Firestore. This role is required for operations that involve creating,
updating, or managing Firestore security rules (see [Firebase Security Rules
roles][firebaserules-roles])
See [Firestore access control][firestore-iam] for more information on
applying IAM permissions and roles to an identity.
[iam-overview]: https://cloud.google.com/firestore/docs/security/iam
[adc]: https://cloud.google.com/docs/authentication#adc
[set-adc]: https://cloud.google.com/docs/authentication/provide-credentials-adc
[firestore-iam]: https://cloud.google.com/firestore/docs/security/iam
[firebaserules-roles]:
https://cloud.google.com/iam/docs/roles-permissions/firebaserules
### Database Selection
Firestore allows you to create multiple databases within a single project. Each
database is isolated from the others and has its own set of documents and
collections. If you don't specify a database in your configuration, the default
database named `(default)` will be used.
## Example
```yaml
sources:
my-firestore-source:
kind: "firestore"
project: "my-project-id"
# database: "my-database" # Optional, defaults to "(default)"
```
## Reference
| **field** | **type** | **required** | **description** |
|-----------|:--------:|:------------:|----------------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "firestore". |
| project | string | true | Id of the GCP project that contains the Firestore database (e.g. "my-project-id"). |
| database | string | false | Name of the Firestore database to connect to. Defaults to "(default)" if not specified. |

View File

@@ -13,6 +13,11 @@ The HTTP Source allows Toolbox to retrieve data from arbitrary HTTP
endpoints. This enables Generative AI applications to access data from web APIs
and other HTTP-accessible resources.
## Available Tools
- [`http`](../tools/http/http.md)
Make HTTP requests to REST APIs or other web services.
## Example
```yaml

View File

@@ -0,0 +1,66 @@
---
title: "Looker"
type: docs
weight: 1
description: >
Looker is a business intelligence tool that also provides a semantic layer.
---
## About
[Looker][looker-docs] is a web based business intelligence and data management
tool that provides a semantic layer to facilitate querying. It can be deployed
in the cloud, on GCP, or on premises.
[looker-docs]: https://cloud.google.com/looker/docs
## Requirements
### Database User
This source only uses API authentication. You will need to
[create an API user][looker-user] to login to Looker.
[looker-user]:
https://cloud.google.com/looker/docs/api-auth#authentication_with_an_sdk
## Example
```yaml
sources:
my-looker-source:
kind: looker
base_url: http://looker.example.com
client_id: ${LOOKER_CLIENT_ID}
client_secret: ${LOOKER_CLIENT_SECRET}
verify_ssl: true
timeout: 600s
```
The Looker base url will look like "https://looker.example.com", don't include
a trailing "/". In some cases, especially if your Looker is deployed
on-premises, you may need to add the API port numner like
"https://looker.example.com:19999".
Verify ssl should almost always be "true" (all lower case) unless you are using
a self-signed ssl certificate for the Looker server. Anything other than "true"
will be interpretted as false.
The client id and client secret are seemingly random character sequences
assigned by the looker server.
{{< notice tip >}}
Use environment variable replacement with the format ${ENV_NAME}
instead of hardcoding your secrets into the configuration file.
{{< /notice >}}
## Reference
| **field** | **type** | **required** | **description** |
| ------------- | :------: | :----------: | ----------------------------------------------------------------------------------------- |
| kind | string | true | Must be "looker". |
| base_url | string | true | The URL of your Looker server with no trailing /). |
| client_id | string | true | The client id assigned by Looker. |
| client_secret | string | true | The client secret assigned by Looker. |
| verify_ssl | string | true | Whether to check the ssl certificate of the server. |
| timeout | string | false | Maximum time to wait for query execution (e.g. "30s", "2m"). By default, 120s is applied. |

View File

@@ -0,0 +1,34 @@
---
title: "MongoDB"
type: docs
weight: 1
description: >
MongoDB is a no-sql data platform that can not only serve general purpose data requirements also perform VectorSearch where both operational data and embeddings used of search can reside in same document.
---
## About
[MongoDB][mongodb-docs] is a popular NoSQL database that stores data in
flexible, JSON-like documents, making it easy to develop and scale applications.
[mongodb-docs]: https://www.mongodb.com/docs/atlas/getting-started/
## Example
```yaml
sources:
my-mongodb:
kind: mongodb
uri: "mongodb+srv://username:password@host.mongodb.net"
database: sample_mflix
```
## Reference
| **field** | **type** | **required** | **description** |
|-----------|:--------:|:------------:|-------------------------------------------------------------------|
| kind | string | true | Must be "mongodb". |
| uri | string | true | connection string to connect to MongoDB |
| database | string | true | Name of the mongodb database to connect to (e.g. "sample_mflix"). |

View File

@@ -15,6 +15,14 @@ amount of data through a structured format.
[mssql-docs]: https://www.microsoft.com/en-us/sql-server
## Available Tools
- [`mssql-sql`](../tools/mssql/mssql-sql.md)
Execute pre-defined SQL Server queries with placeholder parameters.
- [`mssql-execute-sql`](../tools/mssql/mssql-execute-sql.md)
Run parameterized SQL Server queries in SQL Server.
## Requirements
### Database User
@@ -35,6 +43,7 @@ sources:
database: my_db
user: ${USER_NAME}
password: ${PASSWORD}
# encrypt: strict
```
{{< notice tip >}}
@@ -44,11 +53,12 @@ instead of hardcoding your secrets into the configuration file.
## Reference
| **field** | **type** | **required** | **description** |
|-----------|:--------:|:------------:|------------------------------------------------------------------------|
| kind | string | true | Must be "mssql". |
| host | string | true | IP address to connect to (e.g. "127.0.0.1"). |
| port | string | true | Port to connect to (e.g. "1433"). |
| database | string | true | Name of the SQL Server database to connect to (e.g. "my_db"). |
| user | string | true | Name of the SQL Server user to connect as (e.g. "my-user"). |
| password | string | true | Password of the SQL Server user (e.g. "my-password"). |
| **field** | **type** | **required** | **description** |
|-----------|:--------:|:------------:|--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "mssql". |
| host | string | true | IP address to connect to (e.g. "127.0.0.1"). |
| port | string | true | Port to connect to (e.g. "1433"). |
| database | string | true | Name of the SQL Server database to connect to (e.g. "my_db"). |
| user | string | true | Name of the SQL Server user to connect as (e.g. "my-user"). |
| password | string | true | Password of the SQL Server user (e.g. "my-password"). |
| encrypt | string | false | Encryption level for data transmitted between the client and server (e.g., "strict"). If not specified, defaults to the [github.com/microsoft/go-mssqldb](https://github.com/microsoft/go-mssqldb?tab=readme-ov-file#common-parameters) package's default encrypt value. |

View File

@@ -4,7 +4,6 @@ type: docs
weight: 1
description: >
MySQL is a relational database management system that stores and manages data.
---
## About
@@ -15,6 +14,14 @@ reliability, performance, and ease of use.
[mysql-docs]: https://www.mysql.com/
## Available Tools
- [`mysql-sql`](../tools/mysql/mysql-sql.md)
Execute pre-defined prepared SQL queries in MySQL.
- [`mysql-execute-sql`](../tools/mysql/mysql-execute-sql.md)
Run parameterized SQL queries in MySQL.
## Requirements
### Database User
@@ -35,6 +42,7 @@ sources:
database: my_db
user: ${USER_NAME}
password: ${PASSWORD}
queryTimeout: 30s # Optional: query timeout duration
```
{{< notice tip >}}
@@ -44,11 +52,12 @@ instead of hardcoding your secrets into the configuration file.
## Reference
| **field** | **type** | **required** | **description** |
|-----------|:--------:|:------------:|---------------------------------------------------------------------------------------------|
| kind | string | true | Must be "mysql". |
| host | string | true | IP address to connect to (e.g. "127.0.0.1"). |
| port | string | true | Port to connect to (e.g. "3306"). |
| database | string | true | Name of the MySQL database to connect to (e.g. "my_db"). |
| user | string | true | Name of the MySQL user to connect as (e.g. "my-mysql-user"). |
| password | string | true | Password of the MySQL user (e.g. "my-password"). |
| **field** | **type** | **required** | **description** |
| ------------ | :------: | :----------: | ----------------------------------------------------------------------------------------------- |
| kind | string | true | Must be "mysql". |
| host | string | true | IP address to connect to (e.g. "127.0.0.1"). |
| port | string | true | Port to connect to (e.g. "3306"). |
| database | string | true | Name of the MySQL database to connect to (e.g. "my_db"). |
| user | string | true | Name of the MySQL user to connect as (e.g. "my-mysql-user"). |
| password | string | true | Password of the MySQL user (e.g. "my-password"). |
| queryTimeout | string | false | Maximum time to wait for query execution (e.g. "30s", "2m"). By default, no timeout is applied. |

View File

@@ -7,12 +7,19 @@ description: >
---
## About
[Neo4j][neo4j-docs] is a powerful, open source graph database system with over
15 years of active development that has earned it a strong reputation for
reliability, feature robustness, and performance.
[neo4j-docs]: https://neo4j.com/docs
## Available Tools
- [`neo4j-cypher`](../tools/neo4j/neo4j-cypher.md)
Run Cypher queries against your Neo4j graph database.
## Requirements
### Database User

View File

@@ -15,6 +15,14 @@ reputation for reliability, feature robustness, and performance.
[pg-docs]: https://www.postgresql.org/
## Available Tools
- [`postgres-sql`](../tools/postgres/postgres-sql.md)
Execute SQL queries as prepared statements in PostgreSQL.
- [`postgres-execute-sql`](../tools/postgres/postgres-execute-sql.md)
Run parameterized SQL statements in PostgreSQL.
## Requirements
### Database User

View File

@@ -10,15 +10,26 @@ description: >
## About
Redis is an open-source, in-memory data structure store, used as a database, cache, and message broker. It supports data structures such as strings, hashes, lists, sets, sorted sets with range queries, bitmaps, hyperloglogs, and geospatial indexes with radius queries.
Redis is an open-source, in-memory data structure store, used as a database,
cache, and message broker. It supports data structures such as strings, hashes,
lists, sets, sorted sets with range queries, bitmaps, hyperloglogs, and
geospatial indexes with radius queries.
If you are new to Redis, you can find installation and getting started guides on the [official Redis website](https://redis.io/docs/getting-started/).
If you are new to Redis, you can find installation and getting started guides on
the [official Redis website](https://redis.io/docs/getting-started/).
## Available Tools
- [`redis`](../tools/redis/redis.md)
Run Redis commands and interact with key-value pairs.
## Requirements
### Redis
[AUTH string][auth] is a password for connection to Redis. If you have the `requirepass` directive set in your Redis configuration, incoming client connections must authenticate in order to connect.
[AUTH string][auth] is a password for connection to Redis. If you have the
`requirepass` directive set in your Redis configuration, incoming client
connections must authenticate in order to connect.
Specify your AUTH string in the password field:
@@ -27,7 +38,7 @@ sources:
my-redis-instance:
kind: redis
address:
- 127.0.0.1
- 127.0.0.1:6379
username: ${MY_USER_NAME}
password: ${MY_AUTH_STRING} # Omit this field if you don't have a password.
# database: 0
@@ -52,14 +63,14 @@ sources:
my-redis-cluster-instance:
kind: memorystore-redis
address:
- 127.0.0.1
- 127.0.0.1:6379
password: ${MY_AUTH_STRING}
# useGCPIAM: false
# clusterEnabled: false
```
Memorystore Redis Cluster supports IAM authentication instead. Grant your account the
required [IAM role][iam] and make sure to set `useGCPIAM` to `true`.
Memorystore Redis Cluster supports IAM authentication instead. Grant your
account the required [IAM role][iam] and make sure to set `useGCPIAM` to `true`.
Here is an example tools.yaml config for Memorystore Redis Cluster instances
using IAM authentication:
@@ -68,7 +79,8 @@ using IAM authentication:
sources:
my-redis-cluster-instance:
kind: memorystore-redis
address: 127.0.0.1
address:
- 127.0.0.1:6379
useGCPIAM: true
clusterEnabled: true
```

View File

@@ -23,6 +23,14 @@ the Google Cloud console][spanner-quickstart].
[spanner-quickstart]:
https://cloud.google.com/spanner/docs/create-query-database-console
## Available Tools
- [`spanner-sql`](../tools/spanner/spanner-sql.md)
Execute SQL on Google Cloud Spanner.
- [`spanner-execute-sql`](../tools/spanner/spanner-execute-sql.md)
Run structured and parameterized queries on Spanner.
## Requirements
### IAM Permissions

View File

@@ -15,17 +15,24 @@ database management system. The lite in SQLite means lightweight in terms of
setup, database administration, and required resources.
SQLite has the following notable characteristics:
- Self-contained with no external dependencies
- Serverless - the SQLite library accesses its storage files directly
- Single database file that can be easily copied or moved
- Zero-configuration - no setup or administration needed
- Transactional with ACID properties
## Available Tools
- [`sqlite-sql`](../tools/sqlite/sqlite-sql.md)
Run SQL queries against a local SQLite database.
## Requirements
### Database File
You need a SQLite database file. This can be:
- An existing database file
- A path where a new database file should be created
- `:memory:` for an in-memory database
@@ -40,6 +47,7 @@ sources:
```
For an in-memory database:
```yaml
sources:
my-sqlite-memory-db:
@@ -51,13 +59,14 @@ sources:
### Configuration Fields
| Field | Type | Required | Description |
|-------|------|----------|-------------|
| kind | string | Yes | Must be "sqlite" |
| database | string | Yes | Path to SQLite database file, or ":memory:" for an in-memory database |
| **field** | **type** | **required** | **description** |
|-----------|:--------:|:------------:|---------------------------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "spanner". |
| database | string | true | Path to SQLite database file, or ":memory:" for an in-memory database. |
### Connection Properties
SQLite connections are configured with these defaults for optimal performance:
- `MaxOpenConns`: 1 (SQLite only supports one writer at a time)
- `MaxIdleConns`: 1
- `MaxIdleConns`: 1

View File

@@ -10,9 +10,19 @@ description: >
## About
Valkey is an open-source, in-memory data structure store that originated as a fork of Redis. It's designed to be used as a database, cache, and message broker, supporting a wide range of data structures like strings, hashes, lists, sets, sorted sets with range queries, bitmaps, hyperloglogs, and geospatial indexes with radius queries.
Valkey is an open-source, in-memory data structure store that originated as a
fork of Redis. It's designed to be used as a database, cache, and message
broker, supporting a wide range of data structures like strings, hashes, lists,
sets, sorted sets with range queries, bitmaps, hyperloglogs, and geospatial
indexes with radius queries.
If you're new to Valkey, you can find installation and getting started guides on the [official Valkey website](https://valkey.io/docs/getting-started/).
If you're new to Valkey, you can find installation and getting started guides on
the [official Valkey website](https://valkey.io/topics/quickstart/).
## Available Tools
- [`valkey`](../tools/valkey/valkey.md)
Issue Valkey (Redis-compatible) commands.
## Example
@@ -21,7 +31,7 @@ sources:
my-valkey-instance:
kind: valkey
address:
- 127.0.0.1
- 127.0.0.1:6379
username: ${YOUR_USERNAME}
password: ${YOUR_PASSWORD}
# database: 0
@@ -45,7 +55,7 @@ sources:
my-valkey-instance:
kind: valkey
address:
- 127.0.0.1
- 127.0.0.1:6379
useGCPIAM: true
```

View File

@@ -2,8 +2,8 @@
title: "Tools"
type: docs
weight: 2
description: >
Tools define actions an agent can take -- such as reading and writing to a
description: >
Tools define actions an agent can take -- such as reading and writing to a
source.
---
@@ -11,7 +11,6 @@ A tool represents an action your agent can take, such as running a SQL
statement. You can define Tools as a map in the `tools` section of your
`tools.yaml` file. Typically, a tool will require a source to act on:
```yaml
tools:
search_flights_by_number:
@@ -50,7 +49,6 @@ tools:
description: 1 to 4 digit number
```
## Specifying Parameters
Parameters for each Tool will define what inputs the agent will need to provide
@@ -83,46 +81,87 @@ the parameter.
|-------------|:---------------:|:------------:|-----------------------------------------------------------------------------|
| name | string | true | Name of the parameter. |
| type | string | true | Must be one of "string", "integer", "float", "boolean" "array" |
| default | parameter type | false | Default value of the parameter. If provided, the parameter is not required. |
| description | string | true | Natural language description of the parameter to describe it to the agent. |
| default | parameter type | false | Default value of the parameter. If provided, `required` will be `false`. |
| required | bool | false | Indicate if the parameter is required. Default to `true`. |
### Array Parameters
The `array` type is a list of items passed in as a single parameter.
To use the `array` type, you must also specify what kind of items are
To use the `array` type, you must also specify what kind of items are
in the list using the items field:
```yaml
parameters:
- name: preferred_airlines
type: array
description: A list of airline, ordered by preference.
description: A list of airline, ordered by preference.
items:
name: name
name: name
type: string
description: Name of the airline.
description: Name of the airline.
statement: |
SELECT * FROM airlines WHERE preferred_airlines = ANY($1);
```
| **field** | **type** | **required** | **description** |
|-------------|:----------------:|:------------:|-----------------------------------------------------------------------------|
| name | string | true | Name of the parameter. |
| type | string | true | Must be "array" |
| default | parameter type | false | Default value of the parameter. If provided, the parameter is not required. |
| description | string | true | Natural language description of the parameter to describe it to the agent. |
| default | parameter type | false | Default value of the parameter. If provided, `required` will be `false`. |
| required | bool | false | Indicate if the parameter is required. Default to `true`. |
| items | parameter object | true | Specify a Parameter object for the type of the values in the array. |
{{< notice note >}}
Items in array should not have a default value. If provided, it will be ignored.
Items in array should not have a `default` or `required` value. If provided, it
will be ignored.
{{< /notice >}}
### Map Parameters
The map type is a collection of key-value pairs. It can be configured in two
ways:
- Generic Map: By default, it accepts values of any primitive type (string,
integer, float, boolean), allowing for mixed data.
- Typed Map: By setting the valueType field, you can enforce that all values
within the map must be of the same specified type.
#### Generic Map (Mixed Value Types)
This is the default behavior when valueType is omitted. It's useful for passing
a flexible group of settings.
```yaml
parameters:
- name: execution_context
type: map
description: A flexible set of key-value pairs for the execution environment.
```
#### Typed Map
Specify valueType to ensure all values in the map are of the same type. An error
will be thrown in case of value type mismatch.
```yaml
parameters:
- name: user_scores
type: map
description: A map of user IDs to their scores. All scores must be integers.
valueType: integer # This enforces the value type for all entries.
```
### Authenticated Parameters
Authenticated parameters are automatically populated with user
information decoded from [ID tokens](../authsources/#specifying-id-tokens-from-clients) that
are passed in request headers. They do not take input values in request bodies
like other parameters. To use authenticated parameters, you must configure
the tool to map the required [authServices](../authservices) to
specific claims within the user's ID token.
information decoded from [ID
tokens](../authServices/#specifying-id-tokens-from-clients) that are passed in
request headers. They do not take input values in request bodies like other
parameters. To use authenticated parameters, you must configure the tool to map
the required [authServices](../authServices/) to specific claims within the
user's ID token.
```yaml
tools:
@@ -144,25 +183,27 @@ specific claims within the user's ID token.
| **field** | **type** | **required** | **description** |
|-----------|:--------:|:------------:|-----------------------------------------------------------------------------------------|
| name | string | true | Name of the [authServices](../authservices) used to verify the OIDC auth token. |
| name | string | true | Name of the [authServices](../authServices/) used to verify the OIDC auth token. |
| field | string | true | Claim field decoded from the OIDC token used to auto-populate this parameter. |
### Template Parameters
Template parameters types include `string`, `integer`, `float`, `boolean` types. In
most cases, the description will be provided to the LLM as context on specifying
the parameter. Template parameters will be inserted into the SQL statement before
executing the prepared statement. They will be inserted without quotes, so to
insert a string using template parameters, quotes must be explicitly added within
the string.
Template parameters types include `string`, `integer`, `float`, `boolean` types.
In most cases, the description will be provided to the LLM as context on
specifying the parameter. Template parameters will be inserted into the SQL
statement before executing the prepared statement. They will be inserted without
quotes, so to insert a string using template parameters, quotes must be
explicitly added within the string.
Template parameter arrays can also be used similarly to basic parameters, and array
items must be strings. Once inserted into the SQL statement, the outer layer of quotes
will be removed. Therefore to insert strings into the SQL statement, a set of quotes
must be explicitly added within the string.
items must be strings. Once inserted into the SQL statement, the outer layer of
quotes will be removed. Therefore to insert strings into the SQL statement, a
set of quotes must be explicitly added within the string.
{{< notice warning >}}
Because template parameters can directly replace identifiers, column names, and table names, they are prone to SQL injections. Basic parameters are preferred for performance and safety reasons.
Because template parameters can directly replace identifiers, column names, and
table names, they are prone to SQL injections. Basic parameters are preferred
for performance and safety reasons.
{{< /notice >}}
```yaml
@@ -203,7 +244,7 @@ tools:
You can require an authorization check for any Tool invocation request by
specifying an `authRequired` field. Specify a list of
[authServices](../authservices) defined in the previous section.
[authServices](../authServices/) defined in the previous section.
```yaml
tools:

View File

@@ -0,0 +1,7 @@
---
title: "AlloyDB AI NL"
type: docs
weight: 1
description: >
AlloyDB AI NL Tool.
---

View File

@@ -2,11 +2,13 @@
title: "alloydb-ai-nl"
type: docs
weight: 1
description: >
The "alloydb-ai-nl" tool leverages
[AlloyDB AI](https://cloud.google.com/alloydb/ai) next-generation Natural
description: >
The "alloydb-ai-nl" tool leverages
[AlloyDB AI](https://cloud.google.com/alloydb/ai) next-generation Natural
Language support to provide the ability to query the database directly using
natural language.
aliases:
- /resources/tools/alloydb-ai-nl
---
## About
@@ -16,18 +18,20 @@ Language][alloydb-ai-nl-overview] support to allow an Agent the ability to query
the database directly using natural language. Natural language streamlines the
development of generative AI applications by transferring the complexity of
converting natural language to SQL from the application layer to the database
layer.
layer.
This tool is compatible with the following sources:
- [alloydb-postgres](../sources/alloydb-pg.md)
AlloyDB AI Natural Language delivers secure and accurate responses for
application end user natural language questions. Natural language streamlines
the development of generative AI applications by transferring the complexity
of converting natural language to SQL from the application layer to the
- [alloydb-postgres](../../sources/alloydb-pg.md)
AlloyDB AI Natural Language delivers secure and accurate responses for
application end user natural language questions. Natural language streamlines
the development of generative AI applications by transferring the complexity
of converting natural language to SQL from the application layer to the
database layer.
## Requirements
{{< notice tip >}} AlloyDB AI natural language is currently in gated public
preview. For more information on availability and limitations, please see
[AlloyDB AI natural language overview](https://cloud.google.com/alloydb/docs/ai/natural-language-overview)
@@ -41,16 +45,16 @@ context for your application.
[alloydb-ai-nl-overview]: https://cloud.google.com/alloydb/docs/ai/natural-language-overview
[alloydb-ai-gen-nl]: https://cloud.google.com/alloydb/docs/ai/generate-sql-queries-natural-language
## Configuration
### Specifying an `nl_config`
A `nl_config` is a configuration that associates an application to schema
objects, examples and other contexts that can be used. A large application can
also use different configurations for different parts of the app, as long as the
correct configuration can be specified when a question is sent from that part of
the application.
Once you've followed the steps for configuring context, you can use the
`context` field when configuring a `alloydb-ai-nl` tool. When this tool is
invoked, the SQL will be generated and executed using this context.
@@ -58,14 +62,14 @@ invoked, the SQL will be generated and executed using this context.
### Specifying Parameters to PSV's
[Parameterized Secure Views (PSVs)][alloydb-psv] are a feature unique to AlloyDB
that allows you allow you to require one or more named parameter values passed
that allows you to require one or more named parameter values passed
to the view when querying it, somewhat like bind variables with ordinary
database queries.
database queries.
You can use the `nlConfigParameters` to list the parameters required for your
`nl_config`. You **must** supply all parameters required for all PSVs in the
context. It's strongly recommended to use features like [Authenticated
Parameters](../tools/#array-parameters) or Bound Parameters to provide secure
Parameters](../#array-parameters) or Bound Parameters to provide secure
access to queries generated using natural language, as these parameters are not
visible to the LLM.
@@ -84,8 +88,8 @@ tools:
- name: user_email
type: string
description: User ID of the logged in user.
# note: we strongly recommend using features like Authenticated or
# Bound parameters to prevent the LLM from seeing these params and
# note: we strongly recommend using features like Authenticated or
# Bound parameters to prevent the LLM from seeing these params and
# specifying values it shouldn't in the tool input
authServices:
- name: my_google_service
@@ -100,4 +104,4 @@ tools:
| source | string | true | Name of the AlloyDB source the natural language query should execute on. |
| description | string | true | Description of the tool that is passed to the LLM. |
| nlConfig | string | true | The name of the `nl_config` in AlloyDB |
| nlConfigParameters | [parameters](_index#specifying-parameters) | true | List of PSV parameters defined in the `nl_config` |
| nlConfigParameters | [parameters](../#specifying-parameters) | true | List of PSV parameters defined in the `nl_config` |

View File

@@ -0,0 +1,7 @@
---
title: "BigQuery"
type: docs
weight: 1
description: >
Tools that work with BigQuery Sources.
---

View File

@@ -2,8 +2,10 @@
title: "bigquery-execute-sql"
type: docs
weight: 1
description: >
description: >
A "bigquery-execute-sql" tool executes a SQL statement against BigQuery.
aliases:
- /resources/tools/bigquery-execute-sql
---
## About
@@ -11,7 +13,7 @@ description: >
A `bigquery-execute-sql` tool executes a SQL statement against BigQuery.
It's compatible with the following sources:
- [bigquery](../sources/bigquery.md)
- [bigquery](../../sources/bigquery.md)
`bigquery-execute-sql` takes one input parameter `sql` and runs the sql
statement against the `source`.
@@ -30,6 +32,6 @@ tools:
| **field** | **type** | **required** | **description** |
|-------------|:------------------------------------------:|:------------:|--------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "bigquery-execute-sql". |
| kind | string | true | Must be "bigquery-execute-sql". |
| source | string | true | Name of the source the SQL should execute on. |
| description | string | true | Description of the tool that is passed to the LLM. |

View File

@@ -2,8 +2,10 @@
title: "bigquery-get-dataset-info"
type: docs
weight: 1
description: >
description: >
A "bigquery-get-dataset-info" tool retrieves metadata for a BigQuery dataset.
aliases:
- /resources/tools/bigquery-get-dataset-info
---
## About
@@ -11,10 +13,12 @@ description: >
A `bigquery-get-dataset-info` tool retrieves metadata for a BigQuery dataset.
It's compatible with the following sources:
- [bigquery](../sources/bigquery.md)
- [bigquery](../../sources/bigquery.md)
bigquery-get-dataset-info takes a dataset parameter to specify the dataset
on the given source.
`bigquery-get-dataset-info` takes a `dataset` parameter to specify the dataset
on the given source. It also optionally accepts a `project` parameter to
define the Google Cloud project ID. If the `project` parameter is not provided,
the tool defaults to using the project defined in the source configuration.
## Example
@@ -30,6 +34,6 @@ tools:
| **field** | **type** | **required** | **description** |
|-------------|:------------------------------------------:|:------------:|--------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "bigquery-get-dataset-info". |
| kind | string | true | Must be "bigquery-get-dataset-info". |
| source | string | true | Name of the source the SQL should execute on. |
| description | string | true | Description of the tool that is passed to the LLM. |

View File

@@ -2,8 +2,10 @@
title: "bigquery-get-table-info"
type: docs
weight: 1
description: >
description: >
A "bigquery-get-table-info" tool retrieves metadata for a BigQuery table.
aliases:
- /resources/tools/bigquery-get-table-info
---
## About
@@ -11,10 +13,12 @@ description: >
A `bigquery-get-table-info` tool retrieves metadata for a BigQuery table.
It's compatible with the following sources:
- [bigquery](../sources/bigquery.md)
- [bigquery](../../sources/bigquery.md)
bigquery-get-table-info takes dataset and table parameters to specify
the target table.
`bigquery-get-table-info` takes `dataset` and `table` parameters to specify
the target table. It also optionally accepts a `project` parameter to define
the Google Cloud project ID. If the `project` parameter is not provided, the
tool defaults to using the project defined in the source configuration.
## Example
@@ -30,6 +34,6 @@ tools:
| **field** | **type** | **required** | **description** |
|-------------|:------------------------------------------:|:------------:|--------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "bigquery-get-table-info". |
| kind | string | true | Must be "bigquery-get-table-info". |
| source | string | true | Name of the source the SQL should execute on. |
| description | string | true | Description of the tool that is passed to the LLM. |

View File

@@ -2,8 +2,10 @@
title: "bigquery-list-dataset-ids"
type: docs
weight: 1
description: >
description: >
A "bigquery-list-dataset-ids" tool returns all dataset IDs from the source.
aliases:
- /resources/tools/bigquery-list-dataset-ids
---
## About
@@ -11,10 +13,11 @@ description: >
A `bigquery-list-dataset-ids` tool returns all dataset IDs from the source.
It's compatible with the following sources:
- [bigquery](../sources/bigquery.md)
- [bigquery](../../sources/bigquery.md)
bigquery-list-dataset-ids requires no input parameters beyond the configured
source.
`bigquery-list-dataset-ids` optionally accepts a `project` parameter to define
the Google Cloud project ID. If the `project` parameter is not provided, the
tool defaults to using the project defined in the source configuration.
## Example
@@ -30,6 +33,6 @@ tools:
| **field** | **type** | **required** | **description** |
|-------------|:------------------------------------------:|:------------:|--------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "bigquery-list-dataset-ids". |
| kind | string | true | Must be "bigquery-list-dataset-ids". |
| source | string | true | Name of the source the SQL should execute on. |
| description | string | true | Description of the tool that is passed to the LLM. |

View File

@@ -2,8 +2,10 @@
title: "bigquery-list-table-ids"
type: docs
weight: 1
description: >
description: >
A "bigquery-list-table-ids" tool returns table IDs in a given BigQuery dataset.
aliases:
- /resources/tools/bigquery-list-table-ids
---
## About
@@ -11,10 +13,12 @@ description: >
A `bigquery-list-table-ids` tool returns table IDs in a given BigQuery dataset.
It's compatible with the following sources:
- [bigquery](../sources/bigquery.md)
- [bigquery](../../sources/bigquery.md)
bigquery-get-dataset-info takes a dataset parameter to specify the dataset
from which to list table IDs.
`bigquery-get-dataset-info` takes a required `dataset` parameter to specify the dataset
from which to list table IDs. It also optionally accepts a `project` parameter to
define the Google Cloud project ID. If the `project` parameter is not provided, the
tool defaults to using the project defined in the source configuration.
## Example
@@ -30,6 +34,6 @@ tools:
| **field** | **type** | **required** | **description** |
|-------------|:------------------------------------------:|:------------:|--------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "bigquery-list-table-ids". |
| kind | string | true | Must be "bigquery-list-table-ids". |
| source | string | true | Name of the source the SQL should execute on. |
| description | string | true | Description of the tool that is passed to the LLM. |

View File

@@ -4,13 +4,16 @@ type: docs
weight: 1
description: >
A "bigquery-sql" tool executes a pre-defined SQL statement.
aliases:
- /resources/tools/bigquery-sql
---
## About
A `bigquery-sql` tool executes a pre-defined SQL statement. It's compatible with
A `bigquery-sql` tool executes a pre-defined SQL statement. It's compatible with
the following sources:
- [bigquery](../sources/bigquery.md)
- [bigquery](../../sources/bigquery.md)
### GoogleSQL
@@ -69,7 +72,7 @@ tools:
> including identifiers, column names, and table names. **This makes it more
> vulnerable to SQL injections**. Using basic parameters only (see above) is
> recommended for performance and safety reasons. For more details, please check
> [templateParameters](_index#template-parameters).
> [templateParameters](../#template-parameters).
```yaml
tools:
@@ -98,5 +101,5 @@ tools:
| source | string | true | Name of the source the GoogleSQL should execute on. |
| description | string | true | Description of the tool that is passed to the LLM. |
| statement | string | true | The GoogleSQL statement to execute. |
| parameters | [parameters](_index#specifying-parameters) | false | List of [parameters](_index#specifying-parameters) that will be inserted into the SQL statement. |
| templateParameters | [templateParameters](_index#template-parameters) | false | List of [templateParameters](_index#template-parameters) that will be inserted into the SQL statement before executing prepared statement. |
| parameters | [parameters](../#specifying-parameters) | false | List of [parameters](../#specifying-parameters) that will be inserted into the SQL statement. |
| templateParameters | [templateParameters](../#template-parameters) | false | List of [templateParameters](../#template-parameters) that will be inserted into the SQL statement before executing prepared statement. |

View File

@@ -0,0 +1,7 @@
---
title: "Bigtable"
type: docs
weight: 1
description: >
Tools that work with Bigtable Sources.
---

View File

@@ -2,9 +2,11 @@
title: "bigtable-sql"
type: docs
weight: 1
description: >
A "bigtable-sql" tool executes a pre-defined SQL statement against a Google
description: >
A "bigtable-sql" tool executes a pre-defined SQL statement against a Google
Cloud Bigtable instance.
aliases:
- /resources/tools/bigtable-sql
---
## About
@@ -12,7 +14,7 @@ description: >
A `bigtable-sql` tool executes a pre-defined SQL statement against a Bigtable
instance. It's compatible with any of the following sources:
- [bigtable](../sources/bigtable.md)
- [bigtable](../../sources/bigtable.md)
### GoogleSQL
@@ -21,6 +23,13 @@ dialect, the specified SQL statement is executed as a [data manipulation
language (DML)][bigtable-googlesql] statements, and specified parameters will
inserted according to their name: e.g. `@name`.
{{<notice note>}}
Bigtable's GoogleSQL support for DML statements might be limited to certain
query types. For detailed information on supported DML statements and use
cases, refer to the [Bigtable GoogleSQL use
cases](https://cloud.google.com/bigtable/docs/googlesql-overview#use-cases).
{{</notice>}}
[bigtable-googlesql]: https://cloud.google.com/bigtable/docs/googlesql-overview
## Example
@@ -36,13 +45,13 @@ tools:
kind: bigtable-sql
source: my-bigtable-instance
statement: |
SELECT
TO_INT64(cf[ 'id' ]) as id,
CAST(cf[ 'name' ] AS string) as name,
FROM
mytable
WHERE
TO_INT64(cf[ 'id' ]) = @id
SELECT
TO_INT64(cf[ 'id' ]) as id,
CAST(cf[ 'name' ] AS string) as name,
FROM
mytable
WHERE
TO_INT64(cf[ 'id' ]) = @id
OR CAST(cf[ 'name' ] AS string) = @name;
description: |
Use this tool to get information for a specific user.
@@ -68,7 +77,7 @@ tools:
> including identifiers, column names, and table names. **This makes it more
> vulnerable to SQL injections**. Using basic parameters only (see above) is
> recommended for performance and safety reasons. For more details, please check
> [templateParameters](_index#template-parameters).
> [templateParameters](#template-parameters).
```yaml
tools:
@@ -97,8 +106,8 @@ tools:
| source | string | true | Name of the source the SQL should execute on. |
| description | string | true | Description of the tool that is passed to the LLM. |
| statement | string | true | SQL statement to execute on. |
| parameters | [parameters](_index#specifying-parameters) | false | List of [parameters](_index#specifying-parameters) that will be inserted into the SQL statement. |
| templateParameters | [templateParameters](_index#template-parameters) | false | List of [templateParameters](_index#template-parameters) that will be inserted into the SQL statement before executing prepared statement. |
| parameters | [parameters](../#specifying-parameters) | false | List of [parameters](../#specifying-parameters) that will be inserted into the SQL statement. |
| templateParameters | [templateParameters](#template-parameters) | false | List of [templateParameters](#template-parameters) that will be inserted into the SQL statement before executing prepared statement. |
## Tips

View File

@@ -0,0 +1,7 @@
---
title: "Couchbase"
type: docs
weight: 1
description: >
Tools that work with Couchbase Sources.
---

View File

@@ -2,9 +2,11 @@
title: "couchbase-sql"
type: docs
weight: 1
description: >
description: >
A "couchbase-sql" tool executes a pre-defined SQL statement against a Couchbase
database.
aliases:
- /resources/tools/couchbase-sql
---
## About
@@ -12,7 +14,7 @@ description: >
A `couchbase-sql` tool executes a pre-defined SQL statement against a Couchbase
database. It's compatible with any of the following sources:
- [couchbase](../sources/couchbase.md)
- [couchbase](../../sources/couchbase.md)
The specified SQL statement is executed as a parameterized statement, and specified
parameters will be used according to their name: e.g. `$id`.
@@ -64,7 +66,7 @@ tools:
> including identifiers, column names, and table names. **This makes it more
> vulnerable to SQL injections**. Using basic parameters only (see above) is
> recommended for performance and safety reasons. For more details, please check
> [templateParameters](_index#template-parameters).
> [templateParameters](#template-parameters).
```yaml
tools:
@@ -93,6 +95,6 @@ tools:
| source | string | true | Name of the source the SQL query should execute on. |
| description | string | true | Description of the tool that is passed to the LLM. |
| statement | string | true | SQL statement to execute |
| parameters | [parameters](_index#specifying-parameters) | false | List of [parameters](_index#specifying-parameters) that will be used with the SQL statement. |
| templateParameters | [templateParameters](_index#template-parameters) | false | List of [templateParameters](_index#template-parameters) that will be inserted into the SQL statement before executing prepared statement. |
| authRequired | array[string] | false | List of auth services that are required to use this tool. |
| parameters | [parameters](../#specifying-parameters) | false | List of [parameters](../#specifying-parameters) that will be used with the SQL statement. |
| templateParameters | [templateParameters](#template-parameters) | false | List of [templateParameters](#template-parameters) that will be inserted into the SQL statement before executing prepared statement. |
| authRequired | array[string] | false | List of auth services that are required to use this tool. |

View File

@@ -0,0 +1,7 @@
---
title: "Dataplex"
type: docs
weight: 1
description: >
Tools that work with Dataplex Sources.
---

View File

@@ -0,0 +1,75 @@
---
title: "dataplex-search-entries"
type: docs
weight: 1
description: >
A "dataplex-search-entries" tool allows to search for entries based on the provided query.
aliases:
- /resources/tools/dataplex-search-entries
---
## About
A `dataplex-search-entries` tool returns all entries in Dataplex Catalog (e.g.
tables, views, models) that matches given user query.
It's compatible with the following sources:
- [dataplex](../../sources/dataplex.md)
`dataplex-search-entries` takes a required `query` parameter based on which
entries are filtered and returned to the user and a required `name` parameter
which is constructed using source's project if user does not provide it
explicitly and has the following format: projects/{project}/locations/global. It
also optionally accepts following parameters:
- `pageSize` - Number of results in the search page. Defaults to `5`.
- `pageToken` - Page token received from a previous locations.searchEntries
call.
- `orderBy` - Specifies the ordering of results. Supported values are: relevance
(default), last_modified_timestamp, last_modified_timestamp asc
- `semanticSearch` - Specifies whether the search should understand the meaning
and intent behind the query, rather than just matching keywords. Defaults to
`true`.
- `scope` - The scope under which the search should be operating. Since this
parameter is not exposed to the toolbox user, it defaults to the organization
where the project provided in name is located.
## Requirements
### IAM Permissions
Dataplex uses [Identity and Access Management (IAM)][iam-overview] to control
user and group access to Dataplex resources. Toolbox will use your
[Application Default Credentials (ADC)][adc] to authorize and authenticate when
interacting with [Dataplex][dataplex-docs].
In addition to [setting the ADC for your server][set-adc], you need to ensure
the IAM identity has been given the correct IAM permissions for the tasks you
intend to perform. See [Dataplex Universal Catalog IAM permissions][iam-permissions]
and [Dataplex Universal Catalog IAM roles][iam-roles] for more information on
applying IAM permissions and roles to an identity.
[iam-overview]: https://cloud.google.com/dataplex/docs/iam-and-access-control
[adc]: https://cloud.google.com/docs/authentication#adc
[set-adc]: https://cloud.google.com/docs/authentication/provide-credentials-adc
[iam-permissions]: https://cloud.google.com/dataplex/docs/iam-permissions
[iam-roles]: https://cloud.google.com/dataplex/docs/iam-roles
[dataplex-docs]: https://cloud.google.com/dataplex
## Example
```yaml
tools:
dataplex-search-entries:
kind: dataplex-search-entries
source: my-dataplex-source
description: Use this tool to get all the entries based on the provided query.
```
## Reference
| **field** | **type** | **required** | **description** |
|-------------|:------------------------------------------:|:------------:|--------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "dataplex-search-entries". |
| source | string | true | Name of the source the tool should execute on. |
| description | string | true | Description of the tool that is passed to the LLM. |

View File

@@ -0,0 +1,7 @@
---
title: "Dgraph"
type: docs
weight: 1
description: >
Tools that work with Dgraph Sources.
---

View File

@@ -2,9 +2,11 @@
title: "dgraph-dql"
type: docs
weight: 1
description: >
description: >
A "dgraph-dql" tool executes a pre-defined DQL statement against a Dgraph
database.
aliases:
- /resources/tools/dgraph-dql
---
## About
@@ -12,7 +14,7 @@ description: >
A `dgraph-dql` tool executes a pre-defined DQL statement against a Dgraph
database. It's compatible with any of the following sources:
- [dgraph](../sources/dgraph.md)
- [dgraph](../../sources/dgraph.md)
To run a statement as a query, you need to set the config `isQuery=true`. For
upserts or mutations, set `isQuery=false`. You can also configure timeout for a
@@ -117,6 +119,6 @@ tools:
| source | string | true | Name of the source the dql query should execute on. |
| description | string | true | Description of the tool that is passed to the LLM. |
| statement | string | true | dql statement to execute |
| isQuery | boolean | false | To run statement as query set true otherwise false |
| timeout | string | false | To set timeout for query |
| parameters | [parameters](_index#specifying-parameters) | false | List of [parameters](_index#specifying-parameters) that will be used with the dql statement. |
| isQuery | boolean | false | To run statement as query set true otherwise false |
| timeout | string | false | To set timeout for query |
| parameters | [parameters](../#specifying-parameters) | false | List of [parameters](../#specifying-parameters) that will be used with the dql statement. |

View File

@@ -0,0 +1,7 @@
---
title: "DuckDB"
type: docs
weight: 1
description: >
Tools that work with DuckDB Sources.
---

View File

@@ -0,0 +1,80 @@
---
title: "duckdb-sql"
type: docs
weight: 1
description: >
Execute SQL statements against a DuckDB database using the DuckDB SQL tools configuration.
aliases:
- /resources/tools/duckdb-sql
---
## About
A `duckdb-sql` tool executes a pre-defined SQL statement against a [DuckDB](https://duckdb.org/) database. It is compatible with any DuckDB source configuration as defined in the [DuckDB source documentation](../../sources/duckdb.md).
The specified SQL statement is executed as a prepared statement, and parameters are inserted according to their position: e.g., `$1` is the first parameter, `$2` is the second, and so on. If template parameters are included, they are resolved before execution of the prepared statement.
DuckDB's SQL dialect closely follows the conventions of the PostgreSQL dialect, with a few exceptions listed in the [DuckDB PostgreSQL Compatibility documentation](https://duckdb.org/docs/stable/sql/dialect/postgresql_compatibility.html). For an introduction to DuckDB's SQL dialect, refer to the [DuckDB SQL Introduction](https://duckdb.org/docs/stable/sql/introduction).
### Concepts
DuckDB is a relational database management system (RDBMS). Data is stored in relations (tables), where each table is a named collection of rows. Each row in a table has the same set of named columns, each with a specific data type. Tables are stored within schemas, and a collection of schemas constitutes the entire database.
For more details, see the [DuckDB SQL Introduction](https://duckdb.org/docs/stable/sql/introduction).
## Example
> **Note:** This tool uses parameterized queries to prevent SQL injections. Query parameters can be used as substitutes for arbitrary expressions but cannot be used for identifiers, column names, table names, or other parts of the query.
```yaml
tools:
search-users:
kind: duckdb-sql
source: my-duckdb
description: Search users by name and age
statement: SELECT * FROM users WHERE name LIKE $1 AND age >= $2
parameters:
- name: name
type: string
description: The name to search for
- name: min_age
type: integer
description: Minimum age
```
## Example with Template Parameters
> **Note:** Template parameters allow direct modifications to the SQL statement, including identifiers, column names, and table names, which makes them more vulnerable to SQL injections. Using basic parameters (see above) is recommended for performance and safety. For more details, see the [templateParameters](../#template-parameters) section.
```yaml
tools:
list_table:
kind: duckdb-sql
source: my-duckdb
statement: |
SELECT * FROM {{.tableName}};
description: |
Use this tool to list all information from a specific table.
Example:
{{
"tableName": "flights",
}}
templateParameters:
- name: tableName
type: string
description: Table to select from
```
## Reference
### Configuration Fields
| **field** | **type** | **required** | **description** |
|--------------------|:-------------------------------:|:------------:|--------------------------------------------------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "duckdb-sql". |
| source | string | true | Name of the DuckDB source configuration (see [DuckDB source documentation](../../sources/duckdb.md)). |
| description | string | true | Description of the tool that is passed to the LLM. |
| statement | string | true | The SQL statement to execute. |
| authRequired | []string | false | List of authentication requirements for the tool (if any). |
| parameters | [parameters](../#specifying-parameters) | false | List of parameters that will be inserted into the SQL statement |
| templateParameters | [templateParameters](../#template-parameters) | false | List of template parameters that will be inserted into the SQL statement before executing the prepared statement. |

View File

@@ -0,0 +1,7 @@
---
title: "Firestore"
type: docs
weight: 1
description: >
Tools that work with Firestore Sources.
---

View File

@@ -0,0 +1,39 @@
---
title: "firestore-delete-documents"
type: docs
weight: 1
description: >
A "firestore-delete-documents" tool deletes multiple documents from Firestore by their paths.
aliases:
- /resources/tools/firestore-delete-documents
---
## About
A `firestore-delete-documents` tool deletes multiple documents from Firestore by
their paths.
It's compatible with the following sources:
- [firestore](../../sources/firestore.md)
`firestore-delete-documents` takes one input parameter `documentPaths` which is
an array of document paths to delete. The tool uses Firestore's BulkWriter for
efficient batch deletion and returns the success status for each document.
## Example
```yaml
tools:
delete_user_documents:
kind: firestore-delete-documents
source: my-firestore-source
description: Use this tool to delete multiple documents from Firestore.
```
## Reference
| **field** | **type** | **required** | **description** |
|-------------|:--------------:|:------------:|----------------------------------------------------------|
| kind | string | true | Must be "firestore-delete-documents". |
| source | string | true | Name of the Firestore source to delete documents from. |
| description | string | true | Description of the tool that is passed to the LLM. |

View File

@@ -0,0 +1,39 @@
---
title: "firestore-get-documents"
type: docs
weight: 1
description: >
A "firestore-get-documents" tool retrieves multiple documents from Firestore by their paths.
aliases:
- /resources/tools/firestore-get-documents
---
## About
A `firestore-get-documents` tool retrieves multiple documents from Firestore by
their paths.
It's compatible with the following sources:
- [firestore](../../sources/firestore.md)
`firestore-get-documents` takes one input parameter `documentPaths` which is an
array of document paths, and returns the documents' data along with metadata
such as existence status, creation time, update time, and read time.
## Example
```yaml
tools:
get_user_documents:
kind: firestore-get-documents
source: my-firestore-source
description: Use this tool to retrieve multiple documents from Firestore.
```
## Reference
| **field** | **type** | **required** | **description** |
|-------------|:--------------:|:------------:|------------------------------------------------------------|
| kind | string | true | Must be "firestore-get-documents". |
| source | string | true | Name of the Firestore source to retrieve documents from. |
| description | string | true | Description of the tool that is passed to the LLM. |

View File

@@ -0,0 +1,39 @@
---
title: "firestore-get-rules"
type: docs
weight: 1
description: >
A "firestore-get-rules" tool retrieves the active Firestore security rules for the current project.
aliases:
- /resources/tools/firestore-get-rules
---
## About
A `firestore-get-rules` tool retrieves the active [Firestore security
rules](https://firebase.google.com/docs/firestore/security/get-started) for the
current project.
It's compatible with the following sources:
- [firestore](../../sources/firestore.md)
`firestore-get-rules` takes no input parameters and returns the security rules
content along with metadata such as the ruleset name, and timestamps.
## Example
```yaml
tools:
get_firestore_rules:
kind: firestore-get-rules
source: my-firestore-source
description: Use this tool to retrieve the active Firestore security rules.
```
## Reference
| **field** | **type** | **required** | **description** |
|-------------|:-------------:|:------------:|-------------------------------------------------------|
| kind | string | true | Must be "firestore-get-rules". |
| source | string | true | Name of the Firestore source to retrieve rules from. |
| description | string | true | Description of the tool that is passed to the LLM. |

View File

@@ -0,0 +1,42 @@
---
title: "firestore-list-collections"
type: docs
weight: 1
description: >
A "firestore-list-collections" tool lists collections in Firestore, either at the root level or as subcollections of a document.
aliases:
- /resources/tools/firestore-list-collections
---
## About
A `firestore-list-collections` tool lists
[collections](https://firebase.google.com/docs/firestore/data-model#collections)
in Firestore, either at the root level or as
[subcollections](https://firebase.google.com/docs/firestore/data-model#subcollections)
of a specific document.
It's compatible with the following sources:
- [firestore](../../sources/firestore.md)
`firestore-list-collections` takes an optional `parentPath` parameter to specify a document
path. If provided, it lists all subcollections of that document. If not provided, it lists
all root-level collections in the database.
## Example
```yaml
tools:
list_firestore_collections:
kind: firestore-list-collections
source: my-firestore-source
description: Use this tool to list collections in Firestore.
```
## Reference
| **field** | **type** | **required** | **description** |
|-------------|:----------------:|:------------:|--------------------------------------------------------|
| kind | string | true | Must be "firestore-list-collections". |
| source | string | true | Name of the Firestore source to list collections from. |
| description | string | true | Description of the tool that is passed to the LLM. |

View File

@@ -0,0 +1,215 @@
---
title: "firestore-query-collection"
type: docs
weight: 1
description: >
A "firestore-query-collection" tool allow to query collections in Firestore.
aliases:
- /resources/tools/firestore-query-collection
---
# About
The `firestore-query-collection` tool allows you to query Firestore collections
with filters, ordering, and limit capabilities.
## Configuration
To use this tool, you need to configure it in your YAML configuration file:
```yaml
sources:
my-firestore:
kind: firestore
project: my-gcp-project
database: "(default)"
tools:
query_collection:
kind: firestore-query-collection
source: my-firestore
description: Query Firestore collections with advanced filtering
```
## Parameters
| **parameters** | **type** | **required** | **default** | **description** |
|------------------|:------------:|:------------:|:-----------:|-----------------------------------------------------------------------|
| `collectionPath` | string | true | - | The Firestore Rules source code to validate |
| `filters` | array | false | - | Array of filter objects (as JSON strings) to apply to the query |
| `orderBy` | string | false | - | JSON string specifying field and direction to order results |
| `limit` | integer | false | 100 | Maximum number of documents to return |
| `analyzeQuery` | boolean | false | false | If true, returns query explain metrics including execution statistics |
### Filter Format
Each filter in the `filters` array should be a JSON string with the following
structure:
```json
{
"field": "fieldName",
"op": "operator",
"value": "compareValue"
}
```
Supported operators:
- `<` - Less than
- `<=` - Less than or equal to
- `>` - Greater than
- `>=` - Greater than or equal to
- `==` - Equal to
- `!=` - Not equal to
- `array-contains` - Array contains a specific value
- `array-contains-any` - Array contains any of the specified values
- `in` - Field value is in the specified array
- `not-in` - Field value is not in the specified array
Value types supported:
- String: `"value": "text"`
- Number: `"value": 123` or `"value": 45.67`
- Boolean: `"value": true` or `"value": false`
- Array: `"value": ["item1", "item2"]` (for `in`, `not-in`, `array-contains-any`
operators)
### OrderBy Format
The `orderBy` parameter should be a JSON string with the following structure:
```json
{
"field": "fieldName",
"direction": "ASCENDING"
}
```
Direction values:
- `ASCENDING`
- `DESCENDING`
## Example Usage
### Query with filters
```json
{
"collectionPath": "users",
"filters": [
"{\"field\": \"age\", \"op\": \">\", \"value\": 18}",
"{\"field\": \"status\", \"op\": \"==\", \"value\": \"active\"}"
],
"orderBy": "{\"field\": \"createdAt\", \"direction\": \"DESCENDING\"}",
"limit": 50
}
```
### Query with array contains filter
```json
{
"collectionPath": "products",
"filters": [
"{\"field\": \"categories\", \"op\": \"array-contains\", \"value\": \"electronics\"}",
"{\"field\": \"price\", \"op\": \"<\", \"value\": 1000}"
],
"orderBy": "{\"field\": \"price\", \"direction\": \"ASCENDING\"}",
"limit": 20
}
```
### Query with IN operator
```json
{
"collectionPath": "orders",
"filters": [
"{\"field\": \"status\", \"op\": \"in\", \"value\": [\"pending\", \"processing\"]}"
],
"limit": 100
}
```
### Query with explain metrics
```json
{
"collectionPath": "users",
"filters": [
"{\"field\": \"age\", \"op\": \">=\", \"value\": 21}",
"{\"field\": \"active\", \"op\": \"==\", \"value\": true}"
],
"orderBy": "{\"field\": \"lastLogin\", \"direction\": \"DESCENDING\"}",
"limit": 25,
"analyzeQuery": true
}
```
## Response Format
### Standard Response (analyzeQuery = false)
The tool returns an array of documents, where each document includes:
```json
{
"id": "documentId",
"path": "collection/documentId",
"data": {
// Document fields
},
"createTime": "2025-01-07T12:00:00Z",
"updateTime": "2025-01-07T12:00:00Z",
"readTime": "2025-01-07T12:00:00Z"
}
```
### Response with Query Analysis (analyzeQuery = true)
When `analyzeQuery` is set to true, the tool returns a single object containing
documents and explain metrics:
```json
{
"documents": [
// Array of document objects as shown above
],
"explainMetrics": {
"planSummary": {
"indexesUsed": [
{
"query_scope": "Collection",
"properties": "(field ASC, __name__ ASC)"
}
]
},
"executionStats": {
"resultsReturned": 50,
"readOperations": 50,
"executionDuration": "120ms",
"debugStats": {
"indexes_entries_scanned": "1000",
"documents_scanned": "50",
"billing_details": {
"documents_billable": "50",
"index_entries_billable": "1000",
"min_query_cost": "0"
}
}
}
}
}
```
## Error Handling
The tool will return errors for:
- Invalid collection path
- Malformed filter JSON
- Unsupported operators
- Query execution failures
- Invalid orderBy format

View File

@@ -0,0 +1,124 @@
---
title: "firestore-validate-rules"
type: docs
weight: 1
description: >
A "firestore-validate-rules" tool validates Firestore security rules syntax and semantic correctness without deploying them. It provides detailed error reporting with source positions and code snippets.
aliases:
- /resources/tools/firestore-validate-rules
---
## Overview
The `firestore-validate-rules` tool validates Firestore security rules syntax
and semantic correctness without deploying them. It provides detailed error
reporting with source positions and code snippets.
## Configuration
```yaml
tools:
firestore-validate-rules:
kind: firestore-validate-rules
source: <firestore-source-name>
description: "Checks the provided Firestore Rules source for syntax and validation errors"
```
## Authentication
This tool requires authentication if the source requires authentication.
## Parameters
| **parameters** | **type** | **required** | **description** |
|-----------------|:------------:|:------------:|----------------------------------------------|
| source | string | true | The Firestore Rules source code to validate |
## Response
The tool returns a `ValidationResult` object containing:
```json
{
"valid": "boolean",
"issueCount": "number",
"formattedIssues": "string",
"rawIssues": [
{
"sourcePosition": {
"fileName": "string",
"line": "number",
"column": "number",
"currentOffset": "number",
"endOffset": "number"
},
"description": "string",
"severity": "string"
}
]
}
```
## Example Usage
### Validate simple rules
```json
{
"source": "rules_version = '2';\nservice cloud.firestore {\n match /databases/{database}/documents {\n match /{document=**} {\n allow read, write: if true;\n }\n }\n}"
}
```
### Example response for valid rules
```json
{
"valid": true,
"issueCount": 0,
"formattedIssues": "✓ No errors detected. Rules are valid."
}
```
### Example response with errors
```json
{
"valid": false,
"issueCount": 1,
"formattedIssues": "Found 1 issue(s) in rules source:\n\nERROR: Unexpected token ';' [Ln 4, Col 32]\n```\n allow read, write: if true;;\n ^\n```",
"rawIssues": [
{
"sourcePosition": {
"line": 4,
"column": 32,
"currentOffset": 105,
"endOffset": 106
},
"description": "Unexpected token ';'",
"severity": "ERROR"
}
]
}
```
## Error Handling
The tool will return errors for:
- Missing or empty `source` parameter
- API errors when calling the Firebase Rules service
- Network connectivity issues
## Use Cases
1. **Pre-deployment validation**: Validate rules before deploying to production
2. **CI/CD integration**: Integrate rules validation into your build pipeline
3. **Development workflow**: Quickly check rules syntax while developing
4. **Error debugging**: Get detailed error locations with code snippets
## Related Tools
- [firestore-get-rules]({{< ref "firestore-get-rules" >}}): Retrieve current
active rules
- [firestore-query-collection]({{< ref "firestore-query-collection" >}}): Test
rules by querying collections

View File

@@ -0,0 +1,7 @@
---
title: "HTTP"
type: docs
weight: 1
description: >
Tools that work with HTTP Sources.
---

View File

@@ -2,16 +2,20 @@
title: "http"
type: docs
weight: 1
description: >
description: >
A "http" tool sends out an HTTP request to an HTTP endpoint.
aliases:
- /resources/tools/http
---
## About
The `http` tool allows you to make HTTP requests to APIs to retrieve data.
An HTTP request is the method by which a client communicates with a server to retrieve or manipulate resources.
Toolbox allows you to configure the request URL, method, headers, query parameters, and the request body for an HTTP Tool.
An HTTP request is the method by which a client communicates with a server to
retrieve or manipulate resources.
Toolbox allows you to configure the request URL, method, headers, query
parameters, and the request body for an HTTP Tool.
### URL
@@ -20,9 +24,11 @@ Toolbox composes the request URL from 3 places:
1. The HTTP Source's `baseUrl`.
2. The HTTP Tool's `path` field.
3. The HTTP Tool's `pathParams` for dynamic path composed during Tool invocation.
3. The HTTP Tool's `pathParams` for dynamic path composed during Tool
invocation.
For example, the following config allows you to reach different paths of the same server using multiple Tools:
For example, the following config allows you to reach different paths of the
same server using multiple Tools:
```yaml
sources:
@@ -44,7 +50,7 @@ tools:
method: GET
path: /search
description: Tool to search information from the example API
my-dynamic-path-tool:
kind: http
source: my-http-source
@@ -60,11 +66,15 @@ tools:
### Headers
An HTTP request header is a key-value pair sent by a client to a server, providing additional information about the request, such as the client's preferences, the request body content type, and other metadata.
Headers specified by the HTTP Tool are combined with its HTTP Source headers for the resulting HTTP request, and override the Source headers in case of conflict.
An HTTP request header is a key-value pair sent by a client to a server,
providing additional information about the request, such as the client's
preferences, the request body content type, and other metadata.
Headers specified by the HTTP Tool are combined with its HTTP Source headers for
the resulting HTTP request, and override the Source headers in case of conflict.
The HTTP Tool allows you to specify headers in two different ways:
- Static headers can be specified using the `headers` field, and will be the same for every invocation:
- Static headers can be specified using the `headers` field, and will be the
same for every invocation:
```yaml
my-http-tool:
@@ -78,7 +88,9 @@ my-http-tool:
Content-Type: application/json
```
- Dynamic headers can be specified as parameters in the `headerParams` field. The `name` of the `headerParams` will be used as the header key, and the value is determined by the LLM input upon Tool invocation:
- Dynamic headers can be specified as parameters in the `headerParams` field.
The `name` of the `headerParams` will be used as the header key, and the value
is determined by the LLM input upon Tool invocation:
```yaml
my-http-tool:
@@ -95,9 +107,12 @@ my-http-tool:
### Query parameters
Query parameters are key-value pairs appended to a URL after a question mark (?) to provide additional information to the server for processing the request, like filtering or sorting data.
Query parameters are key-value pairs appended to a URL after a question mark (?)
to provide additional information to the server for processing the request, like
filtering or sorting data.
- Static request query parameters should be specified in the `path` as part of the URL itself:
- Static request query parameters should be specified in the `path` as part of
the URL itself:
```yaml
my-http-tool:
@@ -108,7 +123,8 @@ my-http-tool:
description: Tool to search for item with ID 1 in English
```
- Dynamic request query parameters should be specified as parameters in the `queryParams` section:
- Dynamic request query parameters should be specified as parameters in the
`queryParams` section:
```yaml
my-http-tool:
@@ -125,8 +141,11 @@ my-http-tool:
### Request body
The request body payload is a string that supports parameter replacement following [Go template][go-template-doc]'s annotations.
The parameter names in the `requestBody` should be preceded by "." and enclosed by double curly brackets "{{}}". The values will be populated into the request body payload upon Tool invocation.
The request body payload is a string that supports parameter replacement
following [Go template][go-template-doc]'s annotations.
The parameter names in the `requestBody` should be preceded by "." and enclosed
by double curly brackets "{{}}". The values will be populated into the request
body payload upon Tool invocation.
Example:
@@ -153,14 +172,17 @@ my-http-tool:
#### Formatting Parameters
Some complex parameters (such as arrays) may require additional formatting to match the expected output. For convenience, you can specify one of the following pre-defined functions before the parameter name to format it:
Some complex parameters (such as arrays) may require additional formatting to
match the expected output. For convenience, you can specify one of the following
pre-defined functions before the parameter name to format it:
##### JSON
The `json` keyword converts a parameter into a JSON format.
{{< notice note >}}
Using JSON may add quotes to the variable name for certain types (such as strings).
Using JSON may add quotes to the variable name for certain types (such as
strings).
{{< /notice >}}
Example:
@@ -234,8 +256,8 @@ my-http-tool:
| method | string | true | The HTTP method to use (e.g., GET, POST, PUT, DELETE). |
| headers | map[string]string | false | A map of headers to include in the HTTP request (overrides source headers). |
| requestBody | string | false | The request body payload. Use [go template][go-template-doc] with the parameter name as the placeholder (e.g., `{{.id}}` will be replaced with the value of the parameter that has name `id` in the `bodyParams` section). |
| queryParams | [parameters](_index#specifying-parameters) | false | List of [parameters](_index#specifying-parameters) that will be inserted into the query string. |
| bodyParams | [parameters](_index#specifying-parameters) | false | List of [parameters](_index#specifying-parameters) that will be inserted into the request body payload. |
| headerParams | [parameters](_index#specifying-parameters) | false | List of [parameters](_index#specifying-parameters) that will be inserted as the request headers. |
| queryParams | [parameters](../#specifying-parameters) | false | List of [parameters](../#specifying-parameters) that will be inserted into the query string. |
| bodyParams | [parameters](../#specifying-parameters) | false | List of [parameters](../#specifying-parameters) that will be inserted into the request body payload. |
| headerParams | [parameters](../#specifying-parameters) | false | List of [parameters](../#specifying-parameters) that will be inserted as the request headers. |
[go-template-doc]: <https://pkg.go.dev/text/template#pkg-overview>

View File

@@ -0,0 +1,7 @@
---
title: "Looker"
type: docs
weight: 1
description: >
Tools that work with Looker Sources.
---

View File

@@ -0,0 +1,44 @@
---
title: "looker-get-dimensions"
type: docs
weight: 1
description: >
A "looker-get-dimensions" tool returns all the dimensions from a given explore
in a given model in the source.
aliases:
- /resources/tools/looker-get-dimensions
---
## About
A `looker-get-dimensions` tool returns all the dimensions from a given explore
in a given mode in the source.
It's compatible with the following sources:
- [looker](../../sources/looker.md)
`looker-get-dimensions` accepts two parameters, the `model` and the `explore`.
## Example
```yaml
tools:
get_dimensions:
kind: looker-get-dimensions
source: looker-source
description: |
The get_dimensions tool retrieves the list of dimensions defined in
an explore.
It takes two parameters, the model_name looked up from get_models and the
explore_name looked up from get_explores.
```
## Reference
| **field** | **type** | **required** | **description** |
|-------------|:------------------------------------------:|:------------:|--------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "looker-get-dimensions". |
| source | string | true | Name of the source the SQL should execute on. |
| description | string | true | Description of the tool that is passed to the LLM. |

View File

@@ -0,0 +1,44 @@
---
title: "looker-get-explores"
type: docs
weight: 1
description: >
A "looker-get-explores" tool returns all explores
for the given model from the source.
aliases:
- /resources/tools/looker-get-explores
---
## About
A `looker-get-explores` tool returns all explores
for a given model from the source.
It's compatible with the following sources:
- [looker](../../sources/looker.md)
`looker-get-explores` accepts one parameter, the
`model` id.
## Example
```yaml
tools:
get_explores:
kind: looker-get-explores
source: looker-source
description: |
The get_explores tool retrieves the list of explores defined in a LookML model
in the Looker system.
It takes one parameter, the model_name looked up from get_models.
```
## Reference
| **field** | **type** | **required** | **description** |
|-------------|:------------------------------------------:|:------------:|--------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "looker-get-explores". |
| source | string | true | Name of the source the SQL should execute on. |
| description | string | true | Description of the tool that is passed to the LLM. |

View File

@@ -0,0 +1,44 @@
---
title: "looker-get-filters"
type: docs
weight: 1
description: >
A "looker-get-filters" tool returns all the filters from a given explore
in a given model in the source.
aliases:
- /resources/tools/looker-get-filters
---
## About
A `looker-get-filters` tool returns all the filters from a given explore
in a given mode in the source.
It's compatible with the following sources:
- [looker](../../sources/looker.md)
`looker-get-filters` accepts two parameters, the `model` and the `explore`.
## Example
```yaml
tools:
get_dimensions:
kind: looker-get-filters
source: looker-source
description: |
The get_filters tool retrieves the list of filters defined in
an explore.
It takes two parameters, the model_name looked up from get_models and the
explore_name looked up from get_explores.
```
## Reference
| **field** | **type** | **required** | **description** |
|-------------|:------------------------------------------:|:------------:|--------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "looker-get-filters". |
| source | string | true | Name of the source the SQL should execute on. |
| description | string | true | Description of the tool that is passed to the LLM. |

View File

@@ -0,0 +1,60 @@
---
title: "looker-get-looks"
type: docs
weight: 1
description: >
"looker-get-looks" searches for saved Looks in a Looker
source.
aliases:
- /resources/tools/looker-get-looks
---
## About
The `looker-get-looks` tool searches for a saved Look by
name or description.
It's compatible with the following sources:
- [looker](../../sources/looker.md)
`looker-get-looks` takes four parameters, the `title`, `desc`, `limit`
and `offset`.
Title and description use SQL style wildcards and are case insensitive.
Limit and offset are used to page through a larger set of matches and
default to 100 and 0.
## Example
```yaml
tools:
get_looks:
kind: looker-get-looks
source: looker-source
description: |
get_looks Tool
This tool is used to search for saved looks in a Looker instance.
String search params use case-insensitive matching. String search
params can contain % and '_' as SQL LIKE pattern match wildcard
expressions. example="dan%" will match "danger" and "Danzig" but
not "David" example="D_m%" will match "Damage" and "dump".
Most search params can accept "IS NULL" and "NOT NULL" as special
expressions to match or exclude (respectively) rows where the
column is null.
The limit and offset are used to paginate the results.
The result of the get_looks tool is a list of json objects.
```
## Reference
| **field** | **type** | **required** | **description** |
|-------------|:------------------------------------------:|:------------:|--------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "looker-get-looks" |
| source | string | true | Name of the source the SQL should execute on. |
| description | string | true | Description of the tool that is passed to the LLM. |

View File

@@ -0,0 +1,44 @@
---
title: "looker-get-measures"
type: docs
weight: 1
description: >
A "looker-get-measures" tool returns all the measures from a given explore
in a given model in the source.
aliases:
- /resources/tools/looker-get-measures
---
## About
A `looker-get-measures` tool returns all the measures from a given explore
in a given mode in the source.
It's compatible with the following sources:
- [looker](../../sources/looker.md)
`looker-get-measures` accepts two parameters, the `model` and the `explore`.
## Example
```yaml
tools:
get_measures:
kind: looker-get-measures
source: looker-source
description: |
The get_measures tool retrieves the list of measures defined in
an explore.
It takes two parameters, the model_name looked up from get_models and the
explore_name looked up from get_explores.
```
## Reference
| **field** | **type** | **required** | **description** |
|-------------|:------------------------------------------:|:------------:|--------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "looker-get-measures". |
| source | string | true | Name of the source the SQL should execute on. |
| description | string | true | Description of the tool that is passed to the LLM. |

View File

@@ -0,0 +1,40 @@
---
title: "looker-get-models"
type: docs
weight: 1
description: >
A "looker-get-models" tool returns all the models in the source.
aliases:
- /resources/tools/looker-get-models
---
## About
A `looker-get-models` tool returns all the models the source.
It's compatible with the following sources:
- [looker](../../sources/looker.md)
`looker-get-models` accepts no parameters.
## Example
```yaml
tools:
get_models:
kind: looker-get-models
source: looker-source
description: |
The get_models tool retrieves the list of LookML models in the Looker system.
It takes no parameters.
```
## Reference
| **field** | **type** | **required** | **description** |
|-------------|:------------------------------------------:|:------------:|--------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "looker-get-models". |
| source | string | true | Name of the source the SQL should execute on. |
| description | string | true | Description of the tool that is passed to the LLM. |

View File

@@ -0,0 +1,44 @@
---
title: "looker-get-parameters"
type: docs
weight: 1
description: >
A "looker-get-parameters" tool returns all the parameters from a given explore
in a given model in the source.
aliases:
- /resources/tools/looker-get-parameters
---
## About
A `looker-get-parameters` tool returns all the parameters from a given explore
in a given mode in the source.
It's compatible with the following sources:
- [looker](../../sources/looker.md)
`looker-get-parameters` accepts two parameters, the `model` and the `explore`.
## Example
```yaml
tools:
get_parameters:
kind: looker-get-parameters
source: looker-source
description: |
The get_parameters tool retrieves the list of parameters defined in
an explore.
It takes two parameters, the model_name looked up from get_models and the
explore_name looked up from get_explores.
```
## Reference
| **field** | **type** | **required** | **description** |
|-------------|:------------------------------------------:|:------------:|--------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "looker-get-parameters". |
| source | string | true | Name of the source the SQL should execute on. |
| description | string | true | Description of the tool that is passed to the LLM. |

Some files were not shown because too many files have changed in this diff Show More