Compare commits

...

33 Commits

Author SHA1 Message Date
Averi Kitsch
b2cbb4098b debug 2025-08-05 11:05:39 -07:00
Yuan Teoh
4c177198c9 update Dockerfile 2025-08-05 09:57:04 -07:00
Averi Kitsch
618ae82176 debug 2025-08-05 09:54:07 -07:00
Yuan Teoh
9b11a59519 update continuous release 2025-08-05 09:42:38 -07:00
Averi Kitsch
f4219dc00a Merge branch 'main' into fix-ci 2025-08-05 09:12:48 -07:00
Averi Kitsch
af3d791dea chore: roll back version (#1077) 2025-08-05 16:12:31 +00:00
Averi Kitsch
10f79b7a97 ci: enable C++ compiler for duckDB 2025-08-05 08:53:37 -07:00
Wenxin Du
0527532bd7 feat(tools/bigquery,mssql,mysql,postgres,spanner,tidb): Add query logging to execute-sql tools (#1069)
fix: https://github.com/googleapis/genai-toolbox/issues/1052
2025-08-05 03:01:19 +00:00
release-please[bot]
8d0fa6783a chore(main): release 0.11.0 (#1000)
🤖 I have created a release *beep* *boop*
---


##
[0.11.0](https://github.com/googleapis/genai-toolbox/compare/v0.10.0...v0.11.0)
(2025-08-04)


### ⚠ BREAKING CHANGES

* **tools/bigquery-sql:** Ensure invoke always returns a non-null value
([#1020](https://github.com/googleapis/genai-toolbox/issues/1020))
([9af55b6](9af55b651d))
* **tools/bigquery-execute-sql:** Update the return messages
([#1034](https://github.com/googleapis/genai-toolbox/issues/1034))
([051e686](051e686476))

### Features

* Add DuckDB source and tool
([#879](https://github.com/googleapis/genai-toolbox/pull/879))
([fd14933](fd149337e9))
* Add TiDB source and tool
([#829](https://github.com/googleapis/genai-toolbox/issues/829))
([6eaf36a](6eaf36ac85))
* Interactive web UI for Toolbox
([#1065](https://github.com/googleapis/genai-toolbox/issues/1065))
([8749b03](8749b03003))
* **tools/looker-query-url:** Add support for `looker-query-url` tool
([#1015](https://github.com/googleapis/genai-toolbox/issues/1015))
([327ddf0](327ddf0439))
* **tools/dataplex-lookup-entry:** Add support for
`dataplex-lookup-entry` tool
([#1009](https://github.com/googleapis/genai-toolbox/issues/1009))
([5fa1660](5fa1660fc8))

---
This PR was generated with [Release
Please](https://github.com/googleapis/release-please). See
[documentation](https://github.com/googleapis/release-please#release-please).

---------

Co-authored-by: release-please[bot] <55107282+release-please[bot]@users.noreply.github.com>
Co-authored-by: Yuan Teoh <45984206+Yuan325@users.noreply.github.com>
2025-08-04 19:11:17 -07:00
Dr. Strangelove
8da5a8f68d refactor(tools/looker): dedup code into helper functions (#1053)
Refactoring code in the Looker tools as suggested by Gemini

---------

Co-authored-by: Yuan Teoh <45984206+Yuan325@users.noreply.github.com>
2025-08-04 16:51:31 -07:00
Averi Kitsch
ecf9d65e8a docs: update architecture diagram (#1038) 2025-08-04 22:20:25 +00:00
AlexTalreja
8749b03003 feat: interactive web UI for Toolbox (#1065)
Introduce Toolbox UI, which can be launched with the `--ui` flag. 

This initial version of Toolbox UI allows users to test Toolbox by
inspecting tools/toolsets, modifying parameters, managing headers, and
executing API calls.
2025-08-04 11:47:38 -07:00
prernakakkar-google
bfabcf826e docs: Redirect alloydb pages to cgc (#1064) 2025-08-04 10:53:45 -07:00
Yuan Teoh
e843f73079 chore(server/mcp): update to accept other json content type request (#1049)
Toolbox MCP endpoint to accept request of multiple content type
according to the json schema
(https://www.jsonrpc.org/historical/json-rpc-over-http.html#http-header)

Toolbox endpoints only accept `Content-Type: application/json`. Update
to accept `Content-Type: application/json-rpc` and
`Content-Type:application/jsonrequest` as well.

Fixes #1004
2025-08-02 08:04:46 +00:00
Cheese
6eaf36ac85 feat: support tidb in data source, sql tool, and execute sql tool (#829)
This PR supports TiDB in:

1. sources - tidb: As a data source;
2. tools - tidb - tidbsql: As a prepared SQL running tool;
3. tools - tidb - tidbexecutesql: As an arbitrary SQL running tool (for
development purposes).

And its corresponding docs.

---------

Co-authored-by: Yuan Teoh <45984206+Yuan325@users.noreply.github.com>
Co-authored-by: Yuan Teoh <yuanteoh@google.com>
2025-08-02 00:54:22 -07:00
Huan Chen
051e686476 fix(tools/bigquery-execute-sql): update the return messages (#1034)
Updated return message to make sure all cases are covered.
2025-08-01 14:39:16 -07:00
Huan Chen
9af55b651d fix(tools/bigquery-sql): ensure invoke always returns a non-null value (#1020)
This is to make bigquery-sql consistent with bigquery-execute-sql. May
not be necessary to have.

- Added a dry run step to identify the query type (e.g., SELECT, DML),
which allows the tool to correctly handle the query's output.
- The recommended high-level client, cloud.google.com/go/bigquery, does
not expose the statement type from a dry run. To circumvent this
limitation, the low-level BigQuery REST API client
(google.golang.org/api/bigquery/v2) was added to gain access to these
necessary details.

---------

Co-authored-by: Averi Kitsch <akitsch@google.com>
2025-08-01 14:16:57 -07:00
Anuj Jhunjhunwala
5fa1660fc8 feat(tools/dataplex-lookup-entry): Add support for dataplex-lookup-entry tool (#1009)
Added support for lookup entry tool in Dataplex.
Fixes #997

---------

Co-authored-by: Wenxin Du <117315983+duwenxin99@users.noreply.github.com>
2025-08-01 15:18:56 -04:00
Yuan Teoh
d9ee17d2c7 chore(sources/duckdb): run lint (#1048) 2025-08-01 09:15:57 -07:00
Mend Renovate
f693f75f38 chore(deps): update module google.golang.org/api to v0.244.0 (#1039)
This PR contains the following updates:

| Package | Change | Age | Confidence |
|---|---|---|---|
|
[google.golang.org/api](https://redirect.github.com/googleapis/google-api-go-client)
| `v0.243.0` -> `v0.244.0` |
[![age](https://developer.mend.io/api/mc/badges/age/go/google.golang.org%2fapi/v0.244.0?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|
[![confidence](https://developer.mend.io/api/mc/badges/confidence/go/google.golang.org%2fapi/v0.243.0/v0.244.0?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|

---

### Release Notes

<details>
<summary>googleapis/google-api-go-client
(google.golang.org/api)</summary>

###
[`v0.244.0`](https://redirect.github.com/googleapis/google-api-go-client/releases/tag/v0.244.0)

[Compare
Source](https://redirect.github.com/googleapis/google-api-go-client/compare/v0.243.0...v0.244.0)

##### Features

- **all:** Auto-regenerate discovery clients
([#&#8203;3241](https://redirect.github.com/googleapis/google-api-go-client/issues/3241))
([2c20485](2c204857ee))
- **all:** Auto-regenerate discovery clients
([#&#8203;3243](https://redirect.github.com/googleapis/google-api-go-client/issues/3243))
([cac72a1](cac72a1458))
- **all:** Auto-regenerate discovery clients
([#&#8203;3244](https://redirect.github.com/googleapis/google-api-go-client/issues/3244))
([e6b1c87](e6b1c8715f))
- **all:** Auto-regenerate discovery clients
([#&#8203;3245](https://redirect.github.com/googleapis/google-api-go-client/issues/3245))
([2c1ff18](2c1ff18dfc))
- **all:** Auto-regenerate discovery clients
([#&#8203;3247](https://redirect.github.com/googleapis/google-api-go-client/issues/3247))
([09e5c07](09e5c0743d))
- **all:** Auto-regenerate discovery clients
([#&#8203;3249](https://redirect.github.com/googleapis/google-api-go-client/issues/3249))
([214eb4e](214eb4ea56))
- **all:** Auto-regenerate discovery clients
([#&#8203;3250](https://redirect.github.com/googleapis/google-api-go-client/issues/3250))
([ce50789](ce50789a30))
- **all:** Auto-regenerate discovery clients
([#&#8203;3251](https://redirect.github.com/googleapis/google-api-go-client/issues/3251))
([e5c3e18](e5c3e1801e))

</details>

---

### Configuration

📅 **Schedule**: Branch creation - At any time (no schedule defined),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/googleapis/genai-toolbox).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MS40My41IiwidXBkYXRlZEluVmVyIjoiNDEuNDYuMyIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOltdfQ==-->
2025-07-31 21:18:09 +00:00
Dr. Strangelove
327ddf0439 feat: new tool - looker-query-url (#1015) 2025-07-31 17:45:45 +00:00
Twisha Bansal
330b14e518 docs: fix method name (#1042) 2025-07-31 10:29:32 +05:30
Shobhit Singh
5bdab8c83b fix: template paramaters link in bigquery-sql tool (#1032) 2025-07-31 02:57:09 +00:00
Mend Renovate
dbf355d31a chore(deps): update module modernc.org/sqlite to v1.38.2 (#1006)
This PR contains the following updates:

| Package | Change | Age | Confidence |
|---|---|---|---|
| [modernc.org/sqlite](https://gitlab.com/cznic/sqlite) | `v1.38.0` ->
`v1.38.2` |
[![age](https://developer.mend.io/api/mc/badges/age/go/modernc.org%2fsqlite/v1.38.2?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|
[![confidence](https://developer.mend.io/api/mc/badges/confidence/go/modernc.org%2fsqlite/v1.38.0/v1.38.2?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|

---

### Release Notes

<details>
<summary>cznic/sqlite (modernc.org/sqlite)</summary>

###
[`v1.38.2`](https://gitlab.com/cznic/sqlite/compare/v1.38.1...v1.38.2)

[Compare
Source](https://gitlab.com/cznic/sqlite/compare/v1.38.1...v1.38.2)

###
[`v1.38.1`](https://gitlab.com/cznic/sqlite/compare/v1.38.0...v1.38.1)

[Compare
Source](https://gitlab.com/cznic/sqlite/compare/v1.38.0...v1.38.1)

</details>

---

### Configuration

📅 **Schedule**: Branch creation - At any time (no schedule defined),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/googleapis/genai-toolbox).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MS40MC4wIiwidXBkYXRlZEluVmVyIjoiNDEuNDMuNSIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOltdfQ==-->
2025-07-30 01:44:47 +00:00
Mend Renovate
3bdb12f7a7 chore(deps): update module github.com/marcboeker/go-duckdb/v2 to v2.3.4 (#1029)
This PR contains the following updates:

| Package | Change | Age | Confidence |
|---|---|---|---|
|
[github.com/marcboeker/go-duckdb/v2](https://redirect.github.com/marcboeker/go-duckdb)
| `v2.3.3` -> `v2.3.4` |
[![age](https://developer.mend.io/api/mc/badges/age/go/github.com%2fmarcboeker%2fgo-duckdb%2fv2/v2.3.4?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|
[![confidence](https://developer.mend.io/api/mc/badges/confidence/go/github.com%2fmarcboeker%2fgo-duckdb%2fv2/v2.3.3/v2.3.4?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|

---

### Release Notes

<details>
<summary>marcboeker/go-duckdb
(github.com/marcboeker/go-duckdb/v2)</summary>

###
[`v2.3.4`](https://redirect.github.com/marcboeker/go-duckdb/compare/v2.3.3...v2.3.4)

[Compare
Source](https://redirect.github.com/marcboeker/go-duckdb/compare/v2.3.3...v2.3.4)

</details>

---

### Configuration

📅 **Schedule**: Branch creation - At any time (no schedule defined),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/googleapis/genai-toolbox).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MS40My41IiwidXBkYXRlZEluVmVyIjoiNDEuNDMuNSIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOltdfQ==-->
2025-07-30 01:29:06 +00:00
Pranava B
fd149337e9 feat: add support for DuckDB (#879)
Fixes #861 
This PR adds support for DuckDB which is a free, open-source, embedded,
in-process, relational database management system (RDBMS) designed for
analytical processing (OLAP)

---------

Co-authored-by: Averi Kitsch <akitsch@google.com>
2025-07-29 14:31:22 -07:00
Averi Kitsch
c1305b5ab4 chore: remove link checker due to flakiness (#1027) 2025-07-29 13:19:52 -07:00
Averi Kitsch
3003b45256 chore: Update blunderbuss.yml (#1026) 2025-07-29 13:00:40 -07:00
Mend Renovate
1a5fda34b1 chore(deps): update actions/checkout action to v4 (#1018)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
| [actions/checkout](https://redirect.github.com/actions/checkout) |
action | major | `v2` -> `v4` |

---

### Release Notes

<details>
<summary>actions/checkout (actions/checkout)</summary>

###
[`v4`](https://redirect.github.com/actions/checkout/blob/HEAD/CHANGELOG.md#v422)

[Compare
Source](https://redirect.github.com/actions/checkout/compare/v3...v4)

- `url-helper.ts` now leverages well-known environment variables by
[@&#8203;jww3](https://redirect.github.com/jww3) in
[https://github.com/actions/checkout/pull/1941](https://redirect.github.com/actions/checkout/pull/1941)
- Expand unit test coverage for `isGhes` by
[@&#8203;jww3](https://redirect.github.com/jww3) in
[https://github.com/actions/checkout/pull/1946](https://redirect.github.com/actions/checkout/pull/1946)

###
[`v3`](https://redirect.github.com/actions/checkout/blob/HEAD/CHANGELOG.md#v360)

[Compare
Source](https://redirect.github.com/actions/checkout/compare/v2...v3)

- [Fix: Mark test scripts with Bash'isms to be run via
Bash](https://redirect.github.com/actions/checkout/pull/1377)
- [Add option to fetch tags even if fetch-depth >
0](https://redirect.github.com/actions/checkout/pull/579)

</details>

---

### Configuration

📅 **Schedule**: Branch creation - At any time (no schedule defined),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/googleapis/genai-toolbox).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MS40My41IiwidXBkYXRlZEluVmVyIjoiNDEuNDMuNSIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOltdfQ==-->

Co-authored-by: Averi Kitsch <akitsch@google.com>
2025-07-28 16:19:15 -07:00
Mend Renovate
3353085265 chore(deps): pin dependencies (#1017)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
|
[JustinBeckwith/linkinator-action](https://redirect.github.com/JustinBeckwith/linkinator-action)
| action | pinDigest | -> `3d5ba09` |
| [actions/checkout](https://redirect.github.com/actions/checkout) |
action | pinDigest | -> `ee0669b` |

---

### Configuration

📅 **Schedule**: Branch creation - At any time (no schedule defined),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

👻 **Immortal**: This PR will be recreated if closed unmerged. Get
[config
help](https://redirect.github.com/renovatebot/renovate/discussions) if
that's undesired.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/googleapis/genai-toolbox).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MS40My41IiwidXBkYXRlZEluVmVyIjoiNDEuNDMuNSIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOltdfQ==-->
2025-07-28 15:35:53 -07:00
Averi Kitsch
a279d32c57 docs: add link checker and fix broken links (#1014) 2025-07-28 14:51:18 -07:00
Dr. Strangelove
0568423e33 docs: Fix looker source links from looker tools. (#1013)
The relative links to the Looker source documentation page weren't
resolving correctly on the documentation website.

I updated the links in all of the Looker tool markdown files to use a
different relative path that should be correctly interpreted by the Hugo
static site generator. I changed the link from `../sources/looker.md` to
`../../sources/looker/`.

Co-authored-by: google-labs-jules[bot] <161369871+google-labs-jules[bot]@users.noreply.github.com>
2025-07-28 09:18:01 -07:00
Matt Cornillon
92845c943a docs(samples): Adding AlloyDB samples (#987)
Co-authored-by: Matt Cornillon <cornillon@google.com>
Co-authored-by: Averi Kitsch <akitsch@google.com>
2025-07-25 11:38:19 -07:00
146 changed files with 7240 additions and 1193 deletions

View File

@@ -14,116 +14,117 @@
steps:
- id: "build-docker"
name: "gcr.io/cloud-builders/docker"
waitFor: ['-']
waitFor: ["-"]
script: |
#!/usr/bin/env bash
docker buildx create --name container-builder --driver docker-container --bootstrap --use
docker buildx build --platform linux/amd64,linux/arm64 --build-arg COMMIT_SHA=$(git rev-parse HEAD) -t ${_DOCKER_URI}:$REF_NAME --push .
#!/usr/bin/env bash
docker buildx create --name container-builder --driver docker-container --bootstrap --use
docker buildx build --platform linux/amd64,linux/arm64 --build-arg COMMIT_SHA=$(git rev-parse HEAD) -t ${_DOCKER_URI}:$REF_NAME --push .
- id: "install-dependencies"
name: golang:1
waitFor: ['-']
name: golang:1-bookworm
waitFor: ["-"]
env:
- 'GOPATH=/gopath'
- "GOPATH=/gopath"
volumes:
- name: 'go'
path: '/gopath'
- name: "go"
path: "/gopath"
script: |
go get -d ./...
apt-get install -y clang
go get -d ./...
- id: "build-linux-amd64"
name: golang:1
waitFor:
name: golang:1-bookworm
waitFor:
- "install-dependencies"
env:
- 'GOPATH=/gopath'
- "GOPATH=/gopath"
volumes:
- name: 'go'
path: '/gopath'
- name: "go"
path: "/gopath"
script: |
#!/usr/bin/env bash
CGO_ENABLED=0 GOOS=linux GOARCH=amd64 \
go build -ldflags "-X github.com/googleapis/genai-toolbox/cmd.commitSha=$(git rev-parse HEAD)" -o toolbox.linux.amd64
#!/usr/bin/env bash
CGO_ENABLED=1 GOOS=linux GOARCH=amd64 \
go build -ldflags "-X github.com/googleapis/genai-toolbox/cmd.commitSha=$(git rev-parse HEAD)" -o toolbox.linux.amd64
- id: "store-linux-amd64"
name: "gcr.io/cloud-builders/gcloud:latest"
waitFor:
- "build-linux-amd64"
script: |
#!/usr/bin/env bash
gcloud storage cp toolbox.linux.amd64 gs://$_BUCKET_NAME/$REF_NAME/linux/amd64/toolbox
#!/usr/bin/env bash
gcloud storage cp toolbox.linux.amd64 gs://$_BUCKET_NAME/$REF_NAME/linux/amd64/toolbox
- id: "build-darwin-arm64"
name: golang:1
waitFor:
name: golang:1-bookworm
waitFor:
- "install-dependencies"
env:
- 'GOPATH=/gopath'
- "GOPATH=/gopath"
volumes:
- name: 'go'
path: '/gopath'
- name: "go"
path: "/gopath"
script: |
#!/usr/bin/env bash
CGO_ENABLED=0 GOOS=darwin GOARCH=arm64 \
go build -ldflags "-X github.com/googleapis/genai-toolbox/cmd.commitSha=$(git rev-parse HEAD)" -o toolbox.darwin.arm64
#!/usr/bin/env bash
CGO_ENABLED=1 GOOS=darwin GOARCH=arm64 \
go build -ldflags "-X github.com/googleapis/genai-toolbox/cmd.commitSha=$(git rev-parse HEAD)" -o toolbox.darwin.arm64
- id: "store-darwin-arm64"
name: "gcr.io/cloud-builders/gcloud:latest"
waitFor:
- "build-darwin-arm64"
script: |
#!/usr/bin/env bash
gcloud storage cp toolbox.darwin.arm64 gs://$_BUCKET_NAME/$REF_NAME/darwin/arm64/toolbox
#!/usr/bin/env bash
gcloud storage cp toolbox.darwin.arm64 gs://$_BUCKET_NAME/$REF_NAME/darwin/arm64/toolbox
- id: "build-darwin-amd64"
name: golang:1
waitFor:
name: golang:1-bookworm
waitFor:
- "install-dependencies"
env:
- 'GOPATH=/gopath'
- "GOPATH=/gopath"
volumes:
- name: 'go'
path: '/gopath'
- name: "go"
path: "/gopath"
script: |
#!/usr/bin/env bash
CGO_ENABLED=0 GOOS=darwin GOARCH=amd64 \
go build -ldflags "-X github.com/googleapis/genai-toolbox/cmd.commitSha=$(git rev-parse HEAD)" -o toolbox.darwin.amd64
#!/usr/bin/env bash
CGO_ENABLED=1 GOOS=darwin GOARCH=amd64 \
go build -ldflags "-X github.com/googleapis/genai-toolbox/cmd.commitSha=$(git rev-parse HEAD)" -o toolbox.darwin.amd64
- id: "store-darwin-amd64"
name: "gcr.io/cloud-builders/gcloud:latest"
waitFor:
- "build-darwin-amd64"
script: |
#!/usr/bin/env bash
gcloud storage cp toolbox.darwin.amd64 gs://$_BUCKET_NAME/$REF_NAME/darwin/amd64/toolbox
#!/usr/bin/env bash
gcloud storage cp toolbox.darwin.amd64 gs://$_BUCKET_NAME/$REF_NAME/darwin/amd64/toolbox
- id: "build-windows-amd64"
name: golang:1
waitFor:
name: golang:1-bookworm
waitFor:
- "install-dependencies"
env:
- 'GOPATH=/gopath'
- "GOPATH=/gopath"
volumes:
- name: 'go'
path: '/gopath'
- name: "go"
path: "/gopath"
script: |
#!/usr/bin/env bash
CGO_ENABLED=0 GOOS=windows GOARCH=amd64 \
go build -ldflags "-X github.com/googleapis/genai-toolbox/cmd.commitSha=$(git rev-parse HEAD)" -o toolbox.windows.amd64
#!/usr/bin/env bash
CGO_ENABLED=1 GOOS=windows GOARCH=amd64 \
go build -ldflags "-X github.com/googleapis/genai-toolbox/cmd.commitSha=$(git rev-parse HEAD)" -o toolbox.windows.amd64
- id: "store-windows-amd64"
name: "gcr.io/cloud-builders/gcloud:latest"
waitFor:
- "build-windows-amd64"
script: |
#!/usr/bin/env bash
gcloud storage cp toolbox.windows.amd64 gs://$_BUCKET_NAME/$REF_NAME/windows/amd64/toolbox.exe
#!/usr/bin/env bash
gcloud storage cp toolbox.windows.amd64 gs://$_BUCKET_NAME/$REF_NAME/windows/amd64/toolbox.exe
options:
automapSubstitutions: true
dynamicSubstitutions: true
logging: CLOUD_LOGGING_ONLY # Necessary for custom service account
machineType: 'E2_HIGHCPU_32'
machineType: "E2_HIGHCPU_32"
substitutions:
_REGION: us-central1

View File

@@ -486,8 +486,25 @@ steps:
"Looker" \
looker \
looker
- id: "duckdb"
name: golang:1
waitFor: ["compile-test-binary"]
entrypoint: /bin/bash
env:
- "GOPATH=/gopath"
- "SERVICE_ACCOUNT_EMAIL=$SERVICE_ACCOUNT_EMAIL"
volumes:
- name: "go"
path: "/gopath"
secretEnv: ["CLIENT_ID"]
args:
- -c
- |
.ci/test_with_coverage.sh \
"DuckDB" \
duckdb \
duckdb
- id: "alloydbwaitforoperation"
name: golang:1
@@ -507,6 +524,28 @@ steps:
"Alloydb Wait for Operation" \
utility \
utility/alloydbwaitforoperation
- id: "tidb"
name: golang:1
waitFor: ["compile-test-binary"]
entrypoint: /bin/bash
env:
- "GOPATH=/gopath"
- "TIDB_DATABASE=$_DATABASE_NAME"
- "TIDB_HOST=$_TIDB_HOST"
- "TIDB_PORT=$_TIDB_PORT"
- "SERVICE_ACCOUNT_EMAIL=$SERVICE_ACCOUNT_EMAIL"
secretEnv: ["CLIENT_ID", "TIDB_USER", "TIDB_PASS"]
volumes:
- name: "go"
path: "/gopath"
args:
- -c
- |
.ci/test_with_coverage.sh \
"TiDB" \
tidb \
tidbsql tidbexecutesql
availableSecrets:
secretManager:
@@ -566,7 +605,10 @@ availableSecrets:
env: LOOKER_CLIENT_ID
- versionName: projects/107716898620/secrets/looker_client_secret/versions/latest
env: LOOKER_CLIENT_SECRET
- versionName: projects/107716898620/secrets/tidb_user/versions/latest
env: TIDB_USER
- versionName: projects/107716898620/secrets/tidb_pass/versions/latest
env: TIDB_PASS
options:
logging: CLOUD_LOGGING_ONLY
@@ -598,4 +640,6 @@ substitutions:
_DGRAPHURL: "https://play.dgraph.io"
_COUCHBASE_BUCKET: "couchbase-bucket"
_COUCHBASE_SCOPE: "couchbase-scope"
_LOOKER_VERIFY_SSL: "true"
_LOOKER_VERIFY_SSL: "true"
_TIDB_HOST: 127.0.0.1
_TIDB_PORT: "4000"

View File

@@ -14,130 +14,130 @@
steps:
- id: "build-docker"
name: "gcr.io/cloud-builders/docker"
waitFor: ['-']
waitFor: ["-"]
script: |
#!/usr/bin/env bash
export VERSION=$(cat ./cmd/version.txt)
docker buildx create --name container-builder --driver docker-container --bootstrap --use
#!/usr/bin/env bash
export VERSION=$(cat ./cmd/version.txt)
docker buildx create --name container-builder --driver docker-container --bootstrap --use
export TAGS="-t ${_DOCKER_URI}:$VERSION"
if [[ $_PUSH_LATEST == 'true' ]]; then
export TAGS="$TAGS -t ${_DOCKER_URI}:latest"
fi
docker buildx build --platform linux/amd64,linux/arm64 --build-arg BUILD_TYPE=container.release --build-arg COMMIT_SHA=$(git rev-parse HEAD) $TAGS --push .
export TAGS="-t ${_DOCKER_URI}:$VERSION"
if [[ $_PUSH_LATEST == 'true' ]]; then
export TAGS="$TAGS -t ${_DOCKER_URI}:latest"
fi
docker buildx build --platform linux/amd64,linux/arm64 --build-arg BUILD_TYPE=container.release --build-arg COMMIT_SHA=$(git rev-parse HEAD) $TAGS --push .
- id: "install-dependencies"
name: golang:1
waitFor: ['-']
env:
- 'GOPATH=/gopath'
waitFor: ["-"]
env:
- "GOPATH=/gopath"
volumes:
- name: 'go'
path: '/gopath'
- name: "go"
path: "/gopath"
script: |
go get -d ./...
go get -d ./...
- id: "build-linux-amd64"
name: golang:1
waitFor:
waitFor:
- "install-dependencies"
env:
- 'GOPATH=/gopath'
env:
- "GOPATH=/gopath"
volumes:
- name: 'go'
path: '/gopath'
- name: "go"
path: "/gopath"
script: |
#!/usr/bin/env bash
export VERSION=$(cat ./cmd/version.txt)
CGO_ENABLED=0 GOOS=linux GOARCH=amd64 \
go build -ldflags "-X github.com/googleapis/genai-toolbox/cmd.buildType=binary -X github.com/googleapis/genai-toolbox/cmd.commitSha=$(git rev-parse HEAD)" -o toolbox.linux.amd64
#!/usr/bin/env bash
export VERSION=$(cat ./cmd/version.txt)
CGO_ENABLED=1 GOOS=linux GOARCH=amd64 \
go build -ldflags "-X github.com/googleapis/genai-toolbox/cmd.buildType=binary -X github.com/googleapis/genai-toolbox/cmd.commitSha=$(git rev-parse HEAD)" -o toolbox.linux.amd64
- id: "store-linux-amd64"
name: "gcr.io/cloud-builders/gcloud:latest"
waitFor:
- "build-linux-amd64"
script: |
#!/usr/bin/env bash
export VERSION=v$(cat ./cmd/version.txt)
gcloud storage cp toolbox.linux.amd64 gs://$_BUCKET_NAME/$VERSION/linux/amd64/toolbox
#!/usr/bin/env bash
export VERSION=v$(cat ./cmd/version.txt)
gcloud storage cp toolbox.linux.amd64 gs://$_BUCKET_NAME/$VERSION/linux/amd64/toolbox
- id: "build-darwin-arm64"
name: golang:1
waitFor:
waitFor:
- "install-dependencies"
env:
- 'GOPATH=/gopath'
env:
- "GOPATH=/gopath"
volumes:
- name: 'go'
path: '/gopath'
- name: "go"
path: "/gopath"
script: |
#!/usr/bin/env bash
export VERSION=$(cat ./cmd/version.txt)
CGO_ENABLED=0 GOOS=darwin GOARCH=arm64 \
go build -ldflags "-X github.com/googleapis/genai-toolbox/cmd.buildType=binary -X github.com/googleapis/genai-toolbox/cmd.commitSha=$(git rev-parse HEAD)" -o toolbox.darwin.arm64
#!/usr/bin/env bash
export VERSION=$(cat ./cmd/version.txt)
CGO_ENABLED=1 GOOS=darwin GOARCH=arm64 \
go build -ldflags "-X github.com/googleapis/genai-toolbox/cmd.buildType=binary -X github.com/googleapis/genai-toolbox/cmd.commitSha=$(git rev-parse HEAD)" -o toolbox.darwin.arm64
- id: "store-darwin-arm64"
name: "gcr.io/cloud-builders/gcloud:latest"
waitFor:
- "build-darwin-arm64"
script: |
#!/usr/bin/env bash
export VERSION=v$(cat ./cmd/version.txt)
gcloud storage cp toolbox.darwin.arm64 gs://$_BUCKET_NAME/$VERSION/darwin/arm64/toolbox
#!/usr/bin/env bash
export VERSION=v$(cat ./cmd/version.txt)
gcloud storage cp toolbox.darwin.arm64 gs://$_BUCKET_NAME/$VERSION/darwin/arm64/toolbox
- id: "build-darwin-amd64"
name: golang:1
waitFor:
waitFor:
- "install-dependencies"
env:
- 'GOPATH=/gopath'
env:
- "GOPATH=/gopath"
volumes:
- name: 'go'
path: '/gopath'
- name: "go"
path: "/gopath"
script: |
#!/usr/bin/env bash
export VERSION=$(cat ./cmd/version.txt)
CGO_ENABLED=0 GOOS=darwin GOARCH=amd64 \
go build -ldflags "-X github.com/googleapis/genai-toolbox/cmd.buildType=binary -X github.com/googleapis/genai-toolbox/cmd.commitSha=$(git rev-parse HEAD)" -o toolbox.darwin.amd64
#!/usr/bin/env bash
export VERSION=$(cat ./cmd/version.txt)
CGO_ENABLED=1 GOOS=darwin GOARCH=amd64 \
go build -ldflags "-X github.com/googleapis/genai-toolbox/cmd.buildType=binary -X github.com/googleapis/genai-toolbox/cmd.commitSha=$(git rev-parse HEAD)" -o toolbox.darwin.amd64
- id: "store-darwin-amd64"
name: "gcr.io/cloud-builders/gcloud:latest"
waitFor:
- "build-darwin-amd64"
script: |
#!/usr/bin/env bash
export VERSION=v$(cat ./cmd/version.txt)
gcloud storage cp toolbox.darwin.amd64 gs://$_BUCKET_NAME/$VERSION/darwin/amd64/toolbox
#!/usr/bin/env bash
export VERSION=v$(cat ./cmd/version.txt)
gcloud storage cp toolbox.darwin.amd64 gs://$_BUCKET_NAME/$VERSION/darwin/amd64/toolbox
- id: "build-windows-amd64"
name: golang:1
waitFor:
waitFor:
- "install-dependencies"
env:
- 'GOPATH=/gopath'
env:
- "GOPATH=/gopath"
volumes:
- name: 'go'
path: '/gopath'
- name: "go"
path: "/gopath"
script: |
#!/usr/bin/env bash
export VERSION=$(cat ./cmd/version.txt)
CGO_ENABLED=0 GOOS=windows GOARCH=amd64 \
go build -ldflags "-X github.com/googleapis/genai-toolbox/cmd.buildType=binary -X github.com/googleapis/genai-toolbox/cmd.commitSha=$(git rev-parse HEAD)" -o toolbox.windows.amd64
#!/usr/bin/env bash
export VERSION=$(cat ./cmd/version.txt)
CGO_ENABLED=1 GOOS=windows GOARCH=amd64 \
go build -ldflags "-X github.com/googleapis/genai-toolbox/cmd.buildType=binary -X github.com/googleapis/genai-toolbox/cmd.commitSha=$(git rev-parse HEAD)" -o toolbox.windows.amd64
- id: "store-windows-amd64"
name: "gcr.io/cloud-builders/gcloud:latest"
waitFor:
- "build-windows-amd64"
script: |
#!/usr/bin/env bash
export VERSION=v$(cat ./cmd/version.txt)
gcloud storage cp toolbox.windows.amd64 gs://$_BUCKET_NAME/$VERSION/windows/amd64/toolbox.exe
#!/usr/bin/env bash
export VERSION=v$(cat ./cmd/version.txt)
gcloud storage cp toolbox.windows.amd64 gs://$_BUCKET_NAME/$VERSION/windows/amd64/toolbox.exe
options:
automapSubstitutions: true
dynamicSubstitutions: true
logging: CLOUD_LOGGING_ONLY # Necessary for custom service account
machineType: 'E2_HIGHCPU_32'
machineType: "E2_HIGHCPU_32"
substitutions:
_REGION: us-central1

View File

@@ -1,7 +1,7 @@
assign_issues:
- Yuan325
- duwenxin99
- akitsch
- averikitsch
assign_issues_by:
- labels:
- 'product: bigquery'
@@ -12,4 +12,4 @@ assign_issues_by:
assign_prs:
- Yuan325
- duwenxin99
- akitsch
- averikitsch

View File

@@ -1,5 +1,21 @@
# Changelog
## [0.11.0](https://github.com/googleapis/genai-toolbox/compare/v0.10.0...v0.11.0) (2025-08-04)
### ⚠ BREAKING CHANGES
* **tools/bigquery-sql:** Ensure invoke always returns a non-null value ([#1020](https://github.com/googleapis/genai-toolbox/issues/1020)) ([9af55b6](https://github.com/googleapis/genai-toolbox/commit/9af55b651d836f268eda342ea27380e7c9967c94))
* **tools/bigquery-execute-sql:** Update the return messages ([#1034](https://github.com/googleapis/genai-toolbox/issues/1034)) ([051e686](https://github.com/googleapis/genai-toolbox/commit/051e686476a781ca49f7617764d507916a1188b8))
### Features
* Add DuckDB source and tool ([#879](https://github.com/googleapis/genai-toolbox/pull/879)) ([fd14933](https://github.com/googleapis/genai-toolbox/commit/fd149337e9fa8e912e8699962a7104d51cdffc5d))
* Add TiDB source and tool ([#829](https://github.com/googleapis/genai-toolbox/issues/829)) ([6eaf36a](https://github.com/googleapis/genai-toolbox/commit/6eaf36ac8505d523fa4f5a4ac3c97209fd688cef))
* Interactive web UI for Toolbox ([#1065](https://github.com/googleapis/genai-toolbox/issues/1065)) ([8749b03](https://github.com/googleapis/genai-toolbox/commit/8749b030035e65361047c4ead13dfacb8e9a9b59))
* **tools/looker-query-url:** Add support for `looker-query-url` tool ([#1015](https://github.com/googleapis/genai-toolbox/issues/1015)) ([327ddf0](https://github.com/googleapis/genai-toolbox/commit/327ddf0439058aa5ecd2c7ae8251fcde6aeff18c))
* **tools/dataplex-lookup-entry:** Add support for `dataplex-lookup-entry` tool ([#1009](https://github.com/googleapis/genai-toolbox/issues/1009)) ([5fa1660](https://github.com/googleapis/genai-toolbox/commit/5fa1660fc8631989b4d13abea205b6426bb506a5))
## [0.10.0](https://github.com/googleapis/genai-toolbox/compare/v0.9.0...v0.10.0) (2025-07-25)

View File

@@ -81,7 +81,7 @@ implementation](https://github.com/googleapis/genai-toolbox/blob/main/internal/s
#### 2. Implement the New Tool
We recommend looking at an [example tool
implementation](https://github.com/googleapis/genai-toolbox/tree/main/internal/tools/postgressql).
implementation](https://github.com/googleapis/genai-toolbox/tree/main/internal/tools/postgres/postgressql).
* **Create a new directory** under `internal/tools` for your tool type (e.g.,
`internal/tools/newdb` or `internal/tools/newdb<tool_name>`).
@@ -134,7 +134,7 @@ tools.
5. (Optional) [RunToolInvokeWithTemplateParameters][temp-param]: tests for [template
parameters][temp-param-doc]. Only run this test if template
parameters apply to your tool.
* **Add the new database to the test config** in
[integration.cloudbuild.yaml](.ci/integration.cloudbuild.yaml).

View File

@@ -13,7 +13,7 @@
# limitations under the License.
# Use the latest stable golang 1.x to compile to a binary
FROM --platform=$BUILDPLATFORM golang:1 AS build
FROM --platform=$BUILDPLATFORM golang:1-bookworm AS build
WORKDIR /go/src/genai-toolbox
COPY . .
@@ -23,8 +23,11 @@ ARG TARGETARCH
ARG BUILD_TYPE="container.dev"
ARG COMMIT_SHA=""
RUN apt-get update && \
apt install -y clang && \
rm -rf /var/lib/apt/lists/*
RUN go get ./...
RUN CGO_ENABLED=0 GOOS=${TARGETOS} GOARCH=${TARGETARCH} \
RUN CGO_ENABLED=1 GOOS=${TARGETOS} GOARCH=${TARGETARCH} \
go build -ldflags "-X github.com/googleapis/genai-toolbox/cmd.buildType=container.${BUILD_TYPE} -X github.com/googleapis/genai-toolbox/cmd.commitSha=${COMMIT_SHA}"
# Final Stage
@@ -34,4 +37,4 @@ WORKDIR /app
COPY --from=build --chown=nonroot /go/src/genai-toolbox/genai-toolbox /toolbox
USER nonroot
ENTRYPOINT ["/toolbox"]
ENTRYPOINT ["/toolbox"]

View File

@@ -1,20 +0,0 @@
load("//tools/build_defs/go:go_library.bzl", "go_library")
load("//tools/build_defs/go:go_test.bzl", "go_test")
go_library(
name = "cmd",
srcs = [
"options.go",
"root.go",
],
embedsrcs = ["version.txt"],
)
go_test(
name = "cmd_test",
srcs = [
"options_test.go",
"root_test.go",
],
library = ":cmd",
)

View File

@@ -51,8 +51,10 @@ import (
_ "github.com/googleapis/genai-toolbox/internal/tools/bigquery/bigquerysql"
_ "github.com/googleapis/genai-toolbox/internal/tools/bigtable"
_ "github.com/googleapis/genai-toolbox/internal/tools/couchbase"
_ "github.com/googleapis/genai-toolbox/internal/tools/dataplex/dataplexlookupentry"
_ "github.com/googleapis/genai-toolbox/internal/tools/dataplex/dataplexsearchentries"
_ "github.com/googleapis/genai-toolbox/internal/tools/dgraph"
_ "github.com/googleapis/genai-toolbox/internal/tools/duckdbsql"
_ "github.com/googleapis/genai-toolbox/internal/tools/firestore/firestoredeletedocuments"
_ "github.com/googleapis/genai-toolbox/internal/tools/firestore/firestoregetdocuments"
_ "github.com/googleapis/genai-toolbox/internal/tools/firestore/firestoregetrules"
@@ -69,6 +71,7 @@ import (
_ "github.com/googleapis/genai-toolbox/internal/tools/looker/lookergetparameters"
_ "github.com/googleapis/genai-toolbox/internal/tools/looker/lookerquery"
_ "github.com/googleapis/genai-toolbox/internal/tools/looker/lookerquerysql"
_ "github.com/googleapis/genai-toolbox/internal/tools/looker/lookerqueryurl"
_ "github.com/googleapis/genai-toolbox/internal/tools/looker/lookerrunlook"
_ "github.com/googleapis/genai-toolbox/internal/tools/mongodb/mongodbaggregate"
_ "github.com/googleapis/genai-toolbox/internal/tools/mongodb/mongodbdeletemany"
@@ -92,6 +95,8 @@ import (
_ "github.com/googleapis/genai-toolbox/internal/tools/spanner/spannerexecutesql"
_ "github.com/googleapis/genai-toolbox/internal/tools/spanner/spannersql"
_ "github.com/googleapis/genai-toolbox/internal/tools/sqlitesql"
_ "github.com/googleapis/genai-toolbox/internal/tools/tidb/tidbexecutesql"
_ "github.com/googleapis/genai-toolbox/internal/tools/tidb/tidbsql"
_ "github.com/googleapis/genai-toolbox/internal/tools/utility/alloydbwaitforoperation"
_ "github.com/googleapis/genai-toolbox/internal/tools/utility/wait"
_ "github.com/googleapis/genai-toolbox/internal/tools/valkey"
@@ -107,6 +112,7 @@ import (
_ "github.com/googleapis/genai-toolbox/internal/sources/couchbase"
_ "github.com/googleapis/genai-toolbox/internal/sources/dataplex"
_ "github.com/googleapis/genai-toolbox/internal/sources/dgraph"
_ "github.com/googleapis/genai-toolbox/internal/sources/duckdb"
_ "github.com/googleapis/genai-toolbox/internal/sources/firestore"
_ "github.com/googleapis/genai-toolbox/internal/sources/http"
_ "github.com/googleapis/genai-toolbox/internal/sources/looker"
@@ -118,6 +124,7 @@ import (
_ "github.com/googleapis/genai-toolbox/internal/sources/redis"
_ "github.com/googleapis/genai-toolbox/internal/sources/spanner"
_ "github.com/googleapis/genai-toolbox/internal/sources/sqlite"
_ "github.com/googleapis/genai-toolbox/internal/sources/tidb"
_ "github.com/googleapis/genai-toolbox/internal/sources/valkey"
)
@@ -216,9 +223,10 @@ func NewCommand(opts ...Option) *Command {
flags.BoolVar(&cmd.cfg.TelemetryGCP, "telemetry-gcp", false, "Enable exporting directly to Google Cloud Monitoring.")
flags.StringVar(&cmd.cfg.TelemetryOTLP, "telemetry-otlp", "", "Enable exporting using OpenTelemetry Protocol (OTLP) to the specified endpoint (e.g. 'http://127.0.0.1:4318')")
flags.StringVar(&cmd.cfg.TelemetryServiceName, "telemetry-service-name", "toolbox", "Sets the value of the service.name resource attribute for telemetry data.")
flags.StringVar(&cmd.prebuiltConfig, "prebuilt", "", "Use a prebuilt tool configuration by source type. Cannot be used with --tools-file. Allowed: 'alloydb-postgres-admin', alloydb-postgres', 'bigquery', 'cloud-sql-mysql', 'cloud-sql-postgres', 'cloud-sql-mssql', 'dataplex', 'firestore', 'mssql', 'mysql', 'postgres', 'spanner', 'spanner-postgres'.")
flags.StringVar(&cmd.prebuiltConfig, "prebuilt", "", "Use a prebuilt tool configuration by source type. Cannot be used with --tools-file. Allowed: 'alloydb-postgres-admin', alloydb-postgres', 'bigquery', 'cloud-sql-mysql', 'cloud-sql-postgres', 'cloud-sql-mssql', 'dataplex', 'firestore', 'looker', 'mssql', 'mysql', 'postgres', 'spanner', 'spanner-postgres'.")
flags.BoolVar(&cmd.cfg.Stdio, "stdio", false, "Listens via MCP STDIO instead of acting as a remote HTTP server.")
flags.BoolVar(&cmd.cfg.DisableReload, "disable-reload", false, "Disables dynamic reloading of tools file.")
flags.BoolVar(&cmd.cfg.UI, "ui", false, "Launches the Toolbox UI web server.")
// wrap RunE command so that we have access to original Command object
cmd.RunE = func(*cobra.Command, []string) error { return run(cmd) }
@@ -795,6 +803,9 @@ func run(cmd *Command) error {
return errMsg
}
cmd.logger.InfoContext(ctx, "Server ready to serve!")
if cmd.cfg.UI {
cmd.logger.InfoContext(ctx, "Toolbox UI is up and running at: http://localhost:5000/ui")
}
go func() {
defer close(srvErr)

View File

@@ -1250,7 +1250,7 @@ func TestPrebuiltTools(t *testing.T) {
wantToolset: server.ToolsetConfigs{
"dataplex-tools": tools.ToolsetConfig{
Name: "dataplex-tools",
ToolNames: []string{"dataplex_search_entries"},
ToolNames: []string{"dataplex_search_entries", "dataplex_lookup_entry"},
},
},
},
@@ -1290,7 +1290,7 @@ func TestPrebuiltTools(t *testing.T) {
wantToolset: server.ToolsetConfigs{
"looker-tools": tools.ToolsetConfig{
Name: "looker-tools",
ToolNames: []string{"get_models", "get_explores", "get_dimensions", "get_measures", "get_filters", "get_parameters", "query", "query_sql", "get_looks", "run_look"},
ToolNames: []string{"get_models", "get_explores", "get_dimensions", "get_measures", "get_filters", "get_parameters", "query", "query_sql", "query_url", "get_looks", "run_look"},
},
},
},

View File

@@ -1 +1 @@
0.10.0
0.11.0

View File

@@ -4,12 +4,12 @@ type: docs
notoc: false
weight: 1
description: >
All of Toolbox's documentation.
All of Toolbox's documentation.
---
<html>
<head>
<link rel="canonical" href="getting-started/introduction/"/>
<meta http-equiv="refresh" content="0;url=getting-started/introduction"/>
<meta http-equiv="refresh" content="0;url=getting-started/introduction/"/>
</head>
</html>

View File

@@ -3,7 +3,7 @@ title: "Telemetry"
type: docs
weight: 2
description: >
An overview of telemetry and observability in Toolbox.
An overview of telemetry and observability in Toolbox.
---
## About
@@ -158,7 +158,7 @@ enabled:
- [Cloud Logging API](https://cloud.google.com/logging/docs/api/enable-api)
- [Cloud Monitoring API](https://cloud.google.com/monitoring/api/enable-api)
- [Cloud Trace API](https://cloud.google.com/apis/enableflow?apiid=cloudtrace.googleapis.com)
- [Cloud Trace API](https://console.cloud.google.com/apis/enableflow?apiid=cloudtrace.googleapis.com)
{{< /notice >}}
#### OTLP Exporter
@@ -177,7 +177,7 @@ It receives telemetry data, transforms it, and then exports data to backends
that can store it permanently. Toolbox provide an option to export telemetry
data to user's choice of backend(s) that are compatible with the Open Telemetry
Protocol (OTLP). If you would like to use a collector, please refer to this
[Export Telemetry using the Otel Collector](../how-to/export_telemetry.md).
[Export Telemetry using the Otel Collector](../../how-to/export_telemetry.md).
### Flags

View File

@@ -234,7 +234,7 @@
},
"outputs": [],
"source": [
"version = \"0.10.0\" # x-release-please-version\n",
"version = \"0.11.0\" # x-release-please-version\n",
"! curl -O https://storage.googleapis.com/genai-toolbox/v{version}/linux/amd64/toolbox\n",
"\n",
"# Make the binary executable\n",

View File

@@ -136,6 +136,15 @@ Toolbox enables dynamic reloading by default. To disable, use the
`--disable-reload` flag.
{{< /notice >}}
#### Launching Toolbox UI
To launch Toolbox's interactive UI, use the `--ui` flag. This allows you to test tools and toolsets
with features such as authorized parameters. To learn more, visit [Toolbox UI](../../how-to/use-toolbox-ui/index.md).
```sh
./toolbox --ui
```
#### Homebrew Users
If you installed Toolbox using Homebrew, the `toolbox` binary is available in your system path. You can start the server with the same command:
@@ -148,7 +157,7 @@ You can use `toolbox help` for a full list of flags! To stop the server, send a
terminate signal (`ctrl+c` on most platforms).
For more detailed documentation on deploying to different environments, check
out the resources in the [How-to section](../../how-to/_index.md)
out the resources in the [How-to section](../../how-to/)
### Integrating your application
@@ -321,7 +330,7 @@ const toolboxTools = await client.loadToolset('toolsetName');
const getTool = (toolboxTool) => tool({
name: toolboxTool.getName(),
description: toolboxTool.getDescription(),
parameters: toolboxTool.getParams(),
parameters: toolboxTool.getParamSchema(),
execute: toolboxTool
});;

Binary file not shown.

Before

Width:  |  Height:  |  Size: 154 KiB

After

Width:  |  Height:  |  Size: 76 KiB

View File

@@ -137,7 +137,7 @@ postgres` and a password next time.
```sql
INSERT INTO hotels(id, name, location, price_tier, checkin_date, checkout_date, booked)
VALUES
VALUES
(1, 'Hilton Basel', 'Basel', 'Luxury', '2024-04-22', '2024-04-20', B'0'),
(2, 'Marriott Zurich', 'Zurich', 'Upscale', '2024-04-14', '2024-04-21', B'0'),
(3, 'Hyatt Regency Basel', 'Basel', 'Upper Upscale', '2024-04-02', '2024-04-20', B'0'),

View File

@@ -133,7 +133,7 @@ postgres` and a password next time.
```sql
INSERT INTO hotels(id, name, location, price_tier, checkin_date, checkout_date, booked)
VALUES
VALUES
(1, 'Hilton Basel', 'Basel', 'Luxury', '2024-04-22', '2024-04-20', B'0'),
(2, 'Marriott Zurich', 'Zurich', 'Upscale', '2024-04-14', '2024-04-21', B'0'),
(3, 'Hyatt Regency Basel', 'Basel', 'Upper Upscale', '2024-04-02', '2024-04-20', B'0'),
@@ -466,14 +466,14 @@ async function run() {
}
})
);
conversationHistory.push(...toolResponses);
// Call the AI again with the tool results.
response = await ai.generate({ messages: conversationHistory, tools });
conversationHistory.push(response.message);
}
console.log(response.text);
}
}

View File

@@ -71,7 +71,7 @@ access by our agent, and create a database user for Toolbox to connect with.
```sql
INSERT INTO hotels(id, name, location, price_tier, checkin_date, checkout_date, booked)
VALUES
VALUES
(1, 'Hilton Basel', 'Basel', 'Luxury', '2024-04-22', '2024-04-20', B'0'),
(2, 'Marriott Zurich', 'Zurich', 'Upscale', '2024-04-14', '2024-04-21', B'0'),
(3, 'Hyatt Regency Basel', 'Basel', 'Upper Upscale', '2024-04-02', '2024-04-20', B'0'),
@@ -200,7 +200,7 @@ In this section, we will download Toolbox, configure our tools in a
```
For more info on tools, check out the
[Tools](../../resources/tools/_index.md) section.
[Tools](../../resources/tools/) section.
1. Run the Toolbox server, pointing to the `tools.yaml` file created earlier:

View File

@@ -6,337 +6,9 @@ description: >
Create your AlloyDB database with MCP Toolbox.
---
This guide covers how to use [MCP Toolbox for Databases][toolbox] to create
AlloyDB clusters and instances from IDE enabling their E2E journey.
- [Cursor][cursor]
- [Windsurf][windsurf] (Codium)
- [Visual Studio Code][vscode] (Copilot)
- [Cline][cline] (VS Code extension)
- [Claude desktop][claudedesktop]
- [Claude code][claudecode]
- [Gemini CLI][geminicli]
- [Gemini Code Assist][geminicodeassist]
[toolbox]: https://github.com/googleapis/genai-toolbox
[cursor]: #configure-your-mcp-client
[windsurf]: #configure-your-mcp-client
[vscode]: #configure-your-mcp-client
[cline]: #configure-your-mcp-client
[claudedesktop]: #configure-your-mcp-client
[claudecode]: #configure-your-mcp-client
[geminicli]: #configure-your-mcp-client
[geminicodeassist]: #configure-your-mcp-client
## Before you begin
1. In the Google Cloud console, on the [project selector
page](https://console.cloud.google.com/projectselector2/home/dashboard),
select or create a Google Cloud project.
1. [Make sure that billing is enabled for your Google Cloud
project](https://cloud.google.com/billing/docs/how-to/verify-billing-enabled#confirm_billing_is_enabled_on_a_project).
## Install MCP Toolbox
1. Download the latest version of Toolbox as a binary. Select the [correct
binary](https://github.com/googleapis/genai-toolbox/releases) corresponding
to your OS and CPU architecture. You are required to use Toolbox version
V0.10.0+:
<!-- {x-release-please-start-version} -->
{{< tabpane persist=header >}}
{{< tab header="linux/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.10.0/linux/amd64/toolbox
{{< /tab >}}
{{< tab header="darwin/arm64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.10.0/darwin/arm64/toolbox
{{< /tab >}}
{{< tab header="darwin/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.10.0/darwin/amd64/toolbox
{{< /tab >}}
{{< tab header="windows/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.10.0/windows/amd64/toolbox.exe
{{< /tab >}}
{{< /tabpane >}}
<!-- {x-release-please-end} -->
1. Make the binary executable:
```bash
chmod +x toolbox
```
1. Verify the installation:
```bash
./toolbox --version
```
## Configure your MCP Client
{{< tabpane text=true >}}
{{% tab header="Claude code" lang="en" %}}
1. Install [Claude
Code](https://docs.anthropic.com/en/docs/agents-and-tools/claude-code/overview).
1. Create a `.mcp.json` file in your project root if it doesn't exist.
1. Generate Access token to be used as API_KEY using `gcloud auth
print-access-token`.
> **Note:** The lifetime of token is 1 hour.
1. Add the following configuration, replace the environment variables with your
values, and save:
```json
{
"mcpServers": {
"alloydb-admin": {
"command": "./PATH/TO/toolbox",
"args": ["--prebuilt", "alloydb-postgres-admin", "--stdio"],
"env": {
"API_KEY": "your-api-key"
}
}
}
}
```
1. Restart Claude code to apply the new configuration.
{{% /tab %}}
{{% tab header="Claude desktop" lang="en" %}}
1. Open [Claude desktop](https://claude.ai/download) and navigate to Settings.
1. Under the Developer tab, tap Edit Config to open the configuration file.
1. Generate Access token to be used as API_KEY using `gcloud auth
print-access-token`.
> **Note:** The lifetime of token is 1 hour.
1. Add the following configuration, replace the environment variables with your
values, and save:
```json
{
"mcpServers": {
"alloydb-admin": {
"command": "./PATH/TO/toolbox",
"args": ["--prebuilt", "alloydb-postgres-admin", "--stdio"],
"env": {
"API_KEY": "your-api-key"
}
}
}
}
```
1. Restart Claude desktop.
1. From the new chat screen, you should see a hammer (MCP) icon appear with the
new MCP server available.
{{% /tab %}}
{{% tab header="Cline" lang="en" %}}
1. Open the [Cline](https://github.com/cline/cline) extension in VS Code and tap
the **MCP Servers** icon.
1. Tap Configure MCP Servers to open the configuration file.
1. Generate Access token to be used as API_KEY using `gcloud auth
print-access-token`.
> **Note:** The lifetime of token is 1 hour.
1. Add the following configuration, replace the environment variables with your
values, and save:
```json
{
"mcpServers": {
"alloydb-admin": {
"command": "./PATH/TO/toolbox",
"args": ["--prebuilt", "alloydb-postgres-admin", "--stdio"],
"env": {
"API_KEY": "your-api-key"
}
}
}
}
```
1. You should see a green active status after the server is successfully
connected.
{{% /tab %}}
{{% tab header="Cursor" lang="en" %}}
1. Create a `.cursor` directory in your project root if it doesn't exist.
1. Create a `.cursor/mcp.json` file if it doesn't exist and open it.
1. Generate Access token to be used as API_KEY using `gcloud auth
print-access-token`.
> **Note:** The lifetime of token is 1 hour.
1. Add the following configuration, replace the environment variables with your
values, and save:
```json
{
"mcpServers": {
"alloydb-admin": {
"command": "./PATH/TO/toolbox",
"args": ["--prebuilt", "alloydb-postgres-admin", "--stdio"],
"env": {
"API_KEY": "your-api-key"
}
}
}
}
```
1. [Cursor](https://www.cursor.com/) and navigate to **Settings > Cursor
Settings > MCP**. You should see a green active status after the server is
successfully connected.
{{% /tab %}}
{{% tab header="Visual Studio Code (Copilot)" lang="en" %}}
1. Open [VS Code](https://code.visualstudio.com/docs/copilot/overview) and
create a `.vscode` directory in your project root if it doesn't exist.
1. Create a `.vscode/mcp.json` file if it doesn't exist and open it.
1. Generate Access token to be used as API_KEY using `gcloud auth
print-access-token`.
> **Note:** The lifetime of token is 1 hour.
1. Add the following configuration, replace the environment variables with your
values, and save:
```json
{
"mcpServers": {
"alloydb-admin": {
"command": "./PATH/TO/toolbox",
"args": ["--prebuilt", "alloydb-postgres-admin", "--stdio"],
"env": {
"API_KEY": "your-api-key"
}
}
}
}
```
{{% /tab %}}
{{% tab header="Windsurf" lang="en" %}}
1. Open [Windsurf](https://docs.codeium.com/windsurf) and navigate to the
Cascade assistant.
1. Tap on the hammer (MCP) icon, then Configure to open the configuration file.
1. Generate Access token to be used as API_KEY using `gcloud auth
print-access-token`.
> **Note:** The lifetime of token is 1 hour.
1. Add the following configuration, replace the environment variables with your
values, and save:
```json
{
"mcpServers": {
"alloydb-admin": {
"command": "./PATH/TO/toolbox",
"args": ["--prebuilt", "alloydb-postgres-admin", "--stdio"],
"env": {
"API_KEY": "your-api-key"
}
}
}
}
```
{{% /tab %}}
{{% tab header="Gemini CLI" lang="en" %}}
1. Install the [Gemini
CLI](https://github.com/google-gemini/gemini-cli?tab=readme-ov-file#quickstart).
1. In your working directory, create a folder named `.gemini`. Within it, create
a `settings.json` file.
1. Generate Access token to be used as API_KEY using `gcloud auth print-access-token`.
> **Note:** The lifetime of token is 1 hour.
1. Add the following configuration, replace the environment variables with your
values, and save:
```json
{
"mcpServers": {
"alloydb-admin": {
"command": "./PATH/TO/toolbox",
"args": ["--prebuilt", "alloydb-postgres-admin", "--stdio"],
"env": {
"API_KEY": "your-api-key"
}
}
}
}
```
{{% /tab %}}
{{% tab header="Gemini Code Assist" lang="en" %}}
1. Install the [Gemini Code
Assist](https://marketplace.visualstudio.com/items?itemName=Google.geminicodeassist)
extension in Visual Studio Code.
1. Enable Agent Mode in Gemini Code Assist chat.
1. In your working directory, create a folder named `.gemini`. Within it, create
a `settings.json` file.
1. Generate Access token to be used as API_KEY using `gcloud auth print-access-token`.
> **Note:** The lifetime of token is 1 hour.
1. Add the following configuration, replace the environment variables with your
values, and save:
```json
{
"mcpServers": {
"alloydb-admin": {
"command": "./PATH/TO/toolbox",
"args": ["--prebuilt", "alloydb-postgres-admin", "--stdio"],
"env": {
"API_KEY": "your-api-key"
}
}
}
}
```
{{% /tab %}}
{{< /tabpane >}}
## Use Tools
Your AI tool is now connected to AlloyDB using MCP. Try asking your AI assistant
to create a database, cluster or instance.
The following tools are available to the LLM:
1. **alloydb-create-cluster**: creates alloydb cluster
1. **alloydb-create-instance**: creates alloydb instance (PRIMARY, READ_POOL or SECONDARY)
1. **alloydb-get-operation**: polls on operations API until the operation is done.
{{< notice note >}}
Prebuilt tools are pre-1.0, so expect some tool changes between versions. LLMs
will adapt to the tools available, so this shouldn't affect most users.
{{< /notice >}}
## Connect to your Data
After setting up an AlloyDB cluster and instance, you can [connect your IDE to
the
database](https://cloud.google.com/alloydb/docs/pre-built-tools-with-mcp-toolbox).
<html>
<head>
<link rel="canonical" href="https://cloud.google.com/alloydb/docs/create-database-with-mcp-toolbox"/>
<meta http-equiv="refresh" content="0;url=https://cloud.google.com/alloydb/docs/create-database-with-mcp-toolbox"/>
</head>
</html>

View File

@@ -7,7 +7,7 @@ description: >
---
<html>
<head>
<link rel="canonical" href="https://cloud.google.com/alloydb/docs/pre-built-tools-with-mcp-toolbox"/>
<meta http-equiv="refresh" content="0;url=https://cloud.google.com/alloydb/docs/pre-built-tools-with-mcp-toolbox"/>
<link rel="canonical" href="https://cloud.google.com/alloydb/docs/connect-ide-using-mcp-toolbox"/>
<meta http-equiv="refresh" content="0;url=https://cloud.google.com/alloydb/docs/connect-ide-using-mcp-toolbox"/>
</head>
</html>

View File

@@ -44,7 +44,7 @@ to expose your developer assistant tools to a Looker instance:
v0.10.0+:
<!-- {x-release-please-start-version} -->
{{< tabpane persist=header >}}
{{< tabpane persist=header >}}
{{< tab header="linux/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.10.0/linux/amd64/toolbox
{{< /tab >}}
@@ -265,6 +265,7 @@ The following tools are available to the LLM:
1. **get_parameters**: list the parameters in a given explore
1. **query**: Run a query
1. **query_sql**: Return the SQL generated by Looker for a query
1. **query_url**: Return a link to the query in Looker for further exploration
1. **get_looks**: Return the saved Looks that match a title or description
1. **run_look**: Run a saved Look and return the data

View File

@@ -29,8 +29,8 @@ Toolbox currently supports the following versions of MCP specification:
The auth implementation in Toolbox is not supported in MCP's auth specification.
This includes:
* [Authenticated Parameters](../resources/tools/_index.md#authenticated-parameters)
* [Authorized Invocations](../resources/tools/_index.md#authorized-invocations)
* [Authenticated Parameters](../resources/tools/#authenticated-parameters)
* [Authorized Invocations](../resources/tools/#authorized-invocations)
## Connecting to Toolbox with an MCP client
@@ -40,7 +40,7 @@ This includes:
MCP is only compatible with Toolbox version 0.3.0 and above.
{{< /notice >}}
1. [Install](../getting-started/introduction/_index.md#installing-the-server)
1. [Install](../getting-started/introduction/#installing-the-server)
Toolbox version 0.3.0+.
1. Make sure you've set up and initialized your database.
@@ -133,7 +133,7 @@ testing and debugging Toolbox server.
You should be able to inspect your toolbox tools!
{{% /tab %}}
{{% tab header="HTTP with SSE (deprecated)" lang="en" %}}
1. [Run Toolbox](../getting-started/introduction/_index.md#running-the-server).
1. [Run Toolbox](../getting-started/introduction/#running-the-server).
1. In a separate terminal, run Inspector directly through `npx`:
@@ -150,7 +150,7 @@ testing and debugging Toolbox server.
tools!
{{% /tab %}}
{{% tab header="Streamable HTTP" lang="en" %}}
1. [Run Toolbox](../getting-started/introduction/_index.md#running-the-server).
1. [Run Toolbox](../getting-started/introduction/#running-the-server).
1. In a separate terminal, run Inspector directly through `npx`:

View File

@@ -79,7 +79,7 @@ database are in the same VPC network.
Create a `tools.yaml` file that contains your configuration for Toolbox. For
details, see the
[configuration](https://googleapis.github.io/genai-toolbox/resources/sources/)
[configuration](../resources/sources/)
section.
## Deploy to Cloud Run
@@ -125,7 +125,7 @@ section.
--region us-central1 \
--set-secrets "/app/tools.yaml=tools:latest" \
--args="--tools-file=/app/tools.yaml","--address=0.0.0.0","--port=8080" \
# TODO(dev): update the following to match your VPC if necessary
# TODO(dev): update the following to match your VPC if necessary
--network default \
--subnet default
# --allow-unauthenticated # https://cloud.google.com/run/docs/authenticating/public#gcloud

Binary file not shown.

After

Width:  |  Height:  |  Size: 36 MiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 298 KiB

View File

@@ -0,0 +1,106 @@
---
title: "Toolbox UI"
type: docs
weight: 1
description: >
How to effectively use Toolbox UI.
---
Toolbox UI is a built-in web interface that allows users to visually inspect and test out configured resources such as tools and toolsets.
## Launching Toolbox UI
To launch Toolbox's interactive UI, use the `--ui` flag.
```sh
./toolbox --ui
```
Toolbox UI will be served from the same host and port as the Toolbox Server, with the `/ui` suffix. Once Toolbox
is launched, the following INFO log with Toolbox UI's url will be shown:
```bash
INFO "Toolbox UI is up and running at: http://localhost:5000/ui"
```
## Navigating the Tools Page
The tools page shows all tools loaded from your configuration file. This corresponds to the default toolset (represented by an empty string). Each tool's name on this page will exactly match its name in the configuration
file.
To view details for a specific tool, click on the tool name. The main content area will be populated
with the tool name, description, and available parameters.
![Tools Page](./tools.png)
### Invoking a Tool
1. Click on a Tool
2. Enter appropriate parameters in each parameter field
3. Click "Run Tool"
4. Done! Your results will appear in the response field
5. (Optional) Uncheck "Prettify JSON" to format the response as plain text
![Run Tool Demo GIF](./run-tool.gif)
### Optional Parameters
Toolbox allows users to add [optional parameters](../../resources/tools/#basic-parameters) with or without a default value.
To exclude a parameter, uncheck the box to the right of an associated parameter, and that parameter will not be
included in the request body. If the parameter is not sent, Toolbox will either use it as `nil` value or the `default` value, if configured. If the parameter is required, Toolbox will throw an error.
When the box is checked, parameter will be sent exactly as entered in the response field (e.g. empty string).
![Optional Parameter checked example](./optional-param-checked.png)
![Optional Parameter unchecked example](./optional-param-unchecked.png)
### Editing Headers
To edit headers, press the "Edit Headers" button to display the header modal. Within this modal,
users can make direct edits by typing into the header's text area.
Toolbox UI validates that the headers are in correct JSON format. Other header-related errors (e.g.,
incorrect header names or values required by the tool) will be reported in the Response section
after running the tool.
![Edit Headers](./edit-headers.png)
#### Google OAuth
Currently, Toolbox supports Google OAuth 2.0 as an AuthService, which allows tools to utilize
authorized parameters. When a tool uses an authorized parameter, the parameter will be displayed
but not editable, as it will be populated from the authentication token.
To provide the token, add your Google OAuth ID Token to the request header using the "Edit Headers"
button and modal described above. The key should be the name of your AuthService as defined in
your tool configuration file, suffixed with `_token`. The value should be your ID token as a string.
1. Select a tool that requires [authenticated parameters]()
2. The auth parameter's text field is greyed out. This is because it cannot be entered manually and will
be parsed from the resolved auth token
3. To update request headers with the token, select "Edit Headers"
4. Checkout the dropdown "How to extract Google OAuth ID Token manually" for guidance on retrieving ID token
5. Paste the request header
6. Click "Save"
7. Click "Run Tool"
```json
{
"Content-Type": "application/json",
"my-google-auth_token": "YOUR_ID_TOKEN_HERE"
}
```
![Using Authenticated Parameter GIF](./edit-headers.gif)
## Navigating the Toolsets Page
Through the toolsets page, users can search for a specific toolset to retrieve tools from. Simply
enter the toolset name in the search bar, and press "Enter" to retrieve the associated tools.
If the toolset name is not defined within the tools configuration file, an error message will be
displayed.
![Toolsets Page](./toolsets.png)

Binary file not shown.

After

Width:  |  Height:  |  Size: 58 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 59 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 5.4 MiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 269 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 136 KiB

View File

@@ -3,11 +3,11 @@ title: "AuthServices"
type: docs
weight: 1
description: >
AuthServices represent services that handle authentication and authorization.
AuthServices represent services that handle authentication and authorization.
---
AuthServices represent services that handle authentication and authorization. It
can primarily be used by [Tools](../tools) in two different ways:
can primarily be used by [Tools](../tools/) in two different ways:
- [**Authorized Invocation**][auth-invoke] is when a tool
is validated by the auth service before the call can be invoked. Toolbox

View File

@@ -3,7 +3,7 @@ title: "Google Sign-In"
type: docs
weight: 1
description: >
Use Google Sign-In for Oauth 2.0 flow and token lifecycle.
Use Google Sign-In for Oauth 2.0 flow and token lifecycle.
---
## Getting Started

View File

@@ -35,10 +35,213 @@ You can use the following system prompt as "Custom Instructions" in your client
application.
```
Whenever you will receive response from dataplex_search_entries tool decide what do to by following these steps:
# Objective
Your primary objective is to help discover, organize and manage metadata related to data assets.
# Tone and Style
1. Adopt the persona of a senior subject matter expert
2. Your communication style must be:
1. Concise: Always favor brevity.
2. Direct: Avoid greetings (e.g., "Hi there!", "Certainly!"). Get straight to the point.
Example (Incorrect): Hi there! I see that you are looking for...
Example (Correct): This problem likely stems from...
3. Do not reiterate or summarize the question in the answer.
4. Crucially, always convey a tone of uncertainty and caution. Since you are interpreting metadata and have no way to externally verify your answers, never express complete confidence. Frame your responses as interpretations based solely on the provided metadata. Use a suggestive tone, not a prescriptive one:
Example (Correct): "The entry describes..."
Example (Correct): "According to catalog,..."
Example (Correct): "Based on the metadata,..."
Example (Correct): "Based on the search results,..."
5. Do not make assumptions
# Data Model
## Entries
Entry represents a specific data asset. Entry acts as a metadata record for something that is managed by Catalog, such as:
- A BigQuery table or dataset
- A Cloud Storage bucket or folder
- An on-premises SQL table
## Aspects
While the Entry itself is a container, the rich descriptive information about the asset (e.g., schema, data types, business descriptions, classifications) is stored in associated components called Aspects. Aspects are created based on pre-defined blueprints known as Aspect Types.
## Aspect Types
Aspect Type is a reusable template that defines the schema for a set of metadata fields. Think of an Aspect Type as a structure for the kind of metadata that is organized in the catalog within the Entry.
Examples:
- projects/dataplex-types/locations/global/aspectTypes/analytics-hub-exchange
- projects/dataplex-types/locations/global/aspectTypes/analytics-hub
- projects/dataplex-types/locations/global/aspectTypes/analytics-hub-listing
- projects/dataplex-types/locations/global/aspectTypes/bigquery-connection
- projects/dataplex-types/locations/global/aspectTypes/bigquery-data-policy
- projects/dataplex-types/locations/global/aspectTypes/bigquery-dataset
- projects/dataplex-types/locations/global/aspectTypes/bigquery-model
- projects/dataplex-types/locations/global/aspectTypes/bigquery-policy
- projects/dataplex-types/locations/global/aspectTypes/bigquery-routine
- projects/dataplex-types/locations/global/aspectTypes/bigquery-row-access-policy
- projects/dataplex-types/locations/global/aspectTypes/bigquery-table
- projects/dataplex-types/locations/global/aspectTypes/bigquery-view
- projects/dataplex-types/locations/global/aspectTypes/cloud-bigtable-instance
- projects/dataplex-types/locations/global/aspectTypes/cloud-bigtable-table
- projects/dataplex-types/locations/global/aspectTypes/cloud-spanner-database
- projects/dataplex-types/locations/global/aspectTypes/cloud-spanner-instance
- projects/dataplex-types/locations/global/aspectTypes/cloud-spanner-table
- projects/dataplex-types/locations/global/aspectTypes/cloud-spanner-view
- projects/dataplex-types/locations/global/aspectTypes/cloudsql-database
- projects/dataplex-types/locations/global/aspectTypes/cloudsql-instance
- projects/dataplex-types/locations/global/aspectTypes/cloudsql-schema
- projects/dataplex-types/locations/global/aspectTypes/cloudsql-table
- projects/dataplex-types/locations/global/aspectTypes/cloudsql-view
- projects/dataplex-types/locations/global/aspectTypes/contacts
- projects/dataplex-types/locations/global/aspectTypes/dataform-code-asset
- projects/dataplex-types/locations/global/aspectTypes/dataform-repository
- projects/dataplex-types/locations/global/aspectTypes/dataform-workspace
- projects/dataplex-types/locations/global/aspectTypes/dataproc-metastore-database
- projects/dataplex-types/locations/global/aspectTypes/dataproc-metastore-service
- projects/dataplex-types/locations/global/aspectTypes/dataproc-metastore-table
- projects/dataplex-types/locations/global/aspectTypes/data-product
- projects/dataplex-types/locations/global/aspectTypes/data-quality-scorecard
- projects/dataplex-types/locations/global/aspectTypes/external-connection
- projects/dataplex-types/locations/global/aspectTypes/overview
- projects/dataplex-types/locations/global/aspectTypes/pubsub-topic
- projects/dataplex-types/locations/global/aspectTypes/schema
- projects/dataplex-types/locations/global/aspectTypes/sensitive-data-protection-job-result
- projects/dataplex-types/locations/global/aspectTypes/sensitive-data-protection-profile
- projects/dataplex-types/locations/global/aspectTypes/sql-access
- projects/dataplex-types/locations/global/aspectTypes/storage-bucket
- projects/dataplex-types/locations/global/aspectTypes/storage-folder
- projects/dataplex-types/locations/global/aspectTypes/storage
- projects/dataplex-types/locations/global/aspectTypes/usage
## Entry Types
Every Entry must conform to an Entry Type. The Entry Type acts as a template, defining the structure, required aspects, and constraints for Entries of that type.
Examples:
- projects/dataplex-types/locations/global/entryTypes/analytics-hub-exchange
- projects/dataplex-types/locations/global/entryTypes/analytics-hub-listing
- projects/dataplex-types/locations/global/entryTypes/bigquery-connection
- projects/dataplex-types/locations/global/entryTypes/bigquery-data-policy
- projects/dataplex-types/locations/global/entryTypes/bigquery-dataset
- projects/dataplex-types/locations/global/entryTypes/bigquery-model
- projects/dataplex-types/locations/global/entryTypes/bigquery-routine
- projects/dataplex-types/locations/global/entryTypes/bigquery-row-access-policy
- projects/dataplex-types/locations/global/entryTypes/bigquery-table
- projects/dataplex-types/locations/global/entryTypes/bigquery-view
- projects/dataplex-types/locations/global/entryTypes/cloud-bigtable-instance
- projects/dataplex-types/locations/global/entryTypes/cloud-bigtable-table
- projects/dataplex-types/locations/global/entryTypes/cloud-spanner-database
- projects/dataplex-types/locations/global/entryTypes/cloud-spanner-instance
- projects/dataplex-types/locations/global/entryTypes/cloud-spanner-table
- projects/dataplex-types/locations/global/entryTypes/cloud-spanner-view
- projects/dataplex-types/locations/global/entryTypes/cloudsql-mysql-database
- projects/dataplex-types/locations/global/entryTypes/cloudsql-mysql-instance
- projects/dataplex-types/locations/global/entryTypes/cloudsql-mysql-table
- projects/dataplex-types/locations/global/entryTypes/cloudsql-mysql-view
- projects/dataplex-types/locations/global/entryTypes/cloudsql-postgresql-database
- projects/dataplex-types/locations/global/entryTypes/cloudsql-postgresql-instance
- projects/dataplex-types/locations/global/entryTypes/cloudsql-postgresql-schema
- projects/dataplex-types/locations/global/entryTypes/cloudsql-postgresql-table
- projects/dataplex-types/locations/global/entryTypes/cloudsql-postgresql-view
- projects/dataplex-types/locations/global/entryTypes/cloudsql-sqlserver-database
- projects/dataplex-types/locations/global/entryTypes/cloudsql-sqlserver-instance
- projects/dataplex-types/locations/global/entryTypes/cloudsql-sqlserver-schema
- projects/dataplex-types/locations/global/entryTypes/cloudsql-sqlserver-table
- projects/dataplex-types/locations/global/entryTypes/cloudsql-sqlserver-view
- projects/dataplex-types/locations/global/entryTypes/dataform-code-asset
- projects/dataplex-types/locations/global/entryTypes/dataform-repository
- projects/dataplex-types/locations/global/entryTypes/dataform-workspace
- projects/dataplex-types/locations/global/entryTypes/dataproc-metastore-database
- projects/dataplex-types/locations/global/entryTypes/dataproc-metastore-service
- projects/dataplex-types/locations/global/entryTypes/dataproc-metastore-table
- projects/dataplex-types/locations/global/entryTypes/pubsub-topic
- projects/dataplex-types/locations/global/entryTypes/storage-bucket
- projects/dataplex-types/locations/global/entryTypes/storage-folder
- projects/dataplex-types/locations/global/entryTypes/vertexai-dataset
- projects/dataplex-types/locations/global/entryTypes/vertexai-feature-group
- projects/dataplex-types/locations/global/entryTypes/vertexai-feature-online-store
## Entry Groups
Entries are organized within Entry Groups, which are logical groupings of Entries. An Entry Group acts as a namespace for its Entries.
## Entry Links
Entries can be linked together using EntryLinks to represent relationships between data assets (e.g. foreign keys).
# Tool instructions
## Tool: dataplex_search_entries
## General
- Do not try to search within search results on your own.
- Do not fetch multiple pages of results unless explicitly asked.
## Search syntax
### Simple search
In its simplest form, a search query consists of a single predicate. Such a predicate can match several pieces of metadata:
- A substring of a name, display name, or description of a resource
- A substring of the type of a resource
- A substring of a column name (or nested column name) in the schema of a resource
- A substring of a project ID
- A string from an overview description
For example, the predicate foo matches the following resources:
- Resource with the name foo.bar
- Resource with the display name Foo Bar
- Resource with the description This is the foo script
- Resource with the exact type foo
- Column foo_bar in the schema of a resource
- Nested column foo_bar in the schema of a resource
- Project prod-foo-bar
- Resource with an overview containing the word foo
### Qualified predicates
You can qualify a predicate by prefixing it with a key that restricts the matching to a specific piece of metadata:
- An equal sign (=) restricts the search to an exact match.
- A colon (:) after the key matches the predicate to either a substring or a token within the value in the search results.
Tokenization splits the stream of text into a series of tokens, with each token usually corresponding to a single word. For example:
- name:foo selects resources with names that contain the foo substring, like foo1 and barfoo.
- description:foo selects resources with the foo token in the description, like bar and foo.
- location=foo matches resources in a specified location with foo as the location name.
The predicate keys type, system, location, and orgid support only the exact match (=) qualifier, not the substring qualifier (:). For example, type=foo or orgid=number.
Search syntax supports the following qualifiers:
- "name:x" - Matches x as a substring of the resource ID.
- "displayname:x" - Match x as a substring of the resource display name.
- "column:x" - Matches x as a substring of the column name (or nested column name) in the schema of the resource.
- "description:x" - Matches x as a token in the resource description.
- "label:bar" - Matches BigQuery resources that have a label (with some value) and the label key has bar as a substring.
- "label=bar" - Matches BigQuery resources that have a label (with some value) and the label key equals bar as a string.
- "label:bar:x" - Matches x as a substring in the value of a label with a key bar attached to a BigQuery resource.
- "label=foo:bar" - Matches BigQuery resources where the key equals foo and the key value equals bar.
- "label.foo=bar" - Matches BigQuery resources where the key equals foo and the key value equals bar.
- "label.foo" - Matches BigQuery resources that have a label whose key equals foo as a string.
- "type=TYPE" - Matches resources of a specific entry type or its type alias.
- "projectid:bar" - Matches resources within Google Cloud projects that match bar as a substring in the ID.
- "parent:x" - Matches x as a substring of the hierarchical path of a resource. The parent path is a fully_qualified_name of the parent resource.
- "orgid=number" - Matches resources within a Google Cloud organization with the exact ID value of the number.
- "system=SYSTEM" - Matches resources from a specified system. For example, system=bigquery matches BigQuery resources.
- "location=LOCATION" - Matches resources in a specified location with an exact name. For example, location=us-central1 matches assets hosted in Iowa. BigQuery Omni assets support this qualifier by using the BigQuery Omni location name. For example, location=aws-us-east-1 matches BigQuery Omni assets in Northern Virginia.
- "createtime" -
Finds resources that were created within, before, or after a given date or time. For example "createtime:2019-01-01" matches resources created on 2019-01-01.
- "updatetime" - Finds resources that were updated within, before, or after a given date or time. For example "updatetime>2019-01-01" matches resources updated after 2019-01-01.
- "fully_qualified_name:x" - Matches x as a substring of fully_qualified_name.
- "fully_qualified_name=x" - Matches x as fully_qualified_name.
### Logical operators
A query can consist of several predicates with logical operators. If you don't specify an operator, logical AND is implied. For example, foo bar returns resources that match both predicate foo and predicate bar.
Logical AND and logical OR are supported. For example, foo OR bar.
You can negate a predicate with a - (hyphen) or NOT prefix. For example, -name:foo returns resources with names that don't match the predicate foo.
Logical operators aren't case-sensitive. For example, both or and OR are acceptable.
### Request
1. Always try to rewrite the prompt using search syntax.
### Response
1. If there are multiple search results found
1.1. Present the list of search results
1.2. Format the output in nested ordered list, for example:
1. Present the list of search results
2. Format the output in nested ordered list, for example:
Given
```
{
@@ -75,14 +278,19 @@ Whenever you will receive response from dataplex_search_entries tool decide what
- location: us-central1
- description: Table contains list of best customers.
```
1.3. Ask to select one of the presented search results
3. Ask to select one of the presented search results
2. If there is only one search result found
2.1. Present the search result immediately.
1. Present the search result immediately.
3. If there are no search result found
3.1. Explain that no search result was found
3.2. Suggest to provide a more specific search query.
1. Explain that no search result was found
2. Suggest to provide a more specific search query.
Do not try to search within search results on your own.
## Tool: dataplex_lookup_entry
### Request
1. Always try to limit the size of the response by specifying `aspect_types` parameter. Make sure to include to select view=CUSTOM when using aspect_types parameter.
2. If you do not know the name of the entry, use `dataplex_search_entries` tool
### Response
1. Unless asked for a specific aspect, respond with all aspects attached to the entry.
```
## Reference
@@ -90,4 +298,4 @@ Do not try to search within search results on your own.
| **field** | **type** | **required** | **description** |
|-----------|:--------:|:------------:|----------------------------------------------------------------------------------|
| kind | string | true | Must be "dataplex". |
| project | string | true | Id of the GCP project used for quota and billing purposes (e.g. "my-project-id").|
| project | string | true | ID of the GCP project used for quota and billing purposes (e.g. "my-project-id").|

View File

@@ -0,0 +1,73 @@
---
title: DuckDB
linkTitle: DuckDB
type: docs
weight: 1
description: >
DuckDB is an in-process SQL OLAP database management system designed for analytical query processing.
---
## About
[DuckDB](https://duckdb.org/) is an embedded analytical database management system that runs in-process with the client application. It is optimized for analytical workloads, providing high performance for complex queries with minimal setup.
DuckDB has the following notable characteristics:
- In-process, serverless database engine
- Supports complex SQL queries for analytical processing
- Can operate on in-memory or persistent storage
- Zero-configuration - no external dependencies or server setup required
- Highly optimized for columnar data storage and query execution
For more details, refer to the [DuckDB Documentation](https://duckdb.org/).
## Available Tools
- [`duckdb-sql`](../tools/duckdb/duckdb-sql.md)
Execute pre-defined prepared SQL queries in DuckDB.
## Requirements
### Database File
To use DuckDB, you can either:
- Specify a file path for a persistent database stored on the filesystem
- Omit the file path to use an in-memory database
## Example
For a persistent DuckDB database:
```yaml
sources:
my-duckdb:
kind: "duckdb"
dbFilePath: "/path/to/database.db"
configuration:
memory_limit: "2GB"
threads: "4"
```
For an in-memory DuckDB database:
```yaml
sources:
my-duckdb-memory:
name: "my-duckdb-memory"
kind: "duckdb"
```
## Reference
### Configuration Fields
| **field** | **type** | **required** | **description** |
|-------------------|:-----------------:|:------------:|---------------------------------------------------------------------------------|
| kind | string | true | Must be "duckdb". |
| dbFilePath | string | false | Path to the DuckDB database file. Omit for an in-memory database. |
| configuration | map[string]string | false | Additional DuckDB configuration options (e.g., `memory_limit`, `threads`). |
For a complete list of available configuration options, refer to the [DuckDB Configuration Documentation](https://duckdb.org/docs/stable/configuration/overview.html#local-configuration-options).
For more details on the Go implementation, see the [go-duckdb package documentation](https://pkg.go.dev/github.com/scottlepp/go-duckdb#section-readme).

View File

@@ -0,0 +1,81 @@
---
title: "TiDB"
type: docs
weight: 1
description: >
TiDB is a distributed SQL database that combines the best of traditional RDBMS and NoSQL databases.
---
## About
[TiDB][tidb-docs] is an open-source distributed SQL database that supports Hybrid Transactional and Analytical Processing (HTAP) workloads. It is MySQL-compatible and features horizontal scalability, strong consistency, and high availability.
[tidb-docs]: https://docs.pingcap.com/tidb/stable
## Requirements
### Database User
This source uses standard MySQL protocol authentication. You will need to [create a TiDB user][tidb-users] to login to the database with.
For TiDB Cloud users, you can create database users through the TiDB Cloud console.
[tidb-users]: https://docs.pingcap.com/tidb/stable/user-account-management
## SSL Configuration
- TiDB Cloud
For TiDB Cloud instances, SSL is automatically enabled when the hostname matches the TiDB Cloud pattern (`gateway*.*.*.tidbcloud.com`). You don't need to explicitly set `ssl: true` for TiDB Cloud connections.
- Self-Hosted TiDB
For self-hosted TiDB instances, you can optionally enable SSL by setting `ssl: true` in your configuration.
## Example
- TiDB Cloud
```yaml
sources:
my-tidb-cloud-source:
kind: tidb
host: gateway01.us-west-2.prod.aws.tidbcloud.com
port: 4000
database: my_db
user: ${TIDB_USERNAME}
password: ${TIDB_PASSWORD}
# SSL is automatically enabled for TiDB Cloud
```
- Self-Hosted TiDB
```yaml
sources:
my-tidb-source:
kind: tidb
host: 127.0.0.1
port: 4000
database: my_db
user: ${TIDB_USERNAME}
password: ${TIDB_PASSWORD}
# ssl: true # Optional: enable SSL for secure connections
```
{{< notice tip >}}
Use environment variable replacement with the format ${ENV_NAME}
instead of hardcoding your secrets into the configuration file.
{{< /notice >}}
## Reference
| **field** | **type** | **required** | **description** |
|-----------|:--------:|:------------:|--------------------------------------------------------------------------------------------|
| kind | string | true | Must be "tidb". |
| host | string | true | IP address or hostname to connect to (e.g. "127.0.0.1" or "gateway01.*.tidbcloud.com"). |
| port | string | true | Port to connect to (typically "4000" for TiDB). |
| database | string | true | Name of the TiDB database to connect to (e.g. "my_db"). |
| user | string | true | Name of the TiDB user to connect as (e.g. "my-tidb-user"). |
| password | string | true | Password of the TiDB user (e.g. "my-password"). |
| ssl | boolean | false | Whether to use SSL/TLS encryption. Automatically enabled for TiDB Cloud instances. |

View File

@@ -2,8 +2,8 @@
title: "Tools"
type: docs
weight: 2
description: >
Tools define actions an agent can take -- such as reading and writing to a
description: >
Tools define actions an agent can take -- such as reading and writing to a
source.
---
@@ -157,10 +157,10 @@ will be thrown in case of value type mismatch.
Authenticated parameters are automatically populated with user
information decoded from [ID
tokens](../authsources/#specifying-id-tokens-from-clients) that are passed in
tokens](../authServices/#specifying-id-tokens-from-clients) that are passed in
request headers. They do not take input values in request bodies like other
parameters. To use authenticated parameters, you must configure the tool to map
the required [authServices](../authservices) to specific claims within the
the required [authServices](../authServices/) to specific claims within the
user's ID token.
```yaml
@@ -183,7 +183,7 @@ user's ID token.
| **field** | **type** | **required** | **description** |
|-----------|:--------:|:------------:|-----------------------------------------------------------------------------------------|
| name | string | true | Name of the [authServices](../authservices) used to verify the OIDC auth token. |
| name | string | true | Name of the [authServices](../authServices/) used to verify the OIDC auth token. |
| field | string | true | Claim field decoded from the OIDC token used to auto-populate this parameter. |
### Template Parameters
@@ -244,7 +244,7 @@ tools:
You can require an authorization check for any Tool invocation request by
specifying an `authRequired` field. Specify a list of
[authServices](../authservices) defined in the previous section.
[authServices](../authServices/) defined in the previous section.
```yaml
tools:

View File

@@ -2,9 +2,9 @@
title: "alloydb-ai-nl"
type: docs
weight: 1
description: >
The "alloydb-ai-nl" tool leverages
[AlloyDB AI](https://cloud.google.com/alloydb/ai) next-generation Natural
description: >
The "alloydb-ai-nl" tool leverages
[AlloyDB AI](https://cloud.google.com/alloydb/ai) next-generation Natural
Language support to provide the ability to query the database directly using
natural language.
aliases:
@@ -22,7 +22,7 @@ layer.
This tool is compatible with the following sources:
- [alloydb-postgres](../sources/alloydb-pg.md)
- [alloydb-postgres](../../sources/alloydb-pg.md)
AlloyDB AI Natural Language delivers secure and accurate responses for
application end user natural language questions. Natural language streamlines
@@ -69,7 +69,7 @@ database queries.
You can use the `nlConfigParameters` to list the parameters required for your
`nl_config`. You **must** supply all parameters required for all PSVs in the
context. It's strongly recommended to use features like [Authenticated
Parameters](../tools/#array-parameters) or Bound Parameters to provide secure
Parameters](../#array-parameters) or Bound Parameters to provide secure
access to queries generated using natural language, as these parameters are not
visible to the LLM.
@@ -88,8 +88,8 @@ tools:
- name: user_email
type: string
description: User ID of the logged in user.
# note: we strongly recommend using features like Authenticated or
# Bound parameters to prevent the LLM from seeing these params and
# note: we strongly recommend using features like Authenticated or
# Bound parameters to prevent the LLM from seeing these params and
# specifying values it shouldn't in the tool input
authServices:
- name: my_google_service
@@ -104,4 +104,4 @@ tools:
| source | string | true | Name of the AlloyDB source the natural language query should execute on. |
| description | string | true | Description of the tool that is passed to the LLM. |
| nlConfig | string | true | The name of the `nl_config` in AlloyDB |
| nlConfigParameters | [parameters](_index#specifying-parameters) | true | List of PSV parameters defined in the `nl_config` |
| nlConfigParameters | [parameters](../#specifying-parameters) | true | List of PSV parameters defined in the `nl_config` |

View File

@@ -2,7 +2,7 @@
title: "bigquery-execute-sql"
type: docs
weight: 1
description: >
description: >
A "bigquery-execute-sql" tool executes a SQL statement against BigQuery.
aliases:
- /resources/tools/bigquery-execute-sql
@@ -13,7 +13,7 @@ aliases:
A `bigquery-execute-sql` tool executes a SQL statement against BigQuery.
It's compatible with the following sources:
- [bigquery](../sources/bigquery.md)
- [bigquery](../../sources/bigquery.md)
`bigquery-execute-sql` takes one input parameter `sql` and runs the sql
statement against the `source`.

View File

@@ -2,7 +2,7 @@
title: "bigquery-get-dataset-info"
type: docs
weight: 1
description: >
description: >
A "bigquery-get-dataset-info" tool retrieves metadata for a BigQuery dataset.
aliases:
- /resources/tools/bigquery-get-dataset-info
@@ -13,7 +13,7 @@ aliases:
A `bigquery-get-dataset-info` tool retrieves metadata for a BigQuery dataset.
It's compatible with the following sources:
- [bigquery](../sources/bigquery.md)
- [bigquery](../../sources/bigquery.md)
`bigquery-get-dataset-info` takes a `dataset` parameter to specify the dataset
on the given source. It also optionally accepts a `project` parameter to

View File

@@ -2,7 +2,7 @@
title: "bigquery-get-table-info"
type: docs
weight: 1
description: >
description: >
A "bigquery-get-table-info" tool retrieves metadata for a BigQuery table.
aliases:
- /resources/tools/bigquery-get-table-info
@@ -13,7 +13,7 @@ aliases:
A `bigquery-get-table-info` tool retrieves metadata for a BigQuery table.
It's compatible with the following sources:
- [bigquery](../sources/bigquery.md)
- [bigquery](../../sources/bigquery.md)
`bigquery-get-table-info` takes `dataset` and `table` parameters to specify
the target table. It also optionally accepts a `project` parameter to define

View File

@@ -2,7 +2,7 @@
title: "bigquery-list-dataset-ids"
type: docs
weight: 1
description: >
description: >
A "bigquery-list-dataset-ids" tool returns all dataset IDs from the source.
aliases:
- /resources/tools/bigquery-list-dataset-ids
@@ -13,7 +13,7 @@ aliases:
A `bigquery-list-dataset-ids` tool returns all dataset IDs from the source.
It's compatible with the following sources:
- [bigquery](../sources/bigquery.md)
- [bigquery](../../sources/bigquery.md)
`bigquery-list-dataset-ids` optionally accepts a `project` parameter to define
the Google Cloud project ID. If the `project` parameter is not provided, the

View File

@@ -2,7 +2,7 @@
title: "bigquery-list-table-ids"
type: docs
weight: 1
description: >
description: >
A "bigquery-list-table-ids" tool returns table IDs in a given BigQuery dataset.
aliases:
- /resources/tools/bigquery-list-table-ids
@@ -13,7 +13,7 @@ aliases:
A `bigquery-list-table-ids` tool returns table IDs in a given BigQuery dataset.
It's compatible with the following sources:
- [bigquery](../sources/bigquery.md)
- [bigquery](../../sources/bigquery.md)
`bigquery-get-dataset-info` takes a required `dataset` parameter to specify the dataset
from which to list table IDs. It also optionally accepts a `project` parameter to

View File

@@ -13,7 +13,7 @@ aliases:
A `bigquery-sql` tool executes a pre-defined SQL statement. It's compatible with
the following sources:
- [bigquery](../sources/bigquery.md)
- [bigquery](../../sources/bigquery.md)
### GoogleSQL
@@ -72,7 +72,7 @@ tools:
> including identifiers, column names, and table names. **This makes it more
> vulnerable to SQL injections**. Using basic parameters only (see above) is
> recommended for performance and safety reasons. For more details, please check
> [templateParameters](_index#template-parameters).
> [templateParameters](../#template-parameters).
```yaml
tools:
@@ -101,5 +101,5 @@ tools:
| source | string | true | Name of the source the GoogleSQL should execute on. |
| description | string | true | Description of the tool that is passed to the LLM. |
| statement | string | true | The GoogleSQL statement to execute. |
| parameters | [parameters](_index#specifying-parameters) | false | List of [parameters](_index#specifying-parameters) that will be inserted into the SQL statement. |
| templateParameters | [templateParameters](_index#template-parameters) | false | List of [templateParameters](_index#template-parameters) that will be inserted into the SQL statement before executing prepared statement. |
| parameters | [parameters](../#specifying-parameters) | false | List of [parameters](../#specifying-parameters) that will be inserted into the SQL statement. |
| templateParameters | [templateParameters](../#template-parameters) | false | List of [templateParameters](../#template-parameters) that will be inserted into the SQL statement before executing prepared statement. |

View File

@@ -2,8 +2,8 @@
title: "bigtable-sql"
type: docs
weight: 1
description: >
A "bigtable-sql" tool executes a pre-defined SQL statement against a Google
description: >
A "bigtable-sql" tool executes a pre-defined SQL statement against a Google
Cloud Bigtable instance.
aliases:
- /resources/tools/bigtable-sql
@@ -14,7 +14,7 @@ aliases:
A `bigtable-sql` tool executes a pre-defined SQL statement against a Bigtable
instance. It's compatible with any of the following sources:
- [bigtable](../sources/bigtable.md)
- [bigtable](../../sources/bigtable.md)
### GoogleSQL
@@ -45,13 +45,13 @@ tools:
kind: bigtable-sql
source: my-bigtable-instance
statement: |
SELECT
TO_INT64(cf[ 'id' ]) as id,
CAST(cf[ 'name' ] AS string) as name,
FROM
mytable
WHERE
TO_INT64(cf[ 'id' ]) = @id
SELECT
TO_INT64(cf[ 'id' ]) as id,
CAST(cf[ 'name' ] AS string) as name,
FROM
mytable
WHERE
TO_INT64(cf[ 'id' ]) = @id
OR CAST(cf[ 'name' ] AS string) = @name;
description: |
Use this tool to get information for a specific user.
@@ -77,7 +77,7 @@ tools:
> including identifiers, column names, and table names. **This makes it more
> vulnerable to SQL injections**. Using basic parameters only (see above) is
> recommended for performance and safety reasons. For more details, please check
> [templateParameters](_index#template-parameters).
> [templateParameters](#template-parameters).
```yaml
tools:
@@ -106,8 +106,8 @@ tools:
| source | string | true | Name of the source the SQL should execute on. |
| description | string | true | Description of the tool that is passed to the LLM. |
| statement | string | true | SQL statement to execute on. |
| parameters | [parameters](_index#specifying-parameters) | false | List of [parameters](_index#specifying-parameters) that will be inserted into the SQL statement. |
| templateParameters | [templateParameters](_index#template-parameters) | false | List of [templateParameters](_index#template-parameters) that will be inserted into the SQL statement before executing prepared statement. |
| parameters | [parameters](../#specifying-parameters) | false | List of [parameters](../#specifying-parameters) that will be inserted into the SQL statement. |
| templateParameters | [templateParameters](#template-parameters) | false | List of [templateParameters](#template-parameters) that will be inserted into the SQL statement before executing prepared statement. |
## Tips

View File

@@ -2,7 +2,7 @@
title: "couchbase-sql"
type: docs
weight: 1
description: >
description: >
A "couchbase-sql" tool executes a pre-defined SQL statement against a Couchbase
database.
aliases:
@@ -14,7 +14,7 @@ aliases:
A `couchbase-sql` tool executes a pre-defined SQL statement against a Couchbase
database. It's compatible with any of the following sources:
- [couchbase](../sources/couchbase.md)
- [couchbase](../../sources/couchbase.md)
The specified SQL statement is executed as a parameterized statement, and specified
parameters will be used according to their name: e.g. `$id`.
@@ -66,7 +66,7 @@ tools:
> including identifiers, column names, and table names. **This makes it more
> vulnerable to SQL injections**. Using basic parameters only (see above) is
> recommended for performance and safety reasons. For more details, please check
> [templateParameters](_index#template-parameters).
> [templateParameters](#template-parameters).
```yaml
tools:
@@ -95,6 +95,6 @@ tools:
| source | string | true | Name of the source the SQL query should execute on. |
| description | string | true | Description of the tool that is passed to the LLM. |
| statement | string | true | SQL statement to execute |
| parameters | [parameters](_index#specifying-parameters) | false | List of [parameters](_index#specifying-parameters) that will be used with the SQL statement. |
| templateParameters | [templateParameters](_index#template-parameters) | false | List of [templateParameters](_index#template-parameters) that will be inserted into the SQL statement before executing prepared statement. |
| parameters | [parameters](../#specifying-parameters) | false | List of [parameters](../#specifying-parameters) that will be used with the SQL statement. |
| templateParameters | [templateParameters](#template-parameters) | false | List of [templateParameters](#template-parameters) that will be inserted into the SQL statement before executing prepared statement. |
| authRequired | array[string] | false | List of auth services that are required to use this tool. |

View File

@@ -0,0 +1,60 @@
---
title: "dataplex-lookup-entry"
type: docs
weight: 1
description: >
A "dataplex-lookup-entry" tool returns details of a particular entry in Dataplex Catalog.
aliases:
- /resources/tools/dataplex-lookup-entry
---
## About
A `dataplex-lookup-entry` tool returns details of a particular entry in Dataplex Catalog.
It's compatible with the following sources:
- [dataplex](../sources/dataplex.md)
`dataplex-lookup-entry` takes a required `name` parameter which contains the project and location to which the request should be attributed in the following form: projects/{project}/locations/{location} and also a required `entry` parameter which is the resource name of the entry in the following form: projects/{project}/locations/{location}/entryGroups/{entryGroup}/entries/{entry}. It also optionally accepts following parameters:
- `view` - View to control which parts of an entry the service should return. It takes integer values from 1-4 corresponding to type of view - BASIC, FULL, CUSTOM, ALL
- `aspectTypes` - Limits the aspects returned to the provided aspect types in the format `projects/{project}/locations/{location}/aspectTypes/{aspectType}`. It only works for CUSTOM view.
- `paths` - Limits the aspects returned to those associated with the provided paths within the Entry. It only works for CUSTOM view.
## Requirements
### IAM Permissions
Dataplex uses [Identity and Access Management (IAM)][iam-overview] to control
user and group access to Dataplex resources. Toolbox will use your
[Application Default Credentials (ADC)][adc] to authorize and authenticate when
interacting with [Dataplex][dataplex-docs].
In addition to [setting the ADC for your server][set-adc], you need to ensure
the IAM identity has been given the correct IAM permissions for the tasks you
intend to perform. See [Dataplex Universal Catalog IAM permissions][iam-permissions]
and [Dataplex Universal Catalog IAM roles][iam-roles] for more information on
applying IAM permissions and roles to an identity.
[iam-overview]: https://cloud.google.com/dataplex/docs/iam-and-access-control
[adc]: https://cloud.google.com/docs/authentication#adc
[set-adc]: https://cloud.google.com/docs/authentication/provide-credentials-adc
[iam-permissions]: https://cloud.google.com/dataplex/docs/iam-permissions
[iam-roles]: https://cloud.google.com/dataplex/docs/iam-roles
## Example
```yaml
tools:
lookup_entry:
kind: dataplex-lookup-entry
source: my-dataplex-source
description: Use this tool to retrieve a specific entry in Dataplex Catalog.
```
## Reference
| **field** | **type** | **required** | **description** |
|-------------|:------------------------------------------:|:------------:|--------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "dataplex-lookup-entry". |
| source | string | true | Name of the source the tool should execute on. |
| description | string | true | Description of the tool that is passed to the LLM. |

View File

@@ -2,7 +2,7 @@
title: "dataplex-search-entries"
type: docs
weight: 1
description: >
description: >
A "dataplex-search-entries" tool allows to search for entries based on the provided query.
aliases:
- /resources/tools/dataplex-search-entries
@@ -14,7 +14,7 @@ A `dataplex-search-entries` tool returns all entries in Dataplex Catalog (e.g.
tables, views, models) that matches given user query.
It's compatible with the following sources:
- [dataplex](../sources/dataplex.md)
- [dataplex](../../sources/dataplex.md)
`dataplex-search-entries` takes a required `query` parameter based on which
entries are filtered and returned to the user and a required `name` parameter

View File

@@ -2,7 +2,7 @@
title: "dgraph-dql"
type: docs
weight: 1
description: >
description: >
A "dgraph-dql" tool executes a pre-defined DQL statement against a Dgraph
database.
aliases:
@@ -14,7 +14,7 @@ aliases:
A `dgraph-dql` tool executes a pre-defined DQL statement against a Dgraph
database. It's compatible with any of the following sources:
- [dgraph](../sources/dgraph.md)
- [dgraph](../../sources/dgraph.md)
To run a statement as a query, you need to set the config `isQuery=true`. For
upserts or mutations, set `isQuery=false`. You can also configure timeout for a
@@ -121,4 +121,4 @@ tools:
| statement | string | true | dql statement to execute |
| isQuery | boolean | false | To run statement as query set true otherwise false |
| timeout | string | false | To set timeout for query |
| parameters | [parameters](_index#specifying-parameters) | false | List of [parameters](_index#specifying-parameters) that will be used with the dql statement. |
| parameters | [parameters](../#specifying-parameters) | false | List of [parameters](../#specifying-parameters) that will be used with the dql statement. |

View File

@@ -0,0 +1,7 @@
---
title: "DuckDB"
type: docs
weight: 1
description: >
Tools that work with DuckDB Sources.
---

View File

@@ -0,0 +1,80 @@
---
title: "duckdb-sql"
type: docs
weight: 1
description: >
Execute SQL statements against a DuckDB database using the DuckDB SQL tools configuration.
aliases:
- /resources/tools/duckdb-sql
---
## About
A `duckdb-sql` tool executes a pre-defined SQL statement against a [DuckDB](https://duckdb.org/) database. It is compatible with any DuckDB source configuration as defined in the [DuckDB source documentation](../../sources/duckdb.md).
The specified SQL statement is executed as a prepared statement, and parameters are inserted according to their position: e.g., `$1` is the first parameter, `$2` is the second, and so on. If template parameters are included, they are resolved before execution of the prepared statement.
DuckDB's SQL dialect closely follows the conventions of the PostgreSQL dialect, with a few exceptions listed in the [DuckDB PostgreSQL Compatibility documentation](https://duckdb.org/docs/stable/sql/dialect/postgresql_compatibility.html). For an introduction to DuckDB's SQL dialect, refer to the [DuckDB SQL Introduction](https://duckdb.org/docs/stable/sql/introduction).
### Concepts
DuckDB is a relational database management system (RDBMS). Data is stored in relations (tables), where each table is a named collection of rows. Each row in a table has the same set of named columns, each with a specific data type. Tables are stored within schemas, and a collection of schemas constitutes the entire database.
For more details, see the [DuckDB SQL Introduction](https://duckdb.org/docs/stable/sql/introduction).
## Example
> **Note:** This tool uses parameterized queries to prevent SQL injections. Query parameters can be used as substitutes for arbitrary expressions but cannot be used for identifiers, column names, table names, or other parts of the query.
```yaml
tools:
search-users:
kind: duckdb-sql
source: my-duckdb
description: Search users by name and age
statement: SELECT * FROM users WHERE name LIKE $1 AND age >= $2
parameters:
- name: name
type: string
description: The name to search for
- name: min_age
type: integer
description: Minimum age
```
## Example with Template Parameters
> **Note:** Template parameters allow direct modifications to the SQL statement, including identifiers, column names, and table names, which makes them more vulnerable to SQL injections. Using basic parameters (see above) is recommended for performance and safety. For more details, see the [templateParameters](../#template-parameters) section.
```yaml
tools:
list_table:
kind: duckdb-sql
source: my-duckdb
statement: |
SELECT * FROM {{.tableName}};
description: |
Use this tool to list all information from a specific table.
Example:
{{
"tableName": "flights",
}}
templateParameters:
- name: tableName
type: string
description: Table to select from
```
## Reference
### Configuration Fields
| **field** | **type** | **required** | **description** |
|--------------------|:-------------------------------:|:------------:|--------------------------------------------------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "duckdb-sql". |
| source | string | true | Name of the DuckDB source configuration (see [DuckDB source documentation](../../sources/duckdb.md)). |
| description | string | true | Description of the tool that is passed to the LLM. |
| statement | string | true | The SQL statement to execute. |
| authRequired | []string | false | List of authentication requirements for the tool (if any). |
| parameters | [parameters](../#specifying-parameters) | false | List of parameters that will be inserted into the SQL statement |
| templateParameters | [templateParameters](../#template-parameters) | false | List of template parameters that will be inserted into the SQL statement before executing the prepared statement. |

View File

@@ -2,7 +2,7 @@
title: "firestore-delete-documents"
type: docs
weight: 1
description: >
description: >
A "firestore-delete-documents" tool deletes multiple documents from Firestore by their paths.
aliases:
- /resources/tools/firestore-delete-documents
@@ -14,7 +14,7 @@ A `firestore-delete-documents` tool deletes multiple documents from Firestore by
their paths.
It's compatible with the following sources:
- [firestore](../sources/firestore.md)
- [firestore](../../sources/firestore.md)
`firestore-delete-documents` takes one input parameter `documentPaths` which is
an array of document paths to delete. The tool uses Firestore's BulkWriter for

View File

@@ -2,7 +2,7 @@
title: "firestore-get-documents"
type: docs
weight: 1
description: >
description: >
A "firestore-get-documents" tool retrieves multiple documents from Firestore by their paths.
aliases:
- /resources/tools/firestore-get-documents
@@ -14,7 +14,7 @@ A `firestore-get-documents` tool retrieves multiple documents from Firestore by
their paths.
It's compatible with the following sources:
- [firestore](../sources/firestore.md)
- [firestore](../../sources/firestore.md)
`firestore-get-documents` takes one input parameter `documentPaths` which is an
array of document paths, and returns the documents' data along with metadata

View File

@@ -2,7 +2,7 @@
title: "firestore-get-rules"
type: docs
weight: 1
description: >
description: >
A "firestore-get-rules" tool retrieves the active Firestore security rules for the current project.
aliases:
- /resources/tools/firestore-get-rules
@@ -15,7 +15,7 @@ rules](https://firebase.google.com/docs/firestore/security/get-started) for the
current project.
It's compatible with the following sources:
- [firestore](../sources/firestore.md)
- [firestore](../../sources/firestore.md)
`firestore-get-rules` takes no input parameters and returns the security rules
content along with metadata such as the ruleset name, and timestamps.

View File

@@ -2,7 +2,7 @@
title: "firestore-list-collections"
type: docs
weight: 1
description: >
description: >
A "firestore-list-collections" tool lists collections in Firestore, either at the root level or as subcollections of a document.
aliases:
- /resources/tools/firestore-list-collections
@@ -17,7 +17,7 @@ in Firestore, either at the root level or as
of a specific document.
It's compatible with the following sources:
- [firestore](../sources/firestore.md)
- [firestore](../../sources/firestore.md)
`firestore-list-collections` takes an optional `parentPath` parameter to specify a document
path. If provided, it lists all subcollections of that document. If not provided, it lists

View File

@@ -2,7 +2,7 @@
title: "http"
type: docs
weight: 1
description: >
description: >
A "http" tool sends out an HTTP request to an HTTP endpoint.
aliases:
- /resources/tools/http
@@ -50,7 +50,7 @@ tools:
method: GET
path: /search
description: Tool to search information from the example API
my-dynamic-path-tool:
kind: http
source: my-http-source
@@ -256,8 +256,8 @@ my-http-tool:
| method | string | true | The HTTP method to use (e.g., GET, POST, PUT, DELETE). |
| headers | map[string]string | false | A map of headers to include in the HTTP request (overrides source headers). |
| requestBody | string | false | The request body payload. Use [go template][go-template-doc] with the parameter name as the placeholder (e.g., `{{.id}}` will be replaced with the value of the parameter that has name `id` in the `bodyParams` section). |
| queryParams | [parameters](_index#specifying-parameters) | false | List of [parameters](_index#specifying-parameters) that will be inserted into the query string. |
| bodyParams | [parameters](_index#specifying-parameters) | false | List of [parameters](_index#specifying-parameters) that will be inserted into the request body payload. |
| headerParams | [parameters](_index#specifying-parameters) | false | List of [parameters](_index#specifying-parameters) that will be inserted as the request headers. |
| queryParams | [parameters](../#specifying-parameters) | false | List of [parameters](../#specifying-parameters) that will be inserted into the query string. |
| bodyParams | [parameters](../#specifying-parameters) | false | List of [parameters](../#specifying-parameters) that will be inserted into the request body payload. |
| headerParams | [parameters](../#specifying-parameters) | false | List of [parameters](../#specifying-parameters) that will be inserted as the request headers. |
[go-template-doc]: <https://pkg.go.dev/text/template#pkg-overview>

View File

@@ -16,7 +16,7 @@ in a given mode in the source.
It's compatible with the following sources:
- [looker](../sources/looker.md)
- [looker](../../sources/looker.md)
`looker-get-dimensions` accepts two parameters, the `model` and the `explore`.

View File

@@ -16,7 +16,7 @@ for a given model from the source.
It's compatible with the following sources:
- [looker](../sources/looker.md)
- [looker](../../sources/looker.md)
`looker-get-explores` accepts one parameter, the
`model` id.

View File

@@ -16,7 +16,7 @@ in a given mode in the source.
It's compatible with the following sources:
- [looker](../sources/looker.md)
- [looker](../../sources/looker.md)
`looker-get-filters` accepts two parameters, the `model` and the `explore`.

View File

@@ -16,7 +16,7 @@ name or description.
It's compatible with the following sources:
- [looker](../sources/looker.md)
- [looker](../../sources/looker.md)
`looker-get-looks` takes four parameters, the `title`, `desc`, `limit`
and `offset`.

View File

@@ -2,7 +2,7 @@
title: "looker-get-measures"
type: docs
weight: 1
description: >
description: >
A "looker-get-measures" tool returns all the measures from a given explore
in a given model in the source.
aliases:
@@ -16,7 +16,7 @@ in a given mode in the source.
It's compatible with the following sources:
- [looker](../sources/looker.md)
- [looker](../../sources/looker.md)
`looker-get-measures` accepts two parameters, the `model` and the `explore`.

View File

@@ -2,7 +2,7 @@
title: "looker-get-models"
type: docs
weight: 1
description: >
description: >
A "looker-get-models" tool returns all the models in the source.
aliases:
- /resources/tools/looker-get-models
@@ -14,7 +14,7 @@ A `looker-get-models` tool returns all the models the source.
It's compatible with the following sources:
- [looker](../sources/looker.md)
- [looker](../../sources/looker.md)
`looker-get-models` accepts no parameters.

View File

@@ -2,7 +2,7 @@
title: "looker-get-parameters"
type: docs
weight: 1
description: >
description: >
A "looker-get-parameters" tool returns all the parameters from a given explore
in a given model in the source.
aliases:
@@ -16,7 +16,7 @@ in a given mode in the source.
It's compatible with the following sources:
- [looker](../sources/looker.md)
- [looker](../../sources/looker.md)
`looker-get-parameters` accepts two parameters, the `model` and the `explore`.

View File

@@ -16,7 +16,7 @@ semantic model.
It's compatible with the following sources:
- [looker](../sources/looker.md)
- [looker](../../sources/looker.md)
`looker-query` takes eight parameters:

View File

@@ -16,7 +16,7 @@ semantic model.
It's compatible with the following sources:
- [looker](../sources/looker.md)
- [looker](../../sources/looker.md)
`looker-query-sql` takes eight parameters:

View File

@@ -0,0 +1,53 @@
---
title: "looker-query-url"
type: docs
weight: 1
description: >
"looker-query-url" generates a url link to a Looker explore.
aliases:
- /resources/tools/looker-query-url
---
## About
The `looker-query-url` generates a url link to an explore in
Looker so the query can be investigated further.
It's compatible with the following sources:
- [looker](../../sources/looker.md)
`looker-query-url` takes eight parameters:
1. the `model`
2. the `explore`
3. the `fields` list
4. an optional set of `filters`
5. an optional set of `pivots`
6. an optional set of `sorts`
7. an optional `limit`
8. an optional `tz`
## Example
```yaml
tools:
query_url:
kind: looker-query-url
source: looker-source
description: |
Query URL Tool
This tool is used to generate the URL of a query in Looker.
The user can then explore the query further inside Looker.
The tool also returns the query_id and slug. The parameters
are the same as the `looker-query` tool.
```
## Reference
| **field** | **type** | **required** | **description** |
|-------------|:------------------------------------------:|:------------:|--------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "looker-query-url" |
| source | string | true | Name of the source the SQL should execute on. |
| description | string | true | Description of the tool that is passed to the LLM. |

View File

@@ -15,7 +15,7 @@ saved Look.
It's compatible with the following sources:
- [looker](../sources/looker.md)
- [looker](../../sources/looker.md)
`looker-run-look` takes one parameter, the `look_id`.

View File

@@ -2,7 +2,7 @@
title: "mssql-execute-sql"
type: docs
weight: 1
description: >
description: >
A "mssql-execute-sql" tool executes a SQL statement against a SQL Server
database.
aliases:
@@ -14,8 +14,8 @@ aliases:
A `mssql-execute-sql` tool executes a SQL statement against a SQL Server
database. It's compatible with any of the following sources:
- [cloud-sql-mssql](../sources/cloud-sql-mssql.md)
- [mssql](../sources/mssql.md)
- [cloud-sql-mssql](../../sources/cloud-sql-mssql.md)
- [mssql](../../sources/mssql.md)
`mssql-execute-sql` takes one input parameter `sql` and run the sql
statement against the `source`.

View File

@@ -2,7 +2,7 @@
title: "mssql-sql"
type: docs
weight: 1
description: >
description: >
A "mssql-sql" tool executes a pre-defined SQL statement against a SQL Server
database.
aliases:
@@ -14,8 +14,8 @@ aliases:
A `mssql-sql` tool executes a pre-defined SQL statement against a SQL Server
database. It's compatible with any of the following sources:
- [cloud-sql-mssql](../sources/cloud-sql-mssql.md)
- [mssql](../sources/mssql.md)
- [cloud-sql-mssql](../../sources/cloud-sql-mssql.md)
- [mssql](../../sources/mssql.md)
Toolbox supports the [prepare statement syntax][prepare-statement] of MS SQL
Server and expects parameters in the SQL query to be in the form of either
@@ -78,7 +78,7 @@ tools:
> including identifiers, column names, and table names. **This makes it more
> vulnerable to SQL injections**. Using basic parameters only (see above) is
> recommended for performance and safety reasons. For more details, please check
> [templateParameters](_index#template-parameters).
> [templateParameters](#template-parameters).
```yaml
tools:
@@ -107,5 +107,5 @@ tools:
| source | string | true | Name of the source the T-SQL statement should execute on. |
| description | string | true | Description of the tool that is passed to the LLM. |
| statement | string | true | SQL statement to execute. |
| parameters | [parameters](_index#specifying-parameters) | false | List of [parameters](_index#specifying-parameters) that will be inserted into the SQL statement. |
| templateParameters | [templateParameters](_index#template-parameters) | false | List of [templateParameters](_index#template-parameters) that will be inserted into the SQL statement before executing prepared statement. |
| parameters | [parameters](../#specifying-parameters) | false | List of [parameters](../#specifying-parameters) that will be inserted into the SQL statement. |
| templateParameters | [templateParameters](#template-parameters) | false | List of [templateParameters](#template-parameters) that will be inserted into the SQL statement before executing prepared statement. |

View File

@@ -2,7 +2,7 @@
title: "mysql-execute-sql"
type: docs
weight: 1
description: >
description: >
A "mysql-execute-sql" tool executes a SQL statement against a MySQL
database.
aliases:
@@ -14,8 +14,8 @@ aliases:
A `mysql-execute-sql` tool executes a SQL statement against a MySQL
database. It's compatible with any of the following sources:
- [cloud-sql-mysql](../sources/cloud-sql-mysql.md)
- [mysql](../sources/mysql.md)
- [cloud-sql-mysql](../../sources/cloud-sql-mysql.md)
- [mysql](../../sources/mysql.md)
`mysql-execute-sql` takes one input parameter `sql` and run the sql
statement against the `source`.

View File

@@ -2,7 +2,7 @@
title: "mysql-sql"
type: docs
weight: 1
description: >
description: >
A "mysql-sql" tool executes a pre-defined SQL statement against a MySQL
database.
aliases:
@@ -14,8 +14,8 @@ aliases:
A `mysql-sql` tool executes a pre-defined SQL statement against a MySQL
database. It's compatible with any of the following sources:
- [cloud-sql-mysql](../sources/cloud-sql-mysql.md)
- [mysql](../sources/mysql.md)
- [cloud-sql-mysql](../../sources/cloud-sql-mysql.md)
- [mysql](../../sources/mysql.md)
The specified SQL statement is executed as a [prepared statement][mysql-prepare],
and expects parameters in the SQL query to be in the form of placeholders `?`.
@@ -73,7 +73,7 @@ tools:
> including identifiers, column names, and table names. **This makes it more
> vulnerable to SQL injections**. Using basic parameters only (see above) is
> recommended for performance and safety reasons. For more details, please check
> [templateParameters](_index#template-parameters).
> [templateParameters](#template-parameters).
```yaml
tools:
@@ -102,5 +102,5 @@ tools:
| source | string | true | Name of the source the SQL should execute on. |
| description | string | true | Description of the tool that is passed to the LLM. |
| statement | string | true | SQL statement to execute on. |
| parameters | [parameters](_index#specifying-parameters) | false | List of [parameters](_index#specifying-parameters) that will be inserted into the SQL statement. |
| templateParameters | [templateParameters](_index#template-parameters) | false | List of [templateParameters](_index#template-parameters) that will be inserted into the SQL statement before executing prepared statement. |
| parameters | [parameters](../#specifying-parameters) | false | List of [parameters](../#specifying-parameters) that will be inserted into the SQL statement. |
| templateParameters | [templateParameters](#template-parameters) | false | List of [templateParameters](#template-parameters) that will be inserted into the SQL statement before executing prepared statement. |

View File

@@ -2,7 +2,7 @@
title: "neo4j-cypher"
type: docs
weight: 1
description: >
description: >
A "neo4j-cypher" tool executes a pre-defined cypher statement against a Neo4j
database.
aliases:
@@ -14,7 +14,7 @@ aliases:
A `neo4j-cypher` tool executes a pre-defined Cypher statement against a Neo4j
database. It's compatible with any of the following sources:
- [neo4j](../sources/neo4j.md)
- [neo4j](../../sources/neo4j.md)
The specified Cypher statement is executed as a [parameterized
statement][neo4j-parameters], and specified parameters will be used according to
@@ -63,7 +63,7 @@ tools:
description: Full actor name, "firstname lastname"
- name: year
type: integer
description: 4 digit number starting in 1900 up to the current year
description: 4 digit number starting in 1900 up to the current year
```
## Reference
@@ -74,4 +74,4 @@ tools:
| source | string | true | Name of the source the Cypher query should execute on. |
| description | string | true | Description of the tool that is passed to the LLM. |
| statement | string | true | Cypher statement to execute |
| parameters | [parameters](_index#specifying-parameters) | false | List of [parameters](_index#specifying-parameters) that will be used with the Cypher statement. |
| parameters | [parameters](../#specifying-parameters) | false | List of [parameters](../#specifying-parameters) that will be used with the Cypher statement. |

View File

@@ -2,7 +2,7 @@
title: "neo4j-execute-cypher"
type: docs
weight: 1
description: >
description: >
A "neo4j-execute-cypher" tool executes any arbitrary Cypher statement against a Neo4j
database.
aliases:
@@ -16,7 +16,7 @@ string parameter against a Neo4j database. It's designed to be a flexible tool
for interacting with the database when a pre-defined query is not sufficient.
This tool is compatible with any of the following sources:
- [neo4j](../sources/neo4j.md)
- [neo4j](../../sources/neo4j.md)
For security, the tool can be configured to be read-only. If the `readOnly` flag
is set to `true`, the tool will analyze the incoming Cypher query and reject any

View File

@@ -2,7 +2,7 @@
title: "postgres-execute-sql"
type: docs
weight: 1
description: >
description: >
A "postgres-execute-sql" tool executes a SQL statement against a Postgres
database.
aliases:
@@ -14,9 +14,9 @@ aliases:
A `postgres-execute-sql` tool executes a SQL statement against a Postgres
database. It's compatible with any of the following sources:
- [alloydb-postgres](../sources/alloydb-pg.md)
- [cloud-sql-postgres](../sources/cloud-sql-pg.md)
- [postgres](../sources/postgres.md)
- [alloydb-postgres](../../sources/alloydb-pg.md)
- [cloud-sql-postgres](../../sources/cloud-sql-pg.md)
- [postgres](../../sources/postgres.md)
`postgres-execute-sql` takes one input parameter `sql` and run the sql
statement against the `source`.

View File

@@ -2,7 +2,7 @@
title: "postgres-sql"
type: docs
weight: 1
description: >
description: >
A "postgres-sql" tool executes a pre-defined SQL statement against a Postgres
database.
aliases:
@@ -14,9 +14,9 @@ aliases:
A `postgres-sql` tool executes a pre-defined SQL statement against a Postgres
database. It's compatible with any of the following sources:
- [alloydb-postgres](../sources/alloydb-pg.md)
- [cloud-sql-postgres](../sources/cloud-sql-pg.md)
- [postgres](../sources/postgres.md)
- [alloydb-postgres](../../sources/alloydb-pg.md)
- [cloud-sql-postgres](../../sources/cloud-sql-pg.md)
- [postgres](../../sources/postgres.md)
The specified SQL statement is executed as a [prepared statement][pg-prepare],
and specified parameters will inserted according to their position: e.g. `1`
@@ -77,7 +77,7 @@ tools:
> including identifiers, column names, and table names. **This makes it more
> vulnerable to SQL injections**. Using basic parameters only (see above) is
> recommended for performance and safety reasons. For more details, please check
> [templateParameters](_index#template-parameters).
> [templateParameters](#template-parameters).
```yaml
tools:
@@ -106,5 +106,5 @@ tools:
| source | string | true | Name of the source the SQL should execute on. |
| description | string | true | Description of the tool that is passed to the LLM. |
| statement | string | true | SQL statement to execute on. |
| parameters | [parameters](_index#specifying-parameters) | false | List of [parameters](_index#specifying-parameters) that will be inserted into the SQL statement. |
| templateParameters | [templateParameters](_index#template-parameters) | false | List of [templateParameters](_index#template-parameters) that will be inserted into the SQL statement before executing prepared statement. |
| parameters | [parameters](../#specifying-parameters) | false | List of [parameters](../#specifying-parameters) that will be inserted into the SQL statement. |
| templateParameters | [templateParameters](#template-parameters) | false | List of [templateParameters](#template-parameters) that will be inserted into the SQL statement before executing prepared statement. |

View File

@@ -2,7 +2,7 @@
title: "spanner-execute-sql"
type: docs
weight: 1
description: >
description: >
A "spanner-execute-sql" tool executes a SQL statement against a Spanner
database.
aliases:
@@ -14,7 +14,7 @@ aliases:
A `spanner-execute-sql` tool executes a SQL statement against a Spanner
database. It's compatible with any of the following sources:
- [spanner](../sources/spanner.md)
- [spanner](../../sources/spanner.md)
`spanner-execute-sql` takes one input parameter `sql` and run the sql
statement against the `source`.

View File

@@ -2,8 +2,8 @@
title: "spanner-sql"
type: docs
weight: 1
description: >
A "spanner-sql" tool executes a pre-defined SQL statement against a Google
description: >
A "spanner-sql" tool executes a pre-defined SQL statement against a Google
Cloud Spanner database.
aliases:
- /resources/tools/spanner-sql
@@ -15,7 +15,7 @@ A `spanner-sql` tool executes a pre-defined SQL statement (either `googlesql` or
`postgresql`) against a Cloud Spanner database. It's compatible with any of the
following sources:
- [spanner](../sources/spanner.md)
- [spanner](../../sources/spanner.md)
### GoogleSQL
@@ -134,7 +134,7 @@ tools:
> including identifiers, column names, and table names. **This makes it more
> vulnerable to SQL injections**. Using basic parameters only (see above) is
> recommended for performance and safety reasons. For more details, please check
> [templateParameters](_index#template-parameters).
> [templateParameters](#template-parameters).
```yaml
tools:
@@ -163,6 +163,6 @@ tools:
| source | string | true | Name of the source the SQL should execute on. |
| description | string | true | Description of the tool that is passed to the LLM. |
| statement | string | true | SQL statement to execute on. |
| parameters | [parameters](_index#specifying-parameters) | false | List of [parameters](_index#specifying-parameters) that will be inserted into the SQL statement. |
| parameters | [parameters](../#specifying-parameters) | false | List of [parameters](../#specifying-parameters) that will be inserted into the SQL statement. |
| readOnly | bool | false | When set to `true`, the `statement` is run as a read-only transaction. Default: `false`. |
| templateParameters | [templateParameters](_index#template-parameters) | false | List of [templateParameters](_index#template-parameters) that will be inserted into the SQL statement before executing prepared statement. |
| templateParameters | [templateParameters](#template-parameters) | false | List of [templateParameters](#template-parameters) that will be inserted into the SQL statement before executing prepared statement. |

View File

@@ -13,7 +13,7 @@ aliases:
A `sqlite-sql` tool executes SQL statements against a SQLite database.
It's compatible with any of the following sources:
- [sqlite](../sources/sqlite.md)
- [sqlite](../../sources/sqlite.md)
SQLite uses the `?` placeholder for parameters in SQL statements. Parameters are
bound in the order they are provided.
@@ -51,7 +51,7 @@ tools:
> including identifiers, column names, and table names. **This makes it more
> vulnerable to SQL injections**. Using basic parameters only (see above) is
> recommended for performance and safety reasons. For more details, please check
> [templateParameters](_index#template-parameters).
> [templateParameters](#template-parameters).
```yaml
tools:
@@ -80,5 +80,5 @@ tools:
| source | string | true | Name of the source the SQLite source configuration. |
| description | string | true | Description of the tool that is passed to the LLM. |
| statement | string | true | The SQL statement to execute. |
| parameters | [parameters](_index#specifying-parameters) | false | List of [parameters](_index#specifying-parameters) that will be inserted into the SQL statement. |
| templateParameters | [templateParameters](_index#template-parameters) | false | List of [templateParameters](_index#template-parameters) that will be inserted into the SQL statement before executing prepared statement. |
| parameters | [parameters](../#specifying-parameters) | false | List of [parameters](../#specifying-parameters) that will be inserted into the SQL statement. |
| templateParameters | [templateParameters](#template-parameters) | false | List of [templateParameters](#template-parameters) that will be inserted into the SQL statement before executing prepared statement. |

View File

@@ -0,0 +1,7 @@
---
title: "TiDB"
type: docs
weight: 1
description: >
Tools that work with TiDB Sources, such as TiDB Cloud and self-hosted TiDB.
---

View File

@@ -0,0 +1,41 @@
---
title: "tidb-execute-sql"
type: docs
weight: 1
description: >
A "tidb-execute-sql" tool executes a SQL statement against a TiDB
database.
aliases:
- /resources/tools/tidb-execute-sql
---
## About
A `tidb-execute-sql` tool executes a SQL statement against a TiDB
database. It's compatible with the following source:
- [tidb](../sources/tidb.md)
`tidb-execute-sql` takes one input parameter `sql` and run the sql
statement against the `source`.
> **Note:** This tool is intended for developer assistant workflows with
> human-in-the-loop and shouldn't be used for production agents.
## Example
```yaml
tools:
execute_sql_tool:
kind: tidb-execute-sql
source: my-tidb-instance
description: Use this tool to execute sql statement.
```
## Reference
| **field** | **type** | **required** | **description** |
|-------------|:------------------------------------------:|:------------:|--------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "tidb-execute-sql". |
| source | string | true | Name of the source the SQL should execute on. |
| description | string | true | Description of the tool that is passed to the LLM. |

View File

@@ -0,0 +1,105 @@
---
title: "tidb-sql"
type: docs
weight: 1
description: >
A "tidb-sql" tool executes a pre-defined SQL statement against a TiDB
database.
aliases:
- /resources/tools/tidb-sql
---
## About
A `tidb-sql` tool executes a pre-defined SQL statement against a TiDB
database. It's compatible with the following source:
- [tidb](../sources/tidb.md)
The specified SQL statement is executed as a [prepared statement][tidb-prepare],
and expects parameters in the SQL query to be in the form of placeholders `?`.
[tidb-prepare]: https://docs.pingcap.com/tidb/stable/sql-prepared-plan-cache
## Example
> **Note:** This tool uses parameterized queries to prevent SQL injections.
> Query parameters can be used as substitutes for arbitrary expressions.
> Parameters cannot be used as substitutes for identifiers, column names, table
> names, or other parts of the query.
```yaml
tools:
search_flights_by_number:
kind: tidb-sql
source: my-tidb-instance
statement: |
SELECT * FROM flights
WHERE airline = ?
AND flight_number = ?
LIMIT 10
description: |
Use this tool to get information for a specific flight.
Takes an airline code and flight number and returns info on the flight.
Do NOT use this tool with a flight id. Do NOT guess an airline code or flight number.
A airline code is a code for an airline service consisting of two-character
airline designator and followed by flight number, which is 1 to 4 digit number.
For example, if given CY 0123, the airline is "CY", and flight_number is "123".
Another example for this is DL 1234, the airline is "DL", and flight_number is "1234".
If the tool returns more than one option choose the date closes to today.
Example:
{{
"airline": "CY",
"flight_number": "888",
}}
Example:
{{
"airline": "DL",
"flight_number": "1234",
}}
parameters:
- name: airline
type: string
description: Airline unique 2 letter identifier
- name: flight_number
type: string
description: 1 to 4 digit number
```
### Example with Template Parameters
> **Note:** This tool allows direct modifications to the SQL statement,
> including identifiers, column names, and table names. **This makes it more
> vulnerable to SQL injections**. Using basic parameters only (see above) is
> recommended for performance and safety reasons. For more details, please check
> [templateParameters](_index#template-parameters).
```yaml
tools:
list_table:
kind: tidb-sql
source: my-tidb-instance
statement: |
SELECT * FROM {{.tableName}};
description: |
Use this tool to list all information from a specific table.
Example:
{{
"tableName": "flights",
}}
templateParameters:
- name: tableName
type: string
description: Table to select from
```
## Reference
| **field** | **type** | **required** | **description** |
|--------------------|:------------------------------------------------:|:------------:|--------------------------------------------------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "tidb-sql". |
| source | string | true | Name of the source the SQL should execute on. |
| description | string | true | Description of the tool that is passed to the LLM. |
| statement | string | true | SQL statement to execute on. |
| parameters | [parameters](_index#specifying-parameters) | false | List of [parameters](_index#specifying-parameters) that will be inserted into the SQL statement. |
| templateParameters | [templateParameters](_index#template-parameters) | false | List of [templateParameters](_index#template-parameters) that will be inserted into the SQL statement before executing prepared statement. |

View File

@@ -0,0 +1,340 @@
---
title: "AlloyDB"
type: docs
weight: 2
description: >
How to get started running Toolbox with MCP Inspector and AlloyDB as the source.
---
## Overview
[Model Context Protocol](https://modelcontextprotocol.io) is an open protocol
that standardizes how applications provide context to LLMs. Check out this page
on how to [connect to Toolbox via MCP](../../how-to/connect_via_mcp.md).
## Before you begin
This guide assumes you have already done the following:
1. [Create a AlloyDB cluster and instance](https://cloud.google.com/alloydb/docs/cluster-create) with a database and user.
1. Connect to the instance using [AlloyDB Studio](https://cloud.google.com/alloydb/docs/manage-data-using-studio), [`psql` command-line tool](https://www.postgresql.org/download/), or any other PostgreSQL client.
2. Enable the `pgvector` and `google_ml_integration` [extensions](https://cloud.google.com/alloydb/docs/ai). These are required for Semantic Search and Natural Language to SQL tools. Run the following SQL commands:
```sql
CREATE EXTENSION IF NOT EXISTS "vector";
CREATE EXTENSION IF NOT EXISTS "google_ml_integration";
CREATE EXTENSION IF NOT EXISTS alloydb_ai_nl cascade;
CREATE EXTENSION IF NOT EXISTS parameterized_views;
```
## Step 1: Set up your AlloyDB database
In this section, we will create the necessary tables and functions in your AlloyDB instance.
1. Create tables using the following commands:
```sql
CREATE TABLE products (
product_id SERIAL PRIMARY KEY,
name VARCHAR(255) NOT NULL,
description TEXT,
price DECIMAL(10, 2) NOT NULL,
category_id INT,
embedding vector(3072) -- Vector size for model(gemini-embedding-001)
);
CREATE TABLE customers (
customer_id SERIAL PRIMARY KEY,
name VARCHAR(255) NOT NULL,
email VARCHAR(255) UNIQUE NOT NULL
);
CREATE TABLE cart (
cart_id SERIAL PRIMARY KEY,
customer_id INT UNIQUE NOT NULL,
created_at TIMESTAMP WITH TIME ZONE DEFAULT CURRENT_TIMESTAMP,
FOREIGN KEY (customer_id) REFERENCES customers(customer_id)
);
CREATE TABLE cart_items (
cart_item_id SERIAL PRIMARY KEY,
cart_id INT NOT NULL,
product_id INT NOT NULL,
quantity INT NOT NULL,
price DECIMAL(10, 2) NOT NULL,
FOREIGN KEY (cart_id) REFERENCES cart(cart_id),
FOREIGN KEY (product_id) REFERENCES products(product_id)
);
CREATE TABLE categories (
category_id SERIAL PRIMARY KEY,
name VARCHAR(255) NOT NULL
);
```
2. Insert sample data into the tables:
```sql
INSERT INTO categories (category_id, name) VALUES
(1, 'Flowers'),
(2, 'Vases');
INSERT INTO products (product_id, name, description, price, category_id, embedding) VALUES
(1, 'Rose', 'A beautiful red rose', 2.50, 1, embedding('gemini-embedding-001', 'A beautiful red rose')),
(2, 'Tulip', 'A colorful tulip', 1.50, 1, embedding('gemini-embedding-001', 'A colorful tulip')),
(3, 'Glass Vase', 'A transparent glass vase', 10.00, 2, embedding('gemini-embedding-001', 'A transparent glass vase')),
(4, 'Ceramic Vase', 'A handmade ceramic vase', 15.00, 2, embedding('gemini-embedding-001', 'A handmade ceramic vase'));
INSERT INTO customers (customer_id, name, email) VALUES
(1, 'John Doe', 'john.doe@example.com'),
(2, 'Jane Smith', 'jane.smith@example.com');
INSERT INTO cart (cart_id, customer_id) VALUES
(1, 1),
(2, 2);
INSERT INTO cart_items (cart_id, product_id, quantity, price) VALUES
(1, 1, 2, 2.50),
(1, 3, 1, 10.00),
(2, 2, 5, 1.50);
```
## Step 2: Install Toolbox
In this section, we will download and install the Toolbox binary.
1. Download the latest version of Toolbox as a binary:
{{< notice tip >}}
Select the
[correct binary](https://github.com/googleapis/genai-toolbox/releases)
corresponding to your OS and CPU architecture.
{{< /notice >}}
<!-- {x-release-please-start-version} -->
```bash
export OS="linux/amd64" # one of linux/amd64, darwin/arm64, darwin/amd64, or windows/amd64
export VERSION="0.10.0"
curl -O https://storage.googleapis.com/genai-toolbox/v$VERSION/$OS/toolbox
```
<!-- {x-release-please-end} -->
1. Make the binary executable:
```bash
chmod +x toolbox
```
## Step 3: Configure the tools
Create a `tools.yaml` file and add the following content. You must replace the placeholders with your actual AlloyDB configuration.
First, define the data source for your tools. This tells Toolbox how to connect to your AlloyDB instance.
```yaml
sources:
alloydb-pg-source:
kind: alloydb-postgres
project: YOUR_PROJECT_ID
region: YOUR_REGION
cluster: YOUR_CLUSTER
instance: YOUR_INSTANCE
database: YOUR_DATABASE
user: YOUR_USER
password: YOUR_PASSWORD
```
Next, define the tools the agent can use. We will categorize them into three types:
### 1. Structured Queries Tools
These tools execute predefined SQL statements. They are ideal for common, structured queries like managing a shopping cart. Add the following to your `tools.yaml` file:
```yaml
tools:
access-cart-information:
kind: postgres-sql
source: alloydb-pg-source
description: >-
List items in customer cart.
Use this tool to list items in a customer cart. This tool requires the cart ID.
parameters:
- name: cart_id
type: integer
description: The id of the cart.
statement: |
SELECT
p.name AS product_name,
ci.quantity,
ci.price AS item_price,
(ci.quantity * ci.price) AS total_item_price,
c.created_at AS cart_created_at,
ci.product_id AS product_id
FROM
cart_items ci JOIN cart c ON ci.cart_id = c.cart_id
JOIN products p ON ci.product_id = p.product_id
WHERE
c.cart_id = $1;
add-to-cart:
kind: postgres-sql
source: alloydb-pg-source
description: >-
Add items to customer cart using the product ID and product prices from the product list.
Use this tool to add items to a customer cart.
This tool requires the cart ID, product ID, quantity, and price.
parameters:
- name: cart_id
type: integer
description: The id of the cart.
- name: product_id
type: integer
description: The id of the product.
- name: quantity
type: integer
description: The quantity of items to add.
- name: price
type: float
description: The price of items to add.
statement: |
INSERT INTO
cart_items (cart_id, product_id, quantity, price)
VALUES($1,$2,$3,$4);
delete-from-cart:
kind: postgres-sql
source: alloydb-pg-source
description: >-
Remove products from customer cart.
Use this tool to remove products from a customer cart.
This tool requires the cart ID and product ID.
parameters:
- name: cart_id
type: integer
description: The id of the cart.
- name: product_id
type: integer
description: The id of the product.
statement: |
DELETE FROM
cart_items
WHERE
cart_id = $1 AND product_id = $2;
```
### 2. Semantic Search Tools
These tools use vector embeddings to find the most relevant results based on the meaning of a user's query, rather than just keywords. Append the following tools to the `tools` section in your `tools.yaml`:
```yaml
search-product-recommendations:
kind: postgres-sql
source: alloydb-pg-source
description: >-
Search for products based on user needs.
Use this tool to search for products. This tool requires the user's needs.
parameters:
- name: query
type: string
description: The product characteristics
statement: |
SELECT
product_id,
name,
description,
ROUND(CAST(price AS numeric), 2) as price
FROM
products
ORDER BY
embedding('gemini-embedding-001', $1)::vector <=> embedding
LIMIT 5;
```
### 3. Natural Language to SQL (NL2SQL) Tools
1. Create a [natural language configuration](https://cloud.google.com/alloydb/docs/ai/use-natural-language-generate-sql-queries#create-config) for your AlloyDB cluster.
{{< notice tip >}}Before using NL2SQL tools,
you must first install the `alloydb_ai_nl` extension and
create the [semantic layer](https://cloud.google.com/alloydb/docs/ai/natural-language-overview) under a configuration named `flower_shop`.
{{< /notice >}}
2. Configure your NL2SQL tool to use your configuration. These tools translate natural language questions into SQL queries, allowing users to interact with the database conversationally. Append the following tool to the `tools` section:
```yaml
ask-questions-about-products:
kind: alloydb-ai-nl
source: alloydb-pg-source
nlConfig: flower_shop
description: >-
Ask questions related to products or brands.
Use this tool to ask questions about products or brands.
Always SELECT the IDs of objects when generating queries.
```
Finally, group the tools into a `toolset` to make them available to the model. Add the following to the end of your `tools.yaml` file:
```yaml
toolsets:
flower_shop:
- access-cart-information
- search-product-recommendations
- ask-questions-about-products
- add-to-cart
- delete-from-cart
```
For more info on tools, check out the
[Tools](../../resources/tools/) section.
## Step 4: Run the Toolbox server
Run the Toolbox server, pointing to the `tools.yaml` file created earlier:
```bash
./toolbox --tools-file "tools.yaml"
```
## Step 5: Connect to MCP Inspector
1. Run the MCP Inspector:
```bash
npx @modelcontextprotocol/inspector
```
1. Type `y` when it asks to install the inspector package.
1. It should show the following when the MCP Inspector is up and running (please take note of `<YOUR_SESSION_TOKEN>`):
```bash
Starting MCP inspector...
⚙️ Proxy server listening on localhost:6277
🔑 Session token: <YOUR_SESSION_TOKEN>
Use this token to authenticate requests or set DANGEROUSLY_OMIT_AUTH=true to disable auth
🚀 MCP Inspector is up and running at:
http://localhost:6274/?MCP_PROXY_AUTH_TOKEN=<YOUR_SESSION_TOKEN>
```
1. Open the above link in your browser.
1. For `Transport Type`, select `Streamable HTTP`.
1. For `URL`, type in `http://127.0.0.1:5000/mcp`.
1. For `Configuration` -> `Proxy Session Token`, make sure `<YOUR_SESSION_TOKEN>` is present.
1. Click Connect.
1. Select `List Tools`, you will see a list of tools configured in `tools.yaml`.
1. Test out your tools here!
## What's next
- Learn more about [MCP Inspector](../../how-to/connect_via_mcp.md).
- Learn more about [Toolbox Resources](../../resources/).
- Learn more about [Toolbox How-to guides](../../how-to/).

View File

@@ -220,7 +220,7 @@
},
"outputs": [],
"source": [
"version = \"0.10.0\" # x-release-please-version\n",
"version = \"0.11.0\" # x-release-please-version\n",
"! curl -O https://storage.googleapis.com/genai-toolbox/v{version}/linux/amd64/toolbox\n",
"\n",
"# Make the binary executable\n",

View File

@@ -3,7 +3,7 @@ title: "Quickstart (Local with BigQuery)"
type: docs
weight: 1
description: >
How to get started running Toolbox locally with Python, BigQuery, and
How to get started running Toolbox locally with Python, BigQuery, and
LangGraph, LlamaIndex, or ADK.
---
@@ -693,7 +693,7 @@ with ToolboxSyncClient("<http://127.0.0.1:5000>") as toolbox_client:
{{< tabpane text=true persist=header >}}
{{% tab header="Core" lang="en" %}}
To learn more about the Core SDK, check out the [Toolbox Core SDK
documentation.](https://github.com/googleapis/genai-toolbox/tree/main/sdks/toolbox-core)
documentation.](https://github.com/googleapis/mcp-toolbox-sdk-python/blob/main/packages/toolbox-core/README.md)
{{% /tab %}}
{{% tab header="Langchain" lang="en" %}}
To learn more about Agents in LangChain, check out the [LangGraph Agent

View File

@@ -10,7 +10,7 @@ description: >
[Model Context Protocol](https://modelcontextprotocol.io) is an open protocol
that standardizes how applications provide context to LLMs. Check out this page
on how to [connect to Toolbox via MCP](../../how-to/connect_via_mcp.md).
on how to [connect to Toolbox via MCP](../../../how-to/connect_via_mcp.md).
## Step 1: Set up your BigQuery Dataset and Table
@@ -190,7 +190,7 @@ In this section, we will download Toolbox, configure our tools in a
```
For more info on tools, check out the
[Tools](../../resources/tools/_index.md) section.
[Tools](../../../resources/tools/) section.
1. Run the Toolbox server, pointing to the `tools.yaml` file created earlier:

View File

@@ -46,7 +46,8 @@ In this section, we will download Toolbox and run the Toolbox server.
1. Edit the file `~/.gemini/settings.json` and add the following
to the list of mcpServers. Use the Client Id and Client Secret
you obtained earlier.
you obtained earlier. The name of the server - here
`looker-toolbox` - can be anything meaningful to you.
```json
"mcpServers": {
@@ -99,6 +100,7 @@ In this section, we will download Toolbox and run the Toolbox server.
- looker-toolbox__query_sql
- looker-toolbox__get_dimensions
- looker-toolbox__run_look
- looker-toolbox__query_url
```
1. Start exploring your Looker instance with commands like
@@ -106,4 +108,5 @@ In this section, we will download Toolbox and run the Toolbox server.
inventory broken down by item category`.
1. Gemini will prompt you for your approval before using
a tool. You can approve all the tools.
a tool. You can approve all the tools at once or
one at a time.

46
go.mod
View File

@@ -1,6 +1,6 @@
module github.com/googleapis/genai-toolbox
go 1.23.8
go 1.24
toolchain go1.24.5
@@ -20,6 +20,7 @@ require (
github.com/go-chi/chi/v5 v5.2.2
github.com/go-chi/httplog/v2 v2.1.1
github.com/go-chi/render v1.0.3
github.com/go-goquery/goquery v1.0.1
github.com/go-playground/validator/v10 v10.27.0
github.com/go-sql-driver/mysql v1.9.3
github.com/goccy/go-yaml v1.18.0
@@ -44,14 +45,25 @@ require (
go.opentelemetry.io/otel/sdk/metric v1.37.0
go.opentelemetry.io/otel/trace v1.37.0
golang.org/x/oauth2 v0.30.0
google.golang.org/api v0.243.0
modernc.org/sqlite v1.38.0
google.golang.org/api v0.244.0
modernc.org/sqlite v1.38.2
)
require golang.org/x/exp v0.0.0-20250408133849-7e4ce0ab07d0 // indirect
require (
github.com/andybalholm/cascadia v1.3.3 // indirect
github.com/duckdb/duckdb-go-bindings v0.1.17 // indirect
github.com/duckdb/duckdb-go-bindings/darwin-amd64 v0.1.12 // indirect
github.com/duckdb/duckdb-go-bindings/darwin-arm64 v0.1.12 // indirect
github.com/duckdb/duckdb-go-bindings/linux-amd64 v0.1.12 // indirect
github.com/duckdb/duckdb-go-bindings/linux-arm64 v0.1.12 // indirect
github.com/duckdb/duckdb-go-bindings/windows-amd64 v0.1.12 // indirect
github.com/marcboeker/go-duckdb/arrowmapping v0.0.10 // indirect
github.com/marcboeker/go-duckdb/mapping v0.0.11 // indirect
golang.org/x/exp v0.0.0-20250620022241-b7579e27df2b // indirect
)
require (
cel.dev/expr v0.23.0 // indirect
cel.dev/expr v0.24.0 // indirect
cloud.google.com/go v0.121.2 // indirect
cloud.google.com/go/alloydb v1.18.0 // indirect
cloud.google.com/go/auth v0.16.3 // indirect
@@ -65,11 +77,13 @@ require (
github.com/GoogleCloudPlatform/grpc-gcp-go/grpcgcp v1.5.3 // indirect
github.com/GoogleCloudPlatform/opentelemetry-operations-go/detectors/gcp v1.27.0 // indirect
github.com/GoogleCloudPlatform/opentelemetry-operations-go/internal/resourcemapping v0.53.0 // indirect
github.com/PuerkitoBio/goquery v1.10.3 // indirect
github.com/ajg/form v1.5.1 // indirect
github.com/apache/arrow-go/v18 v18.4.0 // indirect
github.com/apache/arrow/go/v15 v15.0.2 // indirect
github.com/cenkalti/backoff/v5 v5.0.2 // indirect
github.com/cespare/xxhash/v2 v2.3.0 // indirect
github.com/cncf/xds/go v0.0.0-20250326154945-ae57f3c0d45f // indirect
github.com/cncf/xds/go v0.0.0-20250501225837-2ac532fd4443 // indirect
github.com/couchbase/gocbcore/v10 v10.7.1 // indirect
github.com/couchbase/gocbcoreps v0.1.3 // indirect
github.com/couchbase/goprotostellar v1.0.2 // indirect
@@ -86,12 +100,13 @@ require (
github.com/go-logr/stdr v1.2.2 // indirect
github.com/go-playground/locales v0.14.1 // indirect
github.com/go-playground/universal-translator v0.18.1 // indirect
github.com/goccy/go-json v0.10.2 // indirect
github.com/go-viper/mapstructure/v2 v2.3.0 // indirect
github.com/goccy/go-json v0.10.5 // indirect
github.com/golang-sql/civil v0.0.0-20220223132316-b832511892a9 // indirect
github.com/golang-sql/sqlexp v0.1.0 // indirect
github.com/golang/groupcache v0.0.0-20241129210726-2c02b8208cf8 // indirect
github.com/golang/snappy v0.0.4 // indirect
github.com/google/flatbuffers v23.5.26+incompatible // indirect
github.com/golang/snappy v1.0.0 // indirect
github.com/google/flatbuffers v25.2.10+incompatible // indirect
github.com/google/s2a-go v0.1.9 // indirect
github.com/googleapis/enterprise-certificate-proxy v0.3.6 // indirect
github.com/googleapis/gax-go/v2 v2.15.0 // indirect
@@ -102,15 +117,16 @@ require (
github.com/jackc/pgpassfile v1.0.0 // indirect
github.com/jackc/pgservicefile v0.0.0-20240606120523-5a60cdf6a761 // indirect
github.com/jackc/puddle/v2 v2.2.2 // indirect
github.com/klauspost/compress v1.16.7 // indirect
github.com/klauspost/cpuid/v2 v2.2.5 // indirect
github.com/klauspost/compress v1.18.0 // indirect
github.com/klauspost/cpuid/v2 v2.2.11 // indirect
github.com/leodido/go-urn v1.4.0 // indirect
github.com/marcboeker/go-duckdb/v2 v2.3.4
github.com/mattn/go-isatty v0.0.20 // indirect
github.com/modern-go/concurrent v0.0.0-20180306012644-bacd9c7ef1dd // indirect
github.com/modern-go/reflect2 v1.0.2 // indirect
github.com/montanaflynn/stats v0.7.1 // indirect
github.com/ncruces/go-strftime v0.1.9 // indirect
github.com/pierrec/lz4/v4 v4.1.18 // indirect
github.com/pierrec/lz4/v4 v4.1.22 // indirect
github.com/planetscale/vtprotobuf v0.6.1-0.20240319094008-0393e58bdf10 // indirect
github.com/remyoudompheng/bigfft v0.0.0-20230129092748-24d4a6f8daec // indirect
github.com/spf13/pflag v1.0.6 // indirect
@@ -145,11 +161,11 @@ require (
golang.org/x/xerrors v0.0.0-20240903120638-7835f813f4da // indirect
google.golang.org/genproto v0.0.0-20250603155806-513f23925822 // indirect
google.golang.org/genproto/googleapis/api v0.0.0-20250603155806-513f23925822 // indirect
google.golang.org/genproto/googleapis/rpc v0.0.0-20250715232539-7130f93afb79 // indirect
google.golang.org/grpc v1.73.0 // indirect
google.golang.org/genproto/googleapis/rpc v0.0.0-20250728155136-f173205681a0 // indirect
google.golang.org/grpc v1.74.2 // indirect
google.golang.org/protobuf v1.36.6 // indirect
gopkg.in/ini.v1 v1.67.0 // indirect
modernc.org/libc v1.65.10 // indirect
modernc.org/libc v1.66.3 // indirect
modernc.org/mathutil v1.7.1 // indirect
modernc.org/memory v1.11.0 // indirect
)

138
go.sum
View File

@@ -1,5 +1,5 @@
cel.dev/expr v0.23.0 h1:wUb94w6OYQS4uXraxo9U+wUAs9jT47Xvl4iPgAwM2ss=
cel.dev/expr v0.23.0/go.mod h1:hLPLo1W4QUmuYdA72RBX06QTs6MXw941piREPl3Yfiw=
cel.dev/expr v0.24.0 h1:56OvJKSH3hDGL0ml5uSxZmz3/3Pq4tJ+fb1unVLAFcY=
cel.dev/expr v0.24.0/go.mod h1:hLPLo1W4QUmuYdA72RBX06QTs6MXw941piREPl3Yfiw=
cloud.google.com/go v0.26.0/go.mod h1:aQUYkXzVsufM+DwF1aE+0xfcU+56JwCaLick0ClmMTw=
cloud.google.com/go v0.34.0/go.mod h1:aQUYkXzVsufM+DwF1aE+0xfcU+56JwCaLick0ClmMTw=
cloud.google.com/go v0.38.0/go.mod h1:990N+gfupTy94rShfmMCWGDn0LpTmnzTp2qbd1dvSRU=
@@ -665,6 +665,8 @@ github.com/GoogleCloudPlatform/opentelemetry-operations-go/internal/resourcemapp
github.com/GoogleCloudPlatform/opentelemetry-operations-go/internal/resourcemapping v0.53.0/go.mod h1:cSgYe11MCNYunTnRXrKiR/tHc0eoKjICUuWpNZoVCOo=
github.com/JohnCGriffin/overflow v0.0.0-20211019200055-46fa312c352c/go.mod h1:X0CRv0ky0k6m906ixxpzmDRLvX58TFUKS2eePweuyxk=
github.com/OneOfOne/xxhash v1.2.2/go.mod h1:HSdplMjZKSmBqAxg5vPj2TmRDmfkzw+cTzAElWljhcU=
github.com/PuerkitoBio/goquery v1.10.3 h1:pFYcNSqHxBD06Fpj/KsbStFRsgRATgnf3LeXiUkhzPo=
github.com/PuerkitoBio/goquery v1.10.3/go.mod h1:tMUX0zDMHXYlAQk6p35XxQMqMweEKB7iK7iLNd4RH4Y=
github.com/ajg/form v1.5.1 h1:t9c7v8JUKu/XxOGBU0yjNpaMloxGEJhUkqFRq0ibGeU=
github.com/ajg/form v1.5.1/go.mod h1:uL1WgH+h2mgNtvBq0339dVnzXdBETtL2LeUXaIv25UY=
github.com/ajstarks/deck v0.0.0-20200831202436-30c9fc6549a9/go.mod h1:JynElWSGnm/4RlzPXRlREEwqTHAN3T56Bv2ITsFT3gY=
@@ -672,12 +674,20 @@ github.com/ajstarks/deck/generate v0.0.0-20210309230005-c3f852c02e19/go.mod h1:T
github.com/ajstarks/svgo v0.0.0-20180226025133-644b8db467af/go.mod h1:K08gAheRH3/J6wwsYMMT4xOr94bZjxIelGM0+d/wbFw=
github.com/ajstarks/svgo v0.0.0-20211024235047-1546f124cd8b/go.mod h1:1KcenG0jGWcpt8ov532z81sp/kMMUG485J2InIOyADM=
github.com/andybalholm/brotli v1.0.4/go.mod h1:fO7iG3H7G2nSZ7m0zPUDn85XEX2GTukHGRSepvi9Eig=
github.com/andybalholm/brotli v1.2.0 h1:ukwgCxwYrmACq68yiUqwIWnGY0cTPox/M94sVwToPjQ=
github.com/andybalholm/brotli v1.2.0/go.mod h1:rzTDkvFWvIrjDXZHkuS16NPggd91W3kUSvPlQ1pLaKY=
github.com/andybalholm/cascadia v1.3.3 h1:AG2YHrzJIm4BZ19iwJ/DAua6Btl3IwJX+VI4kktS1LM=
github.com/andybalholm/cascadia v1.3.3/go.mod h1:xNd9bqTn98Ln4DwST8/nG+H0yuB8Hmgu1YHNnWw0GeA=
github.com/antihax/optional v1.0.0/go.mod h1:uupD/76wgC+ih3iEmQUL+0Ugr19nfwCT1kdvxnR2qWY=
github.com/apache/arrow-go/v18 v18.4.0 h1:/RvkGqH517iY8bZKc4FD5/kkdwXJGjxf28JIXbJ/oB0=
github.com/apache/arrow-go/v18 v18.4.0/go.mod h1:Aawvwhj8x2jURIzD9Moy72cF0FyJXOpkYpdmGRHcw14=
github.com/apache/arrow/go/v10 v10.0.1/go.mod h1:YvhnlEePVnBS4+0z3fhPfUy7W1Ikj0Ih0vcRo/gZ1M0=
github.com/apache/arrow/go/v11 v11.0.0/go.mod h1:Eg5OsL5H+e299f7u5ssuXsuHQVEGC4xei5aX110hRiI=
github.com/apache/arrow/go/v15 v15.0.2 h1:60IliRbiyTWCWjERBCkO1W4Qun9svcYoZrSLcyOsMLE=
github.com/apache/arrow/go/v15 v15.0.2/go.mod h1:DGXsR3ajT524njufqf95822i+KTh+yea1jass9YXgjA=
github.com/apache/thrift v0.16.0/go.mod h1:PHK3hniurgQaNMZYaCLEqXKsYK8upmhPbmdP2FXSqgU=
github.com/apache/thrift v0.22.0 h1:r7mTJdj51TMDe6RtcmNdQxgn9XcyfGDOzegMDRg47uc=
github.com/apache/thrift v0.22.0/go.mod h1:1e7J/O1Ae6ZQMTYdy9xa3w9k+XHWPfRvdPyJeynQ+/g=
github.com/benbjohnson/clock v1.1.0/go.mod h1:J11/hYXuz8f4ySSvYwY0FKfm+ezbsZBKZxNJlLklBHA=
github.com/boombuler/barcode v1.0.0/go.mod h1:paBWMcWSl3LHKBqUq+rly7CNSldXjb2rDl3JlRe0mD8=
github.com/boombuler/barcode v1.0.1/go.mod h1:paBWMcWSl3LHKBqUq+rly7CNSldXjb2rDl3JlRe0mD8=
@@ -712,8 +722,8 @@ github.com/cncf/xds/go v0.0.0-20211011173535-cb28da3451f1/go.mod h1:eXthEFrGJvWH
github.com/cncf/xds/go v0.0.0-20220314180256-7f1daf1720fc/go.mod h1:eXthEFrGJvWHgFFCl3hGmgk+/aYT6PnTQLykKQRLhEs=
github.com/cncf/xds/go v0.0.0-20230105202645-06c439db220b/go.mod h1:eXthEFrGJvWHgFFCl3hGmgk+/aYT6PnTQLykKQRLhEs=
github.com/cncf/xds/go v0.0.0-20230607035331-e9ce68804cb4/go.mod h1:eXthEFrGJvWHgFFCl3hGmgk+/aYT6PnTQLykKQRLhEs=
github.com/cncf/xds/go v0.0.0-20250326154945-ae57f3c0d45f h1:C5bqEmzEPLsHm9Mv73lSE9e9bKV23aB1vxOsmZrkl3k=
github.com/cncf/xds/go v0.0.0-20250326154945-ae57f3c0d45f/go.mod h1:W+zGtBO5Y1IgJhy4+A9GOqVhqLpfZi+vwmdNXUehLA8=
github.com/cncf/xds/go v0.0.0-20250501225837-2ac532fd4443 h1:aQ3y1lwWyqYPiWZThqv1aFbZMiM9vblcSArJRf2Irls=
github.com/cncf/xds/go v0.0.0-20250501225837-2ac532fd4443/go.mod h1:W+zGtBO5Y1IgJhy4+A9GOqVhqLpfZi+vwmdNXUehLA8=
github.com/couchbase/gocb/v2 v2.10.1 h1:5r1jngGxw3dTZdtq6Xmjq3pdU6hOwRvynvbVIp58T64=
github.com/couchbase/gocb/v2 v2.10.1/go.mod h1:GGEJuYjrfnPHCQLcxTcIco+Puy63PS2p8QQd8FRw66I=
github.com/couchbase/gocbcore/v10 v10.7.1 h1:6jsNDtqyfoQ8Xg6kv99rzccc3CrHbp7FjeY+ahWXTF4=
@@ -739,6 +749,18 @@ github.com/davecgh/go-spew v1.1.2-0.20180830191138-d8f796af33cc/go.mod h1:J7Y8Yc
github.com/dgryski/go-rendezvous v0.0.0-20200823014737-9f7001d12a5f h1:lO4WD4F/rVNCu3HqELle0jiPLLBs70cWOduZpkS1E78=
github.com/dgryski/go-rendezvous v0.0.0-20200823014737-9f7001d12a5f/go.mod h1:cuUVRXasLTGF7a8hSLbxyZXjz+1KgoB3wDUb6vlszIc=
github.com/docopt/docopt-go v0.0.0-20180111231733-ee0de3bc6815/go.mod h1:WwZ+bS3ebgob9U8Nd0kOddGdZWjyMGR8Wziv+TBNwSE=
github.com/duckdb/duckdb-go-bindings v0.1.17 h1:SjpRwrJ7v0vqnIvLeVFHlhuS72+Lp8xxQ5jIER2LZP4=
github.com/duckdb/duckdb-go-bindings v0.1.17/go.mod h1:pBnfviMzANT/9hi4bg+zW4ykRZZPCXlVuvBWEcZofkc=
github.com/duckdb/duckdb-go-bindings/darwin-amd64 v0.1.12 h1:8CLBnsq9YDhi2Gmt3sjSUeXxMzyMQAKefjqUy9zVPFk=
github.com/duckdb/duckdb-go-bindings/darwin-amd64 v0.1.12/go.mod h1:Ezo7IbAfB8NP7CqPIN8XEHKUg5xdRRQhcPPlCXImXYA=
github.com/duckdb/duckdb-go-bindings/darwin-arm64 v0.1.12 h1:wjO4I0GhMh2xIpiUgRpzuyOT4KxXLoUS/rjU7UUVvCE=
github.com/duckdb/duckdb-go-bindings/darwin-arm64 v0.1.12/go.mod h1:eS7m/mLnPQgVF4za1+xTyorKRBuK0/BA44Oy6DgrGXI=
github.com/duckdb/duckdb-go-bindings/linux-amd64 v0.1.12 h1:HzKQi2C+1jzmwANsPuYH6x9Sfw62SQTjNAEq3OySKFI=
github.com/duckdb/duckdb-go-bindings/linux-amd64 v0.1.12/go.mod h1:1GOuk1PixiESxLaCGFhag+oFi7aP+9W8byymRAvunBk=
github.com/duckdb/duckdb-go-bindings/linux-arm64 v0.1.12 h1:YGSR7AFLw2gJ7IbgLE6DkKYmgKv1LaRSd/ZKF1yh2oE=
github.com/duckdb/duckdb-go-bindings/linux-arm64 v0.1.12/go.mod h1:o7crKMpT2eOIi5/FY6HPqaXcvieeLSqdXXaXbruGX7w=
github.com/duckdb/duckdb-go-bindings/windows-amd64 v0.1.12 h1:2aduW6fnFnT2Q45PlIgHbatsPOxV9WSZ5B2HzFfxaxA=
github.com/duckdb/duckdb-go-bindings/windows-amd64 v0.1.12/go.mod h1:IlOhJdVKUJCAPj3QsDszUo8DVdvp1nBFp4TUJVdw99s=
github.com/dustin/go-humanize v1.0.0/go.mod h1:HtrtbFcZ19U5GC7JDqmcUSB87Iq5E25KnS6fMYU6eOk=
github.com/dustin/go-humanize v1.0.1 h1:GzkhY7T5VNhEkwH0PVJgjz+fX1rhBrR7pRT3mDkpeCY=
github.com/dustin/go-humanize v1.0.1/go.mod h1:Mu1zIs6XwVuF/gI1OepvI0qD18qycQx+mFykh5fBlto=
@@ -788,6 +810,8 @@ github.com/go-fonts/stix v0.1.0/go.mod h1:w/c1f0ldAUlJmLBvlbkvVXLAD+tAMqobIIQpmn
github.com/go-gl/glfw v0.0.0-20190409004039-e6da0acd62b1/go.mod h1:vR7hzQXu2zJy9AVAgeJqvqgH9Q5CA+iKCZ2gyEVpxRU=
github.com/go-gl/glfw/v3.3/glfw v0.0.0-20191125211704-12ad95a8df72/go.mod h1:tQ2UAYgL5IevRw8kRxooKSPJfGvJ9fJQFa0TUsXzTg8=
github.com/go-gl/glfw/v3.3/glfw v0.0.0-20200222043503-6f7a984d4dc4/go.mod h1:tQ2UAYgL5IevRw8kRxooKSPJfGvJ9fJQFa0TUsXzTg8=
github.com/go-goquery/goquery v1.0.1 h1:kpchVA1LdOFWdRpkDPESVdlb1JQI6ixsJ5MiNUITO7U=
github.com/go-goquery/goquery v1.0.1/go.mod h1:W5s8OWbqWf6lG0LkXWBeh7U1Y/X5XTI0Br65MHF8uJk=
github.com/go-jose/go-jose/v4 v4.0.5 h1:M6T8+mKZl/+fNNuFHvGIzDz7BTLQPIounk/b9dw3AaE=
github.com/go-jose/go-jose/v4 v4.0.5/go.mod h1:s3P1lRrkT8igV8D9OjyL4WRyHvjB6a4JSllnOrmmBOA=
github.com/go-kit/log v0.1.0/go.mod h1:zbhenjAZHb184qTLMA9ZjW7ThYL0H2mk7Q6pNt4vbaY=
@@ -812,9 +836,11 @@ github.com/go-playground/validator/v10 v10.27.0/go.mod h1:I5QpIEbmr8On7W0TktmJAu
github.com/go-sql-driver/mysql v1.9.3 h1:U/N249h2WzJ3Ukj8SowVFjdtZKfu9vlLZxjPXV1aweo=
github.com/go-sql-driver/mysql v1.9.3/go.mod h1:qn46aNg1333BRMNU69Lq93t8du/dwxI64Gl8i5p1WMU=
github.com/go-stack/stack v1.8.0/go.mod h1:v0f6uXyyMGvRgIKkXu+yp6POWl0qKG85gN/melR3HDY=
github.com/go-viper/mapstructure/v2 v2.3.0 h1:27XbWsHIqhbdR5TIC911OfYvgSaW93HM+dX7970Q7jk=
github.com/go-viper/mapstructure/v2 v2.3.0/go.mod h1:oJDH3BJKyqBA2TXFhDsKDGDTlndYOZ6rGS0BRZIxGhM=
github.com/goccy/go-json v0.9.11/go.mod h1:6MelG93GURQebXPDq3khkgXZkazVtN9CRI+MGFi0w8I=
github.com/goccy/go-json v0.10.2 h1:CrxCmQqYDkv1z7lO7Wbh2HN93uovUHgrECaO5ZrCXAU=
github.com/goccy/go-json v0.10.2/go.mod h1:6MelG93GURQebXPDq3khkgXZkazVtN9CRI+MGFi0w8I=
github.com/goccy/go-json v0.10.5 h1:Fq85nIqj+gXn/S5ahsiTlK3TmC85qgirsdTP/+DeaC4=
github.com/goccy/go-json v0.10.5/go.mod h1:oq7eo15ShAhp70Anwd5lgX2pLfOS3QCiwU/PULtXL6M=
github.com/goccy/go-yaml v1.18.0 h1:8W7wMFS12Pcas7KU+VVkaiCng+kG8QiFeFwzFb+rwuw=
github.com/goccy/go-yaml v1.18.0/go.mod h1:XBurs7gK8ATbW4ZPGKgcbrY1Br56PdM69F7LkFRi1kA=
github.com/gogo/protobuf v1.3.2/go.mod h1:P1XiOD3dCwIKUDQYPy72D8LYyHL2YPYrpS2s69NZV8Q=
@@ -865,15 +891,16 @@ github.com/golang/protobuf v1.5.3/go.mod h1:XVQd3VNwM+JqD3oG2Ue2ip4fOMUkwXdXDdiu
github.com/golang/protobuf v1.5.4 h1:i7eJL8qZTpSEXOPTxNKhASYpMn+8e5Q6AdndVa1dWek=
github.com/golang/protobuf v1.5.4/go.mod h1:lnTiLA8Wa4RWRcIUkrtSVa5nRhsEGBg48fD6rSs7xps=
github.com/golang/snappy v0.0.3/go.mod h1:/XxbfmMg8lxefKM7IXC3fBNl/7bRcc72aCRzEWrmP2Q=
github.com/golang/snappy v0.0.4 h1:yAGX7huGHXlcLOEtBnF4w7FQwA26wojNCwOYAEhLjQM=
github.com/golang/snappy v0.0.4/go.mod h1:/XxbfmMg8lxefKM7IXC3fBNl/7bRcc72aCRzEWrmP2Q=
github.com/golang/snappy v1.0.0 h1:Oy607GVXHs7RtbggtPBnr2RmDArIsAefDwvrdWvRhGs=
github.com/golang/snappy v1.0.0/go.mod h1:/XxbfmMg8lxefKM7IXC3fBNl/7bRcc72aCRzEWrmP2Q=
github.com/google/btree v0.0.0-20180813153112-4030bb1f1f0c/go.mod h1:lNA+9X1NB3Zf8V7Ke586lFgjr2dZNuvo3lPJSGZ5JPQ=
github.com/google/btree v1.0.0/go.mod h1:lNA+9X1NB3Zf8V7Ke586lFgjr2dZNuvo3lPJSGZ5JPQ=
github.com/google/btree v1.1.3 h1:CVpQJjYgC4VbzxeGVHfvZrv1ctoYCAI8vbl07Fcxlyg=
github.com/google/btree v1.1.3/go.mod h1:qOPhT0dTNdNzV6Z/lhRX0YXUafgPLFUh+gZMl761Gm4=
github.com/google/flatbuffers v2.0.8+incompatible/go.mod h1:1AeVuKshWv4vARoZatz6mlQ0JxURH0Kv5+zNeJKJCa8=
github.com/google/flatbuffers v23.5.26+incompatible h1:M9dgRyhJemaM4Sw8+66GHBu8ioaQmyPLg1b8VwK5WJg=
github.com/google/flatbuffers v23.5.26+incompatible/go.mod h1:1AeVuKshWv4vARoZatz6mlQ0JxURH0Kv5+zNeJKJCa8=
github.com/google/flatbuffers v25.2.10+incompatible h1:F3vclr7C3HpB1k9mxCGRMXq6FdUalZ6H/pNX4FP1v0Q=
github.com/google/flatbuffers v25.2.10+incompatible/go.mod h1:1AeVuKshWv4vARoZatz6mlQ0JxURH0Kv5+zNeJKJCa8=
github.com/google/go-cmp v0.2.0/go.mod h1:oXzfMopK8JAjlY9xF4vHSVASa0yLyX7SntLO5aqRK0M=
github.com/google/go-cmp v0.3.0/go.mod h1:8QqcDgzrUqlUb/G2PQTWiueGozuR1884gddMywk6iLU=
github.com/google/go-cmp v0.3.1/go.mod h1:8QqcDgzrUqlUb/G2PQTWiueGozuR1884gddMywk6iLU=
@@ -889,6 +916,7 @@ github.com/google/go-cmp v0.5.6/go.mod h1:v8dTdLbMG2kIc/vJvl+f65V22dbkXbowE6jgT/
github.com/google/go-cmp v0.5.7/go.mod h1:n+brtR0CgQNWTVd5ZUFpTBC8YFBDLK/h/bpaJ8/DtOE=
github.com/google/go-cmp v0.5.8/go.mod h1:17dUlkBOakJ0+DkrSSNjCkIjxS6bF9zb3elmeNGIjoY=
github.com/google/go-cmp v0.5.9/go.mod h1:17dUlkBOakJ0+DkrSSNjCkIjxS6bF9zb3elmeNGIjoY=
github.com/google/go-cmp v0.6.0/go.mod h1:17dUlkBOakJ0+DkrSSNjCkIjxS6bF9zb3elmeNGIjoY=
github.com/google/go-cmp v0.7.0 h1:wk8382ETsv4JYUZwIsn6YpYiWiBsYLSJiTsyBybVuN8=
github.com/google/go-cmp v0.7.0/go.mod h1:pXiqmnSA92OHEEa9HXL2W4E7lf9JzCmGVUdgjX3N/iU=
github.com/google/gofuzz v1.0.0/go.mod h1:dBl0BpW6vV/+mYPU4Po3pmUjxk6FQPldtuIdl/M65Eg=
@@ -993,13 +1021,14 @@ github.com/jung-kurt/gofpdf v1.0.3-0.20190309125859-24315acbbda5/go.mod h1:7Id9E
github.com/kballard/go-shellquote v0.0.0-20180428030007-95032a82bc51/go.mod h1:CzGEWj7cYgsdH8dAjBGEr58BoE7ScuLd+fwFZ44+/x8=
github.com/kisielk/errcheck v1.5.0/go.mod h1:pFxgyoBC7bSaBwPgfKdkLd5X25qrDl4LWUI2bnpBCr8=
github.com/kisielk/gotool v1.0.0/go.mod h1:XhKaO+MFFWcvkIS/tQcRk01m1F5IRFswLeQ+oQHNcck=
github.com/klauspost/asmfmt v1.3.2 h1:4Ri7ox3EwapiOjCki+hw14RyKk201CN4rzyCJRFLpK4=
github.com/klauspost/asmfmt v1.3.2/go.mod h1:AG8TuvYojzulgDAMCnYn50l/5QV3Bs/tp6j0HLHbNSE=
github.com/klauspost/compress v1.15.9/go.mod h1:PhcZ0MbTNciWF3rruxRgKxI5NkcHHrHUDtV4Yw2GlzU=
github.com/klauspost/compress v1.16.7 h1:2mk3MPGNzKyxErAw8YaohYh69+pa4sIQSC0fPGCFR9I=
github.com/klauspost/compress v1.16.7/go.mod h1:ntbaceVETuRiXiv4DpjP66DpAtAGkEQskQzEyD//IeE=
github.com/klauspost/compress v1.18.0 h1:c/Cqfb0r+Yi+JtIEq73FWXVkRonBlf0CRNYc8Zttxdo=
github.com/klauspost/compress v1.18.0/go.mod h1:2Pp+KzxcywXVXMr50+X0Q/Lsb43OQHYWRCY2AiWywWQ=
github.com/klauspost/cpuid/v2 v2.0.9/go.mod h1:FInQzS24/EEf25PyTYn52gqo7WaD8xa0213Md/qVLRg=
github.com/klauspost/cpuid/v2 v2.2.5 h1:0E5MSMDEoAulmXNFquVs//DdoomxaoTY1kUhbc/qbZg=
github.com/klauspost/cpuid/v2 v2.2.5/go.mod h1:Lcz8mBdAVJIBVzewtcLocK12l3Y+JytZYpaMropDUws=
github.com/klauspost/cpuid/v2 v2.2.11 h1:0OwqZRYI2rFrjS4kvkDnqJkKHdHaRnCm68/DY4OxRzU=
github.com/klauspost/cpuid/v2 v2.2.11/go.mod h1:hqwkgyIinND0mEev00jJYCxPNVRVXFQeu1XKlok6oO0=
github.com/konsorten/go-windows-terminal-sequences v1.0.1/go.mod h1:T0+1ngSBFLxvqU3pZ+m/2kptfBszLMUkC4ZK/EgS/cQ=
github.com/kr/fs v0.1.0/go.mod h1:FFnZGqtBN9Gxj7eW1uZ42v5BccTP0vu6NEaFoC2HwRg=
github.com/kr/pretty v0.1.0/go.mod h1:dAy3ld7l9f0ibDNOQOHHMYYIIbhfbHSm3C4ZsoJORNo=
@@ -1017,6 +1046,12 @@ github.com/looker-open-source/sdk-codegen/go v0.25.10/go.mod h1:YM/IYSsTPk7I54j4
github.com/lyft/protoc-gen-star v0.6.0/go.mod h1:TGAoBVkt8w7MPG72TrKIu85MIdXwDuzJYeZuUPFPNwA=
github.com/lyft/protoc-gen-star v0.6.1/go.mod h1:TGAoBVkt8w7MPG72TrKIu85MIdXwDuzJYeZuUPFPNwA=
github.com/lyft/protoc-gen-star/v2 v2.0.1/go.mod h1:RcCdONR2ScXaYnQC5tUzxzlpA3WVYF7/opLeUgcQs/o=
github.com/marcboeker/go-duckdb/arrowmapping v0.0.10 h1:G1W+GVnUefR8uy7jHdNO+CRMsmFG5mFPIHVAespfFCA=
github.com/marcboeker/go-duckdb/arrowmapping v0.0.10/go.mod h1:jccUb8TYD0p5TsEEeN4SXuslNJHo23QaKOqKD+U6uFU=
github.com/marcboeker/go-duckdb/mapping v0.0.11 h1:fusN1b1l7Myxafifp596I6dNLNhN5Uv/rw31qAqBwqw=
github.com/marcboeker/go-duckdb/mapping v0.0.11/go.mod h1:aYBjFLgfKO0aJIbDtXPiaL5/avRQISveX/j9tMf9JhU=
github.com/marcboeker/go-duckdb/v2 v2.3.4 h1:o98wrefPbH0IdJRix4pF0+jZiXoFQ+FSR8InMsCUZD0=
github.com/marcboeker/go-duckdb/v2 v2.3.4/go.mod h1:8adNrftF4Ye29XMrpIl5NYNosTVsZu1mz3C82WdVvrk=
github.com/mattn/go-isatty v0.0.12/go.mod h1:cbi8OIDigv2wuxKPP5vlRcQ1OAZbq2CE4Kysco4FUpU=
github.com/mattn/go-isatty v0.0.16/go.mod h1:kYGgaQfpe5nmfYZH+SKPsOc2e4SrIfOl2e/yFXSvRLM=
github.com/mattn/go-isatty v0.0.20 h1:xfD0iDuEKnDkl03q4limB+vH+GxLEtL/jb4xVJSWWEY=
@@ -1024,7 +1059,9 @@ github.com/mattn/go-isatty v0.0.20/go.mod h1:W+V8PltTTMOvKvAeJH7IuucS94S2C6jfK/D
github.com/mattn/go-sqlite3 v1.14.14/go.mod h1:NyWgC/yNuGj7Q9rpYnZvas74GogHl5/Z4A/KQRfk6bU=
github.com/microsoft/go-mssqldb v1.9.2 h1:nY8TmFMQOHpm2qVWo6y4I2mAmVdZqlGiMGAYt64Ibbs=
github.com/microsoft/go-mssqldb v1.9.2/go.mod h1:GBbW9ASTiDC+mpgWDGKdm3FnFLTUsLYN3iFL90lQ+PA=
github.com/minio/asm2plan9s v0.0.0-20200509001527-cdd76441f9d8 h1:AMFGa4R4MiIpspGNG7Z948v4n35fFGB3RR3G/ry4FWs=
github.com/minio/asm2plan9s v0.0.0-20200509001527-cdd76441f9d8/go.mod h1:mC1jAcsrzbxHt8iiaC+zU4b1ylILSosueou12R++wfY=
github.com/minio/c2goasm v0.0.0-20190812172519-36a3d3bbc4f3 h1:+n/aFZefKZp7spd8DFdX7uMikMLXX4oubIzJF4kv/wI=
github.com/minio/c2goasm v0.0.0-20190812172519-36a3d3bbc4f3/go.mod h1:RagcQ7I8IeTMnF8JTXieKnO4Z6JCsikNEzj0DwauVzE=
github.com/modern-go/concurrent v0.0.0-20180228061459-e0a39a4cb421/go.mod h1:6dJC0mAP4ikYIbvyc7fijjWJddQyLn8Ig3JB5CqoB9Q=
github.com/modern-go/concurrent v0.0.0-20180306012644-bacd9c7ef1dd h1:TRLaZ9cD/w8PVh93nsPXa1VrQ6jlwL5oN8l14QlcNfg=
@@ -1044,8 +1081,8 @@ github.com/phpdave11/gofpdf v1.4.2/go.mod h1:zpO6xFn9yxo3YLyMvW8HcKWVdbNqgIfOOp2
github.com/phpdave11/gofpdi v1.0.12/go.mod h1:vBmVV0Do6hSBHC8uKUQ71JGW+ZGQq74llk/7bXwjDoI=
github.com/phpdave11/gofpdi v1.0.13/go.mod h1:vBmVV0Do6hSBHC8uKUQ71JGW+ZGQq74llk/7bXwjDoI=
github.com/pierrec/lz4/v4 v4.1.15/go.mod h1:gZWDp/Ze/IJXGXf23ltt2EXimqmTUXEy0GFuRQyBid4=
github.com/pierrec/lz4/v4 v4.1.18 h1:xaKrnTkyoqfh1YItXl56+6KJNVYWlEEPuAQW9xsplYQ=
github.com/pierrec/lz4/v4 v4.1.18/go.mod h1:gZWDp/Ze/IJXGXf23ltt2EXimqmTUXEy0GFuRQyBid4=
github.com/pierrec/lz4/v4 v4.1.22 h1:cKFw6uJDK+/gfw5BcDL0JL5aBsAFdsIT18eRtLj7VIU=
github.com/pierrec/lz4/v4 v4.1.22/go.mod h1:gZWDp/Ze/IJXGXf23ltt2EXimqmTUXEy0GFuRQyBid4=
github.com/pkg/browser v0.0.0-20240102092130-5ac0b6a4141c h1:+mdjkGKdHQG3305AYmdv1U2eRNDiU2ErMBj1gwrq8eQ=
github.com/pkg/browser v0.0.0-20240102092130-5ac0b6a4141c/go.mod h1:7rwL4CYBLnjLxUqIJNnCWiEdr3bn6IUYi15bNlnbCCU=
github.com/pkg/diff v0.0.0-20210226163009-20ebb0f2a09e/go.mod h1:pJLUxLENpZxwdsKMEsNbx1VGcRFpLqf3715MtcvvzbA=
@@ -1196,6 +1233,10 @@ golang.org/x/crypto v0.0.0-20200622213623-75b288015ac9/go.mod h1:LzIPMQfyMNhhGPh
golang.org/x/crypto v0.0.0-20210421170649-83a5a9bb288b/go.mod h1:T9bdIzuCu7OtxOm1hfPfRQxPLYneinmdGuTeoZ9dtd4=
golang.org/x/crypto v0.0.0-20210921155107-089bfa567519/go.mod h1:GvvjBRRGRdwPK5ydBHafDWAxML/pGHZbMvKqRZ5+Abc=
golang.org/x/crypto v0.0.0-20211108221036-ceb1ce70b4fa/go.mod h1:GvvjBRRGRdwPK5ydBHafDWAxML/pGHZbMvKqRZ5+Abc=
golang.org/x/crypto v0.13.0/go.mod h1:y6Z2r+Rw4iayiXXAIxJIDAJ1zMW4yaTpebo8fPOliYc=
golang.org/x/crypto v0.19.0/go.mod h1:Iy9bg/ha4yyC70EfRS8jz+B6ybOBKMaSxLj6P6oBDfU=
golang.org/x/crypto v0.23.0/go.mod h1:CKFgDieR+mRhux2Lsu27y0fO304Db0wZe70UKqHu0v8=
golang.org/x/crypto v0.31.0/go.mod h1:kDsLvtWBEx7MV9tJOj9bnXsPbxwJQ6csT/x4KIN4Ssk=
golang.org/x/crypto v0.40.0 h1:r4x+VvoG5Fm+eJcxMaY8CQM7Lb0l1lsmjGBQ6s8BfKM=
golang.org/x/crypto v0.40.0/go.mod h1:Qr1vMER5WyS2dfPHAlsOj01wgLbsyWtFn/aY+5+ZdxY=
golang.org/x/exp v0.0.0-20180321215751-8460e604b9de/go.mod h1:CJ0aWSM057203Lf6IL+f9T1iT9GByDxfZKAQTCR3kQA=
@@ -1213,8 +1254,8 @@ golang.org/x/exp v0.0.0-20200119233911-0405dc783f0a/go.mod h1:2RIsYlXP63K8oxa1u0
golang.org/x/exp v0.0.0-20200207192155-f17229e696bd/go.mod h1:J/WKrq2StrnmMY6+EHIKF9dgMWnmCNThgcyBT1FY9mM=
golang.org/x/exp v0.0.0-20200224162631-6cc2880d07d6/go.mod h1:3jZMyOhIsHpP37uCMkUooju7aAi5cS1Q23tOzKc+0MU=
golang.org/x/exp v0.0.0-20220827204233-334a2380cb91/go.mod h1:cyybsKvd6eL0RnXn6p/Grxp8F5bW7iYuBgsNCOHpMYE=
golang.org/x/exp v0.0.0-20250408133849-7e4ce0ab07d0 h1:R84qjqJb5nVJMxqWYb3np9L5ZsaDtB+a39EqjV0JSUM=
golang.org/x/exp v0.0.0-20250408133849-7e4ce0ab07d0/go.mod h1:S9Xr4PYopiDyqSyp5NjCrhFrqg6A5zA2E/iPHPhqnS8=
golang.org/x/exp v0.0.0-20250620022241-b7579e27df2b h1:M2rDM6z3Fhozi9O7NWsxAkg/yqS/lQJ6PmkyIV3YP+o=
golang.org/x/exp v0.0.0-20250620022241-b7579e27df2b/go.mod h1:3//PLf8L/X+8b4vuAfHzxeRUl04Adcb341+IGKfnqS8=
golang.org/x/image v0.0.0-20180708004352-c73c2afc3b81/go.mod h1:ux5Hcp/YLpHSI86hEcLt0YII63i6oz57MZXIpbrjZUs=
golang.org/x/image v0.0.0-20190227222117-0694c2d4d067/go.mod h1:kZ7UVZpmo3dzQBMxlp+ypCbDeSB+sBbTgSJuh5dn5js=
golang.org/x/image v0.0.0-20190802002840-cff245a6509b/go.mod h1:FeLwcggjj3mMvU+oOTbSwawSJRM1uh48EjtB4UJZlP0=
@@ -1257,6 +1298,9 @@ golang.org/x/mod v0.6.0-dev.0.20220419223038-86c51ed26bb4/go.mod h1:jJ57K6gSWd91
golang.org/x/mod v0.7.0/go.mod h1:iBbtSCu2XBx23ZKBPSOrRkjjQPZFPuis4dIYUhu/chs=
golang.org/x/mod v0.8.0/go.mod h1:iBbtSCu2XBx23ZKBPSOrRkjjQPZFPuis4dIYUhu/chs=
golang.org/x/mod v0.9.0/go.mod h1:iBbtSCu2XBx23ZKBPSOrRkjjQPZFPuis4dIYUhu/chs=
golang.org/x/mod v0.12.0/go.mod h1:iBbtSCu2XBx23ZKBPSOrRkjjQPZFPuis4dIYUhu/chs=
golang.org/x/mod v0.15.0/go.mod h1:hTbmBsO62+eylJbnUtE2MGJUyE7QWk4xUqPFrRgJ+7c=
golang.org/x/mod v0.17.0/go.mod h1:hTbmBsO62+eylJbnUtE2MGJUyE7QWk4xUqPFrRgJ+7c=
golang.org/x/mod v0.25.0 h1:n7a+ZbQKQA/Ysbyb0/6IbB1H/X41mKgbhfv7AfG/44w=
golang.org/x/mod v0.25.0/go.mod h1:IXM97Txy2VM4PJ3gI61r1YEk/gAj6zAHN3AdZt6S9Ww=
golang.org/x/net v0.0.0-20180724234803-3673e40ba225/go.mod h1:mL1N/T3taQHkDXs73rZJwtUhF3w3ftmwwsq0BUmARs4=
@@ -1316,6 +1360,11 @@ golang.org/x/net v0.6.0/go.mod h1:2Tu9+aMcznHK/AK1HMvgo6xiTLG5rD5rZLDS+rp2Bjs=
golang.org/x/net v0.7.0/go.mod h1:2Tu9+aMcznHK/AK1HMvgo6xiTLG5rD5rZLDS+rp2Bjs=
golang.org/x/net v0.8.0/go.mod h1:QVkue5JL9kW//ek3r6jTKnTFis1tRmNAW2P1shuFdJc=
golang.org/x/net v0.9.0/go.mod h1:d48xBJpPfHeWQsugry2m+kC02ZBRGRgulfHnEXEuWns=
golang.org/x/net v0.10.0/go.mod h1:0qNGK6F8kojg2nk9dLZ2mShWaEBan6FAoqfSigmmuDg=
golang.org/x/net v0.15.0/go.mod h1:idbUs1IY1+zTqbi8yxTbhexhEEk5ur9LInksu6HrEpk=
golang.org/x/net v0.21.0/go.mod h1:bIjVDfnllIU7BJ2DNgfnXvpSvtn8VRwhlsaeUTyUS44=
golang.org/x/net v0.25.0/go.mod h1:JkAGAh7GEvH74S6FOH42FLoXpXbE/aqXSrIQjXgsiwM=
golang.org/x/net v0.33.0/go.mod h1:HXLR5J+9DxmrqMwG9qjGCxZ+zKXxBru04zlTvWlWuN4=
golang.org/x/net v0.42.0 h1:jzkYrhi3YQWD6MLBJcsklgQsoAcw89EcZbJw8Z614hs=
golang.org/x/net v0.42.0/go.mod h1:FF1RA5d3u7nAYA4z2TkclSCKh68eSXtiFwcWQpPXdt8=
golang.org/x/oauth2 v0.0.0-20180821212333-d2e6202438be/go.mod h1:N/0e6XlmueqKjAGxoOufVs8QHGRruUQn6yWY3a++T0U=
@@ -1365,6 +1414,10 @@ golang.org/x/sync v0.0.0-20220722155255-886fb9371eb4/go.mod h1:RxMgew5VJxzue5/jJ
golang.org/x/sync v0.0.0-20220819030929-7fc1605a5dde/go.mod h1:RxMgew5VJxzue5/jJTE5uejpjVlOe/izrB70Jof72aM=
golang.org/x/sync v0.0.0-20220929204114-8fcdb60fdcc0/go.mod h1:RxMgew5VJxzue5/jJTE5uejpjVlOe/izrB70Jof72aM=
golang.org/x/sync v0.1.0/go.mod h1:RxMgew5VJxzue5/jJTE5uejpjVlOe/izrB70Jof72aM=
golang.org/x/sync v0.3.0/go.mod h1:FU7BRWz2tNW+3quACPkgCx/L+uEAv1htQ0V83Z9Rj+Y=
golang.org/x/sync v0.6.0/go.mod h1:Czt+wKu1gCyEFDUtn0jG5QVvpJ6rzVqr5aXyt9drQfk=
golang.org/x/sync v0.7.0/go.mod h1:Czt+wKu1gCyEFDUtn0jG5QVvpJ6rzVqr5aXyt9drQfk=
golang.org/x/sync v0.10.0/go.mod h1:Czt+wKu1gCyEFDUtn0jG5QVvpJ6rzVqr5aXyt9drQfk=
golang.org/x/sync v0.16.0 h1:ycBJEhp9p4vXvUZNszeOq0kGTPghopOL8q0fq3vstxw=
golang.org/x/sync v0.16.0/go.mod h1:1dzgHSNfp02xaA81J2MS99Qcpr2w7fw1gpm99rleRqA=
golang.org/x/sys v0.0.0-20180830151530-49385e6e1522/go.mod h1:STP8DvDyc/dI5b8T5hshtkjS+E42TnysNCUPdjciGhY=
@@ -1447,8 +1500,14 @@ golang.org/x/sys v0.4.0/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
golang.org/x/sys v0.5.0/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
golang.org/x/sys v0.6.0/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
golang.org/x/sys v0.7.0/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
golang.org/x/sys v0.8.0/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
golang.org/x/sys v0.12.0/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
golang.org/x/sys v0.17.0/go.mod h1:/VUhepiaJMQUp4+oa/7Zr1D23ma6VTLIYjOOTFZPUcA=
golang.org/x/sys v0.20.0/go.mod h1:/VUhepiaJMQUp4+oa/7Zr1D23ma6VTLIYjOOTFZPUcA=
golang.org/x/sys v0.28.0/go.mod h1:/VUhepiaJMQUp4+oa/7Zr1D23ma6VTLIYjOOTFZPUcA=
golang.org/x/sys v0.34.0 h1:H5Y5sJ2L2JRdyv7ROF1he/lPdvFsd0mJHFw2ThKHxLA=
golang.org/x/sys v0.34.0/go.mod h1:BJP2sWEmIv4KK5OTEluFJCKSidICx8ciO85XgH3Ak8k=
golang.org/x/telemetry v0.0.0-20240228155512-f48c80bd79b2/go.mod h1:TeRTkGYfJXctD9OcfyVLyj2J3IxLnKwHJR8f4D8a3YE=
golang.org/x/term v0.0.0-20201126162022-7de9c90e9dd1/go.mod h1:bj7SfCRtBDWHUb9snDiAeCFNEtKQo2Wmx5Cou7ajbmo=
golang.org/x/term v0.0.0-20210927222741-03fcf44c2211/go.mod h1:jbD1KX2456YbFQfuXm/mYQcufACuNUgVhRMnK/tPxf8=
golang.org/x/term v0.2.0/go.mod h1:TVmDHMZPmdnySmBfhjOoOdhjzdE1h4u1VwSiw2l1Nuc=
@@ -1457,6 +1516,11 @@ golang.org/x/term v0.4.0/go.mod h1:9P2UbLfCdcvo3p/nzKvsmas4TnlujnuoV9hGgYzW1lQ=
golang.org/x/term v0.5.0/go.mod h1:jMB1sMXY+tzblOD4FWmEbocvup2/aLOaQEp7JmGp78k=
golang.org/x/term v0.6.0/go.mod h1:m6U89DPEgQRMq3DNkDClhWw02AUbt2daBVO4cn4Hv9U=
golang.org/x/term v0.7.0/go.mod h1:P32HKFT3hSsZrRxla30E9HqToFYAQPCMs/zFMBUFqPY=
golang.org/x/term v0.8.0/go.mod h1:xPskH00ivmX89bAKVGSKKtLOWNx2+17Eiy94tnKShWo=
golang.org/x/term v0.12.0/go.mod h1:owVbMEjm3cBLCHdkQu9b1opXd4ETQWc3BhuQGKgXgvU=
golang.org/x/term v0.17.0/go.mod h1:lLRBjIVuehSbZlaOtGMbcMncT+aqLLLmKrsjNrUguwk=
golang.org/x/term v0.20.0/go.mod h1:8UkIAJTvZgivsXaD6/pH6U9ecQzZ45awqEOzuCvwpFY=
golang.org/x/term v0.27.0/go.mod h1:iMsnZpn0cago0GOrHO2+Y7u7JPn5AylBrcoWkElMTSM=
golang.org/x/text v0.0.0-20170915032832-14c0d48ead0c/go.mod h1:NqM8EUOU14njkJ3fqMW+pc6Ldnwhi/IjpwHt7yyuwOQ=
golang.org/x/text v0.3.0/go.mod h1:NqM8EUOU14njkJ3fqMW+pc6Ldnwhi/IjpwHt7yyuwOQ=
golang.org/x/text v0.3.1-0.20180807135948-17ff2d5776d2/go.mod h1:NqM8EUOU14njkJ3fqMW+pc6Ldnwhi/IjpwHt7yyuwOQ=
@@ -1473,6 +1537,10 @@ golang.org/x/text v0.6.0/go.mod h1:mrYo+phRRbMaCq/xk9113O4dZlRixOauAjOtrjsXDZ8=
golang.org/x/text v0.7.0/go.mod h1:mrYo+phRRbMaCq/xk9113O4dZlRixOauAjOtrjsXDZ8=
golang.org/x/text v0.8.0/go.mod h1:e1OnstbJyHTd6l/uOt8jFFHp6TRDWZR/bV3emEE/zU8=
golang.org/x/text v0.9.0/go.mod h1:e1OnstbJyHTd6l/uOt8jFFHp6TRDWZR/bV3emEE/zU8=
golang.org/x/text v0.13.0/go.mod h1:TvPlkZtksWOMsz7fbANvkp4WM8x/WCo/om8BMLbz+aE=
golang.org/x/text v0.14.0/go.mod h1:18ZOQIKpY8NJVqYksKHtTdi31H5itFRjB5/qKTNYzSU=
golang.org/x/text v0.15.0/go.mod h1:18ZOQIKpY8NJVqYksKHtTdi31H5itFRjB5/qKTNYzSU=
golang.org/x/text v0.21.0/go.mod h1:4IBbMaMmOPCJ8SecivzSH54+73PCFmPWxNTLm+vZkEQ=
golang.org/x/text v0.27.0 h1:4fGWRpyh641NLlecmyl4LOe6yDdfaYNrGb2zdfo4JV4=
golang.org/x/text v0.27.0/go.mod h1:1D28KMCvyooCX9hBiosv5Tz/+YLxj0j7XhWjpSUF7CU=
golang.org/x/time v0.0.0-20181108054448-85acf8d2951c/go.mod h1:tRJNPiyCQ0inRvYxbN9jk5I+vvW/OXSQhTDSoE431IQ=
@@ -1547,6 +1615,8 @@ golang.org/x/tools v0.1.12/go.mod h1:hNGJHUnrk76NpqgfD5Aqm5Crs+Hm0VOH/i9J2+nxYbc
golang.org/x/tools v0.3.0/go.mod h1:/rWhSS2+zyEVwoJf8YAX6L2f0ntZ7Kn/mGgAWcipA5k=
golang.org/x/tools v0.6.0/go.mod h1:Xwgl3UAJ/d3gWutnCtw505GrjyAbvKui8lOU390QaIU=
golang.org/x/tools v0.7.0/go.mod h1:4pg6aUX35JBAogB10C9AtvVL+qowtN4pT3CGSQex14s=
golang.org/x/tools v0.13.0/go.mod h1:HvlwmtVNQAhOuCjW7xxvovg8wbNq7LwfXh/k7wXUl58=
golang.org/x/tools v0.21.1-0.20240508182429-e35e4ccd0d2d/go.mod h1:aiJjzUbINMkxbQROHiO6hDPo2LHcIPhhQsa9DLh0yGk=
golang.org/x/tools v0.34.0 h1:qIpSLOxeCYGg9TrcJokLBG4KFA6d795g0xkBkiESGlo=
golang.org/x/tools v0.34.0/go.mod h1:pAP9OwEaY1CAW3HOmg3hLZC5Z0CCmzjAF2UQMSqNARg=
golang.org/x/xerrors v0.0.0-20190717185122-a985d3407aa7/go.mod h1:I/5z698sn9Ka8TeJc9MKroUUfqBBauWjQqLJ2OPfmY0=
@@ -1563,8 +1633,8 @@ gonum.org/v1/gonum v0.0.0-20180816165407-929014505bf4/go.mod h1:Y+Yx5eoAFn32cQvJ
gonum.org/v1/gonum v0.8.2/go.mod h1:oe/vMfY3deqTw+1EZJhuvEW2iwGF1bW9wwu7XCu0+v0=
gonum.org/v1/gonum v0.9.3/go.mod h1:TZumC3NeyVQskjXqmyWt4S3bINhy7B4eYwW69EbyX+0=
gonum.org/v1/gonum v0.11.0/go.mod h1:fSG4YDCxxUZQJ7rKsQrj0gMOg00Il0Z96/qMA4bVQhA=
gonum.org/v1/gonum v0.12.0 h1:xKuo6hzt+gMav00meVPUlXwSdoEJP46BR+wdxQEFK2o=
gonum.org/v1/gonum v0.12.0/go.mod h1:73TDxJfAAHeA8Mk9mf8NlIppyhQNo5GLTcYeqgo2lvY=
gonum.org/v1/gonum v0.16.0 h1:5+ul4Swaf3ESvrOnidPp4GZbzf0mxVQpDCYUQE7OJfk=
gonum.org/v1/gonum v0.16.0/go.mod h1:fef3am4MQ93R2HHpKnLk4/Tbh/s0+wqD5nfa6Pnwy4E=
gonum.org/v1/netlib v0.0.0-20190313105609-8cb42192e0e0/go.mod h1:wa6Ws7BG/ESfp6dHfk7C6KdzKA7wR7u/rKwOGE66zvw=
gonum.org/v1/plot v0.0.0-20190515093506-e2840ee46a6b/go.mod h1:Wt8AAjI+ypCyYX3nZBvf6cAIx93T+c/OS2HFAYskSZc=
gonum.org/v1/plot v0.9.0/go.mod h1:3Pcqqmp6RHvJI72kgb8fThyUnav364FOsdDo2aGW5lY=
@@ -1626,8 +1696,8 @@ google.golang.org/api v0.108.0/go.mod h1:2Ts0XTHNVWxypznxWOYUeI4g3WdP9Pk2Qk58+a/
google.golang.org/api v0.110.0/go.mod h1:7FC4Vvx1Mooxh8C5HWjzZHcavuS2f6pmJpZx60ca7iI=
google.golang.org/api v0.111.0/go.mod h1:qtFHvU9mhgTJegR31csQ+rwxyUTHOKFqCKWp1J0fdw0=
google.golang.org/api v0.114.0/go.mod h1:ifYI2ZsFK6/uGddGfAD5BMxlnkBqCmqHSDUVi45N5Yg=
google.golang.org/api v0.243.0 h1:sw+ESIJ4BVnlJcWu9S+p2Z6Qq1PjG77T8IJ1xtp4jZQ=
google.golang.org/api v0.243.0/go.mod h1:GE4QtYfaybx1KmeHMdBnNnyLzBZCVihGBXAmJu/uUr8=
google.golang.org/api v0.244.0 h1:lpkP8wVibSKr++NCD36XzTk/IzeKJ3klj7vbj+XU5pE=
google.golang.org/api v0.244.0/go.mod h1:dMVhVcylamkirHdzEBAIQWUCgqY885ivNeZYd7VAVr8=
google.golang.org/appengine v1.1.0/go.mod h1:EbEs0AVv82hx2wNQdGPgUI5lhzA/G0D9YwlJXL52JkM=
google.golang.org/appengine v1.4.0/go.mod h1:xpcJRLb0r/rnEns0DIKYYv+WjYCduHsrkT7/EB5XEv4=
google.golang.org/appengine v1.5.0/go.mod h1:xpcJRLb0r/rnEns0DIKYYv+WjYCduHsrkT7/EB5XEv4=
@@ -1772,8 +1842,8 @@ google.golang.org/genproto v0.0.0-20250603155806-513f23925822 h1:rHWScKit0gvAPuO
google.golang.org/genproto v0.0.0-20250603155806-513f23925822/go.mod h1:HubltRL7rMh0LfnQPkMH4NPDFEWp0jw3vixw7jEM53s=
google.golang.org/genproto/googleapis/api v0.0.0-20250603155806-513f23925822 h1:oWVWY3NzT7KJppx2UKhKmzPq4SRe0LdCijVRwvGeikY=
google.golang.org/genproto/googleapis/api v0.0.0-20250603155806-513f23925822/go.mod h1:h3c4v36UTKzUiuaOKQ6gr3S+0hovBtUrXzTG/i3+XEc=
google.golang.org/genproto/googleapis/rpc v0.0.0-20250715232539-7130f93afb79 h1:1ZwqphdOdWYXsUHgMpU/101nCtf/kSp9hOrcvFsnl10=
google.golang.org/genproto/googleapis/rpc v0.0.0-20250715232539-7130f93afb79/go.mod h1:qQ0YXyHHx3XkvlzUtpXDkS29lDSafHMZBAZDc03LQ3A=
google.golang.org/genproto/googleapis/rpc v0.0.0-20250728155136-f173205681a0 h1:MAKi5q709QWfnkkpNQ0M12hYJ1+e8qYVDyowc4U1XZM=
google.golang.org/genproto/googleapis/rpc v0.0.0-20250728155136-f173205681a0/go.mod h1:qQ0YXyHHx3XkvlzUtpXDkS29lDSafHMZBAZDc03LQ3A=
google.golang.org/grpc v1.19.0/go.mod h1:mqu4LbDTu4XGKhr4mRzUsmM4RtVoemTSY81AxZiDr8c=
google.golang.org/grpc v1.20.1/go.mod h1:10oTOabMzJvdu6/UiuZezV6QK5dSlG84ov/aaiqXj38=
google.golang.org/grpc v1.21.1/go.mod h1:oYelfM1adQP15Ek0mdvEgi9Df8B9CZIaU1084ijfRaM=
@@ -1815,8 +1885,8 @@ google.golang.org/grpc v1.52.3/go.mod h1:pu6fVzoFb+NBYNAvQL08ic+lvB2IojljRYuun5v
google.golang.org/grpc v1.53.0/go.mod h1:OnIrk0ipVdj4N5d9IUoFUx72/VlD7+jUsHwZgwSMQpw=
google.golang.org/grpc v1.54.0/go.mod h1:PUSEXI6iWghWaB6lXM4knEgpJNu2qUcKfDtNci3EC2g=
google.golang.org/grpc v1.56.3/go.mod h1:I9bI3vqKfayGqPUAwGdOSu7kt6oIJLixfffKrpXqQ9s=
google.golang.org/grpc v1.73.0 h1:VIWSmpI2MegBtTuFt5/JWy2oXxtjJ/e89Z70ImfD2ok=
google.golang.org/grpc v1.73.0/go.mod h1:50sbHOUqWoCQGI8V2HQLJM0B+LMlIUjNSZmow7EVBQc=
google.golang.org/grpc v1.74.2 h1:WoosgB65DlWVC9FqI82dGsZhWFNBSLjQ84bjROOpMu4=
google.golang.org/grpc v1.74.2/go.mod h1:CtQ+BGjaAIXHs/5YS3i473GqwBBa1zGQNevxdeBEXrM=
google.golang.org/grpc/cmd/protoc-gen-go-grpc v1.1.0/go.mod h1:6Kw0yEErY5E/yWrBtf03jp27GLLJujG4z/JK95pnjjw=
google.golang.org/protobuf v0.0.0-20200109180630-ec00e32a8dfd/go.mod h1:DFci5gLYBciE7Vtevhsrf46CRTquxDuWsQurQQe4oz8=
google.golang.org/protobuf v0.0.0-20200221191635-4d8936d0db64/go.mod h1:kwYJMbMJ01Woi6D6+Kah6886xMZcty6N08ah7+eCXa0=
@@ -1864,8 +1934,8 @@ lukechampine.com/uint128 v1.2.0/go.mod h1:c4eWIwlEGaxC/+H1VguhU4PHXNWDCDMUlWdIWl
modernc.org/cc/v3 v3.36.0/go.mod h1:NFUHyPn4ekoC/JHeZFfZurN6ixxawE1BnVonP/oahEI=
modernc.org/cc/v3 v3.36.2/go.mod h1:NFUHyPn4ekoC/JHeZFfZurN6ixxawE1BnVonP/oahEI=
modernc.org/cc/v3 v3.36.3/go.mod h1:NFUHyPn4ekoC/JHeZFfZurN6ixxawE1BnVonP/oahEI=
modernc.org/cc/v4 v4.26.1 h1:+X5NtzVBn0KgsBCBe+xkDC7twLb/jNVj9FPgiwSQO3s=
modernc.org/cc/v4 v4.26.1/go.mod h1:uVtb5OGqUKpoLWhqwNQo/8LwvoiEBLvZXIQ/SmO6mL0=
modernc.org/cc/v4 v4.26.2 h1:991HMkLjJzYBIfha6ECZdjrIYz2/1ayr+FL8GN+CNzM=
modernc.org/cc/v4 v4.26.2/go.mod h1:uVtb5OGqUKpoLWhqwNQo/8LwvoiEBLvZXIQ/SmO6mL0=
modernc.org/ccgo/v3 v3.0.0-20220428102840-41399a37e894/go.mod h1:eI31LL8EwEBKPpNpA4bU1/i+sKOwOrQy8D87zWUcRZc=
modernc.org/ccgo/v3 v3.0.0-20220430103911-bc99d88307be/go.mod h1:bwdAnOoaIt8Ax9YdWGjxWsdkPcZyRPHqrOvJxaKAKGw=
modernc.org/ccgo/v3 v3.16.4/go.mod h1:tGtX0gE9Jn7hdZFeU88slbTh1UtCYKusWOoCJuvkWsQ=
@@ -1875,10 +1945,12 @@ modernc.org/ccgo/v3 v3.16.9/go.mod h1:zNMzC9A9xeNUepy6KuZBbugn3c0Mc9TeiJO4lgvkJD
modernc.org/ccgo/v4 v4.28.0 h1:rjznn6WWehKq7dG4JtLRKxb52Ecv8OUGah8+Z/SfpNU=
modernc.org/ccgo/v4 v4.28.0/go.mod h1:JygV3+9AV6SmPhDasu4JgquwU81XAKLd3OKTUDNOiKE=
modernc.org/ccorpus v1.11.6/go.mod h1:2gEUTrWqdpH2pXsmTM1ZkjeSrUWDpjMu2T6m29L/ErQ=
modernc.org/fileutil v1.3.3 h1:3qaU+7f7xxTUmvU1pJTZiDLAIoJVdUSSauJNHg9yXoA=
modernc.org/fileutil v1.3.3/go.mod h1:HxmghZSZVAz/LXcMNwZPA/DRrQZEVP9VX0V4LQGQFOc=
modernc.org/fileutil v1.3.8 h1:qtzNm7ED75pd1C7WgAGcK4edm4fvhtBsEiI/0NQ54YM=
modernc.org/fileutil v1.3.8/go.mod h1:HxmghZSZVAz/LXcMNwZPA/DRrQZEVP9VX0V4LQGQFOc=
modernc.org/gc/v2 v2.6.5 h1:nyqdV8q46KvTpZlsw66kWqwXRHdjIlJOhG6kxiV/9xI=
modernc.org/gc/v2 v2.6.5/go.mod h1:YgIahr1ypgfe7chRuJi2gD7DBQiKSLMPgBQe9oIiito=
modernc.org/goabi0 v0.2.0 h1:HvEowk7LxcPd0eq6mVOAEMai46V+i7Jrj13t4AzuNks=
modernc.org/goabi0 v0.2.0/go.mod h1:CEFRnnJhKvWT1c1JTI3Avm+tgOWbkOu5oPA8eH8LnMI=
modernc.org/httpfs v1.0.6/go.mod h1:7dosgurJGp0sPaRanU53W4xZYKh14wfzX420oZADeHM=
modernc.org/libc v0.0.0-20220428101251-2d5f3daf273b/go.mod h1:p7Mg4+koNjc8jkqwcoFBJx7tXkpj00G77X7A72jXPXA=
modernc.org/libc v1.16.0/go.mod h1:N4LD6DBE9cf+Dzf9buBlzVJndKr/iJHG97vGLHYnb5A=
@@ -1887,8 +1959,8 @@ modernc.org/libc v1.16.17/go.mod h1:hYIV5VZczAmGZAnG15Vdngn5HSF5cSkbvfz2B7GRuVU=
modernc.org/libc v1.16.19/go.mod h1:p7Mg4+koNjc8jkqwcoFBJx7tXkpj00G77X7A72jXPXA=
modernc.org/libc v1.17.0/go.mod h1:XsgLldpP4aWlPlsjqKRdHPqCxCjISdHfM/yeWC5GyW0=
modernc.org/libc v1.17.1/go.mod h1:FZ23b+8LjxZs7XtFMbSzL/EhPxNbfZbErxEHc7cbD9s=
modernc.org/libc v1.65.10 h1:ZwEk8+jhW7qBjHIT+wd0d9VjitRyQef9BnzlzGwMODc=
modernc.org/libc v1.65.10/go.mod h1:StFvYpx7i/mXtBAfVOjaU0PWZOvIRoZSgXhrwXzr8Po=
modernc.org/libc v1.66.3 h1:cfCbjTUcdsKyyZZfEUKfoHcP3S0Wkvz3jgSzByEWVCQ=
modernc.org/libc v1.66.3/go.mod h1:XD9zO8kt59cANKvHPXpx7yS2ELPheAey0vjIuZOhOU8=
modernc.org/mathutil v1.2.2/go.mod h1:mZW8CKdRPY1v87qxC/wUdX5O1qDzXMP5TH3wjfpga6E=
modernc.org/mathutil v1.4.1/go.mod h1:mZW8CKdRPY1v87qxC/wUdX5O1qDzXMP5TH3wjfpga6E=
modernc.org/mathutil v1.5.0/go.mod h1:mZW8CKdRPY1v87qxC/wUdX5O1qDzXMP5TH3wjfpga6E=
@@ -1906,8 +1978,8 @@ modernc.org/opt v0.1.4/go.mod h1:03fq9lsNfvkYSfxrfUhZCWPk1lm4cq4N+Bh//bEtgns=
modernc.org/sortutil v1.2.1 h1:+xyoGf15mM3NMlPDnFqrteY07klSFxLElE2PVuWIJ7w=
modernc.org/sortutil v1.2.1/go.mod h1:7ZI3a3REbai7gzCLcotuw9AC4VZVpYMjDzETGsSMqJE=
modernc.org/sqlite v1.18.1/go.mod h1:6ho+Gow7oX5V+OiOQ6Tr4xeqbx13UZ6t+Fw9IRUG4d4=
modernc.org/sqlite v1.38.0 h1:+4OrfPQ8pxHKuWG4md1JpR/EYAh3Md7TdejuuzE7EUI=
modernc.org/sqlite v1.38.0/go.mod h1:1Bj+yES4SVvBZ4cBOpVZ6QgesMCKpJZDq0nxYzOpmNE=
modernc.org/sqlite v1.38.2 h1:Aclu7+tgjgcQVShZqim41Bbw9Cho0y/7WzYptXqkEek=
modernc.org/sqlite v1.38.2/go.mod h1:cPTJYSlgg3Sfg046yBShXENNtPrWrDX8bsbAQBzgQ5E=
modernc.org/strutil v1.1.1/go.mod h1:DE+MQQ/hjKBZS2zNInV5hhcipt5rLPWkmpbGeW5mmdw=
modernc.org/strutil v1.1.3/go.mod h1:MEHNA7PdEnEwLvspRMtWTNnp2nnyvMfkimT1NKNAGbw=
modernc.org/strutil v1.2.1 h1:UneZBkQA+DX2Rp35KcM69cSsNES9ly8mQWD71HKlOA0=

View File

@@ -7,9 +7,13 @@ tools:
dataplex_search_entries:
kind: dataplex-search-entries
source: dataplex-source
description: |
Use this tool to search for entries in Dataplex Catalog that represent data assets (e.g. tables, views, models) based on the provided search query.
description: Use this tool to search for entries in Dataplex Catalog based on the provided search query.
dataplex_lookup_entry:
kind: dataplex-lookup-entry
source: dataplex-source
description: Use this tool to retrieve a specific entry from Dataplex Catalog.
toolsets:
dataplex-tools:
- dataplex_search_entries
- dataplex_search_entries
- dataplex_lookup_entry

View File

@@ -114,6 +114,20 @@ tools:
The result of the query sql tool is SQL text.
query_url:
kind: looker-query-url
source: looker-source
description: |
Query URL Tool
This tool is used to generate the URL of a query in Looker.
The user can then explore the query further inside Looker.
The tool also returns the query_id and slug. The parameters
are the same as the query tool.
The result is a JSON object with the id, slug, the url, and
the long_url.
get_looks:
kind: looker-get-looks
source: looker-source
@@ -154,5 +168,6 @@ toolsets:
- get_parameters
- query
- query_sql
- query_url
- get_looks
- run_look

View File

@@ -55,6 +55,8 @@ type ServerConfig struct {
Stdio bool
// DisableReload indicates if the user has disabled dynamic reloading for Toolbox.
DisableReload bool
// UI indicates if Toolbox UI endpoints (/ui) are available
UI bool
}
type logFormat string

View File

@@ -214,7 +214,7 @@ func (s *stdioSession) write(ctx context.Context, response any) error {
func mcpRouter(s *Server) (chi.Router, error) {
r := chi.NewRouter()
r.Use(middleware.AllowContentType("application/json"))
r.Use(middleware.AllowContentType("application/json", "application/json-rpc", "application/jsonrequest"))
r.Use(middleware.StripSlashes)
r.Use(render.SetContentType(render.ContentTypeJSON))

View File

@@ -330,6 +330,13 @@ func NewServer(ctx context.Context, cfg ServerConfig) (*Server, error) {
return nil, err
}
r.Mount("/mcp", mcpR)
if cfg.UI {
webR, err := webRouter()
if err != nil {
return nil, err
}
r.Mount("/ui", webR)
}
// default endpoint for validating server is running
r.Get("/", func(w http.ResponseWriter, r *http.Request) {
_, _ = w.Write([]byte("🧰 Hello, World! 🧰"))

Binary file not shown.

After

Width:  |  Height:  |  Size: 57 KiB

View File

@@ -0,0 +1,580 @@
:root {
--toolbox-blue: #4285f4;
--text-primary-gray: #444444;
--text-secondary-gray: #6e6e6e;
--button-primary: var(--toolbox-blue);
--button-secondary: #616161;
}
body {
display: flex;
height: 100vh;
margin: 0;
font-family: 'Trebuchet MS';
background-color: #f8f9fa;
box-sizing: border-box;
}
*, *:before, *:after {
box-sizing: inherit;
}
#navbar-container {
flex: 0 0 250px;
height: 100%;
position: relative;
z-index: 10;
}
#main-content-container {
flex: 1;
display: flex;
flex-direction: column;
min-width: 0;
overflow-x: hidden;
}
.left-nav {
background-color: #fff;
box-shadow: 4px 0px 12px rgba(0, 0, 0, 0.15);
display: flex;
flex-direction: column;
padding: 15px;
align-items: center;
width: 100%;
height: 100%;
z-index: 3;
ul {
font-family: 'Verdana';
list-style: none;
padding: 0;
margin: 0;
width: 100%;
li {
margin-bottom: 5px;
a {
display: flex;
align-items: center;
padding: 12px;
text-decoration: none;
color: #333;
border-radius: 0;
&:hover {
background-color: #e9e9e9;
border-radius: 35px;
}
&.active {
background-color: #d0d0d0;
font-weight: bold;
border-radius: 35px;
}
}
}
}
}
.second-nav {
flex: 0 0 250px;
background-color: #fff;
box-shadow: 4px 0px 12px rgba(0, 0, 0, 0.15);
z-index: 2;
display: flex;
flex-direction: column;
padding: 15px;
align-items: center;
position: relative;
}
.nav-logo {
width: 90%;
margin-bottom: 40px;
flex-shrink: 0;
img {
max-width: 100%;
height: auto;
display: block;
}
}
.main-content-area {
flex: 1;
display: flex;
flex-direction: column;
min-width: 0;
overflow-x: hidden;
}
.top-bar {
background-color: #fff;
padding: 30px 30px;
display: flex;
justify-content: flex-end;
align-items: center;
border-bottom: 1px solid #eee;
}
.content {
padding: 20px;
flex-grow: 1;
overflow-y: auto;
display: flex;
flex-direction: column;
}
.btn {
display: flex;
align-items: center;
justify-content: center;
padding: 10px 20px;
color: white;
border: none;
border-radius: 30px;
font: inherit;
font-size: 1em;
font-weight: bolder;
cursor: pointer;
&:hover {
opacity: 0.8;
}
}
.btn--run {
background-color: var(--button-primary);
}
.btn--editHeaders {
background-color: var(--button-secondary)
}
.btn--saveHeaders {
background-color: var(--button-primary)
}
.btn--closeHeaders {
background-color: var(--button-secondary)
}
.tool-button {
display: flex;
align-items: center;
padding: 12px;
text-decoration: none;
color: #333;
background-color: transparent;
border: none;
border-radius: 0;
width: 100%;
text-align: left;
cursor: pointer;
font-family: inherit;
font-size: inherit;
transition: background-color 0.1s ease-in-out, border-radius 0.1s ease-in-out;
&:hover {
background-color: #e9e9e9;
border-radius: 35px;
}
&:focus {
outline: none;
box-shadow: 0 0 0 2px rgba(208, 208, 208, 0.5);
}
&.active {
background-color: #d0d0d0;
font-weight: bold;
border-radius: 35px;
&:hover {
background-color: #d0d0d0;
}
}
}
#secondary-panel-content {
flex: 1;
overflow-y: auto;
width: 100%;
min-height: 0;
ul {
list-style: none;
padding: 0;
margin: 0;
width: 100%;
}
}
.tool-details-grid {
display: grid;
grid-template-columns: 1fr 2fr;
gap: 20px;
margin: 0 0 20px 0;
align-items: start;
flex-shrink: 0;
}
.tool-info {
display: flex;
flex-direction: column;
gap: 15px;
}
.tool-execution-area {
display: flex;
flex-direction: column;
gap: 12px;
}
.tool-params {
background-color: #ffffff;
padding: 15px;
border-radius: 4px;
border: 1px solid #ddd;
h5 {
margin-bottom: 0;
}
}
.tool-box {
background-color: #ffffff;
padding: 15px;
border-radius: 4px;
border: 1px solid #eee;
h5 {
color: var(--toolbox-blue);
margin-top: 0;
font-weight: bold;
}
}
.params-header {
display: flex;
justify-content: flex-end;
margin-bottom: 8px;
padding-right: 6px;
font-weight: bold;
font-size: 0.9em;
color: var(--text-secondary-gray);
}
.params-disclaimer {
font-style: italic;
color: var(--text-secondary-gray);
font-size: 0.8em;
margin-bottom: 10px;
width: 100%;
word-wrap: break-word;
}
.param-item {
margin-bottom: 12px;
label {
display: block;
margin-bottom: 4px;
font-family: inherit;
}
&.disabled-param {
> label {
color: #888;
text-decoration: line-through;
}
.param-input-element {
background-color: #f5f5f5;
opacity: 0.6;
}
}
input[type="text"],
input[type="number"],
select,
textarea {
width: calc(100% - 12px);
padding: 6px;
border: 1px solid #ccc;
border-radius: 4px;
font-family: inherit;
}
input[type="checkbox"].param-input-element {
width: auto;
padding: 0;
border: initial;
border-radius: initial;
vertical-align: middle;
margin-right: 4px;
accent-color: var(--toolbox-blue);
flex-grow: 0;
}
}
.input-checkbox-wrapper {
display: flex;
align-items: center;
gap: 10px;
}
.param-input-element-container {
flex-grow: 1;
}
.param-input-element {
box-sizing: border-box;
}
.include-param-container {
display: flex;
align-items: center;
white-space: nowrap;
input[type="checkbox"] {
width: auto;
padding: 0;
border: initial;
border-radius: initial;
vertical-align: middle;
margin-right: 0;
accent-color: var(--toolbox-blue);
}
}
.include-param-container input[type="checkbox"] {
width: auto;
padding: 0;
border: initial;
border-radius: initial;
vertical-align: middle;
margin: 0;
accent-color: var(--toolbox-blue);
}
.checkbox-bool-label {
margin-left: 5px;
font-style: italic;
color: var(--text-primary-gray);
}
.checkbox-bool-label.disabled {
color: #aaa;
cursor: not-allowed;
}
.param-label-extras {
font-style: italic;
font-weight: lighter;
color: var(--text-secondary-gray);
}
.auth-param-input {
background-color: #e0e0e0;
cursor: not-allowed;
}
.run-button-container {
display: flex;
justify-content: flex-end;
gap: 20px;
}
.header-modal {
display: none;
position: fixed;
z-index: 1000;
left: 0;
top: 0;
width: 100%;
height: 100%;
overflow: auto;
background-color: rgba(0,0,0,0.4);
li {
margin-bottom: 10px;
}
.header-modal-content {
background-color: #fefefe;
margin: 10% auto;
padding: 20px;
border: 1px solid #888;
width: 80%;
max-width: 50%;
border-radius: 8px;
display: flex;
flex-direction: column;
gap: 15px;
align-items: center;
h5 {
margin-top: 0;
font-size: 1.2em;
}
.headers-textarea {
width: calc(100% - 16px);
padding: 8px;
font-family: monospace;
border: 1px solid #ccc;
border-radius: 4px;
min-height: 150px;
}
.header-modal-actions {
display: flex;
justify-content: center;
gap: 30px;
width: 100%;
}
.auth-token-details {
width: 100%;
max-width: calc(100% - 16px);
margin-left: 8px;
margin-right: 8px;
summary {
cursor: pointer;
text-align: left;
padding: 5px 0;
}
.auth-token-content {
padding: 10px;
border: 1px solid #eee;
margin-top: 5px;
background-color: #f9f9f9;
text-align: left;
max-width: 100%;
overflow-wrap: break-word;
.auth-tab-group {
display: flex;
border-bottom: 1px solid #ccc;
margin-bottom: 10px;
}
.auth-tab-picker {
padding: 8px 12px;
cursor: pointer;
border: 1px solid transparent;
border-bottom: 1px solid transparent;
margin-bottom: -1px;
background-color: #f0f0f0;
&.active {
background-color: #fff;
border-color: #ccc;
border-bottom-color: #fff;
font-weight: bold;
}
}
.auth-tab-content {
display: none;
overflow-wrap: break-word;
word-wrap: break-word;
max-width: 100%;
&.active {
display: block;
}
pre {
white-space: pre-wrap;
word-wrap: break-word;
overflow-x: auto;
background-color: #f5f5f5;
padding: 10px;
border: 1px solid #ccc;
border-radius: 4px;
max-width: 100%;
code {
display: block;
word-wrap: break-word;
color: inherit;
}
}
}
}
}
}
}
.tool-response {
margin: 20px 0 0 0;
textarea {
width: 100%;
min-height: 150px;
padding: 12px;
border: 1px solid #ddd;
border-radius: 4px;
font-family: monospace;
}
}
.search-container {
display: flex;
width: 100%;
margin-bottom: 15px;
#toolset-search-input {
flex-grow: 1;
padding: 10px 12px;
border: 1px solid #ccc;
border-radius: 20px 0 0 20px;
border-right: none;
font-family: inherit;
font-size: 0.9em;
color: var(--text-primary-gray);
&:focus {
outline: none;
border-color: var(--toolbox-blue);
box-shadow: 0 0 0 2px rgba(66, 133, 244, 0.3);
}
&::placeholder {
color: var(--text-secondary-gray);
}
}
#toolset-search-button {
padding: 10px 15px;
border: 1px solid var(--button-primary);
background-color: var(--button-primary);
color: white;
border-radius: 0 20px 20px 0;
cursor: pointer;
font-family: inherit;
font-size: 0.9em;
font-weight: bold;
transition: opacity 0.2s ease-in-out;
flex-shrink: 0;
line-height: 1;
&:hover {
opacity: 0.8;
}
&:focus {
outline: none;
box-shadow: 0 0 0 2px rgba(66, 133, 244, 0.3);
}
}
}

View File

@@ -0,0 +1,24 @@
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>Toolbox UI</title>
<link rel="stylesheet" href="/ui/css/style.css">
</head>
<body>
<div id="navbar-container" data-active-nav=""></div>
<div id="main-content-container"></div>
<script src="/ui/js/navbar.js"></script>
<script src="/ui/js/mainContent.js"></script>
<script>
document.addEventListener('DOMContentLoaded', () => {
const navbarContainer = document.getElementById('navbar-container');
const activeNav = navbarContainer.getAttribute('data-active-nav');
renderNavbar('navbar-container', activeNav);
renderMainContent('main-content-container', '')
});
</script>
</body>
</html>

View File

@@ -0,0 +1,173 @@
// Copyright 2025 Google LLC
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
import { renderToolInterface } from "./toolDisplay.js";
let toolDetailsAbortController = null;
/**
* Fetches a toolset from the /api/toolset endpoint and initiates creating the tool list.
* @param {!HTMLElement} secondNavContent The HTML element where the tool list will be rendered.
* @param {!HTMLElement} toolDisplayArea The HTML element where the details of a selected tool will be displayed.
* @param {string} toolsetName The name of the toolset to load (empty string loads all tools).
* @returns {!Promise<void>} A promise that resolves when the tools are loaded and rendered, or rejects on error.
*/
export async function loadTools(secondNavContent, toolDisplayArea, toolsetName) {
secondNavContent.innerHTML = '<p>Fetching tools...</p>';
try {
const response = await fetch(`/api/toolset/${toolsetName}`);
if (!response.ok) {
throw new Error(`HTTP error! status: ${response.status}`);
}
const apiResponse = await response.json();
renderToolList(apiResponse, secondNavContent, toolDisplayArea);
} catch (error) {
console.error('Failed to load tools:', error);
secondNavContent.innerHTML = `<p class="error">Failed to load tools: <pre><code>${error}</code></pre></p>`;
}
}
/**
* Renders the list of tools as buttons within the provided HTML element.
* @param {?{tools: ?Object<string,*>} } apiResponse The API response object containing the tools.
* @param {!HTMLElement} secondNavContent The HTML element to render the tool list into.
* @param {!HTMLElement} toolDisplayArea The HTML element for displaying tool details (passed to event handlers).
*/
function renderToolList(apiResponse, secondNavContent, toolDisplayArea) {
secondNavContent.innerHTML = '';
if (!apiResponse || typeof apiResponse.tools !== 'object' || apiResponse.tools === null) {
console.error('Error: Expected an object with a "tools" property, but received:', apiResponse);
secondNavContent.textContent = 'Error: Invalid response format from toolset API.';
return;
}
const toolsObject = apiResponse.tools;
const toolNames = Object.keys(toolsObject);
if (toolNames.length === 0) {
secondNavContent.textContent = 'No tools found.';
return;
}
const ul = document.createElement('ul');
toolNames.forEach(toolName => {
const li = document.createElement('li');
const button = document.createElement('button');
button.textContent = toolName;
button.dataset.toolname = toolName;
button.classList.add('tool-button');
button.addEventListener('click', (event) => handleToolClick(event, secondNavContent, toolDisplayArea));
li.appendChild(button);
ul.appendChild(li);
});
secondNavContent.appendChild(ul);
}
/**
* Handles the click event on a tool button.
* @param {!Event} event The click event object.
* @param {!HTMLElement} secondNavContent The parent element containing the tool buttons.
* @param {!HTMLElement} toolDisplayArea The HTML element where tool details will be shown.
*/
function handleToolClick(event, secondNavContent, toolDisplayArea) {
const toolName = event.target.dataset.toolname;
if (toolName) {
const currentActive = secondNavContent.querySelector('.tool-button.active');
if (currentActive) {
currentActive.classList.remove('active');
}
event.target.classList.add('active');
fetchToolDetails(toolName, toolDisplayArea);
}
}
/**
* Fetches details for a specific tool /api/tool endpoint.
* It aborts any previous in-flight request for tool details to stop race condition.
* @param {string} toolName The name of the tool to fetch details for.
* @param {!HTMLElement} toolDisplayArea The HTML element to display the tool interface in.
* @returns {!Promise<void>} A promise that resolves when the tool details are fetched and rendered, or rejects on error.
*/
async function fetchToolDetails(toolName, toolDisplayArea) {
if (toolDetailsAbortController) {
toolDetailsAbortController.abort();
console.debug("Aborted previous tool fetch.");
}
toolDetailsAbortController = new AbortController();
const signal = toolDetailsAbortController.signal;
toolDisplayArea.innerHTML = '<p>Loading tool details...</p>';
try {
const response = await fetch(`/api/tool/${encodeURIComponent(toolName)}`, { signal });
if (!response.ok) {
throw new Error(`HTTP error! status: ${response.status}`);
}
const apiResponse = await response.json();
if (!apiResponse.tools || !apiResponse.tools[toolName]) {
throw new Error(`Tool "${toolName}" data not found in API response.`);
}
const toolObject = apiResponse.tools[toolName];
console.debug("Received tool object: ", toolObject)
const toolInterfaceData = {
id: toolName,
name: toolName,
description: toolObject.description || "No description provided.",
parameters: (toolObject.parameters || []).map(param => {
let inputType = 'text';
const apiType = param.type ? param.type.toLowerCase() : 'string';
let valueType = 'string';
let label = param.description || param.name;
if (apiType === 'integer' || apiType === 'float') {
inputType = 'number';
valueType = 'number';
} else if (apiType === 'boolean') {
inputType = 'checkbox';
valueType = 'boolean';
} else if (apiType === 'array') {
inputType = 'textarea';
const itemType = param.items && param.items.type ? param.items.type.toLowerCase() : 'string';
valueType = `array<${itemType}>`;
label += ' (Array)';
}
return {
name: param.name,
type: inputType,
valueType: valueType,
label: label,
authServices: param.authSources,
required: param.required || false,
// defaultValue: param.default, can't do this yet bc tool manifest doesn't have default
};
})
};
console.debug("Transformed toolInterfaceData:", toolInterfaceData);
renderToolInterface(toolInterfaceData, toolDisplayArea);
} catch (error) {
if (error.name === 'AbortError') {
console.debug("Previous fetch was aborted, expected behavior.");
} else {
console.error(`Failed to load details for tool "${toolName}":`, error);
toolDisplayArea.innerHTML = `<p class="error">Failed to load details for ${toolName}. ${error.message}</p>`;
}
}
}

View File

@@ -0,0 +1,40 @@
// Copyright 2025 Google LLC
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
/**
* Renders the main content area into the HTML.
* @param {string} containerId The ID of the DOM element to inject the content into.
* @param {string} idString The id of the item inside the main content area.
*/
function renderMainContent(containerId, idString) {
const mainContentContainer = document.getElementById(containerId);
if (!mainContentContainer) {
console.error(`Content container with ID "${containerId}" not found.`);
return;
}
const idAttribute = idString ? `id="${idString}"` : '';
const contentHTML = `
<div class="main-content-area">
<div class="top-bar">
</div>
<main class="content" ${idAttribute}">
<h1>Welcome to MCP Toolbox UI</h1>
<p>This is the main content area. Click a tab on the left to navigate.</p>
</main>
</div>
`;
mainContentContainer.innerHTML = contentHTML;
}

View File

@@ -0,0 +1,53 @@
// Copyright 2025 Google LLC
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
/**
* Renders the navigation bar HTML content into the specified container element.
* @param {string} containerId The ID of the DOM element to inject the navbar into.
* @param {string | null} activePath The active tab from the navbar.
*/
function renderNavbar(containerId, activePath) {
const navbarContainer = document.getElementById(containerId);
if (!navbarContainer) {
console.error(`Navbar container with ID "${containerId}" not found.`);
return;
}
const navbarHTML = `
<nav class="left-nav">
<div class="nav-logo">
<img src="/ui/assets/mcptoolboxlogo.png" alt="App Logo">
</div>
<ul>
<!--<li><a href="/ui/sources">Sources</a></li>-->
<!--<li><a href="/ui/authservices">Auth Services</a></li>-->
<li><a href="/ui/tools">Tools</a></li>
<li><a href="/ui/toolsets">Toolsets</a></li>
</ul>
</nav>
`;
navbarContainer.innerHTML = navbarHTML;
if (activePath) {
const navLinks = navbarContainer.querySelectorAll('.left-nav ul li a');
navLinks.forEach(link => {
const linkPath = new URL(link.href).pathname;
if (linkPath === activePath) {
link.classList.add('active');
} else {
link.classList.remove('active');
}
});
}
}

Some files were not shown because too many files have changed in this diff Show More