Compare commits

...

61 Commits

Author SHA1 Message Date
Mike DeAngelo
2959576afa feat(tools/looker): render a query, look, or dash and send output to agent 2025-12-19 19:11:04 -05:00
Yuan Teoh
7daa4111f4 fix: add import for cloudgda source (#2217) 2025-12-19 15:36:17 -08:00
Averi Kitsch
18885f6433 ci: update renovate to use dep groups (#2142)
## Description

> Should include a concise description of the changes (bug or feature),
it's
> impact, along with a summary of the solution

## PR Checklist

> Thank you for opening a Pull Request! Before submitting your PR, there
are a
> few things you can do to make sure it goes smoothly:

- [ ] Make sure you reviewed

[CONTRIBUTING.md](https://github.com/googleapis/genai-toolbox/blob/main/CONTRIBUTING.md)
- [ ] Make sure to open an issue as a

[bug/issue](https://github.com/googleapis/genai-toolbox/issues/new/choose)
  before writing your code! That way we can discuss the change, evaluate
  designs, and agree on the general idea
- [ ] Ensure the tests and linter pass
- [ ] Code coverage does not decrease (if any source code was changed)
- [ ] Appropriate docs were updated (if necessary)
- [ ] Make sure to add `!` if this involve a breaking change

🛠️ Fixes #<issue_number_goes_here>
2025-12-19 22:19:01 +00:00
Mend Renovate
21d676ed58 chore(deps): update module github.com/redis/go-redis/v9 to v9.17.2 (#1994)
This PR contains the following updates:

| Package | Change |
[Age](https://docs.renovatebot.com/merge-confidence/) |
[Confidence](https://docs.renovatebot.com/merge-confidence/) |
|---|---|---|---|
|
[github.com/redis/go-redis/v9](https://redirect.github.com/redis/go-redis)
| `v9.16.0` -> `v9.17.2` |
![age](https://developer.mend.io/api/mc/badges/age/go/github.com%2fredis%2fgo-redis%2fv9/v9.17.2?slim=true)
|
![confidence](https://developer.mend.io/api/mc/badges/confidence/go/github.com%2fredis%2fgo-redis%2fv9/v9.16.0/v9.17.2?slim=true)
|

---

### Release Notes

<details>
<summary>redis/go-redis (github.com/redis/go-redis/v9)</summary>

###
[`v9.17.2`](https://redirect.github.com/redis/go-redis/releases/tag/v9.17.2):
9.17.2

[Compare
Source](https://redirect.github.com/redis/go-redis/compare/v9.17.1...v9.17.2)

#### 🐛 Bug Fixes

- **Connection Pool**: Fixed critical race condition in turn management
that could cause connection leaks when dial goroutines complete after
request timeout
([#&#8203;3626](https://redirect.github.com/redis/go-redis/pull/3626))
by [@&#8203;cyningsun](https://redirect.github.com/cyningsun)
- **Context Timeout**: Improved context timeout calculation to use
minimum of remaining time and DialTimeout, preventing goroutines from
waiting longer than necessary
([#&#8203;3626](https://redirect.github.com/redis/go-redis/pull/3626))
by [@&#8203;cyningsun](https://redirect.github.com/cyningsun)

#### 🧰 Maintenance

- chore(deps): bump rojopolis/spellcheck-github-actions from 0.54.0 to
0.55.0
([#&#8203;3627](https://redirect.github.com/redis/go-redis/pull/3627))

#### Contributors

We'd like to thank all the contributors who worked on this release!

[@&#8203;cyningsun](https://redirect.github.com/cyningsun) and
[@&#8203;ndyakov](https://redirect.github.com/ndyakov)

###
[`v9.17.1`](https://redirect.github.com/redis/go-redis/releases/tag/v9.17.1):
9.17.1

[Compare
Source](https://redirect.github.com/redis/go-redis/compare/v9.17.0...v9.17.1)

#### 🐛 Bug Fixes

- add wait to keyless commands list
([#&#8203;3615](https://redirect.github.com/redis/go-redis/pull/3615))
by [@&#8203;marcoferrer](https://redirect.github.com/marcoferrer)
- fix(time): remove cached time optimization
([#&#8203;3611](https://redirect.github.com/redis/go-redis/pull/3611))
by [@&#8203;ndyakov](https://redirect.github.com/ndyakov)

#### 🧰 Maintenance

- chore(deps): bump golangci/golangci-lint-action from 9.0.0 to 9.1.0
([#&#8203;3609](https://redirect.github.com/redis/go-redis/pull/3609))
- chore(deps): bump actions/checkout from 5 to 6
([#&#8203;3610](https://redirect.github.com/redis/go-redis/pull/3610))
- chore(script): fix help call in tag.sh
([#&#8203;3606](https://redirect.github.com/redis/go-redis/pull/3606))
by [@&#8203;ndyakov](https://redirect.github.com/ndyakov)

#### Contributors

We'd like to thank all the contributors who worked on this release!

[@&#8203;marcoferrer](https://redirect.github.com/marcoferrer) and
[@&#8203;ndyakov](https://redirect.github.com/ndyakov)

###
[`v9.17.0`](https://redirect.github.com/redis/go-redis/releases/tag/v9.17.0):
9.17.0

[Compare
Source](https://redirect.github.com/redis/go-redis/compare/v9.16.0...v9.17.0)

#### 🚀 Highlights

##### Redis 8.4 Support

Added support for Redis 8.4, including new commands and features
([#&#8203;3572](https://redirect.github.com/redis/go-redis/pull/3572))

##### Typed Errors

Introduced typed errors for better error handling using `errors.As`
instead of string checks. Errors can now be wrapped and set to commands
in hooks without breaking library functionality
([#&#8203;3602](https://redirect.github.com/redis/go-redis/pull/3602))

##### New Commands

- **CAS/CAD Commands**: Added support for
Compare-And-Set/Compare-And-Delete operations with conditional matching
(`IFEQ`, `IFNE`, `IFDEQ`, `IFDNE`)
([#&#8203;3583](https://redirect.github.com/redis/go-redis/pull/3583),
[#&#8203;3595](https://redirect.github.com/redis/go-redis/pull/3595))
- **MSETEX**: Atomically set multiple key-value pairs with expiration
options and conditional modes
([#&#8203;3580](https://redirect.github.com/redis/go-redis/pull/3580))
- **XReadGroup CLAIM**: Consume both incoming and idle pending entries
from streams in a single call
([#&#8203;3578](https://redirect.github.com/redis/go-redis/pull/3578))
- **ACL Commands**: Added `ACLGenPass`, `ACLUsers`, and `ACLWhoAmI`
([#&#8203;3576](https://redirect.github.com/redis/go-redis/pull/3576))
- **SLOWLOG Commands**: Added `SLOWLOG LEN` and `SLOWLOG RESET`
([#&#8203;3585](https://redirect.github.com/redis/go-redis/pull/3585))
- **LATENCY Commands**: Added `LATENCY LATEST` and `LATENCY RESET`
([#&#8203;3584](https://redirect.github.com/redis/go-redis/pull/3584))

##### Search & Vector Improvements

- **Hybrid Search**: Added **EXPERIMENTAL** support for the new
`FT.HYBRID` command
([#&#8203;3573](https://redirect.github.com/redis/go-redis/pull/3573))
- **Vector Range**: Added `VRANGE` command for vector sets
([#&#8203;3543](https://redirect.github.com/redis/go-redis/pull/3543))
- **FT.INFO Enhancements**: Added vector-specific attributes in FT.INFO
response
([#&#8203;3596](https://redirect.github.com/redis/go-redis/pull/3596))

##### Connection Pool Improvements

- **Improved Connection Success Rate**: Implemented FIFO queue-based
fairness and context pattern for connection creation to prevent
premature cancellation under high concurrency
([#&#8203;3518](https://redirect.github.com/redis/go-redis/pull/3518))
- **Connection State Machine**: Resolved race conditions and improved
pool performance with proper state tracking
([#&#8203;3559](https://redirect.github.com/redis/go-redis/pull/3559))
- **Pool Performance**: Significant performance improvements with faster
semaphores, lockless hook manager, and reduced allocations (47-67%
faster Get/Put operations)
([#&#8203;3565](https://redirect.github.com/redis/go-redis/pull/3565))

##### Metrics & Observability

- **Canceled Metric Attribute**: Added 'canceled' metrics attribute to
distinguish context cancellation errors from other errors
([#&#8203;3566](https://redirect.github.com/redis/go-redis/pull/3566))

####  New Features

- Typed errors with wrapping support
([#&#8203;3602](https://redirect.github.com/redis/go-redis/pull/3602))
by [@&#8203;ndyakov](https://redirect.github.com/ndyakov)
- CAS/CAD commands (marked as experimental)
([#&#8203;3583](https://redirect.github.com/redis/go-redis/pull/3583),
[#&#8203;3595](https://redirect.github.com/redis/go-redis/pull/3595)) by
[@&#8203;ndyakov](https://redirect.github.com/ndyakov),
[@&#8203;htemelski-redis](https://redirect.github.com/htemelski-redis)
- MSETEX command support
([#&#8203;3580](https://redirect.github.com/redis/go-redis/pull/3580))
by [@&#8203;ofekshenawa](https://redirect.github.com/ofekshenawa)
- XReadGroup CLAIM argument
([#&#8203;3578](https://redirect.github.com/redis/go-redis/pull/3578))
by [@&#8203;ofekshenawa](https://redirect.github.com/ofekshenawa)
- ACL commands: GenPass, Users, WhoAmI
([#&#8203;3576](https://redirect.github.com/redis/go-redis/pull/3576))
by [@&#8203;destinyoooo](https://redirect.github.com/destinyoooo)
- SLOWLOG commands: LEN, RESET
([#&#8203;3585](https://redirect.github.com/redis/go-redis/pull/3585))
by [@&#8203;destinyoooo](https://redirect.github.com/destinyoooo)
- LATENCY commands: LATEST, RESET
([#&#8203;3584](https://redirect.github.com/redis/go-redis/pull/3584))
by [@&#8203;destinyoooo](https://redirect.github.com/destinyoooo)
- Hybrid search command (FT.HYBRID)
([#&#8203;3573](https://redirect.github.com/redis/go-redis/pull/3573))
by
[@&#8203;htemelski-redis](https://redirect.github.com/htemelski-redis)
- Vector range command (VRANGE)
([#&#8203;3543](https://redirect.github.com/redis/go-redis/pull/3543))
by [@&#8203;cxljs](https://redirect.github.com/cxljs)
- Vector-specific attributes in FT.INFO
([#&#8203;3596](https://redirect.github.com/redis/go-redis/pull/3596))
by [@&#8203;ndyakov](https://redirect.github.com/ndyakov)
- Improved connection pool success rate with FIFO queue
([#&#8203;3518](https://redirect.github.com/redis/go-redis/pull/3518))
by [@&#8203;cyningsun](https://redirect.github.com/cyningsun)
- Canceled metrics attribute for context errors
([#&#8203;3566](https://redirect.github.com/redis/go-redis/pull/3566))
by [@&#8203;pvragov](https://redirect.github.com/pvragov)

#### 🐛 Bug Fixes

- Fixed Failover Client MaintNotificationsConfig
([#&#8203;3600](https://redirect.github.com/redis/go-redis/pull/3600))
by [@&#8203;ajax16384](https://redirect.github.com/ajax16384)
- Fixed ACLGenPass function to use the bit parameter
([#&#8203;3597](https://redirect.github.com/redis/go-redis/pull/3597))
by [@&#8203;destinyoooo](https://redirect.github.com/destinyoooo)
- Return error instead of panic from commands
([#&#8203;3568](https://redirect.github.com/redis/go-redis/pull/3568))
by [@&#8203;dragneelfps](https://redirect.github.com/dragneelfps)
- Safety harness in `joinErrors` to prevent panic
([#&#8203;3577](https://redirect.github.com/redis/go-redis/pull/3577))
by [@&#8203;manisharma](https://redirect.github.com/manisharma)

####  Performance

- Connection state machine with race condition fixes
([#&#8203;3559](https://redirect.github.com/redis/go-redis/pull/3559))
by [@&#8203;ndyakov](https://redirect.github.com/ndyakov)
- Pool performance improvements: 47-67% faster Get/Put, 33% less memory,
50% fewer allocations
([#&#8203;3565](https://redirect.github.com/redis/go-redis/pull/3565))
by [@&#8203;ndyakov](https://redirect.github.com/ndyakov)

#### 🧪 Testing & Infrastructure

- Updated to Redis 8.4.0 image
([#&#8203;3603](https://redirect.github.com/redis/go-redis/pull/3603))
by [@&#8203;ndyakov](https://redirect.github.com/ndyakov)
- Added Redis 8.4-RC1-pre to CI
([#&#8203;3572](https://redirect.github.com/redis/go-redis/pull/3572))
by [@&#8203;ndyakov](https://redirect.github.com/ndyakov)
- Refactored tests for idiomatic Go
([#&#8203;3561](https://redirect.github.com/redis/go-redis/pull/3561),
[#&#8203;3562](https://redirect.github.com/redis/go-redis/pull/3562),
[#&#8203;3563](https://redirect.github.com/redis/go-redis/pull/3563)) by
[@&#8203;12ya](https://redirect.github.com/12ya)

#### 👥 Contributors

We'd like to thank all the contributors who worked on this release!

[@&#8203;12ya](https://redirect.github.com/12ya),
[@&#8203;ajax16384](https://redirect.github.com/ajax16384),
[@&#8203;cxljs](https://redirect.github.com/cxljs),
[@&#8203;cyningsun](https://redirect.github.com/cyningsun),
[@&#8203;destinyoooo](https://redirect.github.com/destinyoooo),
[@&#8203;dragneelfps](https://redirect.github.com/dragneelfps),
[@&#8203;htemelski-redis](https://redirect.github.com/htemelski-redis),
[@&#8203;manisharma](https://redirect.github.com/manisharma),
[@&#8203;ndyakov](https://redirect.github.com/ndyakov),
[@&#8203;ofekshenawa](https://redirect.github.com/ofekshenawa),
[@&#8203;pvragov](https://redirect.github.com/pvragov)

***

**Full Changelog**:
<https://github.com/redis/go-redis/compare/v9.16.0...v9.17.0>

</details>

---

### Configuration

📅 **Schedule**: Branch creation - At any time (no schedule defined),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/googleapis/genai-toolbox).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0Mi4xNi4xIiwidXBkYXRlZEluVmVyIjoiNDIuMzIuMiIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOltdfQ==-->

Co-authored-by: Averi Kitsch <akitsch@google.com>
2025-12-19 13:59:00 -08:00
Mend Renovate
1c353a3c8e chore(deps): update module github.com/elastic/elastic-transport-go/v8 to v8.8.0 (#1989)
This PR contains the following updates:

| Package | Change |
[Age](https://docs.renovatebot.com/merge-confidence/) |
[Confidence](https://docs.renovatebot.com/merge-confidence/) |
|---|---|---|---|
|
[github.com/elastic/elastic-transport-go/v8](https://redirect.github.com/elastic/elastic-transport-go)
| `v8.7.0` -> `v8.8.0` |
![age](https://developer.mend.io/api/mc/badges/age/go/github.com%2felastic%2felastic-transport-go%2fv8/v8.8.0?slim=true)
|
![confidence](https://developer.mend.io/api/mc/badges/confidence/go/github.com%2felastic%2felastic-transport-go%2fv8/v8.7.0/v8.8.0?slim=true)
|

---

### Release Notes

<details>
<summary>elastic/elastic-transport-go
(github.com/elastic/elastic-transport-go/v8)</summary>

###
[`v8.8.0`](https://redirect.github.com/elastic/elastic-transport-go/releases/tag/v8.8.0)

[Compare
Source](https://redirect.github.com/elastic/elastic-transport-go/compare/v8.7.0...v8.8.0)

##### Features

- add a Close method to transport
([#&#8203;36](https://redirect.github.com/elastic/elastic-transport-go/issues/36))
([b2d94de](b2d94deb8a))
- add interceptor pattern
([#&#8203;35](https://redirect.github.com/elastic/elastic-transport-go/issues/35))
([c2d0c18](c2d0c18106))

</details>

---

### Configuration

📅 **Schedule**: Branch creation - At any time (no schedule defined),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/googleapis/genai-toolbox).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0Mi4xNi4xIiwidXBkYXRlZEluVmVyIjoiNDIuMzIuMiIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOltdfQ==-->

Co-authored-by: Averi Kitsch <akitsch@google.com>
2025-12-19 20:14:14 +00:00
Mend Renovate
a02ca45ba3 chore(deps): update module github.com/godror/godror to v0.49.6 (#2199)
This PR contains the following updates:

| Package | Change |
[Age](https://docs.renovatebot.com/merge-confidence/) |
[Confidence](https://docs.renovatebot.com/merge-confidence/) |
|---|---|---|---|
| [github.com/godror/godror](https://redirect.github.com/godror/godror)
| `v0.49.4` -> `v0.49.6` |
![age](https://developer.mend.io/api/mc/badges/age/go/github.com%2fgodror%2fgodror/v0.49.6?slim=true)
|
![confidence](https://developer.mend.io/api/mc/badges/confidence/go/github.com%2fgodror%2fgodror/v0.49.4/v0.49.6?slim=true)
|

---

### Release Notes

<details>
<summary>godror/godror (github.com/godror/godror)</summary>

###
[`v0.49.6`](https://redirect.github.com/godror/godror/blob/HEAD/CHANGELOG.md#v0496)

[Compare
Source](https://redirect.github.com/godror/godror/compare/v0.49.5...v0.49.6)

##### Added

- \*bool == nil -> NULL in DB.

###
[`v0.49.5`](https://redirect.github.com/godror/godror/blob/HEAD/CHANGELOG.md#v0495)

[Compare
Source](https://redirect.github.com/godror/godror/compare/v0.49.4...v0.49.5)

- ODPI-C v5.6.4

</details>

---

### Configuration

📅 **Schedule**: Branch creation - At any time (no schedule defined),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/googleapis/genai-toolbox).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0Mi41OS4wIiwidXBkYXRlZEluVmVyIjoiNDIuNTkuMCIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOltdfQ==-->

Co-authored-by: Yuan Teoh <45984206+Yuan325@users.noreply.github.com>
2025-12-19 03:39:25 +00:00
Yuan Teoh
8217d1424d chore: dedup userAgentRoundTripper into util (#2198)
Dedup userAgentRoundTripper into util where userAgent related code are
placed.
2025-12-18 19:19:14 -08:00
release-please[bot]
f520b4ed8a chore(main): release 0.24.0 (#2162)
🤖 I have created a release *beep* *boop*
---


##
[0.24.0](https://github.com/googleapis/genai-toolbox/compare/v0.23.0...v0.24.0)
(2025-12-19)


### Features

* **sources/cloud-gemini-data-analytics:** Add the Gemini Data Analytics
(GDA) integration for DB NL2SQL conversion to Toolbox
([#2181](https://github.com/googleapis/genai-toolbox/issues/2181))
([aa270b2](aa270b2630))
* **source/cloudsqlmysql:** Add support for IAM authentication in Cloud
SQL MySQL source
([#2050](https://github.com/googleapis/genai-toolbox/issues/2050))
([af3d3c5](af3d3c5204))
* **sources/oracle:** Add Oracle OCI and Wallet support
([#1945](https://github.com/googleapis/genai-toolbox/issues/1945))
([8ea39ec](8ea39ec32f))
* Support combining prebuilt and custom tool configurations
([#2188](https://github.com/googleapis/genai-toolbox/issues/2188))
([5788605](5788605818))
* **tools/mysql-get-query-plan:** Add new `mysql-get-query-plan` tool
for MySQL source
([#2123](https://github.com/googleapis/genai-toolbox/issues/2123))
([0641da0](0641da0353))


### Bug Fixes

* **spanner:** Move list graphs validation to runtime
([#2154](https://github.com/googleapis/genai-toolbox/issues/2154))
([914b3ee](914b3eefda))


---
This PR was generated with [Release
Please](https://github.com/googleapis/release-please). See
[documentation](https://github.com/googleapis/release-please#release-please).

---------

Co-authored-by: release-please[bot] <55107282+release-please[bot]@users.noreply.github.com>
Co-authored-by: Yuan Teoh <45984206+Yuan325@users.noreply.github.com>
2025-12-19 02:07:06 +00:00
Yuan Teoh
80315a0ebd chore: release 0.24.0 (#2201)
Release-As: 0.24.0
2025-12-19 01:44:04 +00:00
dishaprakash
5788605818 feat: Support combining prebuilt and custom tool configurations (#2188)
## Description

This PR updates the CLI to allow the --prebuilt flag to be used
simultaneously with custom tool flags (--tools-file, --tools-files, or
--tools-folder). This enables users to extend a standard prebuilt
environment with their own custom tools and configurations.

### Key changes

- Sequential Loading: Load prebuilt configurations first, then
accumulate any specified custom configurations before merging.

- Smart Defaults: Updated logic to only default to tools.yaml if no
configuration flags are provided.

- Legacy Auth Compatibility: Implemented an additive merge strategy for
authentication. Legacy authSources from custom files are merged into the
modern authServices map used by prebuilt tools.

- Strict Validation: To prevent ambiguity, the server will throw an
explicit error if a legacy authSource name conflicts with an existing
authService name (e.g., from a prebuilt config).

## PR Checklist

> Thank you for opening a Pull Request! Before submitting your PR, there
are a
> few things you can do to make sure it goes smoothly:

- [x] Make sure you reviewed

[CONTRIBUTING.md](https://github.com/googleapis/genai-toolbox/blob/main/CONTRIBUTING.md)
- [x] Make sure to open an issue as a

[bug/issue](https://github.com/googleapis/genai-toolbox/issues/new/choose)
  before writing your code! That way we can discuss the change, evaluate
  designs, and agree on the general idea
- [x] Ensure the tests and linter pass
- [x] Code coverage does not decrease (if any source code was changed)
- [x] Appropriate docs were updated (if necessary)
- [x] Make sure to add `!` if this involve a breaking change

🛠️ Fixes https://github.com/googleapis/genai-toolbox/issues/1220

---------

Co-authored-by: Yuan Teoh <45984206+Yuan325@users.noreply.github.com>
2025-12-18 17:21:08 -08:00
gRedHeadphone
0641da0353 feat(tools/mysql-get-query-plan): tool impl + docs + tests (#2123)
## Description

Tool mysql-get-query-plan implementation, along with tests and docs.
Tool used to get information about how MySQL executes a SQL statement
(EXPLAIN).

## PR Checklist

- [x] Make sure you reviewed

[CONTRIBUTING.md](https://github.com/googleapis/genai-toolbox/blob/main/CONTRIBUTING.md)
- [x] Make sure to open an issue as a

[bug/issue](https://github.com/googleapis/genai-toolbox/issues/new/choose)
  before writing your code! That way we can discuss the change, evaluate
  designs, and agree on the general idea
- [x] Ensure the tests and linter pass
- [x] Code coverage does not decrease (if any source code was changed)
- [x] Appropriate docs were updated (if necessary)
- [x] Make sure to add `!` if this involve a breaking change

🛠️ Fixes #1692

---------

Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
Co-authored-by: Averi Kitsch <akitsch@google.com>
Co-authored-by: Yuan Teoh <45984206+Yuan325@users.noreply.github.com>
2025-12-19 01:02:16 +00:00
Yuan Teoh
c9b775d38e tests: add if exists to spanner drop table sql (#2200)
Update `DROP TABLE %table_name` to `DROP TABLE IF EXISTS %tablename`.
The drop table statement often fail to run. This halts the process and
causes context timeout, and eventually failing the integration tests.
2025-12-19 00:39:50 +00:00
Wenxin Du
8ea39ec32f feat(sources/oracle): Add Oracle OCI and Wallet support (#1945)
Previously we used go-ora (a pure Go Oracle driver) because our release
pipeline did not support cross-compilation with CGO. Now that it's
fixed, we want to add support for Oracle OCI driver for advanced
features including digital wallet etc.

Users will be able to configure a source to use OCI by specifying a
`UseOCI: true` field. The source defaults to use the pure Go driver
otherwise.

Oracle Wallet:
- OCI users should use the `tnsAdmin` to set the wallet location
- Non-OCI users can should use the `walletLocation` field.

fix: https://github.com/googleapis/genai-toolbox/issues/1779
2025-12-18 19:02:17 +00:00
Juexin Wang
aa270b2630 feat: add the Gemini Data Analytics (GDA) integration for DB NL2SQL conversion to Toolbox (#2181)
## Description

This PR is to add the Gemini Data Analytics (GDA) integration for DB
NL2SQL conversion to Toolbox. It allows the user to convert a natural
language query to SQL statement based on their database instance. See
the doc section for details.

## PR Checklist

> Thank you for opening a Pull Request! Before submitting your PR, there
are a
> few things you can do to make sure it goes smoothly:

- [x] Make sure you reviewed

[CONTRIBUTING.md](https://github.com/googleapis/genai-toolbox/blob/main/CONTRIBUTING.md)
- [x] Make sure to open an issue as a

[bug/issue](https://github.com/googleapis/genai-toolbox/issues/new/choose)
  before writing your code! That way we can discuss the change, evaluate
  designs, and agree on the general idea
- [x] Ensure the tests and linter pass
- [x] Code coverage does not decrease (if any source code was changed)
- [x] Appropriate docs were updated (if necessary)
- [x] Make sure to add `!` if this involve a breaking change

🛠️ Fixes #2180

---------

Co-authored-by: Yuan Teoh <45984206+Yuan325@users.noreply.github.com>
2025-12-18 09:58:46 -08:00
Dr. Strangelove
e1bd98ef5b chore: fix "unused paramter" lint in vscode (#2119)
## Description

Remove warning about unused parameter in vscode

## PR Checklist

> Thank you for opening a Pull Request! Before submitting your PR, there
are a
> few things you can do to make sure it goes smoothly:

- [x] Make sure you reviewed

[CONTRIBUTING.md](https://github.com/googleapis/genai-toolbox/blob/main/CONTRIBUTING.md)
- [] Make sure to open an issue as a

[bug/issue](https://github.com/googleapis/genai-toolbox/issues/new/choose)
  before writing your code! That way we can discuss the change, evaluate
  designs, and agree on the general idea
- [x] Ensure the tests and linter pass
- [x] Code coverage does not decrease (if any source code was changed)
- [x] Appropriate docs were updated (if necessary)
- [x] Make sure to add `!` if this involve a breaking change
2025-12-18 17:09:24 +00:00
Yuan Teoh
fa148c60a7 docs: update contributing.md integration test code reference to v0.23.0 tag (#2197)
Update links to point towards files within v0.23.0 version. There are
some updates done within the function since the previous link was
generated.
2025-12-18 16:39:49 +00:00
Yuan Teoh
6e87349431 docs: telemetry docs to provide endpoint without scheme or path (#2179)
## Description

According to the OTEL
([docs](https://pkg.go.dev/go.opentelemetry.io/otel/exporters/otlp/otlptrace/otlptracehttp#WithEndpoint)),
`WithEndpoint()` sets the target endpoint (host and port) the Exporter
will connect to. The provided endpoint should resemble
"example.com:4318" (no scheme or path). And it requires the endpoint to
be secure using `https://`.

To provide an insecure endpoint with `http://`, user will need to set
`OTEL_EXPORTER_OTLP_INSECURE=true`. This PR update the docs to reflect
this.

🛠️ Fixes #1539
2025-12-18 16:24:22 +00:00
Srividya Reddy
3fe4e2b671 test(source/postgres): fix list_database_stats integration test (#2196)
## Description

> Should include a concise description of the changes (bug or feature),
it's
> impact, along with a summary of the solution

The list_database_stats test is flaky when run in parallel on the same
shared instance. It fails with the error: failed to create test_user1:
ERROR: role "test_user1" already exists. This test is updates to create
a random role and database name to avoid conflicts with other
simultaneously running tests.

## PR Checklist

> Thank you for opening a Pull Request! Before submitting your PR, there
are a
> few things you can do to make sure it goes smoothly:

- [x] Make sure you reviewed

[CONTRIBUTING.md](https://github.com/googleapis/genai-toolbox/blob/main/CONTRIBUTING.md)
- [x] Make sure to open an issue as a

[bug/issue](https://github.com/googleapis/genai-toolbox/issues/new/choose)
  before writing your code! That way we can discuss the change, evaluate
  designs, and agree on the general idea
- [x] Ensure the tests and linter pass
- [x] Code coverage does not decrease (if any source code was changed)
- [x] Appropriate docs were updated (if necessary)
- [x] Make sure to add `!` if this involve a breaking change

🛠️ Fixes #<1738>
2025-12-18 05:44:25 +00:00
dependabot[bot]
271f39d4b9 chore(deps): bump jws from 4.0.0 to 4.0.1 in /docs/en/getting-started/quickstart/js/langchain (#2118)
Bumps [jws](https://github.com/brianloveswords/node-jws) from 4.0.0 to
4.0.1.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/brianloveswords/node-jws/releases">jws's
releases</a>.</em></p>
<blockquote>
<h2>v4.0.1</h2>
<h3>Changed</h3>
<ul>
<li>Fix advisory GHSA-869p-cjfg-cm3x: createSign and createVerify now
require
that a non empty secret is provided (via opts.secret, opts.privateKey or
opts.key)
when using HMAC algorithms.</li>
<li>Upgrading JWA version to 2.0.1, addressing a compatibility issue for
Node &gt;= 25.</li>
</ul>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/auth0/node-jws/blob/master/CHANGELOG.md">jws's
changelog</a>.</em></p>
<blockquote>
<h2>[4.0.1]</h2>
<h3>Changed</h3>
<ul>
<li>Fix advisory GHSA-869p-cjfg-cm3x: createSign and createVerify now
require
that a non empty secret is provided (via opts.secret, opts.privateKey or
opts.key)
when using HMAC algorithms.</li>
<li>Upgrading JWA version to 2.0.1, adressing a compatibility issue for
Node &gt;= 25.</li>
</ul>
<h2>[3.2.3]</h2>
<h3>Changed</h3>
<ul>
<li>Fix advisory GHSA-869p-cjfg-cm3x: createSign and createVerify now
require
that a non empty secret is provided (via opts.secret, opts.privateKey or
opts.key)
when using HMAC algorithms.</li>
<li>Upgrading JWA version to 1.4.2, adressing a compatibility issue for
Node &gt;= 25.</li>
</ul>
<h2>[3.0.0]</h2>
<h3>Changed</h3>
<ul>
<li><strong>BREAKING</strong>: <code>jwt.verify</code> now requires an
<code>algorithm</code> parameter, and
<code>jws.createVerify</code> requires an <code>algorithm</code> option.
The <code>&quot;alg&quot;</code> field
signature headers is ignored. This mitigates a critical security flaw
in the library which would allow an attacker to generate signatures with
arbitrary contents that would be accepted by <code>jwt.verify</code>.
See
<a
href="https://auth0.com/blog/2015/03/31/critical-vulnerabilities-in-json-web-token-libraries/">https://auth0.com/blog/2015/03/31/critical-vulnerabilities-in-json-web-token-libraries/</a>
for details.</li>
</ul>
<h2><a
href="https://github.com/brianloveswords/node-jws/compare/v1.0.1...v2.0.0">2.0.0</a>
- 2015-01-30</h2>
<h3>Changed</h3>
<ul>
<li>
<p><strong>BREAKING</strong>: Default payload encoding changed from
<code>binary</code> to
<code>utf8</code>. <code>utf8</code> is a is a more sensible default
than <code>binary</code> because
many payloads, as far as I can tell, will contain user-facing
strings that could be in any language. (<!-- raw HTML omitted
-->[6b6de48]<!-- raw HTML omitted -->)</p>
</li>
<li>
<p>Code reorganization, thanks [<a
href="https://github.com/fearphage"><code>@​fearphage</code></a>]! (<!--
raw HTML omitted --><a
href="https://github.com/brianloveswords/node-jws/commit/7880050">7880050</a><!--
raw HTML omitted -->)</p>
</li>
</ul>
<h3>Added</h3>
<ul>
<li>Option in all relevant methods for <code>encoding</code>. For those
few users
that might be depending on a <code>binary</code> encoding of the
messages, this
is for them. (<!-- raw HTML omitted -->[6b6de48]<!-- raw HTML omitted
-->)</li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="34c45b2c04"><code>34c45b2</code></a>
Merge commit from fork</li>
<li><a
href="49bc39b1f5"><code>49bc39b</code></a>
version 4.0.1</li>
<li><a
href="d42350ccab"><code>d42350c</code></a>
Enhance tests for HMAC streaming sign and verify</li>
<li><a
href="5cb007cf82"><code>5cb007c</code></a>
Improve secretOrKey initialization in VerifyStream</li>
<li><a
href="f9a2e1c8c6"><code>f9a2e1c</code></a>
Improve secret handling in SignStream</li>
<li><a
href="b9fb8d30e9"><code>b9fb8d3</code></a>
Merge pull request <a
href="https://redirect.github.com/brianloveswords/node-jws/issues/102">#102</a>
from auth0/SRE-57-Upload-opslevel-yaml</li>
<li><a
href="95b75ee56c"><code>95b75ee</code></a>
Upload OpsLevel YAML</li>
<li><a
href="8857ee7762"><code>8857ee7</code></a>
test: remove unused variable (<a
href="https://redirect.github.com/brianloveswords/node-jws/issues/96">#96</a>)</li>
<li>See full diff in <a
href="https://github.com/brianloveswords/node-jws/compare/v4.0.0...v4.0.1">compare
view</a></li>
</ul>
</details>
<details>
<summary>Maintainer changes</summary>
<p>This version was pushed to npm by <a
href="https://www.npmjs.com/~julien.wollscheid">julien.wollscheid</a>, a
new releaser for jws since your current version.</p>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=jws&package-manager=npm_and_yarn&previous-version=4.0.0&new-version=4.0.1)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)
You can disable automated security fix PRs for this repo from the
[Security Alerts
page](https://github.com/googleapis/genai-toolbox/network/alerts).

</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Yuan Teoh <45984206+Yuan325@users.noreply.github.com>
2025-12-17 11:32:09 -08:00
Dr. Strangelove
97b0e7d3ac ci: Improved integration tests for looker (#2187)
## Description

Improved integration tests for looker

## PR Checklist

> Thank you for opening a Pull Request! Before submitting your PR, there
are a
> few things you can do to make sure it goes smoothly:

- [x] Make sure you reviewed

[CONTRIBUTING.md](https://github.com/googleapis/genai-toolbox/blob/main/CONTRIBUTING.md)
- [x] Make sure to open an issue as a

[bug/issue](https://github.com/googleapis/genai-toolbox/issues/new/choose)
  before writing your code! That way we can discuss the change, evaluate
  designs, and agree on the general idea
- [x] Ensure the tests and linter pass
- [x] Code coverage does not decrease (if any source code was changed)
- [x] Appropriate docs were updated (if necessary)
- [x] Make sure to add `!` if this involve a breaking change
2025-12-17 13:30:37 -05:00
ishatilwani1301
914b3eefda fix(spanner): move list graphs validation to runtime (#2154)
This pull request resolves the issue by changing the validation logic
for the spanner-list-graphs tool to prevent server crashes on PostgreSQL
dialects.
The spanner-list-graphs tool currently supports only the GoogleSQL
dialect. Previously, this check was enforced during initialization,
causing the entire server to crash on startup when connected to a
PostgreSQL-dialect database.

**Changes Implemented**

- The modification is in
internal/tools/spanner/spannerlistgraphs/spannerlistgraphs.go.
- Removed the dialect validation from the Initialize method (startup).
- Added the dialect validation to the Invoke method (runtime).

This change ensures the tool initializes successfully regardless of the
dialect, allowing the server to start. It now returns a graceful error
message only if a user explicitly attempts to execute the tool on an
unsupported dialect.

**Validation Process**
Validated changes by running the toolbox locally against a Spanner
instance using the PostgreSQL dialect.
CLI Configuration: Ran the server using the standard Spanner prebuilt
set: go run . --prebuilt spanner --ui
Testing: Confirmed the logic by testing two scenarios

1. PostgreSQL Dialect (Graceful Failure)
    Set SPANNER_DIALECT="postgresql".
Result: The server started successfully without crashing (fixing the
bug). When running the spanner-list-graphs tool in the UI, it returned a
clear error message: "operation not supported: The 'spanner-list-graphs'
tool is only available for GoogleSQL dialect databases."
    
<img width="2171" height="462" alt="Screenshot 2025-12-10 11 24 53 PM"
src="https://github.com/user-attachments/assets/c57e64e4-ddce-42a2-998d-b291d77b7d1d"
/>
<img width="2233" height="971" alt="Screenshot 2025-12-10 11 22 53 PM"
src="https://github.com/user-attachments/assets/8e510a72-4598-4d55-a175-bf9e2f85489d"
/>
<img width="2233" height="971" alt="Screenshot 2025-12-10 11 23 26 PM"
src="https://github.com/user-attachments/assets/a904a45f-25be-42ef-9d5a-a7975cc03a44"
/>

2. GoogleSQL Dialect (Success)
    Set SPANNER_DIALECT="googlesql".
Result: The tool executed successfully and returned the graph schema (or
empty results), confirming that normal functionality is preserved.
<img width="2171" height="572" alt="Screenshot 2025-12-10 11 26 59 PM"
src="https://github.com/user-attachments/assets/d9c6c677-cb38-4343-be39-d542439685c4"
/>
<img width="2250" height="938" alt="Screenshot 2025-12-10 11 27 32 PM"
src="https://github.com/user-attachments/assets/6e8f3628-8079-4c99-993a-7ada02a124b0"
/>
<img width="2250" height="938" alt="Screenshot 2025-12-10 11 27 45 PM"
src="https://github.com/user-attachments/assets/a3091228-d73a-44a0-acd2-e7fb463de4e2"
/>



🛠️ Fixes #2136

Co-authored-by: Averi Kitsch <akitsch@google.com>
2025-12-12 13:48:07 +00:00
Anubhav Dhawan
776a5ca438 docs: Update Antigravity MCP plugin documentation (#2157)
This updates the documentation for the MCP Toolbox Antigravity plugin
according to the new configuration option for the plugin in the MCP
server window of Antigravity.
2025-12-12 13:12:18 +05:30
Mend Renovate
d08dd144ad chore(deps): update dependency llama-index to v0.14.10 (#2092)
This PR contains the following updates:

| Package | Change |
[Age](https://docs.renovatebot.com/merge-confidence/) |
[Confidence](https://docs.renovatebot.com/merge-confidence/) |
|---|---|---|---|
| [llama-index](https://redirect.github.com/run-llama/llama_index) |
`==0.14.8` -> `==0.14.10` |
![age](https://developer.mend.io/api/mc/badges/age/pypi/llama-index/0.14.10?slim=true)
|
![confidence](https://developer.mend.io/api/mc/badges/confidence/pypi/llama-index/0.14.8/0.14.10?slim=true)
|

---

### Release Notes

<details>
<summary>run-llama/llama_index (llama-index)</summary>

###
[`v0.14.10`](https://redirect.github.com/run-llama/llama_index/blob/HEAD/CHANGELOG.md#2025-12-04)

[Compare
Source](https://redirect.github.com/run-llama/llama_index/compare/v0.14.9...v0.14.10)

##### llama-index-core \[0.14.10]

- feat: add mock function calling llm
([#&#8203;20331](https://redirect.github.com/run-llama/llama_index/pull/20331))

##### llama-index-llms-qianfan \[0.4.1]

- test: fix typo 'reponse' to 'response' in variable names
([#&#8203;20329](https://redirect.github.com/run-llama/llama_index/pull/20329))

##### llama-index-tools-airweave \[0.1.0]

- feat: add Airweave tool integration with advanced search features
([#&#8203;20111](https://redirect.github.com/run-llama/llama_index/pull/20111))

##### llama-index-utils-qianfan \[0.4.1]

- test: fix typo 'reponse' to 'response' in variable names
([#&#8203;20329](https://redirect.github.com/run-llama/llama_index/pull/20329))

###
[`v0.14.9`](https://redirect.github.com/run-llama/llama_index/blob/HEAD/CHANGELOG.md#2025-12-02)

[Compare
Source](https://redirect.github.com/run-llama/llama_index/compare/v0.14.8...v0.14.9)

##### llama-index-agent-azure \[0.2.1]

- fix: Pin azure-ai-projects version to prevent breaking changes
([#&#8203;20255](https://redirect.github.com/run-llama/llama_index/pull/20255))

##### llama-index-core \[0.14.9]

- MultiModalVectorStoreIndex now returns a multi-modal
ContextChatEngine.
([#&#8203;20265](https://redirect.github.com/run-llama/llama_index/pull/20265))
- Ingestion to vector store now ensures that \_node-content is readable
([#&#8203;20266](https://redirect.github.com/run-llama/llama_index/pull/20266))
- fix: ensure context is copied with async utils run\_async
([#&#8203;20286](https://redirect.github.com/run-llama/llama_index/pull/20286))
- fix(memory): ensure first message in queue is always a user message
after flush
([#&#8203;20310](https://redirect.github.com/run-llama/llama_index/pull/20310))

##### llama-index-embeddings-bedrock \[0.7.2]

- feat(embeddings-bedrock): Add support for Amazon Bedrock Application
Inference Profiles
([#&#8203;20267](https://redirect.github.com/run-llama/llama_index/pull/20267))
- fix:(embeddings-bedrock) correct extraction of provider from
model\_name
([#&#8203;20295](https://redirect.github.com/run-llama/llama_index/pull/20295))
- Bump version of bedrock-embedding
([#&#8203;20304](https://redirect.github.com/run-llama/llama_index/pull/20304))

##### llama-index-embeddings-voyageai \[0.5.1]

- VoyageAI correction and documentation
([#&#8203;20251](https://redirect.github.com/run-llama/llama_index/pull/20251))

##### llama-index-llms-anthropic \[0.10.3]

- feat: add anthropic opus 4.5
([#&#8203;20306](https://redirect.github.com/run-llama/llama_index/pull/20306))

##### llama-index-llms-bedrock-converse \[0.12.2]

- fix(bedrock-converse): Only use guardrail\_stream\_processing\_mode in
streaming functions
([#&#8203;20289](https://redirect.github.com/run-llama/llama_index/pull/20289))
- feat: add anthropic opus 4.5
([#&#8203;20306](https://redirect.github.com/run-llama/llama_index/pull/20306))
- feat(bedrock-converse): Additional support for Claude Opus 4.5
([#&#8203;20317](https://redirect.github.com/run-llama/llama_index/pull/20317))

##### llama-index-llms-google-genai \[0.7.4]

- Fix gemini-3 support and gemini function call support
([#&#8203;20315](https://redirect.github.com/run-llama/llama_index/pull/20315))

##### llama-index-llms-helicone \[0.1.1]

- update helicone docs + examples
([#&#8203;20208](https://redirect.github.com/run-llama/llama_index/pull/20208))

##### llama-index-llms-openai \[0.6.10]

- Smallest Nit
([#&#8203;20252](https://redirect.github.com/run-llama/llama_index/pull/20252))
- Feat: Add gpt-5.1-chat model support
([#&#8203;20311](https://redirect.github.com/run-llama/llama_index/pull/20311))

##### llama-index-llms-ovhcloud \[0.1.0]

- Add OVHcloud AI Endpoints provider
([#&#8203;20288](https://redirect.github.com/run-llama/llama_index/pull/20288))

##### llama-index-llms-siliconflow \[0.4.2]

- \[Bugfix] None check on content in delta in siliconflow LLM
([#&#8203;20327](https://redirect.github.com/run-llama/llama_index/pull/20327))

##### llama-index-node-parser-docling \[0.4.2]

- Relax docling Python constraints
([#&#8203;20322](https://redirect.github.com/run-llama/llama_index/pull/20322))

##### llama-index-packs-resume-screener \[0.9.3]

- feat: Update pypdf to latest version
([#&#8203;20285](https://redirect.github.com/run-llama/llama_index/pull/20285))

##### llama-index-postprocessor-voyageai-rerank \[0.4.1]

- VoyageAI correction and documentation
([#&#8203;20251](https://redirect.github.com/run-llama/llama_index/pull/20251))

##### llama-index-protocols-ag-ui \[0.2.3]

- fix: correct order of ag-ui events to avoid event conflicts
([#&#8203;20296](https://redirect.github.com/run-llama/llama_index/pull/20296))

##### llama-index-readers-confluence \[0.6.0]

- Refactor Confluence integration: Update license to MIT, remove
requirements.txt, and implement HtmlTextParser for HTML to Markdown
conversion. Update dependencies and tests accordingly.
([#&#8203;20262](https://redirect.github.com/run-llama/llama_index/pull/20262))

##### llama-index-readers-docling \[0.4.2]

- Relax docling Python constraints
([#&#8203;20322](https://redirect.github.com/run-llama/llama_index/pull/20322))

##### llama-index-readers-file \[0.5.5]

- feat: Update pypdf to latest version
([#&#8203;20285](https://redirect.github.com/run-llama/llama_index/pull/20285))

##### llama-index-readers-reddit \[0.4.1]

- Fix typo in README.md for Reddit integration
([#&#8203;20283](https://redirect.github.com/run-llama/llama_index/pull/20283))

##### llama-index-storage-chat-store-postgres \[0.3.2]

- \[FIX] Postgres ChatStore automatically prefix table name with
"data\_"
([#&#8203;20241](https://redirect.github.com/run-llama/llama_index/pull/20241))

##### llama-index-vector-stores-azureaisearch \[0.4.4]

- `vector-azureaisearch`: check if user agent already in policy before
add it to azure client
([#&#8203;20243](https://redirect.github.com/run-llama/llama_index/pull/20243))
- fix(azureaisearch): Add close/aclose methods to fix unclosed client
session warnings
([#&#8203;20309](https://redirect.github.com/run-llama/llama_index/pull/20309))

##### llama-index-vector-stores-milvus \[0.9.4]

- Fix/consistency level param for milvus
([#&#8203;20268](https://redirect.github.com/run-llama/llama_index/pull/20268))

##### llama-index-vector-stores-postgres \[0.7.2]

- Fix postgresql dispose
([#&#8203;20312](https://redirect.github.com/run-llama/llama_index/pull/20312))

##### llama-index-vector-stores-qdrant \[0.9.0]

- fix: Update qdrant-client version constraints
([#&#8203;20280](https://redirect.github.com/run-llama/llama_index/pull/20280))
- Feat: update Qdrant client to 1.16.0
([#&#8203;20287](https://redirect.github.com/run-llama/llama_index/pull/20287))

##### llama-index-vector-stores-vertexaivectorsearch \[0.3.2]

- fix: update blob path in batch\_update\_index
([#&#8203;20281](https://redirect.github.com/run-llama/llama_index/pull/20281))

##### llama-index-voice-agents-openai \[0.2.2]

- Smallest Nit
([#&#8203;20252](https://redirect.github.com/run-llama/llama_index/pull/20252))

</details>

---

### Configuration

📅 **Schedule**: Branch creation - At any time (no schedule defined),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/googleapis/genai-toolbox).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0Mi4xOS45IiwidXBkYXRlZEluVmVyIjoiNDIuMzIuMiIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOltdfQ==-->

Co-authored-by: Wenxin Du <117315983+duwenxin99@users.noreply.github.com>
2025-12-11 20:29:47 -05:00
Mend Renovate
fbd92c68ba chore(deps): update dependency go to v1.25.5 (#2003)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
| [go](https://go.dev/)
([source](https://redirect.github.com/golang/go)) | toolchain | patch |
`1.25.3` -> `1.25.5` |

---

### Release Notes

<details>
<summary>golang/go (go)</summary>

###
[`v1.25.5`](https://redirect.github.com/golang/go/compare/go1.25.4...go1.25.5)

###
[`v1.25.4`](https://redirect.github.com/golang/go/compare/go1.25.3...go1.25.4)

</details>

---

### Configuration

📅 **Schedule**: Branch creation - At any time (no schedule defined),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/googleapis/genai-toolbox).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0Mi4xNi4xIiwidXBkYXRlZEluVmVyIjoiNDIuMTkuOSIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOltdfQ==-->

Co-authored-by: Averi Kitsch <akitsch@google.com>
2025-12-11 23:28:47 +00:00
dishaprakash
af3d3c5204 feat(source/cloudsqlmysql): add support for IAM authentication in Cloud SQL MySQL source (#2050)
## Description

This PR adds the support for IAM authentication in the Cloud SQL MySQL
source

## PR Checklist

> Thank you for opening a Pull Request! Before submitting your PR, there
are a
> few things you can do to make sure it goes smoothly:

- [x] Make sure you reviewed

[CONTRIBUTING.md](https://github.com/googleapis/genai-toolbox/blob/main/CONTRIBUTING.md)
- [x] Make sure to open an issue as a

[bug/issue](https://github.com/googleapis/genai-toolbox/issues/new/choose)
  before writing your code! That way we can discuss the change, evaluate
  designs, and agree on the general idea
- [x] Ensure the tests and linter pass
- [x] Code coverage does not decrease (if any source code was changed)
- [ ] Appropriate docs were updated (if necessary)
- [x] Make sure to add `!` if this involve a breaking change

🛠️ Fixes #<issue_number_goes_here>
2025-12-11 22:48:34 +00:00
release-please[bot]
466aef024f chore(main): release 0.23.0 (#2138)
🤖 I have created a release *beep* *boop*
---


##
[0.23.0](https://github.com/googleapis/genai-toolbox/compare/v0.22.0...v0.23.0)
(2025-12-11)


### ⚠ BREAKING CHANGES

* **serverless-spark:** add URLs to create batch tool outputs
* **serverless-spark:** add URLs to list_batches output
* **serverless-spark:** add Cloud Console and Logging URLs to get_batch
* **tools/postgres:** Add additional filter params for existing postgres
tools ([#2033](https://github.com/googleapis/genai-toolbox/issues/2033))

### Features

* **tools/postgres:** Add list-table-stats-tool to list table
statistics.
([#2055](https://github.com/googleapis/genai-toolbox/issues/2055))
([78b02f0](78b02f08c3))
* **looker/tools:** Enhance dashboard creation with dashboard filters
([#2133](https://github.com/googleapis/genai-toolbox/issues/2133))
([285aa46](285aa46b88))
* **serverless-spark:** Add Cloud Console and Logging URLs to get_batch
([e29c061](e29c0616d6))
* **serverless-spark:** Add URLs to create batch tool outputs
([c6ccf4b](c6ccf4bd87))
* **serverless-spark:** Add URLs to list_batches output
([5605eab](5605eabd69))
* **sources/mariadb:** Add MariaDB source and MySQL tools integration
([#1908](https://github.com/googleapis/genai-toolbox/issues/1908))
([3b40fea](3b40fea25e))
* **tools/postgres:** Add additional filter params for existing postgres
tools ([#2033](https://github.com/googleapis/genai-toolbox/issues/2033))
([489117d](489117d747))
* **tools/postgres:** Add list_pg_settings, list_database_stats tools
for postgres
([#2030](https://github.com/googleapis/genai-toolbox/issues/2030))
([32367a4](32367a472f))
* **tools/postgres:** Add new postgres-list-roles tool
([#2038](https://github.com/googleapis/genai-toolbox/issues/2038))
([bea9705](bea9705450))


### Bug Fixes

* List tables tools null fix
([#2107](https://github.com/googleapis/genai-toolbox/issues/2107))
([2b45266](2b45266598))
* **tools/mongodb:** Removed sortPayload and sortParams
([#1238](https://github.com/googleapis/genai-toolbox/issues/1238))
([c5a6daa](c5a6daa768))


### Miscellaneous Chores
* **looker:** Upgrade to latest go sdk
([#2159](https://github.com/googleapis/genai-toolbox/issues/2159))
([78e015d](78e015d7df))
---
This PR was generated with [Release
Please](https://github.com/googleapis/release-please). See
[documentation](https://github.com/googleapis/release-please#release-please).

---------

Co-authored-by: release-please[bot] <55107282+release-please[bot]@users.noreply.github.com>
Co-authored-by: Wenxin Du <117315983+duwenxin99@users.noreply.github.com>
2025-12-11 22:26:26 +00:00
Wenxin Du
a6830744fc chore: release 0.23.0 (#2160)
Release-As: 0.23.0
2025-12-11 22:02:23 +00:00
Wenxin Du
615b5f0130 chore: add v0.23.0 doc version (#2161) 2025-12-11 21:27:01 +00:00
gRedHeadphone
2b45266598 fix: list tables tools null fix (#2107)
## Description

Return empty list instead of null in list tables tools when no tables
found

## PR Checklist

- [x] Make sure you reviewed

[CONTRIBUTING.md](https://github.com/googleapis/genai-toolbox/blob/main/CONTRIBUTING.md)
- [x] Make sure to open an issue as a

[bug/issue](https://github.com/googleapis/genai-toolbox/issues/new/choose)
  before writing your code! That way we can discuss the change, evaluate
  designs, and agree on the general idea
- [x] Ensure the tests and linter pass
- [x] Code coverage does not decrease (if any source code was changed)
- [x] Appropriate docs were updated (if necessary)
- [x] Make sure to add `!` if this involve a breaking change

🛠️ Fixes #2027

---------

Co-authored-by: Averi Kitsch <akitsch@google.com>
2025-12-11 20:58:52 +00:00
Twisha Bansal
26ead2ed78 docs: include npx method to run server (#2094)
## Description

> Should include a concise description of the changes (bug or feature),
it's
> impact, along with a summary of the solution

## PR Checklist

> Thank you for opening a Pull Request! Before submitting your PR, there
are a
> few things you can do to make sure it goes smoothly:

- [ ] Make sure you reviewed

[CONTRIBUTING.md](https://github.com/googleapis/genai-toolbox/blob/main/CONTRIBUTING.md)
- [ ] Make sure to open an issue as a

[bug/issue](https://github.com/googleapis/genai-toolbox/issues/new/choose)
  before writing your code! That way we can discuss the change, evaluate
  designs, and agree on the general idea
- [ ] Ensure the tests and linter pass
- [ ] Code coverage does not decrease (if any source code was changed)
- [ ] Appropriate docs were updated (if necessary)
- [ ] Make sure to add `!` if this involve a breaking change

🛠️ Fixes #<issue_number_goes_here>

---------

Co-authored-by: Averi Kitsch <akitsch@google.com>
Co-authored-by: Anubhav Dhawan <anubhavdhawan@google.com>
Co-authored-by: Wenxin Du <117315983+duwenxin99@users.noreply.github.com>
2025-12-11 18:25:12 +00:00
Twisha Bansal
1f31c2c9b2 docs: add prompts quickstart using gemini cli (#2158)
## Description

> Should include a concise description of the changes (bug or feature),
it's
> impact, along with a summary of the solution

## PR Checklist

> Thank you for opening a Pull Request! Before submitting your PR, there
are a
> few things you can do to make sure it goes smoothly:

- [ ] Make sure you reviewed

[CONTRIBUTING.md](https://github.com/googleapis/genai-toolbox/blob/main/CONTRIBUTING.md)
- [ ] Make sure to open an issue as a

[bug/issue](https://github.com/googleapis/genai-toolbox/issues/new/choose)
  before writing your code! That way we can discuss the change, evaluate
  designs, and agree on the general idea
- [ ] Ensure the tests and linter pass
- [ ] Code coverage does not decrease (if any source code was changed)
- [ ] Appropriate docs were updated (if necessary)
- [ ] Make sure to add `!` if this involve a breaking change

🛠️ Fixes #<issue_number_goes_here>

---------

Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
2025-12-11 23:16:16 +05:30
Dr. Strangelove
78e015d7df fix(looker): upgrade to latest go sdk (#2159)
## Description

Upgrade to latest version of Looker sdk with fix for expiring
credentials.

## PR Checklist

> Thank you for opening a Pull Request! Before submitting your PR, there
are a
> few things you can do to make sure it goes smoothly:

- [x] Make sure you reviewed

[CONTRIBUTING.md](https://github.com/googleapis/genai-toolbox/blob/main/CONTRIBUTING.md)
- [x] Make sure to open an issue as a

[bug/issue](https://github.com/googleapis/genai-toolbox/issues/new/choose)
  before writing your code! That way we can discuss the change, evaluate
  designs, and agree on the general idea
- [x] Ensure the tests and linter pass
- [x] Code coverage does not decrease (if any source code was changed)
- [x] Appropriate docs were updated (if necessary)
- [x] Make sure to add `!` if this involve a breaking change

🛠️ Fixes #1597
2025-12-11 17:11:48 +00:00
Dave Borowitz
c6ccf4bd87 feat(serverless-spark)!: add URLs to create batch tool outputs 2025-12-10 15:10:40 -08:00
Dave Borowitz
5605eabd69 feat(serverless-spark)!: add URLs to list_batches output
Unlike get_batch, in this case we are not returning a JSON type directly
from the server, so we can add the new fields in our top-level object
rather than wrapping.
2025-12-10 15:10:40 -08:00
Dave Borowitz
e29c0616d6 feat(serverless-spark)!: add Cloud Console and Logging URLs to get_batch
These are useful links for humans to follow for more information
(output, metrics, logs) that's not readily availble via MCP.
2025-12-10 15:10:40 -08:00
Dr. Strangelove
285aa46b88 feat(looker/tools): Enhance dashboard creation with dashboard filters (#2133)
## Description

Enhance dashboard creation with dashboard level filters. Also improve
tool descriptions.

## PR Checklist

> Thank you for opening a Pull Request! Before submitting your PR, there
are a
> few things you can do to make sure it goes smoothly:

- [X] Make sure you reviewed

[CONTRIBUTING.md](https://github.com/googleapis/genai-toolbox/blob/main/CONTRIBUTING.md)
- [x] Make sure to open an issue as a

[bug/issue](https://github.com/googleapis/genai-toolbox/issues/new/choose)
  before writing your code! That way we can discuss the change, evaluate
  designs, and agree on the general idea
- [x] Ensure the tests and linter pass
- [x] Code coverage does not decrease (if any source code was changed)
- [x] Appropriate docs were updated (if necessary)
- [x] Make sure to add `!` if this involve a breaking change
2025-12-10 13:30:20 -08:00
Ganga4060
c5a6daa768 fix: removed sortPayload and sortParams from the reference (#1238)
Removed sortPayload and sortParams from the reference

---------

Co-authored-by: Averi Kitsch <akitsch@google.com>
2025-12-10 19:19:07 +00:00
Siddharth Ravi
78b02f08c3 feat: add list-table-stats-tool to list table statistics. (#2055)
Adds the following tools for Postgres:
(1) list_table_stats: Lists table statistics in the database. .

<img width="3446" height="1304" alt="image"
src="https://github.com/user-attachments/assets/68951edc-8d99-460e-a1ac-2d3da9388baf"
/>

<img width="2870" height="1338" alt="image"
src="https://github.com/user-attachments/assets/100a3b7d-202d-4dfd-b046-5dab4390ba41"
/>


> Should include a concise description of the changes (bug or feature),
it's
> impact, along with a summary of the solution

## PR Checklist

> Thank you for opening a Pull Request! Before submitting your PR, there
are a
> few things you can do to make sure it goes smoothly:

- [x] Make sure you reviewed

[CONTRIBUTING.md](https://github.com/googleapis/genai-toolbox/blob/main/CONTRIBUTING.md)
- [x] Make sure to open an issue as a

[bug/issue](https://github.com/googleapis/genai-toolbox/issues/new/choose)
  before writing your code! That way we can discuss the change, evaluate
  designs, and agree on the general idea
- [x] Ensure the tests and linter pass
- [x] Code coverage does not decrease (if any source code was changed)
- [x] Appropriate docs were updated (if necessary)
- [x] Make sure to add `!` if this involve a breaking change

🛠️ Fixes #1738
2025-12-10 11:11:33 +00:00
Anubhav Dhawan
18d0440f4e docs: separate Windows installation instructions for Command Prompt and PowerShell (#2097)
## Description
Updates the Windows installation instructions in `README.md` and the
Hugo documentation to provide separate, clear steps for Command Prompt
and PowerShell.

## Before
<img width="1836" height="1054" alt="image"
src="https://github.com/user-attachments/assets/46856ba5-e99f-4ea1-b851-921c1f885c40"
/>

## After
<img width="1842" height="1046" alt="image"
src="https://github.com/user-attachments/assets/80212630-1233-496e-98d2-9039de8e4cd0"
/>
<img width="1858" height="1070" alt="image"
src="https://github.com/user-attachments/assets/348a879d-7337-41c1-8358-fbe341c80525"
/>
2025-12-10 06:59:57 +00:00
Mend Renovate
7a135ce078 chore(deps): update module google.golang.org/genai to v1.36.0 (#1971)
This PR contains the following updates:

| Package | Change | Age | Confidence |
|---|---|---|---|
|
[google.golang.org/genai](https://redirect.github.com/googleapis/go-genai)
| `v1.35.0` -> `v1.36.0` |
[![age](https://developer.mend.io/api/mc/badges/age/go/google.golang.org%2fgenai/v1.36.0?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|
[![confidence](https://developer.mend.io/api/mc/badges/confidence/go/google.golang.org%2fgenai/v1.35.0/v1.36.0?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|

---

### Release Notes

<details>
<summary>googleapis/go-genai (google.golang.org/genai)</summary>

###
[`v1.36.0`](https://redirect.github.com/googleapis/go-genai/releases/tag/v1.36.0)

[Compare
Source](https://redirect.github.com/googleapis/go-genai/compare/v1.35.0...v1.36.0)

##### Features

- add display name to FunctionResponseBlob
([66bd0fb](66bd0fbf35))
- add display name to FunctionResponseFileData
([f470cff](f470cff5a2))
- Add generate\_content\_config.thinking\_level
([93b2586](93b2586dfb))
- Add image output options to ImageConfig for Vertex
([c4b28c3](c4b28c3414))
- Add part.media\_resolution
([93b2586](93b2586dfb))
- support Function call argument streaming for all languages
([dd5ec01](dd5ec01f21))

</details>

---

### Configuration

📅 **Schedule**: Branch creation - At any time (no schedule defined),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/googleapis/genai-toolbox).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MS4xNzMuMSIsInVwZGF0ZWRJblZlciI6IjQxLjE3My4xIiwidGFyZ2V0QnJhbmNoIjoibWFpbiIsImxhYmVscyI6W119-->

Co-authored-by: Anmol Shukla <shuklaanmol@google.com>
Co-authored-by: Averi Kitsch <akitsch@google.com>
2025-12-10 00:19:55 +00:00
gRedHeadphone
ea9e2d12bd docs: update unit tests execute command (#2074)
## Description

update unit tests execute command in DEVELOPER.md to exclude integration
tests

## PR Checklist

- [x] Make sure you reviewed

[CONTRIBUTING.md](https://github.com/googleapis/genai-toolbox/blob/main/CONTRIBUTING.md)
- [ ] Make sure to open an issue as a

[bug/issue](https://github.com/googleapis/genai-toolbox/issues/new/choose)
  before writing your code! That way we can discuss the change, evaluate
  designs, and agree on the general idea
- [x] Ensure the tests and linter pass
- [x] Code coverage does not decrease (if any source code was changed)
- [x] Appropriate docs were updated (if necessary)
- [x] Make sure to add `!` if this involve a breaking change

Co-authored-by: Averi Kitsch <akitsch@google.com>
2025-12-09 23:48:54 +00:00
Srividya Reddy
bea9705450 feat(tools/postgres): Add new postgres-list-roles tool (#2038)
## Description
Adds a postgresql custom list_roles tool, that lists all the
user-created roles in the instance. It provides details about each
role's attributes and memberships.


> Should include a concise description of the changes (bug or feature),
it's
> impact, along with a summary of the solution
![Uploading Screenshot 2025-11-26 at 1.16.42 AM.png…]()

<img width="1065" height="145" alt="Screenshot 2025-11-26 at 12 59
56 AM"
src="https://github.com/user-attachments/assets/d90131b1-d369-4108-b4db-ee5dc9aafe38"
/>


## PR Checklist

> Thank you for opening a Pull Request! Before submitting your PR, there
are a
> few things you can do to make sure it goes smoothly:

- [x] Make sure you reviewed

[CONTRIBUTING.md](https://github.com/googleapis/genai-toolbox/blob/main/CONTRIBUTING.md)
- [x] Make sure to open an issue as a

[bug/issue](https://github.com/googleapis/genai-toolbox/issues/new/choose)
  before writing your code! That way we can discuss the change, evaluate
  designs, and agree on the general idea
- [x] Ensure the tests and linter pass
- [x] Code coverage does not decrease (if any source code was changed)
- [x] Appropriate docs were updated (if necessary)
- [ ] Make sure to add `!` if this involve a breaking change

🛠️ Fixes #<1738>

Co-authored-by: Wenxin Du <117315983+duwenxin99@users.noreply.github.com>
2025-12-09 21:48:20 +00:00
Srividya Reddy
489117d747 feat(tools/postgres)!: Add additional filter params for existing postgres tools (#2033)
## Description

Add additional filter parameters for existing PostgreSQL tools:

1.  `list_views`:
- Add a new optional `"schema_name"` filter parameter to return results
based on a specific schema name pattern.
- Add an additional column `"definition"` to return the view definition.
2.  `list_schemas`:
- Add a new optional `"owner"` filter parameter to return results based
on a specific owner name pattern.
- Add a new optional `"limit"` parameter to return a specific number of
rows.
3.  `list_indexes`:
- Add a new optional `"only_unused"` filter parameter to return only
unused indexes.

> Should include a concise description of the changes (bug or feature),
it's
> impact, along with a summary of the solution

list_views
<img width="1531" height="763" alt="Screenshot 2025-11-25 at 1 36 39 PM"
src="https://github.com/user-attachments/assets/bd6805b3-43d2-46c7-adc8-62d3a4521d36"
/>

list_schemas
<img width="1519" height="755" alt="Screenshot 2025-11-25 at 1 35 54 PM"
src="https://github.com/user-attachments/assets/62d3e987-b64e-442b-ba1a-84def1df7a58"
/>


list_indexes
<img width="1523" height="774" alt="Screenshot 2025-11-25 at 1 35 32 PM"
src="https://github.com/user-attachments/assets/c6f73b3f-f8a2-4b76-9218-64d7011a2241"
/>


## PR Checklist

> Thank you for opening a Pull Request! Before submitting your PR, there
are a
> few things you can do to make sure it goes smoothly:

- [x] Make sure you reviewed

[CONTRIBUTING.md](https://github.com/googleapis/genai-toolbox/blob/main/CONTRIBUTING.md)
- [x] Make sure to open an issue as a

[bug/issue](https://github.com/googleapis/genai-toolbox/issues/new/choose)
  before writing your code! That way we can discuss the change, evaluate
  designs, and agree on the general idea
- [x] Ensure the tests and linter pass
- [x] Code coverage does not decrease (if any source code was changed)
- [x] Appropriate docs were updated (if necessary)
- [x] Make sure to add `!` if this involve a breaking change

🛠️ Fixes #<1738>

Co-authored-by: Averi Kitsch <akitsch@google.com>
2025-12-09 20:16:45 +00:00
Srividya Reddy
32367a472f feat(tools/postgres): add list_pg_settings, list_database_stats tools for postgres (#2030)
## Description
Adds the following tools for Postgres:
(1) list_pg_settings: List configuration parameters for the PostgreSQL
server.
(2) list_database_stats: Lists the key performance and activity
statistics for each database in the postgreSQL
  server.

> Should include a concise description of the changes (bug or feature),
it's
> impact, along with a summary of the solution

list_pg_settings:
<img width="1526" height="803" alt="Screenshot 2025-11-25 at 10 19
48 AM"
src="https://github.com/user-attachments/assets/73634b9b-4936-4bf0-a94b-6b31fe3642a1"
/>
<img width="1064" height="715" alt="Screenshot 2025-11-25 at 10 27
19 AM"
src="https://github.com/user-attachments/assets/36c13585-27e4-4294-b451-1c1a963c0d6c"
/>

list_database_stats:
<img width="1511" height="779" alt="Screenshot 2025-11-25 at 10 21
12 AM"
src="https://github.com/user-attachments/assets/d283e018-ea81-427d-b1b4-7aaf79b9696b"
/>
<img width="1017" height="506" alt="Screenshot 2025-11-25 at 10 27
47 AM"
src="https://github.com/user-attachments/assets/47b72bd7-7114-4f2a-8a9d-cecc80bf47e9"
/>



## PR Checklist

> Thank you for opening a Pull Request! Before submitting your PR, there
are a
> few things you can do to make sure it goes smoothly:

- [x] Make sure you reviewed

[CONTRIBUTING.md](https://github.com/googleapis/genai-toolbox/blob/main/CONTRIBUTING.md)
- [x] Make sure to open an issue as a

[bug/issue](https://github.com/googleapis/genai-toolbox/issues/new/choose)
  before writing your code! That way we can discuss the change, evaluate
  designs, and agree on the general idea
- [x] Ensure the tests and linter pass
- [x] Code coverage does not decrease (if any source code was changed)
- [x] Appropriate docs were updated (if necessary)
- [x] Make sure to add `!` if this involve a breaking change

🛠️ Fixes #<1738>

Co-authored-by: Averi Kitsch <akitsch@google.com>
Co-authored-by: Wenxin Du <117315983+duwenxin99@users.noreply.github.com>
2025-12-09 11:55:53 -08:00
Pranjul Kalsi
3b40fea25e feat(sources/mariadb): add MariaDB source and MySQL tools integration (#1908)
## Description
This PR 
1. Adds **MariaDB** as a Source - Implementation is similar to **MySQL**
source
2. Utilises pre implemented **MySQL** Tools
- `mysql-execute-sql`
- `mysql-list-active-queries`
- `mysql-list-table-fragmentation`
- `mysql-list-tables`
- `mysql-list-tables-missing-unique-indexes`
- `mysql-sql`
**Note:** After discussion with @duwenxin99 in issue #1768, I initially
assumed MariaDB required new tools due to different metadata structures
and system tables. That is true for older MariaDB versions, but current
MySQL tooling already works with MariaDB (verified), so a separate tool
set was not needed.

3. Adds a source doc for **MariaDB** in docs 
4. Adds MariaDB integration tests using the existing MySQL test flow.
Note: The test file is based on the MySQL integration test, but
`GetMariaDBWants()` and
`RunMariDBListTablesTest()` are implemented because MariaDB returns
different metadata
and list-tables output, so the assertions must be MariaDB-specific.
5. Updates CI 

Lastly
I considered adding a MariaDB-exclusive Galera cluster monitoring tool,
but skipped it because it requires a multi-node Galera setup for
integration testing and would significantly increase CI complexity with
unclear usage demand.

## PR Checklist
- [x] Make sure you reviewed

[CONTRIBUTING.md](https://github.com/googleapis/genai-toolbox/blob/main/CONTRIBUTING.md)
- [x] Make sure to open an issue as a

[bug/issue](https://github.com/googleapis/genai-toolbox/issues/new/choose)
  before writing your code! That way we can discuss the change, evaluate
  designs, and agree on the general idea
- [x] Ensure the tests and linter pass
- [x] Code coverage does not decrease (if any source code was changed)
- [x] Appropriate docs were updated (if necessary)
- [ ] Make sure to add `!` if this involve a breaking change

🛠️ Fixes #1712 #1768

---------

Co-authored-by: Wenxin Du <117315983+duwenxin99@users.noreply.github.com>
2025-12-09 18:45:05 +00:00
Anubhav Dhawan
f6b6a9fb5d ci: enable mongodb integration tests (#2111)
This PR enables MongoDB integration tests in Cloud Build. It adds the
necessary build step and secret configuration.

Fixes https://github.com/googleapis/genai-toolbox/issues/1816

---------

Co-authored-by: AnmolShukla2002 <anmol.28422@gmail.com>
Co-authored-by: Anmol Shukla <shuklaanmol@google.com>
Co-authored-by: Averi Kitsch <akitsch@google.com>
2025-12-09 17:42:51 +00:00
dependabot[bot]
1dd971b8d5 chore(deps): bump golang.org/x/crypto from 0.43.0 to 0.45.0 in /docs/en/getting-started/quickstart/go/langchain (#2041)
Bumps [golang.org/x/crypto](https://github.com/golang/crypto) from
0.43.0 to 0.45.0.
<details>
<summary>Commits</summary>
<ul>
<li><a
href="4e0068c009"><code>4e0068c</code></a>
go.mod: update golang.org/x dependencies</li>
<li><a
href="e79546e28b"><code>e79546e</code></a>
ssh: curb GSSAPI DoS risk by limiting number of specified OIDs</li>
<li><a
href="f91f7a7c31"><code>f91f7a7</code></a>
ssh/agent: prevent panic on malformed constraint</li>
<li><a
href="2df4153a03"><code>2df4153</code></a>
acme/autocert: let automatic renewal work with short lifetime certs</li>
<li><a
href="bcf6a849ef"><code>bcf6a84</code></a>
acme: pass context to request</li>
<li><a
href="b4f2b62076"><code>b4f2b62</code></a>
ssh: fix error message on unsupported cipher</li>
<li><a
href="79ec3a51fc"><code>79ec3a5</code></a>
ssh: allow to bind to a hostname in remote forwarding</li>
<li><a
href="122a78f140"><code>122a78f</code></a>
go.mod: update golang.org/x dependencies</li>
<li><a
href="c0531f9c34"><code>c0531f9</code></a>
all: eliminate vet diagnostics</li>
<li><a
href="0997000b45"><code>0997000</code></a>
all: fix some comments</li>
<li>Additional commits viewable in <a
href="https://github.com/golang/crypto/compare/v0.43.0...v0.45.0">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=golang.org/x/crypto&package-manager=go_modules&previous-version=0.43.0&new-version=0.45.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)
You can disable automated security fix PRs for this repo from the
[Security Alerts
page](https://github.com/googleapis/genai-toolbox/network/alerts).

</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Averi Kitsch <akitsch@google.com>
2025-12-05 20:02:27 +00:00
release-please[bot]
cb4529cbaa chore(main): release 0.22.0 (#1997)
🤖 I have created a release *beep* *boop*
---


##
[0.22.0](https://github.com/googleapis/genai-toolbox/compare/v0.21.0...v0.22.0)
(2025-12-04)


### Features

* Add allowed-origins flag
([#1984](https://github.com/googleapis/genai-toolbox/issues/1984))
([862868f](862868f284))
* **tools/postgres:** Add list-query-stats and get-column-cardinality
functions
([#1976](https://github.com/googleapis/genai-toolbox/issues/1976))
([9f76026](9f76026925))
* **tools/spanner:** Add spanner list graphs to prebuiltconfigs
([#2056](https://github.com/googleapis/genai-toolbox/issues/2056))
([0e7fbf4](0e7fbf465c))
* **prebuilt/cloud-sql:** Add clone instance tool for cloud sql
([#1845](https://github.com/googleapis/genai-toolbox/issues/1845))
([5e43630](5e43630907))
* **serverless-spark:** Add create_pyspark_batch tool
([1bf0b51](1bf0b51f03))
* **serverless-spark:** Add create_spark_batch tool
([17a9792](17a979207d))
* Support alternate accessToken header name
([#1968](https://github.com/googleapis/genai-toolbox/issues/1968))
([18017d6](18017d6545))
* Support for annotations
([#2007](https://github.com/googleapis/genai-toolbox/issues/2007))
([ac21335](ac21335f4e))
* **tool/mssql:** Set default host and port for MSSQL source
([#1943](https://github.com/googleapis/genai-toolbox/issues/1943))
([7a9cc63](7a9cc63376))
* **tools/cloudsqlpg:** Add CloudSQL PostgreSQL pre-check tool
([#1722](https://github.com/googleapis/genai-toolbox/issues/1722))
([8752e05](8752e05ab6))
* **tools/postgres-list-publication-tables:** Add new
postgres-list-publication-tables tool
([#1919](https://github.com/googleapis/genai-toolbox/issues/1919))
([f4b1f0a](f4b1f0a680))
* **tools/postgres-list-tablespaces:** Add new postgres-list-tablespaces
tool ([#1934](https://github.com/googleapis/genai-toolbox/issues/1934))
([5ad7c61](5ad7c6127b))
* **tools/spanner-list-graph:** Tool impl + docs + tests
([#1923](https://github.com/googleapis/genai-toolbox/issues/1923))
([a0f44d3](a0f44d34ea))


### Bug Fixes

* Add import for firebirdsql
([#2045](https://github.com/googleapis/genai-toolbox/issues/2045))
([fb7aae9](fb7aae9d35))
* Correct FAQ to mention HTTP tools
([#2036](https://github.com/googleapis/genai-toolbox/issues/2036))
([7b44237](7b44237d4a))
* Format BigQuery numeric output as decimal strings
([#2084](https://github.com/googleapis/genai-toolbox/issues/2084))
([155bff8](155bff80c1))
* Set default annotations for tools in code if annotation not provided
in yaml
([#2049](https://github.com/googleapis/genai-toolbox/issues/2049))
([565460c](565460c4ea))
* **tools/alloydb-postgres-list-tables:** Exclude google_ml schema from
list_tables
([#2046](https://github.com/googleapis/genai-toolbox/issues/2046))
([a03984c](a03984cc15))
* **tools/alloydbcreateuser:** Remove duplication of project praram
([#2028](https://github.com/googleapis/genai-toolbox/issues/2028))
([730ac6d](730ac6d228))
* **tools/mongodb:** Remove `required` tag from the `canonical` field
([#2099](https://github.com/googleapis/genai-toolbox/issues/2099))
([744214e](744214e04c))

---
This PR was generated with [Release
Please](https://github.com/googleapis/release-please). See
[documentation](https://github.com/googleapis/release-please#release-please).

---------

Co-authored-by: release-please[bot] <55107282+release-please[bot]@users.noreply.github.com>
Co-authored-by: Wenxin Du <117315983+duwenxin99@users.noreply.github.com>
2025-12-04 19:10:51 -05:00
Wenxin Du
ac375114fd chore: add v0.22.0 docs version number (#2115) 2025-12-04 23:37:44 +00:00
Wenxin Du
8a0eba9d62 chore: release 0.22.0 (#2116)
Release-As: 0.22.0
2025-12-04 23:11:24 +00:00
Srividya Reddy
5ad7c6127b feat(tools/postgres-list-tablespaces): add new postgres-list-tablespaces tool (#1934)
## Description
Adds a postgresql custom list_tablespaces tool, that returns the details
of tablespaces present in database.

<img width="1719" height="698" alt="Screenshot 2025-11-12 at 9 11 13 AM"
src="https://github.com/user-attachments/assets/03964a1b-27e0-4da8-85a2-57db905163ed"
/>
<img width="1077" height="141" alt="Screenshot 2025-11-12 at 9 12 42 AM"
src="https://github.com/user-attachments/assets/f93f5692-eb62-4f30-8192-40c8873d4d00"
/>


> Should include a concise description of the changes (bug or feature),
it's
> impact, along with a summary of the solution

Lists all tablespaces in the database. Returns the tablespace name,
owner name, size in bytes, internal object ID, the access control list
regarding permissions, and any specific tablespace options.

## PR Checklist

> Thank you for opening a Pull Request! Before submitting your PR, there
are a
> few things you can do to make sure it goes smoothly:

- [x] Make sure you reviewed

[CONTRIBUTING.md](https://github.com/googleapis/genai-toolbox/blob/main/CONTRIBUTING.md)
- [x] Make sure to open an issue as a

[bug/issue](https://github.com/googleapis/genai-toolbox/issues/new/choose)
  before writing your code! That way we can discuss the change, evaluate
  designs, and agree on the general idea
- [x] Ensure the tests and linter pass
- [x] Code coverage does not decrease (if any source code was changed)
- [x] Appropriate docs were updated (if necessary)
- [x] Make sure to add `!` if this involve a breaking change

🛠️ Fixes #1738
2025-12-04 20:38:54 +00:00
Srividya Reddy
f4b1f0a680 feat(tools/postgres-list-publication-tables): add new postgres-list-publication-tables tool (#1919)
## Description
Adds a postgresql custom list_publication_tables tool, that returns the
details of publication tables present in database.

Test Output:

<img width="845" height="239" alt="Screenshot 2025-11-11 at 12 50 59 AM"
src="https://github.com/user-attachments/assets/b7606e44-c5f6-4fc7-865e-7efadd112eff"
/>

<img width="1529" height="648" alt="Screenshot 2025-11-11 at 1 15 18 AM"
src="https://github.com/user-attachments/assets/6192b772-f0bc-4fb4-8032-ca487434d77c"
/>


> Should include a concise description of the changes (bug or feature),
it's
> impact, along with a summary of the solution

## PR Checklist

> Thank you for opening a Pull Request! Before submitting your PR, there
are a
> few things you can do to make sure it goes smoothly:

- [x] Make sure you reviewed

[CONTRIBUTING.md](https://github.com/googleapis/genai-toolbox/blob/main/CONTRIBUTING.md)
- [x] Make sure to open an issue as a

[bug/issue](https://github.com/googleapis/genai-toolbox/issues/new/choose)
  before writing your code! That way we can discuss the change, evaluate
  designs, and agree on the general idea
- [x] Ensure the tests and linter pass
- [x] Code coverage does not decrease (if any source code was changed)
- [x] Appropriate docs were updated (if necessary)
- [x] Make sure to add `!` if this involve a breaking change

🛠️ Fixes #1738

Co-authored-by: Averi Kitsch <akitsch@google.com>
Co-authored-by: Wenxin Du <117315983+duwenxin99@users.noreply.github.com>
2025-12-04 19:26:45 +00:00
Dave Borowitz
17a979207d feat(serverless-spark): add create_spark_batch tool
This tool is almost identical to create_pyspark_batch, but for Java
Spark batches. There are some minor differences in how the main files
and args are provided.
2025-12-04 11:05:53 -08:00
Dave Borowitz
3bf3fe8fa7 refactor(serverless-spark): extract common create batch tool & config
We are planning to add several very similar tools for creating batches
like the existing pyspark batches: spark (Java), R, etc. They will use
an identical approach of specifying environment and runtime config in
the YAML, differing only in how language-specific args are passed. We
can streamline this.
2025-12-04 11:05:53 -08:00
Dave Borowitz
1bf0b51f03 feat(serverless-spark): add create_pyspark_batch tool
This tool creates a PySpark batch from a minimal set of parameters, to
keep things simple for the LLM. Advanced runtime and environment config
can be specified in tools.yaml.
2025-12-04 11:05:53 -08:00
Wenxin Du
744214e04c fix(tools/mongodb): remove required tag from the canonical field (#2099)
The `required` tag raises validation failure error when a boolean field
is defined as `false`:
```
ERROR "unable to parse tool file at "mongodb_tools.yaml": unable to parse tool "insert-one-device" as kind "mongodb-insert-one": [2:12] Key: 'Config.Canonical' Error:Field validation for 'Canonical' failed on the 'required' tag\n   1 | authRequired: []\n>  2 | canonical: false\n                  ^\n   3 | collection: Device\n   4 | database: xiar\n   5 | description: Inserts a single new document into the Device collection. The 'data' parameter must be a string containing the JSON object to insert.\n   6 | "
```
All the `required` tags are removed from the boolean `canonical` field
of the MongoDB tools. Unit tests are added.
2025-12-04 18:25:16 +00:00
Shobhit Singh
155bff80c1 fix: format BigQuery numeric output as decimal strings (#2084)
## Description

> Should include a concise description of the changes (bug or feature),
it's
> impact, along with a summary of the solution

This change updates both `bigquery-sql` and `bigquery-execute-sql` tools
to format `NUMERIC` and `BIGNUMERIC` values as decimal strings (e.g.,
"9.5") instead of rational fractions (e.g., "19/2"). This ensures the
tools' output matches the BigQuery REST API JSON format.

Key changes:
- Added `NormalizeValue` function in
`internal/tools/bigquery/bigquerycommon` to handle `*big.Rat` conversion
with 38-digit precision and trailing zero trimming.
- Updated `bigquery-sql` and `bigquery-execute-sql` to use
`NormalizeValue`.
- Added comprehensive tests in
`internal/tools/bigquery/bigquerycommon/conversion_test.go`.

With these changes the formatting for NUMERIC and BIGNUMERIC is fixed.

**Before:**

```
[
  {
    "id": 3,
    "numeric_value": "1"
  },
  {
    "id": 2,
    "numeric_value": "333333333/1000000000"
  },
  {
    "id": 4,
    "numeric_value": "12341/10"
  },
  {
    "id": 1,
    "numeric_value": "19/2"
  }
]
```

**After:**

```
[
  {
    "id": 3,
    "numeric_value": "1"
  },
  {
    "id": 2,
    "numeric_value": "0.333333333"
  },
  {
    "id": 4,
    "numeric_value": "1234.1"
  },
  {
    "id": 1,
    "numeric_value": "9.5"
  }
]
```

## PR Checklist

> Thank you for opening a Pull Request! Before submitting your PR, there
are a
> few things you can do to make sure it goes smoothly:

- [x] Make sure you reviewed

[CONTRIBUTING.md](https://github.com/googleapis/genai-toolbox/blob/main/CONTRIBUTING.md)
- [ ] Make sure to open an issue as a

[bug/issue](https://github.com/googleapis/genai-toolbox/issues/new/choose)
  before writing your code! That way we can discuss the change, evaluate
  designs, and agree on the general idea
- [x] Ensure the tests and linter pass
- [x] Code coverage does not decrease (if any source code was changed)
- [ ] Appropriate docs were updated (if necessary)
- [ ] Make sure to add `!` if this involve a breaking change

🛠️ Fixes #1194

---------

Co-authored-by: google-labs-jules[bot] <161369871+google-labs-jules[bot]@users.noreply.github.com>
Co-authored-by: Averi Kitsch <akitsch@google.com>
2025-12-04 10:06:12 -08:00
Wenxin Du
e84252feb4 ci: fix multiple source test failures (#2114) 2025-12-04 09:43:18 -08:00
Twisha Bansal
1e67810740 docs(toolbox-adk): Add quickstart for ADK JS SDK (#1862)
## Description

## PR Checklist

> Thank you for opening a Pull Request! Before submitting your PR, there
are a
> few things you can do to make sure it goes smoothly:

- [x] Make sure you reviewed

[CONTRIBUTING.md](https://github.com/googleapis/genai-toolbox/blob/main/CONTRIBUTING.md)
- [ ] Make sure to open an issue as a

[bug/issue](https://github.com/googleapis/genai-toolbox/issues/new/choose)
  before writing your code! That way we can discuss the change, evaluate
  designs, and agree on the general idea
- [x] Ensure the tests and linter pass
- [ ] Code coverage does not decrease (if any source code was changed)
- [x] Appropriate docs were updated (if necessary)
- [ ] Make sure to add `!` if this involve a breaking change

🛠️ Fixes #<issue_number_goes_here>
2025-12-04 12:17:05 +05:30
Twisha Bansal
bdbc6239cd ci: ignore dep updates for GCA reviews (#1991)
This PR introduces the configuration file (.gemini/config.yaml) for the
Gemini Code Assist GitHub App. The goal is to enable AI code reviews
while preventing quota exhaustion caused by high-volume automated PRs
(e.g., Renovate/Dependabot).

See
[schema](https://developers.google.com/gemini-code-assist/docs/customize-gemini-behavior-github#config.yaml-schema)

## Description

> Should include a concise description of the changes (bug or feature),
it's
> impact, along with a summary of the solution

## PR Checklist

> Thank you for opening a Pull Request! Before submitting your PR, there
are a
> few things you can do to make sure it goes smoothly:

- [x] Make sure you reviewed

[CONTRIBUTING.md](https://github.com/googleapis/genai-toolbox/blob/main/CONTRIBUTING.md)
- [ ] Make sure to open an issue as a

[bug/issue](https://github.com/googleapis/genai-toolbox/issues/new/choose)
  before writing your code! That way we can discuss the change, evaluate
  designs, and agree on the general idea
- [x] Ensure the tests and linter pass
- [x] Code coverage does not decrease (if any source code was changed)
- [ ] Appropriate docs were updated (if necessary)
- [ ] Make sure to add `!` if this involve a breaking change

🛠️ Fixes #<issue_number_goes_here>
2025-12-03 16:48:13 +05:30
Anubhav Dhawan
05799475b4 docs: update antigravity docs installation instructions and prerequisites (#2088)
This PR updates the documentation across the repository to reflect the
new installation workflow using `npx` and Node.js, replacing the
previous binary download instructions. It also standardizes the
prerequisites and adds helpful configuration notes for Windows users.

These changes simplify the setup process for users by leveraging `npx`
for executing the tools, ensuring they always use the latest version
without manual binary management. It also addresses feedback from PR
#2079 regarding installation clarity and Windows support.

---------

Co-authored-by: Twisha Bansal <twishabansal07@gmail.com>
2025-12-02 12:25:31 -08:00
239 changed files with 15636 additions and 1877 deletions

View File

@@ -305,4 +305,4 @@ substitutions:
_AR_HOSTNAME: ${_REGION}-docker.pkg.dev
_AR_REPO_NAME: toolbox-dev
_BUCKET_NAME: genai-toolbox-dev
_DOCKER_URI: ${_AR_HOSTNAME}/${PROJECT_ID}/${_AR_REPO_NAME}/toolbox
_DOCKER_URI: ${_AR_HOSTNAME}/${PROJECT_ID}/${_AR_REPO_NAME}/toolbox

View File

@@ -212,6 +212,26 @@ steps:
bigquery \
bigquery
- id: "cloud-gda"
name: golang:1
waitFor: ["compile-test-binary"]
entrypoint: /bin/bash
env:
- "GOPATH=/gopath"
- "CLOUD_GDA_PROJECT=$PROJECT_ID"
- "SERVICE_ACCOUNT_EMAIL=$SERVICE_ACCOUNT_EMAIL"
secretEnv: ["CLIENT_ID"]
volumes:
- name: "go"
path: "/gopath"
args:
- -c
- |
.ci/test_with_coverage.sh \
"Cloud Gemini Data Analytics" \
cloudgda \
cloudgda
- id: "dataplex"
name: golang:1
waitFor: ["compile-test-binary"]
@@ -318,7 +338,7 @@ steps:
.ci/test_with_coverage.sh \
"Spanner" \
spanner \
spanner
spanner || echo "Integration tests failed." # ignore test failures
- id: "neo4j"
name: golang:1
@@ -589,6 +609,26 @@ steps:
firestore \
firestore
- id: "mongodb"
name: golang:1
waitFor: ["compile-test-binary"]
entrypoint: /bin/bash
env:
- "GOPATH=/gopath"
- "MONGODB_DATABASE=$_DATABASE_NAME"
- "SERVICE_ACCOUNT_EMAIL=$SERVICE_ACCOUNT_EMAIL"
secretEnv: ["MONGODB_URI", "CLIENT_ID"]
volumes:
- name: "go"
path: "/gopath"
args:
- -c
- |
.ci/test_with_coverage.sh \
"MongoDB" \
mongodb \
mongodb
- id: "looker"
name: golang:1
waitFor: ["compile-test-binary"]
@@ -806,8 +846,8 @@ steps:
cassandra
- id: "oracle"
name: golang:1
waitFor: ["compile-test-binary"]
name: ghcr.io/oracle/oraclelinux9-instantclient:23
waitFor: ["install-dependencies"]
entrypoint: /bin/bash
env:
- "GOPATH=/gopath"
@@ -820,10 +860,25 @@ steps:
args:
- -c
- |
.ci/test_with_coverage.sh \
"Oracle" \
oracle \
oracle
# Install the C compiler and Oracle SDK headers needed for cgo
dnf install -y gcc oracle-instantclient-devel
# Install Go
curl -L -o go.tar.gz "https://go.dev/dl/go1.25.1.linux-amd64.tar.gz"
tar -C /usr/local -xzf go.tar.gz
export PATH="/usr/local/go/bin:$$PATH"
go test -v ./internal/sources/oracle/... \
-coverprofile=oracle_coverage.out \
-coverpkg=./internal/sources/oracle/...,./internal/tools/oracle/...
# Coverage check
total_coverage=$(go tool cover -func=oracle_coverage.out | grep "total:" | awk '{print $3}')
echo "Oracle total coverage: $total_coverage"
coverage_numeric=$(echo "$total_coverage" | sed 's/%//')
if awk -v cov="$coverage_numeric" 'BEGIN {exit !(cov < 30)}'; then
echo "Coverage failure: $total_coverage is below 30%."
exit 1
fi
- id: "serverless-spark"
name: golang:1
@@ -867,6 +922,26 @@ steps:
singlestore \
singlestore
- id: "mariadb"
name: golang:1
waitFor: ["compile-test-binary"]
entrypoint: /bin/bash
env:
- "GOPATH=/gopath"
- "MARIADB_DATABASE=$_MARIADB_DATABASE"
- "MARIADB_PORT=$_MARIADB_PORT"
- "SERVICE_ACCOUNT_EMAIL=$SERVICE_ACCOUNT_EMAIL"
secretEnv: ["MARIADB_USER", "MARIADB_PASS", "MARIADB_HOST", "CLIENT_ID"]
volumes:
- name: "go"
path: "/gopath"
args:
- -c
- |
# skip coverage check as it re-uses current MySQL implementation
go test ./tests/mariadb
availableSecrets:
secretManager:
- versionName: projects/$PROJECT_ID/secrets/cloud_sql_pg_user/versions/latest
@@ -979,6 +1054,14 @@ availableSecrets:
env: SINGLESTORE_PASSWORD
- versionName: projects/$PROJECT_ID/secrets/singlestore_host/versions/latest
env: SINGLESTORE_HOST
- versionName: projects/$PROJECT_ID/secrets/mariadb_user/versions/latest
env: MARIADB_USER
- versionName: projects/$PROJECT_ID/secrets/mariadb_pass/versions/latest
env: MARIADB_PASS
- versionName: projects/$PROJECT_ID/secrets/mariadb_host/versions/latest
env: MARIADB_HOST
- versionName: projects/$PROJECT_ID/secrets/mongodb_uri/versions/latest
env: MONGODB_URI
options:
logging: CLOUD_LOGGING_ONLY
@@ -1039,3 +1122,6 @@ substitutions:
_SINGLESTORE_PORT: "3308"
_SINGLESTORE_DATABASE: "singlestore"
_SINGLESTORE_USER: "root"
_MARIADB_PORT: "3307"
_MARIADB_DATABASE: test_database

View File

@@ -13,7 +13,7 @@
# limitations under the License.
steps:
- name: 'node:20'
- name: 'node:22'
id: 'js-quickstart-test'
entrypoint: 'bash'
args:
@@ -44,4 +44,4 @@ availableSecrets:
timeout: 1000s
options:
logging: CLOUD_LOGGING_ONLY
logging: CLOUD_LOGGING_ONLY

18
.gemini/config.yaml Normal file
View File

@@ -0,0 +1,18 @@
# Copyright 2025 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
ignore_patterns:
- "package-lock.json"
- "go.sum"
- "requirements.txt"

View File

@@ -24,5 +24,23 @@
],
pinDigests: true,
},
{
groupName: 'Go',
matchManagers: [
'gomod',
],
},
{
groupName: 'Node',
matchManagers: [
'npm',
],
},
{
groupName: 'Pip',
matchManagers: [
'pip_requirements',
],
},
],
}

View File

@@ -1 +1,9 @@
@import 'td/code-dark';
@import 'td/code-dark';
// Make tabs scrollable horizontally instead of wrapping
.nav-tabs {
flex-wrap: nowrap;
white-space: nowrap;
overflow-x: auto;
overflow-y: hidden;
}

View File

@@ -51,6 +51,18 @@ ignoreFiles = ["quickstart/shared", "quickstart/python", "quickstart/js", "quick
# Add a new version block here before every release
# The order of versions in this file is mirrored into the dropdown
[[params.versions]]
version = "v0.24.0"
url = "https://googleapis.github.io/genai-toolbox/v0.24.0/"
[[params.versions]]
version = "v0.23.0"
url = "https://googleapis.github.io/genai-toolbox/v0.23.0/"
[[params.versions]]
version = "v0.22.0"
url = "https://googleapis.github.io/genai-toolbox/v0.22.0/"
[[params.versions]]
version = "v0.21.0"
url = "https://googleapis.github.io/genai-toolbox/v0.21.0/"

View File

@@ -1,5 +1,84 @@
# Changelog
## [0.24.0](https://github.com/googleapis/genai-toolbox/compare/v0.23.0...v0.24.0) (2025-12-19)
### Features
* **sources/cloud-gemini-data-analytics:** Add the Gemini Data Analytics (GDA) integration for DB NL2SQL conversion to Toolbox ([#2181](https://github.com/googleapis/genai-toolbox/issues/2181)) ([aa270b2](https://github.com/googleapis/genai-toolbox/commit/aa270b2630da2e3d618db804ca95550445367dbc))
* **source/cloudsqlmysql:** Add support for IAM authentication in Cloud SQL MySQL source ([#2050](https://github.com/googleapis/genai-toolbox/issues/2050)) ([af3d3c5](https://github.com/googleapis/genai-toolbox/commit/af3d3c52044bea17781b89ce4ab71ff0f874ac20))
* **sources/oracle:** Add Oracle OCI and Wallet support ([#1945](https://github.com/googleapis/genai-toolbox/issues/1945)) ([8ea39ec](https://github.com/googleapis/genai-toolbox/commit/8ea39ec32fbbaa97939c626fec8c5d86040ed464))
* Support combining prebuilt and custom tool configurations ([#2188](https://github.com/googleapis/genai-toolbox/issues/2188)) ([5788605](https://github.com/googleapis/genai-toolbox/commit/57886058188aa5d2a51d5846a98bc6d8a650edd1))
* **tools/mysql-get-query-plan:** Add new `mysql-get-query-plan` tool for MySQL source ([#2123](https://github.com/googleapis/genai-toolbox/issues/2123)) ([0641da0](https://github.com/googleapis/genai-toolbox/commit/0641da0353857317113b2169e547ca69603ddfde))
### Bug Fixes
* **spanner:** Move list graphs validation to runtime ([#2154](https://github.com/googleapis/genai-toolbox/issues/2154)) ([914b3ee](https://github.com/googleapis/genai-toolbox/commit/914b3eefda40a650efe552d245369e007277dab5))
## [0.23.0](https://github.com/googleapis/genai-toolbox/compare/v0.22.0...v0.23.0) (2025-12-11)
### ⚠ BREAKING CHANGES
* **serverless-spark:** add URLs to create batch tool outputs
* **serverless-spark:** add URLs to list_batches output
* **serverless-spark:** add Cloud Console and Logging URLs to get_batch
* **tools/postgres:** Add additional filter params for existing postgres tools ([#2033](https://github.com/googleapis/genai-toolbox/issues/2033))
### Features
* **tools/postgres:** Add list-table-stats-tool to list table statistics. ([#2055](https://github.com/googleapis/genai-toolbox/issues/2055)) ([78b02f0](https://github.com/googleapis/genai-toolbox/commit/78b02f08c3cc3062943bb2f91cf60d5149c8d28d))
* **looker/tools:** Enhance dashboard creation with dashboard filters ([#2133](https://github.com/googleapis/genai-toolbox/issues/2133)) ([285aa46](https://github.com/googleapis/genai-toolbox/commit/285aa46b887d9acb2da8766e107bbf1ab75b8812))
* **serverless-spark:** Add Cloud Console and Logging URLs to get_batch ([e29c061](https://github.com/googleapis/genai-toolbox/commit/e29c0616d6b9ecda2badcaf7b69614e511ac031b))
* **serverless-spark:** Add URLs to create batch tool outputs ([c6ccf4b](https://github.com/googleapis/genai-toolbox/commit/c6ccf4bd87026484143a2d0f5527b2edab03b54a))
* **serverless-spark:** Add URLs to list_batches output ([5605eab](https://github.com/googleapis/genai-toolbox/commit/5605eabd696696ade07f52431a28ef65c0fb1f77))
* **sources/mariadb:** Add MariaDB source and MySQL tools integration ([#1908](https://github.com/googleapis/genai-toolbox/issues/1908)) ([3b40fea](https://github.com/googleapis/genai-toolbox/commit/3b40fea25edae607e02c1e8fc2b0c957fa2c8e9a))
* **tools/postgres:** Add additional filter params for existing postgres tools ([#2033](https://github.com/googleapis/genai-toolbox/issues/2033)) ([489117d](https://github.com/googleapis/genai-toolbox/commit/489117d74711ac9260e7547163ca463eb45eeaa2))
* **tools/postgres:** Add list_pg_settings, list_database_stats tools for postgres ([#2030](https://github.com/googleapis/genai-toolbox/issues/2030)) ([32367a4](https://github.com/googleapis/genai-toolbox/commit/32367a472fae9653fed7f126428eba0252978bd5))
* **tools/postgres:** Add new postgres-list-roles tool ([#2038](https://github.com/googleapis/genai-toolbox/issues/2038)) ([bea9705](https://github.com/googleapis/genai-toolbox/commit/bea97054502cfa236aa10e2ebc8ff58eb00ad035))
### Bug Fixes
* List tables tools null fix ([#2107](https://github.com/googleapis/genai-toolbox/issues/2107)) ([2b45266](https://github.com/googleapis/genai-toolbox/commit/2b452665983154041d4cd0ed7d82532e4af682eb))
* **tools/mongodb:** Removed sortPayload and sortParams ([#1238](https://github.com/googleapis/genai-toolbox/issues/1238)) ([c5a6daa](https://github.com/googleapis/genai-toolbox/commit/c5a6daa7683d2f9be654300d977692c368e55e31))
### Miscellaneous Chores
* **looker:** Upgrade to latest go sdk ([#2159](https://github.com/googleapis/genai-toolbox/issues/2159)) ([78e015d](https://github.com/googleapis/genai-toolbox/commit/78e015d7dfd9cce7e2b444ed934da17eb355bc86))
## [0.22.0](https://github.com/googleapis/genai-toolbox/compare/v0.21.0...v0.22.0) (2025-12-04)
### Features
* **tools/postgres:** Add allowed-origins flag ([#1984](https://github.com/googleapis/genai-toolbox/issues/1984)) ([862868f](https://github.com/googleapis/genai-toolbox/commit/862868f28476ea981575ce412faa7d6a03138f31))
* **tools/postgres:** Add list-query-stats and get-column-cardinality functions ([#1976](https://github.com/googleapis/genai-toolbox/issues/1976)) ([9f76026](https://github.com/googleapis/genai-toolbox/commit/9f760269253a8cc92a357e995c6993ccc4a0fb7b))
* **tools/spanner:** Add spanner list graphs to prebuiltconfigs ([#2056](https://github.com/googleapis/genai-toolbox/issues/2056)) ([0e7fbf4](https://github.com/googleapis/genai-toolbox/commit/0e7fbf465c488397aa9d8cab2e55165fff4eb53c))
* **prebuilt/cloud-sql:** Add clone instance tool for cloud sql ([#1845](https://github.com/googleapis/genai-toolbox/issues/1845)) ([5e43630](https://github.com/googleapis/genai-toolbox/commit/5e43630907aa2d7bc6818142483a33272eab060b))
* **serverless-spark:** Add create_pyspark_batch tool ([1bf0b51](https://github.com/googleapis/genai-toolbox/commit/1bf0b51f033c956790be1577bf5310d0b17e9c12))
* **serverless-spark:** Add create_spark_batch tool ([17a9792](https://github.com/googleapis/genai-toolbox/commit/17a979207dbc4fe70acd0ebda164d1a8d34c1ed3))
* Support alternate accessToken header name ([#1968](https://github.com/googleapis/genai-toolbox/issues/1968)) ([18017d6](https://github.com/googleapis/genai-toolbox/commit/18017d6545335a6fc1c472617101c35254d9a597))
* Support for annotations ([#2007](https://github.com/googleapis/genai-toolbox/issues/2007)) ([ac21335](https://github.com/googleapis/genai-toolbox/commit/ac21335f4e88ca52d954d7f8143a551a35661b94))
* **tool/mssql:** Set default host and port for MSSQL source ([#1943](https://github.com/googleapis/genai-toolbox/issues/1943)) ([7a9cc63](https://github.com/googleapis/genai-toolbox/commit/7a9cc633768d9ae9a7ff8230002da69d6a36ca86))
* **tools/cloudsqlpg:** Add CloudSQL PostgreSQL pre-check tool ([#1722](https://github.com/googleapis/genai-toolbox/issues/1722)) ([8752e05](https://github.com/googleapis/genai-toolbox/commit/8752e05ab6e98812d95673a6f1ff67e9a6ae48d2))
* **tools/postgres-list-publication-tables:** Add new postgres-list-publication-tables tool ([#1919](https://github.com/googleapis/genai-toolbox/issues/1919)) ([f4b1f0a](https://github.com/googleapis/genai-toolbox/commit/f4b1f0a68000ca2fc0325f55a1905705417c38a2))
* **tools/postgres-list-tablespaces:** Add new postgres-list-tablespaces tool ([#1934](https://github.com/googleapis/genai-toolbox/issues/1934)) ([5ad7c61](https://github.com/googleapis/genai-toolbox/commit/5ad7c6127b3e47504fc4afda0b7f3de1dff78b8b))
* **tools/spanner-list-graph:** Tool impl + docs + tests ([#1923](https://github.com/googleapis/genai-toolbox/issues/1923)) ([a0f44d3](https://github.com/googleapis/genai-toolbox/commit/a0f44d34ea3f044dd08501be616f70ddfd63ab45))
### Bug Fixes
* Add import for firebirdsql ([#2045](https://github.com/googleapis/genai-toolbox/issues/2045)) ([fb7aae9](https://github.com/googleapis/genai-toolbox/commit/fb7aae9d35b760d3471d8379642f835a0d84ec41))
* Correct FAQ to mention HTTP tools ([#2036](https://github.com/googleapis/genai-toolbox/issues/2036)) ([7b44237](https://github.com/googleapis/genai-toolbox/commit/7b44237d4a21bfbf8d3cebe4d32a15affa29584d))
* Format BigQuery numeric output as decimal strings ([#2084](https://github.com/googleapis/genai-toolbox/issues/2084)) ([155bff8](https://github.com/googleapis/genai-toolbox/commit/155bff80c1da4fae1e169e425fd82e1dc3373041))
* Set default annotations for tools in code if annotation not provided in yaml ([#2049](https://github.com/googleapis/genai-toolbox/issues/2049)) ([565460c](https://github.com/googleapis/genai-toolbox/commit/565460c4ea8953dbe80070a8e469f957c0f7a70c))
* **tools/alloydb-postgres-list-tables:** Exclude google_ml schema from list_tables ([#2046](https://github.com/googleapis/genai-toolbox/issues/2046)) ([a03984c](https://github.com/googleapis/genai-toolbox/commit/a03984cc15254c928f30085f8fa509ded6a79a0c))
* **tools/alloydbcreateuser:** Remove duplication of project praram ([#2028](https://github.com/googleapis/genai-toolbox/issues/2028)) ([730ac6d](https://github.com/googleapis/genai-toolbox/commit/730ac6d22805fd50b4a675b74c1865f4e7689e7c))
* **tools/mongodb:** Remove `required` tag from the `canonical` field ([#2099](https://github.com/googleapis/genai-toolbox/issues/2099)) ([744214e](https://github.com/googleapis/genai-toolbox/commit/744214e04cd12b11d166e6eb7da8ce4714904abc))
## [0.21.0](https://github.com/googleapis/genai-toolbox/compare/v0.20.0...v0.21.0) (2025-11-19)

View File

@@ -167,15 +167,15 @@ tools.
[integration.cloudbuild.yaml](.ci/integration.cloudbuild.yaml).
[tool-get]:
https://github.com/googleapis/genai-toolbox/blob/fd300dc606d88bf9f7bba689e2cee4e3565537dd/tests/tool.go#L31
https://github.com/googleapis/genai-toolbox/blob/v0.23.0/tests/tool.go#L41
[tool-call]:
<https://github.com/googleapis/genai-toolbox/blob/fd300dc606d88bf9f7bba689e2cee4e3565537dd/tests/tool.go#L79>
https://github.com/googleapis/genai-toolbox/blob/v0.23.0/tests/tool.go#L229
[mcp-call]:
https://github.com/googleapis/genai-toolbox/blob/fd300dc606d88bf9f7bba689e2cee4e3565537dd/tests/tool.go#L554
https://github.com/googleapis/genai-toolbox/blob/v0.23.0/tests/tool.go#L789
[execute-sql]:
<https://github.com/googleapis/genai-toolbox/blob/fd300dc606d88bf9f7bba689e2cee4e3565537dd/tests/tool.go#L431>
https://github.com/googleapis/genai-toolbox/blob/v0.23.0/tests/tool.go#L609
[temp-param]:
<https://github.com/googleapis/genai-toolbox/blob/fd300dc606d88bf9f7bba689e2cee4e3565537dd/tests/tool.go#L297>
https://github.com/googleapis/genai-toolbox/blob/v0.23.0/tests/tool.go#L454
[temp-param-doc]:
https://googleapis.github.io/genai-toolbox/resources/tools/#template-parameters

View File

@@ -109,7 +109,7 @@ golangci-lint run --fix
Execute unit tests locally:
```bash
go test -race -v ./...
go test -race -v ./cmd/... ./internal/...
```
### Integration Tests

View File

@@ -105,6 +105,21 @@ redeploying your application.
## Getting Started
### (Non-production) Running Toolbox
You can run Toolbox directly with a [configuration file](#configuration):
```sh
npx @toolbox-sdk/server --tools-file tools.yaml
```
This runs the latest version of the toolbox server with your configuration file.
> [!NOTE]
> This method should only be used for non-production use cases such as
> experimentation. For any production use-cases, please consider [Installing the
> server](#installing-the-server) and then [running it](#running-the-server).
### Installing the server
For the latest version, check the [releases page][releases] and use the
@@ -125,7 +140,7 @@ To install Toolbox as a binary:
>
> ```sh
> # see releases page for other versions
> export VERSION=0.21.0
> export VERSION=0.24.0
> curl -L -o toolbox https://storage.googleapis.com/genai-toolbox/v$VERSION/linux/amd64/toolbox
> chmod +x toolbox
> ```
@@ -138,7 +153,7 @@ To install Toolbox as a binary:
>
> ```sh
> # see releases page for other versions
> export VERSION=0.21.0
> export VERSION=0.24.0
> curl -L -o toolbox https://storage.googleapis.com/genai-toolbox/v$VERSION/darwin/arm64/toolbox
> chmod +x toolbox
> ```
@@ -151,21 +166,33 @@ To install Toolbox as a binary:
>
> ```sh
> # see releases page for other versions
> export VERSION=0.21.0
> export VERSION=0.24.0
> curl -L -o toolbox https://storage.googleapis.com/genai-toolbox/v$VERSION/darwin/amd64/toolbox
> chmod +x toolbox
> ```
>
> </details>
> <details>
> <summary>Windows (AMD64)</summary>
> <summary>Windows (Command Prompt)</summary>
>
> To install Toolbox as a binary on Windows (AMD64):
> To install Toolbox as a binary on Windows (Command Prompt):
>
> ```cmd
> :: see releases page for other versions
> set VERSION=0.24.0
> curl -o toolbox.exe "https://storage.googleapis.com/genai-toolbox/v%VERSION%/windows/amd64/toolbox.exe"
> ```
>
> </details>
> <details>
> <summary>Windows (PowerShell)</summary>
>
> To install Toolbox as a binary on Windows (PowerShell):
>
> ```powershell
> :: see releases page for other versions
> set VERSION=0.21.0
> curl -o toolbox.exe "https://storage.googleapis.com/genai-toolbox/v%VERSION%/windows/amd64/toolbox.exe"
> # see releases page for other versions
> $VERSION = "0.24.0"
> curl.exe -o toolbox.exe "https://storage.googleapis.com/genai-toolbox/v$VERSION/windows/amd64/toolbox.exe"
> ```
>
> </details>
@@ -177,7 +204,7 @@ You can also install Toolbox as a container:
```sh
# see releases page for other versions
export VERSION=0.21.0
export VERSION=0.24.0
docker pull us-central1-docker.pkg.dev/database-toolbox/toolbox/toolbox:$VERSION
```
@@ -201,7 +228,7 @@ To install from source, ensure you have the latest version of
[Go installed](https://go.dev/doc/install), and then run the following command:
```sh
go install github.com/googleapis/genai-toolbox@v0.21.0
go install github.com/googleapis/genai-toolbox@v0.24.0
```
<!-- {x-release-please-end} -->
@@ -291,6 +318,16 @@ toolbox --tools-file "tools.yaml"
</details>
<details>
<summary>NPM</summary>
To run Toolbox directly without manually downloading the binary (requires Node.js):
```sh
npx @toolbox-sdk/server --tools-file tools.yaml
```
</details>
<details>
<summary>Gemini CLI</summary>
@@ -515,6 +552,36 @@ For more detailed instructions on using the Toolbox Core SDK, see the
```
</details>
<details>
<summary>ADK</summary>
1. Install [Toolbox ADK SDK][toolbox-adk-js]:
```bash
npm install @toolbox-sdk/adk
```
2. Load tools:
```javascript
import { ToolboxClient } from '@toolbox-sdk/adk';
// update the url to point to your server
const URL = 'http://127.0.0.1:5000';
let client = new ToolboxClient(URL);
// these tools can be passed to your application!
const tools = await client.loadToolset('toolsetName');
```
For more detailed instructions on using the Toolbox ADK SDK, see the
[project's README][toolbox-adk-js-readme].
[toolbox-adk-js]: https://www.npmjs.com/package/@toolbox-sdk/adk
[toolbox-adk-js-readme]:
https://github.com/googleapis/mcp-toolbox-sdk-js/blob/main/packages/toolbox-adk/README.md
</details>
</details>
</blockquote>
<details>

View File

@@ -73,6 +73,7 @@ import (
_ "github.com/googleapis/genai-toolbox/internal/tools/clickhouse/clickhouselistdatabases"
_ "github.com/googleapis/genai-toolbox/internal/tools/clickhouse/clickhouselisttables"
_ "github.com/googleapis/genai-toolbox/internal/tools/clickhouse/clickhousesql"
_ "github.com/googleapis/genai-toolbox/internal/tools/cloudgda"
_ "github.com/googleapis/genai-toolbox/internal/tools/cloudhealthcare/cloudhealthcarefhirfetchpage"
_ "github.com/googleapis/genai-toolbox/internal/tools/cloudhealthcare/cloudhealthcarefhirpatienteverything"
_ "github.com/googleapis/genai-toolbox/internal/tools/cloudhealthcare/cloudhealthcarefhirpatientsearch"
@@ -120,6 +121,7 @@ import (
_ "github.com/googleapis/genai-toolbox/internal/tools/firestore/firestorevalidaterules"
_ "github.com/googleapis/genai-toolbox/internal/tools/http"
_ "github.com/googleapis/genai-toolbox/internal/tools/looker/lookeradddashboardelement"
_ "github.com/googleapis/genai-toolbox/internal/tools/looker/lookeradddashboardfilter"
_ "github.com/googleapis/genai-toolbox/internal/tools/looker/lookerconversationalanalytics"
_ "github.com/googleapis/genai-toolbox/internal/tools/looker/lookercreateprojectfile"
_ "github.com/googleapis/genai-toolbox/internal/tools/looker/lookerdeleteprojectfile"
@@ -150,6 +152,9 @@ import (
_ "github.com/googleapis/genai-toolbox/internal/tools/looker/lookerquerysql"
_ "github.com/googleapis/genai-toolbox/internal/tools/looker/lookerqueryurl"
_ "github.com/googleapis/genai-toolbox/internal/tools/looker/lookerrundashboard"
_ "github.com/googleapis/genai-toolbox/internal/tools/looker/lookerrenderdashboard"
_ "github.com/googleapis/genai-toolbox/internal/tools/looker/lookerrenderlook"
_ "github.com/googleapis/genai-toolbox/internal/tools/looker/lookerrenderquery"
_ "github.com/googleapis/genai-toolbox/internal/tools/looker/lookerrunlook"
_ "github.com/googleapis/genai-toolbox/internal/tools/looker/lookerupdateprojectfile"
_ "github.com/googleapis/genai-toolbox/internal/tools/mindsdb/mindsdbexecutesql"
@@ -167,6 +172,7 @@ import (
_ "github.com/googleapis/genai-toolbox/internal/tools/mssql/mssqllisttables"
_ "github.com/googleapis/genai-toolbox/internal/tools/mssql/mssqlsql"
_ "github.com/googleapis/genai-toolbox/internal/tools/mysql/mysqlexecutesql"
_ "github.com/googleapis/genai-toolbox/internal/tools/mysql/mysqlgetqueryplan"
_ "github.com/googleapis/genai-toolbox/internal/tools/mysql/mysqllistactivequeries"
_ "github.com/googleapis/genai-toolbox/internal/tools/mysql/mysqllisttablefragmentation"
_ "github.com/googleapis/genai-toolbox/internal/tools/mysql/mysqllisttables"
@@ -184,13 +190,19 @@ import (
_ "github.com/googleapis/genai-toolbox/internal/tools/postgres/postgresgetcolumncardinality"
_ "github.com/googleapis/genai-toolbox/internal/tools/postgres/postgreslistactivequeries"
_ "github.com/googleapis/genai-toolbox/internal/tools/postgres/postgreslistavailableextensions"
_ "github.com/googleapis/genai-toolbox/internal/tools/postgres/postgreslistdatabasestats"
_ "github.com/googleapis/genai-toolbox/internal/tools/postgres/postgreslistindexes"
_ "github.com/googleapis/genai-toolbox/internal/tools/postgres/postgreslistinstalledextensions"
_ "github.com/googleapis/genai-toolbox/internal/tools/postgres/postgreslistlocks"
_ "github.com/googleapis/genai-toolbox/internal/tools/postgres/postgreslistpgsettings"
_ "github.com/googleapis/genai-toolbox/internal/tools/postgres/postgreslistpublicationtables"
_ "github.com/googleapis/genai-toolbox/internal/tools/postgres/postgreslistquerystats"
_ "github.com/googleapis/genai-toolbox/internal/tools/postgres/postgreslistroles"
_ "github.com/googleapis/genai-toolbox/internal/tools/postgres/postgreslistschemas"
_ "github.com/googleapis/genai-toolbox/internal/tools/postgres/postgreslistsequences"
_ "github.com/googleapis/genai-toolbox/internal/tools/postgres/postgreslisttables"
_ "github.com/googleapis/genai-toolbox/internal/tools/postgres/postgreslisttablespaces"
_ "github.com/googleapis/genai-toolbox/internal/tools/postgres/postgreslisttablestats"
_ "github.com/googleapis/genai-toolbox/internal/tools/postgres/postgreslisttriggers"
_ "github.com/googleapis/genai-toolbox/internal/tools/postgres/postgreslistviews"
_ "github.com/googleapis/genai-toolbox/internal/tools/postgres/postgreslongrunningtransactions"
@@ -198,6 +210,8 @@ import (
_ "github.com/googleapis/genai-toolbox/internal/tools/postgres/postgressql"
_ "github.com/googleapis/genai-toolbox/internal/tools/redis"
_ "github.com/googleapis/genai-toolbox/internal/tools/serverlessspark/serverlesssparkcancelbatch"
_ "github.com/googleapis/genai-toolbox/internal/tools/serverlessspark/serverlesssparkcreatepysparkbatch"
_ "github.com/googleapis/genai-toolbox/internal/tools/serverlessspark/serverlesssparkcreatesparkbatch"
_ "github.com/googleapis/genai-toolbox/internal/tools/serverlessspark/serverlesssparkgetbatch"
_ "github.com/googleapis/genai-toolbox/internal/tools/serverlessspark/serverlesssparklistbatches"
_ "github.com/googleapis/genai-toolbox/internal/tools/singlestore/singlestoreexecutesql"
@@ -224,6 +238,7 @@ import (
_ "github.com/googleapis/genai-toolbox/internal/sources/bigtable"
_ "github.com/googleapis/genai-toolbox/internal/sources/cassandra"
_ "github.com/googleapis/genai-toolbox/internal/sources/clickhouse"
_ "github.com/googleapis/genai-toolbox/internal/sources/cloudgda"
_ "github.com/googleapis/genai-toolbox/internal/sources/cloudhealthcare"
_ "github.com/googleapis/genai-toolbox/internal/sources/cloudmonitoring"
_ "github.com/googleapis/genai-toolbox/internal/sources/cloudsqladmin"
@@ -344,12 +359,12 @@ func NewCommand(opts ...Option) *Command {
flags.StringVarP(&cmd.cfg.Address, "address", "a", "127.0.0.1", "Address of the interface the server will listen on.")
flags.IntVarP(&cmd.cfg.Port, "port", "p", 5000, "Port the server will listen on.")
flags.StringVar(&cmd.tools_file, "tools_file", "", "File path specifying the tool configuration. Cannot be used with --prebuilt.")
flags.StringVar(&cmd.tools_file, "tools_file", "", "File path specifying the tool configuration. Cannot be used with --tools-files, or --tools-folder.")
// deprecate tools_file
_ = flags.MarkDeprecated("tools_file", "please use --tools-file instead")
flags.StringVar(&cmd.tools_file, "tools-file", "", "File path specifying the tool configuration. Cannot be used with --prebuilt, --tools-files, or --tools-folder.")
flags.StringSliceVar(&cmd.tools_files, "tools-files", []string{}, "Multiple file paths specifying tool configurations. Files will be merged. Cannot be used with --prebuilt, --tools-file, or --tools-folder.")
flags.StringVar(&cmd.tools_folder, "tools-folder", "", "Directory path containing YAML tool configuration files. All .yaml and .yml files in the directory will be loaded and merged. Cannot be used with --prebuilt, --tools-file, or --tools-files.")
flags.StringVar(&cmd.tools_file, "tools-file", "", "File path specifying the tool configuration. Cannot be used with --tools-files, or --tools-folder.")
flags.StringSliceVar(&cmd.tools_files, "tools-files", []string{}, "Multiple file paths specifying tool configurations. Files will be merged. Cannot be used with --tools-file, or --tools-folder.")
flags.StringVar(&cmd.tools_folder, "tools-folder", "", "Directory path containing YAML tool configuration files. All .yaml and .yml files in the directory will be loaded and merged. Cannot be used with --tools-file, or --tools-files.")
flags.Var(&cmd.cfg.LogLevel, "log-level", "Specify the minimum level logged. Allowed: 'DEBUG', 'INFO', 'WARN', 'ERROR'.")
flags.Var(&cmd.cfg.LoggingFormat, "logging-format", "Specify logging format to use. Allowed: 'standard' or 'JSON'.")
flags.BoolVar(&cmd.cfg.TelemetryGCP, "telemetry-gcp", false, "Enable exporting directly to Google Cloud Monitoring.")
@@ -357,7 +372,7 @@ func NewCommand(opts ...Option) *Command {
flags.StringVar(&cmd.cfg.TelemetryServiceName, "telemetry-service-name", "toolbox", "Sets the value of the service.name resource attribute for telemetry data.")
// Fetch prebuilt tools sources to customize the help description
prebuiltHelp := fmt.Sprintf(
"Use a prebuilt tool configuration by source type. Cannot be used with --tools-file. Allowed: '%s'.",
"Use a prebuilt tool configuration by source type. Allowed: '%s'.",
strings.Join(prebuiltconfigs.GetPrebuiltSources(), "', '"),
)
flags.StringVar(&cmd.prebuiltConfig, "prebuilt", "", prebuiltHelp)
@@ -451,6 +466,9 @@ func mergeToolsFiles(files ...ToolsFile) (ToolsFile, error) {
if _, exists := merged.AuthSources[name]; exists {
conflicts = append(conflicts, fmt.Sprintf("authSource '%s' (file #%d)", name, fileIndex+1))
} else {
if merged.AuthSources == nil {
merged.AuthSources = make(server.AuthServiceConfigs)
}
merged.AuthSources[name] = authSource
}
}
@@ -827,16 +845,10 @@ func run(cmd *Command) error {
}
}()
var toolsFile ToolsFile
var allToolsFiles []ToolsFile
// Load Prebuilt Configuration
if cmd.prebuiltConfig != "" {
// Make sure --prebuilt and --tools-file/--tools-files/--tools-folder flags are mutually exclusive
if cmd.tools_file != "" || len(cmd.tools_files) > 0 || cmd.tools_folder != "" {
errMsg := fmt.Errorf("--prebuilt and --tools-file/--tools-files/--tools-folder flags cannot be used simultaneously")
cmd.logger.ErrorContext(ctx, errMsg.Error())
return errMsg
}
// Use prebuilt tools
buf, err := prebuiltconfigs.Get(cmd.prebuiltConfig)
if err != nil {
cmd.logger.ErrorContext(ctx, err.Error())
@@ -847,72 +859,96 @@ func run(cmd *Command) error {
// Append prebuilt.source to Version string for the User Agent
cmd.cfg.Version += "+prebuilt." + cmd.prebuiltConfig
toolsFile, err = parseToolsFile(ctx, buf)
parsed, err := parseToolsFile(ctx, buf)
if err != nil {
errMsg := fmt.Errorf("unable to parse prebuilt tool configuration: %w", err)
cmd.logger.ErrorContext(ctx, errMsg.Error())
return errMsg
}
} else if len(cmd.tools_files) > 0 {
// Make sure --tools-file, --tools-files, and --tools-folder flags are mutually exclusive
if cmd.tools_file != "" || cmd.tools_folder != "" {
errMsg := fmt.Errorf("--tools-file, --tools-files, and --tools-folder flags cannot be used simultaneously")
cmd.logger.ErrorContext(ctx, errMsg.Error())
return errMsg
}
// Use multiple tools files
cmd.logger.InfoContext(ctx, fmt.Sprintf("Loading and merging %d tool configuration files", len(cmd.tools_files)))
var err error
toolsFile, err = loadAndMergeToolsFiles(ctx, cmd.tools_files)
if err != nil {
cmd.logger.ErrorContext(ctx, err.Error())
return err
}
} else if cmd.tools_folder != "" {
// Make sure --tools-folder and other flags are mutually exclusive
if cmd.tools_file != "" || len(cmd.tools_files) > 0 {
errMsg := fmt.Errorf("--tools-file, --tools-files, and --tools-folder flags cannot be used simultaneously")
cmd.logger.ErrorContext(ctx, errMsg.Error())
return errMsg
}
// Use tools folder
cmd.logger.InfoContext(ctx, fmt.Sprintf("Loading and merging all YAML files from directory: %s", cmd.tools_folder))
var err error
toolsFile, err = loadAndMergeToolsFolder(ctx, cmd.tools_folder)
if err != nil {
cmd.logger.ErrorContext(ctx, err.Error())
return err
}
} else {
// Set default value of tools-file flag to tools.yaml
if cmd.tools_file == "" {
cmd.tools_file = "tools.yaml"
}
// Read single tool file contents
buf, err := os.ReadFile(cmd.tools_file)
if err != nil {
errMsg := fmt.Errorf("unable to read tool file at %q: %w", cmd.tools_file, err)
cmd.logger.ErrorContext(ctx, errMsg.Error())
return errMsg
}
toolsFile, err = parseToolsFile(ctx, buf)
if err != nil {
errMsg := fmt.Errorf("unable to parse tool file at %q: %w", cmd.tools_file, err)
cmd.logger.ErrorContext(ctx, errMsg.Error())
return errMsg
}
allToolsFiles = append(allToolsFiles, parsed)
}
cmd.cfg.SourceConfigs, cmd.cfg.AuthServiceConfigs, cmd.cfg.ToolConfigs, cmd.cfg.ToolsetConfigs, cmd.cfg.PromptConfigs = toolsFile.Sources, toolsFile.AuthServices, toolsFile.Tools, toolsFile.Toolsets, toolsFile.Prompts
// Determine if Custom Files should be loaded
// Check for explicit custom flags
isCustomConfigured := cmd.tools_file != "" || len(cmd.tools_files) > 0 || cmd.tools_folder != ""
authSourceConfigs := toolsFile.AuthSources
// Determine if default 'tools.yaml' should be used (No prebuilt AND No custom flags)
useDefaultToolsFile := cmd.prebuiltConfig == "" && !isCustomConfigured
if useDefaultToolsFile {
cmd.tools_file = "tools.yaml"
isCustomConfigured = true
}
// Load Custom Configurations
if isCustomConfigured {
// Enforce exclusivity among custom flags (tools-file vs tools-files vs tools-folder)
if (cmd.tools_file != "" && len(cmd.tools_files) > 0) ||
(cmd.tools_file != "" && cmd.tools_folder != "") ||
(len(cmd.tools_files) > 0 && cmd.tools_folder != "") {
errMsg := fmt.Errorf("--tools-file, --tools-files, and --tools-folder flags cannot be used simultaneously")
cmd.logger.ErrorContext(ctx, errMsg.Error())
return errMsg
}
var customTools ToolsFile
var err error
if len(cmd.tools_files) > 0 {
// Use tools-files
cmd.logger.InfoContext(ctx, fmt.Sprintf("Loading and merging %d tool configuration files", len(cmd.tools_files)))
customTools, err = loadAndMergeToolsFiles(ctx, cmd.tools_files)
} else if cmd.tools_folder != "" {
// Use tools-folder
cmd.logger.InfoContext(ctx, fmt.Sprintf("Loading and merging all YAML files from directory: %s", cmd.tools_folder))
customTools, err = loadAndMergeToolsFolder(ctx, cmd.tools_folder)
} else {
// Use single file (tools-file or default `tools.yaml`)
buf, readFileErr := os.ReadFile(cmd.tools_file)
if readFileErr != nil {
errMsg := fmt.Errorf("unable to read tool file at %q: %w", cmd.tools_file, readFileErr)
cmd.logger.ErrorContext(ctx, errMsg.Error())
return errMsg
}
customTools, err = parseToolsFile(ctx, buf)
if err != nil {
err = fmt.Errorf("unable to parse tool file at %q: %w", cmd.tools_file, err)
}
}
if err != nil {
cmd.logger.ErrorContext(ctx, err.Error())
return err
}
allToolsFiles = append(allToolsFiles, customTools)
}
// Merge Everything
// This will error if custom tools collide with prebuilt tools
finalToolsFile, err := mergeToolsFiles(allToolsFiles...)
if err != nil {
cmd.logger.ErrorContext(ctx, err.Error())
return err
}
cmd.cfg.SourceConfigs = finalToolsFile.Sources
cmd.cfg.AuthServiceConfigs = finalToolsFile.AuthServices
cmd.cfg.ToolConfigs = finalToolsFile.Tools
cmd.cfg.ToolsetConfigs = finalToolsFile.Toolsets
cmd.cfg.PromptConfigs = finalToolsFile.Prompts
authSourceConfigs := finalToolsFile.AuthSources
if authSourceConfigs != nil {
cmd.logger.WarnContext(ctx, "`authSources` is deprecated, use `authServices` instead")
cmd.cfg.AuthServiceConfigs = authSourceConfigs
for k, v := range authSourceConfigs {
if _, exists := cmd.cfg.AuthServiceConfigs[k]; exists {
errMsg := fmt.Errorf("resource conflict detected: authSource '%s' has the same name as an existing authService. Please rename your authSource", k)
cmd.logger.ErrorContext(ctx, errMsg.Error())
return errMsg
}
cmd.cfg.AuthServiceConfigs[k] = v
}
}
instrumentation, err := telemetry.CreateTelemetryInstrumentation(versionString)
@@ -963,9 +999,8 @@ func run(cmd *Command) error {
}()
}
watchDirs, watchedFiles := resolveWatcherInputs(cmd.tools_file, cmd.tools_files, cmd.tools_folder)
if !cmd.cfg.DisableReload {
if isCustomConfigured && !cmd.cfg.DisableReload {
watchDirs, watchedFiles := resolveWatcherInputs(cmd.tools_file, cmd.tools_files, cmd.tools_folder)
// start watching the file(s) or folder for changes to trigger dynamic reloading
go watchChanges(ctx, watchDirs, watchedFiles, s)
}

View File

@@ -92,6 +92,21 @@ func invokeCommand(args []string) (*Command, string, error) {
return c, buf.String(), err
}
// invokeCommandWithContext executes the command with a context and returns the captured output.
func invokeCommandWithContext(ctx context.Context, args []string) (*Command, string, error) {
// Capture output using a buffer
buf := new(bytes.Buffer)
c := NewCommand(WithStreams(buf, buf))
c.SetArgs(args)
c.SilenceUsage = true
c.SilenceErrors = true
c.SetContext(ctx)
err := c.Execute()
return c, buf.String(), err
}
func TestVersion(t *testing.T) {
data, err := os.ReadFile("version.txt")
if err != nil {
@@ -1488,7 +1503,7 @@ func TestPrebuiltTools(t *testing.T) {
wantToolset: server.ToolsetConfigs{
"alloydb_postgres_database_tools": tools.ToolsetConfig{
Name: "alloydb_postgres_database_tools",
ToolNames: []string{"execute_sql", "list_tables", "list_active_queries", "list_available_extensions", "list_installed_extensions", "list_autovacuum_configurations", "list_memory_configurations", "list_top_bloated_tables", "list_replication_slots", "list_invalid_indexes", "get_query_plan", "list_views", "list_schemas", "database_overview", "list_triggers", "list_indexes", "list_sequences", "long_running_transactions", "list_locks", "replication_stats", "list_query_stats", "get_column_cardinality"},
ToolNames: []string{"execute_sql", "list_tables", "list_active_queries", "list_available_extensions", "list_installed_extensions", "list_autovacuum_configurations", "list_memory_configurations", "list_top_bloated_tables", "list_replication_slots", "list_invalid_indexes", "get_query_plan", "list_views", "list_schemas", "database_overview", "list_triggers", "list_indexes", "list_sequences", "long_running_transactions", "list_locks", "replication_stats", "list_query_stats", "get_column_cardinality", "list_publication_tables", "list_tablespaces", "list_pg_settings", "list_database_stats", "list_roles", "list_table_stats"},
},
},
},
@@ -1518,7 +1533,7 @@ func TestPrebuiltTools(t *testing.T) {
wantToolset: server.ToolsetConfigs{
"cloud_sql_postgres_database_tools": tools.ToolsetConfig{
Name: "cloud_sql_postgres_database_tools",
ToolNames: []string{"execute_sql", "list_tables", "list_active_queries", "list_available_extensions", "list_installed_extensions", "list_autovacuum_configurations", "list_memory_configurations", "list_top_bloated_tables", "list_replication_slots", "list_invalid_indexes", "get_query_plan", "list_views", "list_schemas", "database_overview", "list_triggers", "list_indexes", "list_sequences", "long_running_transactions", "list_locks", "replication_stats", "list_query_stats", "get_column_cardinality"},
ToolNames: []string{"execute_sql", "list_tables", "list_active_queries", "list_available_extensions", "list_installed_extensions", "list_autovacuum_configurations", "list_memory_configurations", "list_top_bloated_tables", "list_replication_slots", "list_invalid_indexes", "get_query_plan", "list_views", "list_schemas", "database_overview", "list_triggers", "list_indexes", "list_sequences", "long_running_transactions", "list_locks", "replication_stats", "list_query_stats", "get_column_cardinality", "list_publication_tables", "list_tablespaces", "list_pg_settings", "list_database_stats", "list_roles", "list_table_stats"},
},
},
},
@@ -1558,7 +1573,7 @@ func TestPrebuiltTools(t *testing.T) {
wantToolset: server.ToolsetConfigs{
"serverless_spark_tools": tools.ToolsetConfig{
Name: "serverless_spark_tools",
ToolNames: []string{"list_batches", "get_batch", "cancel_batch"},
ToolNames: []string{"list_batches", "get_batch", "cancel_batch", "create_pyspark_batch", "create_spark_batch"},
},
},
},
@@ -1598,7 +1613,7 @@ func TestPrebuiltTools(t *testing.T) {
wantToolset: server.ToolsetConfigs{
"looker_tools": tools.ToolsetConfig{
Name: "looker_tools",
ToolNames: []string{"get_models", "get_explores", "get_dimensions", "get_measures", "get_filters", "get_parameters", "query", "query_sql", "query_url", "get_looks", "run_look", "make_look", "get_dashboards", "run_dashboard", "make_dashboard", "add_dashboard_element", "health_pulse", "health_analyze", "health_vacuum", "dev_mode", "get_projects", "get_project_files", "get_project_file", "create_project_file", "update_project_file", "delete_project_file", "get_connections", "get_connection_schemas", "get_connection_databases", "get_connection_tables", "get_connection_table_columns"},
ToolNames: []string{"get_models", "get_explores", "get_dimensions", "get_measures", "get_filters", "get_parameters", "query", "query_sql", "query_url", "get_looks", "run_look", "make_look", "get_dashboards", "run_dashboard", "render_query", "render_look", "render_dashboard", "make_dashboard", "add_dashboard_element", "add_dashboard_filter", "generate_embed_url", "health_pulse", "health_analyze", "health_vacuum", "dev_mode", "get_projects", "get_project_files", "get_project_file", "create_project_file", "update_project_file", "delete_project_file", "get_connections", "get_connection_schemas", "get_connection_databases", "get_connection_tables", "get_connection_table_columns"},
},
},
},
@@ -1618,7 +1633,7 @@ func TestPrebuiltTools(t *testing.T) {
wantToolset: server.ToolsetConfigs{
"postgres_database_tools": tools.ToolsetConfig{
Name: "postgres_database_tools",
ToolNames: []string{"execute_sql", "list_tables", "list_active_queries", "list_available_extensions", "list_installed_extensions", "list_autovacuum_configurations", "list_memory_configurations", "list_top_bloated_tables", "list_replication_slots", "list_invalid_indexes", "get_query_plan", "list_views", "list_schemas", "database_overview", "list_triggers", "list_indexes", "list_sequences", "long_running_transactions", "list_locks", "replication_stats", "list_query_stats", "get_column_cardinality"},
ToolNames: []string{"execute_sql", "list_tables", "list_active_queries", "list_available_extensions", "list_installed_extensions", "list_autovacuum_configurations", "list_memory_configurations", "list_top_bloated_tables", "list_replication_slots", "list_invalid_indexes", "get_query_plan", "list_views", "list_schemas", "database_overview", "list_triggers", "list_indexes", "list_sequences", "long_running_transactions", "list_locks", "replication_stats", "list_query_stats", "get_column_cardinality", "list_publication_tables", "list_tablespaces", "list_pg_settings", "list_database_stats", "list_roles", "list_table_stats"},
},
},
},
@@ -1755,11 +1770,6 @@ func TestMutuallyExclusiveFlags(t *testing.T) {
args []string
errString string
}{
{
desc: "--prebuilt and --tools-file",
args: []string{"--prebuilt", "alloydb", "--tools-file", "my.yaml"},
errString: "--prebuilt and --tools-file/--tools-files/--tools-folder flags cannot be used simultaneously",
},
{
desc: "--tools-file and --tools-files",
args: []string{"--tools-file", "my.yaml", "--tools-files", "a.yaml,b.yaml"},
@@ -1902,3 +1912,228 @@ func TestMergeToolsFiles(t *testing.T) {
})
}
}
func TestPrebuiltAndCustomTools(t *testing.T) {
t.Setenv("SQLITE_DATABASE", "test.db")
// Setup custom tools file
customContent := `
tools:
custom_tool:
kind: http
source: my-http
method: GET
path: /
description: "A custom tool for testing"
sources:
my-http:
kind: http
baseUrl: http://example.com
`
customFile := filepath.Join(t.TempDir(), "custom.yaml")
if err := os.WriteFile(customFile, []byte(customContent), 0644); err != nil {
t.Fatal(err)
}
// Tool Conflict File
// SQLite prebuilt has a tool named 'list_tables'
toolConflictContent := `
tools:
list_tables:
kind: http
source: my-http
method: GET
path: /
description: "Conflicting tool"
sources:
my-http:
kind: http
baseUrl: http://example.com
`
toolConflictFile := filepath.Join(t.TempDir(), "tool_conflict.yaml")
if err := os.WriteFile(toolConflictFile, []byte(toolConflictContent), 0644); err != nil {
t.Fatal(err)
}
// Source Conflict File
// SQLite prebuilt has a source named 'sqlite-source'
sourceConflictContent := `
sources:
sqlite-source:
kind: http
baseUrl: http://example.com
tools:
dummy_tool:
kind: http
source: sqlite-source
method: GET
path: /
description: "Dummy"
`
sourceConflictFile := filepath.Join(t.TempDir(), "source_conflict.yaml")
if err := os.WriteFile(sourceConflictFile, []byte(sourceConflictContent), 0644); err != nil {
t.Fatal(err)
}
// Toolset Conflict File
// SQLite prebuilt has a toolset named 'sqlite_database_tools'
toolsetConflictContent := `
sources:
dummy-src:
kind: http
baseUrl: http://example.com
tools:
dummy_tool:
kind: http
source: dummy-src
method: GET
path: /
description: "Dummy"
toolsets:
sqlite_database_tools:
- dummy_tool
`
toolsetConflictFile := filepath.Join(t.TempDir(), "toolset_conflict.yaml")
if err := os.WriteFile(toolsetConflictFile, []byte(toolsetConflictContent), 0644); err != nil {
t.Fatal(err)
}
//Legacy Auth File
authContent := `
authSources:
legacy-auth:
kind: google
clientId: "test-client-id"
`
authFile := filepath.Join(t.TempDir(), "auth.yaml")
if err := os.WriteFile(authFile, []byte(authContent), 0644); err != nil {
t.Fatal(err)
}
testCases := []struct {
desc string
args []string
wantErr bool
errString string
cfgCheck func(server.ServerConfig) error
}{
{
desc: "success mixed",
args: []string{"--prebuilt", "sqlite", "--tools-file", customFile},
wantErr: false,
cfgCheck: func(cfg server.ServerConfig) error {
if _, ok := cfg.ToolConfigs["custom_tool"]; !ok {
return fmt.Errorf("custom tool not found")
}
if _, ok := cfg.ToolConfigs["list_tables"]; !ok {
return fmt.Errorf("prebuilt tool 'list_tables' not found")
}
return nil
},
},
{
desc: "tool conflict error",
args: []string{"--prebuilt", "sqlite", "--tools-file", toolConflictFile},
wantErr: true,
errString: "resource conflicts detected",
},
{
desc: "source conflict error",
args: []string{"--prebuilt", "sqlite", "--tools-file", sourceConflictFile},
wantErr: true,
errString: "resource conflicts detected",
},
{
desc: "toolset conflict error",
args: []string{"--prebuilt", "sqlite", "--tools-file", toolsetConflictFile},
wantErr: true,
errString: "resource conflicts detected",
},
{
desc: "legacy auth additive",
args: []string{"--prebuilt", "sqlite", "--tools-file", authFile},
wantErr: false,
cfgCheck: func(cfg server.ServerConfig) error {
if _, ok := cfg.AuthServiceConfigs["legacy-auth"]; !ok {
return fmt.Errorf("legacy auth source not merged into auth services")
}
return nil
},
},
}
for _, tc := range testCases {
t.Run(tc.desc, func(t *testing.T) {
ctx, cancel := context.WithTimeout(context.Background(), 500*time.Millisecond)
defer cancel()
cmd, output, err := invokeCommandWithContext(ctx, tc.args)
if tc.wantErr {
if err == nil {
t.Fatalf("expected an error but got none")
}
if !strings.Contains(err.Error(), tc.errString) {
t.Errorf("expected error message to contain %q, but got %q", tc.errString, err.Error())
}
} else {
if err != nil && err != context.DeadlineExceeded && err != context.Canceled {
t.Fatalf("unexpected error: %v", err)
}
if !strings.Contains(output, "Server ready to serve!") {
t.Errorf("server did not start successfully (no ready message found). Output:\n%s", output)
}
if tc.cfgCheck != nil {
if err := tc.cfgCheck(cmd.cfg); err != nil {
t.Errorf("config check failed: %v", err)
}
}
}
})
}
}
func TestDefaultToolsFileBehavior(t *testing.T) {
t.Setenv("SQLITE_DATABASE", "test.db")
testCases := []struct {
desc string
args []string
expectRun bool
errString string
}{
{
desc: "no flags (defaults to tools.yaml)",
args: []string{},
expectRun: false,
errString: "tools.yaml", // Expect error because tools.yaml doesn't exist in test env
},
{
desc: "prebuilt only (skips tools.yaml)",
args: []string{"--prebuilt", "sqlite"},
expectRun: true,
},
}
for _, tc := range testCases {
t.Run(tc.desc, func(t *testing.T) {
ctx, cancel := context.WithTimeout(context.Background(), 500*time.Millisecond)
defer cancel()
_, output, err := invokeCommandWithContext(ctx, tc.args)
if tc.expectRun {
if err != nil && err != context.DeadlineExceeded && err != context.Canceled {
t.Fatalf("expected server start, got error: %v", err)
}
// Verify it actually started
if !strings.Contains(output, "Server ready to serve!") {
t.Errorf("server did not start successfully (no ready message found). Output:\n%s", output)
}
} else {
if err == nil {
t.Fatalf("expected error reading default file, got nil")
}
if !strings.Contains(err.Error(), tc.errString) {
t.Errorf("expected error message to contain %q, but got %q", tc.errString, err.Error())
}
}
})
}
}

View File

@@ -1 +1 @@
0.21.0
0.24.0

View File

@@ -12,53 +12,7 @@ To connect to the database to explore and query data, search the MCP store for t
## Prerequisites
* Download and install [MCP Toolbox](https://github.com/googleapis/genai-toolbox):
1. **Download the Toolbox binary**:
Download the latest binary for your operating system and architecture from the storage bucket. Check the [releases page](https://github.com/googleapis/genai-toolbox/releases) for additional versions:
<!-- {x-release-please-start-version} -->
* To install Toolbox as a binary on Linux (AMD64):
```bash
curl -L -o toolbox https://storage.googleapis.com/genai-toolbox/v0.21.0/linux/amd64/toolbox
```
* To install Toolbox as a binary on macOS (Apple Silicon):
```bash
curl -L -o toolbox https://storage.googleapis.com/genai-toolbox/v0.21.0/darwin/arm64/toolbox
```
* To install Toolbox as a binary on macOS (Intel):
```bash
curl -L -o toolbox https://storage.googleapis.com/genai-toolbox/v0.21.0/darwin/amd64/toolbox
```
* To install Toolbox as a binary on Windows (AMD64):
```powershell
curl -o toolbox.exe "https://storage.googleapis.com/genai-toolbox/v0.21.0/windows/amd64/toolbox.exe"
```
<!-- {x-release-please-end} -->
2. **Make it executable**:
```bash
chmod +x toolbox
```
3. **Add the binary to $PATH in `.~/bash_profile`** (Note: You may need to restart Antigravity for changes to take effect.):
```bash
export PATH=$PATH:path/to/folder
```
**On Windows, move binary to the `WindowsApps\` folder**:
```
move "C:\Users\<path-to-binary>\toolbox.exe" "C:\Users\<username>\AppData\Local\Microsoft\WindowsApps\"
```
**Tip:** Ensure the destination folder for your binary is included in
your system's PATH environment variable. To check `PATH`, use `echo
$PATH` (or `echo %PATH%` on Windows).
* [Node.js](https://nodejs.org/) installed.
* A Google Cloud project with the **AlloyDB API** enabled.
* Ensure [Application Default Credentials](https://cloud.google.com/docs/authentication/gcloud) are available in your environment.
* IAM Permissions:
@@ -67,10 +21,13 @@ To connect to the database to explore and query data, search the MCP store for t
## Install & Configuration
1. In the Antigravity MCP Store, click the "Install" button.
In the Antigravity MCP Store, click the "Install" button.
You'll now be able to see all enabled tools in the "Tools" tab.
> [!NOTE]
> If you encounter issues with Windows Defender blocking the execution, you may need to configure an allowlist. See [Configure exclusions for Microsoft Defender Antivirus](https://learn.microsoft.com/en-us/microsoft-365/security/defender-endpoint/configure-exclusions-microsoft-defender-antivirus?view=o365-worldwide) for more details.
## Usage
Once configured, the MCP server will automatically provide AlloyDB capabilities to your AI assistant. You can:
@@ -104,8 +61,8 @@ Add the following configuration to your MCP client (e.g., `settings.json` for Ge
{
"mcpServers": {
"alloydb-admin": {
"command": "toolbox",
"args": ["--prebuilt", "alloydb-postgres-admin", "--stdio"]
"command": "npx",
"args": ["-y", "@toolbox-sdk/server", "--prebuilt", "alloydb-postgres-admin", "--stdio"]
}
}
}

View File

@@ -15,53 +15,7 @@ For AlloyDB infrastructure management, search the MCP store for the AlloyDB for
## Prerequisites
* Download and install [MCP Toolbox](https://github.com/googleapis/genai-toolbox):
1. **Download the Toolbox binary**:
Download the latest binary for your operating system and architecture from the storage bucket. Check the [releases page](https://github.com/googleapis/genai-toolbox/releases) for additional versions:
<!-- {x-release-please-start-version} -->
* To install Toolbox as a binary on Linux (AMD64):
```bash
curl -L -o toolbox https://storage.googleapis.com/genai-toolbox/v0.21.0/linux/amd64/toolbox
```
* To install Toolbox as a binary on macOS (Apple Silicon):
```bash
curl -L -o toolbox https://storage.googleapis.com/genai-toolbox/v0.21.0/darwin/arm64/toolbox
```
* To install Toolbox as a binary on macOS (Intel):
```bash
curl -L -o toolbox https://storage.googleapis.com/genai-toolbox/v0.21.0/darwin/amd64/toolbox
```
* To install Toolbox as a binary on Windows (AMD64):
```powershell
curl -o toolbox.exe "https://storage.googleapis.com/genai-toolbox/v0.21.0/windows/amd64/toolbox.exe"
```
<!-- {x-release-please-end} -->
2. **Make it executable**:
```bash
chmod +x toolbox
```
3. **Add the binary to $PATH in `.~/bash_profile`** (Note: You may need to restart Antigravity for changes to take effect.):
```bash
export PATH=$PATH:path/to/folder
```
**On Windows, move binary to the `WindowsApps\` folder**:
```
move "C:\Users\<path-to-binary>\toolbox.exe" "C:\Users\<username>\AppData\Local\Microsoft\WindowsApps\"
```
**Tip:** Ensure the destination folder for your binary is included in
your system's PATH environment variable. To check `PATH`, use `echo
$PATH` (or `echo %PATH%` on Windows).
* [Node.js](https://nodejs.org/) installed.
* A Google Cloud project with the **AlloyDB API** enabled.
* Ensure [Application Default Credentials](https://cloud.google.com/docs/authentication/gcloud) are available in your environment.
* IAM Permissions:
@@ -76,6 +30,9 @@ For AlloyDB infrastructure management, search the MCP store for the AlloyDB for
2. Add the required inputs for your [cluster](https://docs.cloud.google.com/alloydb/docs/cluster-list) in the configuration pop-up, then click "Save". You can update this configuration at any time in the "Configure" tab.
> [!NOTE]
> If you encounter issues with Windows Defender blocking the execution, you may need to configure an allowlist. See [Configure exclusions for Microsoft Defender Antivirus](https://learn.microsoft.com/en-us/microsoft-365/security/defender-endpoint/configure-exclusions-microsoft-defender-antivirus?view=o365-worldwide) for more details.
You'll now be able to see all enabled tools in the "Tools" tab.
## Usage
@@ -125,8 +82,8 @@ Add the following configuration to your MCP client (e.g., `settings.json` for Ge
{
"mcpServers": {
"alloydb-postgres": {
"command": "toolbox",
"args": ["--prebuilt", "alloydb-postgres", "--stdio"]
"command": "npx",
"args": ["-y", "@toolbox-sdk/server", "--prebuilt", "alloydb-postgres", "--stdio"]
}
}
}

View File

@@ -12,53 +12,7 @@ An editor configured to use the BigQuery MCP server can use its AI capabilities
## Prerequisites
* Download and install [MCP Toolbox](https://github.com/googleapis/genai-toolbox):
1. **Download the Toolbox binary**:
Download the latest binary for your operating system and architecture from the storage bucket. Check the [releases page](https://github.com/googleapis/genai-toolbox/releases) for additional versions:
<!-- {x-release-please-start-version} -->
* To install Toolbox as a binary on Linux (AMD64):
```bash
curl -L -o toolbox https://storage.googleapis.com/genai-toolbox/v0.21.0/linux/amd64/toolbox
```
* To install Toolbox as a binary on macOS (Apple Silicon):
```bash
curl -L -o toolbox https://storage.googleapis.com/genai-toolbox/v0.21.0/darwin/arm64/toolbox
```
* To install Toolbox as a binary on macOS (Intel):
```bash
curl -L -o toolbox https://storage.googleapis.com/genai-toolbox/v0.21.0/darwin/amd64/toolbox
```
* To install Toolbox as a binary on Windows (AMD64):
```powershell
curl -o toolbox.exe "https://storage.googleapis.com/genai-toolbox/v0.21.0/windows/amd64/toolbox.exe"
```
<!-- {x-release-please-end} -->
2. **Make it executable**:
```bash
chmod +x toolbox
```
3. **Add the binary to $PATH in `.~/bash_profile`** (Note: You may need to restart Antigravity for changes to take effect.):
```bash
export PATH=$PATH:path/to/folder
```
**On Windows, move binary to the `WindowsApps\` folder**:
```
move "C:\Users\<path-to-binary>\toolbox.exe" "C:\Users\<username>\AppData\Local\Microsoft\WindowsApps\"
```
**Tip:** Ensure the destination folder for your binary is included in
your system's PATH environment variable. To check `PATH`, use `echo
$PATH` (or `echo %PATH%` on Windows).
* [Node.js](https://nodejs.org/) installed.
* A Google Cloud project with the **BigQuery API** enabled.
* Ensure [Application Default Credentials](https://cloud.google.com/docs/authentication/gcloud) are available in your environment.
* IAM Permissions:
@@ -70,6 +24,9 @@ An editor configured to use the BigQuery MCP server can use its AI capabilities
2. Add the required inputs in the configuration pop-up, then click "Save". You can update this configuration at any time in the "Configure" tab.
> [!NOTE]
> If you encounter issues with Windows Defender blocking the execution, you may need to configure an allowlist. See [Configure exclusions for Microsoft Defender Antivirus](https://learn.microsoft.com/en-us/microsoft-365/security/defender-endpoint/configure-exclusions-microsoft-defender-antivirus?view=o365-worldwide) for more details.
You'll now be able to see all enabled tools in the "Tools" tab.
@@ -119,8 +76,8 @@ Add the following configuration to your MCP client (e.g., `settings.json` for Ge
{
"mcpServers": {
"bigquery": {
"command": "toolbox",
"args": ["--prebuilt", "bigquery", "--stdio"]
"command": "npx",
"args": ["-y", "@toolbox-sdk/server", "--prebuilt", "bigquery", "--stdio"]
}
}
}

View File

@@ -12,53 +12,7 @@ To connect to the database to explore and query data, search the MCP store for t
## Prerequisites
* Download and install [MCP Toolbox](https://github.com/googleapis/genai-toolbox):
1. **Download the Toolbox binary**:
Download the latest binary for your operating system and architecture from the storage bucket. Check the [releases page](https://github.com/googleapis/genai-toolbox/releases) for additional versions:
<!-- {x-release-please-start-version} -->
* To install Toolbox as a binary on Linux (AMD64):
```bash
curl -L -o toolbox https://storage.googleapis.com/genai-toolbox/v0.21.0/linux/amd64/toolbox
```
* To install Toolbox as a binary on macOS (Apple Silicon):
```bash
curl -L -o toolbox https://storage.googleapis.com/genai-toolbox/v0.21.0/darwin/arm64/toolbox
```
* To install Toolbox as a binary on macOS (Intel):
```bash
curl -L -o toolbox https://storage.googleapis.com/genai-toolbox/v0.21.0/darwin/amd64/toolbox
```
* To install Toolbox as a binary on Windows (AMD64):
```powershell
curl -o toolbox.exe "https://storage.googleapis.com/genai-toolbox/v0.21.0/windows/amd64/toolbox.exe"
```
<!-- {x-release-please-end} -->
2. **Make it executable**:
```bash
chmod +x toolbox
```
3. **Add the binary to $PATH in `.~/bash_profile`** (Note: You may need to restart Antigravity for changes to take effect.):
```bash
export PATH=$PATH:path/to/folder
```
**On Windows, move binary to the `WindowsApps\` folder**:
```
move "C:\Users\<path-to-binary>\toolbox.exe" "C:\Users\<username>\AppData\Local\Microsoft\WindowsApps\"
```
**Tip:** Ensure the destination folder for your binary is included in
your system's PATH environment variable. To check `PATH`, use `echo
$PATH` (or `echo %PATH%` on Windows).
* [Node.js](https://nodejs.org/) installed.
* A Google Cloud project with the **Cloud SQL Admin API** enabled.
* Ensure [Application Default Credentials](https://cloud.google.com/docs/authentication/gcloud) are available in your environment.
* IAM Permissions:
@@ -67,10 +21,13 @@ To connect to the database to explore and query data, search the MCP store for t
## Install & Configuration
1. In the Antigravity MCP Store, click the "Install" button.
In the Antigravity MCP Store, click the "Install" button.
You'll now be able to see all enabled tools in the "Tools" tab.
> [!NOTE]
> If you encounter issues with Windows Defender blocking the execution, you may need to configure an allowlist. See [Configure exclusions for Microsoft Defender Antivirus](https://learn.microsoft.com/en-us/microsoft-365/security/defender-endpoint/configure-exclusions-microsoft-defender-antivirus?view=o365-worldwide) for more details.
## Usage
Once configured, the MCP server will automatically provide Cloud SQL for SQL Server capabilities to your AI assistant. You can:
@@ -101,8 +58,8 @@ Add the following configuration to your MCP client (e.g., `settings.json` for Ge
{
"mcpServers": {
"cloud-sql-sqlserver-admin": {
"command": "toolbox",
"args": ["--prebuilt", "cloud-sql-mssql-admin", "--stdio"]
"command": "npx",
"args": ["-y", "@toolbox-sdk/server", "--prebuilt", "cloud-sql-mssql-admin", "--stdio"]
}
}
}

View File

@@ -13,53 +13,7 @@ For Cloud SQL infrastructure management, search the MCP store for the Cloud SQL
## Prerequisites
* Download and install [MCP Toolbox](https://github.com/googleapis/genai-toolbox):
1. **Download the Toolbox binary**:
Download the latest binary for your operating system and architecture from the storage bucket. Check the [releases page](https://github.com/googleapis/genai-toolbox/releases) for additional versions:
<!-- {x-release-please-start-version} -->
* To install Toolbox as a binary on Linux (AMD64):
```bash
curl -L -o toolbox https://storage.googleapis.com/genai-toolbox/v0.21.0/linux/amd64/toolbox
```
* To install Toolbox as a binary on macOS (Apple Silicon):
```bash
curl -L -o toolbox https://storage.googleapis.com/genai-toolbox/v0.21.0/darwin/arm64/toolbox
```
* To install Toolbox as a binary on macOS (Intel):
```bash
curl -L -o toolbox https://storage.googleapis.com/genai-toolbox/v0.21.0/darwin/amd64/toolbox
```
* To install Toolbox as a binary on Windows (AMD64):
```powershell
curl -o toolbox.exe "https://storage.googleapis.com/genai-toolbox/v0.21.0/windows/amd64/toolbox.exe"
```
<!-- {x-release-please-end} -->
2. **Make it executable**:
```bash
chmod +x toolbox
```
3. **Add the binary to $PATH in `.~/bash_profile`** (Note: You may need to restart Antigravity for changes to take effect.):
```bash
export PATH=$PATH:path/to/folder
```
**On Windows, move binary to the `WindowsApps\` folder**:
```
move "C:\Users\<path-to-binary>\toolbox.exe" "C:\Users\<username>\AppData\Local\Microsoft\WindowsApps\"
```
**Tip:** Ensure the destination folder for your binary is included in
your system's PATH environment variable. To check `PATH`, use `echo
$PATH` (or `echo %PATH%` on Windows).
* [Node.js](https://nodejs.org/) installed.
* A Google Cloud project with the **Cloud SQL Admin API** enabled.
* Ensure [Application Default Credentials](https://cloud.google.com/docs/authentication/gcloud) are available in your environment.
* IAM Permissions:
@@ -75,6 +29,9 @@ For Cloud SQL infrastructure management, search the MCP store for the Cloud SQL
You'll now be able to see all enabled tools in the "Tools" tab.
> [!NOTE]
> If you encounter issues with Windows Defender blocking the execution, you may need to configure an allowlist. See [Configure exclusions for Microsoft Defender Antivirus](https://learn.microsoft.com/en-us/microsoft-365/security/defender-endpoint/configure-exclusions-microsoft-defender-antivirus?view=o365-worldwide) for more details.
## Usage
Once configured, the MCP server will automatically provide Cloud SQL for SQL Server capabilities to your AI assistant. You can:
@@ -112,8 +69,8 @@ Add the following configuration to your MCP client (e.g., `settings.json` for Ge
{
"mcpServers": {
"cloud-sql-mssql": {
"command": "toolbox",
"args": ["--prebuilt", "cloud-sql-mssql", "--stdio"],
"command": "npx",
"args": ["-y", "@toolbox-sdk/server", "--prebuilt", "cloud-sql-mssql", "--stdio"],
"env": {
"CLOUD_SQL_MSSQL_PROJECT": "your-project-id",
"CLOUD_SQL_MSSQL_REGION": "your-region",

View File

@@ -12,53 +12,7 @@ To connect to the database to explore and query data, search the MCP store for t
## Prerequisites
* Download and install [MCP Toolbox](https://github.com/googleapis/genai-toolbox):
1. **Download the Toolbox binary**:
Download the latest binary for your operating system and architecture from the storage bucket. Check the [releases page](https://github.com/googleapis/genai-toolbox/releases) for additional versions:
<!-- {x-release-please-start-version} -->
* To install Toolbox as a binary on Linux (AMD64):
```bash
curl -L -o toolbox https://storage.googleapis.com/genai-toolbox/v0.21.0/linux/amd64/toolbox
```
* To install Toolbox as a binary on macOS (Apple Silicon):
```bash
curl -L -o toolbox https://storage.googleapis.com/genai-toolbox/v0.21.0/darwin/arm64/toolbox
```
* To install Toolbox as a binary on macOS (Intel):
```bash
curl -L -o toolbox https://storage.googleapis.com/genai-toolbox/v0.21.0/darwin/amd64/toolbox
```
* To install Toolbox as a binary on Windows (AMD64):
```powershell
curl -o toolbox.exe "https://storage.googleapis.com/genai-toolbox/v0.21.0/windows/amd64/toolbox.exe"
```
<!-- {x-release-please-end} -->
2. **Make it executable**:
```bash
chmod +x toolbox
```
3. **Add the binary to $PATH in `.~/bash_profile`** (Note: You may need to restart Antigravity for changes to take effect.):
```bash
export PATH=$PATH:path/to/folder
```
**On Windows, move binary to the `WindowsApps\` folder**:
```
move "C:\Users\<path-to-binary>\toolbox.exe" "C:\Users\<username>\AppData\Local\Microsoft\WindowsApps\"
```
**Tip:** Ensure the destination folder for your binary is included in
your system's PATH environment variable. To check `PATH`, use `echo
$PATH` (or `echo %PATH%` on Windows).
* [Node.js](https://nodejs.org/) installed.
* A Google Cloud project with the **Cloud SQL Admin API** enabled.
* Ensure [Application Default Credentials](https://cloud.google.com/docs/authentication/gcloud) are available in your environment.
* IAM Permissions:
@@ -67,10 +21,13 @@ To connect to the database to explore and query data, search the MCP store for t
## Install & Configuration
1. In the Antigravity MCP Store, click the "Install" button.
In the Antigravity MCP Store, click the "Install" button.
You'll now be able to see all enabled tools in the "Tools" tab.
> [!NOTE]
> If you encounter issues with Windows Defender blocking the execution, you may need to configure an allowlist. See [Configure exclusions for Microsoft Defender Antivirus](https://learn.microsoft.com/en-us/microsoft-365/security/defender-endpoint/configure-exclusions-microsoft-defender-antivirus?view=o365-worldwide) for more details.
## Usage
Once configured, the MCP server will automatically provide Cloud SQL for MySQL capabilities to your AI assistant. You can:
@@ -100,8 +57,8 @@ Add the following configuration to your MCP client (e.g., `settings.json` for Ge
{
"mcpServers": {
"cloud-sql-mysql-admin": {
"command": "toolbox",
"args": ["--prebuilt", "cloud-sql-mysql-admin", "--stdio"]
"command": "npx",
"args": ["-y", "@toolbox-sdk/server", "--prebuilt", "cloud-sql-mysql-admin", "--stdio"]
}
}
}

View File

@@ -15,53 +15,7 @@ For Cloud SQL infrastructure management, search the MCP store for the Cloud SQL
## Prerequisites
* Download and install [MCP Toolbox](https://github.com/googleapis/genai-toolbox):
1. **Download the Toolbox binary**:
Download the latest binary for your operating system and architecture from the storage bucket. Check the [releases page](https://github.com/googleapis/genai-toolbox/releases) for additional versions:
<!-- {x-release-please-start-version} -->
* To install Toolbox as a binary on Linux (AMD64):
```bash
curl -L -o toolbox https://storage.googleapis.com/genai-toolbox/v0.21.0/linux/amd64/toolbox
```
* To install Toolbox as a binary on macOS (Apple Silicon):
```bash
curl -L -o toolbox https://storage.googleapis.com/genai-toolbox/v0.21.0/darwin/arm64/toolbox
```
* To install Toolbox as a binary on macOS (Intel):
```bash
curl -L -o toolbox https://storage.googleapis.com/genai-toolbox/v0.21.0/darwin/amd64/toolbox
```
* To install Toolbox as a binary on Windows (AMD64):
```powershell
curl -o toolbox.exe "https://storage.googleapis.com/genai-toolbox/v0.21.0/windows/amd64/toolbox.exe"
```
<!-- {x-release-please-end} -->
2. **Make it executable**:
```bash
chmod +x toolbox
```
3. **Add the binary to $PATH in `.~/bash_profile`** (Note: You may need to restart Antigravity for changes to take effect.):
```bash
export PATH=$PATH:path/to/folder
```
**On Windows, move binary to the `WindowsApps\` folder**:
```
move "C:\Users\<path-to-binary>\toolbox.exe" "C:\Users\<username>\AppData\Local\Microsoft\WindowsApps\"
```
**Tip:** Ensure the destination folder for your binary is included in
your system's PATH environment variable. To check `PATH`, use `echo
$PATH` (or `echo %PATH%` on Windows).
* [Node.js](https://nodejs.org/) installed.
* A Google Cloud project with the **Cloud SQL Admin API** enabled.
* Ensure [Application Default Credentials](https://cloud.google.com/docs/authentication/gcloud) are available in your environment.
* IAM Permissions:
@@ -77,6 +31,9 @@ For Cloud SQL infrastructure management, search the MCP store for the Cloud SQL
You'll now be able to see all enabled tools in the "Tools" tab.
> [!NOTE]
> If you encounter issues with Windows Defender blocking the execution, you may need to configure an allowlist. See [Configure exclusions for Microsoft Defender Antivirus](https://learn.microsoft.com/en-us/microsoft-365/security/defender-endpoint/configure-exclusions-microsoft-defender-antivirus?view=o365-worldwide) for more details.
## Usage
Once configured, the MCP server will automatically provide Cloud SQL for MySQL capabilities to your AI assistant. You can:
@@ -118,8 +75,8 @@ Add the following configuration to your MCP client (e.g., `settings.json` for Ge
{
"mcpServers": {
"cloud-sql-mysql": {
"command": "toolbox",
"args": ["--prebuilt", "cloud-sql-mysql", "--stdio"],
"command": "npx",
"args": ["-y", "@toolbox-sdk/server", "--prebuilt", "cloud-sql-mysql", "--stdio"],
"env": {
"CLOUD_SQL_MYSQL_PROJECT": "your-project-id",
"CLOUD_SQL_MYSQL_REGION": "your-region",

View File

@@ -12,53 +12,7 @@ To connect to the database to explore and query data, search the MCP store for t
## Prerequisites
* Download and install [MCP Toolbox](https://github.com/googleapis/genai-toolbox):
1. **Download the Toolbox binary**:
Download the latest binary for your operating system and architecture from the storage bucket. Check the [releases page](https://github.com/googleapis/genai-toolbox/releases) for additional versions:
<!-- {x-release-please-start-version} -->
* To install Toolbox as a binary on Linux (AMD64):
```bash
curl -L -o toolbox https://storage.googleapis.com/genai-toolbox/v0.21.0/linux/amd64/toolbox
```
* To install Toolbox as a binary on macOS (Apple Silicon):
```bash
curl -L -o toolbox https://storage.googleapis.com/genai-toolbox/v0.21.0/darwin/arm64/toolbox
```
* To install Toolbox as a binary on macOS (Intel):
```bash
curl -L -o toolbox https://storage.googleapis.com/genai-toolbox/v0.21.0/darwin/amd64/toolbox
```
* To install Toolbox as a binary on Windows (AMD64):
```powershell
curl -o toolbox.exe "https://storage.googleapis.com/genai-toolbox/v0.21.0/windows/amd64/toolbox.exe"
```
<!-- {x-release-please-end} -->
2. **Make it executable**:
```bash
chmod +x toolbox
```
3. **Add the binary to $PATH in `.~/bash_profile`** (Note: You may need to restart Antigravity for changes to take effect.):
```bash
export PATH=$PATH:path/to/folder
```
**On Windows, move binary to the `WindowsApps\` folder**:
```
move "C:\Users\<path-to-binary>\toolbox.exe" "C:\Users\<username>\AppData\Local\Microsoft\WindowsApps\"
```
**Tip:** Ensure the destination folder for your binary is included in
your system's PATH environment variable. To check `PATH`, use `echo
$PATH` (or `echo %PATH%` on Windows).
* [Node.js](https://nodejs.org/) installed.
* A Google Cloud project with the **Cloud SQL Admin API** enabled.
* Ensure [Application Default Credentials](https://cloud.google.com/docs/authentication/gcloud) are available in your environment.
* IAM Permissions:
@@ -67,10 +21,13 @@ To connect to the database to explore and query data, search the MCP store for t
## Install & Configuration
1. In the Antigravity MCP Store, click the "Install" button.
In the Antigravity MCP Store, click the "Install" button.
You'll now be able to see all enabled tools in the "Tools" tab.
> [!NOTE]
> If you encounter issues with Windows Defender blocking the execution, you may need to configure an allowlist. See [Configure exclusions for Microsoft Defender Antivirus](https://learn.microsoft.com/en-us/microsoft-365/security/defender-endpoint/configure-exclusions-microsoft-defender-antivirus?view=o365-worldwide) for more details.
## Usage
Once configured, the MCP server will automatically provide Cloud SQL for PostgreSQL capabilities to your AI assistant. You can:
@@ -100,8 +57,8 @@ Add the following configuration to your MCP client (e.g., `settings.json` for Ge
{
"mcpServers": {
"cloud-sql-postgres-admin": {
"command": "toolbox",
"args": ["--prebuilt", "cloud-sql-postgres-admin", "--stdio"]
"command": "npx",
"args": ["-y", "@toolbox-sdk/server", "--prebuilt", "cloud-sql-postgres-admin", "--stdio"]
}
}
}

View File

@@ -15,53 +15,7 @@ For Cloud SQL infrastructure management, search the MCP store for the Cloud SQL
## Prerequisites
* Download and install [MCP Toolbox](https://github.com/googleapis/genai-toolbox):
1. **Download the Toolbox binary**:
Download the latest binary for your operating system and architecture from the storage bucket. Check the [releases page](https://github.com/googleapis/genai-toolbox/releases) for additional versions:
<!-- {x-release-please-start-version} -->
* To install Toolbox as a binary on Linux (AMD64):
```bash
curl -L -o toolbox https://storage.googleapis.com/genai-toolbox/v0.21.0/linux/amd64/toolbox
```
* To install Toolbox as a binary on macOS (Apple Silicon):
```bash
curl -L -o toolbox https://storage.googleapis.com/genai-toolbox/v0.21.0/darwin/arm64/toolbox
```
* To install Toolbox as a binary on macOS (Intel):
```bash
curl -L -o toolbox https://storage.googleapis.com/genai-toolbox/v0.21.0/darwin/amd64/toolbox
```
* To install Toolbox as a binary on Windows (AMD64):
```powershell
curl -o toolbox.exe "https://storage.googleapis.com/genai-toolbox/v0.21.0/windows/amd64/toolbox.exe"
```
<!-- {x-release-please-end} -->
2. **Make it executable**:
```bash
chmod +x toolbox
```
3. **Add the binary to $PATH in `.~/bash_profile`** (Note: You may need to restart Antigravity for changes to take effect.):
```bash
export PATH=$PATH:path/to/folder
```
**On Windows, move binary to the `WindowsApps\` folder**:
```
move "C:\Users\<path-to-binary>\toolbox.exe" "C:\Users\<username>\AppData\Local\Microsoft\WindowsApps\"
```
**Tip:** Ensure the destination folder for your binary is included in
your system's PATH environment variable. To check `PATH`, use `echo
$PATH` (or `echo %PATH%` on Windows).
* [Node.js](https://nodejs.org/) installed.
* A Google Cloud project with the **Cloud SQL Admin API** enabled.
* Ensure [Application Default Credentials](https://cloud.google.com/docs/authentication/gcloud) are available in your environment.
* IAM Permissions:
@@ -77,6 +31,9 @@ For Cloud SQL infrastructure management, search the MCP store for the Cloud SQL
You'll now be able to see all enabled tools in the "Tools" tab.
> [!NOTE]
> If you encounter issues with Windows Defender blocking the execution, you may need to configure an allowlist. See [Configure exclusions for Microsoft Defender Antivirus](https://learn.microsoft.com/en-us/microsoft-365/security/defender-endpoint/configure-exclusions-microsoft-defender-antivirus?view=o365-worldwide) for more details.
## Usage
Once configured, the MCP server will automatically provide Cloud SQL for PostgreSQL capabilities to your AI assistant. You can:
@@ -130,8 +87,8 @@ Add the following configuration to your MCP client (e.g., `settings.json` for Ge
{
"mcpServers": {
"cloud-sql-postgres": {
"command": "toolbox",
"args": ["--prebuilt", "cloud-sql-postgres", "--stdio"],
"command": "npx",
"args": ["-y", "@toolbox-sdk/server", "--prebuilt", "cloud-sql-postgres", "--stdio"],
"env": {
"CLOUD_SQL_POSTGRES_PROJECT": "your-project-id",
"CLOUD_SQL_POSTGRES_REGION": "your-region",

View File

@@ -11,53 +11,7 @@ An editor configured to use the Dataplex MCP server can use its AI capabilities
## Prerequisites
* Download and install [MCP Toolbox](https://github.com/googleapis/genai-toolbox):
1. **Download the Toolbox binary**:
Download the latest binary for your operating system and architecture from the storage bucket. Check the [releases page](https://github.com/googleapis/genai-toolbox/releases) for additional versions:
<!-- {x-release-please-start-version} -->
* To install Toolbox as a binary on Linux (AMD64):
```bash
curl -L -o toolbox https://storage.googleapis.com/genai-toolbox/v0.21.0/linux/amd64/toolbox
```
* To install Toolbox as a binary on macOS (Apple Silicon):
```bash
curl -L -o toolbox https://storage.googleapis.com/genai-toolbox/v0.21.0/darwin/arm64/toolbox
```
* To install Toolbox as a binary on macOS (Intel):
```bash
curl -L -o toolbox https://storage.googleapis.com/genai-toolbox/v0.21.0/darwin/amd64/toolbox
```
* To install Toolbox as a binary on Windows (AMD64):
```powershell
curl -o toolbox.exe "https://storage.googleapis.com/genai-toolbox/v0.21.0/windows/amd64/toolbox.exe"
```
<!-- {x-release-please-end} -->
2. **Make it executable**:
```bash
chmod +x toolbox
```
3. **Add the binary to $PATH in `.~/bash_profile`** (Note: You may need to restart Antigravity for changes to take effect.):
```bash
export PATH=$PATH:path/to/folder
```
**On Windows, move binary to the `WindowsApps\` folder**:
```
move "C:\Users\<path-to-binary>\toolbox.exe" "C:\Users\<username>\AppData\Local\Microsoft\WindowsApps\"
```
**Tip:** Ensure the destination folder for your binary is included in
your system's PATH environment variable. To check `PATH`, use `echo
$PATH` (or `echo %PATH%` on Windows).
* [Node.js](https://nodejs.org/) installed.
* A Google Cloud project with the **Dataplex API** enabled.
* Ensure [Application Default Credentials](https://cloud.google.com/docs/authentication/gcloud) are available in your environment.
* IAM Permissions:
@@ -71,6 +25,9 @@ An editor configured to use the Dataplex MCP server can use its AI capabilities
You'll now be able to see all enabled tools in the "Tools" tab.
> [!NOTE]
> If you encounter issues with Windows Defender blocking the execution, you may need to configure an allowlist. See [Configure exclusions for Microsoft Defender Antivirus](https://learn.microsoft.com/en-us/microsoft-365/security/defender-endpoint/configure-exclusions-microsoft-defender-antivirus?view=o365-worldwide) for more details.
## Usage
Once configured, the MCP server will automatically provide Dataplex capabilities to your AI assistant. You can:
@@ -102,8 +59,8 @@ Add the following configuration to your MCP client (e.g., `settings.json` for Ge
{
"mcpServers": {
"dataplex": {
"command": "toolbox",
"args": ["--prebuilt", "dataplex", "--stdio"],
"command": "npx",
"args": ["-y", "@toolbox-sdk/server", "--prebuilt", "dataplex", "--stdio"],
"env": {
"DATAPLEX_PROJECT": "your-project-id"
}

View File

@@ -14,53 +14,7 @@ An editor configured to use the Looker MCP server can use its AI capabilities to
## Prerequisites
* Download and install [MCP Toolbox](https://github.com/googleapis/genai-toolbox):
1. **Download the Toolbox binary**:
Download the latest binary for your operating system and architecture from the storage bucket. Check the [releases page](https://github.com/googleapis/genai-toolbox/releases) for additional versions:
<!-- {x-release-please-start-version} -->
* To install Toolbox as a binary on Linux (AMD64):
```bash
curl -L -o toolbox https://storage.googleapis.com/genai-toolbox/v0.21.0/linux/amd64/toolbox
```
* To install Toolbox as a binary on macOS (Apple Silicon):
```bash
curl -L -o toolbox https://storage.googleapis.com/genai-toolbox/v0.21.0/darwin/arm64/toolbox
```
* To install Toolbox as a binary on macOS (Intel):
```bash
curl -L -o toolbox https://storage.googleapis.com/genai-toolbox/v0.21.0/darwin/amd64/toolbox
```
* To install Toolbox as a binary on Windows (AMD64):
```powershell
curl -o toolbox.exe "https://storage.googleapis.com/genai-toolbox/v0.21.0/windows/amd64/toolbox.exe"
```
<!-- {x-release-please-end} -->
2. **Make it executable**:
```bash
chmod +x toolbox
```
3. **Add the binary to $PATH in `.~/bash_profile`** (Note: You may need to restart Antigravity for changes to take effect.):
```bash
export PATH=$PATH:path/to/folder
```
**On Windows, move binary to the `WindowsApps\` folder**:
```
move "C:\Users\<path-to-binary>\toolbox.exe" "C:\Users\<username>\AppData\Local\Microsoft\WindowsApps\"
```
**Tip:** Ensure the destination folder for your binary is included in
your system's PATH environment variable. To check `PATH`, use `echo
$PATH` (or `echo %PATH%` on Windows).
* [Node.js](https://nodejs.org/) installed.
* Access to a Looker instance.
* API Credentials (`Client ID` and `Client Secret`) or OAuth configuration.
@@ -72,6 +26,9 @@ An editor configured to use the Looker MCP server can use its AI capabilities to
You'll now be able to see all enabled tools in the "Tools" tab.
> [!NOTE]
> If you encounter issues with Windows Defender blocking the execution, you may need to configure an allowlist. See [Configure exclusions for Microsoft Defender Antivirus](https://learn.microsoft.com/en-us/microsoft-365/security/defender-endpoint/configure-exclusions-microsoft-defender-antivirus?view=o365-worldwide) for more details.
## Usage
Once configured, the MCP server will automatically provide Looker capabilities to your AI assistant. You can:
@@ -118,8 +75,8 @@ Add the following configuration to your MCP client (e.g., `settings.json` for Ge
{
"mcpServers": {
"looker": {
"command": "toolbox",
"args": ["--prebuilt", "looker", "--stdio"],
"command": "npx",
"args": ["-y", "@toolbox-sdk/server", "--prebuilt", "looker", "--stdio"],
"env": {
"LOOKER_BASE_URL": "https://your.looker.instance.com",
"LOOKER_CLIENT_ID": "your-client-id",

View File

@@ -11,53 +11,7 @@ An editor configured to use the Cloud Spanner MCP server can use its AI capabili
## Prerequisites
* Download and install [MCP Toolbox](https://github.com/googleapis/genai-toolbox):
1. **Download the Toolbox binary**:
Download the latest binary for your operating system and architecture from the storage bucket. Check the [releases page](https://github.com/googleapis/genai-toolbox/releases) for additional versions:
<!-- {x-release-please-start-version} -->
* To install Toolbox as a binary on Linux (AMD64):
```bash
curl -L -o toolbox https://storage.googleapis.com/genai-toolbox/v0.21.0/linux/amd64/toolbox
```
* To install Toolbox as a binary on macOS (Apple Silicon):
```bash
curl -L -o toolbox https://storage.googleapis.com/genai-toolbox/v0.21.0/darwin/arm64/toolbox
```
* To install Toolbox as a binary on macOS (Intel):
```bash
curl -L -o toolbox https://storage.googleapis.com/genai-toolbox/v0.21.0/darwin/amd64/toolbox
```
* To install Toolbox as a binary on Windows (AMD64):
```powershell
curl -o toolbox.exe "https://storage.googleapis.com/genai-toolbox/v0.21.0/windows/amd64/toolbox.exe"
```
<!-- {x-release-please-end} -->
2. **Make it executable**:
```bash
chmod +x toolbox
```
3. **Add the binary to $PATH in `.~/bash_profile`** (Note: You may need to restart Antigravity for changes to take effect.):
```bash
export PATH=$PATH:path/to/folder
```
**On Windows, move binary to the `WindowsApps\` folder**:
```
move "C:\Users\<path-to-binary>\toolbox.exe" "C:\Users\<username>\AppData\Local\Microsoft\WindowsApps\"
```
**Tip:** Ensure the destination folder for your binary is included in
your system's PATH environment variable. To check `PATH`, use `echo
$PATH` (or `echo %PATH%` on Windows).
* [Node.js](https://nodejs.org/) installed.
* A Google Cloud project with the **Cloud Spanner API** enabled.
* Ensure [Application Default Credentials](https://cloud.google.com/docs/authentication/gcloud) are available in your environment.
* IAM Permissions:
@@ -72,6 +26,9 @@ An editor configured to use the Cloud Spanner MCP server can use its AI capabili
You'll now be able to see all enabled tools in the "Tools" tab.
> [!NOTE]
> If you encounter issues with Windows Defender blocking the execution, you may need to configure an allowlist. See [Configure exclusions for Microsoft Defender Antivirus](https://learn.microsoft.com/en-us/microsoft-365/security/defender-endpoint/configure-exclusions-microsoft-defender-antivirus?view=o365-worldwide) for more details.
## Usage
Once configured, the MCP server will automatically provide Cloud Spanner capabilities to your AI assistant. You can:
@@ -108,8 +65,8 @@ Add the following configuration to your MCP client (e.g., `settings.json` for Ge
{
"mcpServers": {
"spanner": {
"command": "toolbox",
"args": ["--prebuilt", "spanner", "--stdio"],
"command": "npx",
"args": ["-y", "@toolbox-sdk/server", "--prebuilt", "spanner", "--stdio"],
"env": {
"SPANNER_PROJECT": "your-project-id",
"SPANNER_INSTANCE": "your-instance-id",

View File

@@ -2,66 +2,23 @@
The MCP Toolbox for Databases Server gives AI-powered development tools the ability to work with your custom tools. It is designed to simplify and secure the development of tools for interacting with databases.
# Prerequisites
* Download and install [MCP Toolbox](https://github.com/googleapis/genai-toolbox):
1. **Download the Toolbox binary**:
Download the latest binary for your operating system and architecture from the storage bucket. Check the [releases page](https://github.com/googleapis/genai-toolbox/releases) for additional versions:
<!-- {x-release-please-start-version} -->
* To install Toolbox as a binary on Linux (AMD64):
```bash
curl -L -o toolbox https://storage.googleapis.com/genai-toolbox/v0.21.0/linux/amd64/toolbox
```
## Prerequisites
* To install Toolbox as a binary on macOS (Apple Silicon):
```bash
curl -L -o toolbox https://storage.googleapis.com/genai-toolbox/v0.21.0/darwin/arm64/toolbox
```
* To install Toolbox as a binary on macOS (Intel):
```bash
curl -L -o toolbox https://storage.googleapis.com/genai-toolbox/v0.21.0/darwin/amd64/toolbox
```
* To install Toolbox as a binary on Windows (AMD64):
```powershell
curl -o toolbox.exe "https://storage.googleapis.com/genai-toolbox/v0.21.0/windows/amd64/toolbox.exe"
```
<!-- {x-release-please-end} -->
2. **Make it executable**:
```bash
chmod +x toolbox
```
3. **Add the binary to $PATH in `.~/bash_profile`** (Note: You may need to restart Antigravity for changes to take effect.):
```bash
export PATH=$PATH:path/to/folder
```
**On Windows, move binary to the `WindowsApps\` folder**:
```
move "C:\Users\<path-to-binary>\toolbox.exe" "C:\Users\<username>\AppData\Local\Microsoft\WindowsApps\"
```
**Tip:** Ensure the destination folder for your binary is included in
your system's PATH environment variable. To check `PATH`, use `echo
$PATH` (or `echo %PATH%` on Windows).
* Any required APIs and permissions for connecting to your database.
> **Note:** If your database instance uses private IPs, you must run the MCP server in the same Virtual Private Cloud (VPC) network.
* [Node.js](https://nodejs.org/) installed.
* A Google Cloud project with relevant APIs enabled.
* Ensure [Application Default Credentials](https://cloud.google.com/docs/authentication/gcloud) are available in your environment.
## Install & Configuration
1. In the Antigravity MCP Store, click the "Install" button.
1. In the Antigravity MCP Store, click the **Install** button. A configuration window will appear.
2. Add your [`tools.yaml` configuration
file](https://googleapis.github.io/genai-toolbox/getting-started/configure/) to
the directory you are running Antigravity
2. Create your [`tools.yaml` configuration file](https://googleapis.github.io/genai-toolbox/getting-started/configure/).
3. In the configuration window, enter the full absolute path to your `tools.yaml` file and click **Save**.
> [!NOTE]
> If you encounter issues with Windows Defender blocking the execution, you may need to configure an allowlist. See [Configure exclusions for Microsoft Defender Antivirus](https://learn.microsoft.com/en-us/microsoft-365/security/defender-endpoint/configure-exclusions-microsoft-defender-antivirus?view=o365-worldwide) for more details.
## Usage
@@ -73,8 +30,8 @@ Interact with your custom tools using natural language.
{
"mcpServers": {
"mcp-toolbox": {
"command": "toolbox",
"args": ["--tools-file", "you-tool-file.yaml"],
"command": "npx",
"args": ["-y", "@toolbox-sdk/server", "--tools-file", "your-tool-file.yaml"],
"env": {
"ENV_VAR_NAME": "ENV_VAR_VALUE",
}

View File

@@ -183,11 +183,11 @@ Protocol (OTLP). If you would like to use a collector, please refer to this
The following flags are used to determine Toolbox's telemetry configuration:
| **flag** | **type** | **description** |
|----------------------------|----------|------------------------------------------------------------------------------------------------------------------|
| `--telemetry-gcp` | bool | Enable exporting directly to Google Cloud Monitoring. Default is `false`. |
| `--telemetry-otlp` | string | Enable exporting using OpenTelemetry Protocol (OTLP) to the specified endpoint (e.g. "<http://127.0.0.1:4318>"). |
| `--telemetry-service-name` | string | Sets the value of the `service.name` resource attribute. Default is `toolbox`. |
| **flag** | **type** | **description** |
|----------------------------|----------|-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| `--telemetry-gcp` | bool | Enable exporting directly to Google Cloud Monitoring. Default is `false`. |
| `--telemetry-otlp` | string | Enable exporting using OpenTelemetry Protocol (OTLP) to the specified endpoint (e.g. "127.0.0.1:4318"). To pass an insecure endpoint here, set environment variable `OTEL_EXPORTER_OTLP_INSECURE=true`. |
| `--telemetry-service-name` | string | Sets the value of the `service.name` resource attribute. Default is `toolbox`. |
In addition to the flags noted above, you can also make additional configuration
for OpenTelemetry via the [General SDK Configuration][sdk-configuration] through
@@ -207,5 +207,5 @@ To enable Google Cloud Exporter:
To enable OTLP Exporter, provide Collector endpoint:
```bash
./toolbox --telemetry-otlp="http://127.0.0.1:4553"
./toolbox --telemetry-otlp="127.0.0.1:4553"
```

View File

@@ -234,7 +234,7 @@
},
"outputs": [],
"source": [
"version = \"0.21.0\" # x-release-please-version\n",
"version = \"0.24.0\" # x-release-please-version\n",
"! curl -O https://storage.googleapis.com/genai-toolbox/v{version}/linux/amd64/toolbox\n",
"\n",
"# Make the binary executable\n",

View File

@@ -71,6 +71,22 @@ redeploying your application.
## Getting Started
### (Non-production) Running Toolbox
You can run Toolbox directly with a [configuration file](../configure.md):
```sh
npx @toolbox-sdk/server --tools-file tools.yaml
```
This runs the latest version of the toolbox server with your configuration file.
{{< notice note >}}
This method should only be used for non-production use cases such as
experimentation. For any production use-cases, please consider [Installing the
server](#installing-the-server) and then [running it](#running-the-server).
{{< /notice >}}
### Installing the server
For the latest version, check the [releases page][releases] and use the
@@ -87,7 +103,7 @@ To install Toolbox as a binary on Linux (AMD64):
```sh
# see releases page for other versions
export VERSION=0.21.0
export VERSION=0.24.0
curl -L -o toolbox https://storage.googleapis.com/genai-toolbox/v$VERSION/linux/amd64/toolbox
chmod +x toolbox
```
@@ -98,7 +114,7 @@ To install Toolbox as a binary on macOS (Apple Silicon):
```sh
# see releases page for other versions
export VERSION=0.21.0
export VERSION=0.24.0
curl -L -o toolbox https://storage.googleapis.com/genai-toolbox/v$VERSION/darwin/arm64/toolbox
chmod +x toolbox
```
@@ -109,19 +125,29 @@ To install Toolbox as a binary on macOS (Intel):
```sh
# see releases page for other versions
export VERSION=0.21.0
export VERSION=0.24.0
curl -L -o toolbox https://storage.googleapis.com/genai-toolbox/v$VERSION/darwin/amd64/toolbox
chmod +x toolbox
```
{{% /tab %}}
{{% tab header="Windows (AMD64)" lang="en" %}}
To install Toolbox as a binary on Windows (AMD64):
{{% tab header="Windows (Command Prompt)" lang="en" %}}
To install Toolbox as a binary on Windows (Command Prompt):
```cmd
:: see releases page for other versions
set VERSION=0.24.0
curl -o toolbox.exe "https://storage.googleapis.com/genai-toolbox/v%VERSION%/windows/amd64/toolbox.exe"
```
{{% /tab %}}
{{% tab header="Windows (PowerShell)" lang="en" %}}
To install Toolbox as a binary on Windows (PowerShell):
```powershell
:: see releases page for other versions
set VERSION=0.21.0
curl -o toolbox.exe "https://storage.googleapis.com/genai-toolbox/v%VERSION%/windows/amd64/toolbox.exe"
# see releases page for other versions
$VERSION = "0.24.0"
curl.exe -o toolbox.exe "https://storage.googleapis.com/genai-toolbox/v$VERSION/windows/amd64/toolbox.exe"
```
{{% /tab %}}
@@ -132,7 +158,7 @@ You can also install Toolbox as a container:
```sh
# see releases page for other versions
export VERSION=0.21.0
export VERSION=0.24.0
docker pull us-central1-docker.pkg.dev/database-toolbox/toolbox/toolbox:$VERSION
```
@@ -151,7 +177,7 @@ To install from source, ensure you have the latest version of
[Go installed](https://go.dev/doc/install), and then run the following command:
```sh
go install github.com/googleapis/genai-toolbox@v0.21.0
go install github.com/googleapis/genai-toolbox@v0.24.0
```
{{% /tab %}}
@@ -294,6 +320,10 @@ let client = new ToolboxClient(URL);
const toolboxTools = await client.loadToolset('toolsetName');
{{< /highlight >}}
For more detailed instructions on using the Toolbox Core SDK, see the
[project's
README](https://github.com/googleapis/mcp-toolbox-sdk-js/blob/main/packages/toolbox-core/README.md).
{{% /tab %}}
{{% tab header="LangChain/Langraph" lang="en" %}}
@@ -318,6 +348,10 @@ const getTool = (toolboxTool) => tool(currTool, {
const tools = toolboxTools.map(getTool);
{{< /highlight >}}
For more detailed instructions on using the Toolbox Core SDK, see the
[project's
README](https://github.com/googleapis/mcp-toolbox-sdk-js/blob/main/packages/toolbox-core/README.md).
{{% /tab %}}
{{% tab header="Genkit" lang="en" %}}
@@ -353,6 +387,10 @@ const getTool = (toolboxTool) => ai.defineTool({
const tools = toolboxTools.map(getTool);
{{< /highlight >}}
For more detailed instructions on using the Toolbox Core SDK, see the
[project's
README](https://github.com/googleapis/mcp-toolbox-sdk-js/blob/main/packages/toolbox-core/README.md).
{{% /tab %}}
{{% tab header="LlamaIndex" lang="en" %}}
@@ -380,13 +418,33 @@ const tools = toolboxTools.map(getTool);
{{< /highlight >}}
{{% /tab %}}
{{< /tabpane >}}
For more detailed instructions on using the Toolbox Core SDK, see the
[project's
README](https://github.com/googleapis/mcp-toolbox-sdk-js/blob/main/packages/toolbox-core/README.md).
{{% /tab %}}
{{% tab header="ADK TS" lang="en" %}}
{{< highlight javascript >}}
import { ToolboxClient } from '@toolbox-sdk/adk';
// Replace with the actual URL where your Toolbox service is running
const URL = 'http://127.0.0.1:5000';
let client = new ToolboxClient(URL);
const tools = await client.loadToolset();
// Use the client and tools as per requirement
{{< /highlight >}}
For detailed samples on using the Toolbox JS SDK with ADK JS, see the [project's
README.](https://github.com/googleapis/mcp-toolbox-sdk-js/tree/main/packages/toolbox-adk/README.md)
{{% /tab %}}
{{< /tabpane >}}
#### Go
Once you've installed the [Toolbox Go

View File

@@ -40,11 +40,24 @@ from Toolbox.
```
1. In a new terminal, install the
[SDK](https://www.npmjs.com/package/@toolbox-sdk/core).
```bash
npm install @toolbox-sdk/core
```
SDK package.
{{< tabpane persist=header >}}
{{< tab header="LangChain" lang="bash" >}}
npm install @toolbox-sdk/core
{{< /tab >}}
{{< tab header="GenkitJS" lang="bash" >}}
npm install @toolbox-sdk/core
{{< /tab >}}
{{< tab header="LlamaIndex" lang="bash" >}}
npm install @toolbox-sdk/core
{{< /tab >}}
{{< tab header="GoogleGenAI" lang="bash" >}}
npm install @toolbox-sdk/core
{{< /tab >}}
{{< tab header="ADK" lang="bash" >}}
npm install @toolbox-sdk/adk
{{< /tab >}}
{{< /tabpane >}}
1. Install other required dependencies
@@ -61,6 +74,9 @@ npm install llamaindex @llamaindex/google @llamaindex/workflow
{{< tab header="GoogleGenAI" lang="bash" >}}
npm install @google/genai
{{< /tab >}}
{{< tab header="ADK" lang="bash" >}}
npm install @google/adk
{{< /tab >}}
{{< /tabpane >}}
1. Create a new file named `hotelAgent.js` and copy the following code to create
@@ -91,6 +107,12 @@ npm install @google/genai
{{< /tab >}}
{{< tab header="ADK" lang="js" >}}
{{< include "quickstart/js/adk/quickstart.js" >}}
{{< /tab >}}
{{< /tabpane >}}
1. Run your agent, and observe the results:

View File

@@ -105,7 +105,7 @@ In this section, we will download Toolbox, configure our tools in a
<!-- {x-release-please-start-version} -->
```bash
export OS="linux/amd64" # one of linux/amd64, darwin/arm64, darwin/amd64, or windows/amd64
curl -O https://storage.googleapis.com/genai-toolbox/v0.21.0/$OS/toolbox
curl -O https://storage.googleapis.com/genai-toolbox/v0.24.0/$OS/toolbox
```
<!-- {x-release-please-end} -->

View File

@@ -0,0 +1,245 @@
---
title: "Prompts using Gemini CLI"
type: docs
weight: 5
description: >
How to get started using Toolbox prompts locally with PostgreSQL and [Gemini CLI](https://pypi.org/project/gemini-cli/).
---
## Before you begin
This guide assumes you have already done the following:
1. Installed [PostgreSQL 16+ and the `psql` client][install-postgres].
[install-postgres]: https://www.postgresql.org/download/
## Step 1: Set up your database
In this section, we will create a database, insert some data that needs to be
accessed by our agent, and create a database user for Toolbox to connect with.
1. Connect to postgres using the `psql` command:
```bash
psql -h 127.0.0.1 -U postgres
```
Here, `postgres` denotes the default postgres superuser.
{{< notice info >}}
#### **Having trouble connecting?**
* **Password Prompt:** If you are prompted for a password for the `postgres`
user and do not know it (or a blank password doesn't work), your PostgreSQL
installation might require a password or a different authentication method.
* **`FATAL: role "postgres" does not exist`:** This error means the default
`postgres` superuser role isn't available under that name on your system.
* **`Connection refused`:** Ensure your PostgreSQL server is actually running.
You can typically check with `sudo systemctl status postgresql` and start it
with `sudo systemctl start postgresql` on Linux systems.
<br/>
#### **Common Solution**
For password issues or if the `postgres` role seems inaccessible directly, try
switching to the `postgres` operating system user first. This user often has
permission to connect without a password for local connections (this is called
peer authentication).
```bash
sudo -i -u postgres
psql -h 127.0.0.1
```
Once you are in the `psql` shell using this method, you can proceed with the
database creation steps below. Afterwards, type `\q` to exit `psql`, and then
`exit` to return to your normal user shell.
If desired, once connected to `psql` as the `postgres` OS user, you can set a
password for the `postgres` *database* user using: `ALTER USER postgres WITH
PASSWORD 'your_chosen_password';`. This would allow direct connection with `-U
postgres` and a password next time.
{{< /notice >}}
1. Create a new database and a new user:
{{< notice tip >}}
For a real application, it's best to follow the principle of least permission
and only grant the privileges your application needs.
{{< /notice >}}
```sql
CREATE USER toolbox_user WITH PASSWORD 'my-password';
CREATE DATABASE toolbox_db;
GRANT ALL PRIVILEGES ON DATABASE toolbox_db TO toolbox_user;
ALTER DATABASE toolbox_db OWNER TO toolbox_user;
```
1. End the database session:
```bash
\q
```
(If you used `sudo -i -u postgres` and then `psql`, remember you might also
need to type `exit` after `\q` to leave the `postgres` user's shell
session.)
1. Connect to your database with your new user:
```bash
psql -h 127.0.0.1 -U toolbox_user -d toolbox_db
```
1. Create the required tables using the following commands:
```sql
CREATE TABLE users (
id SERIAL PRIMARY KEY,
username VARCHAR(50) NOT NULL,
email VARCHAR(100) UNIQUE NOT NULL,
created_at TIMESTAMPTZ DEFAULT NOW()
);
CREATE TABLE restaurants (
id SERIAL PRIMARY KEY,
name VARCHAR(100) NOT NULL,
location VARCHAR(100)
);
CREATE TABLE reviews (
id SERIAL PRIMARY KEY,
user_id INT REFERENCES users(id),
restaurant_id INT REFERENCES restaurants(id),
rating INT CHECK (rating >= 1 AND rating <= 5),
review_text TEXT,
is_published BOOLEAN DEFAULT false,
moderation_status VARCHAR(50) DEFAULT 'pending_manual_review',
created_at TIMESTAMPTZ DEFAULT NOW()
);
```
1. Insert dummy data into the tables.
```sql
INSERT INTO users (id, username, email) VALUES
(123, 'jane_d', 'jane.d@example.com'),
(124, 'john_s', 'john.s@example.com'),
(125, 'sam_b', 'sam.b@example.com');
INSERT INTO restaurants (id, name, location) VALUES
(455, 'Pizza Palace', '123 Main St'),
(456, 'The Corner Bistro', '456 Oak Ave'),
(457, 'Sushi Spot', '789 Pine Ln');
INSERT INTO reviews (user_id, restaurant_id, rating, review_text, is_published, moderation_status) VALUES
(124, 455, 5, 'Best pizza in town! The crust was perfect.', true, 'approved'),
(125, 457, 4, 'Great sushi, very fresh. A bit pricey but worth it.', true, 'approved'),
(123, 457, 5, 'Absolutely loved the dragon roll. Will be back!', true, 'approved'),
(123, 456, 4, 'The atmosphere was lovely and the food was great. My photo upload might have been weird though.', false, 'pending_manual_review'),
(125, 456, 1, 'This review contains inappropriate language.', false, 'rejected');
```
1. End the database session:
```bash
\q
```
## Step 2: Configure Toolbox
Create a file named `tools.yaml`. This file defines the database connection, the
SQL tools available, and the prompts the agents will use.
```yaml
sources:
my-foodiefind-db:
kind: postgres
host: 127.0.0.1
port: 5432
database: toolbox_db
user: toolbox_user
password: my-password
tools:
find_user_by_email:
kind: postgres-sql
source: my-foodiefind-db
description: Find a user's ID by their email address.
parameters:
- name: email
type: string
description: The email address of the user to find.
statement: SELECT id FROM users WHERE email = $1;
find_restaurant_by_name:
kind: postgres-sql
source: my-foodiefind-db
description: Find a restaurant's ID by its exact name.
parameters:
- name: name
type: string
description: The name of the restaurant to find.
statement: SELECT id FROM restaurants WHERE name = $1;
find_review_by_user_and_restaurant:
kind: postgres-sql
source: my-foodiefind-db
description: Find the full record for a specific review using the user's ID and the restaurant's ID.
parameters:
- name: user_id
type: integer
description: The numerical ID of the user.
- name: restaurant_id
type: integer
description: The numerical ID of the restaurant.
statement: SELECT * FROM reviews WHERE user_id = $1 AND restaurant_id = $2;
prompts:
investigate_missing_review:
description: "Investigates a user's missing review by finding the user, restaurant, and the review itself, then analyzing its status."
arguments:
- name: "user_email"
description: "The email of the user who wrote the review."
- name: "restaurant_name"
description: "The name of the restaurant being reviewed."
messages:
- content: >-
**Goal:** Find the review written by the user with email '{{.user_email}}' for the restaurant named '{{.restaurant_name}}' and understand its status.
**Workflow:**
1. Use the `find_user_by_email` tool with the email '{{.user_email}}' to get the `user_id`.
2. Use the `find_restaurant_by_name` tool with the name '{{.restaurant_name}}' to get the `restaurant_id`.
3. Use the `find_review_by_user_and_restaurant` tool with the `user_id` and `restaurant_id` you just found.
4. Analyze the results from the final tool call. Examine the `is_published` and `moderation_status` fields and explain the review's status to the user in a clear, human-readable sentence.
```
## Step 3: Connect to Gemini CLI
Configure the Gemini CLI to talk to your local Toolbox MCP server.
1. Open or create your Gemini settings file: `~/.gemini/settings.json`.
2. Add the following configuration to the file:
```json
{
"mcpServers": {
"MCPToolbox": {
"httpUrl": "http://localhost:5000/mcp"
}
},
"mcp": {
"allowed": ["MCPToolbox"]
}
}
```
3. Start Gemini CLI using
```sh
gemini
```
In case Gemini CLI is already running, use `/mcp refresh` to refresh the MCP server.
4. Use gemini slash commands to run your prompt:
```sh
/investigate_missing_review --user_email="jane.d@example.com" --restaurant_name="The Corner Bistro"
```

View File

@@ -5,7 +5,7 @@ go 1.24.4
require (
github.com/googleapis/mcp-toolbox-sdk-go v0.4.0
google.golang.org/adk v0.1.0
google.golang.org/genai v1.35.0
google.golang.org/genai v1.36.0
)
require (

View File

@@ -108,8 +108,8 @@ google.golang.org/adk v0.1.0 h1:+w/fHuqRVolotOATlujRA+2DKUuDrFH2poRdEX2QjB8=
google.golang.org/adk v0.1.0/go.mod h1:NvtSLoNx7UzZIiUAI1KoJQLMmt9sG3oCgiCx1TLqKFw=
google.golang.org/api v0.255.0 h1:OaF+IbRwOottVCYV2wZan7KUq7UeNUQn1BcPc4K7lE4=
google.golang.org/api v0.255.0/go.mod h1:d1/EtvCLdtiWEV4rAEHDHGh2bCnqsWhw+M8y2ECN4a8=
google.golang.org/genai v1.35.0 h1:Jo6g25CzVqFzGrX5mhWyBgQqXAUzxcx5jeK7U74zv9c=
google.golang.org/genai v1.35.0/go.mod h1:A3kkl0nyBjyFlNjgxIwKq70julKbIxpSxqKO5gw/gmk=
google.golang.org/genai v1.36.0 h1:sJCIjqTAmwrtAIaemtTiKkg2TO1RxnYEusTmEQ3nGxM=
google.golang.org/genai v1.36.0/go.mod h1:A3kkl0nyBjyFlNjgxIwKq70julKbIxpSxqKO5gw/gmk=
google.golang.org/genproto v0.0.0-20251014184007-4626949a642f h1:vLd1CJuJOUgV6qijD7KT5Y2ZtC97ll4dxjTUappMnbo=
google.golang.org/genproto v0.0.0-20251014184007-4626949a642f/go.mod h1:PI3KrSadr00yqfv6UDvgZGFsmLqeRIwt8x4p5Oo7CdM=
google.golang.org/genproto/googleapis/api v0.0.0-20251014184007-4626949a642f h1:OiFuztEyBivVKDvguQJYWq1yDcfAHIID/FVrPR4oiI0=

View File

@@ -4,7 +4,7 @@ go 1.24.6
require (
github.com/googleapis/mcp-toolbox-sdk-go v0.4.0
google.golang.org/genai v1.35.0
google.golang.org/genai v1.36.0
)
require (

View File

@@ -102,8 +102,8 @@ gonum.org/v1/gonum v0.16.0 h1:5+ul4Swaf3ESvrOnidPp4GZbzf0mxVQpDCYUQE7OJfk=
gonum.org/v1/gonum v0.16.0/go.mod h1:fef3am4MQ93R2HHpKnLk4/Tbh/s0+wqD5nfa6Pnwy4E=
google.golang.org/api v0.255.0 h1:OaF+IbRwOottVCYV2wZan7KUq7UeNUQn1BcPc4K7lE4=
google.golang.org/api v0.255.0/go.mod h1:d1/EtvCLdtiWEV4rAEHDHGh2bCnqsWhw+M8y2ECN4a8=
google.golang.org/genai v1.35.0 h1:Jo6g25CzVqFzGrX5mhWyBgQqXAUzxcx5jeK7U74zv9c=
google.golang.org/genai v1.35.0/go.mod h1:A3kkl0nyBjyFlNjgxIwKq70julKbIxpSxqKO5gw/gmk=
google.golang.org/genai v1.36.0 h1:sJCIjqTAmwrtAIaemtTiKkg2TO1RxnYEusTmEQ3nGxM=
google.golang.org/genai v1.36.0/go.mod h1:A3kkl0nyBjyFlNjgxIwKq70julKbIxpSxqKO5gw/gmk=
google.golang.org/genproto v0.0.0-20251014184007-4626949a642f h1:vLd1CJuJOUgV6qijD7KT5Y2ZtC97ll4dxjTUappMnbo=
google.golang.org/genproto v0.0.0-20251014184007-4626949a642f/go.mod h1:PI3KrSadr00yqfv6UDvgZGFsmLqeRIwt8x4p5Oo7CdM=
google.golang.org/genproto/googleapis/api v0.0.0-20251014184007-4626949a642f h1:OiFuztEyBivVKDvguQJYWq1yDcfAHIID/FVrPR4oiI0=

View File

@@ -33,12 +33,12 @@ require (
go.opentelemetry.io/otel v1.38.0 // indirect
go.opentelemetry.io/otel/metric v1.38.0 // indirect
go.opentelemetry.io/otel/trace v1.38.0 // indirect
golang.org/x/crypto v0.43.0 // indirect
golang.org/x/net v0.46.0 // indirect
golang.org/x/crypto v0.45.0 // indirect
golang.org/x/net v0.47.0 // indirect
golang.org/x/oauth2 v0.32.0 // indirect
golang.org/x/sync v0.17.0 // indirect
golang.org/x/sys v0.37.0 // indirect
golang.org/x/text v0.30.0 // indirect
golang.org/x/sync v0.18.0 // indirect
golang.org/x/sys v0.38.0 // indirect
golang.org/x/text v0.31.0 // indirect
golang.org/x/time v0.14.0 // indirect
google.golang.org/api v0.255.0 // indirect
google.golang.org/genproto v0.0.0-20251014184007-4626949a642f // indirect

View File

@@ -100,18 +100,18 @@ go.opentelemetry.io/otel/sdk/metric v1.38.0 h1:aSH66iL0aZqo//xXzQLYozmWrXxyFkBJ6
go.opentelemetry.io/otel/sdk/metric v1.38.0/go.mod h1:dg9PBnW9XdQ1Hd6ZnRz689CbtrUp0wMMs9iPcgT9EZA=
go.opentelemetry.io/otel/trace v1.38.0 h1:Fxk5bKrDZJUH+AMyyIXGcFAPah0oRcT+LuNtJrmcNLE=
go.opentelemetry.io/otel/trace v1.38.0/go.mod h1:j1P9ivuFsTceSWe1oY+EeW3sc+Pp42sO++GHkg4wwhs=
golang.org/x/crypto v0.43.0 h1:dduJYIi3A3KOfdGOHX8AVZ/jGiyPa3IbBozJ5kNuE04=
golang.org/x/crypto v0.43.0/go.mod h1:BFbav4mRNlXJL4wNeejLpWxB7wMbc79PdRGhWKncxR0=
golang.org/x/net v0.46.0 h1:giFlY12I07fugqwPuWJi68oOnpfqFnJIJzaIIm2JVV4=
golang.org/x/net v0.46.0/go.mod h1:Q9BGdFy1y4nkUwiLvT5qtyhAnEHgnQ/zd8PfU6nc210=
golang.org/x/crypto v0.45.0 h1:jMBrvKuj23MTlT0bQEOBcAE0mjg8mK9RXFhRH6nyF3Q=
golang.org/x/crypto v0.45.0/go.mod h1:XTGrrkGJve7CYK7J8PEww4aY7gM3qMCElcJQ8n8JdX4=
golang.org/x/net v0.47.0 h1:Mx+4dIFzqraBXUugkia1OOvlD6LemFo1ALMHjrXDOhY=
golang.org/x/net v0.47.0/go.mod h1:/jNxtkgq5yWUGYkaZGqo27cfGZ1c5Nen03aYrrKpVRU=
golang.org/x/oauth2 v0.32.0 h1:jsCblLleRMDrxMN29H3z/k1KliIvpLgCkE6R8FXXNgY=
golang.org/x/oauth2 v0.32.0/go.mod h1:lzm5WQJQwKZ3nwavOZ3IS5Aulzxi68dUSgRHujetwEA=
golang.org/x/sync v0.17.0 h1:l60nONMj9l5drqw6jlhIELNv9I0A4OFgRsG9k2oT9Ug=
golang.org/x/sync v0.17.0/go.mod h1:9KTHXmSnoGruLpwFjVSX0lNNA75CykiMECbovNTZqGI=
golang.org/x/sys v0.37.0 h1:fdNQudmxPjkdUTPnLn5mdQv7Zwvbvpaxqs831goi9kQ=
golang.org/x/sys v0.37.0/go.mod h1:OgkHotnGiDImocRcuBABYBEXf8A9a87e/uXjp9XT3ks=
golang.org/x/text v0.30.0 h1:yznKA/E9zq54KzlzBEAWn1NXSQ8DIp/NYMy88xJjl4k=
golang.org/x/text v0.30.0/go.mod h1:yDdHFIX9t+tORqspjENWgzaCVXgk0yYnYuSZ8UzzBVM=
golang.org/x/sync v0.18.0 h1:kr88TuHDroi+UVf+0hZnirlk8o8T+4MrK6mr60WkH/I=
golang.org/x/sync v0.18.0/go.mod h1:9KTHXmSnoGruLpwFjVSX0lNNA75CykiMECbovNTZqGI=
golang.org/x/sys v0.38.0 h1:3yZWxaJjBmCWXqhN1qh02AkOnCQ1poK6oF+a7xWL6Gc=
golang.org/x/sys v0.38.0/go.mod h1:OgkHotnGiDImocRcuBABYBEXf8A9a87e/uXjp9XT3ks=
golang.org/x/text v0.31.0 h1:aC8ghyu4JhP8VojJ2lEHBnochRno1sgL6nEi9WGFGMM=
golang.org/x/text v0.31.0/go.mod h1:tKRAlv61yKIjGGHX/4tP1LTbc13YSec1pxVEWXzfoeM=
golang.org/x/time v0.14.0 h1:MRx4UaLrDotUKUdCIqzPC48t1Y9hANFKIRpNx+Te8PI=
golang.org/x/time v0.14.0/go.mod h1:eL/Oa2bBBK0TkX57Fyni+NgnyQQN4LitPmob2Hjnqw4=
gonum.org/v1/gonum v0.16.0 h1:5+ul4Swaf3ESvrOnidPp4GZbzf0mxVQpDCYUQE7OJfk=

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,17 @@
{
"name": "adk",
"version": "1.0.0",
"description": "",
"main": "quickstart.js",
"type": "module",
"scripts": {
"test": "node --test"
},
"keywords": [],
"author": "",
"license": "ISC",
"dependencies": {
"@google/adk": "^0.1.3",
"@toolbox-sdk/adk": "^0.1.5"
}
}

View File

@@ -0,0 +1,56 @@
import { InMemoryRunner, LlmAgent, LogLevel } from '@google/adk';
import { ToolboxClient } from '@toolbox-sdk/adk';
const prompt = `
You're a helpful hotel assistant. You handle hotel searching, booking, and
cancellations. When the user searches for a hotel, mention its name, id,
location and price tier. Always mention hotel ids while performing any
searches. This is very important for any operations. For any bookings or
cancellations, please provide the appropriate confirmation. Be sure to
update checkin or checkout dates if mentioned by the user.
Don't ask for confirmations from the user.
`;
const queries = [
"Find hotels with Basel in its name.",
"Can you book the Hilton Basel for me?",
"Oh wait, this is too expensive. Please cancel it and book the Hyatt Regency instead.",
"My check in dates would be from April 10, 2024 to April 19, 2024.",
];
process.env.GOOGLE_GENAI_API_KEY = process.env.GOOGLE_API_KEY || 'your-api-key'; // Replace it with your API key
export async function main() {
const userId = 'test_user';
const client = new ToolboxClient('http://127.0.0.1:5000');
const tools = await client.loadToolset("my-toolset");
const rootAgent = new LlmAgent({
name: 'hotel_agent',
model: 'gemini-2.5-flash',
description: 'Agent for hotel bookings and administration.',
instruction: prompt,
tools: tools,
});
const appName = rootAgent.name;
const runner = new InMemoryRunner({ agent: rootAgent, appName, logLevel: LogLevel.ERROR, });
const session = await runner.sessionService.createSession({ appName, userId });
for (const query of queries) {
await runPrompt(runner, userId, session.id, query);
}
}
async function runPrompt(runner, userId, sessionId, prompt) {
const content = { role: 'user', parts: [{ text: prompt }] };
const stream = runner.runAsync({ userId, sessionId, newMessage: content });
const responses = await Array.fromAsync(stream);
const accumulatedResponse = responses
.flatMap((e) => e.content?.parts?.map((p) => p.text) ?? [])
.join('');
console.log(`\nMODEL RESPONSE: ${accumulatedResponse}\n`);
}
main();

View File

@@ -872,11 +872,12 @@
}
},
"node_modules/jws": {
"version": "4.0.0",
"resolved": "https://registry.npmjs.org/jws/-/jws-4.0.0.tgz",
"integrity": "sha512-KDncfTmOZoOMTFG4mBlG0qUIOlc03fmzH+ru6RgYVZhPkyiy/92Owlt/8UEN+a4TXR1FQetfIpJE8ApdvdVxTg==",
"version": "4.0.1",
"resolved": "https://registry.npmjs.org/jws/-/jws-4.0.1.tgz",
"integrity": "sha512-EKI/M/yqPncGUUh44xz0PxSidXFr/+r0pA70+gIYhjv+et7yxM+s29Y+VGDkovRofQem0fs7Uvf4+YmAdyRduA==",
"license": "MIT",
"dependencies": {
"jwa": "^2.0.0",
"jwa": "^2.0.1",
"safe-buffer": "^5.0.1"
}
},

View File

@@ -1,4 +1,4 @@
llama-index==0.14.8
llama-index==0.14.10
llama-index-llms-google-genai==0.7.3
toolbox-llamaindex==0.5.3
pytest==9.0.1

View File

@@ -13,7 +13,7 @@ In this section, we will download Toolbox, configure our tools in a
<!-- {x-release-please-start-version} -->
```bash
export OS="linux/amd64" # one of linux/amd64, darwin/arm64, darwin/amd64, or windows/amd64
curl -O https://storage.googleapis.com/genai-toolbox/v0.21.0/$OS/toolbox
curl -O https://storage.googleapis.com/genai-toolbox/v0.24.0/$OS/toolbox
```
<!-- {x-release-please-end} -->

View File

@@ -49,19 +49,19 @@ to expose your developer assistant tools to a Looker instance:
{{< tabpane persist=header >}}
{{< tab header="linux/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.21.0/linux/amd64/toolbox
curl -O https://storage.googleapis.com/genai-toolbox/v0.24.0/linux/amd64/toolbox
{{< /tab >}}
{{< tab header="darwin/arm64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.21.0/darwin/arm64/toolbox
curl -O https://storage.googleapis.com/genai-toolbox/v0.24.0/darwin/arm64/toolbox
{{< /tab >}}
{{< tab header="darwin/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.21.0/darwin/amd64/toolbox
curl -O https://storage.googleapis.com/genai-toolbox/v0.24.0/darwin/amd64/toolbox
{{< /tab >}}
{{< tab header="windows/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.21.0/windows/amd64/toolbox.exe
curl -O https://storage.googleapis.com/genai-toolbox/v0.24.0/windows/amd64/toolbox.exe
{{< /tab >}}
{{< /tabpane >}}
<!-- {x-release-please-end} -->
@@ -323,6 +323,8 @@ instance and create new saved content.
data
1. **make_dashboard**: Create a saved dashboard in Looker and return the URL
1. **add_dashboard_element**: Add a tile to a dashboard
1. **add_dashboard_filter**: Add a filter to a dashboard
1. **generate_embed_url**: Generate an embed url for content
### Looker Instance Health Tools

View File

@@ -45,19 +45,19 @@ instance:
<!-- {x-release-please-start-version} -->
{{< tabpane persist=header >}}
{{< tab header="linux/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.21.0/linux/amd64/toolbox
curl -O https://storage.googleapis.com/genai-toolbox/v0.24.0/linux/amd64/toolbox
{{< /tab >}}
{{< tab header="darwin/arm64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.21.0/darwin/arm64/toolbox
curl -O https://storage.googleapis.com/genai-toolbox/v0.24.0/darwin/arm64/toolbox
{{< /tab >}}
{{< tab header="darwin/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.21.0/darwin/amd64/toolbox
curl -O https://storage.googleapis.com/genai-toolbox/v0.24.0/darwin/amd64/toolbox
{{< /tab >}}
{{< tab header="windows/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.21.0/windows/amd64/toolbox.exe
curl -O https://storage.googleapis.com/genai-toolbox/v0.24.0/windows/amd64/toolbox.exe
{{< /tab >}}
{{< /tabpane >}}
<!-- {x-release-please-end} -->

View File

@@ -43,19 +43,19 @@ expose your developer assistant tools to a MySQL instance:
<!-- {x-release-please-start-version} -->
{{< tabpane persist=header >}}
{{< tab header="linux/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.21.0/linux/amd64/toolbox
curl -O https://storage.googleapis.com/genai-toolbox/v0.24.0/linux/amd64/toolbox
{{< /tab >}}
{{< tab header="darwin/arm64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.21.0/darwin/arm64/toolbox
curl -O https://storage.googleapis.com/genai-toolbox/v0.24.0/darwin/arm64/toolbox
{{< /tab >}}
{{< tab header="darwin/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.21.0/darwin/amd64/toolbox
curl -O https://storage.googleapis.com/genai-toolbox/v0.24.0/darwin/amd64/toolbox
{{< /tab >}}
{{< tab header="windows/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.21.0/windows/amd64/toolbox.exe
curl -O https://storage.googleapis.com/genai-toolbox/v0.24.0/windows/amd64/toolbox.exe
{{< /tab >}}
{{< /tabpane >}}
<!-- {x-release-please-end} -->

View File

@@ -44,19 +44,19 @@ expose your developer assistant tools to a Neo4j instance:
<!-- {x-release-please-start-version} -->
{{< tabpane persist=header >}}
{{< tab header="linux/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.21.0/linux/amd64/toolbox
curl -O https://storage.googleapis.com/genai-toolbox/v0.24.0/linux/amd64/toolbox
{{< /tab >}}
{{< tab header="darwin/arm64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.21.0/darwin/arm64/toolbox
curl -O https://storage.googleapis.com/genai-toolbox/v0.24.0/darwin/arm64/toolbox
{{< /tab >}}
{{< tab header="darwin/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.21.0/darwin/amd64/toolbox
curl -O https://storage.googleapis.com/genai-toolbox/v0.24.0/darwin/amd64/toolbox
{{< /tab >}}
{{< tab header="windows/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.21.0/windows/amd64/toolbox.exe
curl -O https://storage.googleapis.com/genai-toolbox/v0.24.0/windows/amd64/toolbox.exe
{{< /tab >}}
{{< /tabpane >}}
<!-- {x-release-please-end} -->

View File

@@ -56,19 +56,19 @@ Omni](https://cloud.google.com/alloydb/omni/current/docs/overview).
<!-- {x-release-please-start-version} -->
{{< tabpane persist=header >}}
{{< tab header="linux/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.21.0/linux/amd64/toolbox
curl -O https://storage.googleapis.com/genai-toolbox/v0.24.0/linux/amd64/toolbox
{{< /tab >}}
{{< tab header="darwin/arm64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.21.0/darwin/arm64/toolbox
curl -O https://storage.googleapis.com/genai-toolbox/v0.24.0/darwin/arm64/toolbox
{{< /tab >}}
{{< tab header="darwin/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.21.0/darwin/amd64/toolbox
curl -O https://storage.googleapis.com/genai-toolbox/v0.24.0/darwin/amd64/toolbox
{{< /tab >}}
{{< tab header="windows/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.21.0/windows/amd64/toolbox.exe
curl -O https://storage.googleapis.com/genai-toolbox/v0.24.0/windows/amd64/toolbox.exe
{{< /tab >}}
{{< /tabpane >}}
<!-- {x-release-please-end} -->

View File

@@ -43,19 +43,19 @@ to expose your developer assistant tools to a SQLite instance:
<!-- {x-release-please-start-version} -->
{{< tabpane persist=header >}}
{{< tab header="linux/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.21.0/linux/amd64/toolbox
curl -O https://storage.googleapis.com/genai-toolbox/v0.24.0/linux/amd64/toolbox
{{< /tab >}}
{{< tab header="darwin/arm64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.21.0/darwin/arm64/toolbox
curl -O https://storage.googleapis.com/genai-toolbox/v0.24.0/darwin/arm64/toolbox
{{< /tab >}}
{{< tab header="darwin/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.21.0/darwin/amd64/toolbox
curl -O https://storage.googleapis.com/genai-toolbox/v0.24.0/darwin/amd64/toolbox
{{< /tab >}}
{{< tab header="windows/amd64" lang="bash" >}}
curl -O https://storage.googleapis.com/genai-toolbox/v0.21.0/windows/amd64/toolbox.exe
curl -O https://storage.googleapis.com/genai-toolbox/v0.24.0/windows/amd64/toolbox.exe
{{< /tab >}}
{{< /tabpane >}}
<!-- {x-release-please-end} -->

View File

@@ -79,12 +79,16 @@ There are a couple of steps to run and use a Collector.
```
1. Run toolbox with the `--telemetry-otlp` flag. Configure it to send them to
`http://127.0.0.1:4553` (for HTTP) or the Collector's URL.
`127.0.0.1:4553` (for HTTP) or the Collector's URL.
```bash
./toolbox --telemetry-otlp=http://127.0.0.1:4553
./toolbox --telemetry-otlp=127.0.0.1:4553
```
{{< notice tip >}}
To pass an insecure endpoint, set environment variable `OTEL_EXPORTER_OTLP_INSECURE=true`.
{{< /notice >}}
1. Once telemetry datas are collected, you can view them in your telemetry
backend. If you are using GCP exporters, telemetry will be visible in GCP
dashboard at [Metrics Explorer][metrics-explorer] and [Trace

View File

@@ -16,14 +16,14 @@ description: >
| | `--log-level` | Specify the minimum level logged. Allowed: 'DEBUG', 'INFO', 'WARN', 'ERROR'. | `info` |
| | `--logging-format` | Specify logging format to use. Allowed: 'standard' or 'JSON'. | `standard` |
| `-p` | `--port` | Port the server will listen on. | `5000` |
| | `--prebuilt` | Use a prebuilt tool configuration by source type. Cannot be used with --tools-file. See [Prebuilt Tools Reference](prebuilt-tools.md) for allowed values. | |
| | `--prebuilt` | Use a prebuilt tool configuration by source type. See [Prebuilt Tools Reference](prebuilt-tools.md) for allowed values. | |
| | `--stdio` | Listens via MCP STDIO instead of acting as a remote HTTP server. | |
| | `--telemetry-gcp` | Enable exporting directly to Google Cloud Monitoring. | |
| | `--telemetry-otlp` | Enable exporting using OpenTelemetry Protocol (OTLP) to the specified endpoint (e.g. 'http://127.0.0.1:4318') | |
| | `--telemetry-service-name` | Sets the value of the service.name resource attribute for telemetry data. | `toolbox` |
| | `--tools-file` | File path specifying the tool configuration. Cannot be used with --prebuilt, --tools-files, or --tools-folder. | |
| | `--tools-files` | Multiple file paths specifying tool configurations. Files will be merged. Cannot be used with --prebuilt, --tools-file, or --tools-folder. | |
| | `--tools-folder` | Directory path containing YAML tool configuration files. All .yaml and .yml files in the directory will be loaded and merged. Cannot be used with --prebuilt, --tools-file, or --tools-files. | |
| | `--tools-file` | File path specifying the tool configuration. Cannot be used with --tools-files or --tools-folder. | |
| | `--tools-files` | Multiple file paths specifying tool configurations. Files will be merged. Cannot be used with --tools-file or --tools-folder. | |
| | `--tools-folder` | Directory path containing YAML tool configuration files. All .yaml and .yml files in the directory will be loaded and merged. Cannot be used with --tools-file or --tools-files. | |
| | `--ui` | Launches the Toolbox UI web server. | |
| | `--allowed-origins` | Specifies a list of origins permitted to access this server. | `*` |
| `-v` | `--version` | version for toolbox | |
@@ -46,6 +46,9 @@ description: >
```bash
# Basic server with custom port configuration
./toolbox --tools-file "tools.yaml" --port 8080
# Server with prebuilt + custom tools configurations
./toolbox --tools-file tools.yaml --prebuilt alloydb-postgres
```
### Tool Configuration Sources
@@ -72,8 +75,8 @@ The CLI supports multiple mutually exclusive ways to specify tool configurations
{{< notice tip >}}
The CLI enforces mutual exclusivity between configuration source flags,
preventing simultaneous use of `--prebuilt` with file-based options, and
ensuring only one of `--tools-file`, `--tools-files`, or `--tools-folder` is
preventing simultaneous use of the file-based options ensuring only one of
`--tools-file`, `--tools-files`, or `--tools-folder` is
used at a time.
{{< /notice >}}

View File

@@ -13,6 +13,12 @@ allowing developers to interact with and take action on databases.
See guides, [Connect from your IDE](../how-to/connect-ide/_index.md), for
details on how to connect your AI tools (IDEs) to databases via Toolbox and MCP.
{{< notice tip >}}
You can now use `--prebuilt` along `--tools-file`, `--tools-files`, or
`--tools-folder` to combine prebuilt configs with custom tools.
See [Usage Examples](../reference/cli.md#examples).
{{< /notice >}}
## AlloyDB Postgres
* `--prebuilt` value: `alloydb-postgres`
@@ -50,6 +56,12 @@ details on how to connect your AI tools (IDEs) to databases via Toolbox and MCP.
* `list_triggers`: Lists triggers in the database.
* `list_indexes`: List available user indexes in a PostgreSQL database.
* `list_sequences`: List sequences in a PostgreSQL database.
* `list_publication_tables`: List publication tables in a PostgreSQL database.
* `list_tablespaces`: Lists tablespaces in the database.
* `list_pg_settings`: List configuration parameters for the PostgreSQL server.
* `list_database_stats`: Lists the key performance and activity statistics for
each database in the AlloyDB instance.
* `list_roles`: Lists all the user-created roles in PostgreSQL database.
## AlloyDB Postgres Admin
@@ -227,6 +239,12 @@ details on how to connect your AI tools (IDEs) to databases via Toolbox and MCP.
* `list_triggers`: Lists triggers in the database.
* `list_indexes`: List available user indexes in a PostgreSQL database.
* `list_sequences`: List sequences in a PostgreSQL database.
* `list_publication_tables`: List publication tables in a PostgreSQL database.
* `list_tablespaces`: Lists tablespaces in the database.
* `list_pg_settings`: List configuration parameters for the PostgreSQL server.
* `list_database_stats`: Lists the key performance and activity statistics for
each database in the postgreSQL instance.
* `list_roles`: Lists all the user-created roles in PostgreSQL database.
## Cloud SQL for PostgreSQL Observability
@@ -404,6 +422,8 @@ details on how to connect your AI tools (IDEs) to databases via Toolbox and MCP.
* `run_dashboard`: Runs the queries associated with a dashboard.
* `make_dashboard`: Creates a new dashboard.
* `add_dashboard_element`: Adds a tile to a dashboard.
* `add_dashboard_filter`: Adds a filter to a dashboard.
* `generate_embed_url`: Generate an embed url for content.
* `health_pulse`: Test the health of a Looker instance.
* `health_analyze`: Analyze the LookML usage of a Looker instance.
* `health_vacuum`: Suggest LookML elements that can be removed.
@@ -532,6 +552,12 @@ details on how to connect your AI tools (IDEs) to databases via Toolbox and MCP.
* `list_triggers`: Lists triggers in the database.
* `list_indexes`: List available user indexes in a PostgreSQL database.
* `list_sequences`: List sequences in a PostgreSQL database.
* `list_publication_tables`: List publication tables in a PostgreSQL database.
* `list_tablespaces`: Lists tablespaces in the database.
* `list_pg_settings`: List configuration parameters for the PostgreSQL server.
* `list_database_stats`: Lists the key performance and activity statistics for
each database in the PostgreSQL server.
* `list_roles`: Lists all the user-created roles in PostgreSQL database.
## Google Cloud Serverless for Apache Spark

View File

@@ -77,6 +77,25 @@ cluster][alloydb-free-trial].
- [`postgres-get-column-cardinality`](../tools/postgres/postgres-get-column-cardinality.md)
List cardinality of columns in a table in a PostgreSQL database.
- [`postgres-list-table-stats`](../tools/postgres/postgres-list-table-stats.md)
List statistics of a table in a PostgreSQL database.
- [`postgres-list-publication-tables`](../tools/postgres/postgres-list-publication-tables.md)
List publication tables in a PostgreSQL database.
- [`postgres-list-tablespaces`](../tools/postgres/postgres-list-tablespaces.md)
List tablespaces in an AlloyDB for PostgreSQL database.
- [`postgres-list-pg-settings`](../tools/postgres/postgres-list-pg-settings.md)
List configuration parameters for the PostgreSQL server.
- [`postgres-list-database-stats`](../tools/postgres/postgres-list-database-stats.md)
Lists the key performance and activity statistics for each database in the AlloyDB
instance.
- [`postgres-list-roles`](../tools/postgres/postgres-list-roles.md)
Lists all the user-created roles in PostgreSQL database..
### Pre-built Configurations
- [AlloyDB using MCP](https://googleapis.github.io/genai-toolbox/how-to/connect-ide/alloydb_pg_mcp/)

View File

@@ -0,0 +1,40 @@
---
title: "Gemini Data Analytics"
type: docs
weight: 1
description: >
A "cloud-gemini-data-analytics" source provides a client for the Gemini Data Analytics API.
aliases:
- /resources/sources/cloud-gemini-data-analytics
---
## About
The `cloud-gemini-data-analytics` source provides a client to interact with the [Gemini Data Analytics API](https://docs.cloud.google.com/gemini/docs/conversational-analytics-api/reference/rest). This allows tools to send natural language queries to the API.
Authentication can be handled in two ways:
1. **Application Default Credentials (ADC) (Recommended):** By default, the source uses ADC to authenticate with the API. The Toolbox server will fetch the credentials from its running environment (server-side authentication). This is the recommended method.
2. **Client-side OAuth:** If `useClientOAuth` is set to `true`, the source expects the authentication token to be provided by the caller when making a request to the Toolbox server (typically via an HTTP Bearer token). The Toolbox server will then forward this token to the underlying Gemini Data Analytics API calls.
## Example
```yaml
sources:
my-gda-source:
kind: cloud-gemini-data-analytics
projectId: my-project-id
my-oauth-gda-source:
kind: cloud-gemini-data-analytics
projectId: my-project-id
useClientOAuth: true
```
## Reference
| **field** | **type** | **required** | **description** |
| -------------- | :------: | :----------: | ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| kind | string | true | Must be "cloud-gemini-data-analytics". |
| projectId | string | true | The Google Cloud Project ID where the API is enabled. |
| useClientOAuth | boolean | false | If true, the source uses the token provided by the caller (forwarded to the API). Otherwise, it uses server-side Application Default Credentials (ADC). Defaults to `false`. |

View File

@@ -31,6 +31,9 @@ to a database by following these instructions][csql-mysql-quickstart].
- [`mysql-list-active-queries`](../tools/mysql/mysql-list-active-queries.md)
List active queries in Cloud SQL for MySQL.
- [`mysql-get-query-plan`](../tools/mysql/mysql-get-query-plan.md)
Provide information about how MySQL executes a SQL statement (EXPLAIN).
- [`mysql-list-tables`](../tools/mysql/mysql-list-tables.md)
List tables in a Cloud SQL for MySQL database.
@@ -88,13 +91,40 @@ mTLS.
[public-ip]: https://cloud.google.com/sql/docs/mysql/configure-ip
[conn-overview]: https://cloud.google.com/sql/docs/mysql/connect-overview
### Database User
### Authentication
Currently, this source only uses standard authentication. You will need to [create
a MySQL user][cloud-sql-users] to login to the database with.
This source supports both password-based authentication and IAM
authentication (using your [Application Default Credentials][adc]).
#### Standard Authentication
To connect using user/password, [create
a MySQL user][cloud-sql-users] and input your credentials in the `user` and
`password` fields.
```yaml
user: ${USER_NAME}
password: ${PASSWORD}
```
[cloud-sql-users]: https://cloud.google.com/sql/docs/mysql/create-manage-users
#### IAM Authentication
To connect using IAM authentication:
1. Prepare your database instance and user following this [guide][iam-guide].
2. You could choose one of the two ways to log in:
- Specify your IAM email as the `user`.
- Leave your `user` field blank. Toolbox will fetch the [ADC][adc]
automatically and log in using the email associated with it.
3. Leave the `password` field blank.
[iam-guide]: https://cloud.google.com/sql/docs/mysql/iam-logins
[cloudsql-users]: https://cloud.google.com/sql/docs/mysql/create-manage-users
## Example
```yaml
@@ -124,6 +154,6 @@ instead of hardcoding your secrets into the configuration file.
| region | string | true | Name of the GCP region that the cluster was created in (e.g. "us-central1"). |
| instance | string | true | Name of the Cloud SQL instance within the cluster (e.g. "my-instance"). |
| database | string | true | Name of the MySQL database to connect to (e.g. "my_db"). |
| user | string | true | Name of the MySQL user to connect as (e.g. "my-pg-user"). |
| password | string | true | Password of the MySQL user (e.g. "my-password"). |
| user | string | false | Name of the MySQL user to connect as (e.g "my-mysql-user"). Defaults to IAM auth using [ADC][adc] email if unspecified. |
| password | string | false | Password of the MySQL user (e.g. "my-password"). Defaults to attempting IAM authentication if unspecified. |
| ipType | string | false | IP Type of the Cloud SQL instance, must be either `public`, `private`, or `psc`. Default: `public`. |

View File

@@ -58,6 +58,7 @@ to a database by following these instructions][csql-pg-quickstart].
- [`postgres-list-sequences`](../tools/postgres/postgres-list-sequences.md)
List sequences in a PostgreSQL database.
- [`postgres-long-running-transactions`](../tools/postgres/postgres-long-running-transactions.md)
List long running transactions in a PostgreSQL database.
@@ -73,6 +74,25 @@ to a database by following these instructions][csql-pg-quickstart].
- [`postgres-get-column-cardinality`](../tools/postgres/postgres-get-column-cardinality.md)
List cardinality of columns in a table in a PostgreSQL database.
- [`postgres-list-table-stats`](../tools/postgres/postgres-list-table-stats.md)
List statistics of a table in a PostgreSQL database.
- [`postgres-list-publication-tables`](../tools/postgres/postgres-list-publication-tables.md)
List publication tables in a PostgreSQL database.
- [`postgres-list-tablespaces`](../tools/postgres/postgres-list-tablespaces.md)
List tablespaces in a PostgreSQL database.
- [`postgres-list-pg-settings`](../tools/postgres/postgres-list-pg-settings.md)
List configuration parameters for the PostgreSQL server.
- [`postgres-list-database-stats`](../tools/postgres/postgres-list-database-stats.md)
Lists the key performance and activity statistics for each database in the postgreSQL
instance.
- [`postgres-list-roles`](../tools/postgres/postgres-list-roles.md)
Lists all the user-created roles in PostgreSQL database..
### Pre-built Configurations
- [Cloud SQL for Postgres using

View File

@@ -91,18 +91,17 @@ instead of hardcoding your secrets into the configuration file.
## Reference
| **field** | **type** | **required** | **description** |
|----------------------|:--------:|:------------:|-------------------------------------------------------------------------------------------|
| kind | string | true | Must be "looker". |
| base_url | string | true | The URL of your Looker server with no trailing /. |
| client_id | string | false | The client id assigned by Looker. |
| client_secret | string | false | The client secret assigned by Looker. |
| verify_ssl | string | false | Whether to check the ssl certificate of the server. |
| project | string | false | The project id to use in Google Cloud. |
| location | string | false | The location to use in Google Cloud. (default: us) |
| timeout | string | false | Maximum time to wait for query execution (e.g. "30s", "2m"). By default, 120s is applied. |
| use_client_oauth | string | false | Use OAuth tokens instead of client_id and client_secret. (default: false) If a header |
| | | | name is provided, it will be used instead of "Authorization". |
| show_hidden_models | string | false | Show or hide hidden models. (default: true) |
| show_hidden_explores | string | false | Show or hide hidden explores. (default: true) |
| show_hidden_fields | string | false | Show or hide hidden fields. (default: true) |
| **field** | **type** | **required** | **description** |
|----------------------|:--------:|:------------:|-----------------------------------------------------------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "looker". |
| base_url | string | true | The URL of your Looker server with no trailing /. |
| client_id | string | false | The client id assigned by Looker. |
| client_secret | string | false | The client secret assigned by Looker. |
| verify_ssl | string | false | Whether to check the ssl certificate of the server. |
| project | string | false | The project id to use in Google Cloud. |
| location | string | false | The location to use in Google Cloud. (default: us) |
| timeout | string | false | Maximum time to wait for query execution (e.g. "30s", "2m"). By default, 120s is applied. |
| use_client_oauth | string | false | Use OAuth tokens instead of client_id and client_secret. (default: false) If a header name is provided, it will be used instead of "Authorization". |
| show_hidden_models | string | false | Show or hide hidden models. (default: true) |
| show_hidden_explores | string | false | Show or hide hidden explores. (default: true) |
| show_hidden_fields | string | false | Show or hide hidden fields. (default: true) |

View File

@@ -0,0 +1,78 @@
---
title: "MariaDB"
type: docs
weight: 1
description: >
MariaDB is an open-source relational database compatible with MySQL.
---
## About
MariaDB is a relational database management system derived from MySQL. It
implements the MySQL protocol and client libraries and supports modern SQL
features with a focus on performance and reliability.
**Note**: MariaDB is supported using the MySQL source.
## Available Tools
- [`mysql-sql`](../tools/mysql/mysql-sql.md)
Execute pre-defined prepared SQL queries in MariaDB.
- [`mysql-execute-sql`](../tools/mysql/mysql-execute-sql.md)
Run parameterized SQL queries in MariaDB.
- [`mysql-list-active-queries`](../tools/mysql/mysql-list-active-queries.md)
List active queries in MariaDB.
- [`mysql-list-tables`](../tools/mysql/mysql-list-tables.md)
List tables in a MariaDB database.
- [`mysql-list-tables-missing-unique-indexes`](../tools/mysql/mysql-list-tables-missing-unique-indexes.md)
List tables in a MariaDB database that do not have primary or unique indices.
- [`mysql-list-table-fragmentation`](../tools/mysql/mysql-list-table-fragmentation.md)
List table fragmentation in MariaDB tables.
## Requirements
### Database User
This source only uses standard authentication. You will need to [create a
MariaDB user][mariadb-users] to log in to the database.
[mariadb-users]: https://mariadb.com/kb/en/create-user/
## Example
```yaml
sources:
my_mariadb_db:
kind: mysql
host: 127.0.0.1
port: 3306
database: my_db
user: ${MARIADB_USER}
password: ${MARIADB_PASS}
# Optional TLS and other driver parameters. For example, enable preferred TLS:
# queryParams:
# tls: preferred
queryTimeout: 30s # Optional: query timeout duration
```
{{< notice tip >}}
Use environment variables instead of committing credentials to source files.
{{< /notice >}}
## Reference
| **field** | **type** | **required** | **description** |
| ------------ | :------: | :----------: | ----------------------------------------------------------------------------------------------- |
| kind | string | true | Must be `mysql`. |
| host | string | true | IP address to connect to (e.g. "127.0.0.1"). |
| port | string | true | Port to connect to (e.g. "3307"). |
| database | string | true | Name of the MariaDB database to connect to (e.g. "my_db"). |
| user | string | true | Name of the MariaDB user to connect as (e.g. "my-mysql-user"). |
| password | string | true | Password of the MariaDB user (e.g. "my-password"). |
| queryTimeout | string | false | Maximum time to wait for query execution (e.g. "30s", "2m"). By default, no timeout is applied. |
| queryParams | map<string,string> | false | Arbitrary DSN parameters passed to the driver (e.g. `tls: preferred`, `charset: utf8mb4`). Useful for enabling TLS or other connection options. |

View File

@@ -25,6 +25,9 @@ reliability, performance, and ease of use.
- [`mysql-list-active-queries`](../tools/mysql/mysql-list-active-queries.md)
List active queries in MySQL.
- [`mysql-get-query-plan`](../tools/mysql/mysql-get-query-plan.md)
Provide information about how MySQL executes a SQL statement (EXPLAIN).
- [`mysql-list-tables`](../tools/mysql/mysql-list-tables.md)
List tables in a MySQL database.

View File

@@ -18,10 +18,10 @@ DW) database workloads.
## Available Tools
- [`oracle-sql`](../tools/oracle/oracle-sql.md)
Execute pre-defined prepared SQL queries in Oracle.
Execute pre-defined prepared SQL queries in Oracle.
- [`oracle-execute-sql`](../tools/oracle/oracle-execute-sql.md)
Run parameterized SQL queries in Oracle.
Run parameterized SQL queries in Oracle.
## Requirements
@@ -33,6 +33,25 @@ user][oracle-users] to log in to the database with the necessary permissions.
[oracle-users]:
https://docs.oracle.com/en/database/oracle/oracle-database/21/sqlrf/CREATE-USER.html
### Oracle Driver Requirement (Conditional)
The Oracle source offers two connection drivers:
1. **Pure Go Driver (`useOCI: false`, default):** Uses the `go-ora` library.
This driver is simpler and does not require any local Oracle software
installation, but it **lacks support for advanced features** like Oracle
Wallets or Kerberos authentication.
2. **OCI-Based Driver (`useOCI: true`):** Uses the `godror` library, which
provides access to **advanced Oracle features** like Digital Wallet support.
If you set `useOCI: true`, you **must** install the **Oracle Instant Client**
libraries on the machine where this tool runs.
You can download the Instant Client from the official Oracle website: [Oracle
Instant Client
Downloads](https://www.oracle.com/database/technologies/instant-client/downloads.html)
## Connection Methods
You can configure the connection to your Oracle database using one of the
@@ -66,12 +85,15 @@ using a TNS (Transparent Network Substrate) alias.
containing it. This setting will override the `TNS_ADMIN` environment
variable.
## Example
## Examples
This example demonstrates the four connection methods you could choose from:
```yaml
sources:
my-oracle-source:
kind: oracle
# --- Choose one connection method ---
# 1. Host, Port, and Service Name
host: 127.0.0.1
@@ -88,6 +110,43 @@ sources:
user: ${USER_NAME}
password: ${PASSWORD}
# Optional: Set to true to use the OCI-based driver for advanced features (Requires Oracle Instant Client)
```
### Using an Oracle Wallet
Oracle Wallet allows you to store credentails used for database connection. Depending whether you are using an OCI-based driver, the wallet configuration is different.
#### Pure Go Driver (`useOCI: false`) - Oracle Wallet
The `go-ora` driver uses the `walletLocation` field to connect to a database secured with an Oracle Wallet without standard username and password.
```yaml
sources:
pure-go-wallet:
kind: oracle
connectionString: "127.0.0.1:1521/XEPDB1"
user: ${USER_NAME}
password: ${PASSWORD}
# The TNS Alias is often required to connect to a service registered in tnsnames.ora
tnsAlias: "SECURE_DB_ALIAS"
walletLocation: "/path/to/my/wallet/directory"
```
#### OCI-Based Driver (`useOCI: true`) - Oracle Wallet
For the OCI-based driver, wallet authentication is triggered by setting tnsAdmin to the wallet directory and connecting via a tnsAlias.
```yaml
sources:
oci-wallet:
kind: oracle
connectionString: "127.0.0.1:1521/XEPDB1"
user: ${USER_NAME}
password: ${PASSWORD}
tnsAlias: "WALLET_DB_ALIAS"
tnsAdmin: "/opt/oracle/wallet" # Directory containing tnsnames.ora, sqlnet.ora, and wallet files
useOCI: true
```
{{< notice tip >}}
@@ -97,14 +156,15 @@ instead of hardcoding your secrets into the configuration file.
## Reference
| **field** | **type** | **required** | **description** |
|------------------|:--------:|:------------:|-----------------------------------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "oracle". |
| user | string | true | Name of the Oracle user to connect as (e.g. "my-oracle-user"). |
| password | string | true | Password of the Oracle user (e.g. "my-password"). |
| host | string | false | IP address or hostname to connect to (e.g. "127.0.0.1"). Required if not using `connectionString` or `tnsAlias`. |
| port | integer | false | Port to connect to (e.g. "1521"). Required if not using `connectionString` or `tnsAlias`. |
| serviceName | string | false | The Oracle service name of the database to connect to. Required if not using `connectionString` or `tnsAlias`. |
| connectionString | string | false | A direct connection string (e.g. "hostname:port/servicename"). Use as an alternative to `host`, `port`, and `serviceName`. |
| tnsAlias | string | false | A TNS alias from a `tnsnames.ora` file. Use as an alternative to `host`/`port` or `connectionString`. |
| tnsAdmin | string | false | Path to the directory containing the `tnsnames.ora` file. This overrides the `TNS_ADMIN` environment variable if it is set. |
| **field** | **type** | **required** | **description** |
|------------------|:--------:|:------------:|-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| kind | string | true | Must be "oracle". |
| user | string | true | Name of the Oracle user to connect as (e.g. "my-oracle-user"). |
| password | string | true | Password of the Oracle user (e.g. "my-password"). |
| host | string | false | IP address or hostname to connect to (e.g. "127.0.0.1"). Required if not using `connectionString` or `tnsAlias`. |
| port | integer | false | Port to connect to (e.g. "1521"). Required if not using `connectionString` or `tnsAlias`. |
| serviceName | string | false | The Oracle service name of the database to connect to. Required if not using `connectionString` or `tnsAlias`. |
| connectionString | string | false | A direct connection string (e.g. "hostname:port/servicename"). Use as an alternative to `host`, `port`, and `serviceName`. |
| tnsAlias | string | false | A TNS alias from a `tnsnames.ora` file. Use as an alternative to `host`/`port` or `connectionString`. |
| tnsAdmin | string | false | Path to the directory containing the `tnsnames.ora` file. This overrides the `TNS_ADMIN` environment variable if it is set. |
| useOCI | bool | false | If true, uses the OCI-based driver (godror) which supports Oracle Wallet/Kerberos but requires the Oracle Instant Client libraries to be installed. Defaults to false (pure Go driver). |

View File

@@ -68,6 +68,25 @@ reputation for reliability, feature robustness, and performance.
- [`postgres-get-column-cardinality`](../tools/postgres/postgres-get-column-cardinality.md)
List cardinality of columns in a table in a PostgreSQL database.
- [`postgres-list-table-stats`](../tools/postgres/postgres-list-table-stats.md)
List statistics of a table in a PostgreSQL database.
- [`postgres-list-publication-tables`](../tools/postgres/postgres-list-publication-tables.md)
List publication tables in a PostgreSQL database.
- [`postgres-list-tablespaces`](../tools/postgres/postgres-list-tablespaces.md)
List tablespaces in a PostgreSQL database.
- [`postgres-list-pg-settings`](../tools/postgres/postgres-list-pg-settings.md)
List configuration parameters for the PostgreSQL server.
- [`postgres-list-database-stats`](../tools/postgres/postgres-list-database-stats.md)
Lists the key performance and activity statistics for each database in the postgreSQL
server.
- [`postgres-list-roles`](../tools/postgres/postgres-list-roles.md)
Lists all the user-created roles in PostgreSQL database..
### Pre-built Configurations
- [PostgreSQL using MCP](https://googleapis.github.io/genai-toolbox/how-to/connect-ide/postgres_mcp/)

View File

@@ -21,6 +21,10 @@ Apache Spark.
Get a Serverless Spark batch.
- [`serverless-spark-cancel-batch`](../tools/serverless-spark/serverless-spark-cancel-batch.md)
Cancel a running Serverless Spark batch operation.
- [`serverless-spark-create-pyspark-batch`](../tools/serverless-spark/serverless-spark-create-pyspark-batch.md)
Create a Serverless Spark PySpark batch operation.
- [`serverless-spark-create-spark-batch`](../tools/serverless-spark/serverless-spark-create-spark-batch.md)
Create a Serverless Spark Java batch operation.
## Requirements

View File

@@ -0,0 +1,7 @@
---
title: "Gemini Data Analytics"
type: docs
weight: 1
description: >
Tools for Gemini Data Analytics.
---

View File

@@ -0,0 +1,92 @@
---
title: "Gemini Data Analytics QueryData"
type: docs
weight: 1
description: >
A tool to convert natural language queries into SQL statements using the Gemini Data Analytics QueryData API.
aliases:
- /resources/tools/cloud-gemini-data-analytics-query
---
## About
The `cloud-gemini-data-analytics-query` tool allows you to send natural language questions to the Gemini Data Analytics API and receive structured responses containing SQL queries, natural language answers, and explanations. For details on defining data agent context for database data sources, see the official [documentation](https://docs.cloud.google.com/gemini/docs/conversational-analytics-api/data-agent-authored-context-databases).
## Example
```yaml
tools:
my-gda-query-tool:
kind: cloud-gemini-data-analytics-query
source: my-gda-source
description: "Use this tool to send natural language queries to the Gemini Data Analytics API and receive SQL, natural language answers, and explanations."
location: ${your_database_location}
context:
datasourceReferences:
cloudSqlReference:
databaseReference:
projectId: "${your_project_id}"
region: "${your_database_instance_region}"
instanceId: "${your_database_instance_id}"
databaseId: "${your_database_name}"
engine: "POSTGRESQL"
agentContextReference:
contextSetId: "${your_context_set_id}" # E.g. projects/${project_id}/locations/${context_set_location}/contextSets/${context_set_id}
generationOptions:
generateQueryResult: true
generateNaturalLanguageAnswer: true
generateExplanation: true
generateDisambiguationQuestion: true
```
### Usage Flow
When using this tool, a `prompt` parameter containing a natural language query is provided to the tool (typically by an agent). The tool then interacts with the Gemini Data Analytics API using the context defined in your configuration.
The structure of the response depends on the `generationOptions` configured in your tool definition (e.g., enabling `generateQueryResult` will include the SQL query results).
See [Data Analytics API REST documentation](https://clouddocs.devsite.corp.google.com/gemini/docs/conversational-analytics-api/reference/rest/v1alpha/projects.locations/queryData?rep_location=global) for details.
**Example Input Prompt:**
```text
How many accounts who have region in Prague are eligible for loans? A3 contains the data of region.
```
**Example API Response:**
```json
{
"generatedQuery": "SELECT COUNT(T1.account_id) FROM account AS T1 INNER JOIN loan AS T2 ON T1.account_id = T2.account_id INNER JOIN district AS T3 ON T1.district_id = T3.district_id WHERE T3.A3 = 'Prague'",
"intentExplanation": "I found a template that matches the user's question. The template asks about the number of accounts who have region in a given city and are eligible for loans. The question asks about the number of accounts who have region in Prague and are eligible for loans. The template's parameterized SQL is 'SELECT COUNT(T1.account_id) FROM account AS T1 INNER JOIN loan AS T2 ON T1.account_id = T2.account_id INNER JOIN district AS T3 ON T1.district_id = T3.district_id WHERE T3.A3 = ?'. I will replace the named parameter '?' with 'Prague'.",
"naturalLanguageAnswer": "There are 84 accounts from the Prague region that are eligible for loans.",
"queryResult": {
"columns": [
{
"type": "INT64"
}
],
"rows": [
{
"values": [
{
"value": "84"
}
]
}
],
"totalRowCount": "1"
}
}
```
## Reference
| **field** | **type** | **required** | **description** |
| ----------------- | :------: | :----------: | ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------ |
| kind | string | true | Must be "cloud-gemini-data-analytics-query". |
| source | string | true | The name of the `cloud-gemini-data-analytics` source to use. |
| description | string | true | A description of the tool's purpose. |
| location | string | true | The Google Cloud location of the target database resource (e.g., "us-central1"). This is used to construct the parent resource name in the API call. |
| context | object | true | The context for the query, including datasource references. See [QueryDataContext](https://github.com/googleapis/googleapis/blob/b32495a713a68dd0dff90cf0b24021debfca048a/google/cloud/geminidataanalytics/v1beta/data_chat_service.proto#L156) for details. |
| generationOptions | object | false | Options for generating the response. See [GenerationOptions](https://github.com/googleapis/googleapis/blob/b32495a713a68dd0dff90cf0b24021debfca048a/google/cloud/geminidataanalytics/v1beta/data_chat_service.proto#L135) for details. |

View File

@@ -10,27 +10,18 @@ aliases:
## About
The `looker-add-dashboard-element` creates a dashboard element
in the given dashboard.
The `looker-add-dashboard-element` tool creates a new tile (element) within an existing Looker dashboard.
Tiles are added in the order this tool is called for a given `dashboard_id`.
CRITICAL ORDER OF OPERATIONS:
1. Create the dashboard using `make_dashboard`.
2. Add any dashboard-level filters using `add_dashboard_filter`.
3. Then, add elements (tiles) using this tool.
It's compatible with the following sources:
- [looker](../../sources/looker.md)
`looker-add-dashboard-element` takes eleven parameters:
1. the `model`
2. the `explore`
3. the `fields` list
4. an optional set of `filters`
5. an optional set of `pivots`
6. an optional set of `sorts`
7. an optional `limit`
8. an optional `tz`
9. an optional `vis_config`
10. the `title`
11. the `dashboard_id`
## Example
```yaml
@@ -39,24 +30,37 @@ tools:
kind: looker-add-dashboard-element
source: looker-source
description: |
add_dashboard_element Tool
This tool creates a new tile (element) within an existing Looker dashboard.
Tiles are added in the order this tool is called for a given `dashboard_id`.
This tool creates a new tile in a Looker dashboard using
the query parameters and the vis_config specified.
CRITICAL ORDER OF OPERATIONS:
1. Create the dashboard using `make_dashboard`.
2. Add any dashboard-level filters using `add_dashboard_filter`.
3. Then, add elements (tiles) using this tool.
Most of the parameters are the same as the query_url
tool. In addition, there is a title that may be provided.
The dashboard_id must be specified. That is obtained
from calling make_dashboard.
Required Parameters:
- dashboard_id: The ID of the target dashboard, obtained from `make_dashboard`.
- model_name, explore_name, fields: These query parameters are inherited
from the `query` tool and are required to define the data for the tile.
This tool can be called many times for one dashboard_id
and the resulting tiles will be added in order.
Optional Parameters:
- title: An optional title for the dashboard tile.
- pivots, filters, sorts, limit, query_timezone: These query parameters are
inherited from the `query` tool and can be used to customize the tile's query.
- vis_config: A JSON object defining the visualization settings for this tile.
The structure and options are the same as for the `query_url` tool's `vis_config`.
Connecting to Dashboard Filters:
A dashboard element can be connected to one or more dashboard filters (created with
`add_dashboard_filter`). To do this, specify the `name` of the dashboard filter
and the `field` from the element's query that the filter should apply to.
The format for specifying the field is `view_name.field_name`.
```
## Reference
| **field** | **type** | **required** | **description** |
|-------------|:--------:|:------------:|----------------------------------------------------|
| kind | string | true | Must be "looker-add-dashboard-element" |
| source | string | true | Name of the source the SQL should execute on. |
| description | string | true | Description of the tool that is passed to the LLM. |
|:------------|:--------:|:------------:|----------------------------------------------------|
| kind | string | true | Must be "looker-add-dashboard-element". |
| source | string | true | Name of the source the SQL should execute on. |
| description | string | true | Description of the tool that is passed to the LLM. |

View File

@@ -0,0 +1,75 @@
---
title: "looker-add-dashboard-filter"
type: docs
weight: 1
description: >
The "looker-add-dashboard-filter" tool adds a filter to a specified dashboard.
aliases:
- /resources/tools/looker-add-dashboard-filter
---
## About
The `looker-add-dashboard-filter` tool adds a filter to a specified Looker dashboard.
CRITICAL ORDER OF OPERATIONS:
1. Create a dashboard using `make_dashboard`.
2. Add all desired filters using this tool (`add_dashboard_filter`).
3. Finally, add dashboard elements (tiles) using `add_dashboard_element`.
It's compatible with the following sources:
- [looker](../../sources/looker.md)
## Parameters
| **parameter** | **type** | **required** | **default** | **description** |
|:----------------------|:--------:|:-----------------:|:--------------:|-------------------------------------------------------------------------------------------------------------------------------|
| dashboard_id | string | true | none | The ID of the dashboard to add the filter to, obtained from `make_dashboard`. |
| name | string | true | none | A unique internal identifier for the filter. This name is used later in `add_dashboard_element` to bind tiles to this filter. |
| title | string | true | none | The label displayed to users in the Looker UI. |
| filter_type | string | true | `field_filter` | The filter type of filter. Can be `date_filter`, `number_filter`, `string_filter`, or `field_filter`. |
| default_value | string | false | none | The initial value for the filter. |
| model | string | if `field_filter` | none | The name of the LookML model, obtained from `get_models`. |
| explore | string | if `field_filter` | none | The name of the explore within the model, obtained from `get_explores`. |
| dimension | string | if `field_filter` | none | The name of the field (e.g., `view_name.field_name`) to base the filter on, obtained from `get_dimensions`. |
| allow_multiple_values | boolean | false | true | The Dashboard Filter should allow multiple values |
| required | boolean | false | false | The Dashboard Filter is required to run dashboard |
## Example
```yaml
tools:
add_dashboard_filter:
kind: looker-add-dashboard-filter
source: looker-source
description: |
This tool adds a filter to a Looker dashboard.
CRITICAL ORDER OF OPERATIONS:
1. Create a dashboard using `make_dashboard`.
2. Add all desired filters using this tool (`add_dashboard_filter`).
3. Finally, add dashboard elements (tiles) using `add_dashboard_element`.
Parameters:
- dashboard_id (required): The ID from `make_dashboard`.
- name (required): A unique internal identifier for the filter. You will use this `name` later in `add_dashboard_element` to bind tiles to this filter.
- title (required): The label displayed to users in the UI.
- filter_type (required): One of `date_filter`, `number_filter`, `string_filter`, or `field_filter`.
- default_value (optional): The initial value for the filter.
Field Filters (`flter_type: field_filter`):
If creating a field filter, you must also provide:
- model
- explore
- dimension
The filter will inherit suggestions and type information from this LookML field.
```
## Reference
| **field** | **type** | **required** | **description** |
|-------------|:--------:|:------------:|----------------------------------------------------|
| kind | string | true | Must be "looker-add-dashboard-filter". |
| source | string | true | Name of the source the SQL should execute on. |
| description | string | true | Description of the tool that is passed to the LLM. |

View File

@@ -34,9 +34,10 @@ tools:
kind: looker-conversational-analytics
source: looker-source
description: |
Use this tool to perform data analysis, get insights,
or answer complex questions about the contents of specific
Looker explores.
Use this tool to ask questions about your data using the Looker Conversational
Analytics API. You must provide a natural language query and a list of
1 to 5 model and explore combinations (e.g. [{'model': 'the_model', 'explore': 'the_explore'}]).
Use the 'get_models' and 'get_explores' tools to discover available models and explores.
```
## Reference

View File

@@ -27,13 +27,18 @@ tools:
kind: looker-create-project-file
source: looker-source
description: |
create_project_file Tool
This tool creates a new LookML file within a specified project, populating
it with the provided content.
Given a project_id and a file path within the project, as well as the content
of a LookML file, this tool will create a new file within the project.
Prerequisite: The Looker session must be in Development Mode. Use `dev_mode: true` first.
This tool must be called after the dev_mode tool has changed the session to
dev mode.
Parameters:
- project_id (required): The unique ID of the LookML project.
- file_path (required): The desired path and filename for the new file within the project.
- content (required): The full LookML content to write into the new file.
Output:
A confirmation message upon successful file creation.
```
## Reference

View File

@@ -26,13 +26,17 @@ tools:
kind: looker-delete-project-file
source: looker-source
description: |
delete_project_file Tool
This tool permanently deletes a specified LookML file from within a project.
Use with caution, as this action cannot be undone through the API.
Given a project_id and a file path within the project, this tool will delete
the file from the project.
Prerequisite: The Looker session must be in Development Mode. Use `dev_mode: true` first.
This tool must be called after the dev_mode tool has changed the session to
dev mode.
Parameters:
- project_id (required): The unique ID of the LookML project.
- file_path (required): The exact path to the LookML file to delete within the project.
Output:
A confirmation message upon successful file deletion.
```
## Reference

View File

@@ -27,10 +27,13 @@ tools:
kind: looker-dev-mode
source: looker-source
description: |
dev_mode Tool
This tool allows toggling the Looker IDE session between Development Mode and Production Mode.
Development Mode enables making and testing changes to LookML projects.
Passing true to this tool switches the session to dev mode. Passing false to this tool switches the
session to production mode.
Parameters:
- enable (required): A boolean value.
- `true`: Switches the current session to Development Mode.
- `false`: Switches the current session to Production Mode.
```
## Reference

View File

@@ -36,11 +36,17 @@ tools:
kind: looker-generate-embed-url
source: looker-source
description: |
generate_embed_url Tool
This tool generates a signed, private embed URL for specific Looker content,
allowing users to access it directly.
This tool generates an embeddable URL for Looker content.
You need to provide the type of content (e.g., 'dashboards', 'looks', 'query-visualization')
and the ID of the content.
Parameters:
- type (required): The type of content to embed. Common values include:
- `dashboards`
- `looks`
- `explore`
- id (required): The unique identifier for the content.
- For dashboards and looks, use the numeric ID (e.g., "123").
- For explores, use the format "model_name/explore_name".
```
## Reference

View File

@@ -26,10 +26,16 @@ tools:
kind: looker-get-connection-databases
source: looker-source
description: |
get_connection_databases Tool
This tool retrieves a list of databases available through a specified Looker connection.
This is only applicable for connections that support multiple databases.
Use `get_connections` to check if a connection supports multiple databases.
This tool will list the databases available from a connection if the connection
supports multiple databases.
Parameters:
- connection_name (required): The name of the database connection, obtained from `get_connections`.
Output:
A JSON array of strings, where each string is the name of an available database.
If the connection does not support multiple databases, an empty list or an error will be returned.
```
## Reference

View File

@@ -26,10 +26,16 @@ tools:
kind: looker-get-connection-schemas
source: looker-source
description: |
get_connection_schemas Tool
This tool retrieves a list of database schemas available through a specified
Looker connection.
This tool will list the schemas available from a connection, filtered by
an optional database name.
Parameters:
- connection_name (required): The name of the database connection, obtained from `get_connections`.
- database (optional): An optional database name to filter the schemas.
Only applicable for connections that support multiple databases.
Output:
A JSON array of strings, where each string is the name of an available schema.
```
## Reference

View File

@@ -26,11 +26,20 @@ tools:
kind: looker-get-connection-table-columns
source: looker-source
description: |
get_connection_table_columns Tool
This tool retrieves a list of columns for one or more specified tables within a
given database schema and connection.
This tool will list the columns available from a connection, for all the tables
given in a comma separated list of table names, filtered by the
schema name and optional database name.
Parameters:
- connection_name (required): The name of the database connection, obtained from `get_connections`.
- schema (required): The name of the schema where the tables reside, obtained from `get_connection_schemas`.
- tables (required): A comma-separated string of table names for which to retrieve columns
(e.g., "users,orders,products"), obtained from `get_connection_tables`.
- database (optional): The name of the database to filter by. Only applicable for connections
that support multiple databases (check with `get_connections`).
Output:
A JSON array of objects, where each object represents a column and contains details
such as `table_name`, `column_name`, `data_type`, and `is_nullable`.
```
## Reference

View File

@@ -27,10 +27,17 @@ tools:
kind: looker-get-connection-tables
source: looker-source
description: |
get_connection_tables Tool
This tool retrieves a list of tables available within a specified database schema
through a Looker connection.
This tool will list the tables available from a connection, filtered by the
schema name and optional database name.
Parameters:
- connection_name (required): The name of the database connection, obtained from `get_connections`.
- schema (required): The name of the schema to list tables from, obtained from `get_connection_schemas`.
- database (optional): The name of the database to filter by. Only applicable for connections
that support multiple databases (check with `get_connections`).
Output:
A JSON array of strings, where each string is the name of an available table.
```
## Reference

View File

@@ -26,11 +26,18 @@ tools:
kind: looker-get-connections
source: looker-source
description: |
get_connections Tool
This tool retrieves a list of all database connections configured in the Looker system.
This tool will list all the connections available in the Looker system, as
well as the dialect name, the default schema, the database if applicable,
and whether the connection supports multiple databases.
Parameters:
This tool takes no parameters.
Output:
A JSON array of objects, each representing a database connection and including details such as:
- `name`: The connection's unique identifier.
- `dialect`: The database dialect (e.g., "mysql", "postgresql", "bigquery").
- `default_schema`: The default schema for the connection.
- `database`: The associated database name (if applicable).
- `supports_multiple_databases`: A boolean indicating if the connection can access multiple databases.
```
## Reference

View File

@@ -29,25 +29,29 @@ default to 100 and 0.
```yaml
tools:
get_dashboards:
kind: looker-get-dashboards
source: looker-source
description: |
get_dashboards Tool
This tool is used to search for saved dashboards in a Looker instance.
String search params use case-insensitive matching. String search
params can contain % and '_' as SQL LIKE pattern match wildcard
expressions. example="dan%" will match "danger" and "Danzig" but
not "David" example="D_m%" will match "Damage" and "dump".
Most search params can accept "IS NULL" and "NOT NULL" as special
expressions to match or exclude (respectively) rows where the
column is null.
The limit and offset are used to paginate the results.
The result of the get_dashboards tool is a list of json objects.
get_dashboards:
kind: looker-get-dashboards
source: looker-source
description: |
This tool searches for saved dashboards in a Looker instance. It returns a list of JSON objects, each representing a dashboard.
Search Parameters:
- title (optional): Filter by dashboard title (supports wildcards).
- folder_id (optional): Filter by the ID of the folder where the dashboard is saved.
- user_id (optional): Filter by the ID of the user who created the dashboard.
- description (optional): Filter by description content (supports wildcards).
- id (optional): Filter by specific dashboard ID.
- limit (optional): Maximum number of results to return. Defaults to a system limit.
- offset (optional): Starting point for pagination.
String Search Behavior:
- Case-insensitive matching.
- Supports SQL LIKE pattern match wildcards:
- `%`: Matches any sequence of zero or more characters. (e.g., `"finan%"` matches "financial", "finance")
- `_`: Matches any single character. (e.g., `"s_les"` matches "sales")
- Special expressions for null checks:
- `"IS NULL"`: Matches dashboards where the field is null.
- `"NOT NULL"`: Excludes dashboards where the field is null.
```
## Reference

View File

@@ -28,16 +28,20 @@ tools:
kind: looker-get-dimensions
source: looker-source
description: |
The get_dimensions tool retrieves the list of dimensions defined in
an explore.
This tool retrieves a list of dimensions defined within a specific Looker explore.
Dimensions are non-aggregatable attributes or characteristics of your data
(e.g., product name, order date, customer city) that can be used for grouping,
filtering, or segmenting query results.
It takes two parameters, the model_name looked up from get_models and the
explore_name looked up from get_explores.
Parameters:
- model_name (required): The name of the LookML model, obtained from `get_models`.
- explore_name (required): The name of the explore within the model, obtained from `get_explores`.
If this returns a suggestions field for a dimension, the contents of suggestions
can be used as filters for this field. If this returns a suggest_explore and
suggest_dimension, a query against that explore and dimension can be used to find
valid filters for this field.
Output Details:
- If a dimension includes a `suggestions` field, its contents are valid values
that can be used directly as filters for that dimension.
- If a `suggest_explore` and `suggest_dimension` are provided, you can query
that specified explore and dimension to retrieve a list of valid filter values.
```

View File

@@ -40,10 +40,13 @@ tools:
kind: looker-get-explores
source: looker-source
description: |
The get_explores tool retrieves the list of explores defined in a LookML model
in the Looker system.
This tool retrieves a list of explores defined within a specific LookML model.
Explores represent a curated view of your data, typically joining several
tables together to allow for focused analysis on a particular subject area.
The output provides details like the explore's `name` and `label`.
It takes one parameter, the model_name looked up from get_models.
Parameters:
- model_name (required): The name of the LookML model, obtained from `get_models`.
```
## Reference

View File

@@ -24,15 +24,22 @@ It's compatible with the following sources:
```yaml
tools:
get_dimensions:
get_filters:
kind: looker-get-filters
source: looker-source
description: |
The get_filters tool retrieves the list of filters defined in
an explore.
This tool retrieves a list of "filter-only fields" defined within a specific
Looker explore. These are special fields defined in LookML specifically to
create user-facing filter controls that do not directly affect the `GROUP BY`
clause of the SQL query. They are often used in conjunction with liquid templating
to create dynamic queries.
It takes two parameters, the model_name looked up from get_models and the
explore_name looked up from get_explores.
Note: Regular dimensions and measures can also be used as filters in a query.
This tool *only* returns fields explicitly defined as `filter:` in LookML.
Parameters:
- model_name (required): The name of the LookML model, obtained from `get_models`.
- explore_name (required): The name of the explore within the model, obtained from `get_explores`.
```
The response is a json array with the following elements:

View File

@@ -34,21 +34,26 @@ tools:
kind: looker-get-looks
source: looker-source
description: |
get_looks Tool
This tool searches for saved Looks (pre-defined queries and visualizations)
in a Looker instance. It returns a list of JSON objects, each representing a Look.
This tool is used to search for saved looks in a Looker instance.
String search params use case-insensitive matching. String search
params can contain % and '_' as SQL LIKE pattern match wildcard
expressions. example="dan%" will match "danger" and "Danzig" but
not "David" example="D_m%" will match "Damage" and "dump".
Search Parameters:
- title (optional): Filter by Look title (supports wildcards).
- folder_id (optional): Filter by the ID of the folder where the Look is saved.
- user_id (optional): Filter by the ID of the user who created the Look.
- description (optional): Filter by description content (supports wildcards).
- id (optional): Filter by specific Look ID.
- limit (optional): Maximum number of results to return. Defaults to a system limit.
- offset (optional): Starting point for pagination.
Most search params can accept "IS NULL" and "NOT NULL" as special
expressions to match or exclude (respectively) rows where the
column is null.
The limit and offset are used to paginate the results.
The result of the get_looks tool is a list of json objects.
String Search Behavior:
- Case-insensitive matching.
- Supports SQL LIKE pattern match wildcards:
- `%`: Matches any sequence of zero or more characters. (e.g., `"dan%"` matches "danger", "Danzig")
- `_`: Matches any single character. (e.g., `"D_m%"` matches "Damage", "dump")
- Special expressions for null checks:
- `"IS NULL"`: Matches Looks where the field is null.
- `"NOT NULL"`: Excludes Looks where the field is null.
```
## Reference

View File

@@ -28,16 +28,19 @@ tools:
kind: looker-get-measures
source: looker-source
description: |
The get_measures tool retrieves the list of measures defined in
an explore.
This tool retrieves a list of measures defined within a specific Looker explore.
Measures are aggregatable metrics (e.g., total sales, average price, count of users)
that are used for calculations and quantitative analysis in your queries.
It takes two parameters, the model_name looked up from get_models and the
explore_name looked up from get_explores.
Parameters:
- model_name (required): The name of the LookML model, obtained from `get_models`.
- explore_name (required): The name of the explore within the model, obtained from `get_explores`.
If this returns a suggestions field for a measure, the contents of suggestions
can be used as filters for this field. If this returns a suggest_explore and
suggest_dimension, a query against that explore and dimension can be used to find
valid filters for this field.
Output Details:
- If a measure includes a `suggestions` field, its contents are valid values
that can be used directly as filters for that measure.
- If a `suggest_explore` and `suggest_dimension` are provided, you can query
that specified explore and dimension to retrieve a list of valid filter values.
```

View File

@@ -26,9 +26,12 @@ tools:
kind: looker-get-models
source: looker-source
description: |
The get_models tool retrieves the list of LookML models in the Looker system.
This tool retrieves a list of available LookML models in the Looker instance.
LookML models define the data structure and relationships that users can query.
The output includes details like the model's `name` and `label`, which are
essential for subsequent calls to tools like `get_explores` or `query`.
It takes no parameters.
This tool takes no parameters.
```
## Reference

View File

@@ -28,11 +28,15 @@ tools:
kind: looker-get-parameters
source: looker-source
description: |
The get_parameters tool retrieves the list of parameters defined in
an explore.
This tool retrieves a list of parameters defined within a specific Looker explore.
LookML parameters are dynamic input fields that allow users to influence query
behavior without directly modifying the underlying LookML. They are often used
with `liquid` templating to create flexible dashboards and reports, enabling
users to choose dimensions, measures, or other query components at runtime.
It takes two parameters, the model_name looked up from get_models and the
explore_name looked up from get_explores.
Parameters:
- model_name (required): The name of the LookML model, obtained from `get_models`.
- explore_name (required): The name of the explore within the model, obtained from `get_explores`.
```
The response is a json array with the following elements:

View File

@@ -26,10 +26,15 @@ tools:
kind: looker-get-project-file
source: looker-source
description: |
get_project_file Tool
This tool retrieves the raw content of a specific LookML file from within a project.
Given a project_id and a file path within the project, this tool returns
the contents of the LookML file.
Parameters:
- project_id (required): The unique ID of the LookML project, obtained from `get_projects`.
- file_path (required): The path to the LookML file within the project,
typically obtained from `get_project_files`.
Output:
The raw text content of the specified LookML file.
```
## Reference

View File

@@ -26,10 +26,15 @@ tools:
kind: looker-get-project-files
source: looker-source
description: |
get_project_files Tool
This tool retrieves a list of all LookML files within a specified project,
providing details about each file.
Given a project_id this tool returns the details about
the LookML files that make up that project.
Parameters:
- project_id (required): The unique ID of the LookML project, obtained from `get_projects`.
Output:
A JSON array of objects, each representing a LookML file and containing
details such as `path`, `id`, `type`, and `git_status`.
```
## Reference

View File

@@ -26,10 +26,16 @@ tools:
kind: looker-get-projects
source: looker-source
description: |
get_projects Tool
This tool retrieves a list of all LookML projects available on the Looker instance.
It is useful for identifying projects before performing actions like retrieving
project files or making modifications.
This tool returns the project_id and project_name for
all the LookML projects on the looker instance.
Parameters:
This tool takes no parameters.
Output:
A JSON array of objects, each containing the `project_id` and `project_name`
for a LookML project.
```
## Reference

View File

@@ -42,17 +42,18 @@ tools:
kind: looker-health-analyze
source: looker-source
description: |
health-analyze Tool
This tool calculates the usage statistics for Looker projects, models, and explores.
This tool calculates the usage of projects, models and explores.
Parameters:
- action (required): The type of resource to analyze. Can be `"projects"`, `"models"`, or `"explores"`.
- project (optional): The specific project ID to analyze.
- model (optional): The specific model name to analyze. Requires `project` if used without `explore`.
- explore (optional): The specific explore name to analyze. Requires `model` if used.
- timeframe (optional): The lookback period in days for usage data. Defaults to `90` days.
- min_queries (optional): The minimum number of queries for a resource to be considered active. Defaults to `1`.
It accepts 6 parameters:
1. `action`: can be "projects", "models", or "explores"
2. `project`: the project to analyze (optional)
3. `model`: the model to analyze (optional)
4. `explore`: the explore to analyze (optional)
5. `timeframe`: the lookback period in days, default is 90
6. `min_queries`: the minimum number of queries to consider a resource as active, default is 1
Output:
The result is a JSON object containing usage metrics for the specified resources.
```
## Reference

View File

@@ -49,20 +49,22 @@ tools:
kind: looker-health-pulse
source: looker-source
description: |
health-pulse Tool
This tool performs various health checks on a Looker instance.
This tool takes the pulse of a Looker instance by taking
one of the following actions:
1. `check_db_connections`,
2. `check_dashboard_performance`,
3. `check_dashboard_errors`,
4. `check_explore_performance`,
5. `check_schedule_failures`, or
6. `check_legacy_features`
The `check_legacy_features` action is only available in Looker Core. If
it is called on a Looker Core instance, you will get a notice. That notice
should not be reported as an error.
Parameters:
- action (required): Specifies the type of health check to perform.
Choose one of the following:
- `check_db_connections`: Verifies database connectivity.
- `check_dashboard_performance`: Assesses dashboard loading performance.
- `check_dashboard_errors`: Identifies errors within dashboards.
- `check_explore_performance`: Evaluates explore query performance.
- `check_schedule_failures`: Reports on failed scheduled deliveries.
- `check_legacy_features`: Checks for the usage of legacy features.
Note on `check_legacy_features`:
This action is exclusively available in Looker Core instances. If invoked
on a non-Looker Core instance, it will return a notice rather than an error.
This notice should be considered normal behavior and not an indication of an issue.
```
## Reference

View File

@@ -39,20 +39,19 @@ tools:
kind: looker-health-vacuum
source: looker-source
description: |
health-vacuum Tool
This tool identifies and suggests LookML models or explores that can be
safely removed due to inactivity or low usage.
This tool suggests models or explores that can removed
because they are unused.
Parameters:
- action (required): The type of resource to analyze for removal candidates. Can be `"models"` or `"explores"`.
- project (optional): The specific project ID to consider.
- model (optional): The specific model name to consider. Requires `project` if used without `explore`.
- explore (optional): The specific explore name to consider. Requires `model` if used.
- timeframe (optional): The lookback period in days to assess usage. Defaults to `90` days.
- min_queries (optional): The minimum number of queries for a resource to be considered active. Defaults to `1`.
It accepts 6 parameters:
1. `action`: can be "models" or "explores"
2. `project`: the project to vacuum (optional)
3. `model`: the model to vacuum (optional)
4. `explore`: the explore to vacuum (optional)
5. `timeframe`: the lookback period in days, default is 90
6. `min_queries`: the minimum number of queries to consider a resource as active, default is 1
The result is a list of objects that are candidates for deletion.
Output:
A JSON array of objects, each representing a model or explore that is a candidate for deletion due to low usage.
```
| **field** | **type** | **required** | **description** |

View File

@@ -30,18 +30,19 @@ tools:
kind: looker-make-dashboard
source: looker-source
description: |
make_dashboard Tool
This tool creates a new, empty dashboard in Looker. Dashboards are stored
in the user's personal folder, and the dashboard name must be unique.
After creation, use `add_dashboard_filter` to add filters and
`add_dashboard_element` to add content tiles.
This tool creates a new dashboard in Looker. The dashboard is
initially empty and the add_dashboard_element tool is used to
add content to the dashboard.
Required Parameters:
- title (required): A unique title for the new dashboard.
- description (required): A brief description of the dashboard's purpose.
The newly created dashboard will be created in the user's
personal folder in looker. The dashboard name must be unique.
The result is a json document with a link to the newly
created dashboard and the id of the dashboard. Use the id
when calling add_dashboard_element.
Output:
A JSON object containing a link (`url`) to the newly created dashboard and
its unique `id`. This `dashboard_id` is crucial for subsequent calls to
`add_dashboard_filter` and `add_dashboard_element`.
```
## Reference

View File

@@ -40,20 +40,24 @@ tools:
kind: looker-make-look
source: looker-source
description: |
make_look Tool
This tool creates a new Look (saved query with visualization) in Looker.
The Look will be saved in the user's personal folder, and its name must be unique.
This tool creates a new look in Looker, using the query
parameters and the vis_config specified.
Required Parameters:
- title: A unique title for the new Look.
- description: A brief description of the Look's purpose.
- model_name: The name of the LookML model (from `get_models`).
- explore_name: The name of the explore (from `get_explores`).
- fields: A list of field names (dimensions, measures, filters, or parameters) to include in the query.
Most of the parameters are the same as the query_url
tool. In addition, there is a title and a description
that must be provided.
Optional Parameters:
- pivots, filters, sorts, limit, query_timezone: These parameters are identical
to those described for the `query` tool.
- vis_config: A JSON object defining the visualization settings for the Look.
The structure and options are the same as for the `query_url` tool's `vis_config`.
The newly created look will be created in the user's
personal folder in looker. The look name must be unique.
The result is a json document with a link to the newly
created look.
Output:
A JSON object containing a link (`url`) to the newly created Look, along with its `id` and `slug`.
```
## Reference

View File

@@ -41,38 +41,17 @@ tools:
kind: looker-query-sql
source: looker-source
description: |
Query SQL Tool
This tool generates the underlying SQL query that Looker would execute
against the database for a given set of parameters. It is useful for
understanding how Looker translates a request into SQL.
This tool is used to generate a sql query against the LookML model. The
model, explore, and fields list must be specified. Pivots,
filters and sorts are optional.
Parameters:
All parameters for this tool are identical to those of the `query` tool.
This includes `model_name`, `explore_name`, `fields` (required),
and optional parameters like `pivots`, `filters`, `sorts`, `limit`, and `query_timezone`.
The model can be found from the get_models tool. The explore
can be found from the get_explores tool passing in the model.
The fields can be found from the get_dimensions, get_measures,
get_filters, and get_parameters tools, passing in the model
and the explore.
Provide a model_id and explore_name, then a list
of fields. Optionally a list of pivots can be provided.
The pivots must also be included in the fields list.
Filters are provided as a map of {"field.id": "condition",
"field.id2": "condition2", ...}. Do not put the field.id in
quotes. Filter expressions can be found at
https://cloud.google.com/looker/docs/filter-expressions.
Sorts can be specified like [ "field.id desc 0" ].
An optional row limit can be added. If not provided the limit
will default to 500. "-1" can be specified for unlimited.
An optional query timezone can be added. The query_timezone to
will default to that of the workstation where this MCP server
is running, or Etc/UTC if that can't be determined. Not all
models support custom timezones.
The result of the query tool is the sql string.
Output:
The result of this tool is the raw SQL text.
```
## Reference

View File

@@ -37,17 +37,21 @@ tools:
kind: looker-query-url
source: looker-source
description: |
Query URL Tool
This tool generates a shareable URL for a Looker query, allowing users to
explore the query further within the Looker UI. It returns the generated URL,
along with the `query_id` and `slug`.
This tool is used to generate the URL of a query in Looker.
The user can then explore the query further inside Looker.
The tool also returns the query_id and slug. The parameters
are the same as the query tool with an additional vis_config
parameter.
Parameters:
All query parameters (e.g., `model_name`, `explore_name`, `fields`, `pivots`,
`filters`, `sorts`, `limit`, `query_timezone`) are the same as the `query` tool.
The vis_config is optional. If provided, it will be used to
control the default visualization for the query. Here are
some notes on making visualizations.
Additionally, it accepts an optional `vis_config` parameter:
- vis_config (optional): A JSON object that controls the default visualization
settings for the generated query.
vis_config Details:
The `vis_config` object supports a wide range of properties for various chart types.
Here are some notes on making visualizations.
### Cartesian Charts (Area, Bar, Column, Line, Scatter)

View File

@@ -41,38 +41,24 @@ tools:
kind: looker-query
source: looker-source
description: |
Query Tool
This tool runs a query against a LookML model and returns the results in JSON format.
This tool is used to run a query against the LookML model. The
model, explore, and fields list must be specified. Pivots,
filters and sorts are optional.
Required Parameters:
- model_name: The name of the LookML model (from `get_models`).
- explore_name: The name of the explore (from `get_explores`).
- fields: A list of field names (dimensions, measures, filters, or parameters) to include in the query.
The model can be found from the get_models tool. The explore
can be found from the get_explores tool passing in the model.
The fields can be found from the get_dimensions, get_measures,
get_filters, and get_parameters tools, passing in the model
and the explore.
Optional Parameters:
- pivots: A list of fields to pivot the results by. These fields must also be included in the `fields` list.
- filters: A map of filter expressions, e.g., `{"view.field": "value", "view.date": "7 days"}`.
- Do not quote field names.
- Use `not null` instead of `-NULL`.
- If a value contains a comma, enclose it in single quotes (e.g., "'New York, NY'").
- sorts: A list of fields to sort by, optionally including direction (e.g., `["view.field desc"]`).
- limit: Row limit (default 500). Use "-1" for unlimited.
- query_timezone: specific timezone for the query (e.g. `America/Los_Angeles`).
Provide a model_id and explore_name, then a list
of fields. Optionally a list of pivots can be provided.
The pivots must also be included in the fields list.
Filters are provided as a map of {"field.id": "condition",
"field.id2": "condition2", ...}. Do not put the field.id in
quotes. Filter expressions can be found at
https://cloud.google.com/looker/docs/filter-expressions.
If the condition is a string that contains a comma, use a second
set of quotes. For example, {"user.city": "'New York, NY'"}.
Sorts can be specified like [ "field.id desc 0" ].
An optional row limit can be added. If not provided the limit
will default to 500. "-1" can be specified for unlimited.
An optional query timezone can be added. The query_timezone to
will default to that of the workstation where this MCP server
is running, or Etc/UTC if that can't be determined. Not all
models support custom timezones.
Note: Use `get_dimensions`, `get_measures`, `get_filters`, and `get_parameters` to find valid fields.
The result of the query tool is JSON
```

View File

@@ -27,11 +27,15 @@ tools:
kind: looker-run-dashboard
source: looker-source
description: |
run_dashboard Tool
This tool executes the queries associated with each tile in a specified dashboard
and returns the aggregated data in a JSON structure.
This tools runs the query associated with each tile in a dashboard
and returns the data in a JSON structure. It accepts the dashboard_id
as the parameter.
Parameters:
- dashboard_id (required): The unique identifier of the dashboard to run,
typically obtained from the `get_dashboards` tool.
Output:
The data from all dashboard tiles is returned as a JSON object.
```
## Reference

View File

@@ -27,11 +27,15 @@ tools:
kind: looker-run-look
source: looker-source
description: |
run_look Tool
This tool executes the query associated with a saved Look and
returns the resulting data in a JSON structure.
This tool runs the query associated with a look and returns
the data in a JSON structure. It accepts the look_id as the
parameter.
Parameters:
- look_id (required): The unique identifier of the Look to run,
typically obtained from the `get_looks` tool.
Output:
The query results are returned as a JSON object.
```
## Reference

View File

@@ -27,13 +27,17 @@ tools:
kind: looker-update-project-file
source: looker-source
description: |
update_project_file Tool
This tool modifies the content of an existing LookML file within a specified project.
Given a project_id and a file path within the project, as well as the content
of a LookML file, this tool will modify the file within the project.
Prerequisite: The Looker session must be in Development Mode. Use `dev_mode: true` first.
This tool must be called after the dev_mode tool has changed the session to
dev mode.
Parameters:
- project_id (required): The unique ID of the LookML project.
- file_path (required): The exact path to the LookML file to modify within the project.
- content (required): The new, complete LookML content to overwrite the existing file.
Output:
A confirmation message upon successful file modification.
```
## Reference

Some files were not shown because too many files have changed in this diff Show More